I wonder, given that all the more probable existential threats are man-made (including the lab-spawn and "vaccines"), if we wouldn't be better off just pulling the plug and letting the chips fall where they may?
This egghead, for example, thinks that it's necessary to ban the development of AI to save ourselves from being certainly destroyed by it:
In my opinion, people being people, preventing ourselves from developing AI seems impossible as long as the temptation is there. It might even be that people don't really have a choice - that the system is 'self-organizing' and will do whatever it is fated to do.
So, in the spirit of suggesting a course of action that is extremely unlikely to be taken up by everyone, my vote goes for pulling the plug, smashing all the machines and labs, burning all the books, et cetera, et cetera.
But that's never going to happen either is it?
So, here's hoping AI is technically impossible, because otherwise we're all dead.
Time to shut it all down?
Date: 2023-04-01 01:21 am (UTC)This egghead, for example, thinks that it's necessary to ban the development of AI to save ourselves from being certainly destroyed by it:
https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/
In my opinion, people being people, preventing ourselves from developing AI seems impossible as long as the temptation is there. It might even be that people don't really have a choice - that the system is 'self-organizing' and will do whatever it is fated to do.
So, in the spirit of suggesting a course of action that is extremely unlikely to be taken up by everyone, my vote goes for pulling the plug, smashing all the machines and labs, burning all the books, et cetera, et cetera.
But that's never going to happen either is it?
So, here's hoping AI is technically impossible, because otherwise we're all dead.
The Ninth Mouse