Great article over at DEV.TO about taking a step back from A.I. research to give the process a chance to establish safety guidelines around its development.
So many movies have been made on this subject.
2001: A Space Odyssey
Star Trek: The Motion Picture
Avengers: Age of Ultron
Just to name a few.
Link to the article is here.
The frightening part of artificial intelligence hasn’t happened yet. Sentience. Once the machines achieve that, many believe it would be all over for the human race. There would be no chance to put that genie back in the box.
I think the end would come quickly. Who knows, they might decide to keep a few of us around for museum pieces for their robot friends.
It could all be mitigated if they never have access to the very things that could destroy us like guns, missiles, etc. However, there are many ways to end human existence. Since they can live in a vacuum, why not just go up to space and hurl a few asteroids down at us? The possibilities are endless.
So yes, I’ve signed the document asking the community to pump the brakes on A.I. development. Will they stop? Nope. We have no idea who is working on what. We have only our faith in humanity and well…humans suck.