Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes. This kind of scenario is really frightening. It's why we need to expand colonies off world.

There isn't much to add except that I've had the exact same thought several times before.



>It's why we need to expand colonies off world.

If the technology in question that's gone awry due to a single rogue individual is AGI, it'd easily kick our ass across the cosmos if malevolent. There's very likely no running from it.

The point at which it becomes possible for a single person to create something like that, is the point where we'd such need a system of control lest we face extinction as a species.

Fortunately, said system of control would likely have to be implemented with AI anyways, and stands a good chance of being done correctly such that it doesn't infringe on individual freedom or human rights whatsoever.

One of my favorite things to ponder is the notion of a superintelligence created to protect us that renders itself invisible. A diety of our own creation as intangible and mythological as those found in historical scriptures.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: