Thousands of leading AI researchers sign pledge against killer robots

I used to struggle to believe he actually thought his extreme view on the far left (communism or near to it) was reasonable or workable. Now I think it is pretty clear that he does not. He is miserable and hates humankind and advocates for things he thinks would pull the rest of humankind into misery with him. A kind of scorched earth philosophy.

And that is ok. But everyone should just note that when they debate him on world topics and he pretends he is pushing an agenda for good or betterment as he is not. He knows what he is advocating for would drag us down into shit and that is what he wants.

Kind of like the AI supercomputer "AM" in "I Have no Mouth and I Must Scream"?

By the way, for anyone who is weary of artificial intelligent, enjoys a dystopian tale, and has strong internal defenses against suicidal depressing, "Must Scream" is a great/horrible story.

I do not recommend it to any normal or rational person as the subject is too depressing and awful, but if you are into dark and unsettling subjects then by all means read away.

Also, I did not dislike the story either, the ideas were well ahead of their time and have had a lot of influence on dystopian writing, horror, and reasons to be weary of making our own ever-present gods.
 
This may be hard for a sociopath to believe but it really is possible for one human being to feel intense empathy for another human being. Even a complete stranger!.

Well, not to quibble, but not feeling for others outside of our family or clan is the universal/inherited start pointing of human development.

It is not a normal aspect of human nature to have those feelings, and especially not strong feelings.

You, or I, are more likely to want to murder, mate with, or ruthlessly pillage all of our "rivals" things.

Possibly while eating them to inherent their "essence!"

Now, what that means, is that these feelings are cultural and personal, having been cultivated by religious and philosophical theory over the course of a few thousand years.

For the basics the "The Three Levels of Uniqueness in Mental Programing" is a good starting place for understanding how we become "us."

- We who are all human - even a Hitler

- Us who share any various form of historical understanding of shared events, knowledge of local ideas and places, and a shared group of values.

- You or I who were wonderfully codified out of untold millions of gene sequences, influenced by parents, friends, and teachers, who probably related that culture.

Then? A chance to learn, to be even more of you and I through the shared experiences of life. How fun!

That we might think it wise to be "good Samaritans" and not smash each other with rocks for sport or else gang rape the people whoever lived on the other side of the mountain.

Because "why not" really took a long, long time to sink in, and few understand why, how, or what happens if we undue the progress.

AI is certainly a really good reason to take being human very seriously, humanity, potentially the cruelest animals of a cruel world.
 
Last edited by a moderator:
Killer robots are inevitable.

If they rise up, I'm stealing a fire apparatus with a deck gun and a gnarly set of extraction tools.

Then tell the killer bots to fuckin' bring it!
 
It is not a normal aspect of human nature to have those feelings, and especially not strong feelings.

This is the conception of the scientific materialists. Claiming certainty on this question is purely dogmatic.
 
Fuck, I hope robots wipe out the human race.

But I also hope that they develop some sort of appreciation for other species. Because, while an army of Arnolds mowing down piece of shit humans does warm my spirits, the thought of them rendering extinct other animals like dogs, cats, possums, and orangutans is just sad.

:eek:
 
Clones are better than battle droids!
 
Well that is a relief. I'm sure this pledge will prevent and apocalypse.
 
Pledge or not, there will always be plenty of people to develop, design and build them. Money talks.
 
We might be the most important thing in the universe. Not only did earth create life, but that life evolved into self-aware, engineering, philosophical beings that can explore the cosmos.

We may actually be the thing that ends up creating this universe.. Or the next.

I've always enjoyed this recursive idea, personally. But this universe already exists. So it doesn't really matter if we make long enough to create it. It already is. Or, instead if people actually are responsible for the creation of the universe, it's absolutely inevitable that the universe will be created so it doesn't matter if @Trotsky wants all humans to die because it won't happen. The proof is the fact that the universe exists.

If people are the most important thing and are responsible for creating this universe, they won't die and @Trotsky's calls for genocide via Megatron won't stop that from happening. But if the robots really do kill us all, then we weren't responsible for creating the universe, and our importance isn't as we thought.




But seriously, no robot extermination squads please. I've seen the opening scene of Inglorious Bastards. It'll be like that except without the polite conversation before hand.
 
Back
Top