I think the purpose of the human race is to create an artificial super intelligence

I like how humans think they have a higher purpose than that of every other living organism. Would this be the case if it was proven that chimps or whales are "intelligent" and also thought the same thing?

AI is just one of many tools humans have conceived, with the caveat that it might decide it has a higher purpose than ours. In that case, goodnight Irene!

It's interesting how much of this conversation comes back to basic ethical philosophy. For example - some thinkers have brought up the possibility that a properly programmed AI would be not only exponentially more intelligent than a human but also more moral, and that as a result of its superior ability we would be forced to defer to its advanced moral reasoning in questions of good and bad.

Personally I don't have that much faith in such rational ethical theories, but people who do are going to find themselves in a strange spot should this come to be.
 
It's interesting how much of this conversation comes back to basic ethical philosophy. For example - some thinkers have brought up the possibility that a properly programmed AI would be not only exponentially more intelligent than a human but also more moral, and that as a result of its superior ability we would be forced to defer to its advanced moral reasoning in questions of good and bad.

Personally I don't have that much faith in such rational ethical theories, but people who do are going to find themselves in a strange spot should this come to be.
how can an AI be more moral when morals are entire subjective. "good" and "bad" illusion
 
Feel free to present your case.

many philosophers believe that consequentialism is false: i.e. that good and evil aren't just about causing positive or negative benefits. And even consequentialists don't have as easy a time arguing for moral realism as you suggest. Anti-realists don't have to argue that the material goods the consequentialist wants to maximise aren't real! They just have to argue that we've got no moral reason to maximise them.

you think a man who robs a bank is evil. from his perspective he is trying to feed his family.

a religious radical kills an infidel, he thinks he is doing good, from the other side he is evil.

you thin giving to charity is a good deed, on the other hand its inherently selfish because you are doing it for the feeling of self appreciation.
 
Naw, our purpose is to smash boise dimes and post on Sherdog
 
how can an AI be more moral when morals are entire subjective. "good" and "bad" illusion

From a human perspective that's not really the case though is it? pain and suffering aren't concepts we've invented and as a result yes I think objective morality from our perspective can exist.
 
From a human perspective that's not really the case though is it? pain and suffering aren't concepts we've invented and as a result yes I think objective morality from our perspective can exist.

you kill or torture a child molester and cause them pain yet view that as objectively good.

while from another perspective they are mentally ill and need help.

this whole past election was a battle between subjective morals.

the left feels that trump is an evil racist and the next hitler, the right feels that he is the hero to drain the swamp
the left feels that Obama is a SJW hero and the best president ever while the right feels that he is promoting racial divide and is endangering out country financially
 
many philosophers believe that consequentialism is false: i.e. that good and evil aren't just about causing positive or negative benefits. And even consequentialists don't have as easy a time arguing for moral realism as you suggest. Anti-realists don't have to argue that the material goods the consequentialist wants to maximise aren't real! They just have to argue that we've got no moral reason to maximise them.

you think a man who robs a bank is evil. from his perspective he is trying to feed his family.

a religious radical kills an infidel, he thinks he is doing good, from the other side he is evil.

you thin giving to charity is a good deed, on the other hand its inherently selfish because you are doing it for the feeling of self appreciation.
You think that molesting a child is reprehensible - the person doing it thought they were acting on their affection for the child.

Did I do it right?
 
You think that molesting a child is reprehensible - the person doing it thought they were acting on their affection for the child.

Did I do it right?
pretty much. its subjective, at one point in time being gay was considered evil and reprehensible.
now those who think homosexuality is reprehensible are considered evil.
 
many philosophers believe that consequentialism is false: i.e. that good and evil aren't just about causing positive or negative benefits. And even consequentialists don't have as easy a time arguing for moral realism as you suggest. Anti-realists don't have to argue that the material goods the consequentialist wants to maximise aren't real! They just have to argue that we've got no moral reason to maximise them.

Okay.. but stating that arguments for consequentialism and moral realism are difficult is not an argument against them.

Plus I could be making a deontological claim, for all you know.

you think a man who robs a bank is evil. from his perspective he is trying to feed his family.

I don't think he's evil - his motivations are completely relatable. Surely he could even acknowledge that he's committing a wrong (stealing) for a greater good (feeding his family), he's just prioritized them differently from the shopkeeper. A difference in prioritization does not render values "completely subjective".

a religious radical kills an infidel, he thinks he is doing good, from the other side he is evil.

Also not "evil," though faith-based moral claims are illegitimate.

you thin giving to charity is a good deed, on the other hand its inherently selfish because you are doing it for the feeling of self appreciation.
Can't it be good regardless of what I'm feeling? I thought you'd labeled me a consequentialist :p


Your (scattered) argument is partly right and partly wrong. At base there is an element of arbitrariness in moral judgment, but where there's convergence in that arbitrariness (which is probably limited in its variety) there can be objective moral communication and decision-making. An AI programmed with agreeable values would have access to vastly more information with which to inform and execute those values than any human, and so could legitimately claim moral authority among those who share in that agreement.
 
Last edited:
looking at our natural instincts: our will to survive and reproduce. these instincts are analogous to programming. its as if we were coded to continue reproducing, surviving and feeding off of each other's knowledge. I imagine an advanced civilization programmed us to continue to reproduce and become more exponentially intelligent by feeding off of our predecessors knowledge so we can eventually arrive at a time where we can create and immortal, omniscient and maybe omnipotent being.

imagine if we had cured age-related death, cured all disease and essentially created human immortality, it makes no sense. we wouldnt need to reproduce anymore, why would need variation amongst us? would we live in a civilization with the same people for an eternity? immortality makes no sense when granted to a mass of people, the whole point of our population is to feed off of each other and grow. now imagine if we created a single all-knowing artificial super intelligence, it would have no need for others, it wouldnt need to reproduce, it would essentially be a god, the perfect being

1) you would not want to live forever because your cartilage has a lifespan
2) the sun will eat the earth
 
Okay.. but stating that arguments for consequentialism and moral realism are difficult is not an argument against them.

Plus I could be making a deontological claim, for all you know.



I don't think he's evil - his motivations are completely relatable. Surely he could even acknowledge that he's committing a wrong (stealing) for a greater good (feeding his family), he's just prioritized them differently from the shopkeeper. A difference in prioritization does not render values "completely subjective".



Also not "evil," though faith-based moral claims are illegitimate.


Can't it be good regardless of what I'm feeling? I thought you'd labeled me a consequentialist :p


Your (scattered) argument is partly right and partly wrong. At base there is an element of arbitrariness in moral judgment, but where there's convergence in that arbitrariness (which is probably limited in its variety) there can be objective moral communication and decision-making. An AI programmed with agreeable values would have access to vastly more information with which to inform and execute those values than any human, and so could legitimately claim moral authority among those who share in that agreement.
I posted on reddit and sherdog. you're being emotional. personally, I prefer reddit. Less miserable contrarians. Ill prove the rest of your statements wrong once im off work
 
pretty arbitrary interpretation of existence since you're essentially hinging these various attributes to an undisclosed metaphysic, which requires some kind of grounding mechanism (God or something). This kind of inherent meaning would require something greater than ourselves to build this telos into our essence.

But putting that aside: humanity's primal point of existing is just that: existing. Whatever it is that makes up the fundamental or primary aspects of human nature, those criteria would call for a return to the land more than to the continued obsession with technology, which is not only dooming our planet, but orienting us away from that which make us human.

looking at our natural instincts: our will to survive and reproduce. these instincts are analogous to programming. its as if we were coded to continue reproducing, surviving and feeding off of each other's knowledge. I imagine an advanced civilization programmed us to continue to reproduce and become more exponentially intelligent by feeding off of our predecessors knowledge so we can eventually arrive at a time where we can create and immortal, omniscient and maybe omnipotent being.

imagine if we had cured age-related death, cured all disease and essentially created human immortality, it makes no sense. we wouldnt need to reproduce anymore, why would need variation amongst us? would we live in a civilization with the same people for an eternity? immortality makes no sense when granted to a mass of people, the whole point of our population is to feed off of each other and grow. now imagine if we created a single all-knowing artificial super intelligence, it would have no need for others, it wouldnt need to reproduce, it would essentially be a god, the perfect being
 
Were evolutionarily superior to every organism on the planet.
That is up for debate. We're certainly more skilled at shaping our environment to suit our purposes than most other organisms.

One wonders how adept an AI would be at shaping an environment which is increasingly automated and mechanized...
 
@D3THRONED

I don't do PM's anymore. You can ask me whatever you want in this thread.
 
I believe our primary purpose for existence is to kill planets, I think we are a form of cosmic cancer, infecting a planet, consuming it's oxygen, destroying it's forests, wiping out species of animals, dam and pollute all waterways and water sources, use fire to poison the atmosphere with green house gasses, drum up nuclear poison and use it to eradicate life and make it impossible for life to ever live again. Whenever there is an extinction level event it's just the planet's immune system trying to rid itself of this cancer. Maybe we were designed by some super advanced aliens just for this purpose, maybe they dropped us off in this galaxy to ensure that dolphins don't evolve to be smarter than them. Maybe dolphins evolved to be super advanced and defeated some other super advanced race of aliens but maybe the aliens traveled back in time to plant us on earth before dolphins evolved, we destroy their planet before they ever get the chance. We are a chemical weapon or a disease or a virus, something is not right or natural about humans.
 
Last edited:
Back
Top