Cherdawg
Silver Belt
- Joined
- Jun 12, 2013
- Messages
- 11,887
- Reaction score
- 0
Lmao what a gif!
Improbable they would know that everything in the universe is necessary in order to prolong their existence. The only way that would happen is if they created a whole world or massive station to house their expansion.
There are several types of intelligence. Emotional Intelligence is one that is not programmable (taught), but may instead be mimicked (poorly). Just like Fedor's fight I.Q. may be imitated by a sherdog neckbeard in a walmart parkinglot, with disastrous results. Machines bode even worse still
soon
haha
My words are futile
For those that work with or actually understand AI principals in general, its pretty widely accepted that the question isnt if we develop AI, its when.
Correct me if I'm wrong, but don't we already have it (it's just really crappy)?
Sorry, I should have been more clear. I meant AI that matches and surpasses human intelligence.
lol that is so incredibly retarded
the computers will soon (as in coming decades) surpass the processing power of a human brain
we will also be able to map the neural connections in the human brain and recreate it with a machine
this is not science fiction, those are facts, and something that leading computer scientists agree on
And those computers will still not be able to do anything except what their human creators program them to be able to do.
You say I'm retarded, I'm a fucking genius compared to the best computers. Computers are retarded, think about what I said about language. A stupid child is more capable of communication than the most advanced computer program. If you want computers that can do what humans can, you will have to be able to create a human. Good luck with that!
Do you work with control systems? Or program systems with feedback?
ok cool.
My take is that it has already surpassed human intelligence-- in computational ability, and in memory.
There are other types of intelligence which we possess natural ability in which are not easy for us to even understand, so teaching it (being necessary to understand it first) will take quite a long time.
Science fiction has always overestimated the potential for advancement in AI. Robots and computers can only perform the specific task they're programmed to perform, however sophisticated those tasks may be. The most advanced walking robots can't balance themselves nearly as well as humans can. For all their processing power, in terms of language the most advanced programs can't comunicate with humans better than children.
The whole idea that we should be scared of sentient AI has been funny to me. There's nothing to be scared of.
No and please tell me why it matters.