× Didn't find what you were looking for? Ask a question
Top Posters
Since Sunday
5
a
5
k
5
c
5
B
5
l
5
C
4
s
4
a
4
t
4
i
4
r
4
New Topic  
bio_man bio_man
wrote...
Administrator
Educator
Posts: 33243
12 years ago


We've all seen films like Terminator and the Matrix. They imagine a future where machines have become conscious and aren't very nice. But the reality of this field is very different even if it does feel a lot like science fiction.

The idea of The Singularity has been around for over 50 years. It had a small surge of media attention and in the 1990s and quickly died away. This decade however, it's back in the public eye and far less fringe than ever before. There is a Singularity University, a Singularity hub on Newsweek's website, a feature documentary about it and it made the cover of Time Magazine in February.

The Singularity as defined by Singularitarians is this: our computers are getting faster and smarter at an exponential rate. Meaning they are getting faster at getting faster. Eventually they will be able to match human-level intelligence. When they do, they will take over their own development,; they will be conscious. At this point, the world changes and it is impossible to predict exactly what will happen after that point. Just like an event horizon in a black hole, so technological Singularitarians borrow the term from physics and call it a Singularity.
Read 676 times
2 Replies

Related Topics

Replies
wrote...
12 years ago
Military organizations already have weapons with computers inside them and robots capable of killing humans. They even gave some autonomy to these machines. Should we put strong restrictions on the capability of AI machines to harm humans ?

Should we put a limit on what types of intelligence computers are allowed to outsmart humans ?
wrote...
Valued Member
12 years ago
This is a really good question. I think machines will not innately have a survival instinct or something to lead them to WANT to dominate. We could program such features into machines intentionally or possibly carelessly but I don't think it's an issue from the start unless WE make it one. We will also become machines. I think we will have the will and capacity to kill each other before machines do.
Don't forget to give me a thumbs up!
New Topic      
Explore
Post your homework questions and get free online help from our incredible volunteers
  1318 People Browsing
Gallery
  
 17
  
 251
  
 1715
Your Opinion