Thank you for voting Crowdsignal Logo
Option image

How soon will machine intelligence surpass human intelligence? (Poll Closed)

  •  
     
  •  
     
  •  
     
  •  
     
  •  
     
  •  
     
14 Comments

  • Donald - 13 years ago

    Computers aren't intelligent; they just think they are!

  • post-post - 13 years ago

    "How do you feed spirit and soul into a machine?"

    You don't.

  • post-post - 13 years ago

    This is a poll that makes one think; don't know if I would answer "I don't know" because these three appear to be the three possibilities to choose IMO:
    a) It as happened already, but it’s keeping quiet.
    b) Vernor Vinge says shortly after 2020, and I agree with that.
    c) Ray Kurzweil says 2045, which sounds about right to me.

    So for once am commenting without having answered the poll.

  • John Pratt - 13 years ago

    How do you feed spirit and soul into a machine?

  • sheekus - 13 years ago

    btw. chose 2045 as right date. I'll be an oldster then, oh geez, just imagine my build up of my unparalleled wisom for nought.

  • sheekus - 13 years ago

    is it just me or has anyone noticed Kurzweil delayed his dates over the 20 years of his published books? compared age of spiritual machine to Singularity is near, let's say.

  • max - 13 years ago

    We need to talk to that 11% from the it-is-already-here camp! Why have they chosen this time and place to reveal their uncanny knowledge? Lol!

  • Summerspeaker - 13 years ago

    Do a tenth of IEET readers really believe human-level artificial intelligence already exists? I'm in the "I don't know" camp, but Kurzweil's estimate strikes me as one plausible scenario.

  • Sally Morem - 13 years ago

    I agree with Mike Clarke's assessment, except I think accelerating tech is moving somewhat faster than that. I believe it'll happen sometime in the 2020s. Yes, humanity does indeed need to be prepared.

  • Pastor_Alex - 13 years ago

    My difficulty with this poll is that without defining what intelligence is we can't decide whether a machine is even intelligent. Arguments about intelligence abound with the favourite measure being the "IQ test" . The challenge is that IQ tests have been shown to not be cross-cultural and only effective in measuring a small segment of what we call intelligence.

    This doesn't even take into account the relationship between intelligence and consciousness. We could create a machine that had a very high intelligence in a specific area (in fact we already have in Deep Blue and Watson) but no discernable sense of self. Watson didn't complain that it couldn't keep its winnings and blow them on upgrades.

    The answer to the poll is both we already have, and probably never. I don't think we have a sufficient understanding of consciousness to create it in another being, and from what I have seen of present research into the matter I don't think we will in the foreseeable future.

  • H. Dufort - 13 years ago

    What if we have a computer with raw processing power that far surpasses the human brain, with some asynchronous parallel processing, but all digital, and without any meaningful application to emulate adaptable intelligence? For example, a supercomputer used to model meteo systems or galaxy formation; no intelligence would arise from this organized complexity. This is what I like to call a "super-fancy abacus"; it's still a straightforward calculator, even though it escapes von Neumann's narrowest definition

    I think the real breakthrough will come when we model a highly complex, multiple feedback system with both local and global signal propagation, with an emotion generator (e.g. hardwired survival strategies, either symbolic or functional), perception+expression communication channels, and dream-based data consolidation. Such a system WON'T need a multi-terabyte execution space or a petaflop-grade supercomputer, at least not to reach a mammal-level richness in its behaviors. I believe that the breakthrough won't happen through "emerging" consciousness in complex data networks or "accidental" software; at some point, some guy in a lab will spend a few days programming a carefully-thought piece of software that will run, albeit slowly, on your aunt's entry-level PC.

    We have the technology already; but we're lagging in our understanding of cognition, consciousness, self-perception, etc. And we're still really, really primitive when it comes to developing software. Just imagine: the Egyptians invented paper 5000 years ago, but the paper plane was only invented in the 19th century.

  • Mike Clarke - 13 years ago

    If you project the convergence of technologies such as neural net computing, quantum/optic computing, artificial intelligence, self learning systems, etc. they converge on this point in around 2035. However, it may take 10 years or so to develop the sentient intelligence. Humanity needs to be prepared for this.

  • Tobias - 13 years ago

    I would say, that intelligence is not definable concrete enough. There certainly exists no really good definition. I would even contend that such a definition is impossible.

    So after some definitions of intelligence, the point is already past.
    But one will always be able to find a definition where the pint can never happen.
    Personally I think that for most definitions, there will be a machine intelligence which surpasses most humans, within this century.

  • Aleksei Riikonen - 13 years ago

    Thank you for presenting the "I don't know" option (and for not joining it with some additional claim that I wouldn't have supported).

Leave a Comment

0/4000 chars


Submit Comment

Create your own.

Opinions! We all have them. Find out what people really think with polls and surveys from Crowdsignal.