My view, and that of the majority of my colleagues in AI, is that itll be at least half a century before we see computers matching humans. Given that various breakthroughs are needed, and its very hard to predict when breakthroughs will happen, it might even be a century or more. If thats the case, you dont need to lose too much sleep tonight.
One reason for believing that machines will get to human-level or even superhuman-level intelligence quickly is the dangerously seductive idea of the technological singularity. This idea can be traced back to a number of people over fifty years ago: John von Neumann, one of the fathers of computing, and the mathematician and Bletchley Park cryptographer IJ Good. More recently, its an idea that has been popularised by the science-fiction author Vernor Vinge and the futurist Ray Kurzweil.
The singularity is the anticipated point in humankinds history when we have developed a machine so intelligent that it can recursively redesign itself to be even more intelligent. The idea is that this would be a tipping point, and machine intelligence would suddenly start to improve exponentially, quickly exceeding human intelligence by orders of magnitude.
Once we reach the technological singularity, we will no longer be the most intelligent species on the planet. It will certainly be an interesting moment in our history. One fear is that it will happen so quickly that we wont have time to monitor and control the development of this super-intelligence, and that this super-intelligence might lead intentionally or unintentionally to the end of the human race.
Proponents of the technological singularity who, tellingly, are usually not AI researchers but futurists or philosophers behave as if the singularity is inevitable. To them, it is a logical certainty; the only question mark is when. However, like many other AI researchers, I have considerable doubt about its inevitability.
We have learned, over half a century of work, how difficult it is to build computer systems with even modest intelligence. And we have never built a single computer system that can recursively self-improve. Indeed, even the most intelligent system we know of on the planet the human brain has made only modest improvements in its cognitive abilities. It is, for example, still as painfully slow today for most of us to learn a second language as it always was. Little of our understanding of the human brain has made the task easier.
Since 1930, there has been a significant and gradual increase in intelligence test scores in many parts of the world. This is called the Flynn effect, after the New Zealand researcher James Flynn, who has done much to identify the phenomenon. However, explanations for this have tended to focus on improvements in nutrition, healthcare and access to school, rather than on how we educate our young people.
There are multiple technical reasons why the technological singularity might never happen. I discussed many of these in my last book. Nevertheless, the meme that the singularity is inevitable doesnt seem to be getting any less popular. Given the importance of the topic it may decide the fate of the human race I will return again to these arguments, in greater detail, and in light of recent developments in the debates. I will also introduce some new arguments against the inevitability of the technological singularity.
My first objection to the supposed inevitability of the singularity is an idea that has been called the faster-thinking dog argument. It considers the consequences of being able to think faster. While computer speeds may have plateaued, computers nonetheless still process data faster and faster. They achieve this by exploiting more and more parallelism, doing multiple tasks at the same time, a little like the brain.
Theres an expectation that by being able to think longer and harder about problems, machines will eventually become smarter than us. And we certainly have benefited from ever-increasing computer power; the smartphone in your pocket is evidence of that. But processing speed alone probably wont get us to the singularity.
Suppose that you could increase the speed of the brain of your dog. Such a faster-thinking dog would still not be able to talk to you, play chess or compose a sonnet. For one thing, it doesnt possess complex language. A faster-thinking dog will likely still be a dog. It will still dream of chasing squirrels and sticks. It may think these thoughts more quickly, but they will likely not be much deeper. Similarly, faster computers alone will not yield higher intelligence.
Intelligence is a product of many things. It takes us years of experience to train our intuitions. And during those years of learning we also refine our ability to abstract: to take ideas from old situations and apply them to new, novel situations. We add to our common sense knowledge, which helps us adapt to new circumstances. Our intelligence is thus much more than thinking faster about a problem.
My second argument against the inevitability of the technological singularity is anthropocentricity. Proponents of the singularity place a special importance on human intelligence. Surpassing human intelligence, they argue, is a tipping point. Computers will then recursively be able to redesign and improve themselves. But why is human intelligence such a special point to pass?
Human intelligence cannot be measured on some single, linear scale. And even if it could be, human intelligence would not be a single point, but a spectrum of different intelligences. In a room full of people, some people are smarter than others. So what metric of human intelligence are computers supposed to pass? That of the smartest person in the room? The smartest person on the planet today? The smartest person who ever lived? The smartest person who might ever live in the future? The idea of passing human intelligence is already starting to sound a bit shaky.
But lets put these objections aside for a second. Why is human intelligence, whatever it is, the tipping point to pass, after which machine intelligence will inevitably snowball? The assumption appears to be that if we are smart enough to build a machine smarter than us, then this smarter machine must also be smart enough to build an even smarter machine. And so on. But there is no logical reason that this would be the case. We might be able to build a smarter machine than ourselves. But that smarter machine might not necessarily be able to improve on itself.
There could be some level of intelligence that is a tipping point. But it could be any level of intelligence. It seems unlikely that the tipping point is less than human intelligence. If it were less than human intelligence, we humans could likely simulate such a machine today, use this simulation to build a smarter machine, and thereby already start the process of recursive self-improvement.
So it seems that any tipping point is at, or above, the level of human intelligence. Indeed, it could be well above human intelligence. But if we need to build machines with much greater intelligence than our own, this throws up the possibility that we might not be smart enough to build such machines.
My third argument against the inevitability of the technological singularity concerns meta-intelligence. Intelligence, as I said before, encompasses many different abilities. It includes the ability both to perceive the world and to reason about that perceived world. But it also includes many other abilities, such as creativity.
The argument for the inevitability of the singularity confuses two different abilities. It conflates the ability to do a task and the ability to improve your ability to do a task. We can build intelligent machines that improve their ability to do particular tasks, and do these tasks better than humans. Baidu, for instance, has built Deep Speech 2, a machine-learning algorithm that learned to transcribe Mandarin better than humans.
But Deep Speech 2 has not improved our ability to learn tasks. It takes Deep Speech 2 just as long now to learn to transcribe Mandarin as it always has. Its superhuman ability to transcribe Mandarin hasnt fed back into improvements of the basic deep-learning algorithm itself. Unlike humans, who get to be better learners as they learn new tasks, Deep Speech 2 doesnt learn faster as it learns more.
Improvements to deep-learning algorithms have come about the old-fashioned way: by humans thinking long and hard about the problem. We have not yet built any self-improving machines. Its not certain that we ever will.
Excerpted with permission from 2062: The World That AI Made, Toby Walsh, Speaking Tiger Books.
Go here to see the original:
- 4 Non-Obvious Trends That Matter During This Pandemic - Singularity Hub - May 29th, 2020
- Black hole: THIS is what would happen if you got close to a black hole - Express.co.uk - May 29th, 2020
- Buildings Consume Lots of EnergyHere's How to Design Whole Communities That Give Back as Much as They Take - Singularity Hub - May 29th, 2020
- Retired USMC Captain, Derek Herrera, to Take Chairman Role for MVPvets, a Nonprofit That Helps Military Veterans Find Careers in Medical Device and... - May 29th, 2020
- Lewontin's Confession and Mamet's Principle - Discovery Institute - May 29th, 2020
- How to supercharge your teams brainstorming sessions with sci-fi narratives - The Next Web - May 29th, 2020
- Robby Krieger of the Doors Premieres Trippy New Song 'Hot Head' - Rock Cellar Magazine - May 29th, 2020
- Dr. Edward Belbruno on Producing Art in a Time of Isolation - Yu News - May 29th, 2020
- League of Legends patch 10.11 update: The beginning of the ADC meta - PC Invasion - May 29th, 2020
- Musician, Writers and More Share Their Coronavirus Experience in Their Own Words Part II - 303 Magazine - May 29th, 2020
- Navy hopes new NGEN contract will lead to domain singularity - Federal News Network - May 24th, 2020
- Did My Hero Academia Just Reveal the Truth Behind the Singularity? - ComicBook.com - May 24th, 2020
- A New Bionic Eye Could Give Robots and the Blind 20/20 Vision - Singularity Hub - May 24th, 2020
- The US Government Just Invested Big in Small-Scale Nuclear Power - Singularity Hub - May 24th, 2020
- Astronomers have discovered a black hole in two steps from the Earth - The Times Hub - May 24th, 2020
- Rule of thumb on proper use of active verb - Daily Nation - May 24th, 2020
- Making It Work: Marketing consultancy crowdfunds to target overseas markets - Business Post - May 24th, 2020
- Rebellion, New OrderStyle: What Happened to It? - National Review - April 23rd, 2020
- Why Interest in Virtual Worlds for Online Collaboration Is Spiking - Singularity Hub - April 23rd, 2020
- Contact Tracing Is the Next Step in the Covid-19 BattleBut How Will It Work in Western Countries? - Singularity Hub - April 23rd, 2020
- College of the Siskiyous play about end of the world canceled by end of the world - Siskiyou Daily News - April 23rd, 2020
- How to Navigate the Coronavirus Crisis With Innovation at Warp Speed - Singularity Hub - April 23rd, 2020
- Reducing Global Supply Chain Reliance on China Won't Be Easy - BRINK - April 23rd, 2020
- 'Ghost in the Shell: SAC_2045' Preview: Will the new series step up to franchise standards or crash and burn - MEAWW - April 23rd, 2020
- EU Masters Spring 2020: The Top 5 Teams in the Main Event - Blog of Legends - April 23rd, 2020
- Remembering the Boston Marathon Bombing Victims and Survivors 7 Years Later - Newsweek - April 23rd, 2020
- Reaching the Singularity May be Humanity's Greatest and Last Accomplishment - Air & Space Magazine - March 31st, 2020
- Regenerative Business, Part 4: Singularity and Why It Matters - Sustainable Brands - March 31st, 2020
- Do We Have to Give Up Our Personal Freedoms to Beat Coronavirus? - Singularity Hub - March 31st, 2020
- Devs Wants to Unsettle You with that Dj Vu Feeling - SF Weekly - March 31st, 2020
- To fight the coronavirus spread, give artificial intelligence a chance - Livemint - March 31st, 2020
- 'Ghost in the Shell: SAC_2045': Release date, plot, cast, music, trailer and all you need to know about anime - MEAWW - March 31st, 2020
- News Watch Jon Hopkins perform to no one at The Sydney Opera House - Stoney Roads - March 31st, 2020
- Crime thrillers and cannabis cooking competition among April streaming picks - CityNews - March 31st, 2020
- The Greatest Gamble of All Time - Thrive Global - March 31st, 2020
- Marching Toward the Singularity - kcstudio.org - March 15th, 2020
- Coronavirus: Seven Ways Collective Intelligence Is Tackling the Pandemic - Singularity Hub - March 15th, 2020
- This Week's Awesome Tech Stories From Around the Web (Through March 14) - Singularity Hub - March 15th, 2020
- Huge $161 Million Investment Means Meat Without the Animal Is Here - Singularity Hub - March 15th, 2020
- Here's what to do in Houston this week - KPRC Click2Houston - March 15th, 2020
- Kill It and Leave This Town: Film Review - Variety - March 15th, 2020
- PS4-exclusive Horizon Zero Dawn is coming to PC - The Tech Report - March 15th, 2020
- The Best Shows and Movies to Watch This Week: Westworld, The Plot Against America - TV Guide - March 15th, 2020
- 10 Forgotten First Person Shooters You Need To Play | Game Rant - GameRant - March 15th, 2020
- Ed Skrein: As actors, we dont have to put on a posh accent and toe the line - British GQ - March 15th, 2020
- Tesla Is Building Its First European Factory But It Has to Clear a Forest First - Singularity Hub - February 27th, 2020
- Rhythm and Rhyme - Oxford American - February 27th, 2020
- AI Just Discovered a New Antibiotic to Kill the World's Nastiest Bacteria - Singularity Hub - February 27th, 2020
- Review: 'Once Were Brothers: Robbie Robertson and the Band' - Los Angeles Times - February 27th, 2020
- T's Spring Women's Fashion Issue: The Test - The New York Times - February 27th, 2020
- Disagreeing with the fuzzy logic - Galway Advertiser - February 27th, 2020
- African Americans have been blocked from voting, but the Black vote is not a 'bloc' - Penn: Office of University Communications - February 27th, 2020
- A multiverse, not the metaverse - TechCrunch - February 27th, 2020
- Former Fund Manager from Haitong International Asset Management Joins Singularity Financial Executive Team - Yahoo Finance - January 29th, 2020
- Artificial Swarm Intelligence In The Context Of Singularity - Forbes - January 29th, 2020
- Why the Singularity may be the key to World Peace? - Thrive Global - January 29th, 2020
- One of a Kind: MONO and A.A. Williams Forge a Singularity on the - Invisible Oranges - January 29th, 2020
- Decoding the Brain Goes Global With the International Brain Initiative - Singularity Hub - January 29th, 2020
- Time travel discovery: THIS is how you could move back and forward in time in a black hole - Express.co.uk - January 29th, 2020
- The new industrial revolution will change everything - Fabius Maximus journal - January 29th, 2020
- This Marvelous Machine Splits Moon Dust Into Oxygen and Metal - Singularity Hub - January 29th, 2020
- A top Silicon Valley futurist on how AI, AR and VR will shape fashion's future - Vogue Business - January 29th, 2020
- Captain Marvel Might Be Leading Two Superhero Teams in Future MCU Films - Fatherly - January 29th, 2020
- Captain Marvel Faces the ONE Avenger Who Can Stop Her Rampage - CBR - Comic Book Resources - January 29th, 2020
- A-Force brought a feminist utopia to life only to reveal its flaws - SYFY WIRE - January 29th, 2020
- CAPTAIN MARVEL #14 Gives Us Answers But Then More Questions. - Monkeys Fighting Robots - January 29th, 2020
- XponentialWorks Partners With Singularity University to Sponsor the 2020 Abundance 360 Beverly Hills Summit - Business Wire - January 18th, 2020
- Singularity Acquire Danish Roster, Return to their Roots - Hotspawn - January 18th, 2020
- For Fluid Equations, a Steady Flow of Progress - Quanta Magazine - January 18th, 2020
- How Google's New Weather AI Will Make Sure You Never Get Caught in the Rain - Singularity Hub - January 18th, 2020
- The Weird, the Wacky, the Just Plain Cool: Best of CES 2020 - Singularity Hub - January 18th, 2020
- Six mystery objects are orbiting our galaxys black hole and scientists are baffled - The Sun - January 18th, 2020
- Lan To Capital Hosts "Shaping the Future" Forum & Gala Event in Davos, Switzerland - Yahoo News - January 18th, 2020
- Does A Time-Stopping Paradox Really Prevent Black Holes From Growing Over Time? - Science Times - January 18th, 2020
- If the Calendar Says January, Why Does the Thermometer Say May? - The New York Times - January 18th, 2020
- Reducing water and chemical use in food processing - Foodprocessing - January 18th, 2020
- If AI Suddenly Gains Consciousness, Some Say It Will Happen First In AI Self-Driving Cars - Forbes - January 18th, 2020
- The age of artificial humans is here - Livemint - January 18th, 2020
- EDM star Steve Aoki shows off his 'Neon Future' at The Ritz Ybor this Saturday - Creative Loafing Tampa - January 18th, 2020
- Why the 2010s were a "decade of duality" in marketing and advertising - and what needs to change - Econsultancy - January 18th, 2020