It's been 18 years since Deep Blue beat the reigning world champion chess player. It’s been four years since Watson beat the all-time “Jeopardy!” champions. But true artificial intelligence – a computer system that is at least as good as a human at solving problems through reasoning, planning, and abstract thinking – hasn’t come about yet.
Some computer scientists and business leaders predict that a breakthrough is just around the corner; others think that while computers will continue to improve in specific areas, they may never be able to achieve human-like intelligence.
Google is in the former camp, and on Monday, the company took a big step toward building (or letting someone else build) artificial intelligence. Google open-sourced part of TensorFlow, the “deep learning” engine that powers the company’s speech recognition, its Photos app, its translation services, and more.
TensorFlow is a library of algorithms that allows Google to train computer systems, called “neural networks,” to think and learn similarly to the way humans do. The neural networks perform complex mathematical operations on arrays of data, called tensors, discovering patterns and relationships as they chew through all the information that’s available.
A neural network allows the Google Photos app to learn the relationship between an object’s name and its appearance – so it can identify a cat within a photo, based on similar pictures it’s seen before. Another neural network allows the Google Translate app to learn how words are commonly used in conversation, so it can provide less-stilted renderings.
For as much progress as Google has made in the field of deep learning over the past few years, the company thinks the technology can be improved by letting others help out.
“We hope [open-sourcing TensorFlow] will let the machine learning community—everyone from academic researchers, to engineers, to hobbyists—exchange ideas much more quickly, through working code rather than just research papers,” Google CEO Sundar Pichai wrote in a blog entry. “And that, in turn, will accelerate research on machine learning, in the end making technology work better for everyone."
Google says TensorFlow has been built in such a way that it isn’t tightly tied to the company’s hardware – so researchers can put the machine learning engine to work on their own computers, letting it crunch through additional data sets. TensorFlow is more flexible, more configurable, and about twice as fast as its predecessor's deep learning system, DistBelief, senior Google fellow Jeff Dean and TensorFlow technical lead Rajat Monga wrote on Monday.
Google has been bullish on machine learning for some time. The company hired Geoff Hinton, a pioneer in the field, in 2013 to lead an in-house machine learning team. And Alphabet chairman Eric Schmidt wrote in an op-ed in September that machine learning is on the cusp of revolutionizing the way we approach big problems in climate science, energy, and genomics.