Facebook can identify you in a photograph. Shazam can name the song and artist that’s playing on your radio. An autonomous vehicle can identify and track multiple object (signs, traffic signals, vehicles, and pedestrians) all at speeds faster than humans. At first glance, these Artificial Intelligence (AI) applications seem awe inspiring. Surely, they’ll be taking over the world soon, right?

#NotSoFast

Why? Because they have nothing to do with human intelligence. AI applications perform their magic through training deep neural networks (DNNs)–highly interconnected, matrix-math-intensive models that can only do one thing well, such as identifying a face in a crowd, the name of a song, or a person in a crosswalk. And while data scientists describe these DNN architectures using human brain terminology like neurons, connections, training and inference, there’s little overlap with how people process information. Just consider that the human brain only consumes the power of a 20 watt lightbulb, while the energy required to train a single DNN gobbles up enough energy to run five cars over their lifetime–including the corresponding carbon footprint it leaves behind.

If power consumption doesn’t convince you that AI and human brains function differently, let’s look at each from the perspective of robustness.  DNNs are fragile. A DNN that’s only trained to recognize the studio recording of Earth, Wind & Fire’s September wouldn’t be able to identify the same song performed by EW&F  live. Yet, the ability for humans to identify songs is infinitely more robust. Once we hear a familiar song, we have the broad capability to recognize multiple versions of it. For example, if your friend picked up a guitar and started playing September, not only would those who knew the song be able to recognize it–independent of both key or tempo–they’d also be able to sing along.

Deep learning has tremendous advantages over humans when it comes to the processing large amounts of data, but, if those calculations ever hope to approximate the robustness of the human brain, AI scientists will need to change their perspective. The authors of the academic paper, Neuroscience-Inspired Artificial Intelligence explain the magnitude of the gap succinctly. 

“Human cognition is distinguished by its ability to rapidly learn about new concepts from only a handful of examples, leveraging prior knowledge to enable flexible inductive inferences.” Pp 259

Human intelligence is built upon interacting with the environment around us. We gather information through our five senses, compare that information with prior knowledge, and determine the best course of action. We react to new situations through an age-old series:

  1. assess fatal threats.
  2. Once those are eliminated, we seek to increase pleasure and avoid pain.

And while step #2 mimics the supervised learning (carrot and stick) techniques used to train DNNs, the stakes associated with being right or wrong are infinitely more important to a living and breathing human. Until AI applications can incorporate human emotion into their models, machine intelligence will forever remain limited.

DNNs are handicapped by a fundamental flaw–they neglect the most important part of human intelligence: the human condition. AI Scientists understand these limitations and are looking to other disciplines for inspiration.  They’re looking to neuroscience and psychology to close the gap. But rather than looking at human intelligence holistically, they pursue algorithmic solutions from a bottom-up perspective. The best way for data scientists to find the robustness of the human brain is to look at their experiments through the eyes of a storyteller.

A story is the result of people pursuing what they want…and we all want to remain living. It’s Maslow’s hierarchy of needs. We must satisfy both our physiological and safety needs before anything else. We can’t worry about the future until we’re confident that we can survive the present.

This fundamental need drives our actions. Instinct alerts us to mortal threats. Our attention is drawn to things that defy our expectations. Our most vivid memories are based on experiential differences (first/last, hottest/coldest, happiest/saddest) while our ability to bond with other humans is built upon shared experiences.

Humans have a superpower–the ability to react appropriately to situations that we’ve never experienced before. Therefore, if achieving the robustness of human intelligence is truly the goal, then AI scientists must expand the scope of their exploration beyond the disciplines of pure logic and seek the counsel of storytellers. Storytellers study human nature–the actions of autonomous beings as they journey through the great game of life. They study these actions in the context of the environments that people live in. They understand how people assess a situation/thing/idea, create hypotheses, test those hypotheses, and then act based on what they learned. And all of those actions depend on the human capacity to trade risks and rewards, as they experience profoundly complex concepts such as love, hate, fear, or exhilaration.

Until DNNs can incorporate these complex concepts, their relative intelligence will continue to remain artificial.

AI Scientists, meet the storytellers. Storytellers, meet the AI scientists.

 

 

Photo Credit: Lee, Russell, photographer. Tulare County, California. Farmer teaching his six-year old son to drive his tractor. California Tulare County Tulare County. United States, 1942. Feb. Photograph. https://www.loc.gov/item/2017817214/.