Over the past few years we’ve witnessed a tectonic shift in computing. Until now, there have really been two major phases in the evolution of computing: The Tabulating Era and The Programming Era. Slowly but surely we have now entered a new third era of Computing—the Cognitive Era—which is by far the most transformational.
The Tabulating Era (1900s—1940s): Modern computing began with single purpose mechanical systems that were little more than sophisticated calculators. These machines used punch cards to input data and the punch cards also served as a rudimentary mechanism to instruct the machine to perform calcuations.
The Programming Era (1950s—Present): is the era that today’s computing professionals have mainly grown up in. The programming era birthed the basic structure and interpretation of modern computer programs consisting of if-then clauses, iteration loops, and other logical operations. The invention of the transistor and the accompanying advancements associated with Moore’s Law have made programmable machines very capable.
Every computing device that exists today—a smartphone, tablet, laptop, desktop, or smart watch—is a programmable computer.
The key identifier of a programmable machine is that it can never accomplish more than what it is programmed to do by a human. Every program written by a human is limited by the creativity, knowledge, and reasoning ability of the human. A human can only program a computer to do what he or she already knows how to do. In other words, a program can only be as good as the programmers who write it.
The Cognitive Era (2017—): By contrast, the Cognitive Era represents a seismic paradigm shift in computing capabilities and architecture. AlphaGo is a widely recognized key milestone that marked the beginnings of the Cognitive Era: almost everyone remembers when Google DeepMind’s AlphaGo defeated Lee Sedol, winner of 18 world titles and widely considered the greatest Go player of recent history.
The development is interesting because the game of Go has too many permutations for human programmers to program logical instructions for all the various scenarios that could occur. The development transcended a major limitation of the programming era: programmers who couldn’t enumerate all the if-then-else scenarios created a machine that was able to self-learn and adapt in order to defeat the world’s best player.
Cognitive Computing Enables Intelligence Augmentation at Unprecedented Scale
In our post-AlphaGo world, the big shift in Cognitive Computing is about the movement from deterministic systems to adaptive, self-learning systems. Cognitive computing changes the paradigm of how computers learn what they are supposed to do, how they reason when faced with wholly new scenarios not explicitly captured in their programming, and how they interact with humans differently.
“Cognitive Computing allows computers to transcend the predefined limitations of their programming, adapt to new inputs and scenarios, and make decisions probabalistically.”
Cognitive systems are inherently probabalistic and statistical. Conceptually, these systems start to reflect how humans learn. The scaffolding is in place to power previously inconceivable applications that can meaningfully augment and aid people in their daily lives as well as in the knowledge work we perform on a daily basis in today’s information economy.
Add in the fact that 90% of the world’s data was generated in the past 2 years, and that 80% of all data is unstructured, and you start to see why having adaptive cognitive systmes could be a tremendous asset.
The Programming Era was all about machines that calculate answers to prescribed scenarios, that ingest structured inputs, and which then yield predefined outputs. By contrast the Cognitive Era is about machines that perform probalistic decision making, that are capable of self-learning, reasoning, and evolving, and which are able to deal with unstructured inputs and yield dynamic outputs. It’s not surprising that the World Bank characterizes the advent of Cognitive Computing as the Fourth Industrial Revolution.
“The Fourth Industrial Revolution…poised to have profound impact across markets and applications.”
—The World Bank