I was recently on a panel hosted by FundingPost that discussed trends in Artificial Intelligence, how it relates to startups, and what are some items for consideration. Below are a summation of my thoughts.
For the last five years, Big Data has been all the rage. The reality is that corporations have been swimming in data for decades. Many startups launched under the premise that the data they were collecting on users was valuable and unique. But data is not valuable unless one can glean insights from it. The reality is, its not the size of the data but how you use it!
The Big data trend has evolved into Artificial Intelligence, AI. The reality is that big data and machine learning have always been subsets of AI. Machine learning and various forms of it have become the primary method of data science analysis. One of the major applications we see is in natural language processing or NLP. Its funny how NLP was big in the world of personal development (think Tony Robbins and Mystery Method) and now we have the analog in the machine world!
There are three waves or stepping stones of Artificial Intelligence where AI evolves from a rules based analysis to an abstraction and thought base analysis. The three waves being Handcrafted Knowledge, Statistical Learning and Contextual Adaptation. I will credit DARPA for this elegant nomenclature. The goal is to eventually have machines or programs to operate similarly to how the human brain does. Model the system as a brain, not model the world. Our brains do complex things like understand speech and perceive objects by practice and feedback, not through a collection of exhaustive rules.
The first wave of AI is Handcrafted Knowledge. This is where experts form various fields have taken domain knowledge and characterized it in rules so that computer could study it.Examples of this can be seen in logistics — scheduling inventory and shipping, game playing — like chess, Backgammon or calculation of taxes for example in Quicken TurboTax — which can take complexities of tax code and turn it into certain rules that can be processed.
Handcrafted knowledge enables reasoning over a narrowly defined set of problems. However these examples have no ability in the critical components of Perception, Learning, Abstracting, and Reasoning. There is no learning capability, and a poor handling of uncertainty. However this level of AI is still relevant to the majority of corporations and startups and in the world today.
That brings us to the 2nd Wave of AI — Statistical Learning. The application here can be seen in voice and facial recognition, and the ability to sort photos. We can see it in Facebook, Google Photos etc where it is a little scary just how well they do at recognizing who’s who in pictures!
In this category of AI, there is a nuanced ability to classify and predict data — but no ability to understand context nor capability to reason. In this application, we create a system so that a computer/machine can learn. Engineers create statistical models that characterize specific problems and then train them on Big Data. These systems are very good at Perceiving — separate one face from another, vowel sounds from another; and Learning — can learn and adapt to different situations. However they are limited when it comes to Reasoning and Abstraction. In this segment, computers have no knowledge or capability to take from one information from one domain and apply it in another.
This segment is effectively a spreadsheet on steroids. Set of layers of data computation and do layers of calculation. This wave of AI has numerous excellent applications, particularly around Cybersecurity — code and network flows and analyzing the Electromagnetic spectrum — how to use spectrum efficiently to meet wireless data demand
The second wave of AI does well in many instances, however it suffers from the problem of being statistically impressive overall but individually unreliable. For example when a baby holds a toothbrush a computer may recognize the picture as a “a young boy holding a baseball bat”. The system needs specific input around what data it is getting exposure to.
That brings us to the 3rd wave of AI — Contextual Adaptation. Systems over time build explanatory models that allow them to characterize real world phenomena. This will involve building the ability to understand and have clarity about why the machine is making certain decisions
· “I understand why, why not
· I known when you’ll succeed/fail and why
· I know when to trust you
The System over time will learn how model will be structured
· Will perceive
· Will reason
· Will abstract to take data further
This is really the frontier of AI. And its where we are headed. It is where a computer will eventually be able to beat the Turing Test where a human will not know whether it is interacting with a computer or another human.