Artificial Intelligence has been Democratized: Finally, This Time is Real

The predictive power of Artificial Intelligence (AI) has proven itself and its value is no longer under scrutiny.  This should explain why the AI adoption has been so prevalent in recent years.  The technology has been applied to many use cases such as finance (determine which applicant gets a loan), medicine (making sense of X-Rays, ECG scans, and making sense of Electronic Medical Records), understanding human speech (smart speakers such as Alexa), movie and book recommendation engines used by the likes of Netflix, Amazon and others.  The list goes on and on.  Truthfully it is probably easier to keep track of applications that are yet to embrace AI vs. the ones that they do.  Wall Street Journal had recently a phenomenal coverage of how several prominent Silicon Valley Venture Capitalists have developed AI models able to predict the success probability of their portfolio companies.  This followed by another article examining the possible role that AI can play in mental health delivery.  What makes these projects interesting is that they are (in my humble opinion) unexpected.

Most enterprises can be assured that adopting Machine Learning (the most popular form of Artificial Intelligence) can lead to significant operational efficiencies given the following three factors:

  1. Predicting future outcomes is of value and can be monetized
  2. They are in possession of historical operational data that can be used to train Machine Learning (ML) models
  3. They are willing to go beyond their comfort zone and make nominal investment needed to test the waters

Gaining proficiency in Machine learning is non-trivial. The domain is complex and hard to master to say the least.  The complexity stems from its multi-dimensionality.  In other words, gaining competence in ML requires core skills in data science coupled with depth in math, statistics, computer programming, and cloud computing.  This should explain why forming an effective data science team is challenging, expensive, and can take time.   The magic of the technology (e.g., correctly predicting outcomes) is what trained models are able to produce but it takes a bit of doing until that happens.  One can think of a Machine Learning model to be a computer program that runs on machines either on premises or hosted by a cloud computing provider (e.g., Amazon Web Services).  They are fed with inputs and they produce predictions based in these inputs.  Initially the models are incapable of performing, but they can be trained by having them chew on mountains of historical data.  During this training process, they are able to discover past trends that are likely to repeat in the future and they are able to apply the acquired know-how to predict future outcomes. While the process is complex and arduous, the basic logic is not.

The irony is that in a typical Machine Learning project, building and training models (the fun part liked by geeks) accounts for merely 10% to 15% of the project time.  The remaining time (most labor-intensive and least interesting) has to do with cleaning and reformatting the training data.  Without having ample and clean training data, all bets are off.    

Complexities presented above often discourage small and medium size businesses to embrace AI.  They view the technology to be a luxury that only behemoth such as Amazon, Netflix, Apple, Google, Facebook can enjoy.   

While this notion has been traditionally valid, it is “DEFINITELY” no longer the case.  The pace of innovation in Machine Learning in the past few years has been mind boggling and one of the main benefactors are the availability of new tools and platforms that are able to dramatically diminish the complexity and cost of adopting Machine Learning.  The scale of simplification is substantial, and the amount of saved time is great.

Welcome to the era of “No Code AI Platforms” or the concept  “AutoML”.   Companies such as Dataiku,  DataRobotDOMINOH2O.ai, and RapidMiner have enjoyed varying levels of success in launching platforms that enable teams, with little or no IT or data engineering experience; build AI-powered applications and integrate them in their workflows.  Some of their customers have been able to generate truly impressive ROIs in weeks instead of months or even years.  Major cloud-computing companies have not been idle either and have jumped on the bandwagon.  They have enriched their traditional Machine Learning product portfolio with various automation tools.  “SageMaker Autopilot”by Amazon Web Services, “AutoML” by Google Cloud, and “AI Builder” by Microsoft Azure are some examples.

There are also a whole host of new entrants such as  ushurBRYTERSIGNZYRUNWAYMLFRITZ AI, and few others that have chosen to tackle this massive opportunity as well.  They have been able to raise decent VC funding rounds to finance their efforts and seem to be doing remarkable work.  As for differentiation, they each have chosen to tackle the problem in their own unique way or have opted to address a specific slice of the market.

Despite the impressive power of AutoML tools, companies will still need some data science and IT expertise.  It is naive to expect that one can build an effective Machine Learning solution with few mouse clicks in few days.  It is however very reasonable to expect a massive reduction in time and effort needed to build a meaningful solution compared to two years ago.  To bridge the skillset gap, leading AutoML vendors have formed a prolific network of partners and consultants that can do the heavy lifting for their clients that are unable or unwilling to build significant in-house expertise.

While blind adoption of Machine Learning is not advisable, companies have little to lose and much to gain if they develop a “proof of concept” at a minimum.  Thanks to “No Code AI Platforms” this can be done quickly and inexpensively.

Al Gharakhanian