Automation and Cloud

Dawn of the Predictive Era

Written by ,

| Apr 10, 2017

3 MIN READ

Computing is evolving. It’s likely that the forces described by Moore’s Law and Dennard Scaling will mean that many of the ideas that we’ve associated with technological advancement will change. It won’t be about faster, cheaper, and smaller computers in the year 2025; prediction and where we apply it will be the big story.
Most readers will be old enough to remember what photography was like in 1995.  We bought film, took photos and hoped for the best when we dropped our handiwork off to be developed. The idea of taking 20 snaps of the same subject in the hope of getting a single good shot was a luxury only professionals could afford. Photography was fundamentally a chemical engineering problem, and an expensive one at that.  Today disks are larger and cheaper, LCDs and monitors are far more advanced, and printers can do the job previously restricted to professional photography studios.  What was fundamentally a chemical engineering problem was reframed as a home computing one – and this was driven by the dramatic reduction in computing technology prices.
Similar forces are at work when it comes to prediction.
As computation and storage costs fall over the next decade and predictive techniques like machine learning rapidly improve, we’ll see the predictive ‘toolbox’ be applied in ways never before seen. Just a few years ago, the only place we could consider self-driving cars was in a highly-controlled environment like a warehouse where we could program every possible scenario deterministically (not predictively).  Today the ability to make accurate predictions means that self-driving cars are being tested in ‘normal’ environments that include pedestrians, drunk drivers and damaged road signs. Problems that had to be solved using “brute-force” can now be tactically solved in a probabilistic (predictive) manner.  As the cost of prediction continues to plummet, you’ll see it being applied in all sorts of new places.

Table of Contents

WHY NOW? THE RULE OF THREES!

First off, many of the mathematical techniques required to make prediction possible really kicked off in the 1970s as artificial neural networks advanced along with better methods of training them.
Second, the availability of unlimited and virtually free storage, compute, and communications infrastructure in the form of cloud computing makes the mathematics previously mentioned computationally feasible and moves them from theory into practice.
Third, and perhaps most visibly for those of us on the front line of corporate technology, is the diminishing return on ERP investment. Simply capturing data and streamlining business process workflows yielded enormous benefits over the last decades, but these are just about exhausted. We’ve reached a new steady state where effective ERP implementation is so prevalent as not to be a key differentiator.
So what do these three add up to? The dawn of the Predictive Era!  In this era, the ability to onboard data, perform feature engineering (data engineering), train predictive models, and deploy/orchestrate those models is becoming a core competency for every business. Along with new applications for predictive analytics, organisations will benefit from greater returns on complementary skills such as data science.
Here at Pentaho we’ve been on the front lines of these innovations and have always tried to keep our tools pragmatic, practical, and profitable for data scientists and the innovative companies. Our drag-and-drop data integration, visualisation, and machine learning orchestration capabilities mean that you can spend more time tuning your predictive algorithms (in the tool of your choice) and less time feature engineering and operationalizing them. Our enterprise platform empowers data engineers and data scientists who often work in silos to instead collaborate, dramatically accelerating the development cycle and reducing the time-to-value for the business.
The nature of applying machine learning to build a predictive model that gets smarter over time is that the sooner you get started, the more distance you put between your company and your competitors. Our customer Caterpillar Marine recognized this and used Pentaho to outperform the 71 percent of companies benchmarked by Ventana Research that fail to model their event patterns. Caterpillar Marine not only improved productivity, safety and dramatically lowered maintenance costs, it won a Technology Innovation Leadership Award for IoT from Ventana Research.
In another example, Hitachi Rail uses Pentaho with Hitachi’s Hyper Scale-Out Platform to fulfil its pioneering “Trains-as-a-Service” concept, applying advanced IoT technology in three event horizons: real-time (monitoring, fault alerting), medium-term (predictive maintenance) and long-term (big data trend analysis). With each train carrying thousands of sensors generating huge amounts of data per day, the project’s data engineers and scientists face many challenges associated with big data and machine learning. Although the project is not yet operational, Pentaho is already helping to deliver productivity improvements across the business.
So what are you waiting for? Now is the time to embrace the Predictive Era and start reaping the benefits! Learn more about machine learning orchestration.
Wael Elrifai I Director of Enterprise Solutions, Pentaho



Go to Top