This page contains a Flash digital edition of a book.
Controlling weather with a hammer?


Tomas Hill and Paul Lewicki consider the outlook and trends in predictive and advanced analytics


with a hammer.’ Although slightly flippant, the remark summarises in a single sentence the challenges and opportunities for modern


A


analytics to drive productivity and quality of life: l Enabling an understanding and eventual predictive knowledge and control over extremely complex and ‘big’ data problems


l Connecting this understanding in some way to a very simple, purpose-driven user or automated interface


l Connecting that interface in real-time to drive effective action to improve processes and outcomes At StatSoſt (now a part of Dell), we have


observed and, we believe, oſten led the journey to effective analytics systems and solutions over the past two decades. Here is a list of the top changes and trends that we believe are driving the current and next waves of analytics technologies, enabling solutions that in many ways are more remarkable than ‘a hammer-that- controls-the-weather’ would be.


More data drives new analytic technologies Te prices for collecting and storing more data continue to fall. Today’s big data platforms such as Hadoop allow organisations and individuals to store ‘everything.’ Te technology and solutions for analysing text (‘unstructured data’) is now quite mature. But the ability to collect data will always outstrip the ability to analyse it and to extract meaningful information from it. Analysing high-definition pictures and movies or streaming conversations in real time is the new and current challenge. What’s next? While data grows exponentially


and indefinitely, the information contained in data does not. We believe that statistics will be rediscovered – or, more specifically, new kinds of statistics will be developed that will combine


20 SCIENTIFIC COMPUTING WORLD


Statsoft’s products have been delivering effective analytics for two decades or more


the idea of exploratory mining with proven statistical traditions such as highly multivariate experimental designs. Tese methods could be repurposed to address the continuing challenge of insufficient computing resources to process data. In fact, one might argue that statistics was invented to allow for inference about data (populations) too large to process entirely; experimental design was invented to query data (real-world observations) to extract the maximum amount of information.


More data, more models, fewer analysts Te most effective predictive modelling as well as process monitoring (‘quality control’) algorithms are ensembles. Effectively, large numbers of prediction models such as neural nets, decision trees, etc., yield the most accurate predictions. In automated manufacturing and high-dimensional process monitoring, large numbers of rules-based, models-based, and uni/ multivariate control charts are most effective when one needs to watch tens of thousands of parameters in real time. What’s next? Te modelling process itself


will have to become automated and ‘intelligent.’ It is not realistic that a few experienced


n acquaintance and long-time expert on analytics technologies once quipped that ‘in another decade, we will be able to control the weather


modellers will build and maintain hundreds or thousands of prediction models (e.g. for each patient in a hospital, to determine the most promising treatment regimen). Automated, dynamic modelling technologies for building and recalibrating models (as new data become available, sometimes recalibrating and learning in real time) are needed and are emerging.


Prescriptive analytics, decisions, actionable results More data enable more models about more outcomes, based on more data. For example, the interactive workflow where engineers would drill down to understand a particular process problem will not scale when there are thousands of parameters to review and drill down on. Likewise, traditional BI charting becomes ineffective when hundreds or thousands of variables need to be considered, or when thousands of micro-segments of customers need to be reviewed. Interactive analyses and


THE MODELLING PROCESS ITSELF WILL HAVE TO BECOME AUTOMATED AND INTELLIGENT


reviews at least must be guided and prioritised; however, actions based on predicted outcomes and how to optimise them should ideally be initiated instantly and automatically. What’s next? Te role of automated


prescriptive analytics and decision support will become critical. For example, Internet of Tings (IoT) technologies will close the feedback loop to individual stakeholders in predictive model prescriptions. For instance, a patient may receive an automated text message not only in the (relatively ‘obvious’) case when sensors and monitors detect undesirable changes in the amount of food or water intake, or if it is detected that critical medication is


@scwmagazine l www.scientific-computing.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45