When your day to day job is building models that accurately describe data, much of that work is weighting the variables within the model to tweak and tailor it for greater accuracy. Once a model is deployed, it is commonly kept in service, as is, for long stretches of time. But just like real life, the data that is used to create models continues to change. Old prediction models rely on old data, and that can lead to problems down the road.
When the moods and tastes of customers change, that’s reflected in the products they buy, and that affects the data. Changes in mobile technology and customer usage patterns influence the data across networks. An increase or decrease in users, an increase in traffic from low-power devices like smartphones and tablets, all affect the volume and types of data streaming across the network. Regardless of what type of data, it’s likely that data will change over time.
For all the work that analysts and data scientists put into their models, there’s danger in leaving a model unchanged. When the data changes, the model effectively becomes stale. The environmental and economic factors that analysts sought to understand and control have changed and the models they’ve constructed no longer describe the world as it is. Where last year’s model emphasized SQL injection attacks, today’s threats are through coopted mobile devices.
When models for retail or financial services are outdated, profits are affected. When the models involve caring for patients or guiding people and cars, the stakes for stale models are even greater. In both cases, the effect is that any guidance we get from those models becomes less valuable over time. In spite of this, the nature of analytics leads most to put models in place and continue to inform decisions of the firm.
New technologies in analytics will transform how models deal with new data. When we use algorithms to devise the model itself, it becomes not just possible, but reasonable to automate how we analyze and predict on data. Already some industries are beginning to employ more dynamic modeling systems to help tackle complex and highly variable data problems. For companies that work with thousands or more connected devices, or that must contend with the difficult to predict actions of humans these approaches are the only way to manage the constant variability of data. Without dynamic approaches to modeling and predicting, these forward-thinking data companies will be overwhelmed with the volume and variety of data they manage. With them, they will enable their analysts to put less effort into keeping pace with the increasing flow of data and focus on the meaning and the implications of their analysis.