A bold statement indeed, but not unjustified. Within the context of big data, it is conceivable that sooner rather than later, we will be able to accurately predict the future in big business. And here’s why.
Big data has come of age… Or has it?
As many people reading this will be aware, much has been written about big data in recent years in general, and its promise to revolutionize big business in particular.
Such is the power and pervasiveness of big data that in 2017, the big data analytics industry, dedicated to helping big businesses leverage petabytes of information now generated, is worth $122 billion, and still growing.
The basis of the big data promise is that extremely large data sets may be analyzed computationally to unlock hitherto unfathomable patterns, trends, and associations, especially relating to human behavior and interactions. And that this hidden insight would provide killer competitive advantage.
Diverse, disparate & messy data go into big data analytic tools, and actionable insights come out.
But so far, big data has somewhat disappointed in delivering killer competitor advantage.
All style and no substance, maybe
A common criticism is that big data analysis is often shallow compared to analysis of smaller data sets. In fact, in many big data projects, there is no large data analysis at all – the challenge is the extract, transform, load part of data handling. Thus big data has become pure process, and the meaning-making is lost.
In our experience, up to 80% of investments are spent organizing and structuring data – only to find that the analytics produced are not useful to the business. And having invested in purpose-built tools to analyse data at scale, businesses have often simply bought stylish interactive dashboards that visualize it.
Big data in and of itself is not enough
In reality, big data analysis must be situated in social, economic, and political context. But as companies invest large sums to derive insight, less than 40% of employees have sufficiently mature processes and skills to do so.
The Google Flu Trends project is often cited as an example of the failure of big data. The algorithm aimed to estimate the prevalence of real-world flu cases based on Google search queries trained on historical data about both. Initially it performed well, but was soon wildly over-estimating the number of cases. Machine-learned algorithms are supposed to get better over time, not worse.
GFT (Google Flu Trends) and other big data methods can be useful, but only if they’re paired with small data – traditional forms of information collection. Put the two together, and you can reach an excellent model of the world as it actually is.
Further, the use of multivariate methods that probe for the latent structure of the data, such as factor analysis and cluster analysis, have proven useful as analytic approaches that go beyond the bi-variate approaches (cross-tabs) typically employed with smaller data sets.
And this is where it gets interesting.
Predictive insight – accelerated performance
Within the big data conundrum, Nepa focuses on “predictive insight”. Predictive insights are an insight and a forecast. We use predictive rather than descriptive analytics to form repeatable insights, producing improved performance and optimizing ROI. Output is always “the next best action”, delivered as short and long term impacts to our clients’ business.
4 crucial factors for successful big data projects
When we undertake big data projects for our clients, we always stick to 4 proven parameters:
- Success with big data is starting small. Business value is the end goal. Pick a business challenge and build only the necessary to prove value. Be successful, fast! Grow with the next question.
- If you’re not keeping score, you are just practicing. Set clear success metrics based on a solid understanding of where decisions are made and the criteria they are based on.
- Working lean and iterating. Work with sample sets of data. Try new different methods. Validate with business decision makers and tweak if necessary.
- Projects are led by someone fluent in business and data science. Most data scientists come from “outside” of business. Few people in business speak data science. Successful programmes are led by people capable in both.
In staying true to these principles we provide the right type of insight, to right person at right time.
A case in point – companies expect good advice, again and again and again
In a big data project for one of our retail clients, a key business challenge was that the data structure for the direct-to-consumer business was not being used to its fullest potential. Output tended towards the simple descriptive, rather than the prescriptive. And diagnostics combining a breadth of multiple data sources to evaluate store performance were not easily available.
Our solution was to combine their data sets, and apply advanced data science modelling to generate holistic and predictive insight.
Data sources ranged from the internal, e.g. sales data, store & street traffic and product assortment as well as distribution numbers, to the external, e.g. weather patterns, and industry and consumer trends.
Our output consisted of:
- Mid to long term strategic guidance;
- Consulting with senior management in their decisions, including discount planning by region and store;
- Long-term staff planning;
- Optimal location for new stores;
- Stores eligible for closing, to increase overall profitability.
All with the aim of optimizing profit across the business.
Big data needs big judgement
According to an article in the Harvard Business Review, big data must ultimately be complemented by big judgement for best effect, and that’s how we predict the future for our clients.
It’s our killer competitive advantage, that we make yours. So, why not get in touch with me and let´s start to predict your future.