I have been involved in marketing analytics work for some years now. It requires me to regularly talk to CXOs about their big data challenges, and their plan to leverage this data to improve business decision making. I am constantly surprised how much misconception exists among executives. All of them read about new technologies and platforms coming out of Silicon Valley that magically clean, organize, analyze and visualize data for them. As if, they just have to implement some technology, press a button, and insights would start flowing.
This is a myth. There is no such (magical) technology-based analysis. Period.
Big Data’s big deal is not about technology platforms – it is rather about appropriate human interface with data technology.
I am myself guilty of selling big data solutions under the facade of technology and platforms. In many ways, I have contributed to this misconception about Big Data technology. So, I hope you believe me when I tell you – Big Data’s big deal is not about technology platforms – it is rather about appropriate human interface with data technology. Let’s not continue to speculate that technology platforms would save the enterprise from all data problems. I have seen the most advanced technology platforms that exist today. There is only one thing I know – these platforms would serve no purpose if we don’t have trained data professionals who know three basic things – business/domain knowledge, analytical experience, and ability to embrace new data technology.
Let’s Start With Data Deluge (which, BTW, is not the problem):
We all know – there is data everywhere. In the past couple of years, the world has generated more data than the prior civilization put together. Whether it is content posted on web and social media, data transmitted from sensors in cars, appliances, buildings and airplanes, or streamed to your mobile, television or computers, we are surrounded and overwhelmed by data. Advancements in technology are the main driver of this data deluge, but similar advancements have taken place in the technology to collect and store data. This has made it economical for organizations to build infrastructure to store and manage large sets of data. But, the real problem is deriving value out of this data and making it useful. This is where most of the stagnation is today. According to International Data Corporation (IDC), only one percent of the digital data generated is currently being analyzed.
The Data Revolution Is About Insights:
Everyone agrees there is a big data revolution happening, but it is not about the volume and scale of data being generated. The revolution is about the ability to actually do something with that data. What used to take millions of dollars to first build the infrastructure and then hire really smart and expensive individuals to analyze data, can now be done in thousands. It all comes down to using the right set of new age technologies and implementing right set of rules (read algorithms) to deliver answers that weren’t possible earlier. This is where the new age data computation and analysis shines. We have come a long way to leverage machine learning, graph analysis, predictive modeling algorithms and other techniques to uncover patterns and correlations that may not be readily apparent, but may turn out to be highly beneficial for business decision making.
There have been vast improvements in how and what type of datasets can be linked together to capture insights that aren’t possible with singular datasets. An example that everyone understands is how Amazon links together shopping and purchase history of customers to make product recommendations. Along with linking of datasets, improvements in visualization tools have made it much easier for humans to analyze data and see patterns. These technologies are now making inroads into all types of disparate use cases to solve complex problems ranging from pharmaceutical drug discovery to providing terrorism alerts.
But, Here’s The Problem:
Insights can only be delivered by data scientists. And, there is a huge shortage of people who are comfortable with handling large amounts of data. Data collection is easy and cheap, and the general approach is to collect everything and worry later about relevancy and finding patterns. This can be a mistake especially with large datasets because there can be numerous possible correlations that increase the number of false positives that can surface. No matter how sophisticated technologies get, we need more data scientists outside of academia and working full-time on solving real world problems.
Machines cannot replace human beings when it comes to asking the right questions, deciding what to analyze, what data to use, identifying patterns and interpreting results for the business. Machines are good at doing fast computations and analysis, but we need data scientists to build hypothesis, design tests, and use the data to confirm the hypothesis. Traditional data scientists are not the solution, though. There are many generalists in the data science field who claim that if you throw data at them – they can deliver insights. But, here’s the reality – someone who doesn’t have knowledge of your business can only have limited (if any) impact. In addition, data scientists need to make sure decision makers are not presented with too much data because it quickly becomes useless. This is where technology and analytical experience comes in handy – techniques that help aggregate, organize, filter and cluster data are extremely important to reduce datasets to digestible chunks.
Company executives need to understand that human touch plays a fundamental role in the big data journey. Insights delivered by technology without proper human interface can put their business at risk, alienate customers, and damage their brand. Given the current advancements, it comes down to putting the right technologies to use and getting the right people (who know your business) in the room to derive value out of the ‘Big Data’. Is that an easy thing to do? What has been your experience?