Big data is every where, and you can’t get around it. But knowing it’s there isn’t helping you make decisions based off it.
The market research industry started in the 1920s, and really exploded post-WW2. The business was revolutionized in the late nineties with the advent of internet surveying and sampling. During the same time period, and probably closely related to the market context brought on by the birth of internet research, a period of mass consolidation took place. Aggressive companies began swallowing up smaller companies that had a good niche. No research firm that I worked for prior to 2002 exists as an independent entity any more.
Recently, market research has been severely impacted by the lingering downturn in the US economy. On one hand, as consolidation progressed, and the largest players approached annual revenues of one billion dollars, purchasing offices in many companies clamped down on research supply chains. Research managers, who quite honestly had little power to begin with since budgets were typically controlled by the marketing teams, were forced to consolidate their work with the multi-sector conglomerates. Research budgets in general have been squashed over the past five years. Many, if not most, new research companies have been locked out of growth by the shrinking opportunity pool. Many smaller firms who could not achieve the “preferred provider agreements” have evaporated if they haven’t been acquired. Such is life, and so goes a competitive marketplace.
We’re now in an era of faster, cheaper… I am not going to say better, not just yet, and I will explain a little further down. Typically through the use of technology, firms have reduced the amount of time it takes to design and execute research from months to sometimes hours. We’d work on a global study in the 1990s, and it would take weeks to set up and manage the back and forth across multiple countries. At the same time, the focus has shifted from attitudinal models (what do you think?) to behavioral models (tracking behavior through loyalty programs, online cookies, etc.) as behavioral data became more widely available and more easily processed. The amount of data available for analysis in various markets has exploded.
Big Data can be your friend, or your enemy. I worked briefly in 2001 for a technology company that wanted to capitalize on its massive data streams and sell information about online product exposure to advertisers. It sure sounded cool. We built a tool that sorted billions of transactions from the information stream and tried to produce useful, digestible, bits from it. It took us over 24 hours to process 24 hours worth of data. Oh well. We had a great team of engineers, mostly from MIT or CalTech, and some of whom willingly worked 24 hours a day at times on the project. For this practical concern around processing time, and other cost-based issues, the effort was disbanded a few months after I joined the company. Google now owns that company, and I imagine Google has figured out a way to process that amount of information in a few minutes, or less.
In a well regarded article by Hinton and Lopez in Science Magazine, the authors show that the amount of data storage capacity in the world grew at CAGR of 23% between 1986 and 2007. Their scholarly study ended at 2007 capacity; below is a graph from IDC (via infotechlead.com) that shows continued and increasing growth. IDC is essentially saying the amount of data stored in the world should quadruple in the next 4 years.
Because the research world is becoming so data driven, I think a shift has been imposed on the whole business. Energy is being expended to become masters of data extraction and delivery, which is great in and of itself. But I think the amount of “bandwidth,” if you will, devoted to understanding the meaning of what’s behind the data, may be on a downward trend.
Clients more and more tell us to just “give me the data”, and they’ll make sense of it. Really? In pharma, a least, I know that the human power to understand (and I mean something more than ‘analyze’) information has been reduced massively over the past few years, at the same time that the data flows are exploding in people’s faces. I worked for a pharma company 10 years ago, and the market research team was growing rapidly as key product sales were skyrocketing and new drugs were being heavily invested in from all the profits. Two years ago that research team was slashed by about two-thirds, although the number of products decreased by less than a tenth. It’s a pharma-wide problem and supposedly being mitigated by cursory, live-feed, data dashboards that everyone can look at and say “Ah ha!”. You’re expected to boil down a massive pool of information and make sense of it through ten graphs. Maybe it’s possible, but I suspect that the rush of information is too much for strapped teams to figure out well enough to develop new brand insights from.
Another thing that has become the rule of order in the business: tech people and operations management rule the “insights” business now. The research people are practically an afterthought in this model (yes, I am using hyperbole). If you have a social science degree of some sort in this business, you might make a great report writer, but consider another career path if you want to move into a senior management role. For example, a friend of mine worked for one of the research majors for nearly fifteen years and was pushed out as they transitioned away from the types of high-touch research she did. This is happening all over the place. When I worked at Roper in the late nineties, it was run by researchers, people who thought about meaning, not operations. Then it got acquired by NOP, and later by GfK, and the techs took over. Same thing happened at Harris in the late 1990s. The word “Interactive” was appended to the Harris name. (generations of customer focused research teams might have been responsive, but they weren’t interactive). It’s continued apace. No, that research model might not have been scalable, and the profit margins might have been considered low at the time (I bet those profits would be considered golden now, now that the floor has opened up and everyone has fallen through to commodity pricing). But we gave clients something solid to think about.
Does anyone remember back in 1997 when the Japanese children’s show Pokemon rapidly flashed a series of images at the millions of kids who were watching it, causing seizures in hundreds of them? Interestingly enough, the images were about characters fighting inside a computer. So, let’s have the research and marketing community take on all the data streams so they can spit out bits and bytes to clients rapidly and efficiently. Obviously, no one is at risk of information overload and insight deficit.
In Malcolm Gladwell’s Blink, he convinces us quite well that our decisions and judgments are made in less than two seconds. But if our eyes are closed and we’re only allowed to feel one square inch of the elephant’s leg, do we really know what beast we’re dealing with, whether it takes two minutes or two years?
McKinsey, in a 2011 report on Big Data, had this to say about the bodies needed to manage big data:
A shortage of the analytical and managerial talent necessary to make the most of big data is a significant and pressing challenge and one that companies and policy makers can begin to address in the near term. The United States alone faces a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts to analyze big data and make decisions based on their findings.
So the big data is out there, but we’ve got to admit, we probably don’t really understand it. The good news is, for market researchers in a declining industry, there’s a lot of opportunity if you want to seize it!
I’ll probably have more to say about this along the way…