Sometimes I think that Big Data has a branding problem.
You see, for data scientists to gain the trust and buy-in from their colleagues, they have to explain how their analysis can add value. They take a “data ocean” of information and distill it into highly-specific and actionable insights for every internal customer, refining and refreshing it along the way to ensure that it is as relevant as possible.
It is like they take the most powerful telescope imaginable and look for a speck of dust on the moon. “Here you go, this precise set of data will prove that you are right.”
The success of Big Data initiatives (to a large extent) comes in the ability to drill down from the planetary level to the sub-atomic level. It’s all about getting to those small insights that would never have appeared had you not started large and refocused, refocused and refocused. Of course, this doesn’t mean that the bigger trends are not relevant, but we have a tendency to view anything “large” with a certain amount of mistrust.
Somehow we naturally think that “big” things have a bigger margin for error, although the assumptions that we made on the way to the smaller insights could …