We are told to think big. We are told that there is no such thing as ‘too much’, as ‘too complex’, as ‘too great’. With so much hyperbole surrounding business intelligence and insight, it is no wonder that so many of us get caught up in the tidal wave of facts and figures, and swept away in the deluge that follows. Would we not be better served by changing our approach a little?

Before we get started here, I must show my hand; I am a fan of data. I don’t believe, necessarily, that there can be too much data. I believe that there are always fresh insights to be uncovered by looking a little deeper, and by reaching a little further.

But I am also a realist. I understand the practical impact of too much data – of too much of anything. I understand that businesses can be overwhelmed; swamped by the sheer volume of the stuff. I understand how easy it is to misjudge our own capabilities and to overreach. I understand how difficult it can be for businesses to gain advantages they crave from data.

With this in mind, is it not sometimes better to think a little smaller?

The Good Stuff

Alex Woodie, writing for the appropriately titled Datanami, described a frightening scene;

“About 2.5 exabytes of data will be generated today,” he said, “or, roughly the amount of data that was generated from the dawn of time until 2004.”

Woodie was writing in October 2014. The globe’s daily data output is somewhat higher today!

The crux of the article is this – is more data necessarily better? Or would organizations be better served focussing on their objectives, using only ‘the good stuff’? In discussing this question, Woodie highlights the 1998 joint study from Anthony Bastardi and Eldar Shafir, psychologists from the universities of Princeton and Stanford, respectively.

Bastardi and Shafir found that, on a personal level, people who were presented with vast amounts of data with no structure were in a worse position than people who were presented with a straightforward selection of data, in a logical order. The authors of the study concluded that humans, as a species, struggle to prioritize and eliminate pieces of data, and end up – I’m paraphrasing here – barking up the wrong tree.

The study focussed on the choices made by an individual, not on a corporate organization with specialist teams and resources in place to manage this data. However, the principle still applies. By trying to think too big, company bosses could be putting even relatively simple decision making processes in jeopardy.

Applying the Right Data to the Right Projects

The fact is, if we are overwhelmed by data, it becomes useless. Vast organizations with budgets to match can afford to invest in the science of interpretation, and can deploy – or even construct for themselves – enormous data storage networks which bridge the gap between the cloud and the good ol’ fashioned physical server.

For most of us, however, this is simply not possible.

Instead, we need to take what we know already and build on this, developing the data as we go.

Let’s say, a couple of years ago, a tech company launched a new accessory into the market. This launch was a major success, and the first production run flew off the shelves. After a period of a few months, sales began to tail off, as they naturally tend to do.

Now, the tech company wants to capitalize on this original success. The trouble is, they don’t know if they should invest in research and development for an upgraded version of the original accessory, or a totally new accessory for launch into the market. To understand this, they need to gauge the public appetite for an upgrade.

To do this, they can examine customer reviews to see where the public would like improvements to be made. They can examine returns and recall data to see if there are underlying problems with the existing tech. They can look at contemporary sales of compatible technology, to ascertain whether the market still exists or not. These are data sources which are objective-based. The insight derived from these sources integrates directly with the task at hand, furthering understanding and driving towards a solution.

Compare this with the alternative – standing in front of a spurting torrent of data with a big bucket – and you can begin to see why quality is still to be valued over quantity.

Grassroots Data

Habits are difficult to break. If you have developed your business intelligence protocols with the bellow of “Massive Data” ringing in your ears, it could be difficult to stop what is already in motion.

However, in order to get the best out of data, such an overhaul may be necessary. Murray Chick, writing for Marketing Tech News earlier in 2017, recommended a grass roots approach to data.

This means starting small and building up gradually. With each project, assess what insight objectives you need to accomplish before the necessary understanding is achieved, and then stop there. Apply the data you need to reach this level of insight, and then make your decision. Larger, more complex projects will require more data; smaller projects will require less. The key is to try to understand how much data needs to be applied and not to confuse the process by piling useless data fragment upon useless data fragment.

This is how we build a lean and mean approach to data. We must discern and discriminate, looking – as Chick says – for “clues” from data, not “facts”, then interpreting these clues to the best of our abilities.

It’s time for an audit. It is time to take stock of your data sources and to understand the benefit that each one can provide. This way, data stays useful, it stays vital and, above all, it stays relevant. This way you will know which tap to reach for when it is time to fill that data bucket once again.

Image Credit

Pixabay

(Visited 77 times, 1 visits today)