The Four Big Vs Of Big Data

Posted on Oct 20 2017 - 9:54pm by Editorial Staff

Unless you have quite literally been living in a cave for the last few years, you’ll be aware of the Big Data revolution and you’ll know that you’re living in the Big Data Age. What you may not know so much about are the Four Vs of Big data – volume, variety, velocity and value.

We’re creating more and more data very year – the volume is rising exponentially – and this vast increase is straining our existing data storage methods and infrastructures.

Fortune magazine reported that by 2003, humanity had created five exabytes (five billion gigabytes) of digital data. However, by 2011, we’d created that amount in just two days. Fast forward two years and this amount was being churned out every ten minutes. The total volume of data stored on our computers doubles and re-doubles every 18 months or so – Big Data is just going to keep on getting bigger.

Different sources

Big Data comes from – and is also defined by – a huge variety of sources and content, which in turn presents lots of new opportunities to take value from this gushing digital stream. There’s customer purchase lists from transaction processors, geolocation data, social media posts, tweets, council records and so on. The formats also vary, from familiar-looking database entries to unstructured data like audio files, satellite images and so on. This unstructured data is often handled and processed by information specialists like Bit Plane, who can turn this vast sea of raw data into something the human mind can interpret and learn from.

There’s no point in processing this phenomenally large stream of data if it takes so long that your findings are out of date by the time you get them. All the information must be processed as quickly as possible – in other words, with great velocity. New analytical algorithms and better data transmission methods mean that information can be processed in almost real-time, which is vital if it needs to be acted upon. Online fraud, for example, needs to be detected and reacted to immediately, as does traffic volume and delays.

Value for monetisation

Big data has changed the value of information, in terms of usefulness and in terms of monetary worth. The reliability, or integrity, of information as well as its veracity rests on two factors. Firstly, if the information is wrong or incomplete, or if it’s so unstructured that it’s difficult to analyse, then it needs to be verified and its sources checked. Secondly, not all types of the huge amounts of data stored by organisations is worth the effort – sometimes some data strands need to be picked out and ignored as they are, essentially, just noise and they cover up the more valuable types of information. Data analysts need to identify what the worthwhile data is, which types of data will be important to the organisation’s aims, so that they don’t waste time on pointless information.

Data can be, and is being, monetised extensively now. If data has been processed, standardised and (if necessary) anonymised, or if it’s rare data or data that’s difficult to source in the first place, then it will have monetary value. The rare data will have a high value even before it’s processed and cleaned up – a new form of currency.

About the Author

Editorial Staff at I2Mag is a team of subject experts.