Data Veracity: a New Key to Big Data

In his speech at Web Summit 2018, Yves Bernaert, the Senior Managing Director at Accenture, declared the quest for data veracity that will become increasingly important for making sense of Big Data. In short, Data Science is about to turn from data quantity to data quality.

It is true, that data veracity, though always present in Data Science, was outshined by other three big V’s: Volume, Velocity and Variety.


For Data Analysis we need enormous volumes of data. Luckily, today data is provided not only by human experts but by machines, networks, readings from connected devices and so on. It can be said that in most cases, we have enough data around us. What we need now is to select what might be of use.


In the field of Big Data, velocity means the pace and regularity at which data flows in from various sources. It is important, that the flow of data is massive and continuous, and the data could be obtained in real-time or with just a few seconds delay. This real-time data can help researchers make more accurate decisions and provide a fuller picture.


For the data to be representative, it should come from various sources and in many types. At present, there are many …

Read More on Datafloq

Comments are closed, but trackbacks and pingbacks are open.