Big Data’s Emphasis on Value Over Volume
June 17, 2019, Scott Foster
The phrase “big data” has been around for some time, however the concept continues to evolve. Big data first meant collecting and analyzing large data sets that are too complex to be dealt with by traditional data-processing software, with a focus on the volume, variety and velocity of the data. Now, we include the veracity and value of the data—and the emphasis has shifted to prioritize value. In today’s landscape where big data refers to predictive analytics, user behavior analytics or other advanced analytical methods, the size of the data set is no longer the defining characteristic, rather it’s the value that is most important. After all, what good is having all of this data if you can’t actually do anything with the findings?
From a smart city perspective, the data afforded by advanced metering infrastructure set up by the electrical utility can increase operational efficiency, advance monitoring and management of the grid, and improve customer experience. If the smart grid solution offers a full communications backbone as well, like our Delta Smart Grid Network™ (DSGN™), more data can be captured by Internet of Things devices connected to the network (check out April’s blog post for more on IoT).
Furthermore, to extract more value, advances in big data are being incorporated into artificial intelligence (AI) and machine learning. While similar, the two are different:
- AI is the creation of machines that learn from their environment and can problem-solve based on that, and
- machine learning is a sub-set of AI where the machine can use the lessons to improve itself without being explicitly programmed to do so.
Through robust data analytics, artificial intelligence and machine learning the value of big data is exponential. And although volume, variety, velocity, and veracity are still key components of big data—value is the most crucial characteristic.