Pages

Friday, April 1, 2016

Why Big Data Is BIG?

Big data is a broad term for data sets so large or complex that traditional data processing applications are inadequate. Challenges include analysis, capture, data curation, search, sharing, storage, transfer, visualization, querying and information privacy.

Also Big data is a term that describes the large volume of data – both structured and unstructured – that inundates a business on a day-to-day basis. But it’s not the amount of data that’s important. It’s what organizations do with the data that matters. Big data can be analyzed for insights that lead to better decisions and strategic business moves.

Data sets are growing rapidly in part because they are increasingly gathered by cheap and numerous information-sensing mobile devices, aerial (remote sensing), software logs, cameras, microphones, radio-frequency identification (RFID) readers and wireless sensor networks. The world's technological per-capita capacity to store information has roughly doubled every 40 months since the 1980s, as of 2012, every day 2.5 exabytes (2.5×1018) of data are created. One question for large enterprises is determining who should own big data initiatives that affect the entire organization.
 File:Hilbert InfoGrowth.png
Big data usually includes data sets with sizes beyond the ability of commonly used software tools to capture, curate, manage, and process data within a tolerable elapsed time. Big data "size" is a constantly moving target, as of 2012 ranging from a few dozen terabytes to many petabytes of data. Big data requires a set of techniques and technologies with new forms of integration to reveal insights from data sets that are diverse, complex, and of a massive scale.
 

Big data can be described by the following characteristics:
  • Volume
The quantity of generated and stored data. The size of the data determines the value and potential insight- and whether it can actually be considered big data or not.
  • Variety
The type and nature of the data. This helps people who analyze it to effectively use the resulting insight.
  • Velocity
In this context, the speed at which the data is generated and processed to meet the demands and challenges that lie in the path of growth and development.
  • Variability
Inconsistency of the data set can hamper processes to handle and manage it.
  • Veracity
The quality of captured data can vary greatly, affecting accurate analysis.
 
Factory work and Cyber-physical systems may have a 6C system:
  • Connection (sensor and networks)
  • Cloud (computing and data on demand)
  • Cyber (model and memory)
  • Content/context (meaning and correlation)
  • Community (sharing and collaboration)
  • Customization (personalization and value)
                                         Image: Big Data -

No comments:

Post a Comment