Browse Definitions :

Essential Guide

Browse Sections

BACKGROUND IMAGE: iSTOCK/GETTY IMAGES

This content is part of the Essential Guide: Big data tutorial: Everything you need to know
Definition

3Vs (volume, variety and velocity)

Contributor(s): Ivy Wigmore

3Vs (volume, variety and velocity) are three defining properties or dimensions of big data. Volume refers to the amount of data, variety refers to the number of types of data and velocity refers to the speed of data processing. According to the 3Vs model, the challenges of big data management result from the expansion of all three properties, rather than just the volume alone -- the sheer amount of data to be managed.

Gartner analyst Doug Laney introduced the 3Vs concept in a 2001 MetaGroup research publication, 3D data management: Controlling data volume, variety and velocity. More recently, additional Vs have been proposed for addition to the model, including variability -- the increase in the range of values typical of a large data set -- and value, which addresses the need for valuation of enterprise data.

The infographic below (reproduced with permission from Diya Soubra's post, The 3Vs that define Big Data, on Data Science Central) illustrates the increasing expansion of the 3Vs. 

The 3Vs of big data

 

This was last updated in February 2013

Continue Reading About 3Vs (volume, variety and velocity)

Join the conversation

4 comments

Send me notifications when other members comment.

Please create a username to comment.

This is ridiculous. Coming up with abstract buzzwords to sell tools to distill garbage data into garbage results to CIOs reading about it in SkyMall.
Cancel
Great to see the industry finally adopting the "3Vs" of Big Data that Gartner first introduced over 12 years ago! Here's a link to the original piece I wrote on "The Three Dimensional Data Challenge" back in 2001 positing them: http://goo.gl/wH3qG. Interesting also to see others lop on additional "V"s that while interesting are decidedly no definitional characteristics of Big Data. --Doug Laney, VP Research, Gartner, @doug_laney
Cancel
rrrrrrrrrrrrrrrrrr
Cancel
I agree, as far as definitions go this is pretty useless. If one has never heard of the three V's before it is good to understand. What we really need is some kind of empirical definition that transcends time, sort of like Moore's Law. 

Here's my suggestion: "Data is Big Data when it is too big to work on any one commonly available computer, but rather requires a cluster of computers". "Commonly available" would then have to be defined somehow, for example "computers available in the majority of large and medium-sized businesses" so that mainframes would be eliminated.

The reason why a "cluster of computers" is important is because this requires a fundamental change in the underlying architecture of how mathematical functions are designed in order to perform acceptably when network communication is part of the system.

The amount of data that one computer can process has certainly changed over the years and will continue to do so. Therefore this kind of definition should be useful moving forward.
Cancel

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchCompliance

  • compliance audit

    A compliance audit is a comprehensive review of an organization's adherence to regulatory guidelines.

  • regulatory compliance

    Regulatory compliance is an organization's adherence to laws, regulations, guidelines and specifications relevant to its business...

  • Whistleblower Protection Act

    The Whistleblower Protection Act of 1989 is a law that protects federal government employees in the United States from ...

SearchSecurity

  • Transport Layer Security (TLS)

    Transport Layer Security (TLS) is a protocol that provides authentication, privacy, and data integrity between two communicating ...

  • van Eck phreaking

    Van Eck phreaking is a form of electronic eavesdropping that reverse engineers the electromagnetic fields (EM fields) produced by...

  • zero-trust model (zero trust network)

    The zero trust model is a security model used by IT professionals that requires strict identity and device verification ...

SearchHealthIT

SearchDisasterRecovery

  • cloud insurance

    Cloud insurance is any type of financial or data protection obtained by a cloud service provider. 

  • business continuity software

    Business continuity software is an application or suite designed to make business continuity planning/business continuity ...

  • business continuity policy

    Business continuity policy is the set of standards and guidelines an organization enforces to ensure resilience and proper risk ...

SearchStorage

  • solid-state storage

    Solid-state storage (SSS) is a type of computer storage media made from silicon microchips. SSS stores data electronically ...

  • persistent storage

    Persistent storage is any data storage device that retains data after power to that device is shut off. It is also sometimes ...

  • computational storage

    Computational storage is an information technology (IT) architecture in which data is processed at the storage device level to ...

Close