Browse Definitions:
Definition

3Vs (volume, variety and velocity)

This definition is part of our Essential Guide: From data gathering to competitive strategy: The evolution of big data
Contributor(s): Ivy Wigmore

3Vs (volume, variety and velocity) are three defining properties or dimensions of big data. Volume refers to the amount of data, variety refers to the number of types of data and velocity refers to the speed of data processing. According to the 3Vs model, the challenges of big data management result from the expansion of all three properties, rather than just the volume alone -- the sheer amount of data to be managed.

Gartner analyst Doug Laney introduced the 3Vs concept in a 2001 MetaGroup research publication, 3D data management: Controlling data volume, variety and velocity. More recently, additional Vs have been proposed for addition to the model, including variability -- the increase in the range of values typical of a large data set -- and value, which addresses the need for valuation of enterprise data.

The infographic below (reproduced with permission from Diya Soubra's post, The 3Vs that define Big Data, on Data Science Central) illustrates the increasing expansion of the 3Vs. 

The 3Vs of big data

 

This was last updated in February 2013

Continue Reading About 3Vs (volume, variety and velocity)

Join the conversation

4 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

This is ridiculous. Coming up with abstract buzzwords to sell tools to distill garbage data into garbage results to CIOs reading about it in SkyMall.
Cancel
Great to see the industry finally adopting the "3Vs" of Big Data that Gartner first introduced over 12 years ago! Here's a link to the original piece I wrote on "The Three Dimensional Data Challenge" back in 2001 positing them: http://goo.gl/wH3qG. Interesting also to see others lop on additional "V"s that while interesting are decidedly no definitional characteristics of Big Data. --Doug Laney, VP Research, Gartner, @doug_laney
Cancel
rrrrrrrrrrrrrrrrrr
Cancel
I agree, as far as definitions go this is pretty useless. If one has never heard of the three V's before it is good to understand. What we really need is some kind of empirical definition that transcends time, sort of like Moore's Law. 

Here's my suggestion: "Data is Big Data when it is too big to work on any one commonly available computer, but rather requires a cluster of computers". "Commonly available" would then have to be defined somehow, for example "computers available in the majority of large and medium-sized businesses" so that mainframes would be eliminated.

The reason why a "cluster of computers" is important is because this requires a fundamental change in the underlying architecture of how mathematical functions are designed in order to perform acceptably when network communication is part of the system.

The amount of data that one computer can process has certainly changed over the years and will continue to do so. Therefore this kind of definition should be useful moving forward.
Cancel

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchCompliance

  • PCAOB (Public Company Accounting Oversight Board)

    The Public Company Accounting Oversight Board (PCAOB) is a Congressionally-established nonprofit that assesses audits of public ...

  • cyborg anthropologist

    A cyborg anthropologist is an individual who studies the interaction between humans and technology, observing how technology can ...

  • RegTech

    RegTech, or regulatory technology, is a term used to describe technology that is used to help streamline the process of ...

SearchSecurity

  • email spam

    Email spam, or junk email, is unsolicited bulk messages sent through email with commercial, fraudulent or malicious intent.

  • distributed denial of service (DDoS) attack

    A distributed denial-of-service attack occurs when an attack originates from multiple computers or devices, usually from multiple...

  • application whitelisting

    Application whitelisting is the practice of identifying applications that have been deemed safe for execution and restricting all...

SearchHealthIT

  • athenahealth Inc.

    Based in Watertown, Mass., athenahealth Inc. is a leading vendor of cloud-based EHRs for small to medium-sized physician ...

  • Affordable Care Act (ACA or Obamacare)

    The Affordable Care Act (ACA) is legislation passed in 2010 that changed how uninsured Americans enroll in and receive healthcare...

  • HIPAA Privacy Rule

    The Standards for Privacy of Individually Identifiable Health Information, commonly known as the HIPAA Privacy Rule, establishes ...

SearchDisasterRecovery

  • disaster recovery as a service (DRaaS)

    One approach to a strong disaster recovery plan is DRaaS, where companies offload data replication and restoration ...

  • data recovery

    Data recovery restores data that has been lost, accidentally deleted, corrupted or made inaccessible. Learn how data recovery ...

  • disaster recovery plan (DRP)

    A company's disaster recovery policy is enhanced with a documented DR plan that formulates strategies, and outlines preparation ...

SearchStorage

  • yottabyte (YB)

    A yottabyte is a measure of theoretical storage capacity and is 2 to the 80th power bytes, or, in decimal, approximately 1,000 ...

  • Kilo, mega, giga, tera, peta, exa, zetta and all that

    Kilo, mega, giga, tera, peta, exa, zetta are among the list of prefixes used to denote the quantity of something, such as a byte ...

  • brontobyte

    A brontobyte is a measure of memory or data storage that is equal to 10 to the 27th power of bytes.

SearchSolidStateStorage

  • PCIe SSD (PCIe solid-state drive)

    A PCIe SSD (PCIe solid-state drive) is a high-speed expansion card that attaches a computer to its peripherals.

  • SSD caching

    SSD caching, also known as flash caching, is the temporary storage of data on NAND flash memory chips in a solid-state drive so ...

  • NVDIMM (Non-Volatile Dual In-line Memory Module)

    An NVDIMM (non-volatile dual in-line memory module) is hybrid computer memory that retains data during a service outage.

SearchCloudStorage

  • RESTful API

    A RESTful application program interface breaks down a transaction to create a series of small modules, each of which addresses an...

  • cloud storage infrastructure

    Cloud storage infrastructure is the hardware and software framework that supports the computing requirements of a private or ...

  • Zadara VPSA and ZIOS

    Zadara Storage provides block, file or object storage with varying levels of compute and capacity through its ZIOS and VPSA ...

Close