Browse Definitions:
Definition

Apache Parquet

Contributor(s): Matthew Haughn

Apache Parquet is a column-oriented storage format for Hadoop. Hadoop is a free, Java-based programming framework that supports the processing of large data sets in a distributed computing environment. Parquet is optimized to work with complex data in bulk and includes methods for efficient data compression and encoding types.

Typically, data is stored in a row-oriented fashion. Even in databases, data is conventionally stored in this way and is optimized for working with one record at a time. Parquet uses a record-shredding and assembly algorithm to break down data and reassemble it so values in each column are physically stored in contiguous memory locations. Data stored by column in this serialized method allows for efficient searches across massive data sets. Since Hadoop is made for big data, columnar storage is a complementary technology.  

Storing data in a columnar format provides benefits such as:

  • More efficient compression due to space saved by the columnar format.
  • Likeness of the data of columns enables data compression for the specific type of data.
  • Queries searching specific column values need not read the entire row's data, making searches faster.
  • Different encoding can be used per column, allowing for better compression to be used as it is developed.

Parquet’s Apache Thrift framework increases flexibility, to allow working with C++, Java and Python.
Parquet is compatible with the majority of data processing frameworks in Hadoop. Other columnar storage file formats include ORC, RCFile and optimized RCFile.

Parquet is a top-level project sponsored by the Apache Software Foundation (ASF). The project originated as a joint effort of Twitter and Cloudera.

This was last updated in December 2017

Continue Reading About Apache Parquet

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchCompliance

  • smart contract

    A smart contract, also known as a cryptocontract, is a computer program that directly controls the transfer of digital currencies...

  • risk map (risk heat map)

    A risk map, also known as a risk heat map, is a data visualization tool for communicating specific risks an organization faces. A...

  • internal audit (IA)

    An internal audit (IA) is an organizational initiative to monitor and analyze its own business operations in order to determine ...

SearchSecurity

SearchHealthIT

SearchDisasterRecovery

  • incident management plan (IMP)

    An incident management plan (IMP), sometimes called an incident response plan or emergency management plan, is a document that ...

  • crisis communication

    Crisis communication is a method of corresponding with people and organizations during a disruptive event to provide them with ...

  • Zerto

    Zerto is a storage software vendor that specializes in enterprise-class business continuity and disaster recovery in virtual and ...

SearchStorage

  • SSD write cycle

    An SSD write cycle is the process of programming data to a NAND flash memory chip in a solid-state storage device.

  • data storage

    Data storage is the collective methods and technologies that capture and retain digital information on electromagnetic, optical ...

  • hard disk

    A hard disk is part of a unit -- often called a disk drive, hard drive or hard disk drive -- that stores and provides relatively ...

SearchSolidStateStorage

  • hybrid hard disk drive (HDD)

    A hybrid hard disk drive is an electromechanical spinning hard disk that contains some amount of NAND Flash memory.

Close