Browse Definitions :
Definition

Apache Parquet

Contributor(s): Matthew Haughn

Apache Parquet is a column-oriented storage format for Hadoop. Hadoop is a free, Java-based programming framework that supports the processing of large data sets in a distributed computing environment. Parquet is optimized to work with complex data in bulk and includes methods for efficient data compression and encoding types.

Typically, data is stored in a row-oriented fashion. Even in databases, data is conventionally stored in this way and is optimized for working with one record at a time. Parquet uses a record-shredding and assembly algorithm to break down data and reassemble it so values in each column are physically stored in contiguous memory locations. Data stored by column in this serialized method allows for efficient searches across massive data sets. Since Hadoop is made for big data, columnar storage is a complementary technology.  

Storing data in a columnar format provides benefits such as:

  • More efficient compression due to space saved by the columnar format.
  • Likeness of the data of columns enables data compression for the specific type of data.
  • Queries searching specific column values need not read the entire row's data, making searches faster.
  • Different encoding can be used per column, allowing for better compression to be used as it is developed.

Parquet’s Apache Thrift framework increases flexibility, to allow working with C++, Java and Python.
Parquet is compatible with the majority of data processing frameworks in Hadoop. Other columnar storage file formats include ORC, RCFile and optimized RCFile.

Parquet is a top-level project sponsored by the Apache Software Foundation (ASF). The project originated as a joint effort of Twitter and Cloudera.

This was last updated in December 2017

Continue Reading About Apache Parquet

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchCompliance

  • California Consumer Privacy Act (CCPA)

    The California Consumer Privacy Act (CCPA) is legislation in the state of California that supports an individual's right to ...

  • compliance audit

    A compliance audit is a comprehensive review of an organization's adherence to regulatory guidelines.

  • regulatory compliance

    Regulatory compliance is an organization's adherence to laws, regulations, guidelines and specifications relevant to its business...

SearchSecurity

  • endpoint detection and response (EDR)

    Endpoint detection and response (EDR) is a category of tools and technology used for protecting computer hardware devices–called ...

  • ransomware

    Ransomware is a subset of malware in which the data on a victim's computer is locked, typically by encryption, and payment is ...

  • single sign-on (SSO)

    Single sign-on (SSO) is a session and user authentication service that permits an end user to enter one set of login credentials ...

SearchHealthIT

SearchDisasterRecovery

  • disaster recovery team

    A disaster recovery team is a group of individuals focused on planning, implementing, maintaining, auditing and testing an ...

  • cloud insurance

    Cloud insurance is any type of financial or data protection obtained by a cloud service provider. 

  • business continuity software

    Business continuity software is an application or suite designed to make business continuity planning/business continuity ...

SearchStorage

  • blockchain storage

    Blockchain storage is a way of saving data in a decentralized network which utilizes the unused hard disk space of users across ...

  • disk mirroring (RAID 1)

    RAID 1 is one of the most common RAID levels and the most reliable. Data is written to two places simultaneously, so if one disk ...

  • RAID controller

    A RAID controller is a hardware device or software program used to manage hard disk drives (HDDs) or solid-state drives (SSDs) in...

Close