Browse Definitions :
Definition

Apache Parquet

Apache Parquet is a column-oriented storage format for Hadoop. Hadoop is a free, Java-based programming framework that supports the processing of large data sets in a distributed computing environment. Parquet is optimized to work with complex data in bulk and includes methods for efficient data compression and encoding types.

Typically, data is stored in a row-oriented fashion. Even in databases, data is conventionally stored in this way and is optimized for working with one record at a time. Parquet uses a record-shredding and assembly algorithm to break down data and reassemble it so values in each column are physically stored in contiguous memory locations. Data stored by column in this serialized method allows for efficient searches across massive data sets. Since Hadoop is made for big data, columnar storage is a complementary technology.  

Storing data in a columnar format provides benefits such as:

  • More efficient compression due to space saved by the columnar format.
  • Likeness of the data of columns enables data compression for the specific type of data.
  • Queries searching specific column values need not read the entire row's data, making searches faster.
  • Different encoding can be used per column, allowing for better compression to be used as it is developed.

Parquet’s Apache Thrift framework increases flexibility, to allow working with C++, Java and Python.
Parquet is compatible with the majority of data processing frameworks in Hadoop. Other columnar storage file formats include ORC, RCFile and optimized RCFile.

Parquet is a top-level project sponsored by the Apache Software Foundation (ASF). The project originated as a joint effort of Twitter and Cloudera.

This was last updated in December 2017

Continue Reading About Apache Parquet

SearchCompliance
  • pure risk

    Pure risk refers to risks that are beyond human control and result in a loss or no loss with no possibility of financial gain.

  • risk reporting

    Risk reporting is a method of identifying risks tied to or potentially impacting an organization's business processes.

  • risk exposure

    Risk exposure is the quantified potential loss from business activities currently underway or planned.

SearchSecurity
  • payload (computing)

    In computing, a payload is the carrying capacity of a packet or other transmission data unit.

  • script kiddie

    Script kiddie is a derogative term that computer hackers coined to refer to immature, but often just as dangerous, exploiters of ...

  • cipher

    In cryptography, a cipher is an algorithm for encrypting and decrypting data.

SearchHealthIT
SearchDisasterRecovery
  • What is risk mitigation?

    Risk mitigation is a strategy to prepare for and lessen the effects of threats faced by a business.

  • fault-tolerant

    Fault-tolerant technology is a capability of a computer system, electronic system or network to deliver uninterrupted service, ...

  • synchronous replication

    Synchronous replication is the process of copying data over a storage area network, local area network or wide area network so ...

SearchStorage
  • object storage

    Object storage, also called object-based storage, is an approach to addressing and manipulating data storage as discrete units, ...

  • gigabyte (GB)

    A gigabyte (GB) -- pronounced with two hard Gs -- is a unit of data storage capacity that is roughly equivalent to 1 billion ...

  • MRAM (magnetoresistive random access memory)

    MRAM (magnetoresistive random access memory) is a method of storing data bits using magnetic states instead of the electrical ...

Close