Browse Definitions :
Definition

data ingestion

Contributor(s): Stan Gibilisco

Data ingestion is the process of obtaining and importing data for immediate use or storage in a database. To ingest something is to "take something in or absorb something." 

Data can be streamed in real time or ingested in batches. When data is ingested in real time, each data item is imported as it is emitted by the source. When data is ingested in batches, data items are imported in discrete chunks at periodic intervals of time. An effective data ingestion process begins by prioritizing data sources, validating individual files and routing data items to the correct destination.

When numerous big data sources exist in diverse formats (the sources may often number in the hundreds and the formats in the dozens), it can be challenging for businesses to ingest data at a reasonable speed and process it efficiently in order to maintain a competitive advantage. To that end, vendors offer software programs that are tailored to specific computing environments or software applications. When data ingestion is automated, the software used to carry out the process may also include data preparation features to structure and organize data so it can be analyzed on the fly or at a later time by business intelligence (BI) and business analytics (BA) programs. 

This was last updated in May 2016

Continue Reading About data ingestion

Join the conversation

2 comments

Send me notifications when other members comment.

Please create a username to comment.

How can we measure the Data Ingestion Velocity? We would need to predict the future ingestion rate based on past history. What should be the factors we need to consider for measuring ingestion?
Cancel
From a technical communication/documentation standpoint, why are we using a biological term to describe a data processing concept? How is "ingestion" superior to existing IT terms, "import" and "process?" Ingestion just sounds like an unclear piece of jargon to me.
Cancel

-ADS BY GOOGLE

Dateiendungen und Dateiformate

Gesponsert von:

SearchCompliance

  • Whistleblower Protection Act

    The Whistleblower Protection Act of 1989 is a law that protects federal government employees in the United States from ...

  • smart contract

    A smart contract, also known as a cryptocontract, is a computer program that directly controls the transfer of digital currencies...

  • risk map (risk heat map)

    A risk map, also known as a risk heat map, is a data visualization tool for communicating specific risks an organization faces. A...

SearchSecurity

  • buffer underflow

    Buffer underflow, also known as buffer underrun or buffer underwrite, is a threat to data that typically occurs when the ...

  • digital signature

    A digital signature is a mathematical technique used to validate the authenticity and integrity of a message, software or digital...

  • denial-of-service attack

    A denial-of-service attack is a security event that occurs when an attacker prevents legitimate users from accessing specific ...

SearchHealthIT

SearchDisasterRecovery

  • virtual disaster recovery

    Virtual disaster recovery is a type of DR that typically involves replication and allows a user to fail over to virtualized ...

  • tabletop exercise (TTX)

    A tabletop exercise (TTX) is a disaster preparedness activity that takes participants through the process of dealing with a ...

  • risk mitigation

    Risk mitigation is a strategy to prepare for and lessen the effects of threats faced by a data center.

SearchStorage

  • secondary storage

    Secondary storage is storage for noncritical data that does not need to be frequently accessed.

  • Pure Storage

    Pure Storage is a provider of enterprise data flash storage solutions designed to substitute for electromechanical disk arrays.

  • yobibyte (YiB)

    A yobibyte (YiB) is a unit of measure used to describe data capacity as part of the binary system of measuring computing and ...

Close