Browse Definitions :
Definition

streaming data architecture

Contributor(s): Kostas Tzoumas

A streaming data architecture is an information technology framework that puts the focus on processing data in motion and treats extract-transform-load (ETL) batch processing as just one more event in a continuous stream of events. This type of architecture has three basic components -- an aggregator that gathers event streams and batch files from a variety of data sources, a broker that makes data available for consumption and an analytics engine that analyzes the data, correlates values and blends streams together.

The system that receives and sends data streams and executes the application and real-time analytics logic is called the stream processor. Because a streaming data architecture supports the concept of event sourcing, it reduces the need for developers to create and maintain shared databases. Instead, all changes to an application’s state are stored as a sequence of event-driven processing (ESP) triggers that can be reconstructed or queried when necessary. Upon receiving an event, the stream processor reacts in real- or near real-time and triggers an action, such as remembering the event for future reference.

The growing popularity of streaming data architectures reflects a shift in the development of services and products from a monolithic architecture to a decentralized one built with microservices. This type of architecture is usually more flexible and scalable than a classic database-centric application architecture because it co-locates data processing with storage to lower application response times (latency) and improve throughput. Another advantage of using a streaming data architecture is that it factors the time an event occurs into account, which makes it easier for an application’s state and processing to be partitioned and distributed across many instances.

Streaming data architectures enable developers to develop applications that use both bound and unbound data in new ways. For example, Alibaba’s search infrastructure team uses a streaming data architecture powered by Apache Flink to update product detail and inventory information in real-time. Netflix also uses Flink to support its recommendation engines and ING, the global bank based in The Netherlands, uses the architecture to prevent identity theft and provide better fraud protection. Other platforms that can accommodate both stream and batch processing include Apache Spark, Apache Storm, Google Cloud Dataflow and AWS Kinesis.

This was last updated in October 2018

Continue Reading About streaming data architecture

Join the conversation

1 comment

Send me notifications when other members comment.

Please create a username to comment.

What approach does your IT department use to streamline your analytics pipeline(s)?
Cancel

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchCompliance

  • risk management

    Risk management is the process of identifying, assessing and controlling threats to an organization's capital and earnings.

  • compliance as a service (CaaS)

    Compliance as a Service (CaaS) is a cloud service service level agreement (SLA) that specified how a managed service provider (...

  • data protection impact assessment (DPIA)

    A data protection impact assessment (DPIA) is a process designed to help organizations determine how data processing systems, ...

SearchSecurity

  • cybersecurity insurance (cybersecurity liability insurance)

    Cybersecurity insurance, also called cyber liability insurance or cyber insurance, is a contract that an entity can purchase to ...

  • phishing

    Phishing is a form of fraud in which an attacker masquerades as a reputable entity or person in email or other communication ...

  • cybercrime

    Cybercrime is any criminal activity that involves a computer, networked device or a network.

SearchHealthIT

SearchDisasterRecovery

  • business continuity plan (BCP)

    A business continuity plan (BCP) is a document that consists of the critical information an organization needs to continue ...

  • disaster recovery team

    A disaster recovery team is a group of individuals focused on planning, implementing, maintaining, auditing and testing an ...

  • cloud insurance

    Cloud insurance is any type of financial or data protection obtained by a cloud service provider. 

SearchStorage

  • NVMe over Fabrics (NVMe-oF)

    NVMe over Fabrics, also known as NVMe-oF and non-volatile memory express over fabrics, is a protocol specification designed to ...

  • logical unit number (LUN)

    A logical unit number (LUN) is a unique identifier for designating an individual or collection of physical or virtual storage ...

  • CIFS (Common Internet File System)

    CIFS (Common Internet File System) is a protocol that gained popularity around the year 2000, as vendors worked to establish an ...

Close