Browse Definitions :
Definition

Apache Kafka

Contributor(s): Matthew Haughn

Apache Kafka is a distributed publish-subscribe messaging system designed to replace traditional message brokers.

Originally created and developed by LinkedIn, then open sourced in 2011, Kafka is currently developed by the Apache Software Foundation to exploit new data infrastructures made possible by massively parallel commodity clusters.

Message brokers are a type of middleware that translates messages of one language to another, usually more commonly-accepted language. Message brokers can also be used to decouple data streams from processing and buffer unsent messages. Apache Kafka improves on traditional message brokers through advances in throughput, built-in partitioning, replication, latency and reliability.

Kafka can be used for a number of purposes: Messaging, real time website activity tracking, monitoring operational metrics of distributed applications, log aggregation from numerous servers, event sourcing where state changes in a database are logged and ordered, commit logs where distributed systems sync data and restoring data from failed systems.

This was last updated in November 2014

Continue Reading About Apache Kafka

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

File Extensions and File Formats

SearchCompliance

  • Whistleblower Protection Act

    The Whistleblower Protection Act of 1989 is a law that protects federal government employees in the United States from ...

  • smart contract

    A smart contract, also known as a cryptocontract, is a computer program that directly controls the transfer of digital currencies...

  • risk map (risk heat map)

    A risk map, also known as a risk heat map, is a data visualization tool for communicating specific risks an organization faces. A...

SearchSecurity

  • Payload (computing)

    The term payload, when used in the context of networking or telecommunications, is the data carried inside of a packet (or other ...

  • access control

    Access control is a security technique that regulates who or what can view or use resources in a computing environment.

  • ethical hacker

    An ethical hacker, also referred to as a white hat hacker, is an information security expert who systematically attempts to ...

SearchHealthIT

SearchDisasterRecovery

  • virtual disaster recovery

    Virtual disaster recovery is a type of DR that typically involves replication and allows a user to fail over to virtualized ...

  • tabletop exercise (TTX)

    A tabletop exercise (TTX) is a disaster preparedness activity that takes participants through the process of dealing with a ...

  • risk mitigation

    Risk mitigation is a strategy to prepare for and lessen the effects of threats faced by a data center.

SearchStorage

  • storage at the edge

    Storage at the edge is the collective methods and technologies that capture and retain digital information at the periphery of ...

  • Flash Storage

    Flash storage is any type of drive, repository or system that uses flash memory to keep data for an extended period of time.

  • optical disc

    An optical disc is an electronic data storage medium that can be written to and read from using a low-powered laser beam.

Close