Browse Definitions:
Definition

data-driven disaster

This definition is part of our Essential Guide: From data gathering to competitive strategy: The evolution of big data
Contributor(s): Ivy Wigmore

A data-driven disaster is a serious problem caused by one or more ineffective data analysis processes.

According to the Data Warehousing Institute, data quality problems cost businesses in the United States over $600 billion a year. In addition to the financial burden, problems with data quality and analysis can have a serious impact on security, compliance, project management and human resource management (HRM), among other possibilities.

Error can creep into data analytics at any stage. The data quality may be inadequate in the first place, for example. It could be incomplete, inaccurate, not current, or may not be a reliable indicator of what it is intended to represent. Data analysis and interpretation are prone to a similar number of pitfalls. There can be confounding factors and the mathematical method can be flawed or inappropriate. Correlation can be erroneously considered to suggest causation. Statistical significance may be mistakenly attributed when the data doesn’t support it. Even if the data and analytic processes are valid, data may be deliberately presented in a misleading manner to support an agenda.

In a broader context, flaws in data-driven processes have been responsible for real disasters such as explosion of the space shuttle Challenger in 1986 and the shooting down of an Iranian Airbus by the USS Vincennes in 1988.

As businesses deal with huge increases in the amount of data collected -- sometimes referred to as big data -- there's a corresponding increase in the trend toward data-driven decision management (DDDM). Problems arise when insufficient resources are applied to data processes and too much confidence placed in their validity. To prevent data-driven disasters, it's crucial to continually examine data quality and analytic processes, and to pay attention to common sense and even intuition. When data seems to be indicating something that doesn't make logical sense or just seems wrong, it's time to reexamine the source data and the methods of analysis. 

This was last updated in January 2013

Continue Reading About data-driven disaster

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchCompliance

  • smart contract

    A smart contract, also known as a cryptocontract, is a computer program that directly controls the transfer of digital currencies...

  • risk map (risk heat map)

    A risk map, also known as a risk heat map, is a data visualization tool for communicating specific risks an organization faces. A...

  • internal audit (IA)

    An internal audit (IA) is an organizational initiative to monitor and analyze its own business operations in order to determine ...

SearchSecurity

  • evil maid attack

    An evil maid attack is a security exploit that targets a computing device that has been shut down and left unattended.  An evil ...

  • Common Body of Knowledge (CBK)

    In security, Common Body of Knowledge (CBK) is a comprehensive framework of all the relevant subjects a security professional ...

  • rootkit

    A rootkit is a program or, more often, a collection of software tools that gives a threat actor remote access to and control over...

SearchHealthIT

  • value-based healthcare

    Value-based healthcare, also known as value-based care, is a payment model that rewards healthcare providers for providing ...

  • health informatics

    Health informatics is the practice of acquiring, studying and managing health data and applying medical concepts in conjunction ...

  • clinical trial

    A clinical trial, also known as a clinical research study, is a protocol to evaluate the effects and efficacy of experimental ...

SearchDisasterRecovery

  • crisis communication

    Crisis communication is a method of corresponding with people and organizations during a disruptive event to provide them with ...

  • Zerto

    Zerto is a storage software vendor that specializes in enterprise-class business continuity and disaster recovery in virtual and ...

  • crisis management plan (CMP)

    A crisis management plan (CMP) is a document that outlines the processes an organization will use to respond to a critical ...

SearchStorage

  • cache memory

    Cache memory, also called CPU memory, is high-speed static random access memory (SRAM) that a computer microprocessor can access ...

  • RAID 10 (RAID 1+0)

    RAID 10, also known as RAID 1+0, is a RAID configuration that combines disk mirroring and disk striping to protect data.

  • Red Hat OpenStack Platform

    Red Hat OpenStack Platform is a commercially supported distribution of open source OpenStack software designed to build and ...

SearchSolidStateStorage

  • hybrid hard disk drive (HDD)

    A hybrid hard disk drive is an electromechanical spinning hard disk that contains some amount of NAND Flash memory.

Close