Browse Definitions:
Definition

data-driven disaster

This definition is part of our Essential Guide: From data gathering to competitive strategy: The evolution of big data
Contributor(s): Ivy Wigmore

A data-driven disaster is a serious problem caused by one or more ineffective data analysis processes.

According to the Data Warehousing Institute, data quality problems cost businesses in the United States over $600 billion a year. In addition to the financial burden, problems with data quality and analysis can have a serious impact on security, compliance, project management and human resource management (HRM), among other possibilities.

Error can creep into data analytics at any stage. The data quality may be inadequate in the first place, for example. It could be incomplete, inaccurate, not current, or may not be a reliable indicator of what it is intended to represent. Data analysis and interpretation are prone to a similar number of pitfalls. There can be confounding factors and the mathematical method can be flawed or inappropriate. Correlation can be erroneously considered to suggest causation. Statistical significance may be mistakenly attributed when the data doesn’t support it. Even if the data and analytic processes are valid, data may be deliberately presented in a misleading manner to support an agenda.

In a broader context, flaws in data-driven processes have been responsible for real disasters such as explosion of the space shuttle Challenger in 1986 and the shooting down of an Iranian Airbus by the USS Vincennes in 1988.

As businesses deal with huge increases in the amount of data collected -- sometimes referred to as big data -- there's a corresponding increase in the trend toward data-driven decision management (DDDM). Problems arise when insufficient resources are applied to data processes and too much confidence placed in their validity. To prevent data-driven disasters, it's crucial to continually examine data quality and analytic processes, and to pay attention to common sense and even intuition. When data seems to be indicating something that doesn't make logical sense or just seems wrong, it's time to reexamine the source data and the methods of analysis. 

This was last updated in January 2013

Continue Reading About data-driven disaster

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

File Extensions and File Formats

SearchCompliance

SearchSecurity

  • copyright

    Copyright is a legal term describing ownership of control of the rights to the use and distribution of certain works of creative ...

  • keylogger (keystroke logger or system monitor)

    A keylogger, sometimes called a keystroke logger or system monitor, is a type of surveillance technology used to monitor and ...

  • password

    A password is an unspaced sequence of characters used to determine that a computer user requesting access to a computer system is...

SearchHealthIT

SearchDisasterRecovery

  • business continuity plan (BCP)

    A business continuity plan (BCP) is a document that consists of the critical information an organization needs to continue ...

  • call tree

    A call tree -- sometimes referred to as a phone tree -- is a telecommunications chain for notifying specific individuals of an ...

  • mass notification system (MNS)

    A mass notification system is a platform that sends one-way messages to inform employees and the public of an emergency.

SearchStorage

  • CompactFlash card (CF card)

    A CompactFlash card (CF card) is a memory card format developed by SanDisk in 1994 that uses flash memory technology to store ...

  • email archiving

    Email archiving (also spelled e-mail archiving) is a systematic approach to saving and protecting the data contained in email ...

  • RAID (redundant array of independent disks)

    RAID (redundant array of independent disks) is a way of storing the same data in different places on multiple hard disks to ...

SearchSolidStateStorage

  • M.2 SSD

    An M.2 SSD is a solid-state drive (SSD) that conforms to a computer industry specification written for internally mounted storage...

  • NVMe (non-volatile memory express)

    NVMe (non-volatile memory express) is a host controller interface and storage protocol to enable a solid-state drive to use the ...

  • SSD RAID (solid-state drive RAID)

    SSD RAID (solid-state drive RAID) is a methodology commonly used to protect data by distributing redundant data blocks across ...

SearchCloudStorage

  • RESTful API

    A RESTful application program interface breaks down a transaction to create a series of small modules, each of which addresses an...

  • cloud storage infrastructure

    Cloud storage infrastructure is the hardware and software framework that supports the computing requirements of a private or ...

  • Zadara VPSA and ZIOS

    Zadara Storage provides block, file or object storage with varying levels of compute and capacity through its ZIOS and VPSA ...

Close