Browse Definitions :

BACKGROUND IMAGE: iSTOCK/GETTY IMAGES

This content is part of the Essential Guide: From data gathering to competitive strategy: The evolution of big data
Definition

data-driven disaster

Contributor(s): Ivy Wigmore

A data-driven disaster is a serious problem caused by one or more ineffective data analysis processes.

According to the Data Warehousing Institute, data quality problems cost businesses in the United States over $600 billion a year. In addition to the financial burden, problems with data quality and analysis can have a serious impact on security, compliance, project management and human resource management (HRM), among other possibilities.

Error can creep into data analytics at any stage. The data quality may be inadequate in the first place, for example. It could be incomplete, inaccurate, not current, or may not be a reliable indicator of what it is intended to represent. Data analysis and interpretation are prone to a similar number of pitfalls. There can be confounding factors and the mathematical method can be flawed or inappropriate. Correlation can be erroneously considered to suggest causation. Statistical significance may be mistakenly attributed when the data doesn’t support it. Even if the data and analytic processes are valid, data may be deliberately presented in a misleading manner to support an agenda.

In a broader context, flaws in data-driven processes have been responsible for real disasters such as explosion of the space shuttle Challenger in 1986 and the shooting down of an Iranian Airbus by the USS Vincennes in 1988.

As businesses deal with huge increases in the amount of data collected -- sometimes referred to as big data -- there's a corresponding increase in the trend toward data-driven decision management (DDDM). Problems arise when insufficient resources are applied to data processes and too much confidence placed in their validity. To prevent data-driven disasters, it's crucial to continually examine data quality and analytic processes, and to pay attention to common sense and even intuition. When data seems to be indicating something that doesn't make logical sense or just seems wrong, it's time to reexamine the source data and the methods of analysis. 

This was last updated in January 2013

Continue Reading About data-driven disaster

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

File Extensions and File Formats

Powered by:

SearchCompliance

  • compliance audit

    A compliance audit is a comprehensive review of an organization's adherence to regulatory guidelines.

  • regulatory compliance

    Regulatory compliance is an organization's adherence to laws, regulations, guidelines and specifications relevant to its business...

  • Whistleblower Protection Act

    The Whistleblower Protection Act of 1989 is a law that protects federal government employees in the United States from ...

SearchSecurity

  • orphan account

    An orphan account, also referred to as an orphaned account, is a user account that can provide access to corporate systems, ...

  • voice squatting (skill squatting)

    Voice squatting is an attack vector for voice user interfaces (VUIs) that exploits homonyms (words that sound the same but are ...

  • WPA3

    WPA3, also known as Wi-Fi Protected Access 3, is the third version of the security certification program developed by the Wi-Fi ...

SearchHealthIT

SearchDisasterRecovery

  • business continuity policy

    Business continuity policy is the set of standards and guidelines an organization enforces to ensure resilience and proper risk ...

  • business continuity and disaster recovery (BCDR)

    Business continuity and disaster recovery (BCDR) are closely related practices that describe an organization's preparation for ...

  • warm site

    A warm site is a type of facility an organization uses to recover its technology infrastructure when its primary data center goes...

SearchStorage

  • cache memory

    Cache memory, also called CPU memory, is high-speed static random access memory (SRAM) that a computer microprocessor can access ...

  • enterprise storage

    Enterprise storage is a centralized repository for business information that provides common data management, protection and data...

  • disk array

    A disk array, also called a storage array, is a data storage system used for block-based storage, file-based storage or object ...

Close