Browse Definitions :

BACKGROUND IMAGE: iSTOCK/GETTY IMAGES

This content is part of the Essential Guide: From data gathering to competitive strategy: The evolution of big data
Definition

data-driven disaster

Contributor(s): Ivy Wigmore

A data-driven disaster is a serious problem caused by one or more ineffective data analysis processes.

According to the Data Warehousing Institute, data quality problems cost businesses in the United States over $600 billion a year. In addition to the financial burden, problems with data quality and analysis can have a serious impact on security, compliance, project management and human resource management (HRM), among other possibilities.

Error can creep into data analytics at any stage. The data quality may be inadequate in the first place, for example. It could be incomplete, inaccurate, not current, or may not be a reliable indicator of what it is intended to represent. Data analysis and interpretation are prone to a similar number of pitfalls. There can be confounding factors and the mathematical method can be flawed or inappropriate. Correlation can be erroneously considered to suggest causation. Statistical significance may be mistakenly attributed when the data doesn’t support it. Even if the data and analytic processes are valid, data may be deliberately presented in a misleading manner to support an agenda.

In a broader context, flaws in data-driven processes have been responsible for real disasters such as explosion of the space shuttle Challenger in 1986 and the shooting down of an Iranian Airbus by the USS Vincennes in 1988.

As businesses deal with huge increases in the amount of data collected -- sometimes referred to as big data -- there's a corresponding increase in the trend toward data-driven decision management (DDDM). Problems arise when insufficient resources are applied to data processes and too much confidence placed in their validity. To prevent data-driven disasters, it's crucial to continually examine data quality and analytic processes, and to pay attention to common sense and even intuition. When data seems to be indicating something that doesn't make logical sense or just seems wrong, it's time to reexamine the source data and the methods of analysis. 

This was last updated in January 2013

Continue Reading About data-driven disaster

Dig Deeper on Business intelligence - business analytics

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

File Extensions and File Formats

SearchCompliance

  • risk management

    Risk management is the process of identifying, assessing and controlling threats to an organization's capital and earnings.

  • compliance as a service (CaaS)

    Compliance as a Service (CaaS) is a cloud service service level agreement (SLA) that specified how a managed service provider (...

  • data protection impact assessment (DPIA)

    A data protection impact assessment (DPIA) is a process designed to help organizations determine how data processing systems, ...

SearchSecurity

  • cybersecurity insurance (cybersecurity liability insurance)

    Cybersecurity insurance, also called cyber liability insurance or cyber insurance, is a contract that an entity can purchase to ...

  • phishing

    Phishing is a form of fraud in which an attacker masquerades as a reputable entity or person in email or other communication ...

  • cybercrime

    Cybercrime is any criminal activity that involves a computer, networked device or a network.

SearchHealthIT

SearchDisasterRecovery

  • business continuity plan (BCP)

    A business continuity plan (BCP) is a document that consists of the critical information an organization needs to continue ...

  • disaster recovery team

    A disaster recovery team is a group of individuals focused on planning, implementing, maintaining, auditing and testing an ...

  • cloud insurance

    Cloud insurance is any type of financial or data protection obtained by a cloud service provider. 

SearchStorage

  • NVMe over Fabrics (NVMe-oF)

    NVMe over Fabrics, also known as NVMe-oF and non-volatile memory express over fabrics, is a protocol specification designed to ...

  • logical unit number (LUN)

    A logical unit number (LUN) is a unique identifier for designating an individual or collection of physical or virtual storage ...

  • CIFS (Common Internet File System)

    CIFS (Common Internet File System) is a protocol that gained popularity around the year 2000, as vendors worked to establish an ...

Close