Browse Definitions :
Definition

explainable AI (XAI)

Explainable AI (XAI) is artificial intelligence that is programmed to describe its purpose, rationale and decision-making process in a way that can be understood by the average person. XAI is often discussed in relation to deep learning and plays an important role in the FAT ML model (fairness, accountability and transparency in machine learning).

XAI provides general information about how an AI program makes a decision by disclosing:

  • The program's strengths and weaknesses.
  • The specific criteria the program uses to arrive at a decision.
  • Why a program makes a particular decision as opposed to alternatives.
  • The level of trust that's appropriate for various types of decisions.
  • What types of errors the program is prone to.
  • How errors can be corrected.

An important goal of XAI is to provide algorithmic accountability. Until recently, AI systems have essentially been black boxes. Even if the inputs and outputs are known, the algorithms used to arrive at a decision are often proprietary or not easily understood, despite when the inner-workings of the programming is open source and made freely available.

As artificial intelligence becomes increasingly prevalent, it is becoming more important than ever to disclose how bias and the question of trust are being addressed. The EU's General Data Protection Regulation (GDPR), for example, includes a right to explanation clause.

This was last updated in July 2018

Continue Reading About explainable AI (XAI)

SearchCompliance
  • smart contract

    A smart contract is a decentralized application that executes business logic in response to events.

  • compliance risk

    Compliance risk is an organization's potential exposure to legal penalties, financial forfeiture and material loss, resulting ...

  • information governance

    Information governance is a holistic approach to managing corporate information by implementing processes, roles, controls and ...

SearchSecurity
  • social engineering

    Social engineering is an attack vector that relies heavily on human interaction and often involves manipulating people into ...

  • distributed denial-of-service (DDoS) attack

    A distributed denial-of-service (DDoS) attack is one in which multiple compromised computer systems attack a target, such as a ...

  • password cracking

    Password cracking is the process of using an application program to identify an unknown or forgotten password to a computer or ...

SearchHealthIT
SearchDisasterRecovery
  • change control

    Change control is a systematic approach to managing all changes made to a product or system.

  • disaster recovery (DR)

    Disaster recovery (DR) is an organization's ability to respond to and recover from an event that affects business operations.

  • risk mitigation

    Risk mitigation is a strategy to prepare for and lessen the effects of threats faced by a business.

SearchStorage
  • storage security

    Storage security is the group of parameters and settings that make storage resources available to authorized users and trusted ...

  • cloud storage

    Cloud storage is a service model in which data is transmitted and stored on remote storage systems, where it is maintained, ...

  • cloud data management

    Cloud data management is a way to manage data across cloud platforms, either with or instead of on-premises storage.

Close