Definition

cognitive computing

Cognitive computing is the simulation of human thought processes in a computerized model.

Cognitive computing involves self-learning systems that use data mining, pattern recognition and natural language processing to mimic the way the human brain works. The goal of cognitive computing is to create automated IT systems that are capable of solving problems without requiring human assistance.

Cognitive computing systems use machine learning algorithms. Such systems continually acquire knowledge from the data fed into them by mining data for information. The systems refine the way they look for patterns and as well as the way they process data so they become capable of anticipating new problems and modeling possible solutions.

Cognitive computing is used in numerous artificial intelligence (AI) applications, including expert systems, natural language programming, neural networks, robotics and virtual reality. The term cognitive computing is closely associated with IBM’s cognitive computer system, Watson.

See also: big data

This was last updated in April 2014
Posted by: Margaret Rouse

Related Terms

Definitions

Glossaries

  • Business intelligence - business analytics

    - Terms related to business intelligence, including definitions about business analytics and words and phrases about gathering, storing, analyzing and providing access to business data.

  • Internet applications

    - This WhatIs.com glossary contains terms related to Internet applications, including definitions about Software as a Service (SaaS) delivery models and words and phrases about web sites, e-commerce ...

Ask a Question About cognitive computingPowered by ITKnowledgeExchange.com

Get answers from your peers on your most technical challenges

Tech TalkComment

Share
Comments

    Results

    Contribute to the conversation

    All fields are required. Comments will appear at the bottom of the article.