Browse Definitions :

A Timeline of Machine Learning History

Machine learning, an application of artificial intelligence (AI), has some impressive capabilities. A machine learning algorithm can make software capable of unsupervised learning. Without being explicitly programmed, the algorithm can seemingly grow "smarter," and become more accurate at predicting outcomes, through the input of historical data.

The History and Future of Machine Learning

Machine learning was first conceived from the mathematical modeling of neural networks. A paper by logician Walter Pitts and neuroscientist Warren McCulloch, published in 1943, attempted to mathematically map out thought processes and decision making in human cognition.

In 1950, Alan Turning proposed the Turing Test, which became the litmus test for which machines were deemed "intelligent" or "unintelligent." The criteria for a machine to receive status as an "intelligent" machine, was for it to have the ability to convince a human being that it, the machine, was also a human being. Soon after, a summer research program at Dartmouth College became the official birthplace of AI.

From this point on, "intelligent" machine learning algorithms and computer programs started to appear, doing everything from planning travel routes for salespeople, to playing board games with humans such as checkers and tic-tac-toe.

Intelligent machines went on to do everything from using speech recognition to learning to pronounce words the way a baby would learn to defeating a world chess champion at his own game. The infographic below shows the history of machine learning and how it grew from mathematical models to sophisticated technology.  

Machine learning history timeline
A timeline of major events in machine learning history. Click the image to view a clearer version.

Dig Deeper on Artificial intelligence - machine learning

SearchCompliance
  • OPSEC (operations security)

    OPSEC (operations security) is a security and risk management process and strategy that classifies information, then determines ...

  • smart contract

    A smart contract is a decentralized application that executes business logic in response to events.

  • compliance risk

    Compliance risk is an organization's potential exposure to legal penalties, financial forfeiture and material loss, resulting ...

SearchSecurity
  • cyberterrorism

    According to the U.S. Federal Bureau of Investigation, cyberterrorism is any 'premeditated, politically motivated attack against ...

  • biometrics

    Biometrics is the measurement and statistical analysis of people's unique physical and behavioral characteristics.

  • privileged access management (PAM)

    Privileged access management (PAM) is the combination of tools and technology used to secure, control and monitor access to an ...

SearchHealthIT
SearchDisasterRecovery
  • What is risk mitigation?

    Risk mitigation is a strategy to prepare for and lessen the effects of threats faced by a business.

  • change control

    Change control is a systematic approach to managing all changes made to a product or system.

  • disaster recovery (DR)

    Disaster recovery (DR) is an organization's ability to respond to and recover from an event that affects business operations.

SearchStorage
  • PCIe SSD (PCIe solid-state drive)

    A PCIe SSD (PCIe solid-state drive) is a high-speed expansion card that attaches a computer to its peripherals.

  • VRAM (video RAM)

    VRAM (video RAM) refers to any type of random access memory (RAM) specifically used to store image data for a computer display.

  • virtual memory

    Virtual memory is a memory management technique where secondary memory can be used as if it were a part of the main memory.

Close