What is microchip? - Definition from WhatIs.com
Part of the Microprocessors glossary:

A microchip (sometimes just called a "chip") is a unit of packaged computer circuitry (usually called an integrated circuit) that is manufactured from a material such as silicon at a very small scale. Microchips are made for program logic (logic or microprocessor chips) and for computer memory (memory or RAM chips). Microchips are also made that include both logic and memory and for special purposes such as analog-to-digital conversion, bit slicing, and gateways.

This was last updated in September 2005
Posted by: Margaret Rouse

Related Terms

Definitions

  • neurosynaptic chip (cognitive chip)

    - A neurosynaptic chip, also known as a cognitive chip, is a computer processor that functions more like a biological brain than a typical CPU does. (WhatIs.com)

  • von Neumann bottleneck

    - The von Neumann bottleneck is a limitation on throughput caused by the standard personal computer architecture. (WhatIs.com)

  • smart card

    - A smart card is a device, often the size of a credit card, which includes an embedded microcontroller or memory chip and can securely process or store data. (SearchSecurity.com)

Glossaries

  • Microprocessors

    - Terms related to microprocessors, including definitions about silicon chips and words and phrases about computer processors.

  • Internet applications

    - This WhatIs.com glossary contains terms related to Internet applications, including definitions about Software as a Service (SaaS) delivery models and words and phrases about web sites, e-commerce ...

Ask a Question. Find an Answer.Powered by ITKnowledgeExchange.com

Ask An IT Question

Get answers from your peers on your most technical challenges

Ask Question

Tech TalkComment

Share
Comments

    Results

    Contribute to the conversation

    All fields are required. Comments will appear at the bottom of the article.