What is microchip? - Definition from WhatIs.com
Part of the Microprocessors glossary:

A microchip (sometimes just called a "chip") is a unit of packaged computer circuitry (usually called an integrated circuit) that is manufactured from a material such as silicon at a very small scale. Microchips are made for program logic (logic or microprocessor chips) and for computer memory (memory or RAM chips). Microchips are also made that include both logic and memory and for special purposes such as analog-to-digital conversion, bit slicing, and gateways.

This was last updated in September 2005
Posted by: Margaret Rouse

Related Terms



  • Microprocessors

    - Terms related to microprocessors, including definitions about silicon chips and words and phrases about computer processors.

  • Internet applications

    - This WhatIs.com glossary contains terms related to Internet applications, including definitions about Software as a Service (SaaS) delivery models and words and phrases about web sites, e-commerce ...

Ask a Question. Find an Answer.Powered by ITKnowledgeExchange.com

Ask An IT Question

Get answers from your peers on your most technical challenges

Ask Question

Tech TalkComment



    Contribute to the conversation

    All fields are required. Comments will appear at the bottom of the article.