What is microchip? - Definition from WhatIs.com
Part of the Microprocessors glossary:

A microchip (sometimes just called a "chip") is a unit of packaged computer circuitry (usually called an integrated circuit) that is manufactured from a material such as silicon at a very small scale. Microchips are made for program logic (logic or microprocessor chips) and for computer memory (memory or RAM chips). Microchips are also made that include both logic and memory and for special purposes such as analog-to-digital conversion, bit slicing, and gateways.

This was last updated in September 2005
Posted by: Margaret Rouse

Related Terms

Definitions

  • biomimetics (biomimicry)

    - Biomimetic refers to human-made processes, substances, devices, or systems that imitate nature. Biomimetic technologies are also known as biomimicry: mimicry of biological systems. (WhatIs.com)

  • clock gating

    - Clock gating is the power-saving feature in semiconductor microelectronics that enables switching off circuits. Many electronic devices use clock gating to turn off buses, controllers, bridges and ... (WhatIs.com)

  • nanomedicine

    - Nanomedicine is the application of nanotechnology (the engineering of tiny machines) to the prevention and treatment of disease in the human body. (WhatIs.com)

Glossaries

  • Microprocessors

    - Terms related to microprocessors, including definitions about silicon chips and words and phrases about computer processors.

  • Internet applications

    - This WhatIs.com glossary contains terms related to Internet applications, including definitions about Software as a Service (SaaS) delivery models and words and phrases about web sites, e-commerce ...

Ask a Question. Find an Answer.Powered by ITKnowledgeExchange.com

Ask An IT Question

Get answers from your peers on your most technical challenges

Ask Question

Tech TalkComment

Share
Comments

    Results

    Contribute to the conversation

    All fields are required. Comments will appear at the bottom of the article.