What is norm? - Definition from WhatIs.com
Part of the Computing fundamentals glossary:

A norm (from norma , Latin for carpenter's square) is a model of what should exist or be followed, or an average of what currently does exist in some context, such as an average salary among members of a large group.

This was last updated in August 2007
Posted by: Margaret Rouse

Related Terms

Definitions

  • hypervisor

    - A hypervisor is a function which abstracts -- isolates -- operating systems and applications from the underlying computer hardware. (SearchServerVirtualization.com)

  • von Neumann bottleneck

    - The von Neumann bottleneck is a limitation on throughput caused by the standard personal computer architecture. (WhatIs.com)

  • learning curve

    - Learning curves are a visualization of the difficulty estimated in learning a subject over a period of time as well as relative progress throughout the process of learning. The learning curve provi... (WhatIs.com)

Glossaries

  • Computing fundamentals

    - Terms related to computer fundamentals, including computer hardware definitions and words and phrases about software, operating systems, peripherals and troubleshooting.

  • Internet applications

    - This WhatIs.com glossary contains terms related to Internet applications, including definitions about Software as a Service (SaaS) delivery models and words and phrases about web sites, e-commerce ...

Ask a Question. Find an Answer.Powered by ITKnowledgeExchange.com

Ask An IT Question

Get answers from your peers on your most technical challenges

Ask Question

Tech TalkComment

Share
Comments

    Results

    Contribute to the conversation

    All fields are required. Comments will appear at the bottom of the article.