What is millennium? - Definition from WhatIs.com
Part of the Computing fundamentals glossary:

1) A millennium is a period of one thousand years. It is similar to the terms biennium , a period of two years, and century , a period of one hundred years. The term derives from the Latin mille , meaning thousand, and annum , meaning year.

2) The millennium is the anniversary or celebration of a 1000-year period. In the United States, the official timekeeper at the Naval Observatory considers that the second millennium and the beginning of the third will be reached on January 1, 2001. This date is based on the Gregorian calendar , created in 1582 Anno Domini (A.D.) and which has since become a world standard for civil affairs. The Gregorian calendar uses the table of dates for Easter that was established by the sixth-century scholar Dionysius Exiguus who marked the modern epoch as beginning on January 1, 1 A.D. For this reason, the second millennium is not actually reached until January 1, 2001. However, because the most dramatic change in the calendar occurred on January 1, 2000, much of the world celebrated the beginning of the third millennium a year early.

3) The millennium is also used to refer in general to a time when great achievements finally come to pass, great happiness prevails, or some other important objective is reached. In the Book of Revelations, the millennium is a period of a thousand years during which Christ is to rule on Earth.

This was last updated in September 2005
Posted by: Margaret Rouse

Related Terms

Definitions

  • JBoss

    - JBoss is a division of Red Hat that provides support for the JBoss open source application server program and related middleware services marketed under the JBoss Enterprise Middleware brand. (WhatIs.com)

  • GPU supercomputer

    - A GPU supercomputer is a networked group of computers with multiple graphics processing units working as general-purpose GPUs (GPGPUs) in tandem on a single task. (WhatIs.com)

  • video card (graphics card)

    - A video adapter (alternate terms include graphics card, display adapter, video card, video board and almost any combination of the words in these terms) is an integrated circuit card in a computer ... (WhatIs.com)

Glossaries

  • Computing fundamentals

    - Terms related to computer fundamentals, including computer hardware definitions and words and phrases about software, operating systems, peripherals and troubleshooting.

  • Internet applications

    - This WhatIs.com glossary contains terms related to Internet applications, including definitions about Software as a Service (SaaS) delivery models and words and phrases about web sites, e-commerce ...

Ask a Question. Find an Answer.Powered by ITKnowledgeExchange.com

Ask An IT Question

Get answers from your peers on your most technical challenges

Ask Question

Tech TalkComment

Share
Comments

    Results

    Contribute to the conversation

    All fields are required. Comments will appear at the bottom of the article.