What is timestamp? - Definition from WhatIs.com
Part of the Programming glossary:

A timestamp is the current time of an event that is recorded by a computer. Through mechanisms such as the Network Time Protocol ( NTP ), a computer maintains accurate current time, calibrated to minute fractions of a second. Such precision makes it possible for networked computers and applications to communicate effectively. The timestamp mechanism is used for a wide variety of synchronization purposes, such as assigning a sequence order for a multi-event transaction so that if a failure occurs the transaction can be voided. Another way that a timestamp is used is to record time in relation to a particular starting point. In IP telephony , for example, the Real-time Transport Protocol ( RTP ) assigns sequential timestamps to voice packet s so that they can be buffer ed by the receiver, reassembled, and delivered without error. When writing a program, the programmer is usually provided an application program interface for a timestamp that the operating system can provide during program execution.

This was last updated in September 2005
Posted by: Margaret Rouse

Related Terms

Definitions

Glossaries

  • Programming

    - Terms related to software programming, including definitions about programming languages and words and phrases about software design, coding, testing and debugging.

  • Internet applications

    - This WhatIs.com glossary contains terms related to Internet applications, including definitions about Software as a Service (SaaS) delivery models and words and phrases about web sites, e-commerce ...

Ask a Question. Find an Answer.Powered by ITKnowledgeExchange.com

Ask An IT Question

Get answers from your peers on your most technical challenges

Ask Question

Tech TalkComment

Share
Comments

    Results

    Contribute to the conversation

    All fields are required. Comments will appear at the bottom of the article.