Browse Definitions :

Browse Definitions by Alphabet

DAT - DAT

  • data center outsourcing (DCO) - DCO (data center outsourcing) is the practice of outsourcing the day-to-day provisioning and management of computing and storage resources and environments to a third party provider.
  • Data Center Quizzes - We've gathered a collection of our quizzes to test your data center knowledge.
  • data center resiliency - Data center resiliency is the ability of a server, network, storage system, or an entire data center, to continue operating even when there has been an equipment failure, power outage or other disruption.
  • data center services - Data center services is a collective term for the supporting components necessary for the proper operation of a repository for storage, management and dissemination of data organized around a body of knowledge or pertaining to an enterprise.
  • data citizen - A data citizen is an employee who relies on data to make decisions and perform job responsibilities.
  • data classification - Data classification is the process of organizing data into categories that make it is easy to retrieve, sort and store for future use.
  • data cleansing - Data scrubbing, also called data cleansing, is the process of cleaning up data in a database that is incorrect, incomplete, or duplicated.
  • data co-op - A data co-op is a group organized for sharing pooled data from online consumers between two or more companies.
  • data collection - Data collection is a process for gathering information from different sources.
  • Data Communication Equipment - In network computing, DCE (Distributed Computing Environment) is an industry-standard software technology for setting up and managing computing and data exchange in a system of distributed computers.
  • data compression - Data compression is a reduction in the number of bits needed to represent data.
  • data confabulation - Data confabulation is a business intelligence term for the selective and possibly misleading use of data to support a decision that has already been made.
  • data context - Data context is the network of connections among data points.
  • data corruption - Data corruption is the deterioration of computer data as a result of some external agent.
  • data curation - Data curation is the management of data throughout its lifecycle, from creation and initial storage to the time when it is archived for posterity or becomes obsolete and is deleted.
  • data currency (data as currency) - Data currency is monetary value assigned to data so that it can be used as the unit of exchange in a transaction either as the sole payment or in combination with money.
  • data decryption IC - A data encryption/decryption IC is a specialized integrated circuit (IC) that can encrypt outgoing data and decrypt incoming data.
  • Data Decryption Integrated Circuits - A data encryption/decryption IC is a specialized integrated circuit (IC) that can encrypt outgoing data and decrypt incoming data.
  • data deduplication - Deduplication retains one unique data instance to reduce storage and bandwidth consumed by remote backups, replication and disaster recovery.
  • data deduplication - Data deduplication -- often called intelligent compression or single-instance storage -- is a process that eliminates redundant copies of data and reduces storage overhead.
  • data deduplication hardware - Data deduplication hardware is a storage product that eliminates redundant copies of data and retains one instance to be stored.
  • data deduplication ratio - To calculate the deduplication ratio, divide the capacity of backed up data before duplicates are removed by the actual capacity used once the backup is complete.
  • Data Definition Language - Data Definition Language (DDL) is a standard for commands that define the different structures in a database.
  • Data Definition Language (DDL) - Data Definition Language (DDL) is a standard for commands that define the different structures in a database.
  • data democratization - Data democratization is the ability for information in a digital format to be accessible to the average end user.
  • data destruction - Data destruction is the process of destroying data stored on tapes, hard disks and other forms of electronic media so that it is completely unreadable and cannot be accessed or used for unauthorized purposes.
  • data dictionary - A data dictionary is a collection of descriptions of the data objects or items in a data model for the benefit of programmers and others who need to refer to them.
  • data discovery platform - A data discovery platform is a complete set of tools for the purpose of detecting patterns, and those outlier results outside of patterns, in data.
  • data discrimination (data censorship) - Data discrimination, also called discrimination by algorithm, is bias that occurs when predefined data types or data sources are intentionally or unintentionally treated differently than others.
  • data dredging - Data dredging, sometimes referred to as data fishing is a data mining practice in which large volumes of data are searched to find any possible relationships between data.
  • data dredging (data fishing) - Data dredging, sometimes referred to as data fishing is a data mining practice in which large volumes of data are searched to find any possible relationships between data.
  • Data Dynamics StorageX - Data Dynamics StorageX is a software suite that specializes in data migration and Microsoft Distributed File System management.
  • data encryption IC - A data encryption/decryption IC is a specialized integrated circuit (IC) that can encrypt outgoing data and decrypt incoming data.
  • Data Encryption Standard - The Data Encryption Standard (DES) is an outdated symmetric-key method of data encryption.
  • Data Encryption Standard (DES) - The Data Encryption Standard (DES) is an outdated symmetric-key method of data encryption.
  • data encryption/decryption IC - A data encryption/decryption IC is a specialized integrated circuit (IC) that can encrypt outgoing data and decrypt incoming data.
  • data engineer - A data engineer is a worker whose primary job responsibilities involve preparing data for analytical or operational uses.
  • data exfiltration - Data exfiltration, also called data extrusion, is the unauthorized transfer of data from a computer.
  • data exfiltration (data extrusion) - Data exfiltration, also called data extrusion, is the unauthorized transfer of data from a computer.
  • data exhaust - Data exhaust is a byproduct of user actions online and consists of the various files generated by web browsers and their plug-ins such as cookies, log files, temporary internet files and and .
  • data exploration - Data exploration is the first step in data analysis and typically involves summarizing the main characteristics of a data set, including its size, accuracy, initial patterns in the data and other attributes.
  • data extrusion - Data exfiltration, also called data extrusion, is the unauthorized transfer of data from a computer.
  • data federation services - Data federation software is programming that provides an organization with the ability to collect data from disparate sources and aggregate it in a virtual database where it can be used for business intelligence (BI) or other analysis.
  • data federation software - Data federation software is programming that provides an organization with the ability to collect data from disparate sources and aggregate it in a virtual database where it can be used for business intelligence (BI) or other analysis.
  • data federation technology - Data federation software is programming that provides an organization with the ability to collect data from disparate sources and aggregate it in a virtual database where it can be used for business intelligence (BI) or other analysis.
  • data feed - A data feed is an ongoing stream of structured data that provides users with updates of current information from one or more sources.
  • data file - In data processing, using an office metaphor, a file is a related collection of records.
  • data fishing - Data dredging, sometimes referred to as data fishing is a data mining practice in which large volumes of data are searched to find any possible relationships between data.
  • data glove - A data glove is an interactive device, resembling a glove worn on the hand, which facilitates tactile sensing and fine-motion control in robotics and virtual reality.
  • data governance - Data governance (DG) is the overall management of the availability, usability, integrity and security of data used in an enterprise.
  • data governance policy - A data governance policy is a documented set of guidelines for ensuring that an organization's data and information assets are managed consistently and used properly.
  • data gravity - Data gravity is an attribute of data that is manifest in the way software and services are drawn to it relative to its mass (the amount of data).
  • data groupie - Droupie (for data groupie) is computer jargon for someone who likes to spend time with people who are more computer literate than they are.
  • data hiding - Data hiding is a characteristic of object-oriented programming.
  • data historian - A data historian is a software program that records the data created by processes running in a computer system.
  • data hygiene - Data hygiene is the collective processes conducted to ensure the cleanliness of data.
  • data in motion - Data in motion, also referred to as data in transit or data in flight, is digital information that is in the process of being transported between locations within or between computer systems.
  • data in use - Data in use is data that is currently being updated, processed, accessed and read by a system.
  • data ingestion - Data can be ingested in real time or in batches.
  • data integration - Customer data integration (CDI) is the process of defining, consolidating and managing customer information across an organization's business units and systems to achieve a "single version of the truth" for customer data.
  • data integration - Data integration is the process of combining data from multiple source systems to create unified sets of information for both operational and analytical uses.
  • data integrity - Data integrity is the assurance that digital information is uncorrupted and can only be accessed or modified by those authorized to do so.
  • data integrity - Data integrity is the assurance that digital information is uncorrupted and can only be accessed or modified by those authorized to do so.
  • data janitor (data wrangler) - A data janitor is an IT employee that cleans up big data sources to prepare them for data analysts and data scientists.
  • data journalism - Data journalism in an approach to writing for the public in which the journalist analyzes large data sets to identify potential news stories.
  • data key - In cryptography, a data key is a key (a variable value that is applied to a string or block of text to encrypt or decrypt it) that is used to encrypt or decrypt data only and is not used to encrypt or decrypt other keys, as some encryption formulas call for.
  • data labeling - Data labeling, in the context of machine learning, is the process of detecting and tagging data samples.
  • data lake - A data lake is a storage repository that holds a vast amount of raw data in its native format until it is needed.
  • data latency - Data latency is the time it takes for data packets to be stored or retrieved.
  • data life cycle - The data life cycle is the sequence of stages that a particular unit of data goes through from its initial generation or capture to its eventual archival and/or deletion at the end of its useful life.
  • data life cycle management - Data life cycle management (DLM) is a policy-based approach to managing the flow of an information system's data throughout its life cycle: from creation and initial storage to the time when it becomes obsolete and is deleted.
  • data life cycle management (DLM) - Data life cycle management (DLM) is a policy-based approach to managing the flow of an information system's data throughout its life cycle: from creation and initial storage to the time when it becomes obsolete and is deleted.
  • data lifecycle management - Data life cycle management (DLM) is a policy-based approach to managing the flow of an information system's data throughout its life cycle: from creation and initial storage to the time when it becomes obsolete and is deleted.
  • data lineage - Data lineage is the history of data, including where the data has traveled through-out the its existence within an organization.
  • data link control - DLC also is an abbreviation for digital loop carrier.
  • data link control (DLC) - DLC also is an abbreviation for digital loop carrier.
  • data link layer - The Data-Link Layer is the protocol layer in a program that handles the moving of data in and out across a physical link in a network.
  • data literacy - Data literacy is the ability to derive information from data, just as literacy in general is the ability to derive information from the written word.
  • data loss - Data loss is the intentional or unintentional destruction of information, caused by people and or processes from within or outside of an organization.
  • data loss prevention - Data loss prevention (DLP) is a strategy for making sure that end users do not send sensitive or critical information outside of the corporate network.
  • data loss prevention (DLP) - Data loss prevention (DLP) is a strategy for making sure that end users do not send sensitive or critical information outside of the corporate network.
  • data management platform (DMP) - A data management platform (DMP), also referred to as a unified data management platform (UDMP), is a centralized system for collecting and analyzing large sets of data originating from disparate sources.
  • data management-as-a-service (DMaaS) - Data Management-as-a-Service (DMaaS) is a type of cloud service that provides protection, governance and intelligence across a company’s various data sources.
  • data marketplace (data market) - Data marketplaces typically offer various types of data for different markets and from different sources.
  • data mart - A data mart is a repository of data that is designed to serve a particular community of knowledge workers.
  • data mart (datamart) - A data mart is a repository of data that is designed to serve a particular community of knowledge workers.
  • data mashup - An enterprise mashup is the integration of heterogeneous digital data and applications from multiple sources for business purposes.
  • data masking - Data masking is a method of creating a structurally similar but inauthentic version of an organization's data that can be used for purposes such as software testing and user training.
  • data migration - Data migration is the process of transferring data between data storage systems, data formats or computer systems.
  • data miner - Data mining is the process of sorting through large data sets to identify patterns and establish relationships to solve problems through data analysis.
  • data mining - Data mining is the process of sorting through large data sets to identify patterns and establish relationships to solve problems through data analysis.
  • data modeling - Data modeling is the process of documenting a complex software system design as an easily understood diagram, using text and symbols to represent the way data needs to flow.
  • data monetization - Data monetization is the act of creating currency from corporate data.
  • Data Over Cable Service Interface Specifications - Now known as CableLabs Certified Cable Modems, DOCSIS (Data Over Cable Service Interface Specifications) is a standard interface for cable modems, the devices that handle incoming and outgoing data signals between a cable TV operator and a personal or business computer or television set.
  • Data Over Cable Systems Interface - Now known as CableLabs Certified Cable Modems, DOCSIS (Data Over Cable Service Interface Specifications) is a standard interface for cable modems, the devices that handle incoming and outgoing data signals between a cable TV operator and a personal or business computer or television set.
  • data plan (mobile data plan) - Since the advent of the smartphone made mobile Internet possible, most carriers offer data plans at varying rates based on the amount of data transfer allowed before a data cap is imposed.
  • data plane (DP) - The data plane (sometimes known as the user plane, forwarding plane, carrier plane or bearer plane) is the part of a network that carries user traffic.
  • data point - A data point is a discrete unit of information.
  • data portability - Data portability is the ability to move data among different application programs, computing environments or cloud services.
  • data preparation - Data preparation is the process of gathering, combining, structuring and organizing data so it can be analyzed as part of data visualization, analytics and machine learning applications.

-ADS BY GOOGLE

SearchCompliance

  • PCI DSS (Payment Card Industry Data Security Standard)

    The Payment Card Industry Data Security Standard (PCI DSS) is a widely accepted set of policies and procedures intended to ...

  • risk management

    Risk management is the process of identifying, assessing and controlling threats to an organization's capital and earnings.

  • compliance framework

    A compliance framework is a structured set of guidelines that details an organization's processes for maintaining accordance with...

SearchSecurity

  • DNS over HTTPS (DoH)

    DNS over HTTPS (DoH) is a relatively new protocol that encrypts domain name system traffic by passing DNS queries through a ...

  • integrated risk management (IRM)

    Integrated risk management (IRM) is an approach to risk management that uses a set of practices and processes to improve an ...

  • MITRE ATT&CK framework

    The MITRE ATT&CK (pronounced 'miter attack') framework is a free, globally accessible service that provides comprehensive and ...

SearchHealthIT

  • telemedicine (telehealth)

    Telemedicine is the remote delivery of healthcare services, such as health assessments or consultations, over the ...

  • Project Nightingale

    Project Nightingale is a controversial partnership between Google and Ascension, the second largest health system in the United ...

  • medical practice management (MPM) software

    Medical practice management (MPM) software is a collection of computerized services used by healthcare professionals and ...

SearchDisasterRecovery

SearchStorage

  • M.2 SSD

    An M.2 SSD is a solid-state drive (SSD) that conforms to a computer industry specification and is used in internally mounted ...

  • kilobyte (KB or Kbyte)

    A kilobyte (KB or Kbyte) is a unit of measurement for computer memory or data storage used by mathematics and computer science ...

  • virtual memory

    Virtual memory is a memory management capability of an operating system (OS) that uses hardware and software to allow a computer ...

Close