Browse Definitions :

Browse Definitions by Alphabet

DAT - DAT

  • data currency (data as currency) - Data currency is monetary value assigned to data so that it can be used as the unit of exchange in a transaction either as the sole payment or in combination with money.
  • data decryption IC - A data encryption/decryption IC is a specialized integrated circuit (IC) that can encrypt outgoing data and decrypt incoming data.
  • Data Decryption Integrated Circuits - A data encryption/decryption IC is a specialized integrated circuit (IC) that can encrypt outgoing data and decrypt incoming data.
  • data deduplication - Data deduplication -- often called intelligent compression or single-instance storage -- is a process that eliminates redundant copies of data and reduces storage overhead.
  • data deduplication - Data deduplication -- often called intelligent compression or single-instance storage -- is a process that eliminates redundant copies of data and reduces storage overhead.
  • data deduplication hardware - Data deduplication hardware is disk storage that eliminates redundant copies of data and retains one instance to be stored.
  • data deduplication ratio - To calculate the deduplication ratio, divide the capacity of backed up data before duplicates are removed by the actual capacity used once the backup is complete.
  • Data Definition Language - Data Definition Language (DDL) is a standard for commands that define the different structures in a database.
  • Data Definition Language (DDL) - Data Definition Language (DDL) is a standard for commands that define the different structures in a database.
  • data democratization - Data democratization is the ability for information in a digital format to be accessible to the average end user.
  • data destruction - Data destruction is the process of destroying data stored on tapes, hard disks and other forms of electronic media so that it is completely unreadable and cannot be accessed or used for unauthorized purposes.
  • data dictionary - A data dictionary is a collection of descriptions of the data objects or items in a data model for the benefit of programmers and others who need to refer to them.
  • data discovery platform - A data discovery platform is a complete set of tools for the purpose of detecting patterns, and those outlier results outside of patterns, in data.
  • data discrimination (data censorship) - Data discrimination, also called discrimination by algorithm, is bias that occurs when predefined data types or data sources are intentionally or unintentionally treated differently than others.
  • data dredging - Data dredging, sometimes referred to as data fishing is a data mining practice in which large volumes of data are searched to find any possible relationships between data.
  • data dredging (data fishing) - Data dredging, sometimes referred to as data fishing is a data mining practice in which large volumes of data are searched to find any possible relationships between data.
  • Data Dynamics StorageX - Data Dynamics StorageX is a software suite that specializes in data migration and Microsoft Distributed File System management.
  • data encryption IC - A data encryption/decryption IC is a specialized integrated circuit (IC) that can encrypt outgoing data and decrypt incoming data.
  • Data Encryption Standard - Data Encryption Standard (DES) is an outdated symmetric key method of data encryption.
  • Data Encryption Standard (DES) - Data Encryption Standard (DES) is an outdated symmetric key method of data encryption.
  • data engineer - A data engineer is an IT worker whose primary job is to prepare data for analytical or operational uses.
  • data exfiltration - Data exfiltration, also called data extrusion, is the unauthorized transfer of data from a computer.
  • data exfiltration (data extrusion) - Data exfiltration, also called data extrusion, is the unauthorized transfer of data from a computer.
  • data exhaust - Data exhaust is a byproduct of user actions online and consists of the various files generated by web browsers and their plug-ins such as cookies, log files, temporary internet files and and .
  • data exploration - Data exploration is the first step in data analysis involving the use of data visualization tools and statistical techniques to uncover data set characteristics and initial patterns.
  • data extrusion - Data exfiltration, also called data extrusion, is the unauthorized transfer of data from a computer.
  • data fabric - A data fabric is an architecture and software offering a unified collection of data assets, databases and database architectures within an enterprise.
  • data federation services - Data federation software is programming that provides an organization with the ability to collect data from disparate sources and aggregate it in a virtual database where it can be used for business intelligence (BI) or other analysis.
  • data federation software - Data federation software is programming that provides an organization with the ability to collect data from disparate sources and aggregate it in a virtual database where it can be used for business intelligence (BI) or other analysis.
  • data federation technology - Data federation software is programming that provides an organization with the ability to collect data from disparate sources and aggregate it in a virtual database where it can be used for business intelligence (BI) or other analysis.
  • data feed - A data feed is an ongoing stream of structured data that provides users with updates of current information from one or more sources.
  • data file - In data processing, using an office metaphor, a file is a related collection of records.
  • data fishing - Data dredging, sometimes referred to as data fishing is a data mining practice in which large volumes of data are searched to find any possible relationships between data.
  • data flow diagram (DFD) - A data flow diagram (DFD) is a graphical or visual representation using a standardized set of symbols and notations to describe a business's operations through data movement.
  • data glove - A data glove is an interactive device, resembling a glove worn on the hand, which facilitates tactile sensing and fine-motion control in robotics and virtual reality.
  • data governance - Data governance (DG) is the process of managing the availability, usability, integrity and security of the data in enterprise systems, based on internal data standards and policies that also control data usage.
  • data governance policy - A data governance policy is a documented set of guidelines for ensuring that an organization's data and information assets are managed consistently and used properly.
  • data gravity - Data gravity is an attribute of data that is manifest in the way software and services are drawn to it relative to its mass (the amount of data).
  • data historian - A data historian is a software program that records the data created by processes running in a computer system.
  • data hygiene - Data hygiene is the collective processes conducted to ensure the cleanliness of data.
  • data in motion - Data in motion, also referred to as data in transit or data in flight, is digital information that is in the process of being transported between locations within or between computer systems.
  • data in use - Data in use is data that is currently being updated, processed, accessed and read by a system.
  • data ingestion - Data can be ingested in real time or in batches.
  • data integration - Customer data integration (CDI) is the process of defining, consolidating and managing customer information across an organization's business units and systems to achieve a "single version of the truth" for customer data.
  • data integration - Data integration is the process of combining data from multiple source systems to create unified sets of information for both operational and analytical uses.
  • data integrity - Data integrity is the assurance that digital information is uncorrupted and can only be accessed or modified by those authorized to do so.
  • data integrity - Data integrity is the assurance that digital information is uncorrupted and can only be accessed or modified by those authorized to do so.
  • data janitor (data wrangler) - A data janitor is an IT employee that cleans up big data sources to prepare them for data analysts and data scientists.
  • data journalism - Data journalism in an approach to writing for the public in which the journalist analyzes large data sets to identify potential news stories.
  • data labeling - Data labeling, in the context of machine learning, is the process of detecting and tagging data samples.
  • data lake - A data lake is a storage repository that holds a vast amount of raw data in its native format until it is needed for analytics applications.
  • data lakehouse - A data lakehouse is a data management architecture that combines the benefits of a traditional data warehouse and a data lake.
  • data latency - Data latency is the time it takes for data packets to be stored or retrieved.
  • data life cycle - The data life cycle is the sequence of stages that a particular unit of data goes through from its initial generation or capture to its eventual archival and/or deletion at the end of its useful life.
  • data life cycle management - Data life cycle management (DLM) is a policy-based approach to managing the flow of an information system's data throughout its life cycle: from creation and initial storage to the time when it becomes obsolete and is deleted.
  • data lifecycle management - Data life cycle management (DLM) is a policy-based approach to managing the flow of an information system's data throughout its life cycle: from creation and initial storage to the time when it becomes obsolete and is deleted.
  • data lifecycle management (DLM) - Data lifecycle management (DLM) is a policy-based approach to managing the flow of an information system's data throughout its lifecycle: from creation and initial storage to when it becomes obsolete and is deleted.
  • data lineage - Data lineage is the history of data, including where the data has traveled through-out the its existence within an organization.
  • data link control - DLC also is an abbreviation for digital loop carrier.
  • data link control (DLC) - DLC also is an abbreviation for digital loop carrier.
  • data link layer - The data link layer is the protocol layer in a program that handles the moving of data into and out of a physical link in a network.
  • data literacy - Data literacy is the ability to derive information from data, just as literacy in general is the ability to derive information from the written word.
  • data loss - Data loss is the intentional or unintentional destruction of information, caused by people and or processes from within or outside of an organization.
  • data loss prevention - Data loss prevention (DLP) -- sometimes referred to as data leak prevention, information loss prevention and extrusion prevention -- is a strategy for preventing individuals from accessing sensitive information who do not need it.
  • data loss prevention (DLP) - Data loss prevention (DLP) -- sometimes referred to as data leak prevention, information loss prevention and extrusion prevention -- is a strategy for preventing individuals from accessing sensitive information who do not need it.
  • data management as a service (DMaaS) - Data management as a service (DMaaS) is a type of cloud service that provides enterprises with centralized storage for disparate data sources.
  • Data Management Learning Guides - We've gathered a collection of our learning guides and tutorials on data management.
  • data management platform (DMP) - A data management platform (DMP), also referred to as a unified data management platform (UDMP), is a centralized system for collecting and analyzing large sets of data originating from disparate sources.
  • data marketplace (data market) - Data marketplaces typically offer various types of data for different markets and from different sources.
  • data mart - A data mart is a repository of data that is designed to serve a particular community of knowledge workers.
  • data mart (datamart) - A data mart is a repository of data that is designed to serve a particular community of knowledge workers.
  • data mashup - An enterprise mashup is the integration of heterogeneous digital data and applications from multiple sources for business purposes.
  • data masking - Data masking is a method of creating a structurally similar but inauthentic version of an organization's data that can be used for purposes such as software testing and user training.
  • data migration - Data migration is the process of transferring data between data storage systems, data formats or computer systems.
  • data miner - Data mining is the process of sorting through large data sets to identify patterns and establish relationships to solve problems through data analysis.
  • data mining - Data mining is the process of sorting through large data sets to identify patterns and relationships that can help solve business problems through data analysis.
  • data modeling - Data modeling is the process of documenting a complex software system design as an easily understood diagram, using text and symbols to represent the way data needs to flow.
  • data monetization - Data monetization is the act of measuring the economic benefit of corporate data.
  • Data Over Cable Service Interface Specifications - Now known as CableLabs Certified Cable Modems, DOCSIS (Data Over Cable Service Interface Specifications) is a standard interface for cable modems, the devices that handle incoming and outgoing data signals between a cable TV operator and a personal or business computer or television set.
  • Data Over Cable Systems Interface - Now known as CableLabs Certified Cable Modems, DOCSIS (Data Over Cable Service Interface Specifications) is a standard interface for cable modems, the devices that handle incoming and outgoing data signals between a cable TV operator and a personal or business computer or television set.
  • data pipeline - A data pipeline is a system that moves data from one (source) location to another (target) location, much like how an oil pipeline moves oil from one location to another.
  • data plan (mobile data plan) - Since the advent of the smartphone made mobile Internet possible, most carriers offer data plans at varying rates based on the amount of data transfer allowed before a data cap is imposed.
  • data plane (DP) - The data plane (sometimes known as the user plane, forwarding plane, carrier plane or bearer plane) is the part of a network that carries user traffic.
  • data point - A data point is a discrete unit of information.
  • data portability - Data portability is the ability to move data among different application programs, computing environments or cloud services.
  • data preparation - Data preparation is the process of gathering, combining, structuring and organizing data so it can be used in business intelligence (BI), analytics and data visualization applications.
  • data preprocessing - Data preprocessing describes any type of processing performed on raw data to prepare it for another processing procedure.
  • Data Privacy - Consumer privacy, also known as customer privacy, involves the handling and protection of the sensitive personal information provided by customers in the course of everyday transactions.
  • data privacy (information privacy) - Data privacy, also called information privacy, is the aspect of information technology (IT) that deals with the ability an organization or individual has to determine what data in a computer system can be shared with third parties.
  • data products - A data product is digital information that can be purchased.
  • data profiling - Data profiling is the process of examining, analyzing and reviewing data to collect statistics surrounding the quality and hygiene of the dataset.
  • Data Protection Bill 2017 - The Data Protection Bill 2017 is legislation that will replace the Data Protection Act of 1998.
  • data protection impact assessment (DPIA) - A data protection impact assessment (DPIA) is a process designed to help organizations determine how data processing systems, procedures or technologies affect individuals’ privacy and eliminate any risks that might violate compliance.
  • data protection management - Data protection management (DPM) comprises the administration, monitoring and management of backup processes to ensure backup tasks run on schedule and data is securely backed up and recoverable.
  • data protection management (DPM) - Data protection management (DPM) comprises the administration, monitoring and management of backup processes to ensure backup tasks run on schedule and data is securely backed up and recoverable.
  • data protection officer (DPO) - A data protection officer (DPO) is an enterprise security officer tasked with ensuring that data management is compliant with the European Union’s General Data Protection Regulations (GDPR).
  • data quality - Data quality is a measure of the condition of data based on factors such as accuracy, completeness, consistency, reliability and whether it's up to date.
  • data quality assurance - Data quality is a measure of the condition of data based on factors such as accuracy, completeness, consistency, reliability and whether it's up to date.
  • data rate - Data transfer rate (DTR) is the amount of digital data that is moved from one place to another in a given time.
  • data recovery - Data recovery restores data that has been lost, accidentally deleted, corrupted or made inaccessible.
SearchCompliance
  • ISO 31000 Risk Management

    The ISO 31000 Risk Management framework is an international standard that provides businesses with guidelines and principles for ...

  • pure risk

    Pure risk refers to risks that are beyond human control and result in a loss or no loss with no possibility of financial gain.

  • risk reporting

    Risk reporting is a method of identifying risks tied to or potentially impacting an organization's business processes.

SearchSecurity
  • plaintext

    In cryptography, plaintext is usually ordinary readable text before it is encrypted into ciphertext or after it is decrypted.

  • black hat hacker

    A black hat hacker has been historically used to describe one who has malicious intent -- such as theft of information, fraud or ...

  • cookie poisoning

    Cookie poisoning is a type of cyber attack in which a bad actor hijacks, forges, alters or manipulates a cookie to gain ...

SearchHealthIT
SearchDisasterRecovery
  • What is risk mitigation?

    Risk mitigation is a strategy to prepare for and lessen the effects of threats faced by a business.

  • fault-tolerant

    Fault-tolerant technology is a capability of a computer system, electronic system or network to deliver uninterrupted service, ...

  • synchronous replication

    Synchronous replication is the process of copying data over a storage area network, local area network or wide area network so ...

SearchStorage
  • Remote Direct Memory Access (RDMA)

    Remote Direct Memory Access (RDMA) is a technology that enables two networked computers to exchange data in main memory without ...

  • storage (computer storage)

    Data storage is the collective methods and technologies that capture and retain digital information on electromagnetic, optical ...

  • storage medium (storage media)

    In computers, a storage medium is a physical device that receives and retains electronic data for applications and users and ...

Close