Browse Definitions :

Data and data management

Terms related to data, including definitions about data warehousing and words and phrases about data management.

DAT - GZI

  • data - In computing, data is information that has been translated into a form that is efficient for movement or processing.
  • data abstraction - Data abstraction is the reduction of a particular body of data to a simplified representation of the whole.
  • data access rights - A data access right (DAR) is a permission that has been granted that allows a person or computer program to locate and read digital information at rest.
  • data activation - Data activation is a marketing approach that uses consumer information and data analytics to help companies gain real-time insight into target audience behavior and plan for future marketing initiatives.
  • data aggregation - Data aggregation is any process in which information is gathered and expressed in a summary form, for purposes such as statistical analysis.
  • data analytics (DA) - Data analytics (DA) is the science of examining raw data with the purpose of drawing conclusions about that information.
  • data anonymization - The purpose of data anonymization is to make its source untraceable.
  • data archiving - Data archiving migrates infrequently used data to low-cost, high-capacity archive storage for long-term retention.
  • data artist - A data artist is a business analytics (BA) specialist who creates graphs, charts, infographics and other visual tools that help people understand complex data.
  • Data as a Service (DaaS) - Data as a Service (DaaS) is an information provision and distribution model in which data files (including text, images, sounds, and videos) are made available to customers over a network, typically the Internet.
  • data availability - Data availability is a term used by some computer storage manufacturers and storage service providers (SSPs) to describe products and services that ensure that data continues to be available at a required level of performance in situations ranging from normal through "disastrous.
  • data breach - A data breach is a confirmed incident in which sensitive, confidential or otherwise protected data has been accessed and/or disclosed in an unauthorized fashion.
  • data center chiller - A data center chiller is a cooling system used in a data center to remove heat from one element and deposit it into another element.
  • data center services - Data center services is a collective term for the supporting components necessary for the proper operation of a repository for storage, management and dissemination of data organized around a body of knowledge or pertaining to an enterprise.
  • data citizen - A data citizen is an employee who relies on data to make decisions and perform job responsibilities.
  • data collection - Data collection is a process for gathering information from different sources.
  • data context - Data context is the network of connections among data points.
  • data corruption - Data corruption is the deterioration of computer data as a result of some external agent.
  • data curation - Data curation is the management of data throughout its lifecycle, from creation and initial storage to the time when it is archived for posterity or becomes obsolete and is deleted.
  • data democratization - Data democratization is the ability for information in a digital format to be accessible to the average end user.
  • data destruction - Data destruction is the process of destroying data stored on tapes, hard disks and other forms of electronic media so that it is completely unreadable and cannot be accessed or used for unauthorized purposes.
  • data dictionary - A data dictionary is a collection of descriptions of the data objects or items in a data model for the benefit of programmers and others who need to refer to them.
  • data dredging (data fishing) - Data dredging, sometimes referred to as data fishing is a data mining practice in which large volumes of data are searched to find any possible relationships between data.
  • data encryption/decryption IC - A data encryption/decryption IC is a specialized integrated circuit (IC) that can encrypt outgoing data and decrypt incoming data.
  • data engineer - A data engineer is a worker whose primary job responsibilities involve preparing data for analytical or operational uses.
  • data exfiltration (data extrusion) - Data exfiltration, also called data extrusion, is the unauthorized transfer of data from a computer.
  • data exhaust - Data exhaust is a byproduct of user actions online and consists of the various files generated by web browsers and their plug-ins such as cookies, log files, temporary internet files and and .
  • data exploration - Data exploration is the first step in data analysis and typically involves summarizing the main characteristics of a data set, including its size, accuracy, initial patterns in the data and other attributes.
  • data federation software - Data federation software is programming that provides an organization with the ability to collect data from disparate sources and aggregate it in a virtual database where it can be used for business intelligence (BI) or other analysis.
  • data feed - A data feed is an ongoing stream of structured data that provides users with updates of current information from one or more sources.
  • data governance policy - A data governance policy is a documented set of guidelines for ensuring that an organization's data and information assets are managed consistently and used properly.
  • data gravity - Data gravity is an attribute of data that is manifest in the way software and services are drawn to it relative to its mass (the amount of data).
  • data historian - A data historian is a software program that records the data created by processes running in a computer system.
  • data hygiene - Data hygiene is the collective processes conducted to ensure the cleanliness of data.
  • data in motion - Data in motion, also referred to as data in transit or data in flight, is digital information that is in the process of being transported between locations within or between computer systems.
  • data in use - Data in use is data that is currently being updated, processed, accessed and read by a system.
  • data ingestion - Data can be ingested in real time or in batches.
  • data integration - Data integration is the process of combining data from multiple source systems to create unified sets of information for both operational and analytical uses.
  • data integrity - Data integrity is the assurance that digital information is uncorrupted and can only be accessed or modified by those authorized to do so.
  • data janitor (data wrangler) - A data janitor is an IT employee that cleans up big data sources to prepare them for data analysts and data scientists.
  • data life cycle - The data life cycle is the sequence of stages that a particular unit of data goes through from its initial generation or capture to its eventual archival and/or deletion at the end of its useful life.
  • data literacy - Data literacy is the ability to derive information from data, just as literacy in general is the ability to derive information from the written word.
  • data loss - Data loss is the intentional or unintentional destruction of information, caused by people and or processes from within or outside of an organization.
  • data loss prevention (DLP) - Data loss prevention (DLP) is a strategy for making sure that end users do not send sensitive or critical information outside of the corporate network.
  • data management platform (DMP) - A data management platform (DMP), also referred to as a unified data management platform (UDMP), is a centralized system for collecting and analyzing large sets of data originating from disparate sources.
  • data management-as-a-service (DMaaS) - Data Management-as-a-Service (DMaaS) is a type of cloud service that provides protection, governance and intelligence across a company’s various data sources.
  • data marketplace (data market) - Data marketplaces typically offer various types of data for different markets and from different sources.
  • data mart (datamart) - A data mart is a repository of data that is designed to serve a particular community of knowledge workers.
  • data masking - Data masking is a method of creating a structurally similar but inauthentic version of an organization's data that can be used for purposes such as software testing and user training.
  • data migration - Data migration is the process of transferring data between data storage systems, data formats or computer systems.
  • data mining - Data mining is the process of sorting through large data sets to identify patterns and establish relationships to solve problems through data analysis.
  • data modeling - Data modeling is the process of documenting a complex software system design as an easily understood diagram, using text and symbols to represent the way data needs to flow.
  • data preparation - Data preparation is the process of gathering, combining, structuring and organizing data so it can be analyzed as part of data visualization, analytics and machine learning applications.
  • data preprocessing - Data preprocessing describes any type of processing performed on raw data to prepare it for another processing procedure.
  • data profiling - Data profiling is the process of examining, analyzing and reviewing data to collect statistics surrounding the quality and hygiene of the dataset.
  • Data Protection Bill 2017 - The Data Protection Bill 2017 is legislation that will replace the Data Protection Act of 1998.
  • data protection management (DPM) - Data protection management (DPM) is the administration of backup processes to ensure that tasks run on schedule, and that data is securely backed up and recoverable.
  • data protection officer (DPO) - A data protection officer (DPO) is an enterprise security officer tasked with ensuring that data management is compliant with the European Union’s General Data Protection Regulations (GDPR).
  • data quality - Data quality is a measure of the condition of data based on factors such as accuracy, completeness, consistency, reliability and whether it's up to date.
  • data reduction in primary storage (DRIPS) - Data reduction in primary storage (DRIPS) is the application of capacity optimization techniques for data that is in active use.
  • data replication - Data replication copies data from one location to another using a SAN, LAN or local WAN.
  • data residency - Data residency is a concept that refers to the physical location of information, as well as the local regulations imposed on that information based on where it resides.
  • data residency (data sovereignty) - Data residency is the physical location or locations of an organization's data and the area of storage management involved with issues specific to managing data in those particular locations.
  • data retention policy - A data retention policy dictates the types of data to be retained and the duration for which that data must be stored in accordance with operational or regulatory requirements.
  • data science - Data science is the study of where information comes from, what it represents and how it can be turned into a valuable resource in the creation of business and IT strategies.
  • data science as a service (DSaaS) - Data science as a service (DSaaS) is a form of outsourcing that involves the delivery of information gleaned from advanced analytics applications run by data scientists at an outside company to corporate clients for their business use.
  • data scientist - A data scientist is a professional responsible for collecting, analyzing and interpreting extremely large amounts of data.
  • data scrubbing (data cleansing) - Data scrubbing, also called data cleansing, is the process of cleaning up data in a database that is incorrect, incomplete, or duplicated.
  • Data Security Council of India (DSCI) - The Data Security Council of India (DSCI) is a not-for-profit organization created to promote the country as a secure destination for information technology (IT) outsourcing.
  • data set - A data set is a collection of data that contains individual data units organized (formatted) in a specific way and accessed by one or more specific access methods based on the data set organization and data structure.
  • data shadow - A data shadow is the collective body of data that is automatically generated and recorded as we go about our lives rather than intentionally created.
  • data source name (DSN) - A data source name (DSN) is a data structure that contains the information about a specific database that an Open Database Connectivity (ODBC) driver needs in order to connect to it.
  • data splitting - Data splitting is an approach to protecting sensitive data from unauthorized access by encrypting the data and storing different portions of a file on different servers.
  • data stewardship - Data stewardship is the management and oversight of an organization's data assets to help provide business users with high-quality data that is easily accessible in a consistent manner.
  • data store - A data store is a repository for persistently storing collections of data, such as a database, a file system or a directory.
  • data storytelling - Data storytelling is the process of translating complex data analyses into layman's terms in order to influence a decision or action.
  • data structure - A data structure is a specialized format for organizing, processing, retrieving and storing data.
  • Data Transfer Project (DTP) - Data Transfer Project (DTP) is an open source initiative to facilitate the transfer of data between the differing online platforms of providers like Google, Microsoft and Twitter.
  • data transformation - Data transformation is the process of converting data from one format, such as a database file, XML document or Excel spreadsheet, into another.
  • data virtualization - Data virtualization is an umbrella term used to describe any approach to data management that allows an application to retrieve and manipulate data without needing to know any technical details about the data such as how it is formatted or where it is physically located.
  • data virtualization software - Data virtualization software is application programming that facilitates querying data distributed across multiple internal and/or external storage systems.
  • data warehouse - A data warehouse is a federated repository for all the data collected by an enterprise's various operational systems, be they physical or logical.
  • data warehouse appliance - A data warehouse appliance is an all-in-one “black box” solution optimized for data warehousing.
  • data warehouse as a service (DWaaS) - Data warehousing as a service (DWaaS) is an outsourcing model in which a service provider configures and manages the hardware and software resources a data warehouse requires, and the customer provides the data and pays for the managed service.
  • data-driven decision management (DDDM) - Data-driven decision management (DDDM) is an approach to business governance that values actions that can be backed up with verifiable data.
  • data-driven disaster - A data-driven disaster is a serious problem caused by one or more ineffective data analysis processes.
  • database (DB) - A database is a collection of information that is organized so that it can be easily accessed, managed and updated.
  • database abstraction layer - A database abstraction layer is a simplified representation of a database in the form of a written description or a diagram.
  • database management system (DBMS) - A database management system (DBMS) is system software for creating and managing databases.
  • database marketing - Database marketing is a systematic approach to the gathering, consolidation, and processing of consumer data (both for customers and potential customers) that is maintained in a company's databases.
  • database mirroring - Database mirroring is the maintenance of redundant copies of a database to ensure continuous data availability and minimize or avoid downtime that might otherwise result from data corruption or loss, or when the operation of a network is partially compromised.
  • database normalization - Database normalization is intrinsic to most relational database schemes.
  • database of record (DBOR) - A database of record (DBOR) is a repository for centralized storage of information about objects or people.
  • database replication - Database replication is the frequent electronic copying of data from a database in one computer or server to a database in another -- so that all users share the same level of information.
  • database-agnostic - Database-agnostic is a term describing the capacity of software to function with any vendor’s database management system (DBMS).
  • DataOps (data operations) - DataOps (data operations) is an Agile approach to designing, implementing and maintaining a distributed data architecture that will support a wide range of open source tools and frameworks in production.
  • DB2 - DB2 is a family of relational database management system (RDBMS) products from IBM that serve a number of different operating system platforms.
  • DDBMS (distributed database management system) - A DDBMS (distributed database management system) is a centralized application that manages a distributed database as if it were all stored on the same computer.
  • deduplication software - Dedupe software eliminates unnecessary copies of data by redirecting new iterations of the data back to the original.
  • deep analytics - Deep analytics is the application of sophisticated data processing techniques to yield information from large and typically multi-source data sets comprised of both unstructured and semi-structured data.

-ADS BY GOOGLE

SearchCompliance

  • PCI DSS (Payment Card Industry Data Security Standard)

    The Payment Card Industry Data Security Standard (PCI DSS) is a widely accepted set of policies and procedures intended to ...

  • risk management

    Risk management is the process of identifying, assessing and controlling threats to an organization's capital and earnings.

  • compliance framework

    A compliance framework is a structured set of guidelines that details an organization's processes for maintaining accordance with...

SearchSecurity

  • DNS over HTTPS (DoH)

    DNS over HTTPS (DoH) is a relatively new protocol that encrypts domain name system traffic by passing DNS queries through a ...

  • integrated risk management (IRM)

    Integrated risk management (IRM) is an approach to risk management that uses a set of practices and processes to improve an ...

  • MITRE ATT&CK framework

    The MITRE ATT&CK (pronounced 'miter attack') framework is a free, globally accessible service that provides comprehensive and ...

SearchHealthIT

  • telemedicine (telehealth)

    Telemedicine is the remote delivery of healthcare services, such as health assessments or consultations, over the ...

  • Project Nightingale

    Project Nightingale is a controversial partnership between Google and Ascension, the second largest health system in the United ...

  • medical practice management (MPM) software

    Medical practice management (MPM) software is a collection of computerized services used by healthcare professionals and ...

SearchDisasterRecovery

SearchStorage

  • M.2 SSD

    An M.2 SSD is a solid-state drive (SSD) that conforms to a computer industry specification and is used in internally mounted ...

  • kilobyte (KB or Kbyte)

    A kilobyte (KB or Kbyte) is a unit of measurement for computer memory or data storage used by mathematics and computer science ...

  • virtual memory

    Virtual memory is a memory management capability of an operating system (OS) that uses hardware and software to allow a computer ...

Close