Browse Definitions :

Browse Definitions by Alphabet

DAT - DEC

  • data literacy - Data literacy is the ability to derive information from data, just as literacy in general is the ability to derive information from the written word.
  • data loss - Data loss is the intentional or unintentional destruction of information, caused by people and or processes from within or outside of an organization.
  • data loss prevention (DLP) - Data loss prevention (DLP) -- sometimes referred to as data leak prevention, information loss prevention and extrusion prevention -- is a strategy for preventing individuals from accessing sensitive information who do not need it.
  • data management as a service (DMaaS) - Data management as a service (DMaaS) is a type of cloud service that provides enterprises with centralized storage for disparate data sources.
  • data management platform (DMP) - A data management platform (DMP), also referred to as a unified data management platform (UDMP), is a centralized system for collecting and analyzing large sets of data originating from disparate sources.
  • data marketplace (data market) - Data marketplaces typically offer various types of data for different markets and from different sources.
  • data mart (datamart) - A data mart is a repository of data that is designed to serve a particular community of knowledge workers.
  • data masking - Data masking is a method of creating a structurally similar but inauthentic version of an organization's data that can be used for purposes such as software testing and user training.
  • data migration - Data migration is the process of transferring data between data storage systems, data formats or computer systems.
  • data mining - Data mining is the process of sorting through large data sets to identify patterns and relationships that can help solve business problems through data analysis.
  • data modeling - Data modeling is the process of creating a simplified diagram of a software system and the data elements it contains, using text and symbols to represent the data and how it flows.
  • data monetization - Data monetization is the act of measuring the economic benefit of corporate data.
  • data pipeline - A data pipeline is a system that moves data from one (source) location to another (target) location, much like how an oil pipeline moves oil from one location to another.
  • data plan (mobile data plan) - Since the advent of the smartphone made mobile Internet possible, most carriers offer data plans at varying rates based on the amount of data transfer allowed before a data cap is imposed.
  • data plane (DP) - The data plane (sometimes known as the user plane, forwarding plane, carrier plane or bearer plane) is the part of a network that carries user traffic.
  • data point - A data point is a discrete unit of information.
  • data portability - Data portability is the ability to move data among different application programs, computing environments or cloud services.
  • data preprocessing - Data preprocessing, a component of data preparation, describes any type of processing performed on raw data to prepare it for another data processing procedure.
  • data privacy (information privacy) - Data privacy, also called information privacy, is the aspect of information technology (IT) that deals with the ability an organization or individual has to determine what data in a computer system can be shared with third parties.
  • data products - A data product is digital information that can be purchased.
  • data profiling - Data profiling refers to the process of examining, analyzing, reviewing and summarizing data sets to gain insight into the quality of data.
  • data protection impact assessment (DPIA) - A data protection impact assessment (DPIA) is a process designed to help organizations determine how data processing systems, procedures or technologies affect individuals’ privacy and eliminate any risks that might violate compliance.
  • data protection management (DPM) - Data protection management (DPM) comprises the administration, monitoring and management of backup processes to ensure backup tasks run on schedule and data is securely backed up and recoverable.
  • data quality - Data quality is a measure of the condition of data based on factors such as accuracy, completeness, consistency, reliability and whether it's up to date.
  • data recovery - Data recovery restores data that has been lost, accidentally deleted, corrupted or made inaccessible.
  • data recovery agent (DRA) - A data recovery agent (DRA) is a Microsoft Windows user account with the ability to decrypt data that was encrypted by other users.
  • data reduction - Data reduction is the process of reducing the amount of capacity required to store data.
  • data reduction in primary storage (DRIPS) - Data reduction in primary storage (DRIPS) is the application of capacity optimization techniques for data that is in active use.
  • data replication - Data replication copies data from one location to another using a SAN, LAN or local WAN.
  • data residency - Data residency refers to the physical or geographic location of an organization's data or information.
  • data residency (data sovereignty) - Data residency is the physical location or locations of an organization's data and the area of storage management involved with issues specific to managing data in those particular locations.
  • data restore - Data restore is the process of copying backup data from secondary storage and restoring it to its original location or a new location.
  • data retention policy - A data retention policy, or records retention policy, is an organization's established protocol for retaining information for operational or regulatory compliance needs.
  • data sampling - Data sampling is a statistical analysis technique used to select, manipulate and analyze a representative subset of data points to identify patterns and trends in the larger data set being examined.
  • data science as a service (DSaaS) - Data science as a service (DSaaS) is a form of outsourcing that involves the delivery of information gleaned from advanced analytics applications run by data scientists at an outside company to corporate clients for their business use.
  • data science platform - A data science platform is software that allows data scientists to uncover actionable insights from data and communicate those insights throughout an enterprise within a single environment.
  • data scientist - A data scientist is a professional responsible for collecting, analyzing and interpreting extremely large amounts of data.
  • data set - A data set is a collection of data that contains individual data units organized (formatted) in a specific way and accessed by one or more specific access methods based on the data set organization and data structure.
  • data shadow - A data shadow is the collective body of data that is automatically generated and recorded as we go about our lives rather than intentionally created.
  • data silo - A data silo exists when an organization's departments and systems cannot, or do not, communicate freely with one another and encourage the sharing of business-relevant data.
  • data source name (DSN) - A data source name (DSN) is a data structure that contains the information about a specific database that an Open Database Connectivity (ODBC) driver needs in order to connect to it.
  • data sovereignty - Data sovereignty is the concept that information which has been converted and stored in binary digital form is subject to the laws of the country in which it is located.
  • data splitting - Data splitting is when data is divided into two or more subsets.
  • data stewardship - Data stewardship is the management and oversight of an organization's data assets to help provide business users with high-quality data that is easily accessible in a consistent manner.
  • data store - A data store is a repository for persistently storing collections of data, such as a database, a file system or a directory.
  • data storytelling - Data storytelling is the process of translating complex data analyses into layman's terms in order to influence a decision or action.
  • data streaming - Data streaming is the continuous transfer of data at a steady, high-speed rate.
  • data structures - A data structure is a specialized format for organizing, processing, retrieving and storing data.
  • Data Transfer Project (DTP) - Data Transfer Project (DTP) is an open source initiative to facilitate customer-controlled data transfers between two online services.
  • data transfer rate (DTR) - Data transfer rate (DTR) is the amount of digital data that is moved from one place to another in a given time.
  • data transformation - Data transformation is the process of converting data from one format, such as a database file, XML document or Excel spreadsheet, into another.
  • data type - A data type, in programming, is a classification that specifies which type of value a variable has and what type of mathematical, relational or logical operations can be applied to it without causing an error.
  • data validation - Data validation is the practice of checking the integrity, accuracy and structure of data before it is used for a business operation.
  • data virtualization - Data virtualization is an umbrella term used to describe any approach to data management that allows an application to retrieve and manipulate data without needing to know any technical details about the data such as how it is formatted or where it is physically located.
  • data visualization - Data visualization is the practice of translating information into a visual context, such as a map or graph, to make data easier for the human brain to understand and pull insights from.
  • data warehouse - A data warehouse is a federated repository for all the data collected by an enterprise's various operational systems, be they physical or logical.
  • data warehouse appliance - A data warehouse appliance is an all-in-one “black box” solution optimized for data warehousing.
  • data warehouse as a service (DWaaS) - Data warehousing as a service (DWaaS) is an outsourcing model in which a service provider configures and manages the hardware and software resources a data warehouse requires, and the customer provides the data and pays for the managed service.
  • data-driven decision management (DDDM) - Data-driven decision management (DDDM) is an approach to business governance that values actions that can be backed up with verifiable data.
  • data-driven marketing - Data-driven marketing is a strategy in which marketers use statistics and metrics to assess the effectiveness of marketing campaigns and help make better decisions about future campaigns.
  • database (DB) - A database is a collection of information that is organized so that it can be easily accessed, managed and updated.
  • database abstraction layer - A database abstraction layer is a simplified representation of a database in the form of a written description or a diagram.
  • database activity monitoring (DAM) - Database activity monitoring (DAM) systems monitor and record activity in a database and then generate alerts for anything unusual.
  • database administrator (DBA) - A database administrator (DBA) is the information technician responsible for directing or performing all activities related to maintaining a successful database environment.
  • database as a service (DBaaS) - Database as a service (DBaaS) is a cloud computing managed service offering that provides access to a database without requiring the setup of physical hardware, the installation of software or the need to configure the database.
  • database automation - Database automation is the use of unattended processes and self-updating procedures for administrative tasks in a database.
  • database availability group (DAG) - A database availability group (DAG) is a high availability (HA) and data recovery feature of Exchange Server 2010.
  • database management system (DBMS) - A database management system (DBMS) is system software for creating and managing databases.
  • database marketing - Database marketing is a systematic approach to the gathering, consolidation and processing of consumer data.
  • database normalization - Database normalization is intrinsic to most relational database schemes.
  • database replication - Database replication is the frequent electronic copying of data from a database in one computer or server to a database in another -- so that all users share the same level of information.
  • database-agnostic - Database-agnostic is a term describing the capacity of software to function with any vendor’s database management system (DBMS).
  • DataBricks - DataBricks is an organization and big data processing platform founded by the creators of Apache Spark.
  • DataCore - DataCore is a software-defined storage (SDS) company, as well as an early storage virtualization software vendor, in Fort Lauderdale, Fla.
  • DataCore SANsymphony-V - DataCore SANsymphony-V is a storage virtualization platform that pools capacity of heterogeneous storage hardware.
  • Datadog - Datadog is a monitoring and analytics tool for information technology (IT) and DevOps teams that can be used to determine performance metrics as well as event monitoring for infrastructure and cloud services.
  • DataOps (data operations) - DataOps (data operations) is an Agile approach to designing, implementing and maintaining a distributed data architecture that will support a wide range of open source tools and frameworks in production.
  • Datto - Datto Inc. is a backup, recovery and business continuity vendor with headquarters in Norwalk, Conn.
  • daughterboard (or daughter board, daughter card, or daughtercard) - A daughterboard (or daughter board, daughter card, or daughtercard) is a circuit board that plugs into and extends the circuitry of another circuit board.
  • Daylight Saving Time (DST) - Daylight Saving Time (DST) is the practice of turning the clock ahead as warmer weather approaches and back as it becomes colder again so that people will have one more hour of daylight in the afternoon and evening during the warmer season of the year.
  • days inventory outstanding (DIO) - Days inventory outstanding (DOI) is the average number of days it takes for inventory to be sold.
  • days sales outstanding (DSO) - Days sales outstanding (DSO) is the measurement of the average number of days it takes a business to collect payments after a sale has been made.
  • Db2 - Db2 is a family of database management system (DBMS) products from IBM that serve a number of different operating system (OS) platforms.
  • DC (direct current) - DC (direct current) is the unidirectional flow or movement of electric charge carriers (which are usually electrons).
  • DCE (Distributed Computing Environment) - In network computing, DCE (Distributed Computing Environment) is an industry-standard software technology for setting up and managing computing and data exchange in a system of distributed computers.
  • DCML (Data Center Markup Language) - DCML (Data Center Markup Language), based on Extensible Markup Language (XML), is a data format and model for exchanging information that describes a data center environment.
  • DCOM (Distributed Component Object Model) - DCOM (Distributed Component Object Model) is a set of Microsoft concepts and program interfaces in which client program objects can request services from server program objects on other computers in a network.
  • DCPromo (Domain Controller Promoter) - DCPromo (Domain Controller Promoter) is a tool in Active Directory that installs and removes Active Directory Domain Services and promotes domain controllers.
  • de facto standard - A de facto standard is something that is used so widely that it is considered a standard for a given application although it has no official status.
  • de jure standard - A de jure standard is a technology, method or product that has been officially endorsed for a given application.
  • de-anonymization (deanonymization) - De-anonymization is a method used to detect the original data that was subjected to processes to make it impossible -- or at least harder -- to identify the personally identifiable information (PII).
  • dead pixel - A dead pixel is a picture element in which all three RGB sub-pixels are permanently turned off, which creates a black spot in the display.
  • dead zone (Wi-Fi dead zone) - A dead zone (Wi-Fi dead zone) is an area within a wireless LAN location where Wi-Fi does not function, typically due to radio interference or range issues.
  • deadlock - A deadlock is a situation in which two computer programs sharing the same resource are effectively preventing each other from accessing the resource, resulting in both programs ceasing to function.
  • deal registration - Deal registration is a common feature of vendors' channel partner programs in which a channel partner, such as a value-added reseller (VAR), informs the vendor about a sales lead.
  • death by PowerPoint - Death by PowerPoint is a phenomenon caused by the poor use of presentation software.
  • Debian - Debian is a popular and freely available computer operating system (OS) that uses a Unix-like kernel -- typically Linux -- alongside other program components, many of which come from GNU Project.
  • debouncing - Bouncing is the tendency of any two metal contacts in an electronic device to generate multiple signals as the contacts close or open; debouncing is any kind of hardware device or software that ensures that only a single signal will be acted upon for a single opening or closing of a contact.
  • debugging - Debugging, in computer programming and engineering, is a multistep process that involves identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it.
  • deception technology - Deception technology is a class of security tools and techniques designed to prevent an attacker who has already entered the network from doing damage.
SearchNetworking
SearchSecurity
  • man in the browser (MitB)

    Man in the browser (MitB) is a security attack where the perpetrator installs a Trojan horse on the victim's computer that is ...

  • Patch Tuesday

    Patch Tuesday is the unofficial name of Microsoft's monthly scheduled release of security fixes for the Windows operating system ...

  • parameter tampering

    Parameter tampering is a type of web-based cyber attack in which certain parameters in a URL are changed without a user's ...

SearchCIO
  • e-business (electronic business)

    E-business (electronic business) is the conduct of business processes on the internet.

  • business resilience

    Business resilience is the ability an organization has to quickly adapt to disruptions while maintaining continuous business ...

  • chief procurement officer (CPO)

    The chief procurement officer, or CPO, leads an organization's procurement department and oversees the acquisitions of goods and ...

SearchHRSoftware
SearchCustomerExperience
  • first call resolution (FCR)

    First call resolution (FCR) is when customer service agents properly address a customer's needs the first time they call.

  • customer intelligence (CI)

    Customer intelligence (CI) is the process of collecting and analyzing detailed customer data from internal and external sources ...

  • clickstream data (clickstream analytics)

    Clickstream data and clickstream analytics are the processes involved in collecting, analyzing and reporting aggregate data about...

Close