Browse Definitions :

Database management

Terms related to databases, including definitions about relational databases and words and phrases about database management.

DAT - LIN

  • data - In computing, data is information that has been translated into a form that is efficient for movement or processing.
  • data abstraction - Data abstraction is the reduction of a particular body of data to a simplified representation of the whole.
  • data aggregation - Data aggregation is any process in which information is gathered and expressed in a summary form, for purposes such as statistical analysis.
  • data analytics (DA) - Data analytics (DA) is the science of examining raw data with the purpose of drawing conclusions about that information.
  • data availability - Data availability is a term used by some computer storage manufacturers and storage service providers (SSPs) to describe products and services that ensure that data continues to be available at a required level of performance in situations ranging from normal through "disastrous.
  • data corruption - Data corruption is the deterioration of computer data as a result of some external agent.
  • Data Definition Language (DDL) - Data Definition Language (DDL) is a standard for commands that define the different structures in a database.
  • data dictionary - A data dictionary is a collection of descriptions of the data objects or items in a data model for the benefit of programmers and others who need to refer to them.
  • data hiding - Data hiding is a characteristic of object-oriented programming.
  • data ingestion - Data can be ingested in real time or in batches.
  • data integrity - Data integrity is the assurance that digital information is uncorrupted and can only be accessed or modified by those authorized to do so.
  • data management-as-a-service (DMaaS) - Data Management-as-a-Service (DMaaS) is a type of cloud service that provides protection, governance and intelligence across a company’s various data sources.
  • data mart (datamart) - A data mart is a repository of data that is designed to serve a particular community of knowledge workers.
  • data mining - Data mining is the process of sorting through large data sets to identify patterns and establish relationships to solve problems through data analysis.
  • data modeling - Data modeling is the process of documenting a complex software system design as an easily understood diagram, using text and symbols to represent the way data needs to flow.
  • data preprocessing - Data preprocessing describes any type of processing performed on raw data to prepare it for another processing procedure.
  • data profiling - Data profiling is the process of examining, analyzing and reviewing data to collect statistics surrounding the quality and hygiene of the dataset.
  • data quality - Data quality is a measure of the condition of data based on factors such as accuracy, completeness, consistency, reliability and whether it's up to date.
  • data scrubbing (data cleansing) - Data scrubbing, also called data cleansing, is the process of cleaning up data in a database that is incorrect, incomplete, or duplicated.
  • data set - A data set is a collection of data that contains individual data units organized (formatted) in a specific way and accessed by one or more specific access methods based on the data set organization and data structure.
  • data source name (DSN) - A data source name (DSN) is a data structure that contains the information about a specific database that an Open Database Connectivity (ODBC) driver needs in order to connect to it.
  • data splitting - Data splitting is an approach to protecting sensitive data from unauthorized access by encrypting the data and storing different portions of a file on different servers.
  • data store - A data store is a repository for persistently storing collections of data, such as a database, a file system or a directory.
  • data structure - A data structure is a specialized format for organizing, processing, retrieving and storing data.
  • data warehouse - A data warehouse is a federated repository for all the data collected by an enterprise's various operational systems, be they physical or logical.
  • database (DB) - A database is a collection of information that is organized so that it can be easily accessed, managed and updated.
  • database abstraction layer - A database abstraction layer is a simplified representation of a database in the form of a written description or a diagram.
  • database activity monitoring (DAM) - Database activity monitoring (DAM) systems monitor and record activity in a database and then generate alerts for anything unusual.
  • database automation - Database automation is the use of unattended processes and self-updating procedures for administrative tasks in a database.
  • database management system (DBMS) - A database management system (DBMS) is system software for creating and managing databases.
  • database marketing - Database marketing is a systematic approach to the gathering, consolidation, and processing of consumer data (both for customers and potential customers) that is maintained in a company's databases.
  • database mirroring - Database mirroring is the maintenance of redundant copies of a database to ensure continuous data availability and minimize or avoid downtime that might otherwise result from data corruption or loss, or when the operation of a network is partially compromised.
  • database normalization - Database normalization is intrinsic to most relational database schemes.
  • database replication - Database replication is the frequent electronic copying of data from a database in one computer or server to a database in another -- so that all users share the same level of information.
  • database-agnostic - Database-agnostic is a term describing the capacity of software to function with any vendor’s database management system (DBMS).
  • Database: Glossary - This is a glossary of database-related terms.
  • DB2 - DB2 is a family of relational database management system (RDBMS) products from IBM that serve a number of different operating system platforms.
  • DDBMS (distributed database management system) - A DDBMS (distributed database management system) is a centralized application that manages a distributed database as if it were all stored on the same computer.
  • deep analytics - Deep analytics is the application of sophisticated data processing techniques to yield information from large and typically multi-source data sets comprised of both unstructured and semi-structured data.
  • delimiter - In computer programming, a delimiter is a character that identifies the beginning or the end of a character string (a contiguous sequence of characters).
  • derived object (DO) - A derived object (DO) is a file created in a Versioned Object Base (VOB).
  • digital photo album - A digital photo album is an application that allows the user to import graphic image files from a digital camera, memory card, scanner, or computer hard drive, to a central database.
  • dimension - In data warehousing, a dimension is a collection of reference information about a measurable event (fact).
  • dirty data - In a data warehouse, dirty data is a database record that contains errors.
  • distributed database - A distributed database is a database that consists of two or more files located in different sites either on the same network or on entirely different networks.
  • distributed ledger technology (DLT) - Distributed ledger technology (DLT) is a digital system for recording the transaction of assets in which the transactions and their details are recorded in multiple places at the same time.
  • distribution - In marketing, distribution is the process of moving a product from its manufacturing source to its customers.
  • document-oriented database - A document-oriented database is a type of NoSQL database in which data is stored in binary document files.
  • drilldown - As currently used in information technology, to drill down (verb) is to focus in on something.
  • DSML (Directory Services Markup Language) - DSML (Directory Services Markup Language) is an application of the Extensible Markup Language (XML) that enables different computer network directory formats to be expressed in a common format and shared by different directory systems.
  • DSTP (Data Space Transfer Protocol) - DSTP (Data Space Transfer Protocol) is a protocol that is used to index and retrieve data from a number of databases, files, and other data structures using a key that can find all the related data about a particular object across all of the data.
  • Dublin Core - Dublin Core is an initiative to create a digital "library card catalog" for the Web.
  • DXL (Domino Extensible Language) - DXL (Domino Extensible Language) is a specific version of Extensible Markup Language (XML) for Lotus Domino data.
  • dynamic SQL (Dynamic Structured Query Language) - Dynamic SQL is an enhanced form of Structured Query Language (SQL) that, unlike standard (or static) SQL, facilitates the automatic generation and execution of program statements.
  • E. F. Codd (Edgar F. "Ted" Codd) - E. F.
  • ebXML (electronic business xml) - ebXML (Electronic Business XML) is a project to use the Extensible Markup Language (XML) to standardize the secure exchange of business data.
  • Eclipse (Eclipse Foundation) - Eclipse is an open-source Java Integrated Development Environment (IDE) known for its plug-ins that allow developers to develop and test code written in other programming languages.
  • ECMAScript (European Computer Manufacturers Association Script) - ECMAScript is a standard script language, developed with the cooperation of Netscape and Microsoft and mainly derived from Netscape's JavaScript, the widely-used scripting language that is used in Web pages to affect how they look or behave for the user.
  • EDM (Electronic Document Management) - EDM (Electronic Document Management) is the management of different kinds of documents in an enterprise using computer programs and storage.
  • employee self-service (ESS) - Employee self-service (ESS) is a widely used human resources technology that enables employees to perform many job-related functions, such as applying for reimbursement, updating personal information and accessing company benefits information -- which was once largely paper-based, or otherwise would have been maintained by management or administrative staff.
  • enclave - In IBM's OS/390 operating system, an enclave is a representation of a business transaction or unit of work.
  • encoding and decoding - Encoding is the process of putting a sequence of characters (letters, numbers, punctuation, and certain symbols) into a specialized digital format for efficient transmission or transfer.
  • encryption key management - Encryption key management is the administration of tasks involved with protecting, storing, backing up and organizing encryption keys.
  • engine-level encryption - Engine-level encryption is cryptographic encoding and decoding of data that is executed within a database engine.
  • enhancement - In an information technology product, an enhancement is a noteworthy improvement to the product as part of a new version of it.
  • enterprise content management (ECM) - Enterprise content management (ECM) is a set of defined processes, strategies and tools that allow a business to effectively obtain, organize, store and deliver critical information to its employees, business stakeholders and customers.
  • enterprise search - There are a number of kinds of enterprise search including local installations, hosted versions, and search appliances, sometimes called “search in a box.
  • Entity Relationship Diagram (ERD) - An entity relationship diagram (ERD), also known as an entity relationship model, is a graphical representation that depicts relationships among people, objects, places, concepts or events within an information technology (IT) system.
  • entity-relationship model (ERM or ER model) - The entity-relationship model (or ER model) is a way of graphically representing the logical relationships of entities (or objects) in order to create a database.
  • Exadata - Exadata (also called Oracle Exadata Database Machine) in-memory database appliance that supports OLTP (transactional) and OLAP (analytical) database systems.
  • Excel - Excel is a spreadsheet program from Microsoft, a component of its Office product group for business applications.
  • executable - In computers, to execute a program is to run the program in the computer, and, by implication, to start it to run.
  • export - In a personal computer application, to export is to convert a file into another format than the one it is currently in.
  • extension - In computer operating systems, a file name extension is an optional addition to the file name in a suffix of the form ".
  • extract, transform, load (ETL) - In managing databases, extract, transform, load (ETL) refers to three separate functions combined into a single programming tool.
  • failover - Failover is a backup operational mode in which the functions of a system component (such as a processor, server, network, or database, for example) are assumed by secondary system components when the primary component becomes unavailable through either failure or scheduled down time.
  • Fast Guide: SQL Server 2000 commands - Here are ten commands you need to know!.
  • fetch - In computer technology, fetch has several meanings related to getting, reading, or moving data objects.
  • field - A field is an area in a fixed or known location in a unit of data such as a record, message header, or computer instruction that has a purpose and usually a fixed size.
  • file format - In a computer, a file format is the layout of a file in terms of how the data within the file is organized.
  • file transfer - File transfer is the movement of one or more files from one location to another.
  • FileMaker (FMP) - FileMaker is a relational database application in which an individual may design -- and easily share on the Internet -- a database file by starting with a blank document or implementing ready-made and customizable templates.
  • fixed data (permanent data, reference data, archival data, or fixed-content data) - Fixed data (sometimes referred to as permanent data) is data that is not, under normal circumstances, subject to change.
  • flat file - A flat file contains records that have no structured interrelationship.
  • flexfield - In an Oracle environment, a flexfield is a database field that has flexibility built into it so that users can define reporting structures that are relevant to their specific organizations.
  • foreign key - A foreign key is a column or columns of data in one table that connects to the primary key data in the original table.
  • framework - In computer systems, a framework is often a layered structure indicating what kind of programs can or should be built and how they would interrelate.
  • full-text database - A full-text database is a compilation of documents or other information in the form of a database in which the complete text of each referenced document is available for online viewing, printing, or downloading.
  • GIS (geographic information system) - A GIS (geographic information system) enables you to envision the geographic aspects of a body of data.
  • Good Automated Laboratory Practices (GALP) - Good Automated Laboratory Practices (GALP) is a standardized set of best practices that are used to ensure data integrity for laboratory data that is gathered, processed, and archived by a laboratory information management system (LIMS).
  • Google BigQuery - Google BigQuery is a cloud-based big data analytics web service for processing very large read-only data sets.
  • Google Bigtable - Google Bigtable is a distributed, column-oriented data store created by Google Inc.
  • graph analytics - Graph analytics is a category of tools used to visually represent big data and apply algorithms to better understand the relationship between graph database entries.
  • graph database - A graph database is a type of NoSQL database that uses graph theory to store, map and query relationships.
  • Hadoop Distributed File System (HDFS) - The Hadoop Distributed File System (HDFS) is the primary data storage system used by Hadoop applications.
  • Hancock - Hancock is a C-based programming language developed by AT&T specifically for data mining telephone and Internet records.
  • hashing - Hashing is the transformation of a string of characters into a usually shorter fixed-length value or key that represents the original string.
  • HPFS (High Performance File System) - HPFS (High Performance File System) is the file system introduced with IBM's OS/2 Version 1.
  • hybrid online analytical processing (HOLAP or Hybrid OLAP) - Hybrid online analytical processing (HOLAP) is a combination of relational OLAP (ROLAP) and multidimensional OLAP (usually referred to simply as OLAP).
  • IDEF (Integrated Definition) - IDEF (for Integrated Definition) is a group of modeling methods that can be used to describe operations in an enterprise.

-ADS BY GOOGLE

SearchCompliance

  • risk management

    Risk management is the process of identifying, assessing and controlling threats to an organization's capital and earnings.

  • compliance framework

    A compliance framework is a structured set of guidelines that details an organization's processes for maintaining accordance with...

  • regulatory compliance

    Regulatory compliance is an organization's adherence to laws, regulations, guidelines and specifications relevant to its business...

SearchSecurity

  • DNS over HTTPS (DoH)

    DNS over HTTPS (DoH) is a relatively new protocol that encrypts domain name system traffic by passing DNS queries through a ...

  • integrated risk management (IRM)

    Integrated risk management (IRM) is an approach to risk management that uses a set of practices and processes to improve an ...

  • MITRE ATT&CK framework

    The MITRE ATT&CK (pronounced 'miter attack') framework is a free, globally accessible service that provides comprehensive and ...

SearchHealthIT

  • telemedicine (telehealth)

    Telemedicine is the remote delivery of healthcare services, such as health assessments or consultations, over the ...

  • Project Nightingale

    Project Nightingale is a controversial partnership between Google and Ascension, the second largest health system in the United ...

  • medical practice management (MPM) software

    Medical practice management (MPM) software is a collection of computerized services used by healthcare professionals and ...

SearchDisasterRecovery

SearchStorage

  • M.2 SSD

    An M.2 SSD is a solid-state drive (SSD) that conforms to a computer industry specification and is used in internally mounted ...

  • kilobyte (KB or Kbyte)

    A kilobyte (KB or Kbyte) is a unit of measurement for computer memory or data storage used by mathematics and computer science ...

  • virtual memory

    Virtual memory is a memory management capability of an operating system (OS) that uses hardware and software to allow a computer ...

Close