Browse Definitions :

Business intelligence - business analytics

This glossary contains definitions related to customer data analytics, predictive analytics, data visualization and operational business intelligence. Some definitions explain the meaning of words used to Hadoop and other software tools used in big data analytics. Other definitions are related to the strategies that business intelligence professionals, data scientists, statisticians and data analysts use to make data-driven decisions.

CON - GAR

  • confirmation bias - Confirmation bias is a type of mistake that occurs in thinking when information that confirms a pre-existing belief is given priority over information that does not support a preexisting belief.
  • confusion matrix - A confusion matrix is a table that outlines different predictions and test results and contrasts them with real world values.
  • content analytics - Content analytics is the act of applying business intelligence (BI) and business analytics (BA) practices to digital content.
  • content management system (CMS) - A content management system (CMS) is an application program for creating and managing digital content in a collaborative environment.
  • control framework - A control framework is a data structure that organizes and categorizes an organization’s internal controls, which are practices and procedures established to create business value and minimize risk.
  • core banking system - A core banking system is the software used to support a bank’s most common transactions.
  • corporate performance management (CPM) - Corporate performance management (CPM) is a term used to describe the various processes and methodologies involved in aligning an organization's strategies and goals to its plans and executions in order to control the success of the company.
  • correlation - Correlation is a statistical measure that indicates the extent to which two or more variables fluctuate in relation to each other.
  • correlation coefficient - A correlation coefficient is a statistical measure of the degree to which changes to the value of one variable predict change to the value of another.
  • Cost Per Call - In a call center, cost per call is a numerical metric calculated by dividing the total operational costs by the total number of calls for a given period of time.
  • cost per impression - Cost per Impression (CPI) is a business efficiency measure common to advertising in paper and web-based media.
  • cost per sale (CPS) - The cost per sale (CPS), also known as the pay per sale, is a metric used by advertising teams to determine the amount of money paid for every sale generated by a specific advertisement.
  • CRM (customer relationship management) - Customer relationship management (CRM) is the combination of practices, strategies and technologies that companies use to manage and analyze customer interactions and data throughout the customer lifecycle.
  • crowdcasting - Crowdcasting is a problem-solving and idea-generating tactic in which a corporation disseminates details of a specific problem or situation to a carefully chosen group of people for possible solutions.
  • CRUD cycle (Create, Read, Update and Delete Cycle) - The CRUD cycle describes the elemental functions of a persistent database in a computer.
  • customer analytics (customer data analytics) - Customer analytics, also called customer data analytics, is the systematic examination of a company's customer information and customer behavior to identify, attract and retain the most profitable customers.
  • customer intelligence (CI) - Customer intelligence (CI) is information derived from customer data that an organization collects from both internal and external sources.
  • CVO (chief visionary officer) - Chief Visionary Officer (CVO) is a new title being used in corporations to differentiate the holder from other corporate executives including the Chief Executive Officer (CEO), the Chief Financial Officer (CFO), the Chief Information Officer (CIO), and the Chief Technology Officer (CTO).
  • data anonymization - The purpose of data anonymization is to make its source untraceable.
  • data breach - A data breach is a confirmed incident in which sensitive, confidential or otherwise protected data has been accessed and/or disclosed in an unauthorized fashion.
  • data classification - Data classification is the process of organizing data into categories that make it is easy to retrieve, sort and store for future use.
  • data confabulation - Data confabulation is a business intelligence term for the selective and possibly misleading use of data to support a decision that has already been made.
  • data context - Data context is the network of connections among data points.
  • data democratization - Data democratization is the ability for information in a digital format to be accessible to the average end user.
  • data exhaust - Data exhaust is a byproduct of user actions online and consists of the various files generated by web browsers and their plug-ins such as cookies, log files, temporary internet files and and .
  • data federation software - Data federation software is programming that provides an organization with the ability to collect data from disparate sources and aggregate it in a virtual database where it can be used for business intelligence (BI) or other analysis.
  • data gravity - Data gravity is an attribute of data that is manifest in the way software and services are drawn to it relative to its mass (the amount of data).
  • data hygiene - Data hygiene is the collective processes conducted to ensure the cleanliness of data.
  • data integration - Data integration is the process of combining data from multiple source systems to create unified sets of information for both operational and analytical uses.
  • data journalism - Data journalism in an approach to writing for the public in which the journalist analyzes large data sets to identify potential news stories.
  • data latency - Data latency is the time it takes for data packets to be stored or retrieved.
  • data life cycle management (DLM) - Data life cycle management (DLM) is a policy-based approach to managing the flow of an information system's data throughout its life cycle: from creation and initial storage to the time when it becomes obsolete and is deleted.
  • data literacy - Data literacy is the ability to derive information from data, just as literacy in general is the ability to derive information from the written word.
  • data marketplace (data market) - Data marketplaces typically offer various types of data for different markets and from different sources.
  • data masking - Data masking is a method of creating a structurally similar but inauthentic version of an organization's data that can be used for purposes such as software testing and user training.
  • data preparation - Data preparation is the process of gathering, combining, structuring and organizing data so it can be used in business intelligence (BI), analytics and data visualization applications.
  • data protection management (DPM) - Data protection management (DPM) is the administration of backup processes to ensure that tasks run on schedule, and that data is securely backed up and recoverable.
  • data sampling - Data sampling is a statistical analysis technique used to select, manipulate and analyze a representative subset of data points to identify patterns and trends in the larger data set being examined.
  • data science as a service (DSaaS) - Data science as a service (DSaaS) is a form of outsourcing that involves the delivery of information gleaned from advanced analytics applications run by data scientists at an outside company to corporate clients for their business use.
  • data science platform - A data science platform is software that includes a variety of technologies for machine learning and other advanced analytics uses, enabling data scientists to plan strategy, uncover actionable insights from data and communicate those insights throughout an enterprise within a single environment.
  • data scientist - A data scientist is a professional responsible for collecting, analyzing and interpreting extremely large amounts of data.
  • Data Security Council of India (DSCI) - The Data Security Council of India (DSCI) is a not-for-profit organization created to promote the country as a secure destination for information technology (IT) outsourcing.
  • data smog - Data smog refers to the volume and velocity of data that is being created by devices connected to the Internet of Things.
  • data storytelling - Data storytelling is the process of translating complex data analyses into layman's terms in order to influence a decision or action.
  • data visualization - Data visualization is the practice of translating information into a visual context, such as a map or graph, to make data easier for the human brain to understand and pull insights from.
  • data visualization (charts, graphs, dashboards, fever charts, heat maps, etc.) - Data visualization is a graphical representation of numerical data.
  • data-driven disaster - A data-driven disaster is a serious problem caused by one or more ineffective data analysis processes.
  • database management system (DBMS) - A database management system (DBMS) is system software for creating and managing databases.
  • database-agnostic - Database-agnostic is a term describing the capacity of software to function with any vendor’s database management system (DBMS).
  • de-anonymization (deanonymization) - De-anonymization is a method used to detect the original data that was subjected to processes to make it impossible -- or at least harder -- to identify the personally identifiable information (PII).
  • Decision Model and Notation (DMN) - Decision Model and Notation (DMN) is a formalized method of making and mapping out decisions through official business processes.
  • decision tree - A decision tree is a graph that uses a branching method to illustrate every possible outcome of a decision.
  • decision-making process - The decision-making process, in a business context, is a set of steps taken by managers in an enterprise to determine the planned path for business initiatives and to set specific actions in motion.
  • deductive argument - A deductive argument is the presentation of statements that are assumed or known to be true as premises for a conclusion that necessarily follows from those statements.
  • deductive reasoning - Deductive reasoning is a logical process in which a conclusion is based on the accordance of multiple premises that are generally assumed to be true.
  • deep analytics - Deep analytics is the application of sophisticated data processing techniques to yield information from large and typically multi-source data sets comprised of both unstructured and semi-structured data.
  • deep learning - Deep learning is a type of machine learning (ML) and artificial intelligence (AI) that imitates the way humans gain certain types of knowledge.
  • Demand Planning - Demand planning is the process of forecasting the demand for a product or service so it can be produced and delivered more efficiently and to the satisfaction of customers.
  • Demand signal repository (DSR) - Demand signal repository (DSR) is a database that aggregates sales data at the point of sale (POS).
  • descriptive analytics - Descriptive analytics is a preliminary stage of data processing that creates a summary of historical data to yield useful information and possibly prepare the data for further analysis.
  • descriptive modeling - Descriptive modeling is a mathematical process that describes real-world events and the relationships between factors responsible for them.
  • direct digital marketing (DDM) - Direct digital marketing (DDM) is the electronic delivery of relevant communications to specific recipients.
  • document capture - Document capture is any one of several processes used to convert a physical document to another format, typically a digital representation.
  • driver-based planning - Driver-based planning is an approach to management that identifies an organization's key business drivers and creates a series of business plans that mathematically model how those things most necessary for the organization's success would be affected by different variables.
  • dynamic BPM (business process management) - Dynamic business process management (BPM) is an approach designed to allow business processes to adjust quickly to changing business needs.
  • dynamic case management (DCM) - Dynamic case management (DCM) is the handling of case-based work through the use of technologies that automate and streamline aspects of each case.
  • e-business (electronic business) - E-business (electronic business) is the conduct of business processes on the internet.
  • e-score - The e-score is a consumer rating metric used to to determine an individual's potential value as a customer and to use that information to guide marketing efforts.
  • EBITDA ( earnings before interest, taxes, depreciation, and amortization) - EBITDA stands for 'earnings before interest, taxes, depreciation, and amortization.
  • econometrics - Econometrics is the analysis and testing of economic theories to verify hypotheses and improve prediction of financial trends.
  • edge analytics - Edge analytics is an approach to data collection and analysis in which an automated analytical computation is performed on data at a sensor, network switch or other device instead of waiting for the data to be sent back to a centralized data store.
  • Elastic - Elastic is a software company that provides products and services related to Elasticsearch, its distributed enterprise search engine.
  • Elastic Stack - Elastic Stack is a group of open source products from Elastic designed to help users take data from any type of source and in any format and search, analyze, and visualize that data in real time.
  • embedded analytics - Embedded analytics is the integration of business intelligence (BI) tools and capabilities into business software, including customer relationship management (CRM), enterprise resource planning (ERP), marketing automation and financial systems.
  • embedded BI (embedded business intelligence) - Embedded BI (business intelligence) is the integration of self-service BI tools into commonly used business applications.
  • emotional intelligence (EI) - Emotional intelligence (EI) is the area of cognitive ability that facilitates interpersonal behavior.
  • empiricism - Empiricism is the theory that human knowledge comes predominantly from experiences gathered through the five senses.
  • encryption key management - Encryption key management is the administration of tasks involved with protecting, storing, backing up and organizing encryption keys.
  • ensemble modeling - Ensemble modeling is the process of running two or more related but different analytical models and then synthesizing the results into a single score or spread in order to improve the accuracy of predictive analytics and data mining applications.
  • enterprise - In the computer industry, an enterprise is an organization that uses computers.
  • enterprise architecture (EA) - An enterprise architecture (EA) is a conceptual blueprint that defines the structure and operation of an organization.
  • enterprise content management (ECM) - Enterprise content management (ECM) is a set of defined processes, strategies and tools that allow a business to effectively obtain, organize, store and deliver critical information to its employees, business stakeholders and customers.
  • enterprise document management (EDM) - Enterprise document management is a strategy for overseeing an organization's paper and electronic documents so they can be easily retrieved in the event of a compliance audit or subpoena.
  • Enterprise Identity Mapping (EIM) - Enterprise Identity Mapping (EIM) is an open architecture from IBM for helping an enterprise manage the multiple user registries and identities that enable a computer user to access multiple applications with a single sign-on.
  • enterprise information portal (EIP) - The enterprise information portal (EIP), also known as a business portal, is a concept for a Web site that serves as a single gateway to a company's information and knowledge base for employees and possibly for customers, business partners, and the general public as well.
  • enterprise mashup (or data mashup) - An enterprise mashup is the integration of heterogeneous digital data and applications from multiple sources for business purposes.
  • enterprise relationship management (ERM) - Enterprise relationship management (ERM) is software that analyzes data it has about its customers to develop a better understanding of the customer and how the customer is using its products and services.
  • enterprise search - There are a number of kinds of enterprise search including local installations, hosted versions, and search appliances, sometimes called “search in a box.
  • explicit data - Explicit data is data that is provided intentionally and taken at face value rather than analyzed or interpreted for further meaning.
  • Extract, Load, Transform (ELT) - Extract, Load, Transform (ELT) is a data integration process for transferring raw data from a source server to a data system (such as a data warehouse or data lake) on a target server and then preparing the information for downstream uses.
  • Fair Information Practices (FIP) - FIP (Fair Information Practices) is a general term for a set of standards governing the collection and use of personal data and addressing issues of privacy and accuracy.
  • falsifiability - Falsifiability is the capacity for some proposition, statement, theory or hypothesis to be proven wrong.
  • fast data - Fast data is the application of big data analytics to smaller data sets in near-real or real-time in order to solve a problem or create business value.
  • financial analytics - Financial analytics is the creation of ad hoc analysis to answer specific business questions and forecast possible future financial scenarios.
  • financial data management - Financial data management (FDM) is a process and policy, usually assisted by specialized software, that allows an enterprise or institution to consolidate its financial information, maintain compliance with accounting rules and laws, and produce detailed financial reports.
  • firmographic data - Firmographic data is types of information that can be used to categorize organizations, such as location, name, number of clients, industry and so on.
  • focus group - A focus group is a small and typically diverse panel of people selected to survey for their opinions on a given subject.
  • fresh data - Fresh data is data that is current and immediately usable and useful.
  • gap analysis - A gap analysis assesses the differences between the current and desired performance levels of a company's systems or applications.
  • Gartner - Gartner is an information technology (IT) research and consultancy company, formerly known as Gartner Group.

SearchCompliance

  • risk assessment

    Risk assessment is the identification of hazards that could negatively impact an organization's ability to conduct business.

  • PCI DSS (Payment Card Industry Data Security Standard)

    The Payment Card Industry Data Security Standard (PCI DSS) is a widely accepted set of policies and procedures intended to ...

  • risk management

    Risk management is the process of identifying, assessing and controlling threats to an organization's capital and earnings.

SearchSecurity

SearchHealthIT

SearchDisasterRecovery

  • call tree

    A call tree is a layered hierarchical communication model that is used to notify specific individuals of an event and coordinate ...

  • Disaster Recovery as a Service (DRaaS)

    Disaster recovery as a service (DRaaS) is the replication and hosting of physical or virtual servers by a third party to provide ...

  • cloud disaster recovery (cloud DR)

    Cloud disaster recovery (cloud DR) is a combination of strategies and services intended to back up data, applications and other ...

SearchStorage

Close