Data Management Terms

The table below displays all the terms associated with the Data Management Glossary.
Term Description
Data Element

is Technical Metadata describing a unit of data that is considered in context to be indivisible (ISO 11179-1:2015).

Note this may realize a Business Element in a physical form.

Data Element Concept

See Business Element. This is a ISO11179 definition that is equivalent to the Business Element concept.

Data Flow

is the flow of data from one point to another, without involving any intermediaries at a specific level of granularity, to transport data. A data flow may be identified at different levels of granularity (e.g. Organization, Application, Transport layer).

Data Governance Framework

framework that provides Policies, Procedures and Controls to ensure data is managed appropriately within the organization.

Data Identifier Framework

is an Architecture Framework that provides standards and best practices on identifiers for data.

Data Integration Framework

is an Architecture Framework that provides standards and best practices on how data is moved and stored.

Data Lineage

is a record of the sequence of Data Flow's involving a data element from the reference point of interest back to a point that meets the agreed to requirements or risk appetite. The information captured of the movement may additionally include data elements that were used to create the requested data element.

Data Management

is the development and execution of architectures, policies, practices and procedures in order to manage the information lifecycle needs of an enterprise in an effective manner.

searchdatamanagement.techtarget.com/definition/data-management

Data Management Assessment

is a holistic view covering a number of Data Management aspects to better determine strengths and weaknesses within an Organization. These can include "Data Architecture Current State Assessment", "Data Architecture Tools Assessment" and "Data Quality Assessment".

Data Management Framework

is a Framework that organizes Data Management constructs in a manner than can then be executed to achieve a one or more Data Management principles.

Data Model

is a graphical and/or lexical representation of data, specifying their properties, structure and inter-relationships
[ISO/IEC 11179-1:2004:3.2.7]

Data Modeling Framework

is an Architecture Framework that provides standards and best practices for data modeling.

Data Movement Control

is a Data Control that manages the correct delivery of data (completeness, timeliness, etc.)

Data Owner

is a loose generic Non-Preferred Term that can represent an individual, platform or organization that is either responsible or accountable for some aspect of data.

Data Ownership

is a loose generic Non-Preferred Term that can identify an individual, platform or organization that is either responsible or accountable for some aspect of data.

Data Point Modelling

The Data Point Model (DPM) is a data modelling methodology developed by BR-AG – a company specializing in XBRL and the related
technologies.

The DPM has the following characteristics:

  • It produces a multidimensional model
  • It also uses hierarchies and domains to describe the relationships between attributes and allow defined sub categories and values
  • It generally aims to produce a small number of main reporting items, which can then be narrowed to represent a specific ‘data point’ using a range of metadata attributes (modeled using the dimensions and domains).
  • Implicit information about a reported concept is usually made explicit for example measurement information implied elsewhere in a note.  
  • The resulting tables produce a description or map of all possible data points resulting from the main reporting items and dimensions, this can then be limited to those useful or valid in actual reporting.

For more information see: www.wikixbrl.info/index.php?title=Guidelines_for_Data_Point_Modeling

Data Policies

are a collection of Data Policies (see Data Policy) that is used to holistically manage and govern data within the organization.

Data Policy

is a policy that covers the control, distribution, and usage of data within the organization.

Data Principle

define the underlying general rules and guidelines for the use and deployment of all data across the organization.

Data Protection

is an articulation of the rights and obligations organizations, and people, have when storing or processing data to prevent misuse.

Data Provenance

is the documentation of data in sufficient detail to allow reproducibility of a specific Data Set.

[Data] Provenance can be used for many purposes, such as understanding how data was collected so it can be meaningfully used, determining ownership and rights over an object, making judgements about information to determine whether to trust it, verifying that the process and steps used to obtain a result complies with given requirements, and reproducing how something was generated. 

https://www.w3.org/TR/prov-primer/

Data Quality

is a Data Content control that provides a quantitative assessment of a Data Set using a set of quantitative and qualitative rules to verify the dataset is fit for its intended purpose.

Data Quality Assessment

is the process of determining the relate data quality of the information in a Data Domain.

Data Quality Framework

is an Architecture Framework that provides standards and best practices on how data is measured, and the processes associated with the measurement, remediation and root cause analysis.

Data Quality Rule

is a rule that applies an algorithm or logic to a Data Element or Dataset to measure the fit-for-purpose of the content. These rules are technical implementations of either a Business Rule or a Technical Rule.

Data Retention Control

is a Data Control that uses Data Classification to determine when data should be removed or archived as part of the Data Retention Policy.

Data Retention Policy

is a recognized Data Policy within an organization for retaining data for operational use while ensuring adherence to the laws and regulations concerning the data.

Data Set

is a collection of data, published or curated by a single agent, and available for access or download in one or more formats. Also known as a DataSet.

Data Set Instance
Data Sharing

is the disclosure of data from one or more organisations to a third party organisation or organisations, or the sharing of data between different parts of an organisation. Data sharing can take the form of:

  • a reciprocal exchange of data;
  • one or more organisations providing data to a third party or parties;
  • several organisations pooling information and making it available to each other;
  • several organisations pooling information and making it available to a third party or parties;
  • exceptional, one-off disclosures of data in unexpected or emergency situations; or
  • different parts of the same organisation making data available to each other
Data Sharing Agreement

is an agreement that sets out a common set of rules to be adopted by the various organisations involved in a data sharing operation.

The data sharing Agreement should minimally document the following:

  • the purpose, or purposes, of the sharing
  • the potential recipients or types of recipient and the circumstances in which they will have access;
  • the data to be shared;
  • data quality – accuracy, relevance, usability etc;
  • data security;
  • retention of shared data;
  • individuals’ rights – procedures for dealing with access requests, queries and complaints;
  • review of effectiveness/termination of the sharing agreement; and
  • sanctions for failure to comply with the agreement or breaches by individual staff.

 

Data Standard

a specification that describes or defines other data. [ISO11179]

Data Storage Framework

is an Architecture Framework that provides standards and best practices on how technology choices influence data stored.

Data Taxonomy

is a taxonomy that provides a way to classify data across the organization. Different data taxonomies may exist depending on the use case for the taxonomy.

Data Traceability

is the ability to track a data construct back to the construct it was derived from as a more concrete instantiation.

Examples include a physical column may trace back to a logical attribute, which in turn may trace to a business term, and that traces back to a concept. Data Traceability is commonly confused with Data Lineage, Data Provenance and Link Analysis.

Data Trust

is the assessment of the trustworthiness of the input data.

Deployment Control
Design Metadata

is Technical Metadata describing a data representing design constructs. This is often persisted in a form that once deployed can be harvested as Operational Metadata.

Detective Application Control

is a Detective Control operating within an Application that is designed to detect errors that may have occurred based on predefined logic or business rules.
https://www.isaca.org/Knowledge-Center/Documents/Glossary/glossary.pdf

Detective Control

is an Internal control that is designed to detect errors that may have occurred based on predefined logic or business rules. Usually executed after an action has taken place and often cover a group of transactions. https://www.isaca.org/Knowledge-Center/Documents/Glossary/glossary.pdf

Domain Specific Framework

is a domain specific Framework in the organisation may have additional standards and best practices.

Element Data Lineage

is the Data Lineage recorded at a Business Element grain of detail.

EUDA

is a End User Developed Application. This these are commonly not managed by IT, and are often custom spreadsheets.

FIBO

is an Industry Standard that is a business conceptual ontology developed by the members of the EDM Council. FIBO (Financial Industry Business Ontology) provides a description of the structure and contractual obligations of financial instruments, legal entities and financial processes. See www.edmcouncil.org/financialbusiness for more information.

Financial Model
First Line of Defense
Flow

is the movement of something in one direction.
See dictionary.cambridge.org/dictionary/english/flow for more information.

Folksonomy

is a collection of Terms allocated to resources by users in order to categorise or index them in a way that the users consider useful.
Terms in folksonomies, often called tags, are typically added in an uncontrolled manner, without any underlying structure or principles. They may be idiosyncratic, but may also use current terminology more quickly than it can be incorporated into a controlled and structured scheme.

FpML

is an Industry Standard that is a open source XML standard for electronic dealing and processing of derivatives. It establishes a protocol for sharing information electronically on, and dealing in swaps, derivatives and structured products. See www.fpml.org for more information.

Framework

is an organizing construct used to collate related content together in a manner than can then be executed to achieve a goal.

Pages