GLOSSARY

Agile: An approach often used in project and programme management to time-box delivery activity in iterations, or short planned phases, to achieve incrementally the overall goal whilst retaining a focus on quality. It originated as the Agile Manifesto in 2001 following a group of like-minded individuals defining a different, lighter approach to software development. It is now used more extensively than just in software development, with a variety of forms – Scrum, dynamic systems development method (DSDM), XP, to name a few.

Artificial intelligence: Machine intelligence that uses systems to think and act to make decisions. There are numerous forms of artificial intelligence (AI), from what is termed ‘weak AI’ or artificial narrow intelligence (ANI), which is designed for a specific purpose and is most abundant today (for example Amazon’s Alexa, or Apple’s Siri); to artificial general intelligence (AGI), which has a self-aware consciousness to be able to solve problems, learn and plan for the future; to artificial super intelligence (ASI), which goes beyond the capacity of the human brain and is thought to be still at the theoretical stage in terms of active applications.

Benefits: A quantifiable improvement achieved from an activity to transform a situation via an input, which could be a programme or simply a series of tasks. Typically, benefits are identified in advance of an activity, so the progress can be monitored from an agreed baseline.

Big data: Significant and diverse data sets now being handled by many organisations – structured and unstructured – that are characterised by the 3 Vs: greater variety, velocity and volume. The explosion of internet-based data, whether social media or Internet of Things (IoT) applications, and the doubling of data available every two years1 has led to the development of different ways to capture and process data, and two further Vs being added for consideration: value and veracity.

Business intelligence: An embracing term for the technologies, processes and architectural principles which enable an organisation to transform raw data into the appropriate outputs to make effective decisions based on insight and business performance, derived through the coherence that the whole suite of business intelligence activities provides.

Change management: The use of structured processes to shift an organisation to a future state in which it delivers outcomes that achieve different, desired goals and has stakeholder engagement embedded within it. Typically, change management is people-focused, so it is closely aligned with culture.

Culture: The shared values, attitudes and characteristics of an organisation. These may be quite different to the published ‘corporate’ definition of the organisation’s culture, evolving organically, and are shaped by events which define expectations, norms and beliefs over time. Whilst some of this may be documented, much of it will not be.

Data accessibility: The ease of gaining access to data within the organisation, recognising that controls may need to be in place for compliance purposes that restrict some access rights.

Data acquisition: The process of gaining data from other sources, including procurement of data from third party data providers. This may be to plug gaps in existing data sets or to enhance these data sets through the addition of data not available from within the organisation.

Data architecture: Models, policies, standards or rules that define how data is to be collected, stored, distributed, transformed and made accessible to relevant users to enable the achievement of business outcomes.

Data compliance: Ensuring data is captured, stored and made accessible in ways that conform to legislative and regulatory frameworks. These will include data privacy, data protection and data segregation, where appropriate.

Data exchange: The sending, and receipt, of data in a manner which enables it to be consumed in a structured format that prevents any loss or misinterpretation as a result of the process.

Data exploitation: The application of a range of tools and techniques to data to enable meaning and, therefore, value to be drawn from it. It is the interpretation and knowledge derived from data, and this ranges from the presentation of information via reporting, through analytics and insight to the world of artificial intelligence.

Data governance: The collective term that encompasses processes, roles, policies, standards and measures to manage and protect the data asset within an organisation. It includes people, process and technology, and is the key driver in ensuring data quality, compliance and integrity are maintained.

Data integration: The act of combining data from different sources into a single unified view within a system. It is often related to data migration, in which data is moved from one system to another, typically in the case of replacing systems, in which the final step is data integration.

Data lake: A centralised repository in which structured and unstructured data sets can be stored and accessed. This usually stores raw data in its native form, which differentiates it from other approaches (for example data warehouse), with the user manipulating data in the repository as required for the specific purpose for which it is needed. Due to the complexity involved, principal users tend to be data scientists and analysts who are comfortable dealing with large data sets held in an unstructured format, with the knowledge of how to integrate these for their immediate need.

Data maturity: An assessment of an organisation’s sophistication in managing and exploiting the data it holds, and the extent to which data is used to underpin decision making across the organisation. This provides a clear methodology and measurement to track the progress an organisation is making in becoming more data mature.

Data modelling: A visual representation to illustrate connections between data within a system or across systems to enable coherence in understanding the data landscape. There are numerous types of data models which can be developed, each having its own purpose, with the logical data model determining how the data should be curated, and the physical data model representing the actual way data is held within a system.

Data quality: The multiple dimensions against which data can be assessed. Depending on what is relevant to the organisation, and the data attribute being assessed, these would typically reflect different levels of data quality maturity: completeness, validity, consistency, uniqueness, integrity, auditability, accuracy, precision, timeliness, relevance, reliability. Each of these will carry their own definitions as to what element of quality they seek to determine. Data quality can be difficult to specify, as it is often a question of how much you are prepared to invest to assure data is of quality due to its propensity (in many cases) to date and degrade if not maintained.

Data retention: Often referred to as records retention, the policies and processes for determining the period over which data is kept within the organisation. There is usually an organisational imperative that determines the retention period, but there must also be cognisance of compliance obligations. It covers not only data in active use, but that which is archived or ‘soft’ deleted on systems.

Data science: An extension of analytics that has embraced the exponential growth in computing power and intelligence to provide additional capabilities which broaden analytics into a programming environment much broader than that of a traditional analytics approach.

Data security: Specific controls, standard policies and procedures to protect data from a range of issues, including unauthorised access, accidental loss and destruction. The core elements are known as the CIA triad – confidentiality, integrity and availability.

Data standards: Documented agreements on representation, format, definition, structuring, tagging, transmission, manipulation, use and management of data.

Data transformation: The conversion of data from one format or structure to another. It is often part of a wider process known as extract, transform and load, which is commonly used in data warehouse design when data is recomposed in a format which is consistent for all incoming data sets being integrated into the warehouse.

Data virtualisation: The aggregation of real-time data, whether structured or unstructured, in-memory as an abstraction layer rather than a physical transfer of data. Unlike data warehousing that relies on extract-transform-load (ETL) processes, data virtualisation accesses data from source rather than processing it as an extract routine.

Data visualisation: The delivery of data, typically as information or other refined forms, in a visual and appealing way (such as charts, graphs or maps) to give greater insight or understanding to the recipient.

Data warehouse: Aggregated structured data from multiple sources integrated into a central repository to enable interrogation for reporting and analytics. Typically, this will hold historical data to provide trends and to reduce the burden of running queries on data held in operational/transactional systems.

Descriptive analytics: The process of providing information to illustrate what happened or is happening. It is also commonly referred to as reporting, or management information, as it provides a reliable way of processing breadth of data over a time series to report trends and performance.

Diagnostic analytics: An assessment of why something happened and, as such, a build on descriptive analytics. It seeks to establish the root cause of an event, using techniques such as probability, likelihood and distribution of outcomes to explore the data.

Evaluation: A systematic assessment of the design, implementation and outcomes of an intervention. It involves understanding how an intervention is being, or has been, implemented and what effects it has, for whom and why. It identifies what can be improved and estimates its overall impacts and cost-effectiveness.

Knowledge management: An integrated approach to the process of identifying, capturing, evaluating, retrieving and sharing information, from wherever sourced, and effectively using knowledge across an organisation to provide an information environment which can be accessed and exploited.

Machine learning: A branch of artificial intelligence focused on building applications that learn from data and improve their accuracy over time without being programmed to do so.

Master data management: The creation of a single master record for all critical business data, across internal and external data sources and applications. This information becomes a consistent, reliable source for an organisation, and has primacy over other data.

Metadata: A summarised definition of a data attribute. There are six types of metadata: descriptive metadata, which provides basic core information; structural metadata, relating to the nature of the data being described; preservation metadata, which enables it to be managed appropriately; provenance metadata, which provides data on its origins; use metadata, capturing how the data is used; and administrative metadata, which records any rules, restrictions or constraints on how data can be utilised.

Milestone: A point in a project (or programme) where specific outputs have been achieved on the project timeline. It may mark a shift in the project from one phase to another, or simply some significant achievement within the project that is clearly evident.

Open data: Data that is available to everyone to access, use and share. It must be licensed, permitting it to be used by anyone in any way they want, including transforming, combining and sharing it with others, even commercially.

Performance management: The methodologies, metrics, processes and systems used to monitor and manage the business performance of an enterprise. It can also be used to describe the performance of an employee within the organisation via a process of review with a manager.

Portfolio management: The selection, prioritisation and control of an organisation’s programmes and projects, in line with its strategic objectives and capacity to deliver.

Predictive analytics: The use of advanced analytic techniques that leverage historical data to uncover real-time insights and to predict future events. The goal is to go beyond knowing what has happened to providing a best assessment of what will happen, and some sense of when.

Prescriptive analytics: Finding the best course of action in a scenario, given the available data, to influence how an organisation should be proactive in its actions. It is related to both descriptive analytics and predictive analytics, but emphasises actionable insights. Prescriptive analytics is the final step of the analytics continuum.

Programme management: A delivery mechanism to achieve change through the combination and coordination of a series of projects, or activities, that collectively contribute to the totality of the change and enable the delivery of the strategic objectives and direction of the organisation. Programme management is designed to guide the organisation through this dynamic environment, refining and refocusing as necessary along the way. Programmes are concerned with delivering outcomes, whereas projects are focused on outputs.

Project management: The coordinated means to structure a set of agreed activities with a definite start, middle and end to achieve the overall objective using a coherent and disciplined approach. Project management provides structure and control of the project environment so that the agreed activities will produce the right products or services to meet the customer’s expectations.

Red team: The practice of rigorously challenging plans, policies, systems and assumptions by adopting an adversarial approach. A red team may be a contracted external party or an internal group that uses strategies to encourage an outsider perspective.

Requirements: The elicitation, analysis, specification and validation of requirements and constraints to a level that enables effective development and operations of new or changed software, systems, processes, products and services.

Risk management: The art and science of identifying, analysing and responding to risk factors throughout the life of a project and in the best interests of its objectives.

Route map: A high-level, easy-to-understand overview of the important elements of a programme (or project) plan. It provides a quick snapshot of the aims, important milestones, key deliverables, dependencies and possible risks.

Scrum: A process framework that is one of a number of methodologies to utilise an Agile approach, which involves breaking down activity into stages known as sprints. These are time-boxed periods in which the scope is agreed and optimised for the sprint window. A daily review is conducted to track progress and work through the deliverables collaboratively as a team, including a Scrum master and the product owner.

Stakeholder: Individuals and organisations who are actively involved in, or whose interests may be positively or negatively affected as a result of, the activity being undertaken.

Unstructured data: Data that is not in tabular or delimited format. Examples include natural language documents, email, speech, images and video. It is information that has not been specifically encoded for machines to process but rather authored by humans for humans to understand.

Use cases: The description of how an individual will utilise a system or process to accomplish a goal or a business objective. It is a common practice in software development and used extensively as part of Agile, but has become more widely used as a means to assist with process definition in organisations. It consists of an actor (the user), a system (or process) and a goal (which represents the desired outcome).

User stories: A way of capturing the experience described by those specifying the need to be able to define what the outcome should achieve, feel like or deliver. They are relatively detailed, and are then used in the sprint and product backlog process of Agile to determine what will be delivered at the end of a sprint to the customer. They tend to be structured in a broadly common format.

Waymarker: One of a series of signs used to mark out a route. In a strategy implementation context, the use of waymarkers is the logical approach to designate direction and an overarching indication of likely progress, rather than milestones, which provide more detailed definition.

 

1 J. Gantz and D. Reinsel, Extracting Value from Chaos. IDC Digital Universe study, 2011.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset