Chapter 3. MDM Components and the Maturity Model

3.1. Introduction

One common misconception at the highest levels of an organization is that any good idea comes ready to roll right out of the box. But make no mistake about it—there is no silver bullet for any enterprise information initiative, let alone master data management. Information professionals recognize that master information consolidation is the right thing to do, but that does not necessarily imply that there are always going to be acute business requirements that support a drastic upheaval to an information management program.
The migration to an organization that relies exclusively on master data management does not take place overnight; rather it evolves through a number of transitional information management stages. Recognizing that the process involves more than purchasing a software package or engaging outside solution vendors is the first step in achieving the MDM evolution. But it is more than that—it means understanding the essential capabilities necessary to successfully deploy MDM and the maturity of those capabilities necessary to make MDM actionable.
No functionality list completely captures the inventory of services that a specific business requires from its master data asset. However, it is worthwhile to explore a high-level enumeration of core MDM capabilities. In this chapter we provide a conceptual outline of technical MDM components. Next, we explore levels of maturity based on the ability to provide MDM services. Presenting the MDM component layers in terms of their maturity enables enterprise architects to target a desired level of MDM maturity and develop a design and implementation road map that articulates the steps to take when assembling an MDM program.

3.2. MDM Basics

The proliferation of enterprise-level application expectations for shared, synchronized information drives the need for developing a single view of the key data entities in common use across the organization. At the technical level, the drivers and fundamentals of master data management can be summarized as processes for consolidating variant versions of instances of core data objects distributed across the organization into a unique representation. In turn, that unique representation is continually synchronized across the enterprise application architecture to allow master data to be available as a shared resource. The result is a master data asset of uniquely identified key data entity instances integrated through a service layer with the applications across the organization.
However, the devil is in the details. To accomplish what may seem to be a relatively straightforward set of ideas, the organization must be prepared for the technical, operational, and management challenges that will appear along the way. In fact, the deployment of an MDM solution could evolve through a number of iterations, introducing data object analysis and model consolidation for analytical purposes as an initial step, then following on with increasing levels of integration, service, and synchronization.
The end-state master data management environment presents an enterprise resource integrated with the enterprise application architecture through a collection of provided services. At the least, a mature MDM solution will encompass the capabilities and services displayed in Figure 3.1.
B9780123742254000035/gr1.jpg is missing
▪Figure 3.1
MDM component and service model.
Although these layers represent a model of the most necessary technical, operational, and management components needed to develop an MDM capability, organizations can launch the program even with these components at various levels of maturity. The parts of this component model can be grouped into conceptual architectural levels, beginning with the architecture, then governance, management, identity management, integration services, and finally business process management. Examining the levels of maturity of these components and their relationship to the business requirements will guide the MDM program manager in developing an implementation road map. Although the architectural levels are presented from the bottom up, the maturity model will provide insight into how selected pieces of the component model can begin to add value to the organization as the implementation grows. In this chapter we review the fundamentals of each layer in this component stack, then we provide some guidance for evaluating levels of maturity.

3.2.1. Architecture

Fundamentally there are three aspects to the master data management architecture, corresponding to the structure, power, and control of the environment. The structure is represented by the MDM master data models, the power reflects the MDM system architecture, and the control is encompassed by the MDM service layer.

3.2.2. Master Data Model

To accommodate any conceptual master data asset, all of the data elements in the various different formats and structures that exist across the enterprise need to be presented as a centralized resource that can both accommodate the differences from the existing data sources and feed the master representations back into those different representations. This implies that there must be a consolidated model for representing master data as well as models for the extraction and exchange of data as it is integrated into the master data asset.
Source metadata details can be easily captured and managed in a metadata registry, and this information can be used to develop a representative master object model for every master data object type. The representative master object model for each master data type should be resilient to the differences between existing replicated data instance models, and this suggests creating a model to support all of the data in all of the application models. In other words, the set of the data attributes of the consolidated model must be a superset of all of the important attributes from each of the application models, and the format and structures for each attribute must support all the formats and structures used for that attribute across the many variant models. This defines the fundamental challenge of the master data model: supporting variant structure and formats for both accumulating and publishing of master data objects.

3.2.3. MDM System Architecture

All data objects are subject to a “data life cycle,” and different systems requirements are associated with and affected by each stage of that data life cycle. The MDM system architecture focuses on the aspects of that life cycle and incorporates the component methods to support them generically to higher levels of a service layer supporting applications across the enterprise. The MDM system architecture relies on a services-oriented framework, in which the functionality reflects the life cycle activities (create, access, update, retire) as they relate to the master object type.
The core functionality (e.g., create a new master record, access/update a master record) is presented as low-level component services that can be adapted or enhanced for specific master data types (“customer” or “product”) or specific applications. For example, certain pieces of identifying information can be collected at different times and by different applications, but if the different applications are allowed to create a new instance, the creation service may be adapted for each application to acquire what is necessary to complete the business process.

3.2.4. MDM Service Layer Architecture

The MDM system architecture focuses on the core technical components to support the data life cycle. However, as the reliance of applications on the master data management environment increases, there are further requirements for data object services related to the level of service provided for application use, such as synchronization, serialization, embedded access control, integration, consolidation, and access. Business applications then are layered on top of the data object service layer by deploying or possibly reusing specific components associated with business processes.
These more comprehensive management activities for master data objects can be implemented at the system level. But because different types of applications may require different levels of service, it may be worthwhile to segregate those components with a role-based framework. For example, some applications that create new master records may have embedded timeliness requirements, such as a customer creation capability that must establish the customer record before allowing any purchase transactions. If a quick-create capability is needed within the sales organization but not necessarily within the fulfillment organization, then the quick-create can be established at the service layer along with the service level requirements (e.g., the maximum time allowed between master object creation and its availability for use).

3.3. Manifesting Information Oversight with Governance

Because MDM is an enterprise initiative, there must be some assurance of stakeholder adherence to the rules that govern participation and information sharing. As we will discuss in great detail in the next chapter, a data governance program applied across different business-level domains will address issues of data stewardship, ownership, privacy, security, data risks, compliance, data sensitivity, and metadata management. Each of these issues focuses on integrating technical data management with oversight, ensuring organizational observance of defined information policies. The four areas of concentration for data governance include the standardization of common use at the data element level, the consolidation of metadata into enterprise management systems, managing data quality, and operational data stewardship.

3.3.1. Standardized Definitions

Whereas humans are typically adept at resolving ambiguity with words and phrases, application systems are considerably less so. People are able to overcome the barriers of missing information or potentially conflicting definitions, although at some point each individual's translation of a business term may differ slightly from other translations. This becomes an issue during integration and consolidation when data element instances that may share a name do not share a meaning, or differently named data elements are not recognized as representing the same concept. Processes for data analytics and for assessing organizational data element information and coalescing that information into business metadata provide standardized definitions that ultimately drive and control the determination of the catalog of master data objects and how they are resolved into the unique view.

3.3.2. Consolidated Metadata Management

A by-product of the process for identifying and clarifying data element names, definitions, and other relevant attribution is the discovery and documentation of enterprise-wide business metadata. Aside from collecting standard technical details regarding the numerous data elements that are potentially available, there is a need to determine business uses of each data element; which data element definitions refer to the same concept; the applications that refer to manifestations of that concept; how each data element and associated concepts are created, read, modified, or retired by different applications; the data quality characteristics; inspection and monitoring locations within the business process flow; and how all the uses are tied together.
Because the use of the data elements and their underlying concepts drives how the business application operates using master data, the enterprise metadata repository effectively becomes the control center driving and managing the business applications. Therefore, a critical component of an MDM environment is an enterprise business metadata management system to facilitate the desired level of control. At an even grander level, the metadata management framework supports the definition of the master data objects themselves: which data objects are managed within the MDM environment, which application data sources contribute to their consolidation and resolution, the frequency of and processes used for consolidation—everything necessary to understand the complete picture of the distributed use of master data objects across the enterprise.
It is worthwhile to note that advocating an enterprise-wide approach to metadata management does not necessarily mean purchasing an enterprise metadata management tool. Rather, the focus is on the procedures for sharing information, even if that is facilitated through less sophisticated means. The important part is reaching consensus on enterprise metadata.

3.3.3. Data Quality

Data quality figures into MDM at two different levels. First, the concept of the unique representation for each real-world object requires a high level of trust in the data; otherwise there would be little incentive for business clients to participate. Second, data quality tools and techniques are employed in the integration and consolidation processes.
More to the point, instituting a data quality management program will ultimately result in a change to the organization, particularly in the way that management, and in turn individual staff members, relate to and assess the information value. Instead of considering data as only the raw input to the operational running of the business, individuals grow to understand how information becomes an asset to be used in many ways for improving the business. As business practices continue to rely on master data, they will become more reliant on high-quality data. The corresponding recognition that business performance and operational productivity at the organizational as well as at the personal level depend on high-quality data becomes a core competency of any MDM program.

3.3.4. Data Stewardship

As more lines of business integrate with core master data object repositories, there must be some assurance of adherence to the rules that govern participation. Because MDM success relies on data governance, an operational aspect to data governance is applied across different business domains, providing economies of scale for enterprise-wide deployment. The operational aspects of governance, typically formulated as data stewardship activities, supplement the ownership models and oversight mechanisms to ensure active management and information quality.

3.4. Operations Management

By definition, a master data environment provides a unified view of the data entities dealt with by the various business applications. There is a requirement for providing the components for maintaining the special characteristics of these master data objects through the data life cycle while supporting each application's corresponding ongoing needs. This includes the unique identification of each object, and the connectivity between the replicas, instances, and usage points of each object, to say nothing of maintaining the ways that different master data objects are possibly connected, such as householded customer names. Aside from the expected administration and configuration management components, the MDM stack must provide “specialty” management services, including identity management for unique key entities, hierarchy management to track association, lineage, and relationships, and migration management as part of the transition to the MDM platform.

3.4.1. Identity Management

Every instance of each master data object type must represent a unique real-world object, implying the constraint that there is one, and only one, uniquely identifiable record for any specific customer (or product, employee, etc.). This means that any time a process seeks a specific individual managed within the master data asset, enough identifying information must be provided to both determine that either
• A record for that individual exists and that no more than one record for that individual exists, or
• No record exists and one can be created that can be uniquely distinguished from all others.
Identity management addresses these requirements by enabling and managing the determination of the identifying attributes necessary for unique identification, along with the search and match capabilities used to locate both exact and approximate matches as well as maintaining the master index based on the identifying attributes. This concept focuses on maintaining the right model for unique identification, and the result is the input to the Identity Search and Resolution component at the next level up. In addition, policies regarding the distinction of a unique entity across different data sources, both automatically and those instances requiring manual intervention, are management directives, whereas implementing those policies takes place at the identification layer.

3.4.2. Hierarchy Management and Data Lineage

The first aspect of hierarchy management essentially focuses on the lineage and process of resolving multiple records into a single representation. Because there may be records representing the unique entity in different application systems, as part of the consolidation it will be necessary to document which application data sources contribute to the master consolidation and, in certain types of MDM architectures, to provide links back from the master index to the original source records in order to materialize master information on demand. This becomes especially important as a data control in case it is determined that there are false positive matches, in which identifying information for two individual objects incorrectly resolved into a single entry, or false negatives where more than one master record exists for the same unique entity. To some extent, in this point of view, hierarchy management is more concerned with data lineage as a way to mitigate the inevitable occurrence of errors in the data integration processing streams.
The second aspect of hierarchy management for MDM revolves around the interconnectedness of master objects across multiple systems. For example, customers may be related to each other (e.g., same family, work for the same business), or different master data types may be related (e.g., the products associated with a specific supplier). These relationships are reflected in linkage hierarchies, and the hierarchy management layer also provides service components supporting the management of these connections.

3.4.3. Migration Management

The transition toward application integration with an MDM system is interesting to contrast with general approaches to application modernization. When either incrementally or drastically modernizing a stand-alone application, the migration plan typically will have both the version to be retired running simultaneously and the modernized version for some time period to ensure that the new version properly addresses the business requirements. But for an MDM program, one objective may be to replace the application's underlying data interactions, which would complicate the ability to have different versions operating simultaneously. Therefore, a necessary operational component is the ability to manage application migration and transition to using master data services.

3.4.4. Administration/Configuration

Lastly, because the framework supporting MDM may involve different architectures and frameworks, a master index of entity representations, mappings from the master index to persistent storage, and multiple application interfaces and service invocations to access and use master data, the MDM technical team will need tools and processes to configure and provide ongoing administration of various aspects of the underlying MDM framework.

3.5. Identification and Consolidation

The wide spectrum of applications that deal with each type of master data object will eventually need to be integrated to employ the virtual master resource. That requires three capabilities: the ability to search and match for identity resolution, links to connect records within their appropriate hierarchies, and merging and consolidation of multiple records and survivorship rules applied to the attributes to formulate a single “best” version of each entity.

3.5.1. Identity Search and Resolution

Identity resolution refers to the ability to determine that two or more data representations can be resolved into one representation of a unique object. This is not limited to people's names or addresses, because even though the bulk of data (and consequently, the challenge) is person or business names or addresses, there is a growing need for the resolution of records associated with other kinds of data, such as product names, product codes, object descriptions, reference data, and so on.
For a given data population, identity resolution can be viewed as a two-stage process. The first stage is one of discovery and combines data profiling activities with a manual review of data. Typically, simple probabilistic models can be evolved that then feed into the second stage, which is one of similarity scoring and matching for the purpose of record linkage.

3.5.2. Record Linkage

Subsequent to developing the similarity scoring processes and models as part of identity resolution, the algorithms are applied to a larger population of records, taken from the different sources, to link and presumably to automatically establish (within predefined bounds) that some set of records refer to the same entity. Usually, there are some bounds to what can be deemed an automatic match, and these bounds are not just dependent on the quantification of similarity but must be defined based on the application. For example, there is a big difference between business applications that determine if the same person is being mailed two catalogs instead of one as opposed to applications that determine whether the individual boarding the plane is on the terrorist list. The record linkage component services both the identity management capability as well as the processes for merging and consolidation.

3.5.3. Merging and Consolidation

Enterprise data sets are reviewed using identity resolution to distinguish records representing unique entities and then are loaded into a canonical representation. Record linkage is applied to seek out similar representations, paving the way for the merging and consolidation process. Similar records are subjected to algorithms to qualify the values within each data attribute.

3.6. Integration

The objectives of MDM are not only achieved through data integration. Value is added when the consolidated master entity representation is integrated back into operational and analytical use by the participating applications to truly provide a single, synchronized view of the customer, product, or other master data entity.
The abstraction of the data integration layer as it relates to the development of business applications exposes two ways that master data are integrated into a services-based framework. Tactically, a services layer must be introduced to facilitate the transition of applications to use a master asset. Strategically, the abstraction of the core master entities at a data integration layer provide the foundation for establishing a hierarchical set of information services to support the rapid and efficient development of business applications. Fortunately, both of these imperatives are satisfied by a well-defined abstraction layer for services, and these concepts form the next layer of the component model.

3.6.1. Application Integration with Master Data

An MDM program that solely accumulates data into a consolidated repository without allowing for the use of that data is essentially worthless. One driving factor for establishing the unified view of enterprise master data objects is establishing a high-quality asset that can be shared across the enterprise. This means that information must easily be consolidated into the master view and must be easily accessible by enterprise applications. Production applications can be expected to migrate to access the master data asset as each application's data sets are consolidated within the master view. Therefore, part of the MDM framework must accommodate existing application infrastructures in ways that are minimally disruptive yet provide a standardized path for transitioning to the synchronized master.

3.6.2. MDM Component Service Layer

As MDM becomes more fully integrated to support business applications within the enterprise architecture, those applications can increasingly rely on the abstraction of the conceptual master data objects andtheir corresponding functionality to support newer business architecture designs. Standardizing master data object representations reduces the need for application architects to focus on traditional data-oriented issues (e.g., data access and manipulation, security and access control, or policy management) and instead can use abstracted functionality to support business requirements by relying on the lower-level data-directed services whose format and design is dictated through an MDM services layer architecture. This ability to consolidate application functionality (e.g., “creating a new customer” or “listing a new product”) using a services layer that supplements multiple application approaches provides additional value across both existing and future applications by simplifying incremental development.

3.7. Business Process Management

The highest level of abstraction, business process management, is the one at which the requirements for making application design decisions are exposed. All too often, application designs are technology driven, with implementation decisions made based on technical recommendations rather than business needs. A key (and, perhaps, ironic) goal in MDM system design is to ensure that the system is business driven. Despite the fact that MDM is largely dependent on the proper disciplines guiding the organizational use of technology, it is widely recognized that deploying the technical components without linking their functionality to a corresponding business process model is essentially pointless. At this level in the component stack, the architects incorporate business process modeling with system architecture. Clearly, MDM is differentiated from other types of technology-driven consolidation efforts because of the desire to more closely couple technology inclusion, and that is made possible through business process integration and the use of rules-based operational systems that rely on formally defined business rules.

3.7.1. Business Process Integration

All business applications should reflect the implementation of business process requirements specified, either explicitly or implicitly, as the way the business operations are performed. A business process model is a logical presentation that communicates the right details of a business process to the right people at the right time. It typically lists the processes involved, their inputs, aspects that control each process, the types of events or triggers that emerge as a result of each process, and the expected output of each process. The model's visual representation relies on the underlying metadata, such as activity purpose, timing attributes, operational triggers, process inputs, process duration, generated events, resources used, and the desired outputs.
As individual activities are linked, the model shows how the outputs of one activity coupled with triggered events from other activities control or influence the behavior of the enclosing application, as well as the collection of applications as a whole. In turn, these business process model descriptions are annotated with the references to the master data objects necessary to complete the procedure. This effectively integrates the business process with the MDM solution, exposing the strict and implicit data dependencies and validating the identification and selection of master data object classes.

3.7.2. Business Rules

Within any business process model, the logic employed for executing a particular operation combines the evaluation of the values of the shared data objects and the values expressed by defined controls. The values are examined to determine which actions to take, and that in turn will create new values and trigger new controls. There are two ways to look at a specific implementation. The first is explicit: embedding the logic within application program code to evaluate the data values and specifically executing the actions. The second, more abstract approach is to systematically use descriptive rules to examine variable values and trigger actions, all used to establish the consistency of overall system state.
The way that actors interact with the events, controls, and inputs associated with the business process model provides us with the details of the business logic that will ultimately be deployed as formal business rules. Reviewing the business process model enables the application designers to identify the key triggers for specific rules, as well as exposing the full set of conditions that need to be addressed during the business process. This review process leads to a more complete model of the system, and consequently, its corresponding master data dependencies.

3.7.3. MDM Business Component Layer

Underlying the definitions and requirements exposed through the business process modeling and integration component and the implementation of business rules through a rules-based system is the business component layer. It is at this layer that we begin to see the creation of more sophisticated reusable business services (as opposed to the functional services that address interaction with the master data). At the business component layer, we start to see reliance on more interesting master data objects. For example, in addition to referring to master customer records, we might also begin to integrate master customer profiles within predictive analytics embedded within operational applications. The migration toward the use of the master model will open up opportunities for creating analytics-oriented master data object types and combine their use with traditional operational applications.

3.8. MDM Maturity Model

Our objective in defining a maturity model is not to provide a benchmark against which all MDM implementations are measured. Rather, because many organizations have already designed, coded, and deployed various versions of the described capabilities, the level of maturity describes both how the use of already deployed components and services can be exploited for the purposes of a master data management program, as well as suggesting which missing capabilities should be acquired in order to advance to more sophisticated application reliance on master data.

3.8.1. Initial

The initial level of maturity (as detailed in Table 3.1) is characterized more by the absence of capabilities than the alternative. At the initial level, there are limited possibilities for exploiting master data, but there is some degree of recognition that there are replicated copies of certain data sets that are relevant to more than one application. At the initial level, some business and technical managers are prepared to explore ways to consolidate data sets for analytical purposes.
Table 3.1 The Initial Maturity Level
Component LayerCapabilities
ArchitectureLimited enterprise consolidation of representative models
No master data models
Collections of data dictionaries in various forms
GovernanceLimited data cleansing by application/line of business, for specific purposes (e.g., address standardization)
Absence of defined ownership or stewardship models
Recognition of need for oversight
ManagementIdentity management by applica- tion when needed (e.g., customers)
Some application configuration, but not coordinated through centralized management
IdentificationLimited use of identity management by line of business
“Tiger team” attempts at customer data consolidation as required by applications (e.g., software upgrades or transitioning of accounting applications)
IntegrationReplicated copies of reference data
Limited data reuse
No application services reuse
Business process managementLimited or no business involvement except at highest level of requirements definition

3.8.2. Reactive

At the reactive level (detailed in Table 3.2), not only is there a recognition that the existence of replicated copies of data causes business impacts, but there are some attempts to resolve the issue. Invalid or unusable data are deemed an information technology problem. Data quality tools are purchased as a prelude to “fixing the data,” although the actual business needs may lie unanalyzed while a technical team acquires tools. Initial uses of the tools satisfy some line-of-business application needs, but lessons learned are not shared, leading to a duplication of effort.
Table 3.2 The Reactive Maturity Level
Component LayerCapabilities
ArchitectureApplication architectures are defined for each business application
Attempts to collect data dictionaries into a single repository
Initial exploration into low-level application services
Review of options for information sharing (e.g., enterprise information integration or enterprise application integration)
GovernanceExternal applications used to manage metadata
Introduction of data quality management for parsing, standardization, and consolidation
ManagementResources are assigned to manage the use of introduced tool sets
Training for enterprise roll-out of tools and technology make capabilities available on a more widespread basis
Centralized administration of metadata and master indexes
IdentificationIdentity search and match used to reduce duplication
Identity search and match used for rudimentary record linkage for householding purposes
IntegrationInitial exploration of consolidation of data for newly developed analytical (e.g., customer relationship management) applications
Data warehouse used as a core repository for master data
Limited or no integration back into contributing applications
Business process managementConceptual business process models are described
Analytical application integration of consolidated data
Initial use of business rules embedded within applications
Some attempts are made at consolidating metadata from across different applications, and tools are reviewed and purchased but still are managed as technical resources. Application needs for data sharing are attacked by vigorous and uncoordinated XML (eXtensible Markup Language) schemas and corresponding services, although there is a great need for fine-tuning the variant implementations.

3.8.3. Managed

Once analytical applications have been created that rely on some level of consolidation, individuals within the organization can establish a value proposition for continued use and growth of consolidated master repositories. Gaining senior management buy-in enables more comprehensive enterprise modeling activities, which are supplemented by the MDM program. Whereas at the reactive level the focus may have been on a single area such as customers, at the managed level (detailed in Table 3.3) the ability to use master data becomes a repeatable process and can be expanded to incorporate new applications as well as existing applications, as the consolidation and synchronization services are available as part of the migration package.
Table 3.3 The Managed Maturity Level
Component LayerCapabilities
ArchitectureDefined core data model for persistence
Fundamental architecture for shared master data framework
Identified operational framework for low-level master data life cycle activities
Defined services for integration with master data asset
GovernanceData quality tools in place
Policies and procedures for data quality management
Data quality issues tracking
Data standards processes in place
Line-of-business data stewardship
ManagementIdentity management centralized in master index
Identity management utilized across numerous applications
Identified hierarchies (households, relationships within a data class) used by analytical applications
Advanced configuration and administration of application use of master data
A migration plan is available for selected applications
IdentificationIdentity search and match service available to all applications
Record linkage integrated within the MDM service layer
Rules for merging and consolidation standardized and managed under centralized control
Merging and consolidation processes established and repeatable
IntegrationProcesses for integration back into contributing applications
Definition of component services available for application inte- gration
Services for synchronization between applications and master data services
Business process managementIntegration of business rules with master data operations
Fundamental connectivity between business applications and core data objects
Business process analysts participate in master data engineering requirements

3.8.4. Proactive

As organizations establish the core data models and service architectures characterized at the managed level, they become more adept at reducing individual business application dependence on its own copies of replicated data and at the proactive level (detailed in Table 3.4) the applications are generally integrated through the service layer with the master data environment. Synchronization for application data interactions is embedded within the component service layer, as are identity resolution, hierarchy management, and identity management. The business is able to better establish relationships at the customer/supplier/vendor level, as full profiles based on aggregated and consolidated data are managed as a core enterprise resource. Data governance is in effect across the organization with hierarchical organization down the management chain.
Table 3.4 The Proactive Maturity Level
Component LayerCapabilities
ArchitectureMaster models are established
Capability to move from index framework to transaction-based MDM framework
SOA in place for application architecture
Centralized management of business metadata
GovernanceEnterprise data governance program in place
Enterprise data standards and metadata management in place
Proactive monitoring for data quality control feeds into governance program
ManagementIdentity management fully integrated across the enterprise
Unique identification of all master object instances
Full-cycle hierarchy management supports both analytical and operational activities
Hierarchy management enables roll-back of false positive consolidation errors
IdentificationServices for data life cycle embed identity search, match, and resolution
All data life cycle operations structured on top of merging and consolidation services
Consolidation occurs in background
IntegrationSynchronization completely embedded within life cycle services
Component layer supports application integration at master object levelSOA drives business application integration
Business process managementBusiness logic is reused
Business rules are integrated within a rules engine and made available at the business process level
Business analysts integral to application development
Personalized customer relationships
Automated business processes

3.8.5. Strategic Performance

MDM, coupled with a services-oriented architecture, will (at the strategic performance level, as detailed in Table 3.5) ultimately enable rapid development of high-quality applications that support both the operational and analytical requirements of enterprise business applications. Business analysts work closely to enumerate expectations for outward-facing process implementations. Analytical results associated with business intelligence processes will be managed as master objects, enabling more effective and consistent predictive analytics to be embedded within customer-facing applications.
Table 3.5 The Strategic Performance Level
Component LayerCapabilities
ArchitectureComplete transaction integration available to internal applications
Published APIs enable straight-through processing involving master data
GovernanceCross-organization data governance assures high-quality information sharing
ManagementSeamless identity management of all data objects synchronized to both internal and external representations
Migration of legacy applications complete
IdentificationIdentity resolution services exposed externally to the organization
Business performance directly tied to master dimensions
IntegrationAll application development is driven by business process models and their interaction with core master object models
Business process managementBusinesses completely drive application design and development
Applications largely integrate business rule engines
Data instance profiles (customer or vendor profiles) managed within master data asset
MDM enables embedded predictive analytics

3.9. Developing an Implementation Road Map

It is relevant to note that when reviewing the capability/maturity model described in this chapter, your organization may already have a number of these capabilities in place. As an example, as part of many data warehousing projects, the process for consolidating data from multiple application data sets posed questions regarding the quality of the warehouse data. This introduced the need for data cleansing and data quality tools, along with the methods to correct warehouse data to meet analyst and reporting requirements. The availability of the tools in common practice within the organization for parsing, standardization, and cleansing demonstrates that with respect to governance, the organization has already begun to transition from the initial level to the reactive level.
On the one hand, one might expect that all organizations desire to execute at the strategic performance level. However, achieving the capabilities at this level requires a significant investment in time and resources—an investment for which interim value would be expected for delivery. Therefore, it is more reasonable to chart a road map through the different levels of maturity, detailing the business value expected as each level is attained.
This process provides three relevant benefits. First, having learned lessons in the past regarding the longevity, high cost, and limited deliverables of “big bang” projects, project managers instead may seek out the short-term tactical values achieved as the result of intermediate steps taken toward the eventual end state. This provides the business client with tangible benefits during the maturation sequence. Second, envisioning the end state and progressing there clarifies the design, development, and migration processes that will be necessary to evolve both the environment and the structural components to the point where the application and information architectures are able to rely on the master data asset. Third, the strategic view enables proper positioning, communication, and socialization of the organizational changes needed to ensure proper acceptance for the transition.
Fundamentally, the objectives of an implementation road map are to clarify the business goals for MDM, understand the set of capabilities that are necessary to achieve those business goals, determine what sets of capabilities already exist within the organization, assess the gaps, determine where the organization needs to be, and decide how to get there. More formally, we can define a conceptual process as shown in the sidebar.
Implementation Road Map
Evaluate the business goals. Part analysis, part organizational soul-searching, this task is to develop an organizational understanding of the benefits of MDM as they reflect the current and future business objectives. This may involve reviewing the lines of business and their related business processes, evaluating where the absence of a synchronized master view impedes or prevents business processes from completing at all or introduces critical inefficiencies, and identifying key business areas that would benefit from moving to an MDM environment.
Evaluate the business needs. This step is to prioritize the business goals and determine which are most critical (of the critical goals, determine which have dependences on MDM as success criteria).
Assess current state. Here, staff members can use the maturity model to assess the current landscape of available tools, techniques, methods, and organizational readiness for MDM.
Assess the initial gap. By comparing the business needs against the evaluation of the current state, the analysts can determine the degree to which the current state is satisfactory to support business needs or alternatively determine where there are gaps that can be addressed by implementing additional MDM capabilities.
Envision the desired state. If the current capabilities are not sufficient to support the business needs, the analysts must determine the maturity level that would be satisfactory and set that as the target for the implementation.
Analyze the capability gap. The analysts determine the gaps between the current state and the desired level of maturity and identify which components and capabilities are necessary to fill the gaps.
Map the capabilites. To achieve interim benefits, the analysts seek ways to map the MDM capabilities to existing application needs to demonstrate tactical returns.
Plan the project. Having identified the capabilities, tools, and methods that need to be implemented, the “bundle” can be handed off to the project management team to assemble a project plan for the appropriate level of requirements analysis, design, development, and implementation.
In essence, we are using the maturity model as the yardstick against which the organization's MDM capabilities and readiness are measured. In turn, the maturity model is used to project an end state, which helps the business analysts to map the execution of the plan to the current application architecture and foresee the best way to reach the end state.

3.10. Summary

The transition to MDM is viewed as a revolution, but it is more effectively developed as an evolution. We have looked at the different components necessary to implement a mature master data management program, as well as investigated levels of maturity through which organizations may grow. Although no functionality list completely captures the inventory of services that a specific business requires from a master data system, by exploring the core MDM capabilities and a conceptual outline of technical MDM components, we have provided a framework to determine where any organization's capabilities lie.
When faced with the opportunity to assemble a master data management program, one should evaluate the business requirements and then review how those requirements can be addressed at the different levels of the maturity model. The presentation of the MDM component layers in terms of their maturity enables enterprise architects to target a desired level of MDM maturity and develop a design and implementation road map that articulates the steps to take when assembling a program that effectively meets the line-of-business needs.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset