12 IMPROVING BA SERVICE QUALITY

INTRODUCTION

Delivering a high-quality BA Service for customers, and committing to continually strive for service improvement, should be high on the agenda for any BA leader. Quality must be considered in all aspects of the BA Service, from the delivery of the business analysis portfolio of services to the day-to-day management of the service. Figure 12.1 shows how different aspects of quality contribute to a high-quality service.

Figure 12.1 The journey towards service quality

images

This chapter discusses some key aspects of service quality in relation to business analysis. These are:

a quality-focused culture;

frameworks for enabling service improvement;

quality assurance of business analysis outputs.

QUALITY AND IMPROVEMENT CULTURE

The culture of an organisation is often simplified to mean ‘the way we do things around here’. Culture can vary hugely between different organisations, and even within a single organisation, but most organisations want to deliver to the best of their ability (a quality culture) and to enhance what they deliver over time (a continual improvement mindset).

Quality culture

The foundation of service improvement is a ‘quality culture’ amongst those delivering the service. The use of quality improvement methods and tools should become core to the delivery of the service. Quality needs to be core to every aspect of the BA Service, from recruitment through to standards, from production of analysis outputs to training and development, and must operate at both the individual and team level.

Research based on a study of over 60 multi-national corporations, published in the Harvard Business Review (Srinivasan and Kurey, 2014) showed that:

The research identified four key areas of action for organisations aiming to improve their quality culture:

Maintaining a leadership emphasis on quality: leaders walk the walk on quality.

Ensuring message credibility: communication about quality should match up with the activity and behaviour people observe.

Encouraging peer involvement: quality is everyone’s responsibility.

Increasing employee ownership and empowerment: people can identify issues, suggest improvements and new ideas get implemented.

As business analysis is one of the first areas where mistakes can affect the direction of a project or piece of work, it is critical that quality should feature as a topic in key business analysis processes such as planning business analysis work, business analysis service delivery and lessons learned, and retrospectives.

A quality culture can be enabled by encouraging a ‘continual improvement’ mindset for the team, and a growth mindset for individual business analysts (see Chapter 5).

Continual improvement mindset

This approach to the BA Service means that all business analysts are encouraged and enabled to identify opportunities to improve the service and the quality of the work. This involves both large and small improvements, which should be subject to appropriate prioritisation and governance. This approach allows meaningful improvement of the BA Service in parallel with the delivery of projects and other commitments.

There are several areas to consider when aiming to achieve continual improvement:

acceptance that there is always room for improvement; this can require individuals and the organisation to accept criticism, which may be difficult;

appropriate attention to current ways of working so that improvements are set in context, are against a baseline and unintended consequences are minimised;

recognition that all business analysts have the responsibility to identify areas for improvement;

availability of mechanisms to raise, document/discuss and agree improvements;

presence of appropriate controls to prevent the implementation of conflicting or diverging improvements.

It is important to focus improvement effort on those business analysis activities and outputs that offer the most value to customers and have the potential to eliminate waste, redundancy and errors. The Lean Six Sigma management approach encourages organisations to continually improve by reducing waste (Lean) and detecting and removing errors (Six Sigma). Lean methodology defines eight common types of waste that can exist in a service or process, highlighted by the acronym DOWN-TIME (Defects, Over-production, Waiting, Non-used talent, Transportation, Inventory, Motion, Extra- processing). Table 12.1 describes the types of waste and how each can be considered in the context of the BA Service.

Table 12.1 The eight types of waste

images

images

CONTINUAL SERVICE IMPROVEMENT (CSI)

CSI is an approach to identifying and implementing opportunities to make services better, and to assess and measure the impacts of these improvements over time. The core concept of continual service improvement stems from research and models of quality devised by W. E. Deming. CSI for business analysis means understanding the baseline level of BA Service maturity, identifying improvements and devising a plan to carry out the improvements that have the most impact for customers. Suitable models and approaches for assessing and improving the BA Service are described in this section.

There are several models that can help to evaluate the maturity of the BA Service, two of which are the BA Maturity Model and the Capability Maturity Model Integration (extended by the business analysis version). We will start this section by examining these.

BA Maturity Model

The practice of business analysis and the role of the BA have developed over several decades; one of the earliest references to business analysis as a formal discipline was in 1986 (Jakob, 1986). Since then, it has been possible to track the development of business analysis from its original focus on system improvement to the wider and more holistic discipline defined in the BA Service Framework (see Chapter 2).

This trajectory is represented in the Business Analysis Maturity Model (BAMM), shown in Figure 12.2. The BAMM identifies three levels of business analysis work through consideration of two axes:

Scope: the extent to which the scope of the assignment or project has been defined. This ranges from a narrow, well-defined and thus highly constrained scope to a broader but therefore more ambiguous, unclear scope.

Authority: the extent to which a business analyst can challenge and recommend changes with regard to an assignment or project. This ranges from limited authority to extensive authority.

Figure 12.2 The Business Analysis Maturity Model (reproduced with permission of AssistKD)

images

The consideration of these axes results in three, overlapping, levels of business analysis work. The movement from one level to the next is as a result of a progressive reduction in scope definition/constraint and an increase in the business analyst’s authority. These three levels are described in Table 12.2.

Table 12.2 Description of the BAMM levels

BAMM level

Level description

System improvement

Clear scope that constrains the business analysis work; business analysts have limited authority

The business analysis work at the system improvement level is concerned with the definition of requirements for an IT project where the scope is clearly defined. The business analyst may challenge the need for certain stated requirements and is responsible for evaluating the feasibility of the requirements and ensuring alignment with organisational objectives. However, their authority is restricted by the defined scope of the project and the responsibilities of their role

Process improvement

Scope is not entirely defined, particularly with regard to cross-functional business needs; business analysts have some authority

Business analysis work at the process improvement level is concerned with a holistic view, as business analysts are not constrained to focus solely on the requirements for an IS project. Instead, they may take a view outside the project, possibly across an entire change programme, to ensure that all POPITTM model elements have been considered. One of the key aspects of this work is the redesign of the business processes. This requires business analysts to take a cross-functional view of the organisation and the additional impacts relating to an IT project

Business improvement

Scope is deliberately ambiguous; business analysts have significant authority

Business analysis work at the business improvement level is concerned with defining the scope of a business change programme or project; while the scope is likely to include an IT project, this is not necessarily the case. The ambiguity will require business analysts to deploy a range of relevant techniques to understand the situation and to identify the root causes of problems that are to be addressed. The identification and evaluation of options to deal with the business situation is also a key part of business analysts’ work at this level. Essentially, business analysts will focus on achieving beneficial business outcomes and ensure that any investment funds are spent wisely

The level of maturity of a BA Service in respect to scope and authority is influenced by the factors described in Chapter 1, including its position within the organisation and the structures in place.

The Capability Maturity Model Integration (CMMI)

The CMMI,1 an overview of which is shown in Figure 12.3, provides an alternative framework for evaluating maturity. Rather than considering the different levels based upon an assessment of scope definition and authority, the CMMI considers the approach to the work and, in particular, the extent of standardisation, measurement and management.

Figure 12.3 Overview of the Capability Maturity Model Integration

images

The CMMI can be applied to the BAMM and thereby provide further insights into the analysis of business analysis maturity. For example, where an organisation is in the early stages of development for its BA Community of Practice, the business analysts may be employed solely on requirements definition work but there may be well-defined standards for this work that are consistently applied. Therefore, the BAMM level 1 work (system improvement) within the BA Service would be at CMMI level 3. Within the same BA Service, it is possible that the BAMM level 2 work (process improvement) is not as well defined, so is at CMMI level 2, and the business analysts conducting BAMM level 3 work (business improvement) are in the initial stages of developing their processes and standards, so are at CMMI level 1.

Paul, Cadle and Yeates (2014) also defined a customised version of CMMI that helps evaluate the maturity of the business analysis practice. This is shown in Figure 12.4.

Figure 12.4 The CMMI adapted for a BA Service

images

The BA Service Assessment

The combination of the Business Analysis Maturity Model, the BA Service Framework and the Capability Maturity Model Integration provides a basis for assessing the BA Service. The overview process for such an assessment is shown in Figure 12.5.

Assessing the BA Service involves evaluating the level of maturity for each aspect of the BA Service. It is also useful to consider how the assessment criteria will be evidenced.

Allowing all members of the BA Service to contribute to the service assessment process permits a common understanding to be reached and variations in views and practices to be debated. Common reactions to a service assessment might include feeling:

Overwhelmed: the volume of improvement work identified by the assessment may seem unachievable alongside existing priorities and BA Service delivery.

Disheartened: a great deal of work and effort may have been invested in service improvement, but the service assessment appears to show little improvement and considerable future effort is still required.

Motivated: the assessment brings clarity and direction to the improvements that could be made to improve the overall service.

It is useful to reflect on the feelings evoked by the outcome of the assessment and consider questions such as:

Are the target levels realistic?

What is driving the target levels?

Has the assessment been overly critical or overly optimistic?

Figure 12.5 Process for assessing the BA Service

images

The dimensions and CMMI levels to be considered when assessing a BA Service are shown in Figure 12.6. Each of the dimensions of the BA Service are considered against the maturity levels, and the appropriate target level and current level are plotted for each dimension. The areas of the BA Service that have the most significant gap between the current and target levels may indicate areas to prioritise improvement activity.

The results of the service assessment can also be represented as a grid, with service dimensions shown against CMMI levels and with both current level and target level indicated, as shown in Figure 12.7. Both these examples show that it may not be possible or even desirable to achieve a single CMMI level for the whole BA Service, and the identified target state will depend on the needs and priorities of the Service and its customers. The ‘gap’ between current and target states is addressed via activities in the Service Improvement Plan.

Figure 12.7 could be extended to include dates, such as assessment dates for current assessment, and planned dates by which target states could be achieved, and therefore represent a high-level plan. The selection of the appropriate visual representation of the service assessment results will depend on the purpose and audience.

Figure 12.6 Example of BA Service Assessment Framework

images

Figure 12.7 Example of BA Service Assessment Grid

images

BA Service Improvement Plan

A Service Improvement Plan (SIP), discussed also in Appendix 12, shows the activities required to improve the quality and performance of the service and should focus on the areas of relative weakness highlighted when assessing the BA Service. The process of creating a SIP involves asking questions such as:

What are the real CSFs for the service?

What are the highest priority improvement areas?

How do these align to organisation strategy?

Is there agreement on priority?

What are the options for improvement?

Which activities will have the most impact?

Who will benefit?

Who will do them?

When is this needed/When can this be achieved?

How will progress be measured? What should KPIs be?

Progress towards the SIP can be shown visually using a milestone chart or a Kanban approach (covered later in this chapter). The aim of the plan is to create a coherent set of improvements providing a clear road map for the ongoing development of the service. Without this, the service may lose sight of why specific improvements are required and continue to ‘tinker’ with processes and metrics with no specific drivers or direction. It is always useful to consider ‘Why are we planning to make this change/improvement and what will the service be able to achieve in future if this improvement is successful?’ If there is no clear answer to this question, it is important to revisit the vision and goals of the BA Service before allowing further ‘improvement’ activity to continue.

BA Service Road Map

Creating a Service Road Map is an excellent way to engage with stakeholders, including business analysts, by visually representing progress towards the service objectives. It particularly focuses on new features and capabilities of the service.

A road map should be:

aligned: it moves the service towards the vision and objectives. Every item on the road map should have a strategic justification;

clear: easily understood with an appropriate level of detail;

maintained: developed and updated regularly to reflect progress and changes in priority;

transparent: available to anyone who has interest in the BA Service.

The road map shows a number of key themes or areas that the service needs to focus on, and time periods that are sufficient to show progress; this will vary between organisations and could be weeks, months, quarters or years. The road map must also be informed by the business case for the BA Service (see Chapter 2), so that the costs, benefits and potential return on investment into the BA Service are understood and agreed.

Figure 12.8 Example of BA Service Road Map

images

In the example shown in Figure 12.8, the road map shows activities and milestones that work towards six service objectives (shown in Table 12.3). Seeing the entirety of the service development work in this way allows BA leaders to ask questions such as:

Does this order make sense and address priorities?

When will the road map be reviewed to ensure it reflects reality and current priorities?

Is there enough business analyst and management capacity to meet this?

What are the impacts on customers?

Are initiatives staggered appropriately?

What level of change is being introduced? Is this manageable?

Do the timings consider business peaks and troughs and holiday periods?

It can be discouraging to realise that an exciting improvement or initiative cannot start for many months, but it is more important for the BA Service to have a realistic road map and make progress towards it than to set a timeline than cannot be delivered.

Table 12.3 Example BA Service objectives mapped to road map targets

images

QUALITY MANAGEMENT

Quality management considers the standards, processes and activities required to achieve and maintain the desired quality levels for the BA Service. Establishing quality management includes the definition of quality objectives and targets, the communication of quality expectations and the design and deployment of processes that will ensure adherence to the quality management system and the opportunities to identify service improvements. Quality frameworks and principles provide a common language for quality management. Understanding quality cycles allows BA leaders to acknowledge that quality cannot be ‘achieved and then forgotten’; quality management is an ongoing process of measuring, improving and learning.

The Deming Cycle

The Deming Cycle (see Figure 12.9) is a framework used for the continuous improvement of products and processes; it consists of four elements: Plan, Do, Study and Act (The Deming Institute, 1993; Sutherland and Canwell, 2004). The cycle starts with the Plan step, which involves identifying a goal or targets, formulating an approach and defining success metrics. This is followed by the Do step, where the components of the plan are implemented. Implementation activities are followed up with the Study step, where outcomes are monitored to assess the plan for signs of progress and success, or problems and areas for improvement. The Act step is the fourth stage in the cycle and brings together the learning generated by the entire process, which can be used to adjust the goal, change the approach, and inform future plans. These four steps are then repeated as part of the ongoing cycle of continual learning and improvement.

Figure 12.9 The Deming PDSA Cycle

images

A related framework that is often used in conjunction with Six Sigma projects is known as Define, Measure, Analyse, Improve and Control (DMAIC). This cycle serves as a useful reminder that the driver may be to ‘improve’, but this must be set in context and there are number of steps that must be taken before the improvement can be made.

BA Quality Management Cycle

The Deming Cycle shown in Figure 12.9 has been expanded and adapted in order to provide a specific process for managing the quality of the BA Service as a whole and the quality of work undertaken by individual business analysts. This extended BA Quality Management Cycle is shown in Figure 12.10 and its elements are described in Table 12.4.

Figure 12.10 The BA Quality Management Cycle

images

International Organization for Standardization (ISO) quality management principles

ISO2 has developed a wide range of standards in relation to quality, including the ISO 9000 family.

Table 12.4 Stages in the BA Quality Management Cycle

images

ISO provides a set of seven quality principles that underpin the standards, and which are relevant to both the quality of business analysis provided to customers and the importance of the continued improvement of the BA Service. These are set out in Table 12.5.

Table 12.5 ISO quality management principles

images

QUALITY MANAGEMENT TECHNIQUES

Quality management should include both proactive and reactive ways of encouraging and ensuring that appropriate levels of quality are being achieved. Proactive approaches are applied before something is produced and include the provision of adequate training and development (see Chapter 4) and the use of standards and templates (see Chapter 6).

There are numerous reactive techniques that may be used to enable effective quality management and provide quality assurance after something has been produced. This section discusses some of the techniques most useful for business analysis.

The review process

A defined review process is helpful when assessing and improving the quality of business analysis outputs and deliverables. An effective review process will:

involve the right people;

be clear on the purpose and expectations of participants;

have defined stages, time frames and time commitments;

be clear on the quality standard to be achieved and the criteria for assessing the quality of the deliverables.

Without an agreed review process, it is likely that time will be wasted, effort will be duplicated and people may become frustrated.

The review triangle

The review triangle (Figure 12.11) provides a representation of the different levels and types of review that may be conducted. The aim of reviewing business analysis outputs is to ensure that each output:

achieves its agreed purpose;

is appropriate for the audience;

contains the correct content and is complete;

is unambiguous;

meets quality expectations.

There may be several people involved in conducting a quality review, but they do not all have the same review focus. The review triangle shown in Figure 12.11 reflects three levels of review and the differences in the breadth of the review conducted at each level.

1. Self-review is the first level of review, which typically identifies the most errors. This should be performed by the BA who created the output and should be conducted as a separate activity from the creation of the output. A review checklist (see also the Quality checklists section later in this chapter) for a specific type of deliverable can provide an invaluable mechanism in support of self-review.

2. Peer review is the second level of review and is carried out by another member of the BA Service or potentially by a member of the project team.

3. Stakeholder review is the third level of review. This level may require a number of separate iterations; for example, for project team or internal stakeholders, followed by external stakeholders. This level of review should yield the lowest number of errors. By this point, the only errors identified should concern scope and content (as opposed to technical) inaccuracies. Stakeholders should not be required to provide detailed comments; for example, regarding spelling, branding, notational inconsistency, grammatical error or use of language. (However, errors of this nature that are left in the documentation at this stage can undermine the stakeholders’ confidence in the quality of the business analysis work.)

Figure 12.11 The review triangle

images

The three levels of review are clarified further in Table 12.6.

Table 12.7 shows an example of typical comments that may emerge from different types of review when reviewing a process model.

Business analysts are often expected to operate at a fast pace and with tight timescales for both the production and review of analysis deliverables. In this case it may be necessary to share outputs for review by stakeholders before/simultaneously with other reviews. In this case it is important to:

explain to reviewers what they are seeing;

confirm the purpose of their review;

assure reviewers that any low-level errors (such as format, typos) will be addressed.

Table 12.6 The three levels of review

images

Some stakeholders may be disconcerted by superficial errors in outputs and this can undermine their ability to trust the content. Other stakeholders may not notice these small errors and may just concentrate on conducting the required level of review. Discussing and agreeing quality expectations and reminding people of the purpose of their review will help to maintain the relationship and ensure the best use of each reviewer’s time.

Table 12.7 Example set of review comments

Reviewer

Comments

Self

Sometimes ‘admin team’ used, sometimes ‘support team’ – is this the same thing? (find out and update)

Moved to branded template

Spelling errors corrected

Final flows updated to use same notation throughout

Added guard conditions to all decision flows

Updated to verb/noun naming convention throughout

Peer

Add another final flow to prevent crossing lines/make it easier to read

Include annotation to explain decision point logic?

Shows that A OR B can trigger a notification, is this correct? Should it be it be BOTH A AND B?

Is ‘the notification’ defined somewhere?

Inconsistency of spelling of ‘analyze/analyse’

Stakeholder

Task appears in swimlane X, but sometimes team Y performs this task

Task Z can also trigger a notification to the customer

Is business process D covered by another diagram?

Quality checklists

The purpose of a checklist is to act as a reminder when creating different outputs and to allow business analysts to review their own work objectively, or to assist in peer review. They remove personal preference and interpretation and move the BA Service towards consistent outputs, no matter who created the output.

Checklists are generally a simple way of ensuring quality. However, because they are simple to design and implement, more experienced practitioners sometimes assume they do not require them. On the contrary, research across industries as diverse as aviation, manufacturing and medicine has shown that a checklist applied at the right time can support those with even the highest levels of training and experience (Gawande, 2011).

Table 12.8 shows an example checklist for several business analysis deliverables. Checklists should not be too detailed and do not replace the need for standards, templates and appropriate training. Checklists should be updated regularly to reflect recurring issues identified during peer and stakeholder reviews.

Table 12.8 Example checklists for reviewing business analysis deliverables

images

images

Review mechanisms

There are several approaches that can be used by stakeholders to review business analysis deliverables. It is important to consider different factors in determining the right approach, such as the nature of the deliverable, the number and location of the stakeholders, the risk of non-review or limited review, and the timescales available.

Walk-through

The business analyst walks the reviewers through the output in a workshop, describing how the information was obtained, why it has been presented in this format and what it shows. provides the opportunity for them to ask questions or make comments. There is no expectation that a deliverable has been looked at in advance of the workshop.

This process may lead to updating the deliverable. A workshop can require a significant investment of time and may be used to acquire input from specific stakeholders. This approach is useful for gathering immediate feedback but is unlikely to deliver a ‘deep’ review.

Written review

Where it is not possible to gather stakeholders together (either face to face or using technology), or the priority of the review is relatively low, asking for written review comments is a suitable review approach. There are various technology-enabled approaches available to support this type of review including:

updates/tracked changes to a shared version of the review item;

updates/tracked changes to multiple versions of the review item;

comments collated separately to the review item, via email or a review record.

It is important to consider the balance between allowing reviewers to use their preferred method of review and the overhead and potential risk for the business analyst who is collating their responses. Before distributing a deliverable for review, the business analyst should consider:

What is the best approach? Is that approach likely to be adhered to?

How experienced are the reviewers in the review process?

Are the purpose of the review, the expectations of reviewers and the required time frames clearly articulated?

Is it necessary for reviewers to build on or respond to each other’s comments and, if so, how will they do this?

Do reviewers expect or require responses to comments?

When the volume or complexity of comments received exceeds what was expected, this may point to the need for a review meeting. Equally, if very few comments are received, particularly when more have been expected, this may suggest that the mechanism (or time frame) was unworkable for reviewers.

Review meeting

The deliverable to be reviewed is shared in advance of an arranged meeting, and there is an expectation that comments and queries will be submitted and collated before the meeting. The meeting is used to address significant comments and agree corresponding updates. This approach is particularly valuable if stakeholders have provided conflicting comments.

Review summary

Senior stakeholders who are asked to approve key deliverables ultimately need assurance that an appropriate level of quality assurance and review has taken place. The completion of a formal review process that has included a peer and stakeholder review is a more compelling case for approval than suggesting that they also need to conduct a detailed review. The creation of a review summary is helpful to senior stakeholders; this sets out who has reviewed the deliverable, when the review took place, the main comments that were made, any quality controls such as checklists that have been applied and the actions taken to address the comments. It is also helpful to highlight any questions that remain to be decided by the senior stakeholder or group, such as issues related to scope or business strategy.

Kanban

The concept of Kanban originated from the Total Quality Management (TQM) approach developed at Toyota in Japan. It creates a visual representation that allows monitoring and limitation of work in progress, prevents build-up of excess inventory (see Table 12.1) and encourages a just-in-time (JIT) approach.

Figure 12.12 shows the three columns of the Kanban Board: ‘To do’, ‘In progress’ and ‘Done’. Items in the ‘To do’ column are derived from an agreed backlog of potential work items. However, not everything from the backlog is represented – only items for which there is now (or in the near future) a work commitment should be entered onto the Kanban Board.

Figure 12.12 Structure of a Kanban Board

images

The aims of Kanban are to ensure that the work:

is understood and agreed (transparent);

meets agreed quality;

in-progress limits are adhered to;

flows through the columns.

Quality and acceptance criteria must be defined against which work items are assessed when deciding whether or not to move them from one column to the next. This ensures that work only starts on a particular item when there is clarity on what needs to be achieved and work is only considered complete (‘Done’) when it meets the defined quality criteria.

Where work remains ‘In progress’ for an extended period, this may indicate that the work has not been broken down into sensible, achievable chunks, or that barriers may be impeding progress towards meeting the agreed quality. These issues should be investigated and addressed in order to support the achievement of the quality criteria. Having too many items ‘In progress’ has been shown to impact efficiency and productivity (Sjøberg, 2018), so applying a limit on the amount of work in progress (WIP) encourages focus and flow.

Having WIP limits reduces time and effort wasted due to:

context switching;

excess meetings;

working on lower priority deliverables;

miscommunication;

rework;

duplicate effort;

missed deadlines.

While business analysts may be most familiar with the use of the Kanban system to underpin product development, it can be used as a physical or virtual workload management tool for the service improvement activities identified for the BA Service. An example Kanban for suggested BA Service improvement activities is shown in Figure 12.13.

Using this approach, business analysts can see what service improvement activities are coming up, and therefore what they may be able to get involved in if capacity allows. This approach also shows that progress towards service improvement is being made.

Figure 12.13 Example Kanban Board for BA Service improvement activities

images

CONCLUSION

Delivering a high-quality BA Service will only happen through design and continued effort, and this requires a quality culture and a continual improvement mindset. It is the responsibility of every member of the BA Service to contribute to the development of service quality.

The provision of processes and approaches to assess the quality of business analysis deliverables will help to ensure that the artefacts produced by business analysts will be fit for purpose and will offer value to customers.

The maturity of the BA Service can be assessed, and potential areas for service improvement identified and managed, through use of approaches such as the BA Maturity Model, the BA Service Assessment, the BA Service Improvement Plan and the BA Service Road Map.

Various frameworks exist that help to improve service quality, including the Deming Cycle and the ISO quality management principles. Ongoing quality management, which encompasses formal processes and techniques, is needed to ensure that the BA Service maintains a focus on service quality.

CASE STUDY 8: INSTILLING A QUALITY FOCUS TO ENABLE BUSINESS ANALYSTS TO SUCCEED AND THRIVE

Charlie Payne, National Grid

Charlie Payne is a BA manager for a business area within the National Grid. He is also a regular presenter at business analysis events and the BA Conference Europe.

Charlie believes in having a clear vision and set of values for the BA team. He asks questions such as, ‘How do we know that we are offering quality?’ He doesn’t believe in micro-managing his staff but in enabling them to succeed when performing their business analysis work. He is focused on achieving the required outcomes and answering the question, ‘Why do we do what we do?’

His approach to measuring performance and improving the quality of business analysis work involves a combination of formal and informal mechanisms; for example, the IIBA Competency model has been used in workshops and discussions as a basis for exploring where the team has strengths and where there are gaps. Charlie believes in listening and understanding – whether working with customers or developing the BA team.

Charlie is very focused on the customer perspective and expects his business analysts to consider the customer view. One of the key questions from this perspective is: ‘How does a customer know that the business analysts know what to do?’ To address this question, he has instigated a new task that the business analysts need to work through when starting a project. This involves defining the approach that will be taken to the work and communicating with the customers to make sure that they understand the approach. At a later stage, Charlie contacts the customers to ask if they have signed up to the proposed approach and whether the approach was communicated effectively by the BA. He is keen that the BAs need to understand their stakeholders and think about what they need from the analysis outputs presented to them. ‘Doing the right thing’ and making sure that this has been thought through is very important.

The major challenge faced regarding performance measurement and quality improvement involves moving from non-measurement to measurement. Charlie feels that it is important to get to the point where people want performance and quality measurement rather than feeling that it has been imposed on them. He wants his team to embrace continuous improvement and feels that he needs to ensure that they are ready for this. He does this through co-creating a high-performance context with the BAs in the team.

Addressing the quality challenge has been accomplished by applying a formal process for working with the team. Providing time and space to discuss how performance measurement and quality might work is just one aspect. The other, more informal approach involves listening, supporting and coaching the team members. Charlie uses the analogy of an American football blindside tackler who protects the ‘guys with the ball’. He sees his role as Chief Cheerleader, looking after the team members and instilling them with the confidence to do the work, knowing that he ‘has their back’.

Charlie’s BA team want to perform and want to offer great quality work. He wants to give them the ability to step up and succeed as BAs – or if business analysis doesn’t suit them, to move into another role. He is currently trialling an approach whereby customers are asked review questions based on the IIBA Competency model:

1. How accurately has the BA captured the information for you to understand, review and validate?

2. How usable is the work that the BA has provided?

3. How effective is the format and presentation of the work?

4. How satisfied are you with how the BA communicates with you?

The BAs can ask these questions of their customers when they want. The idea is to learn and adapt, improving quality as they go, and to be the best BA they can be.

The key lessons are to be clear about what the team is trying to achieve, understand the purpose of the work (Simon Sinek’s ‘start with why’) and know where you are going. Having a focus on the outcome is important, as this helps to pursue continuous improvement. Charlie believes in listening to customers ‘relentlessly’ and wants them to say, ‘we are so glad we have your BAs here’. He also wants colleagues such as project managers to state how valuable the BAs are to the projects.

Charlie recognises that BAs work in the ‘wicked mess’ where there are lots of different people and multiple perspectives. The BAs have to bring those perspectives together. So, it is challenging work, but, with the right support and attitude, he intends to help the BAs in the team to succeed and thrive.

1 Developed by the Software Engineering Institute, Carnegie Mellon University.

2 See www.iso.org/iso-9001-quality-management

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset