An analytic project, no matter how brilliant, is worth nothing if it does not inform decisions and drive actions. In this chapter, we address techniques you can use to get your project adopted in your organization. We cover testing of the solution to validate it delivers as expected, training of target users to promote understanding and adoption, and phased rollout to build momentum and a solid user base.
The type and degree of testing needed for your solution is dependent on the scale and level of automation. Will the solution be rolled out enterprise-wide or will it be deployed to a small user group? Will one or two developers be responsible for Data Development, the Analytical Structure, and Guided Analytics, or will a team of developers be involved?
Large and complex projects designed for enterprise deployment with a high degree of automation require a more formal and structured testing, training, and rollout process. Smaller, focused projects impacting a small group of users can adopt a more agile process. Let's review both scenarios.
Decision Architecture projects on an enterprise scale involve several departments working together in cross-functional teams with a number of roles. A team working on an enterprise-wide Decision Architecture project typically requires a wide scope of capabilities, comprising facilitation, documentation, data architecture design, data analysis, data science and decision theory, analytics, UI/UX design, and dashboard development. These capabilities require cross-functional and cross-departmental collaboration across a broad range of roles, such as project leader, decision architect, decision analyst, data scientist, data librarian, data analyst, data architect, data developer, guided analytics developer, UI/UX designer, project manager, and trainer. Chapter 16 discusses these organizational capabilities and roles in greater detail.
While some of the roles may be held by one person, generally speaking, the project team for large, enterprise solutions will be more complex than a small group of people working in a room together. For many large organizations, team members are likely to be located in different countries around the world. In these cases, coordination and testing every step of the way is essential to ensuring the end solution, once brought together, works as intended rather than becoming a jumbled mess.
Large decision architecture projects have a lot in common with software development projects. Accordingly, the testing processes are similar. During the development process, analytic, data, and guided analytic developers will use unit-testing processes to ensure the analysis provides correct results, the data is correct and repeatable, and the front-end tools comply with design and usability requirements.
Following completion of development testing, user acceptance testing (UAT) is the next testing process. In this step, developers and selected end users form a test group to ensure the final solution works as intended. Testers and users may be located in many different locations, so it is best to make use of an online issue-tracking tool or document to log issues found.
During the UAT testing phase, testers and developers collaborate as needed to meet the project schedule. Table 15.1 is an example format of an issue log we use in our projects when a system issue logging tool is not available. The issue log is essential to keep track of issues identified, root causes, solutions, and timing in order to maintain project schedules. A well-maintained issue log also serves as an institutional learning tool, as it can be shared with other teams working on similar projects to benefit from the issues raised and solutions developed.
Table 15.1 Issue Log
Item No | Issue | Issue Description | Priority (HML) | Tester | Tester Comment | Issue Entered Date | Issue Resolution Date | Owner | Owner Comment |
1 | Alert for Revenue metric on Exec Dashboard | Alert is not triggering for revenue decreases above threshold | M | Sarah | Should change color when revenue decreases below 10% | 10/15/2016 | 10/20/2016 | Bill | Found and fixed issue with the display action |
2 | Null handling for displacement calculation | Current displacement metric is incorrect | M | Wenming | Need to add zero-if-null test to the formula | 10/15/2016 | 10/20/2016 | Amal | Modified the calculation as requested |
3 | Missing data for property ABC | When ABC is selected in the property pull-down menu, no data is displayed | H | Sarah | 10/16/2016 | TBD | Sergio | Looking into ETL process; will update when problem is identified |
Small- to medium-sized organizations or departmental solutions within a larger organization are likely to develop smaller scale Decision Architecture projects. The teams typically will be smaller and often colocated, making for ease of communication and collaboration. Essential roles found in even the smallest projects include data developer, decision/business analyst, and guided analytics developer. In small-scale projects the development team and user team often overlap so it is more difficult to draw a line between unit testing and UAT, but it is still a good idea to follow a structured process in order to keep the project moving.
We also recommend maintaining an issue log to document issues found and resolutions applied even in cases where the tester and the owner may be the same person. This preserves organizational learning and helps to develop best practices in order to not repeat mistakes on subsequent projects.
Early in the development cycle it is important to develop a standard nomenclature for release names, helping team members to maintain version control and avoid redundant work or rework. Release naming nomenclature ranges from the simple v1.0 to more complex multi-component names. Useful best practices identified by Princeton University Records Management can be found at records.princeton.edu/blogs/records-management-manual/file-naming-conventions-version-control and include agreeing on standards for the following:
Once a project has completed UAT, it enters the Adoption process step comprising Training and Rollout. User adoption and coaching are important to ensuring the solution is utilized. How well the users adopt the solution and how often it is used will dictate the value derived by the organization.
Training is a critical step to ensuring user adoption. With busy schedules and many competing projects, users will not spend valuable time on a solution they do not understand or know how to use. Key training components include content, medium, and adoption.
There are several methods we deploy to create quick references to help managers tie the solution to actions in their daily work.
It is important to tie the actions with decisions allowing analyst and end user to see the correlation and develop a deeper understanding of the solution and how to make use of it.
While it may be tempting to go for the big-bang approach and throw your solution out there to get it going, we strongly recommend using a phased rollout approach. This approach lends itself well to the iterative development style we emphasize throughout this book. Depending on the time available and complexity of the project, we roll out the solution in up to three expanding user group phases: pilot, expanded pilot, and full user base.
When we are ready to deploy a solution after successful completion of the testing process, we first identify a small group of Pilot users to use the solution in a production environment. Although by this point we have been through some degree of unit testing and UAT, we believe in the Agile principle to get the solution into the hands of real users as soon as possible to uncover application and usability issues that simply cannot be detected in a development environment. We also select one of the team members to act as the Project Champion during the rollout phase, serving as the primary point of interface for the user groups.
We seek out key influencers and heavy users of the analytic solution to form the Pilot group. During the Pilot Rollout phase, we continue to maintain an issue log as presented in Table 15.1 to capture issues raised by our Pilot group. In addition, we also maintain an enhancement log, shown in Table 15.2, as end-users often identify opportunities for improvement that were not scoped in the original requirements. Sometimes these enhancement opportunities are easy to address and can be resolved through a “dot-release,” that is, we can include the enhancement in version 1.1. In other cases, the enhancements requested are more involved and may require an additional development cycle to implement a solution. In some cases, the request is simply not feasible or too far out of scope for the solution.
Table 15.2 Enhancement Log
Item No | Enhancement | Enhancement Description | Priority (HML) | Tester | Tester Comment | Issue Entered Date | Target Release | Est Level of Effort | Owner | Owner Comment |
1 | Forecast accuracy | Display forecast accuracy in the tooltip on the diagnostic dashboard | H | Tom | This will help users judge the likelihood of hitting the forecast targets | 11/20/2016 | V2 | M | Sergio | Will have to source forecast accuracy metrics and join into the analytic dataset |
2 | Add an Opportunity diagnostic dashboard | Add diagnostic metrics and analytics to assess the likelihood of the opportunity | M | Alicia | Would help users determine which opportunity they should prioritize | 11/21/2016 | V3 | H | Amal | Will need to define requirements, develop metrics, source data, and incorporate into the data model |
3 | Market names | Replace market names in the current tool with those used by the sales department for field reps | H | Crystal | Current market hierarchy is an older one used by the marketing department but not in sync with one just rolled out by sales to the field reps | 11/22/2016 | V1.5 | L | Sergio | We have identified a source for the new sales market hierarchy and can add to the solution to use in addition to the marketing department hierarchy |
4 | Property Management Report | Include metrics from the Property Management Report in order to be able to compare solutions | M | Crystal | While this analytic solution is helpful to determining actions in the marketplace, it would be helpful to be able to compare with performance metrics from the current Property Management Report | 11/22/2016 | N/A | N | Patty | This solution is not aimed at property management diagnostics; inclusion of property management metrics would increase the complexity of the ETL processes, impact front-end performance, and be redundant to existing reports |
In all cases, we keep a record of the enhancement requests and the planned disposition to maintain a clear channel of communications with our Pilot user group. It is important that they see their input is taken seriously and acted upon in some capacity in order to encourage ongoing participation. Depending on user locations and the scope of the project, we may schedule conference calls to discuss feedback and proposed solutions. A highly engaged Pilot user group is an invaluable component to the development of the final solution in terms of usability and utility. Once invested in the success of the project, Pilot users often become unofficial champions during the broader adoption phase.
After a period of time, we are ready to move from the Pilot phase to the Expanded Pilot phase. By this time, we may have iterated through several versions and have an increasingly well-honed solution. The Expanded Pilot group will extend to a broader group of influencers and users likely to access the solution on a regular basis as part of their standard work. The process is similar to the one followed with the Pilot group, but with a larger group we might choose to establish automated means to collect feedback, such as a project email or online comment site.
With the Expanded Pilot group, we are less likely to uncover new issues or enhancements if we had a thorough run with the Pilot group. Rather we are more focused on driving understanding of the intent of the solution and how to use it. Information gathered during this phase is passed on to the training team to incorporate into training materials.
Even though the process is still technically in Pilot, the solution is live, in production, and delivering value. Pilot users are able to make use of the information, actions, and strategies identified through the solution to enhance their daily work.
Having worked with the Expanded Pilot group, the last process step is full rollout to the entire targeted user base. By this time, the solution is fully in production, maintained by the production team, and the development team has moved on to development of the next version or other projects.
In small-to-medium organizations or small groups within a large organization the Pilot, Expanded Pilot, and Full User Rollout steps may condense into one or two steps.
Without user adoption, the efforts you put into your solution will not be realized. Make sure you establish the right level of training and coaching to fit the solution you are enabling. Check in often with the users during the rollout phase and after to receive feedback on additional requests or issues they are finding. Lastly, if possible, create a community around the solution to drive continued innovation and adoption.