Scrum was brought into MegaEnergy through a pilot project, the Title project, which was referred to in Chapter 2. This project had already been attempted twice and had failed both times. An IT director had learned about Scrum and had convinced his fellow IT managers that they should try Scrum on the Title project. They all felt that this was an opportunity to assess Scrum. If Scrum could turn around the Title project, it would be deemed worthy of further evaluation.
Stakeholders are those who have a stake in a project’s outcome, including those who funded the project and those who will be users of the system in development. Most projects at MegaEnergy offered project stakeholders only limited visibility into the project’s operations and progress toward goals. These projects developed internal artifacts such as requirements statements, architectures, models, designs, and code. At the very end of the project, if the project hadn’t stalled in developing these artifacts, they were all pulled together into a working system. Only then did the stakeholders get to see the actual system they were to use.
Project managers at MegaEnergy kept stakeholders and management apprised of a project’s progress through periodic reports. Because traditional projects are managed at the task level, these reports documented the percentage of completed tasks, slippage in task completion, and any problems and recommended remedies. Since tasks have only a casual relationship to user functionality, these reports were often more frustrating than useful to stakeholders. A Gantt report was then used to track project progress. A Gantt report is a key tool in task-level project management, providing a visual mechanism for laying out all a project’s work, the relationships between the work, and the resources assigned to the work.
MegaEnergy had a very formal and traditional project management process developed over the years by its program management office, staffed by senior people who had previously run pipeline construction projects. To them, the Gantt report was the Holy Grail for planning and controlling a project. Their solution to the first Title project failure had been to increase the extent of the initial planning and rigidly enforce change-control processes during the second attempt. They believed that the first project had failed because management had tolerated too many changes to the initial plan. When I heard this, I was reminded of Einstein’s definition of insanity: doing the same thing over and over and expecting different results. Surprisingly, this approach is common. If a project being managed by a defined approach fails, people often assume that the project failed because the defined approach wasn’t adhered to rigorously enough. They conclude that all that is needed for the project to succeed is increased control and project definition.
Senior management, including the steering committee for the Title project and the project management office, knew that something new was going to be tried on this third attempt at the Title project. The people staffing the program management office weren’t familiar with empirical process control; Scrum was a big unknown to them. However, nobody objected to its use as long as the project was going to be controlled the way all projects at MegaEnergy were controlled—with Gantt reports.
This presented us with a dilemma. Should we provide Scrum training to senior management, including the people in the program management office? Should we make them aware of the radically different approach we were about to use on the Title project? Should we enter into a long discussion with them about the differences between defined and empirical process control? We knew that the discussion would have a large emotional impact. The people in the program management office had a long history of success at MegaEnergy. Their approach had been used to manage projects much larger than the Title project. Their take on the previous Title project failures was that it was a people failure, not a process failure. How could we convince them otherwise?
Scrum managers measure and track requirements, not tasks. The Product Backlog indicates the requirements, the prioritization of the requirements, and the anticipated grouping of the requirements into Sprints and releases. The Product Backlog at the start of a specific Sprint looks different from a Product Backlog at the start of a subsequent Sprint because business conditions might have caused the Product Backlog to change or be reprioritized. Some items in the Product Backlog might not have been completed during a Sprint and might have been reallocated to a future Sprint. The amount of Product Backlog initially planned for a release might include more or fewer requirements. The Product Owner might have restructured or repurposed the release. Planned Sprints might include more or fewer Product Backlog items than before as more is learned about the size of specific Product Backlog items, or as more is learned about the productivity of the teams working on the project.
Scrum reports represent a paradigm shift for most stakeholders and management. Traditionally, a plan is established and any variation from the plan is seen as undesirable. Periodic management reports are supposed to show how closely a project is to the initial plan. Scrum instead reports exceptions to the plan, responses to these exceptions, and the impact on the project plan. Scrum expects change and provides reports that track change and its impact on the project.
The ScrumMaster, Ruth, was a solid project manager at MegaEnergy. She knew MegaEnergy culture inside and out, and she knew the senior executives who would be receiving reports on the Title project’s progress. She had worked with the people in the program management office and knew exactly what they wanted and why. She knew that Gantt reports were the core of their reporting system. She was skilled in preparing and managing these reports with Microsoft Project, the standard project management tool at MegaEnergy.
Ruth and I sat down to figure out how we could get the people in the program management office to permit us to proceed with Scrum. If we convinced them to let us try a new form of project management like Scrum, they wouldn’t necessarily want it to succeed. They had a vested interest in the current method of project planning and management. Reporting would be tricky because we would have to justify every change. The words “empirical,” “self-organizing,” and “emergent” were virtually unknown in the program management office and would probably seem abhorrent to it.
The approach we settled on for introducing Scrum to senior management and the program management office reminds me of an old joke. John sees Hank pulling a long piece of rope up a narrow, winding mountain road. John asks Hank why he is doing this. Hank replies, “Because it’s easier than pushing it!” The approach Ruth and I settled on wasn’t as simple and straightforward as Scrum when it initially comes out of the can, but it seemed a lot simpler than trying to convince everyone that empirical process control, as embodied by Scrum, was a palatable alternative to their current approach. Ruth and I decided to provide management with the Gantt-based reports. However, rather than using task-based planning and reporting, we would plan and report requirements.
Our first step was to acquaint Ruth with Scrum’s reports. Scrum defines four reports for the Product Owner and ScrumMaster to create at the end of each Sprint. The first lists the Product Backlog at the start of the previous Sprint. The second lists the Product Backlog at the start of the new Sprint. The third, the Changes report, details all of the differences between the Product Backlogs in the first two reports. The fourth report is the Product Backlog Burndown report.
The Changes report summarizes what happened during the Sprint, what was seen at the Sprint review, and what adaptations have been made to the project in response to the inspection at the Sprint review. Why have future Sprints been reformulated? Why was the release date or content reformulated? Why did the team complete fewer requirements than anticipated during the Sprint? Where was the incomplete work reprioritized in the Product Backlog? Why was the team less or more productive than it had anticipated? All of these questions are answered in the Changes report. The old and new Product Backlog reports are snapshots of the project between two Sprints. The Changes report documents these differences and their causes. A collection of Changes reports over a period of time documents the changes, inspections, and adaptations made during that period of time.
We then set about translating the Product Backlog into a Gantt report. The Product Backlog, shown in Figure 7-1, was maintained in a spreadsheet as a simple prioritized list of requirements. Dependencies between requirements are resolved by placing dependent requirements at a position in the list lower than the requirements on which they depend. Requirements are segmented into Sprints and released by unique rows in the list.
A Gantt report is a lot more impressive than a Product Backlog list, as you can see in Figure 7-2. It is graphic, indicates dependencies with lines, comes in multiple colors, and is much more complicated than a simple list. But appearances can be deceiving. If a Gantt report includes only requirements, and not tasks, it is merely a graphical representation of the Product Backlog.
Ruth opened the Title project Product Backlog spreadsheet in Microsoft Excel and opened a new project in Microsoft Project. She copied and pasted the entire Product Backlog list from Excel into Microsoft Project in the Task Name column. She then copied the estimates into the Duration column. She then arranged the requirements (Microsoft Project thought of them as tasks) by Sprint, as shown in Figure 7-3.
This transfer between two Microsoft products was straightforward. Ruth then populated the Work and Tracking views of the Microsoft Project views with the estimated work for each Product Backlog item, along with the start date and end date of each item’s actual or planned Sprint. The percentage completed fields were normally 100 percent at the end of the Sprint. We decided that when they weren’t, she would split the items and reallocate the undone work to future Sprints.
The only report that we couldn’t readily translate to existing MegaEnergy reports was the Product Backlog Burndown report, shown in Figure 7-4. The Burndown report graphs remaining estimated workload over the course of the project. Workload at the start of each Sprint is measured by summing all open Product Backlog work estimates. From Sprint to Sprint, the increase or decrease in these sums can be used to assess progress toward completing all work for a release by an expected date.
This Burndown report measures the amount of remaining Product Backlog work on the vertical axis and the time scale, by Sprint, on the horizontal axis. The Product Owner plots remaining quantity of Product Backlog work at the start of each Sprint. By drawing a line connecting the plots from all completed Sprints, a trend line indicating progress in completing all work can be drawn. By figuring out the average slope over the last several Sprints and drawing a trend line from the plots of these Sprints, the time when zero work remains can be determined, occurring when the trend line intersects the horizontal axis. Ruth and I decided that this was an important report. It would graphically present to management how the factors of functionality and time were interrelated. We decided to include it in the reports, but as an appendix.
When management got its reports at the end of the first Sprint, the new reports looked a lot like the old reports except that, as Ruth noted in the preface to the reports, she was tracking progress and completion of functionality rather than tasks. When Ruth went over these reports with the steering committee, she used the Product Backlog Burndown report to show the implications of completed Product Backlog to the entire release schedule. She then used the Product Backlogs to show the difference between the Product Backlog plans at the start of the Sprint and the end of the Sprint. In this case, the difference was fairly dramatic. The Product Owner had capitalized on the value of the first increment by choosing to implement it, meaning that the functionality in the increment was made production ready, the users in the Title department were trained, and the users started using this functionality in their everyday work. This decision introduced a release Sprint of two weeks into the Product Backlog, changing everything. As the steering committee discussed this, it came to realize a core benefit of Scrum: each Sprint’s increment can potentially be implemented. In this case, the Product Owner felt that an early implementation was justified. The Product Owner inspected and adapted. The steering committee was exposed to the incremental nature of Scrum and the benefits of frequent inspection and adaptation.
Ruth correctly assumed that senior management didn’t want to talk about process; it wanted to talk only about results. Introducing a new format for reporting project progress would require teaching management about Scrum. It would require getting the program management office to consider a whole new approach to managing projects. Senior management didn’t care about Scrum. It cared about its investment in the project.
Ruth could have continued task-based reporting to senior management. If she had chosen to do so, she would have developed an anticipated task plan and fit each Sprint Backlog into it. She didn’t have the time or inclination to do this, but she didn’t want to change the reporting format. She correctly assessed that she could deliver a new message using the same reporting format, reporting progress on requirements and functionality rather than tasks. By fitting the Product Backlog into Microsoft Project, she was able to normalize the Product Backlog into a known format.
Translating Product Backlog to the Gantt report wasn’t a very big effort. Ruth felt that it was certainly a much smaller effort than convincing the program management office that Scrum and Scrum reporting were acceptable. The only report that didn’t readily fit was the Product Backlog Burndown report, which became an appendix to regular management reports. As management asked questions about the regular reports, Ruth was able to support her discussion with the Product Backlog Burndown reports. When management wanted to know the impact of the early release, Ruth was able to show it to management on the Burndown reports. Ruth was able to teach management how to manage a Scrum project without having to teach it a whole new vocabulary.
Scrum proves its value as projects succeed. However, it is a radically different approach from traditional project management, expecting change rather than fearing it. Adaptation is a normal part of the project rather than an exception. If these concepts are theoretically discussed, most people agree on the reasonableness of the approach. When these concepts are brought up in the context of a critical project, however, management is often extremely wary. Managers want to know why this approach is being suggested. They ask if the approach is risky, and whether allowing this much change isn’t inviting disaster. In this case, Ruth showed the value of the new approach by putting her mouth where management’s money was. She showed the benefits and value of Scrum to management without its knowing or caring about the concepts or theory of agile processes. All management knew was that something good was afoot. As the CEO of another company stated at a Sprint review, “I don’t know what this Scrum is, but I like it.” That’s the way it should be.