The United States became aware of its vulnerability to terrorism on September 11, 2001. Suddenly, vectors of seemingly unrelated activities coalesced. One question we persistently ask is how could we have been unaware of this threat? From all of the data collected daily, couldn’t a pattern have been detected, a threat perceived, and a warning issued? Indeed, massive amounts of data are collected daily by government agencies in the United States at federal, state, and local levels. However, little of this data is used for more than supporting a single institution or its initial goals. Privacy concerns and an absence of clear need had previously deterred attempts to convert this data into pertinent, widely shared information.
The possibility of deriving such information from masses of disparate data had been the subject of research at Lapsec for several years. Lapsec referred to the technique as data fusion. Data fusion applied advanced algorithms to large quantities of data, rapidly reduced it into globs with consequences, and finally stored it as information that could be subjected to pattern analysis. Lapsec received the go-ahead for this project in early 2002. At the same time, Lapsec was granted access to all of the transactional databases supporting various levels of American government.
The project not only had complexities, but resolving each complexity required different levels of skill, intensity, and persistence. Timely data had to be continuously extracted from agencies that didn’t know the meaning of the word “cooperation.” This data had to be carefully filtered and reduced to minimize extraction, transfer, and load times. Algorithms had to be constructed to browse, search, parse, correlate, and store intermediate results. New data technologies had to be acquired or created to store the intermediate data in dynamically changing data structures. Last, data fusion algorithms had to be employed to identify and highlight the needles in the haystack that could represent threats to national security.
This project was a first for Lapsec in many ways: the technologies involved in this project were untested, and the degree of cooperation required was higher than ever before. Accordingly, the first milestone was a proof of concept. Experts on algorithm, database, and fusion technologies and developers were brought together to form a team to reach this milestone.
The team members struggled for several weeks without making much progress. The unknowns and interdependencies were simply too great. How much data should they initially acquire? Which agencies should they approach for data feeds? How could they quickly acquire the requisite data storage capabilities? Would a commercial data storage facility work, or would they have to build their own? What kind of algorithms should they develop? After several weeks of struggling, the team decided they needed a process that would focus their efforts. Several team members knew of Scrum and suggested that the team try it.
I had doubts about whether Scrum would be able to help resolve the situation. In most complex projects, I’m able to make timely suggestions because I see patterns that I recognize from my previous experiences. This project, however, was classified: I wasn’t allowed to know anything about it. Most of what I thought I knew about the requirements of the project I’d inferred rather than been told explicitly. It is entirely possible that my inferences are incorrect and the project was an effort to catch salmon poachers!
I offer a service called Project Quickstart. It is a two-day exercise that gets a new project team unfamiliar with Scrum launched on its project. During Project Quickstart, I teach the team about Scrum, and I help the team go through a Sprint planning meeting for their project. At the end of the Quickstart, the team has begun its first Sprint, and the first increment of product functionality is due in 30 calendar days.
After the first day of Quickstart with the Lapsec team, I was frustrated. I didn’t feel that they really understood Scrum. The day felt like a formal training exercise, not the start of the team’s first Sprint. Team members were still acting like people from different organizations who were investigating something new. They didn’t act like a self-organizing team performing as a single unit to solve a problem.
How could I help them? Every time I tried to get them to discuss their goals and Product Backlog, they told me they would have to shoot me if they told me. Lapsec had decided that progress in the project was so urgent that there wasn’t enough time to get me enough security clearance to learn the basics of the project. At least, that’s what they told me—perhaps I had failed the security clearance! In any case, the team couldn’t tell me anything about its work. Some information was unintentionally passed on nonetheless. The team felt secure enough to tell me about the project’s data fusion algorithms, the functions of its host organizations, and its merging of a lot of disparate governmental data.
For Scrum to work, the team has to deeply and viscerally understand collective commitment and self-organization. Scrum’s theory, practices, and rules are easy to grasp intellectually. But until a group of individuals has made a collective commitment to deliver something tangible in a fixed amount of time, those individuals probably don’t get Scrum. When the team members stop acting as many and adopt and commit to a common purpose, the team becomes capable of self-organization and can quickly cut through complexity and produce actionable plans. At that point, the members of a team no longer accept obstacles, but instead scheme, plot, noodle, and brainstorm about how to remove them. They figure out how to make their project happen despite the different backgrounds and skill sets each person brings to the job.
I couldn’t sleep for fear that the next day might be a disaster and the training session as a whole might fail. Finally, I decided that I had enough information to construct a hypothetical project. I had been told the vision of the project was to improve national security through the merging and fusing of data. I knew some of the people were national data experts, others were mathematicians responsible for algorithm design, and still others were Internet search experts. Every U.S. citizen was familiar with the criticisms of the government regarding intelligence gathering and usage prior to 9/11. Even though I would probably be wrong about the details, I felt that I could probably con- struct an adequate hypothetical Product Backlog to help the team meld.
The next morning, I started by reviewing the concepts of Product Backlog and sashimi with the team. I then passed out the hypothetical Product Backlog that I had written the previous night. I asked the team to spend the next two hours going over it and selecting the work for the first Sprint. I told them that after two hours they would have to tell me what Product Backlog they had selected to turn into a full, demonstrable increment of potentially shippable product functionality during the first Sprint. In other words, what could they accomplish? I hoped my hypothetical Product Backlog might be close enough to their real project to make the class seem real to this team.
The Product Backlog consisted of one functional requirement and a number of supporting nonfunctional requirements. The requirements held that the product ought to be able to perform the following functions:
Identify those people who have attended flight school over the past three months and who fit the profile of someone who might intend to commit an act of terror against the United States.
Present information in a graphical manner that leads to intuitive exploration through such techniques as merging and drill down.
Combine information from multiple sources correlated in relationship to the inquiry or criteria being posed.
Deconstruct an inquiry into relevant data.
Provide intermediate storage of extracted data in such a manner that it could be readily codified and later used. Do this dynamically as the inquiries are parsed and without undue intervention by the person making the inquiry.
By combining the requirements to demonstrate just one piece of functionality within the time limit of the Sprint, Scrum forced the team to focus its attention on the immediate. I told the team that I was their Product Owner and would answer any questions regarding the Product Backlog and project. Then I told the teams to start work.
As the exercise progressed, the team laid out tentative designs, explored the amount of data that could be retrieved, analyzed the elements and attributes required to support the required functionality, and designed several simple fusion algorithms. They struggled with how they could limit the work. How could they do all of this work and still produce sashimi in one Sprint? Under the pressure of the Sprint time limit, the team realized that it had to use a one-time extract of data from the source databases. It didn’t have time to build formal database interfaces. The team came to realize that it needed to put only representative pieces of the whole product together to produce the desired functionality. There was no need for the team to build every piece of the product.
At the end of the two hours, the team described what it could do. The team had collaborated with me, the Product Owner, to devise something of value that could be done in one Sprint. In the process, the team members had self-organized and become a single cohesive team. The team had gone from being a separate group of individuals at a class to a team committed to finding a solution. The team had learned the essence of the Scrum process!
The rest of the day was spent constructing a hypothetical Sprint goal and Sprint Backlog. Most pleasing to me, team members from different organizations devised Sprint Backlog items that required cross-functional responsibilities that would require significant cooperation to complete. By the end of the day, I was satisfied. I felt that the team had grasped Scrum. I felt that the team members knew how to plan and commit well enough to do so in the future without any help from me.
I asked the team to spend the next day on real work. Its first step would be to modify the hypothetical Product Backlog into an actual project Product Backlog. Then the team would repeat the Sprint planning meeting with live Product Backlog, spending two hours designing and constructing a Sprint Backlog for the real project. I asked the team to call me if it had any questions that I could answer—at least, questions that I could answer without getting security clearance.
I got e-mail from the person who engaged me for the Project Quickstart. He related the team’s success the next day and its successful initiation of the first Sprint. But I haven’t heard from the team since. My e-mails go unanswered. I presume the team is alive and well, producing something that makes me secure but is too secret for me to know about. Sometimes I wonder whether I was the only person in the room who was using his real name!
We can see that the ScrumMaster has to be effective regardless of the circumstances. Although ScrumMasters might be tempted to think of themselves as powerful, they are really only enablers. My hands were tied at Lapsec by my lack of knowledge of the application and the technology. My suggestions were based on mere guesswork. It’s one thing to read and talk about Scrum, but it’s another to implement it. Scrum must be put into place before it can be fully understood.
The dynamics of self-organization, collaboration, and emergence are best understood when a team faces a real problem. My description of Scrum remained merely academic for the team until I provided the team with a hypothetical Product Backlog that was similar enough to its own backlog of work that team members could get emotionally involved with it and feel as though they were making real progress. At that point, everyone’s understanding of Scrum quickly moved from intellectual to experiential. From then on, the team was able to employ Scrum’s practices and rules to reduce the complexity of its situation to something that could be addressed within a 30-day Sprint.