How well are retrospectives serving you? We assume you follow the retrospective agenda by starting with setting the stage, then gathering data, followed by generating insights before you decide on actions and close the retrospectives. We furthermore hope that at your next retrospective you look back to your earlier decisions in order to evaluate if you managed to implement these actions and if they made any difference.
However, how do you know it was those actions that made a difference? Further, if they did make a difference, what kind of difference did they make? What you need to be able to answer these questions is a more scientific approach for your retrospectives. This means that for every action (or experiment) under consideration, explore what is your hypothesis – what do you assume will happen by taking that action? And moreover, how can you later tell if this hypothesis was validated or not?
It’s all about the logical process you follow. If you look at all the facts and decide to act on what seems like the best solution, you are being a good detective. You are using what is called “abductive logic.” That can lead to effective improvements, and yet it can also lead to a good deal of flailing around: the same problem occurring again or resurfacing in a different form – in any case it’s not solved. Another approach is to become a good scientist – to use “inductive logic.”
Inductive logic means that when a team during a retrospective thinks of an action to address a problem, it pauses to ask “What is our hypothesis?” An hypothesis says “Given this background and based on this general theory or accepted approach, if we do x, y, z, we expect to see the following outcome, which we shall measure by the following. Furthermore, we predict that the measurements will be l, m, n. This hypothesis forms the basis of an experiment.
Measurement is critical. You need first to know what your measurements are before you start your experiment so that you have something to compare to your later results! If you are uncertain about “how to measure anything” we highly recommend the book with that title by Douglas W. Hubbard. The measurements can be objective or subjective. Even better, you can look for ways to have a “control group.” For example, some people on your team try a new method and another team member keeps doing it the old way. Or, you might be able to compare between teams.
Only with a hypothesis and an experimental plan with adequate measurements will you be able to tell if you might need a different action or if you have to rethink your hypothesis. In our book on company-wide agility, we call this process “probing” and this helps you not only improve continuously but also innovate sustainably.
There is one more aspect to probes: the possibility of publication. “What happens in a retrospective stays in the retrospective” is a familiar mantra and must, of course, be respected. However, if you discover a process or procedure that seems to make a big difference, you may wish, with everyone’s consent, to tell others about your findings. Publication, in ways that respect the privacy of the retrospective, could be very important because the act of publishing your results asks others to try to replicate what you think you have learned. You will learn from their experiences and validate that what you have uncovered is truly a new discovery.
So consider developing probes in your retrospectives, becoming more like scientists and less like detectives.
Let us know how it goes!
Jutta & John
Jutta Eckstein
Jutta Eckstein works as an independent coach, consultant, and trainer. She has helped many teams and organizations worldwide to make an Agile transition. She has a unique experience in applying Agile processes within medium-sized to large distributed mission-critical projects. Jutta has recently pair-written with John Buck a book entitled 'Company-wide Agility with Beyond Budgeting, Open Space & Sociocracy' (dubbed BOSSA nova). Besides that, she has published her experience in her books 'Agile Software Development in the Large', 'Agile Software Development with Distributed Teams', 'Retrospectives for Organizational Change', and together with Johanna Rothman 'Diving for Hidden Treasures: Uncovering the Cost of Delay in your Project Portfolio'.
Jutta is a member of the Agile Alliance (having served the board of directors from 2003-2007) and a member of the program committee of many different American, Asian, and European conferences, where she has also presented her work. She holds a M.A. in Business Coaching & Change Management, a Dipl.Eng. in Product-Engineering, and a B.A. in Education.