I just read Thinking, fast and slow by Daniel Kahneman. Daniel is a winner of the Nobel Prize in Economic Sciences and have done a lot of research in psychology. The book is a great overview of a vast area of psychology and a lifetime of his and others research. It gives a very good overview of how the mind works in some different situations and problems that this commonly leads to.
I found it interesting that so many of the concepts that he writes about is directly applicable on teams, organizations and software development. This is a short summary of some of those concepts and my thoughts on how they apply to agile software development.
Note that all of those concepts, effects and behaviors are validated and proven over and over in many studies - so even if you believe that they are not true for you or for your team, they most likely are.
Question substitution is replacing a hard question with a simple one – without being aware of it. An example from the book is replacing the question “How happy are you these days?” which is very hard for humans to answer with the easier “How is your mood right now?”, or judging a political candidate mostly after how much they look like a leader rather than their political views.
In our setting, my guess is that this happens a lot. Consider the question “How likely is it that this project will be finished on time?” – it’s a very difficult question to answer and requires a lot of analysis and thinking. It will be much easier answering “How do I feel about my current tasks?” or “How likely is it that I finish my tasks on time?” and I believe that this is what most team members will answer when they are asked for likelihood for a project to finish on time.
It has also been proven that if you like something you tend to overestimate the positive effects and underestimate the risks and negative effects. This means that if you believe in the product or new feature that you are building, you will likely believe that the positive effects of the product will be bigger than they will be, and you will most likely believe that the risks of building the product is smaller than they really are.
Those are serious biases that needs to be taken into account when prioritizing and estimating different features or projects.
Smiling and frowning
When you work on a very hard task, you will start frowning. And when a task is easy, you will start smiling ever so slightly. It turns out that it works the other way as well – if you are frowning you will actually find that solving a task takes a bit longer, requires more energy and will fail more often.
The lesson is clear – make sure that you create an environment where people are smiling a lot, especially when trying to solve tricky problems.
When we engage in tasks that requires hard cognitive work, the body will use a lot of glucose. This leads to energy depletion fairly quickly and this has a huge effect on our ability to perform well in cognitive tasks or logical decision making. A study showed that judges in Israel would approve parole for as much as 65% of their cases right after a meal, only to lower that rate significantly and be close to 0% after two hours.
So when you conduct meetings, retrospectives and planning sessions – make sure that all participants can refill their glucose levels every hour, or they will not be very effective.
The author describes several different states of mind when a person is more likely to listen more to their gut feel and disregard facts or critical thinking. One that stands out for me was the feeling of power – when a person feel powerful they are more likely to listen to their intuition and disregard facts. This is worth keeping in mind when working with management, team leads and product owners – they might need an extra reminder to look at the numbers, take a step back and consider something more thoroughly or to challenge their intuition.
The anchoring effect is a phenomenon where hearing a number will greatly influence your ability to estimate something. This holds true even if you know that the number has no connection to what you are estimating, if it is random or obviously false. The scary thing is that this effect holds true not only when guessing numbers you don’t really know (such as the age of Gandhi when he died), but also when deciding numbers that is part of your expertise. This has been proven on real-estate agents estimating buying price of a house, on judges deciding prison sentences and software teams estimating the size of a project.
The anchor index in different studies is usually in the range of 30-50%, which means that raising or lowering the anchor will change the estimate with 30-50% of that – so it’s a huge effect.
This means that if we ask our team to estimate a project that we believe will be approximately 5 months long – that number will create a very strong anchoring and any estimate will be greatly affected by this number. Always try to avoid anchoring if you can – this is one of the reasons for why planning poker is very useful for creating estimates within teams.
Plans and forecasts tend to be unrealistically close to a best-case scenario and could usually be improved by taking existing statistics of similar cases into account. One of the reasons for this is that we plan for everything that we can foresee, add some margin for the risks that we know about and can foresee and are completely unable to take into account any risks that we do not foresee.
If you work as a project leader in a company that has only completed 20% of their projects on time, budget and scope – then a good guess is that you only have a 20% chance to do the same. But having the full project planned out will make you feel very confident and you most likely will judge the chances of success much higher.
The proposed way of avoiding the planning fallacy is to take the outside view and use statistical information from similar ventures. When a team estimates how long it will take them to release their feature on a new platform, also have them look at how long it has taken other teams or companies to release features on that platform. A good technique is to start with that baseline (how long it has taken other teams) and change that baseline based on specific information about your case. If the estimate of the team is much different – challenge them to why they have reasons to believe that they will be faster or slower than most other teams. There are most likely unknown unknowns that they are not accounting for.
Individuals and organizations tend to be overly optimistic and confident about their chance of success. The more coherent your constructed plan feels, the more confident you will feel about it. A way to avoid this in teams and organizations is to run premortems.
We are all familiar with postmortems – gathering everyone after the end of a project or iteration to figure out what went wrong or how we could have improved. The only problem is that those postmortems are held after the fact. Instead, the book suggests that before you commit to any major decision you should run a premortem session where you ask the attendants to “Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster”. This is a great way to challenge a team to think about risks and doubts that they have been unconsciously suppressing because the feel confident about the current plan.
Those examples are only a small part of everything that is mentioned in the book and I really recommend everyone to read it – it will change the way you think about your own rationality and how decisions are made.