11 psychological traps in software engineering
By Tomasz Kuczma
There are certain common behavioral patterns and effects that you can observe while working with people. Knowledge of them helps me to advocate their behavior, avoid many mental mistakes, and make better decisions. Some of those psychological traps and patterns are especially noticeable in software engineering. Today, I would like to explain to you my top 11 psychological traps.
Dunning–Kruger effect
Let’s start with my favorite one. This effect was identified by 2 American social psychologists and professors David Dunning and Justin Kruger in 1999. It basically says that people with low abilities (low knowledge about something or who just started exploring some area) have a tendency to have high self-confidence and to believe that their skills/knowledge are much better than it really is (illusory superiority). On the other hand, people with mid ability have a tendency to low self-confidence and to underestimate their real knowledge. People with really high abilities start getting more self-confidence but they rarely reach the imaginary confidence of people with low skills.
Its usually demonstrated using this chart:
Image source: wikipedia.org
In my opinion, that is the main reason why junior engineers are so eager to do rewrite the code or to add something to the code and they underestimate the time needed for development.
Impostor syndrome
It is my second best psychological effect and it’s not because of the recent popularity of Among Us game :) It’s highly noticeable in STEM (science, technology, engineering, and mathematics).
In simple words, it is believing that you are not as competent as others see you to be. There can be a ton of empirical evidence of your competence but you will still doubt your own skill. It’s not about modesty. People with this syndrome can have a serious fear of being exposed as a “fraud”. Some people can resign from their promotion or even quit the company because of the related stress.
The fun fact is that early researches focus only on high-achieving women but modern researches show that there is no gender preference here. Hardward Business Review in the article “The Dangers of Feeling Like a Fake” shows that, it’s a common syndrome in executive positions around the world! In my opinion, it is also noticeable in many software engineers’ behaviors as well.
Cognitive biases
The rest of the traps are basically cognitive biases. We try to make rational decisions but sometimes we make those mistakes and in consequence, the action is deviate from the correct rational one. Those incorrect decisions can sometimes cost us more than we think. I can see 2 main categories.
Ego based
- Belief bias - we prefer what we already know or believe in (algorithm, solution, software system, programming language) and are more willing to accept arguments in discussions that align with that knowledge and reject counter arguments.
- IKEA effect - People tend to underestimate the values of their “own babies”. Ikea effect is similar and especially noticeable for stuff we partially build ourselves like furniture from IKEA. So we will overvalue and “defend” the computer systems we built ourselves (e.g. custom monitoring solution for our system).
- Egocentric bias - It is when somebody claims more responsibility for himself for the results of the action of a group than he really deserves.
Incorrect probability managment
- Normalcy bias - You refuse to plan for a disaster that has never happened before. People basically (incorrectly) assign 0 probability to things that have never happen before and assume they will also never happen in the future. One of the examples is the recent (March 2021) fire event in one of the OVH datacenters that destroyed one data center and knocked two others offline. OVH asked customers to execute disaster recovery plans but many of them complained that they do not have any.
- Neglect of probability - It is a tendency to overrate small risk actions and underrate high-risk actions.
Others
- Automation bias - we believe more the information coming from automated decision-making systems (aircraft cockpits, Jira tickets) than a different source. If they are contradictory, we tend to prefer the one coming from an automated system and ignore the other source. We simply forget that computer programs and devices can have bugs.
- Hallo effect - It is the first impression effect. You assume something about a person based on the first impression he made. Even for the traits that the person did not demonstrate yet. E.g. you assume that person is a good manager because he came on time for a meeting and was polite (no managing skill has been demonstrated). It also works for negative effects. E.g. you didn’t like somebody’s high voice and assumed that he is not a responsible person. Of course, some people target to exploit this bias during a job interview - sell themself base on the first impression effect.
- Hindsight bias - when we analyze the past things look much simpler and we tend to say “it was easy to predict”, “we should do that instead”. In the past, we didn’t have all the information we have now and that’s why we made a different decision back then. Now, we have more data and all facts that happened are known so we can say that we would behave differently. The fact is that it is a mistake as we didn’t know everything back then. It’s a common mistake while writing post mortems. It’s also very common on the stock market where people tends to say “I would buy there and sell here” when you have an entire stock chart but nobody can say “buy now” or “sell now” when you have only the left side of the chart (don’t know future stock value).
- And of course discrimination-based biases (gender, age, nationality, etc). But that I think is quite obvious.
Conclusions
It is good to be aware of those mental traps. This knowledge helped me made better decisions in my personal carrier and for the companies, I worked for. It already prevented wasting many months on unnecessary work - avoid work that is not a priority, cutting off useless discussions, helping grow my colleagues. It also gave me a lot of confidence in myself and developed good skills how to find better consensus in difficult situations. I hope, you find this knowledge useful and it will help you and the others too.
Software engineer with a passion. Interested in computer networks and large-scale distributed computing. He loves to optimize and simplify software on various levels of abstraction starting from memory ordering through non-blocking algorithms up to system design and end-user experience. Geek. Linux user.