“The Human Side of Software Development” may just be a tacky slogan I came up with on the spur of the moment to make my WordPress install just a tad less generic, but the sentiment behind it is genuine, and something that I’ve always meant to expand more on in this blog. So here goes, with a review of a decidedly non-technical book.
One thing I’ve been convinced of since I first read Peopleware is that human factors are the cause of more of the problems in the average software team than technical issues. It’s not just that human problems exist in our teams and are difficult to solve, it’s that we don’t learn from our mistakes. Dev never talks to marketing. Engineers blame the testers, and testers blame engineers. Management write off all techies as being difficult to manage. These tropes are played out again and again in thousands of teams, and we still don’t seem to have a really clear idea what the underlying problem is, let alone what to do about it.
Nor is this just a matter of individual learning. Teams and whole organisations need to learn from their mistakes so that we don’t end up pulling in different directions, or even worse have the lone people who feel they have solutions feeling powerless to influence the herd.
Enter Chris Argyris, Professor Emeritus at Harvard Business School, who has spent a lifetime researching topics like these. His book Knowledge for Action attempts to tackle one of the most crucial barriers to this sort of organisation learning, namely the defensive habits and routines that make it impossible for organisations to change. Argyris paints an all-too-familiar picture of an organisation where everyone is overtly committed to effecting some change, but politics creeps in, fights break out and people tacitly cooperate in undermining their own efforts.
Argyris’s main contention is that attempting to change organisations throws up situations of embarrassment or threat, and that people respond to this by avoiding the difficult issues. Moreover, people silently collaborate on this because it’s in nobody’s interest to uncover the threatening material. The case study that’s central to the book develops the author’s hypothesis that by changing our fundamental internal model of the world (taking the focus off winning / losing and onto objectively verifying our beliefs about others) our individual and team behaviour will naturally follow.
I suspect that two aspects of this book will appeal to those of a technical persuasion. First of all, the book is research-based and as precise in its analysis as the subject matter allows. This is not some faddy airport self-help guide for middle managers. Secondly, the approach is the quintessentially nerdy technique of looking to change the second derivative of the problem: not dealing with things that are bad, or even with how to make them better, but how to improve the ‘making better’ process. Hopefully engineers will intuitively see the potential for huge leverage in getting this right.
Unfortunately, I can give this only a qualified recommendation for readers from a technical background. Yes it’s a good book, and a great contribution to the growing body of knowledge, but ultimately it’s still a piece of social science research and the author is clearly intending it to be read by other academics with a similar background. I probably read more “soft” science research papers than the average techie and I found it pretty hard going at times.
So this is really one for the enthusiasts, or those who’ve already read The Fifth Discipline and want to take it further.