Saturday, 1 February 2014

Adopting The Rational Stance

As a programmer, it's not unusual to have to justify my position, or to have to think through a problem, as well as listen to others do the same. We argue on logic, on solutions, on frameworks, on practices to code - and all this in an environment where no two people completely agree on anything. Yet despite this, what doesn't happen (at least not in my experience) is seeing one programmer dismiss a proposition by finding some emotional "reason" the proponent might happen to hold it for.

Are computer programmers under the illusion that humans are purely rational beings who have not understood the role of emotion in cognition? Perhaps, though it seems somewhat unlikely that programmers are divorced from the human tendency to see rationality in their own views and emotion in the views of others. What seems more likely, however, is that it's quite irrelevant to the task at hand. That is to say, even if programmers aren't fully rational, it's still right to adopt the rational stance.

Yet this only seems odd in light of online discussions, where experience has taught me the harsh lesson that rationality is at best a front. Unconstrained by a shared goal, folk psychology tends to dominate. It seems quite ironic that in a state of anonymity we get even more personal.

What I fail to see, however, is much of a distinction between the two activities. To be sure, programming might often benefit from being highly constrained compared to some of the more open questions that people tend to fuss over, yet the aim of the activity is fundamentally the same. It's not whether we can be fully rational, but whether we ought to adopt the ideal of trying to be rational. Failure to do this would be like playing chess for the purpose of flipping over the board.

Any debate over ideas is an invitation to adopt the rational stance - to treat a problem as an object of rational thought, and assess the relative merits as if it were put forward by a rational agent. The goal, normatively-speaking, is not to worry about how an idea is held, but whether holding an idea is warranted. No easy task in practice, but as an aim it's evidentially achievable. Programmers do it everyday, and there's nothing special about programmers.


Ron Tijhaar said...

What you suggest here is that there is some divide between rational thinking and non-rational thinking, say emotional thinking. I think there is no real divide between the two in human discourse, any human discourse. There may be degrees of rationality involved, sure, and as a skeptic myself I think it is in many cases wise to try to maximize the rationality part. But it may be unwise to think no emotion is involved at all. Moreover, it may be the case that emotion and intuition serve an important role when it comes to choosing between options in practice. A strong suggestion for the latter comes from the fact that rationality as such lacks some choice making mechanism when the data available is insufficient to distinguish between options at hand.

This may sound farfetched, but I think it easily translates to the practice of a software development team in which I have spent years of my professional life. Say there is a team meeting and the team has to decide on how to proceed with a certain feature X of the system under development. In many cases I have encountered the data brought to the table by the various diciplines involved is insufficient to choose between options A, B and C on a rational basis only. You could decide to gather more information in order to make a better judgement at a later moment, but time and budget are restricted and the project manager will at some point in the process demand a decision to be taken. Typically one of the options, say option A, is the better one from the perspective of minimizing short term development effort and therefore short term cost. And in many cases another option, say option C, is the better one from the perspective of application management and cost effectiveness in the long run. But the information about the expected lifespan of the system, the expected load on the system and ten other factors involved are not clear and dependent on other factors such as management decisions still to come, market response, depencies with other systems, innovation trends, stakeholder commitment and so on. You cannot make a complete and precise picture of this no matter how hard you put rationality to work. This is where wiggle room for non-rational argument arises. This is were communication skills of participants involved outperform rationality skills.

From my own experience I am inclined to say that the above mechanism is the reason why most business decisions are not primarily rationality driven but primarily emotion driven. And also this is why the software landscape of large companies - despite all efforts to architecture it up front - more often looks like a haphazard construction instead of a well architectured one.

In my experience many design decisions in a software development project are about short term versus long term arguments. And they typically cannot be resolved on the data available at hand.

Kel said...

"This is were communication skills of participants involved outperform rationality skills."
Yeah, I've found this too.