Suppose we want to improve research output from universities in Pakistan. The ‘result’ we are interested in is publications in high-quality journals. RBM starts by measuring the output — number of articles, perhaps multiplied by an impact factor. Then we provide incentives to increase this output. For example, the number of articles is made the basis of promotions, pay raises and other types of evaluations of faculty and institutions. As this RBM practice became popular the world over, the number of ‘fake’ journals has seen explosive growth. There are now a large number of journals which will quickly publish any kind of garbage upon payment of a hefty fee. Academicians churn out paper after worthless paper, seeking an increase in publication count, without any consideration of whether this paper adds to useful or relevant knowledge. Personally, I have seen many CVs which contain 10 or more papers that apply a single esoteric and unreliable technique to different data sets, as well as CVs which contain 20 papers published in a single year in fake journals. The extraordinary increase in noise due to RBM has made it difficult to distinguish between good and bad. The genuine researcher may only have a few papers of high quality, while the competitor has 20 or more marginal papers in low-quality journals. The most important things — like quality and depth of ideas — cannot be measured. This is just one out of many examples. Whenever a number is made to measure performance, people start playing number games instead of improving genuine performance.
The story of McNamara in Vietnam is an amazing example of how numbers blind us to human realities. With the arrogance common to fresh MBAs, McNamara dismissed the advice of his experienced generals, and applied the scientific and quantitative approach to war. He calculated that the number of the Viet Cong guerillas was limited, and used formulas to try to achieve kill-rates which would wipe them out in two years. Huge emphasis was placed on counting the kills, and often sorties sent out to count bodies would create more kills for the enemy than the original engagement. However, all this measurement failed miserably, as the extreme and wanton cruelty to civilians turned the Vietnamese against the Americans. More killings created even more guerillas, completely disturbing the quantitative calculations.
PBM pays much more attention to the human dimension. To improve research output, we would think about what leads human beings to produce great research. How can we inspire and motivate people to struggle hard to acquire and produce knowledge? The single-most important factor is an environment which encourages research and provides appreciation for intellectual efforts. Good researchers are driven by passionate commitment to a much greater extent than material rewards. Thus PBM would focus on creating an environment which nurtures research. In particular, developing research-based social networks and connecting students with inspiring role models would be crucial. PBM deals with soft and unquantifiable goals suitable for humans, instead of hard quantitative goals suitable for machines. Rejecting the MBA religion of measurement, PBM is based on common sense. We human beings manage diverse and complex dimensions of our lives, juggling friends, families, social and professional responsibilities without measuring and quantifying. Business tasks are no different.
Published in The Express Tribune, November 30th, 2015.
Like Opinion & Editorial on Facebook, follow @ETOpEd on Twitter to receive all updates on all our daily pieces.
COMMENTS (2)
Comments are moderated and generally will be posted if they are on-topic and not abusive.
For more information, please see our Comments FAQ