The practice of social science research in Ethiopia: some observations-I
Part I: Obsession with Causality
In two blog posts, I would like to raise an issue that has been boggling my mind and making me question the very importance of my own profession—applied economics. Undoubtedly, this is an era of empirical research fueled by improved access to data and advancements in computing power, which are in fact making my job as an applied economist so much easier. Why do I have a problem with applied economics then? Two things: obsession with causality and fancy models!
The core purpose of empirical social science research is to understand cause and effect; what causes what. The aim is to uncover not just mere association (correlation) between one phenomenon with another (or what the research community calls variables). Correlation studies only inform us which variables co-vary but fail to tell us which one is causing which. Worse, variables may co-vary when they are, in fact, uncorrelated, simply because they happen to be driven by a third common variable. Knowing what variable is causing what and by how much, is very important and enables us to lead a phenomenon in the preferred direction by changing its causal variable.
Let me give you a classic example of how an unidentified third factor could trick us into believing two phenomena are correlated. In the 90s, there were a series of studies that claim listening to Mozart’s music makes children smarter. This result is known as the Mozart effect. The findings ended up being very popular to the extent that Georgia’s governor in the US proposed a budget to finance the distribution of CDs of classical music to children. But subsequent studies indicate that the ‘Mozart effect’ is mainly caused by manipulation of arousal or mood, which can also be generated by other stimuli such as listening to a pleasant and engaging story.
As we humans tend to conflate correlation with causation, and arrive at conclusions based on simple correlations. It is thus understandable why social science research emphasizes establishing causality. My own profession economics is no exception to this, if not more. The increased interest in causality has spawned ingenious econometric tools, along with novel survey designs and experiments that could help to establish causality.
However, the preoccupation with causality in applied social science research is becoming a stumbling block for its very own purpose—pushing the frontier of knowledge. The heightened interest in causality is driving away researchers from asking exciting research questions that have paramount importance to society. Increasingly, researchers are gravitated to look at data first and then figure out the causal research questions that they could plausibly answer given the data. Yes, some interesting research questions are inspired by data; data sometimes make us ask interesting questions. The point here is that intriguing research questions also deserve researchers’ attention though the case to establish causality given the available data is a long shot.
Researchers should come to terms with the fact that they do not always have the luxury of waiting until big enough data are gathered from a random sample to make statistical inferences. Thus, they should be willing to use sparse qualitative and quantitative data and their analytical skills to shed some light on emerging issues, rather than leaving them altogether to journalists and activists. Of course, as more data become available, they can return and provide more insights on the subject.
Causality is about precision. It is understandable if it holds a special place in the developed world where researchers may have not much to contribute other than continually refine the vast knowledge that has already been accumulated through their long scientific research history. In contrast, for developing countries, such as Ethiopia, where we have little grasp of numerous issues, the obsession with causality is simply out of touch. Afterall, achieving causality in social science research is easier said than done. Let’s be honest.