Learning Objectives
By the end of this section, you will be able to:
- Explain the rationale behind the principle of data access and research transparency
- Understand the benefit of increased openness in quantitative research
While quantitative/statistical analysis, when used properly, could yield powerful information to support one’s theoretical claims, improper use of such technique could ultimately challenge the integrity of the quantitative method as well as the research being conducted. Without proper precautions, statistics can lead to misunderstanding as well as intentional misrepresentation and manipulation of the findings.
One of the most important facts to consider when applying the quantitative method to one’s research, is to make sure that the principle of objectivity, which is at the heart of the scientific method, is reflected in practice (Johnson, Reynolds, and Mycoff 2015). In other words, in
addition to presenting the information in an objective manner as possible, one must ensure that all relevant information in interpreting the results is also accessible to the readers as well. The implication of this principle in practice is that not only should a researcher provide access to data used in a research project but also explain the process of how one has reached the conclusion that is presented in the research. This resonates with the current discourse on data access and research transparency in the political science discipline.
The most recent work on data access and research transparency in political science discipline were borne out of the concerns amongst practitioners that scholars were unable to replicate a significant proportion of research produced in top journals. In order for the discipline to advance knowledge across different subfields of political science and different methodological approaches, the principle of data sharing and research transparency became ever relevant in the discourse of the discipline. The idea is that evidence-informed knowledge needs to be accessible by the members of other research community whose research may rely on different methodological approaches. As a result of the growing concerns about the lack of norms of data sharing and research transparency culture amongst practitioners of various methodological communities and substantive subfield, the American Political Science Association (APSA), the national professional organization for political scientists, have produced an ethics guideline to ensure that the discipline as a whole can advance the data sharing and research transparency culture and practice.
The recently updated ethics guidelines published by APSA which is mentioned in (Lupia and Elman 2014) states that “researchers have an ethical obligation to facilitate the evaluation of their evidence-based knowledge claims through data access, production transparency, and analytic transparency so the at their work can be tested and replicated”. According to this document, quantitatively oriented research must meet the three prongs of research ethics: data access, production transparency, and analytical transparency. When conducting quantitative political research, all three needs to be incorporated for it to be considered meeting the ethical standard.
First, researchers must ensure data accessibility. Researchers should clearly reference the data used in their work, and if the data used were originally generated, collected, and/or compiled by the researcher, she should provide access to them. This is a practice already adopted by many journals where the condition of publication of an article is to provide access to data used in the manuscript. Some researchers include code and commands used in various statistical software, such as Stata, SAS, and R, so that one can replicate the published work.
Second, researchers need to practice production transparency. Not only should the researcher share the data themselves, but she also needs to provide a full account of the procedures used in the generation and collection of the data. First and foremost, this principle provides safeguards against unethical practice of misrepresenting or inventing data. One of the most famous recent cases of data fraud in political science research perhaps is the case involving Michael LaCour (Konnikova 2015). He completely fabricated the data he and his co-author Donald Green used in their research where many political scientists thought was miraculous findings. Only when two UC Berkeley grad students, David Broockman and Josh Kalla, tried to replicate the study and contacted the firm that LaCour supposedly used in the collection of the survey data, it was revealed that LaCour completely made up the “survey data” the authors used in their research.
Finally, researchers need to ensure analytical transparency where the link between the data and the conclusion of the research is clearly delineated. In other words, a researcher must explicitly explain the process that led to the conclusion of a research project based on the data being used in such a study. The empirical evidence must be clearly mapped on the theoretical framework of a given research project. Some scholars are concerned about the implication of radical honesty in political science research, identifying that the probability of successful journal publication may diminish as the level of transparency and radical honesty increases (Yom 2018). As a result, the idea of radical honesty in political science research requires the institutional buy-in beyond an ethical practice at the individual level. Unless such a practice is beneficial to a scholar, as opposed to being a challenge, the culture of analytical transparency may not cascade to the greater political science community beyond the pockets of ethical practitioners that currently exist.
It is important to note that increased openness in quantitative research provides political scientists with a number of benefits beyond what is promised in the ethical front (Lupia and Elman 2014). First, transparency and increased data access offer members of a particular research community to examine the current state of their own scholarship. Through such “internal” self-assessment within a particular subfield of political science, scholars are able to cultivate ̈an evidentiary and logical basis of treating claims as valid” (Lupia and Elman 2014). In many subfields, the validity of their knowledge requires replication of existing work. When access to quality data is limited, it becomes challenging to determine whether we should have confidence in the research findings presented. Without the culture and practice of data access and research transparency, it affects the confidence of a particular subfield as well.
In the literature of civil war onset, Hegre and Sambanis, for example, conducted a sensitivity study on the findings of various published works (Hegre and Sambanis 2006). Essentially, a sensitivity study is the examination of a numerical measurement (e.g, whether a civil war started or not) under a different condition than the original setting. In this particular case, the scholars of civil war literature uses different definitions of when a violent conflict constitute a civil war. The implication of this is that some scholars may have included or excluded certain cases from their dataset. Consequentially, it will influence the results of their study. So, one way to conduct a sensitivity study is to use the same definition, for example, of an outcome variable and replicate the study to examine the effect of such change.
This project was the result of the observation that several empirical results are not robust or replicable across studies. Because the authors of these articles in the sensitivity analysis practiced the ethical culture of data sharing and research transparency, scholars of civil war literature were able to reflect on the state of their research community. For the members of other research communities, the culture and practice of openness could contribute to the persuasiveness of the findings. This is based on the idea that the more one is empowered to understand the process through which the researchers have reached a particular conclusion, the more likely that the reader is likely to believe and value the knowledge.
Next, the culture and practice of openness help political scientists more effectively communicate with members of other communities, including non-political scientists. This is very important, for our research findings often carry real political and social implications. Generally speaking, good political research must contribute to the field of political science as well as to the real world (King, Keohane, and Verba 1994). Our findings are often used by political actors, policy advocates as well as various non-profit organizations that affect many lives of the general public. For example, Dr. Tom Wong, an expert on immigration policy, has worked as an expert advisor in the Obama administration and testified in various federal court cases to advocate for the rights of undocumented immigrants. He supported his position by relying on his research on the impact of undocumented immigrants which were primarily written for academics. However, he was also able to communicate with non-political scientists partly because of the fact that his research reflected the value of data access and research transparency (Wong 2015, 2017).
Although political scientists should intrinsically adopt ethical research practices, it is also quite effective to identify the potential benefit of such practices to their research communities so that the practitioners have the incentive to adopt the culture of data sharing and research transparency and becomes second nature.