Skip to main content
Social Sci LibreTexts

5.4: Procedural Bias

  • Page ID
    154319
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Another type of methodological bias is procedural bias, which is sometimes referred to as administration bias. This type of bias is related to the study conditions including the setting and how the instruments are administered across cultures (He, 2010). The interaction between the research participant and interviewer is another type of procedural bias that can interfere with cultural comparisons.

    Setting

    Where the study is conducted can have a major influence on how the data is collected, analyzed and later interpreted. Settings can be small (e.g., home or community center) or settings can be large (e.g., countries or regions) and can influence how a survey is administered or how participants might respond. In a large cross-cultural health study Steels and colleagues (2014) found that the postal system in Vietnam was unreliable and demanded a major, and unexpected, change in survey methodology. The researchers were forced to use more participants from urban areas than rural areas as a result of these challenges. Harzing and Reiche (2013) found that their online survey was blocked in China due to internet censoring practices of the Chinese government but with minor changes it was later made available for administering.

    Instrument Administration

    In addition to the setting, how the data is collected (e.g., paper-and-pencil mode versus online survey) may influence different levels of social desirability and response rates. Dwight and Feigelson (2000) completed a meta-analysis of computerized testing on socially desirable responding and found that impression management (one dimension of social desirability) was lower in online assessment. The impact was small but it does have broad implications for how results are interpreted and compared across cultural groups when testing occurs online.

    Harzing and Reiche (2013) found that paper/pencil surveys were overwhelmingly preferred by their participants, a sample of international human resource managers, and had much higher response rates when compared to the online survey. It is important to note that online survey response rates were likely higher in Japan and Korea largely because of difficulties in photocopying and mailing paper versions of the survey.

    Interviewer and Interviewee Issues

    The interviewer effect can easily occur when there are communication problems between interviewers and interviewees, especially, when they have different first languages and cultural backgrounds (van de Vijver and Tanzer, 2003). Interviewers, not familiar with cultural norms and values may unintentionally offend participants or colleagues or compromise the integrity of the study.

    An example of the interviewer effect was summarized by Davis and Silver (2003). The researchers found that when answering questions regarding political knowledge, African American respondents got fewer answers right when interviewed by a European American interviewer than by an African American interviewer. Administration conditions that can lead to bias should be taken into consideration before beginning the research and researchers should exercise caution when interpreting and generalizing results.

    Using a translator is not a guarantee that interviewer bias will be reduced. Translators may unintentionally change the intent of a question or item by omitting, revising or reducing content. These language changes can alter the intent or nuance of a survey item (Berman, 2011), which will alter the answer provided by the participant.


    This page titled 5.4: Procedural Bias is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by L. D. Worthy, Trisha Lavigne, & Fernando Romero (Maricopa Open Digital Press) via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.