Skip to main content
Social Sci LibreTexts

17: OWI Research Considerations

  • Page ID
    77991
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Chapter 17. OWI Research Considerations

    Christa Ehmann

    Smarthinking, Inc.

    Beth L. Hewett

    CCCC Committee for Effective Practices in OWI
    Defend & Publish, LLC

    This chapter examines strategies for researching distance education in the OWI context. It considers overall methodological research approaches that can be employed to engage in consistent and useful investigation of one’s program whether it has a writing course or writing center focus. The chapter also addresses key factors related to choice of research instruments, sample selection, data collection, and analysis, as well as issues of reporting and information dissemination post-research.

    Keywords: automated writing evaluation (AWE), data collection, empirical research, mobile technology, MOOC, methodological design, outcomes, practice, RAD research, replicable, qualitative, quantitative, stakeholder, theory

    A significant challenge facing online education is committing to a deeper understanding of the efficacy, values, and inner workings of OWI (both classroom- and tutor-based); its innumerable, rapidly changing modalities; its distinctive nature; and how it functions in a pedagogical sense. The writing studies discipline awaits viable theories of OWI as a philosophy of writing and as a series of strategies for teaching and learning to write in digital settings. To these ends, the CCCC OWI Committee has reasoned that ongoing research is crucial. OWI Principle 15 stated, “OWI/OWL administrators and teachers/tutors should be committed to ongoing research into their programs and courses as well as the very principles in this document” (CCCC OWI Committee, 2013, p. 31).

    This chapter focuses on how useful, ongoing research into OWI might be developed by examining and highlighting the crucial need for a deeper understanding of OWI; it offers suggestions for developing a rigorous framework of investigation when engaging in OWI. Methodological strategies for researching distance education in the OWI context also are considered. Specifically, the chapter examines overall methods and research approaches that can be employed to engage in consistent and useful investigation of one’s program whether it has a writing course or writing center focus. The chapter analyses key factors related to choice of research instruments, sample selection, data collection, and analysis. In a final section, the chapter discusses reporting and information dissemination post-research.

    A Need for Research into OWI

    The Internet has had a profound impact across educational contexts with the teaching and learning of writing among the most primary. There are innovative and exciting writing models that are linked inextricably to the online modalities that power them. The education landscape is marked by rapid growth and expansion of online technologies that are used to construct and deliver education and instruction—including writing instruction.

    Systematic and broad scale national and international data specifically targeting OWI trends do not exist. Overall, however, US higher education has seen considerable enduring growth with online education. For example, the National Center for Education reported that over 200 fully accredited online higher education institutions currently operate in the United States (Radford, 2011)—these institutions all provide high-enrollment, core courses that include composition. Further, Eduventures—a leading research and consulting firm for higher education institutions—estimated that in the fall of 2010, 2.78 million students enrolled in fully online programs. This number represents 14% of all higher education enrollments (Aslanian & Clinefelter, 2012). Along these lines, 77% of all four-year university presidents reported that their institutions offered some type of online or hybrid classes (Parker, Lendard, & Moore, 2011). Worldwide trends are similar; both online programs and enrolled students are increasing yearly. Indeed, even students in traditional onsite courses can expect to access some of their course materials and/or communicative experiences in online settings such as an institution’s LMS or, minimally, email.

    These statistics reflect online learning overall. As this book demonstrates, when we consider OWI alone, there is much that remains unknown. Undoubtedly, there is much to learn about how the changing digital landscape affects writing instruction in online settings. Among a wide variety of possible concerns are issues of accessibility, mobile technologies, and experimental learning formats. We provide the following examples to ground the exigency of developing appropriately sound and potentially helpful OWI research.

    Accessibility, for instance, received primary importance in A Position Statement of Principles and Example Effective Practices for OWI (CCCC OWI Committee, 2013). OWI Principle 1 concerned the need for inclusivity and accessibility in OWI for all students and teachers. Where inclusivity has been more or less retrofitted (Oswal & Hewett, 2013) to online settings in the past, it is provided a paramount position in this position statement. With OWI Principle 1, it becomes clear that every activity in an OWI setting and the technology choices for those activities should be determined with accessibility as a priority for both students and teachers (pp. 7-11). Student audiences, for example, are masked in online settings, making it difficult to know whether they are physically disabled, intellectually challenged, struggling with multilingual needs, and/or socioeconomically disadvantaged. Some students may have none of these specific problems but may be poor or slow readers with challenges in online educational settings (Hewett, 2015a). Without self-disclosure of such vital concerns, teachers may be unaware of particular student needs yet somewhat aware of their struggles as revealed in the student’s writing. Indeed, given a potential for difficulties in handling the increased literacy load of a common OWC (Griffin & Minter, 2013), all students might be considered somewhat disadvantaged in OWI settings (Hewett, 2015a). Too little research has been produced to assist with understanding this phenomenon although scholars are beginning to address this stark need (see, for example, Meloncon, 2013). Even though Chapters 8, 9, and 10 address many access and inclusivity concerns, it is useful to consider here some of the most pressing areas to be researched:

    • Regarding teacher/tutor training for accessibility:
      • What kinds of training, if any at all, do teachers and professional and peer tutors currently receive to provide universally accessible and inclusive OWI?
      • What training, if any, from different college disciplines or other online systems would support writing teachers/tutors in providing more accessible and inclusive OWI?
      • What attitudes do teachers/tutors need to acknowledge, address, and/or overcome to develop more accessible and inclusive OWI?
    • Regarding self-disclosure:
      • How many students self-disclose some type of disability before, during, or after their OWCs? What types of disabilities do student disclose?
      • Under what situations (e.g., pre-course preparatory materials or general encouragement from teachers) do students tend to self-disclose accessibility needs for OWCs?
      • Given legal restrictions from the Family Education Rights and Privacy Act (FERPA) and the ADA, how can students be provided helpful guidance about self-disclosure?
      • What sort of research partnerships are possible between OWI programs and campus disability services to learn about disabled students’ technological needs, learning styles, and other academic preferences?
      • How, if at all, do teachers (or tutors) modify their work with online writing students to meet access needs?
    • Regarding students who may not previously have been considered as needing particular access:
      • What kinds of accessibility and/or inclusivity needs do multilingual students have in an OWC?
      • Under what conditions should multilingual students receive accessibility assistance in an OWC or with an online tutor?
      • What do we know about the accessibility of online tutoring systems for students with disabilities?
      • What are the socioeconomic factors that teachers/tutors must consider when providing accessibility OWI for students?
      • What are the socioeconomic factors that administrators must consider when preparing OWI teachers and tutors?

    In the field of writing studies, there is a wealth of scholarship about multilingual student writers as well as students whose literacy performances are affected by their socioeconomic conditions. But there is a dearth of scholarship about how these student populations perform in OWI contexts.

    As a second example of the need for continued OWI research, the trend of using mobile devices and cell phone technology for OWI is among the changes in digital technologies, which is discussed in Chapter 16 of this book. Using such technology in writing courses undoubtedly will have an impact on writing instruction and student competency (Ehmann, 2012) as well as on student reading literacy (Hewett, 2015a). There is growing evidence to suggest that a growing population of students is using tablet and even mobile phone technology for educational purposes. Apple alone counts 1.5 million iPads in use by students in K-12 US schools (Kessler, 2012). In early October 2012, Piper Jaffray analyst Gene Munster released survey results showing that iPhone ownership also is growing at a rapid rate. According to Munster’s findings, 34% of surveyed high school students now own an iPhone, and 40% said they planned to buy one in the next month (CourseSmart, 2012). A May 2012 survey released by CourseSmart estimated that 93% of university students own laptops, 57% own smartphones, and 22% own tablets (CourseSmart, 2012). As students become ever-more attached to their mobile technology, online learning opportunities via mobile devices undoubtedly will expand. A 2011 Pearson Foundation study on US students and tablets reported the following findings:

    • Seventy percent of high school seniors and university students would like to own a tablet device.
    • Twenty percent of college students and 7% of university-bound high school seniors planned to purchase one in the next six months.
    • Sixty-nine percent of university-level students reported that they think tablet computing will change education in the future.
    • Sixty-three percent of students surveyed reported that tablets can enhance education.
    • Almost half of the surveyed students expect digital textbooks to replace paper textbooks within the next five years (p. 2).

    In theory, there are some interesting opportunities for learner engagement with mobile devices and the development of writing skills. For example, students can easily and collaboratively share information and ideas with each other through this very social technology—as well as enjoy easy access to peer reviews. Research questions surrounding such mobile technology choices for OWI include:

    • Regarding flipped classrooms:
      • Contemporary composition courses typically are taught in “flipped” manner—with lecture rejected for in-class practice and peer workshops—because it makes sense to have students perform the hands-on or direct engagement aspects of the learning process in the presence of the teaching professional. Given this scenario, how can mobile devices facilitate the flipped classroom?
      • Are mobile devices the best or just one of several tools for such activities?
      • What are their best uses in a writing instruction context?
      • How do these technologies enable or inhibit accessibility and inclusion?
    • Regarding anywhere/anytime learning:
      • In what ways, if at all, do students actually use such mobile devices for educational writing experiences?
      • What is student satisfaction level with such uses?
      • In what ways, if at all, does the use of these devices hinder or support writing improvement from the teacher’s perspective? From the student’s perspective?
      • To what extent and in what ways do these technological tools promote learning and engage participants?
    • Regarding writing conventions:
      • In what ways, if any, do students follow or reject traditional writing conventions, to include Standard Academic English, when they use mobile devices for educational writing experiences?
      • What kind of writing skills do students need and use for texting, tweeting, and other types of instant messaging modes that are twenty-first century skills necessary both in the work place and daily life?
      • How do (or can) writing teachers address and/or support such writing skills?
      • In what real contexts do the discourses of the academy and online discourse overlap, and how are these facilitated (or not) by mobile technology in writing course settings?

    Houghton-Mifflin Harcourt conducted a US study from 2010 to 2011 on one of their mobile applications that teaches Algebra I and reported positive results. Although very small, this study is widely cited as evidence for the efficacy of mobile learning within the sciences (HMHEducation.com, 2012). Such a study for writing has yet to be published.

    Another relatively recent movement that is gaining attention and points to the significant growth of online learning with potential impact on writing instruction is that of MOOCs. Targeting learners world-wide, MOOCs are an experimental form of online courses available to large audiences of learners through open Web access. MOOCs developed out of the open educational resources movement as “a response to the challenges faced by organizations and distributed disciplines in a time of information overload” (Cormier, 2010). According to Dave Cormier (2010), the most important feature of a MOOC is that it “builds on the active engagement of several hundred to several thousand ‘students’ who self-organize their participation according to learning goals, prior knowledge and skills, and common interests.” Structurally, MOOCs may be similar to some college-level programs. Typically, MOOCs do not offer the college credits that paying students at colleges and universities receive although some colleges have done so (Koller, 2012).

    The CCCC OWI Committee’s A Position Statement of Principles and Example Effective Practices for OWI currently lists MOOCs as an experimental use of educational technology and OWI Principle 6 suggested that they “should be subject to the same principles of pedagogical soundness, teacher/designer preparation, and oversight detailed in this document” (p. 16). A study of such experimental uses of educational technology has not yet been published for composition or writing-enhanced disciplinary courses, although data have been collected on the four institutions (i.e., Duke University, Georgia Technological Institute, San Jacinto Community College, The Ohio State University) that won Gates Fellowship Grants to establish MOOCs for writing courses. Questions for such research on MOOCs, adaptable to other experimental OWCs, might be:

    • Regarding the philosophies driving contemporary composition:
      • In what ways, if at all, do MOOCs, support core writing studies philosophies?
      • In what ways do they not support core philosophies?
      • How, if at all, do writing MOOCs extend core writing studies philosophies and/or develop new strategies helpful to student writers?
      • The four writing MOOCs piloted in 2013 enrolled a large number of students from around the world. What potential does such global reach of MOOCs offer to writing studies pedagogy?
    • Regarding accessibility, enrollment, retention, and persistence:
      • What student accessibility challenges exist surrounding OWI? How and to what extent can such challenges can be addressed?
      • What types of students most benefit from the MOOC environment? In what ways do they benefit?
      • What types of students are most challenged by the environment? In what ways are they challenged?
      • What qualities of MOOCs encourage writing students to enroll?
      • Why do some students persist and others drop out?
      • What measures, if any at all, would encourage greater retention?
    • Regarding the written products that students develop from learning in a MOOC:
      • In what ways, if any, is peer review supportive of a student’s writing development in a MOOC?
      • In what ways, if any, would such students benefit from experienced writers’ (e.g., teaching assistants, teachers, and professional writers) review and response in a MOOC?
      • How, if at all, should student writing be evaluated for course credit when the student has completed a writing MOOC?

    The rapid expansion of such technologies and trends as those listed above and the growing sphere of instructors who engage in digitally enhanced and Internet-based education are evidence that online education in general and OWI specifically are becoming significant within the higher education landscape (Ehmann, 2012; Krause & Lowe, 2014; see also Chapter 18). Although actual technologies, formats, and procedures may change, the Internet has transformed education and the teaching of writing in meaningful ways. Within this exciting context of change and transformation, however, few individuals have investigated the outcomes, processes, and procedures of online teaching and learning in rigorous and empirical ways. This reality also holds true for OWI specifically. Indeed, the empirically based learning science surrounding online writing contexts has a long way to go before replicable results yield convincing learning theories in connection with writing.

    Certainly, the understanding has developed that digitally enhanced or hosted writing instruction is not a replacement for onsite courses but rather a complement to them. Educators have begun to recognize that providing writing instruction in multiple modalities supports writing instruction rather than limits it (Hewett, 2010, 2015a, 2015b; Snart, 2010; Warnock, 2009), making OWI a substantial tool in a large toolbox that we use to make learning more accessible to a more diverse population of learners throughout the world, with some students benefiting from online learning and writing instruction more than others (Hewett & Ehmann, 2004). Nonetheless, even the anecdotal literature that exists often avoids the question of what really works and why.

    Many experts have discussed the need to understand more deeply the various pedagogies related to OWI. As seen in the CCCC OWI Committee’s Annotated Bibliography on Online Writing Instruction (2008) that gathered, reviewed, and annotated Web texts, articles, and books from 1980 to 2008, still much about what we know of OWI, what those processes look like, how students learn, and how teachers teach in the online writing environment is not comprehensive and does not account fully for the intricacies or complexities of participants and contexts (CCCC OWI Committee, 2008). Another annotated bibliography, “Studies Comparing Outcomes among Onsite, Hybrid and Fully Online Writing Courses” compiled by Scott Warnock (2013), studied the notion of difference between traditionally onsite and online (fully online and hybrid) composition courses. Warnock stated, “While generally few differences have been found in terms of educational outcome based on course modality, some studies identify nuanced differences in course experiences” (p. 2). Such nuances tend to affect retention and persistence as well as “student behaviors or performances in online courses in ways that lend themselves to comparison with onsite courses” (p. 2). Yet, such comparisons may be inappropriate for various reasons including that OWI often is scrutinized more than or differently from onsite writing courses, as Warnock indicates in Chapter 4.

    Nonetheless, essential questions as to what distinguishes OWI from composition instruction and learning in onsite settings remain. Such distinctions, reported anecdotally and experientially by practitioners and researchers alike, strongly suggest that OWI may need its own theories unique to student cognition, teacher instruction, and affective dimensions of learning when working online (Hewett & Ehmann, 2004). The most obvious difference from onsite learning is in the affective realm from the loss of real-time body/face/voice connections where researchers have suggested that such loss interferes with developing classroom community (DePew & Lettner-Rust, 2009; Ehmann, 2010; Gouge, 2009). To some degree, these ideas are comprised of teacher impressions that require thoughtful research to test their veracity. On the other hand, in a few studies that questioned student experiences (see Warnock, 2013, for example), students have reported that their interactions with instructors and peers were similarly satisfactory when compared to those in onsite settings and that their satisfaction was connected in part from frequency of interactions and prompt feedback from instructors (Boyd, 2008; Johnson & Card, 2008). These issues would benefit from additional research. In particular, affective loss, student progress and retention, and the notion of leveling classroom power would be useful topics for further research.

    Beyond concerns of affective connection, Beth L. Hewett (2010, 2015b) theorized in The Online Writing Conference: A Guide for Teachers and Tutors that OWI requires an increased clarity of written language, what she calls semantic integrity—response to students that recognizes fidelity between what teachers and tutors say in their writing and what they mean. This fidelity enables instructors to express themselves with clear intention and students to interpret their intentions as accurately as possible. Semantic integrity involves using straightforward language that is linguistically direct rather than indirect or suggestive and avoiding such conditional language as rhetorical questions and yes/no closed questions. In Reading to Learn and Writing to Teach: Literacy Strategies for Online Writing Instruction, Hewett (2015a) theorized that the decreased connection of body/face/voice in OWI reflects most strongly in lost cognition. When writing is taught primarily through reading and writing, an increased literacy load (Griffin & Minter, 2013)—where reading and writing are text-heavy and text-rich—taxes students who must make cognitive connections among what they read about writing generally, their writing specifically, and how to apply that information to their own writing-in-progress. Such reading challenges students, but it also challenges teachers who must understand that they are responsible for writing comprehensible instructional text that is straightforward and clear, leading directly back to a need for semantic integrity (Hewett, 2010, 2013, 2015a, 2015b). Indeed, according to David Alan Sapp and James Simon (2005), “Though many writing teachers may have the skills to communicate content and assignment instructions to students online, few have the sophisticated communication skills necessary to connect with students interpersonally, to build trust and rapport in unfamiliar virtual environments” (p. 478). Hewett’s (2015a) literacy-cognition theory of OWI requires additional research both for confirmation and mitigation of its effects on students with various learning needs and styles.

    Although composition theories and some teaching strategies can be migrated and adapted to online settings as described in OWI Principle 4 (also see Warnock, 2009), institutional writing programs cannot simply move or transfer traditional educational methods online in a wholesale manner (pp. 14-15). Rather, new techniques and pedagogies unique to digital environments must be discerned and employed, as revealed in OWI Principle 3, to address the heavily text-based nature of OWI and the myriad of technological devices available for educational uses (pp. 12-14). However, what educators believe are the best online teaching strategies are not always the best learning strategies for students, which means that research must address that aspect of OWI as well. As described in the Introduction, the CCCC OWI Committee’s weakest area of research in terms of its field visits, surveys, and work with the expert/stakeholder panel was in its consideration of student learning experiences. The discipline needs to step back and understand what works for students and then consider how to marry that with overall pedagogical approaches—particularly given students’ accessibility needs. Depending on local institutional constraints, that marriage may look different within and across various contexts.

    With these points in mind, there is a strong need for open-ended research into overarching areas of interest surrounding OWI as it occurs in naturalistic settings across institutions and student and teacher cohorts. To be sure, close-ended, tightly controlled experimental studies that, for example, test pre-determined theories or hypotheses about such issues as how students learn in this environment can serve an important role and can address various research questions about quantitative benefits and measurable outcomes (Ehmann, 2003). These we recommend as well. From a pedagogically driven perspective and in light of the current landscape of OWI knowledge, however, we believe strongly that more interpretive open-ended research should be a leading priority in any study of OWI. Given existing questions about participant experiences and OWI processes, therefore, a primary need is to explore the phenomenon of OWI—with individual cases across various institutions and learning contexts being viewed as opportunities to investigate overall trends and patterns that can lead to a deeper understanding of OWI as a phenomenon in and of itself. Based on previous exploratory research, compelling areas for empirical, theory-generating research are related to literacy and cognition, processes, participant perceptions and experiences, and outcomes. Because much of OWI is text-based, scholars are well positioned to delve into archives of teacher and learner work to explore short-term/one-off moments as well as longitudinal evidence of learning. In addition to deepening our understanding of the literacy and cognition challenges of OWI and pedagogical inner-workings and processes of OWI, it is essential that student outcomes also are assessed. Outcomes include student grades and other performance measures, course retention, and persistence. Although the analysis of such outcomes in relation to student achievement and learning has its pitfalls, it is undeniable that such information is useful as a baseline for both faculty and administrators who require such data as a means of quantitative efficacy.

    This background strongly suggests that program and course design should be developed to allow such information to be captured and analyzed. Within an interpretative framework, both qualitative and quantitative methodological designs can be employed to address key questions surrounding OWI and OWL outcomes, processes, and participant perspectives. These areas will be discussed in the next sections of this chapter with particular focus on how they relate to planning for a postsecondary program that includes OWI and designing an OWI course; however, readers may adapt these discussions particularly to OWLs where the focus seems relevant.

    Perspectives for OWI Research

    As indicated in the previous section, the urgent call for OWI educators in assessing their current courses/programs and in contemplating the outlook of future development is, minimally, fourfold:

    • Establishing opportunities within courses to investigate the phenomenon of OWI through an interpretive, process-oriented research framework;
    • Using the analytic data available through LMSs to study students’ assistive technology use and learning patterns;
    • Establishing opportunities to collect and study a baseline of performance-related student outcomes that can be analyzed by internal and external third parties; and
    • Establishing opportunities to collect and study instructional strategies pertinent to both composition and the OWI-environment that also can be analyzed internally and externally for quality measures.

    This need for research is highlighted in OWI Principle 15 of A Position Statement of Principles and Example Effective Practices for OWI (p. 31). Specifically, the CCCC OWI Committee’s statement reported on a common theme from leaders in the field about the need for professional development in the area of OWI and OWLs. The CCCC OWI Committee expressed a pressing need to “educate the writing community on OWI and OWLs and to help direct the teaching and learning of our students with what is known about state of the art and effective practices” (p. 31). Moreover, the statement called for developments in OWI and OWLs to be rooted in “valid and reliable research findings and systematic information dissemination” (p. 31). In other words, of paramount importance is the knowledge sharing that occurs pre-, mid-, and post-research.

    As in other contexts, the challenge for undertaking the type of aforementioned research resides in the time and funding resources for accomplishing such work. For an obvious example, student writing appropriately is one measure of any writing program’s efficacy. Studying that writing, however, is time and labor intensive. With that in mind, some computer programs or automated writing evaluation (AWE, a term that appears to have been coined by Warschauer and Paige [2006]) tools potentially may be used for research purposes to determine writing differences by, for example, indicating linguistic changes across drafts and revisions—provided the AWE tool is adequately trained and normed to specific prompts informed by human-directed parameters of strong and weak writing. At this time, such AWE programs typically cannot accurately synthesize within individual contexts (i.e., for individual students and their unique essay writing) whether and how those changes represent an individual’s specific rhetorical strengths and weaknesses, content accuracy and depth, and overall writing maturity (NCTE Position Statement on Machine Scoring, 2013; Perelman, 2013). Indeed, this work requires trained human instructors and readers. Given the need to balance the requirements of human expertise with the time and resources necessary for research into OWI, it is worth considering how markers of writing change may be quantitatively assessed via AWE technology in conjunction with human efforts related to the qualitative understanding, teaching, and assessment of writing.

    To be sure, there is much controversy surrounding the use of AWE for placement, assessment, and instruction—let alone as a potential analysis tool to aid research of OWI. Earlier critiques of AWE have (1) typically focused on scoring rather than integration of AWE into the writing classroom and (2) relied more on composition theory than on empirical classroom studies of AWE’s impact on student learning (Ericsson & Haswell, 2006; Grimes & Warshauer, 2006; Shermis, Burstein, & Bliss, 2004). More recent studies have investigated the impact of AWE on student scores on standardized tests, teachers’ impressions of AWE, student impressions of AWE, impact on student writing, and student behavior as they use AWE applications (Chen & Cheng, 2008; Cotos, 2011; Grimes & Warshauer, 2010; Holman, 2011; Perelman, 2013; Schroder, Grohe, & Pogue, 2008; Shermis & Burstein, 2013; Shermis & Hammer, 2013). The point we wish to make here is that whether or not AWE is positioned as an optimal instructional or evaluative tool for complex and unique writing challenges and development issues that individual learners face, it may be leveraged for research into OWI if done with its strengths and limitations in mind. These items certainly warrant further consideration.

    The human labor involved in OWI research requires training, developed experience, allotted time, and funding, all of which have been an impediment to OWI and other writing researchers who typically are not expected, encouraged, or educated to seek grants for such research. Indeed, writing studies scholars—whether focused on onsite or OWI—often are not prepared for such research generally. And, when it is developed, venues for publishing it can be scarce.

    In 2005, Richard Haswell’s “NCTE/CCCC’s Recent War on Scholarship” cogently argued for RAD research in composition studies. His argument for empirical research in the field serves as a counterpoint to Stephen M. North’s (1987) popular and helpful The Making of Knowledge in Composition: Portrait of an Emerging Field—in that the field no longer is emerging and that assumptions drawn from research ought to be validated and, to some degree, comparable across studies. Janice M. Lauer and J. William Asher’s (1988) Composition Research: Empirical Designs provided solid advice toward RAD research, but Haswell (2005) speculated that because the key NCTE/CCCC journals for the writing studies field (e.g., College English and College Composition and Communication) failed to publish sufficient RAD research, it discouraged publishing scholars from conducting it. To the field’s benefit, some scholars have taken up Haswell’s call to action with powerful critiques of current non-RAD research and practices based on it; Rebecca Day Babcock and Terese Thonus’ (2012) Researching the Writing Center: Toward an Evidence-based Practice comprises one such critique. In support of OWI, a few scholars have taken up such research (Hewett, 2000, 2004-2005, 2006; Jones, Georghiades, & Gunson, 2012; Wolfe & Griffin, 2013), which has helped to ground practice-based OWI development. Nonetheless, few researchers actually have replicated previous studies—using prior research methods, taxonomies, or other analytical tools—allowing the research to become part of a growing body of knowledge where comparisons might be made. Hewett (1998, 2000), where previously used methods and taxonomies were adapted to a new setting (2003-2005, 2006), provided one example of an attempt at RAD research based on earlier used talk and revision taxonomies (Faigley & Witte, 1981; Gere & Abbott, 1985, Gere & Stevens, 1985).

    RAD research into OWI presents considerable challenges that we are not dismissing in this chapter. However, given these challenges, a strong case can be made to apply an action research approach to the overall study of OWI—such that educators can leverage their own practices as scholarly investigation. Grounded in an action research approach that is designed to ask educators to investigate, reflect, and report on their own practices, the CCCC OWI Committee’s explication of this research-focused principle encouraged practitioners to research their own courses, students, and programs (also see Hewett, 2010, 2015b; Hewett & Ehmann, 2004). As co-creators and/or participant observers, OWI and OWL administrators and teachers/tutors alike are situated ideally to commit to continuous research of their courses, students, and programs—with the overall intention of building and strengthening the theoretical and pedagogical frameworks for OWI and OWLs. Pepperdine University’s Center for Collaborative Action Research advocated a similar version of action research. They explained that action research:

    is the systematic, reflective study of one’s actions, and the effects of these actions, in a workplace context. As such, it involves deep inquiry into one’s professional practice... . Action research is a way of learning from and through one’s practice by working through a series of reflective stages that facilitate the development of a form of “adaptive” expertise. Over time, action researchers develop a deep understanding of the ways in which a variety of social and environmental forces interact to create complex patterns. Since these forces are dynamic, action research is a process of living one’s theory into practice. (Reil, 2010)

    As emphasized in OWI Principle 15 of A Position Statement of Principles and Example Effective Practices for OWI, “Empirical, repeatable, and longitudinal research that addresses questions regarding the phenomena of OWI and OWLs will drive a deeper understanding of OWI and OWLs, ultimately benefiting students and the teaching and learning of writing in online contexts” (p. 31). Similarly, follow-up reporting and information dissemination are important phases in the strategy of “progressive inquiry” (Reil, 2010) for action research. The intention is for research findings to be critiqued and validated by peers as well as the scholarly community and administrators in composition studies. As the CCCC OWI Committee’s A Position Statement of Principles and Example Effective Practices for OWI advocated under OWI Principle 15:

    OWI and OWL administrators and teachers/tutors should engage actively in the scholarly conversation by sharing research findings at regional and national conferences and through peer-reviewed journals and other academic publications. OWI and OWL administrators and teachers/tutors should share research findings with the general public in suitable venues to assist with setting appropriate expectations for and understanding of OWI and OWLs. (p. 31)

    Perhaps most importantly, such research can be used to inform and ultimately improve one’s own practice within a cyclical, phased approach.

    As indicated earlier in this chapter, there are numerous main areas that, in light of the current state of knowledge about OWI, can be considered research priorities. For the purposes of this chapter, we categorize these priorities into three overarching areas pertinent to OWI/OWLs: processes and interactions, participant perceptions and experiences, and outcomes. Although a myriad of potential research topics related to these broad areas exist, the point here is to highlight examples of research questions and methodological choices that can serve to deepen understanding and knowledge of processes, participant experiences, and outcomes of OWI for both teaching and tutoring practices.

    Processes and Interactions

    Effective Practice 15.1 of the CCCC OWI Committee’s A Position Statement of Principles and Example Effective Practices for OWI articulated the need to design and deploy empirical investigations surrounding “the processes of asynchronous and/or synchronous OWI or OWL interactions” and “student and teacher/tutor behaviors, actions, and relationships within the context of the actual exchanges” (p. 31). Processes and interactions in an OWI context can be taken to mean one-to-one and group interactions that occur among teachers/tutors and learners.

    Within these areas, there are multiple subsets of OWI that warrant investigation including asynchronous and synchronous modalities, fully online and hybrid contexts, effects of medium on learning, feedback and response strategies using different technologies, and overall approaches to online learning where writing is the focus. Common examples include fully online or hybrid classroom exchanges among individuals, one-to-one exchanges in an online tutorial or conference setting, online conversations that occur in peer revision groups, and email exchanges. Moreover, these interactions occur in asynchronous and synchronous modalities such as delayed, multi-textual, correspondence exchanges; one-way or two-way audio and audio/video messages; and voice and/or text chat-based messaging. Above and beyond the specific context, the primary point here is that the focus on processes and interaction relies on studying the nature of the human-to-human interactions and exchanges among participants in digital, educational settings—all of which, in most settings, can be captured and accessed with consent via online archiving and records. Following an interpretive strategic methodological approach, open-ended research questions about processes can address broader, descriptive notions about the OWI phenomenon (Ehmann, 2003, 2012). Such questions include but are not limited to the following areas:

    • What happens in fully online OWI courses, online conferences (course-based and/or tutorial), and associated text-based exchanges?
    • Similarly, what happens in hybrid courses and conferences?
    • What is the overall structure and format of these online teaching and learning environments—in both fully online and hybrid contexts?
      • What is the length of engagement for a class?
      • What is the frequency, length, and nature of one-to-one conferences, group interaction, and any peer interaction?
    • What is the substantive focus of participant exchanges in these contexts?
    • What topics do teachers/tutors initiate and how do students respond?
    • What topics do students initiate and how do teachers/tutors respond?
    • How do participants “talk” in text about the topics they initiate?
    • What pedagogical strategies do instructors employ in their teaching of writing?
    • What strategies do students employ in their learning of writing?
    • How do these strategies compare with those of students with learning disabilities?
    • What strategies can develop to suit these students’ learning needs?
    • What indications or evidence of understanding and progress do participants demonstrate in their participation, written work, and revision?

    The kinds of data required to address these types of process/interaction-focused questions point to artifacts that are inherent to OWI. Specifically, A Position Statement of Principles and Example Effective Practices for OWI reported that the OWI and OWL environments are “particularly well positioned as sites of ongoing research in that almost all interactions are saved and archived ... enabling empirical analysis” (p. 31) Indeed, the inherent characteristics and qualities of OWI and OWLs can be leveraged in meeting this call and commitment for on-going research. Key records of interaction among teachers/tutors and learners can involve such texts as email, various modes of online platform communication, online group discussion, and a portfolio of longitudinal writing drafts and revisions. These records can serve as artifacts available for investigating the deep pedagogical processes of OWI in meaningful ways.

    When considering meaningful analysis methods for the interaction data collected, an approach that remains true to the open-ended intent of the research is paramount. With the intention of exploring what individuals actually are doing rather than what existing sentiment says they should or could do, analysis methods need to support the desire to produce findings that accurately reflect participant practices and to build on previous research where applicable and appropriate (Ehmann, 2003). Options for analyzing interactions can involve fine-grained scrutiny of participant talk such as linguistic or discourse analysis of participant talk which entails the detailed and systematic investigation of functional structures and hierarchies potentially related to student revision (see, for example, Hewett, 2006, 2003-2005, 2000, 1998).

    Broader brush-stroke approaches include the identification of thematic activity, behaviors, or writing development within or potentially connected to such learning exchanges. Regardless of the level of granularity, however, a unit of analysis within an interaction must be identified and justified for the investigation’s purposes—whether the unit is a conversation move, an episode, or some other discreet chunk within social and instructional exchanges. Depending on the research objectives, both quantitative and qualitative methods can be used in the analysis of largely qualitative talk or text-based data—for example, using numeric analysis to describe the actions that individuals take in teaching and learning exchanges. Units of analysis can be counted and presented; such an approach may lend external credibility to findings depending on the audience. Based on actual conferencing interactions, Hewett’s (2004-2005, 2006) research is among the most relevant empirical scholarship regarding OWI interactions to date. In her work, Hewett found that students’ revised writing demonstrated linguistic connections to the online conferences in which they participated; where the connections did not exist, however, the students’ open-ended survey responses provided interesting evidence toward her theory that instructors’ semantic integrity is crucial in a text-based setting (Hewett, 2010, 2015a, 2015b). Some of these findings have been replicated in recent research into uses of metaphor in online tutoring (Thonus & Hewett, 2015). Moving forward, new empirical studies might build upon this work, which can ground adaptations and changes in instructional approaches for both asynchronous and synchronous settings.

    Stakeholder Perceptions and Experiences

    The importance of deepening our understanding of OWI/OWL interactions and learning exchanges is clear. An additional area that requires investigation is that of participant perceptions and experiences. The CCCC OWI Committee’s example Effective Practice 15.1 called attention to this point, suggesting that “Studies might examine participant perceptions of OWI or OWLs (e.g., benefits, challenges, experiences) via interviews with students, teachers/tutors, and administrators” (p. 31).

    Students are primary stakeholders in the OWI endeavor. As such, their first-hand experiences warrant exploration in addition to their reasons for engaging in OWI and their views about its purpose and value in the postsecondary context. A priority of this approach is to seek descriptive responses that are rooted in respondents’ actual experiences rather than evaluative responses about what OWI should or should not be. Both Nancy Sommers (2006) and Jane Mathison Fife and Peggy O’Neil (2001) have indicated how important it is to understand how students believe they have benefitted from feedback and essay response, for example. The student experience helps to triangulate what researchers see in the many texts that OWI makes archivally available.

    Similarly, with teachers/tutors, a priority is to explore descriptive accounts of OWI experiences as well as observations about OWI beyond the scope of single instances or courses. It is crucial to understand notions about the purpose and value of what they are doing as teachers/tutors within the OWI context as well as why and how are they doing it.

    A third OWI stakeholder includes administrators and non-faculty decision-makers about OWI programs. Exploring their views of OWI also is important, with the primary objectives being to discern their perceptions about the purpose and value of OWI and how OWI fits into other ways of teaching and learning writing at the institution.

    With attention to these stakeholders and their needs in mind, the following types of research questions can be formulated:

    • Why do stakeholders engage in and with OWI/OWLs?
    • What are the perceived purposes of OWI/OWLs—from all participant perspectives?
    • What approaches to practice do participants see themselves taking?
      • What pedagogical strategies do they view as most effective?
      • What pedagogical strategies do they view as least effective?
      • What evidence do they look for in terms of efficacy and student learning?
    • What is the perceived value of OWI/OWLs for students, faculty, and administrators?
    • What are the perceived benefits for students, faculty, and administrators?
    • What are the perceived challenges for students, faculty, and administrators?
    • What approaches to practice do stakeholders see themselves taking?
    • What approaches to learning do students see themselves taking?
    • What training and professional development opportunities do instructors and administrators view as most helpful, least helpful, and most necessary?
      • What delivery mechanisms are best for such training and professional development?
      • How can training and professional development best be evaluated for potential efficacy?
    • What orientation and support services do students, in particular, view as most helpful, least helpful, and most necessary?

    Furthermore, regarding students as stakeholders, it is paramount to understand which populations of learners are served better online than others. While ADA law is clear, institutions may not have equipped themselves to support students online who have learning disabilities and/or physical challenges. While their needs may be connected more to ethical rather than legal exigency, students with multilingual issues, limited socioeconomic resources, or who are ill-prepared for college also have access and inclusivity needs. Institutional responsibility includes preparing teachers/tutors for developing their online instruction inclusively. In its early days, online learning was lauded as a new way to help those students who could not get to campus for whatever reason. Ironically, however, the CCCC OWI Committee’s research strongly suggested that writing studies educators really do not know how to support those who are, perhaps, most in need of the access-based opportunities that OWI can afford—such as the blind, the dyslexic, and auditory learners, for example. There is little anecdotal literature and even less research on the matter, making it a crucial area for future research.

    From the instructor perspective, we must determine the strategies, skills, and understandings needed in orientation, training, and on-going professional development—for new teachers as well as veteran onsite teachers entering into the online environment, perhaps for the first time. As is commonly documented in the OWI literature (Hewett & Ehmann, 2004; McGrath, 2012), many have seen firsthand that the best face-to-face teachers and subject matter experts do not always make the best online teachers—this reality crosses international as well as subject-area boundaries. The instructional skills needed in the OWI environment do not transfer directly or straightforwardly from physical contexts, and online instructors need training as well as ongoing professional development to orient them to the most effective strategies for teaching writing in online settings. In addition to comfort and fluency with the online technologies, instructors need to be strong teachers and tutors of writing. Attracting, training, and retaining good teachers to use technology as well as implementing and advancing online writing-specific pedagogies together provide the cornerstone for any online writing program’s success. How to accomplish this twofold goal definitely warrants further study and can be explored via collecting participant perceptions.

    The kinds of data required to access and to investigate individuals’ views and perspectives about OWI can be collected effectively via interviews and surveys (both open- and closed-ended). As an interpretive starting point, semi-structured, open-ended interviews can focus on issues deemed important to an OWI/OWL study and provide an opportunity to understand how respondents make sense of those issues as well as other topics they believe are important within the OWI context. Such perspectives can be captured with individual students in pre-course surveys, mid-course feedback sessions, and exit-interviews after the course. They also can be captured in student and professor course evaluations. In the analytical process, it is possible to explore themes surrounding, for example, approaches to OWI practice, attitudes towards OWI and students in the online environment, and perceived benefits and challenges of OWI. Employing appropriate sampling techniques and using findings from and empirical analysis conducted on interview data can then inform closed-ended surveys and questionnaires that ultimately can be administered on a larger scale. Based on initial open-ended interviews with identified OWI leaders in the field, the CCCC OWI Committee administered two larger scale surveys regarding hybrid and fully online OWI environments (CCCC OWI Committee, 2011a, 2011b), as described in the Introduction to this book. The State of the Art of OWI (CCCC OWI Committee, 2011c) report indicated important contextual background and trends for the OWI landscape including:

    • the lack of a common pedagogical framework grounded in theory and practice specifically related to OWI,
    • the urgent need for training and professional development of educators and supplemental support (such as online writing centers) involved in OWI,
    • the lack of knowledge about ELL students and those with disabilities in the OWI environment, and,
    • the challenges associated with instructors’ professional satisfaction with OWI in terms of adequate institutional support for training and technology needs.

    From the report, it is clear that more work in the aforementioned areas is warranted.

    Outcomes

    Quantitative studies that investigate student performance in terms of learning outcomes or benchmarks, grades, and course retention should be designed and deployed for OWI and OWL settings. From an administrative perspective, return on investment studies also can be deployed to help understand the financial impact—and potential benefits and challenges—of OWI or of an OWL to institutions. Where possible, longitudinal research should be designed and institutionally funded to understand the differing complexities of learning to read and write in digital, online, and distributed online educational settings. Retention is one of the greatest challenges for institutions and students; this certainly is true in the United States, and other countries face the same challenge (Boston, Ice, & Gibson, 2011). With that challenge in mind, experts must pursue targeted, strategic research, and administrators must implement newfound effective practices surrounding the support of online writing students in terms of the expectations and modes of interaction those students encounter in online learning environments. The following types of questions can be formulated to address student and institutional outcomes:

    • What are demographics of the students who participate in OWI/OWLs?
    • What are demographics of the instructors who participate in OWI/OWLs?
    • What quantifiable measures of student engagement can be tracked and reported on—such as attendance, assignment completion, participation levels, and student-to-student and/or instructor-to-student contacts?
    • What are student grades in these settings?
    • What are OWI course pass rates?
    • What are OWI course completion rates?
    • What are the OWI course retention and persistence rates?
    • What are the overall institutional retention rates and how are they affected by the OWI courses and/or OWL presence?
    • What other standardized and/or high stakes competency or skills testing gains do OWI students achieve?
    • What is the fiscal and/or human investment for OWI to a department and/or institution and what is the return on this investment?

    The kind of data required to address the aforementioned research questions involves student demographic information that is typically collected in standard institutional database systems. The questions also require course-level information typically collected in an LMS. Data analytics regarding student usage, activity, and other such behavior also can be used to triangulate and provide a fuller picture of outcomes in the OWI environment.

    With an eye toward understanding these key areas of interactions, experiences, and outcomes, practitioners and administrators alike would do well to design courses with these factors in mind. When designing research, using data that already may have been collected by the institution for any given course (e.g., student demographic information or performance) can help to achieve optimal efficiency for research. Finally, there must be a commitment on behalf of the institution to allow practitioners the crucial time to organize, analyze, reflect, present, refine, and disseminate their research and findings. This commitment to time and timing is an important element in course design that must be negotiated in the design process. Additionally, such a commitment to practitioner time for research reflects attention to the needs for appropriate compensation for OWI development work, as suggested by OWI Principle 8.

    Recommendations

    This section of the chapter summarizes key points and recommendations for designing an appropriate research strategy for OWI, which includes the uses of OWLs.

    Table 17.1 provides a sample research memo that can be adapted to gain institutional, administrator, and faculty support and resources for the endeavor. It may be used as well to encourage buy-in for a course/OWL design that can accommodate on-going research.

    Table 17.1: Example research proposal memo for OWI

    Introduction

    With the pervasiveness of Internet-based education (and paucity of data and analysis), questions abound surrounding teaching and learning in online writing instructional contexts, instructor preparation, student gains, and the administration and delivery of online programs. This memo outlines potential research options for the exploration of online writing processes and the relationships between student participation in online writing and student performance in academic courses. It encourages the investigation of student usage data and OWI session archives generated via the course platform.

    Institutional Background

    [Insert key contextual information about the institution]

    Research Concept

    While many factors, such as faculty effort or student demographics, can have an impact on grades, course pass rates, and retention, online writing instructional courses are an important margin in schools’ administrative decisions and budgets. Understanding the potential links between OWI services and various measures of student satisfaction and student academic success is an important step in designing OWI programs that improve overall institutional performance. To reach a better understanding of this intersection, research into OWI is well suited to focus on high-risk attrition courses such as math, chemistry, and writing. Findings from such a study could facilitate an institution’s assessment of OWI as it relates to student experience and learning, retention, and ultimately return on investment.

    Methods

    Depending on specific research questions, the project can be designed in various ways that can lead to a largely instructor-driven initiative. Potential data sets, a summary of potential options for research design and methods, and timeline guidelines are outlined in Chart A below.

    Data Sets

    Outcomes: Institution X’s system tracks student course performance, usage, activity, and grades. These data provides contextual information about how and when students use the service. Overall student performance (grades, retention etc.), demographic information, and other needed data would be required from the institution.

    Student perspectives: Students’ perspectives of their experience engaging in OWI qualify the extent to which and ways in which they view this type of online instruction as beneficial to their learning and potentially to other aspects of their education. Course evaluations, interviews and online surveys could provide cost-effective means of collecting such perspectives.

    Faculty perspectives: Via interviews and online surveys, faculty could provide feedback on the effect of OWI on student learning and participation as well as how OWI has an impact on their teaching and pedagogical approaches and understandings. Similarly, administrators could provide feedback on the level of understanding and impact of OWI on overall departmental and institutional advances.

    Interactions: Via archives of student work, peer collaboration/communication/engagement, and faculty-student interaction, the processes and procedures of OWI can be studied. A detailed analysis of such interactional data can yield deep insights into the pedagogical strategies and approaches that are most beneficial to students and can assist with training and professional of faculty members.

    Table 17.2 (Ehmann, 2013) further summarizes research strategy options and illustrates: key relationships between dependent and explanatory variables, an array of possible data collection methods with corresponding analysis techniques for quantitative and qualitative data, and final reporting and action steps. It succinctly encapsulates research options, and it can be used as a starting point for discussion with various audiences.

    Table 17.2: OWI in fully-online and hybrid contexts (Ehmann, 2013)

    Dependent
    variables

    Explanatory
    variables

    Data collection methods

    Analysis
    techniques

    Reporting & action steps

    • Student grades
    • Course pass & completion rates
    • Institution retention and persistence rates
    • Measures of student and faculty satisfaction (for example, faculty or course evaluations)

    Student work and usage patterns; writing, communication, participation and other interactional indicators, tutorial frequency and duration

    Student, faculty, administrator perspectives

    Teaching and learning processes

    Data analytics via reporting tools (available as a part of most course management systems)

    In-depth, semi-structured-ended interviews, online surveys, pre-, mid-, and post- course evaluations or course exit interviews

    Teacher / tutor / learner interactions captured in asynchronous and synchronous modalities

    Quantitative analysis

    • Descriptive statistics
    • Correlation/ co-variation between different quantitative or qualitative measures of student success
    • Probabilistic models (i.e. logit or probit) that link various binary outcomes (pass/fall or retention) to explanatory variables (unit of analysis: student)

    Qualitative analysis

    • Discourse analysis; content analysis; or rhetorical analysis of human-to-human interactions to explore patterns of pedagogy, behavior and learning activity.
    • Comparing outputs of student work
    • Tracking types of student participation on tutorial sessions
    • Correlating types of student participation to student work.
    • Written reports & publications
    • Conferences
    • Other peer reviewed scholarship
    • Social media activity
    • Web logs, blogging, other
    • User group meetings

    The ultimate credibility of any research will rely on the goals, justifications, articulated limitations, and overall transparency of the projects. Addressing the following areas can serve to provide such credibility in the dissemination of research findings:

    • A plan for researching OWI
    • Research questions
    • Overall research strategy
    • Choice of research instruments
    • Role of researcher
    • Ethical considerations
    • Selecting the sample
    • Conduct of and lessons learned during piloting
    • Data collection
    • Conduct of the interviews and/or surveys, if used
    • Analysis of the interactions
    • Analysis of the interview data
    • Findings
    • Conclusions
    • Recommendations for additional research

    Conclusion

    Rhetoric and composition educators understand the need for OWI research and evidence that supports OWI teaching and learning strategies. As indicated, there are many areas open for research where scholars and practitioners can contribute to the knowledge of this field. Among such areas are:

    • Outcomes on quantitatively measurable OWI student gains (e.g., grades, course retention, and sequence persistence) to justify overall course success to, for example, administrators and institutional leadership for funding purpose, and
    • The more qualitatively, interpretative, theory-generating work needed to understand the success and value of various strategies, techniques, and pedagogies associated with OWI.

    A significant challenge exists in meeting both ends of this research spectrum. This chapter has outlined strategic investigative approaches and methodologies that can address quantitatively as well as qualitatively focused questions within an action research paradigm. When planning for a postsecondary program that includes OWI and designing an OWI course or that involves the intricacies of tutoring in an OWL, these recommendations can incorporate and ultimately be used to strengthen and fortify the teaching of writing and student learning in online and onsite settings alike.

    References

    Aslanian, Carol B. & Clinefelter, David L. (2012). Online college students 2012: Comprehensive data on demands and preferences. Retrieved from www.learninghouse.com/files/documents/resources/Online%20College%20Students%202012.pdf

    Boston, Wallace E., Ice, Phil, & Gibson, Angela (2011). Comprehensive assessment of student retention in online learning environments. Online Journal of Distance Learning Administration, 4(1). Retrieved from http://www.westga.edu/~distance/ojdla/spring141/boston_ice_gibson141.pdf

    Boyd, Patricia Webb. (2008). Analyzing students’ perceptions of their learning in online and hybrid first-year composition courses. Computers and Composition, 25, 224-243.

    CCCC OWI Committee for Effective Practices in Online Writing Instruction. (2008). Annotated bibliography on online writing instruction 1980-2008. Committee for Best Practices on Online Writing Instruction (Keith Gibson & Beth L. Hewett, Eds.). Retrieved from www.ncte.org/library/NCTEFiles/Groups/CCCC/Committees/OWIAnnotatedBib.pdf

    CCCC OWI Committee for Effective Practices in Online Writing Instruction. (2011a). Fully online distance-based courses survey results. Retrieved from http://s.zoomerang.com/sr.aspx?sm=EAupi15gkwWur6G7egRSXUw8kpNMu1f5gjUp01aogtY%3d

    CCCC OWI Committee for Effective Practices in Online Writing Instruction. (2011b). Hybrid/blended course survey results. Retrieved from http://s.zoomerang.com/sr.aspx?sm=%2fPsFeeRDwfznaIyyz4sV0qxkkh5Ry7O1NdnGHCxIBD4%3d

    CCCC Committee for Effective Practices in Online Writing Instruction. (2011c). The state of the art of OWI. Retrieved from www.ncte.org/library/NCTEFiles/Groups/CCCC/Committees/OWI_State-of-Art_Report_April_2011.pdf

    CCCC OWI Committee for Effective Practices in Online Writing Instruction. (2013). A position statement of principles and effective practices for online writing instruction (OWI). Retrieved from http://www.ncte.org/cccc/resources/p.../owiprinciples

    Chen, Chi-Fen Emily & Cheng, Wei-Yuan Eugene. (2008). Beyond the design of automated writing evaluation: Pedagogical practices and perceived learning effectiveness in EFL writing classes. Language Learning & Technology 12(2), 94-112.

    Cormier, Dave. (2010). What is a MOOC? Retrieved from https://www.youtube.com/watch?v=eW3gMGqcZQc

    Cotos, Elena. (2011). Potential of automated writing evaluation feedback. CALICO Journal, 28(2), 420-459.

    CourseSmart & Wakefield Research. (2012). CourseSmart survey. Retrieved from http://www.coursesmart.com/media#pr10

    DePew, Kevin Eric & Lettner-Rust, Heather. (2009). Mediating power: Distance learning interfaces, classroom epistemology, and the gaze. Computers and Composition, 26, 174–189.

    Ehmann, Christa. (2003). A study of peer tutoring in higher education (Doctoral thesis, The University of Oxford, Oxford, UK).

    Ehmann, Christa. (2010). A study of online writing instructor perceptions. In Beth L. Hewett (Ed.), The online writing conference: A guide for teachers and tutors (pp. 163-171). Portsmouth, NH: Heinemann.

    Ehmann, Christa (2012, December). Online teaching and learning: Past, present, and future. Conference on e-Learning & e-Libraries Wellington College, UK.

    Ehmann, Christa (2013). OWI in fully-online & hybrid contexts [Chart]

    Ericsson, Patricia. F. & Haswell, Richard. (Eds.). (2006). Machine scoring of student essays: Truth and consequences. Logan, UT: Utah State University Press.

    Faigley, Lester & Witte, Stephen. (1981). Analyzing revision. Computers and Composition, 32(4), 400-414.

    Fife, Jane Mathison & O’Neill, Peggy. (2001). Moving beyond the written comment: Narrowing the gap between response practice and research. College Composition and Communication, 53(2), 300-321.

    Gere, Ann Ruggles & Abbott, Robert D. (1985). Talking about writing: The language of writing groups. Research in the Teaching of Writing, 19(4), 362-381.

    Gere, Ann Ruggles & Ralph S. Stevens. (1985). The language of writing groups: How oral response shapes revision. In Sarah Warshaur Freedman (Ed.), The acquisition of written language: Response and revision (pp. 85-105). NJ: Ablex.

    Gouge, Catherine. (2009). Conversation at a crucial moment: Hybrid courses and the future of writing programs. College English, 71(4), 338-362.

    Griffin, June & Minter, Deborah. (2013). The rise of the online writing classroom: Reflecting on the material conditions of college composition teaching. College Composition and Communication, 65(1), 140-161.

    Grimes, Douglas & Warschauer, Mark. (2006, April). Automated essay scoring in the classroom. Paper presented at the American Educational Research Association (AERA) Annual Conference, San Francisco, CA.

    Grimes, Douglas & Warschauer, Mark. (2010). Utility in a fallible tool: A multi-site case study of automated writing evaluation. Journal of Technology, Learning, and Assessment, 8(6). Retrieved from http://ejournals.bc.edu/ojs/index.php/jtla/article/view/1625/1469

    Hewett, Beth L. (1998). The characteristics and effects of oral and computer-mediated peer group talk on the argumentative writing process (Doctoral dissertation). Retrieved from http://www.defendandpublish.com/Diss_Characteristics.pdf

    Hewett, Beth L. (2000). Characteristics of interactive oral and computer-mediated peer group talk and its influence on revision. Computers & Composition, 17, 265-288.

    Hewett, Beth L. (2001). Generating new theory for online writing instruction. Kairos: Rhetoric, Technology, and Pedagogy, 6.2. Retrieved from english.ttu.edu/kairos/6.2/index.html

    Hewett, Beth L. (2004-2005). Asynchronous online instructional commentary: A study of student revision. Readerly/Writerly Texts: Essays in Literary, Composition, and Pedagogical Theory (Double Issue), 11 & 12(1 & 2), 47-67.

    Hewett, Beth L. (2006). Synchronous online conference-based instruction: A study of whiteboard interactions and student writing. Computers and Composition, 23(1), 4-31.

    Hewett, Beth L. (2010). The online writing conference: A guide for teachers and tutors. Portsmouth, NH, Boynton/Cook.

    Hewett, Beth L. (2013). Fully online and hybrid writing instruction. In Gary Tate, Amy Rupiper Taggart, Kurt Schick, & H. Brooke Hessler (Eds.), A guide to composition pedagogies (2nd ed.) (pp. 194-211). New York: Oxford University Press.

    Hewett, Beth L. (2015a). Reading to learn and writing to teach: Literacy strategies for online writing instruction. Boston: Bedford/St. Martin’s Press.

    Hewett, Beth L. (2015b). The online writing conference: A guide for teachers and tutors (Updated). Boston: Bedford/St. Martin’s Press.

    Hewett, Beth L. & Ehmann, Christa. (2004). Preparing educators for online writing instruction. Urbana, IL, National Council of Teachers of English.

    HMHEducation.com (2012). Results of a year-long algebra pilot in Riverside, California. Retrieved from www.hmheducation.com/fuse/pdf/hmh-fuse-riverside-whitepaper.pdf

    Holman, Lester Donnie. (2011). Automated writing evaluation program’s effect on student writing achievement (Doctoral dissertation). Available from ProQuest Dissertations and Theses database. (913400137)

    Johnson, E. Janet & Card, Karen. (2008). The effects of instructor and student immediacy behaviors in writing improvement and course satisfaction in a Web-based undergraduate course. MountainRise, 04(2). Retrieved from http://mountainrise.wcu.edu/index.php/MtnRise/article/view/81/36

    Kessler, Sarah. (2012). Why the iPad won’t transform education just yet. Retrieved from www.cnn.com/2012/01/20/tech/innovation/ipad-wont-transform-education/index.html Mashable.com

    Koller, Daphne. (2012, June). What we’re learning from online education. [Talk presented at the TEDGlobal 2012, Edinburgh, Scotland, UK]. Retrieved from http://www.ted.com/talks/daphne_koller_what_we_re_learning_from_online_education.html

    Krause, Steven D. & Lowe, Charles (Eds.). (2014). Invasion of the MOOCs: The perils and promises of massive open online courses. Anderson, SC: Parlor Press.

    Marquis, Justin W. & Rivas, K. (2012, June 26). Eight nations leading the way in online education [Web log post]. Retrieved from http://www.onlineuniversities.com/blog/2012/06/8-nations-leading-way-online-education/

    Meloncon, Lisa. (Ed.). (2013). Rhetorical access-ability: At the intersection of technical communication and disability studies. Amityville, NY: Baywood Publishing.

    McGrath, Laura. (2008). In their own voices: Online writing instructors speak out on issues of preparation, development, & support. Computers and Composition Online (Spring). Retrieved from www.bgsu.edu/departments/english/cconline/OWIPDS/index.html

    NCTE Task Force on Writing Assessment. (2013). Machine scoring fails the test. NCTE Position Statement on Machine Scoring. Retrieved from http://www.ncte.org/positions/statements/machine_scoring

    Oswal, Sushil K. & Hewett, Beth L. (2013). Accessibility challenges for visually impaired students and their OWI teachers. In Lisa Meloncon (Ed.), Rhetorical access-ability: At the intersection of technical communication and disability studies (pp. 135-156). Amityville, NY: Baywood Publishing.

    Parker, Kim, Lenhard, Amanda, & Moore, Kathleen. (2011). The digital revolution and higher education. Pew Research Center Internet & American Life Project. Retrieved from www.pewsocialtrends.org/files/2011/08/online-learning.pdf

    Pearson Foundation. (2011). Survey on students and tablets. Retrieved from www.pearsonfoundation.org/downloads/PF_Tablet_Survey_Summary.pdf

    Perelman, Les C. (2013). Critique of Mark D. Shermis & Ben Hammer, “Contrasting state-of-the-art automated scoring of essays: Analysis.” Journal of Writing Assessment, 6. Retrieved from http://www.journalofwritingassessment.org/article.php?article=69

    Radford, Alexandria W. (2011). Learning at a distance: Undergraduate enrollment in distance education courses and degree programs. National Center for Education Statistics. Retrieved from: http://nces.ed.gov/pubs2012/2012154.pdf

    Reil, Margaret (2010). Understanding action research. University of Pepperdine, Center for Collaborative Action Research. Retrieved from http://cadres.pepperdine.edu/ccar/define.html

    Sapp, David Alan & Simon, James. (2005). Comparing grades in online and face-to-face writing courses: Interpersonal accountability and institutional commitment. Computers and Composition, 22(4), 471-489.

    Schroeder, Julie, Grohe, Bonnie, & Pogue, Rene. (2008). The impact of criterion writing evaluation technology on criminal justice student writing skills. Journal of Criminal Justice Education 19(8), 432–445.

    Shermis, Mark D., Burstein, Jill. C., & Bliss, Larry. (2004, April). The impact of automated essay scoring on high stakes writing assessments. Paper presented at the annual meetings of the National Council on Measurement in Education, San Diego, CA.

    Shermis, Mark D. & Burstein, Jill C. (Eds.). (2013). Handbook of automated essay evaluation: Current applications and new directions. New York: Routledge.

    Thonus, Terese & Hewett, Beth L. (2015). Follow this path: Conceptual metaphors in writing center online consultations. Manuscript in preparation.


    This page titled 17: OWI Research Considerations is shared under a CC BY-NC-ND license and was authored, remixed, and/or curated by Beth L. Hewett and Kevin Eric DePew, Editors (WAC Clearinghouse) .

    • Was this article helpful?