Skip to main content
Social Sci LibreTexts

1.6: Collaborating on Writing-to-Learn in Ninth-Grade Science- What Is Collaboration—and How Can We Sustain It?, Danielle Myelle-Watson, Deb Spears, David Wellen, Michael McClellan, and Brad Peters

  • Page ID
    77457
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Chapter 5. Collaborating on Writing-to-Learn in Ninth-Grade Science: What Is Collaboration—and How Can We Sustain It?

    Danielle Myelle-Watson, Deb Spears, David Wellen, Michael McClellan, and Brad Peters

    Other contributors to this volume note that in high school and university partnerships, teachers often want extended time to collaborate with professors so they can learn more about developing specific methods relevant to their own subject areas (Cox and Gimbel, Chapter 2). However, high school circumstances and practices can hinder the collaboration (Beaumont, Pydde, and Tschirpke, Chapter 7). Keeping communication lines open is critical.

    In the project recounted here, three science teachers, a high school reading specialist, and a university coordinator of WAC collaborated for a year and a half in a project sponsored by an Illinois state grant from Promoting Achievement through Literacy Skills (PALS). Modeled on a previous study at the same high school, the science writing project provided teachers with university credit for a semester’s course that the reading specialist and the WAC coordinator had co-taught on site many times before (Peters 64-66). The following year, the teachers implemented what they learned with continued support from the PALS grant and ongoing collaboration with the reading specialist and the WAC coordinator (McClellen et al.).

    Similar to other chapters in this volume, the project matched materials and aims to the needs and interests of the teachers (McMullen-Light, Chapter 6). A chief concern among the teachers was how writing-to-learn might help ninth-grade students attain “scientific literacy,” especially with the emphasis that the new Common Core Standards put on “Career and College Readiness Anchor Standards for Writing” in their science classes (States Standards Initiative 63-66; Cox and Gimbel). Historically, educators have identified science literacy as:

    • A cultural force in the modern world
    • Preparation for the workplace
    • Applicable to everyday living
    • A trait of informed citizenship
    • A particular way of examining nature
    • An approach to understanding scientific reports or discussions in popular media
    • A means of appreciating the aesthetics of natural laws and phenomena
    • A willingness to make use of scientific expertise
    • A way to critique and use technology ethically. (DeBoer 591-93)

    But in a 78.7% high-poverty urban district where the teachers’ school was on academic watch, where standardized tests predominated, and where the majority of non-college bound students only needed to complete two courses in science to graduate, such goals seemed out of reach for many (Illinois Interactive Report Card; Rockford Public Schools 7). How could writing-to-learn help?

    During the spring semester prior to implementing the project, the teachers studied resources on writing-to-learn. Then, they composed and experimented with sets of sequenced writing prompts that clarified scientific concepts embedded in class discussions, small-group interaction, and above all, laboratory work. They piloted and evaluated the prompts, using a rubric that other teachers in the same school had developed for an earlier project (Peters 65). In consultation with the high school reading specialist, the teachers decided that all of their next-year courses—two honors sections of biology, five “average” sections of biology, and seven sections of physical science—would engage in four National Writing Project practices:

    • Plan informal writing at least twice a month
    • Discuss writing strategies in the context of course content
    • Do some form of redrafting
    • Collect, examine, and reflect on the writing. (Nagin 44)

    The following summer, the district serendipitously recruited the three teachers and their colleagues in other high schools to design four quarterly tests in biology and physical science as a move toward instruction-based assessment (Gallagher 58-59). District faculty closely aligned the tests with the curriculum. The three science teachers felt the quarterly tests could help them gauge the gains of writing-to-learn in comparison with their students’ district peers.

    When the new academic year began, the teachers agreed to form a “Professional Learning Community” (PLC) that would meet every two weeks. In the PLC meetings, they would review and revise prompts that addressed key concepts in the curriculum. They would discuss samples of student responses, analyze any problems or concerns that came up, and examine the quarterly test results. They would also provide written reflections on each quarter’s progress.

    The first quarter of the next year’s collaboration showed great promise. Everyone attended the PLC meetings. The teachers regularly consulted with the reading specialist between the meetings. Both the biology and physical science students responded well to informal writing. In all achievement levels, students improved substantially in their lab reports and scored approximately ten percentage points higher than their district peers in the first-quarter district tests. Correlated test scores were also statistically significant.

    However, the second quarter did not proceed in the same way. Unexpected issues interrupted the PLC meetings, forcing cancellations and sporadic attendance. Although the reading specialist tried to help them all keep up, lack of adequate collaboration as a group had a negative impact. Biology students did much less writing-to-learn than expected. Physical science students (approximately 30% of whom were special education students) struggled with prompts that intimidated or confused them.

    After the second quarter, the nature of the collaboration changed. Debriefing among the teachers as a cohort waned. Maintaining a sustainable structure for communication became a challenge (similar to Beaumont, Pydde, and Tschirpke, Chapter 7). Because the project took place at the high school, the reading specialist and WAC coordinator could still meet with teachers individually. Nevertheless, prompts in biology were assigned unevenly through the third and fourth quarters. It was difficult to tell if writing-to-learn made a difference. Honors biology students received quarterly test scores that remained the same as, or dropped even lower than, their district peers. Yet average biology students achieved slightly higher test scores, and one third-semester biology class earned scores that were statistically significant.

    On the other hand, after the teachers readjusted some of their practices, physical science students either kept on responding to two sets of prompts per quarter or focused more on writing well-developed conclusions to lab reports. These at-risk students accomplished statistically significant test scores in the third and fourth quarters.

    In addition to correlating the quarterly test scores between students who wrote to learn and district students who did not, the teachers collected their students’ “write-to-learn” folders to be read, ranked, and examined at the end of the academic year. Descriptive statistics suggested that the frequency and consistency of writing-to-learn tasks were important variables, in addition to the impact of changes in collaboration among the teachers, the reading specialist, and the WAC coordinator. Accordingly, the story that follows is as much a study of the dynamics of collaboration as it is a study of how writing-to-learn affects student learning outcomes.

    Why Write to Learn? A Theoretical Basis for Enhancing Scientific Literacy

    National studies show how writing-to-learn can help students “learn more deeply” and attain “higher achievement in science” (Peery 17-18, 21; Reeves; Rivard). Moreover, a study conducted by the National Survey of Student Engagement and the National Council of Writing Program Administrators claims that in the most successful classroom experiences, “the amount of pages students wrote was less important for deep learning and gains than interactive writing, meaning‐making, and clear expectations” (Anderson et al. 1). This claim is supported by the earlier findings of Johannessen, Kahn, and Walter who asserted that sequenced writing-to-learn prompts can yield “dramatic gains in only a short time,” enabling high school students to make “essential thinking strategies ... part of their repertoire” when they encounter new material (5, 22). However, those sequenced prompts must be clustered around a specific concept and designed to break a complex thinking process into more manageable steps (Johannessen, Kahn, and Walter 5). Such claims impressed the science teachers particularly because the reading specialist had participated in a previous study of writing-to-learn in their high school, which implied that all students—even low achievers—could “produce statistically significant learning outcomes” (Peters 85).

    The science teachers already asked students to write a fair amount. Lab guidelines sharpened the students’ observational and procedural skills regularly. These questions from a physical science lab on magnets are typical:

    • Place iron filings on a blank piece of white paper to create a magnetic field around a bar magnet. Draw the magnetic field lines.
    • What observations do you make?
    • Place two bar magnets so the north pole of one is close to the south pole of another. Draw the field lined near the poles.
    • What observation can you make?

    Furthermore, one teacher provided her students with the following guidelines on “How to write the different parts” of “Power Conclusions” to their lab reports:

    • Reference to hypothesis—“I thought that ...” (if ... then statement)
    • Reflection on hypothesis—“I found out ... therefore my hypothesis was ....”
    • Reference to specific data—“My data showed that ...”
    • Error analysis—“Doing _____ may have affected my results”; “I forgot to ...”; “I could have improved on my ____ skill(s) and this would have ...”
    • Future research—“I would like to investigate more about _____ because ...”; “Knowing _____ will help me to ...”; “We should repeat the experiment because ...” (adapted from Pierce and Shellhaas)

    Students also wrote papers requiring Internet research, e.g., reports on famous scientists’ discoveries, or detailed profiles of simple to complex animal life.

    All the same, the science teachers welcomed exercises that have succeeded in science classes nationwide—e.g., freewriting, RAFTs (role/audience/form/topic), the Science Writing Heuristic, Cornell notes, reading logs, interpretations of graphs, and even poetry—(Chabot and Tomkiewicz 53-55; Childers and Lowry 1; Hohenshell and Hand 271; Keys 116; Peery 127-51; Petersen 99). They recognized that different written practices could also help their students acquire and negotiate the disciplinary knowledge they needed to become scientifically literate (Navarro and Ravel Chion, Chapter 4). Most important, they were interested in composing sequenced prompts that scaffolded students’ thinking strategies around a specific scientific concept (Johannessen, Kahn, and Walter 3-4; Wood, Bruner, and Ross 89-90).

    During the onsite course, the teachers also questioned and re-examined their assumptions about how they taught scientific literacy in their classrooms. For example, one science teacher always asked his students to memorize an acronym similar to the Science Writing Heuristic: “P.H.E.O.I.C.” (define a problem, form a hypothesis, experiment, observe, interpret data, and make a conclusion). He would occasionally quiz them or ask them to recite it. But after a lively discussion with his colleagues about how and why students so often failed to translate this acronym into well-written lab reports, he wondered if he was engaging his students in “knowledge telling” rather than “knowledge transforming” (Bereiter and Scardamalia, 1987; Rivard and Straw 586). He went on to compose the following set of sequenced prompts to help students apply the heuristic more meaningfully and critically:

    • Define science, pseudoscience, hypothesis, and law.
    • Identify steps of the scientific method for a classmate who was absent in class the day we reviewed it. Explain why you think scientists follow these steps.
    • Describe all the parts of a controlled experiment. Give reasons why you think it’s important to include a control group and an experimental group. Remember to discuss the effects of independent and dependent variables.
    • Develop a well-thought out set of instructions that anyone could follow for the lab we did on potatoes. Include guides for analyzing the data and drawing a conclusion from a hypothesis and its test.

    He assigned these prompts at intervals over a two-week period in one class and asked students to talk over their answers each time they wrote. He reported back to his colleagues that it allowed him to intersperse the writing with small-group lab work and what McCann et al. call “authentic discussion”: engaging the students directly in critical thinking about the concept he was teaching, rather than requiring them merely to recite it (2-3). In sum, he and his colleagues deduced that writing-to-learn could become part of several “language-based activities” that contributed to the complex changes that enabled their students to acquire scientific literacy (Rivard 438).

    This high-caliber collaboration occurred among the science teachers throughout the semester’s preparatory course, giving rise to a “WACommunity” that reflected “a true culture of writing” (Cox and Gimbel, Chapter 2; McMullen-Light, Chapter 6). In turn, the reading specialist and the WAC coordinator gained rich insights about the value of teachers applying their disciplinary knowledge to assignment design to address students’ specific learning problems.

    Instruction-Based Assessment: A Response to Standardized Tests

    The district’s summer decision to ask all science teachers to develop quarterly tests came at a good time. Some state boards of education are discovering what teachers and scholars have known for years. Too much emphasis on standardized tests not only “squelches teaching and learning creativity”—it also “eliminates the need for critical thought” (College Readiness Project, Phase II; Hillocks 136). Yet a recent national survey shows that secondary teachers care a lot about the accountability that tests produce, as long as it does not force them to neglect important aspects of their curriculum (Sunderman, Kim, and Orfield 124). Teachers in a few states have found instruction-driven assessment based on locally designed tests especially compelling, because it enables them to parse assessments throughout the year, increasing the possibility of using test scores to discover and redress their students’ needs before those students leave the classroom or fail (Gallagher 63). Instruction-driven assessment can even encourage teachers to develop individual education plans for students, using writing-to-learn, portfolios and “student-friendly rubrics to help students understand learning expectations” (Gallagher 67). Science scholars in Washington suggest such an approach may in fact reveal that “how students learn could be more important than what they learn” (College Readiness Project, Phase II).

    Nonetheless, it was unusual for the school district to initiate the quarterly tests. As with other large urban districts, teachers often felt “alienated from core decisions about [...] instruction,” where “district objectives, large criterion-referenced tests, and textbooks” dictated the curriculum (Gallagher 65). Some faculty suspected the district intended the tests to enforce cookie-cutter syllabi throughout its four high schools. But after years of reserving nearly a month to prepare students for the Prairie State Achievement Examination (PSAE) and the ACT, the three science teachers cautiously welcomed tests that mirrored what they actually taught. They saw how the quarterly tests could help “develop and refine their curriculum, instruction, professional development ... and assessment,” above all because the tests were also aligned with “power standards” adapted from the Illinois learning standards for science (Gallagher 64-65; Illinois State Board of Education). Moreover, as in other districts that have tried instruction-driven assessment, the quarterly tests would let the science teachers give “heightened attention to particular groups of students, including low-income students, English language learners [ELL], and ethnic or racial groups” (Gallagher 65). The quarterly tests bolstered the collaboration accordingly as they began in the preparatory course. The reading specialist and the WAC coordinator were excited that this approach to assessment could take the collaboration to an even higher level, making the project relevant to the whole district.

    While these teacher-designed assessments clearly established expectations for content coverage at a certain pace, the three science teachers still noted places where they could apply writing-to-learn most advantageously. The tests also left room in the curriculum for the teachers to address important issues that required expanding upon and connecting. For instance, could the physical science curriculum include a short review of earth science—a subject covered only in middle school—even though 25% of the PSAE questions focused on it? They wanted to try.

    After studying the tests in a PLC meeting, the teachers decided the physical science prompts should address the topics of:

    • The relationship between matter and energy
    • Elements, compounds, and mixtures
    • The earth and its atmosphere
    • Atomic structure
    • Chemical reactions (combustion of gas)
    • The scientific problem-solving process
    • Common uses of electricity
    • Electrical circuits
    • Magnetic fields and uses of magnets

    Meanwhile, the teachers decided the biology prompts should address the topics of:

    • Characteristics of living things
    • Biotic and abiotic factors in ecosystems
    • Plant and animal cells
    • Differentiating mitosis and meiosis
    • Cell specialization and embryonic development
    • DNA replication
    • Inheritance of genetic traits
    • Evolution of genetic traits

    Because one teacher taught both physical science and biology, it seemed optimal for all three to work together and share the prompts they were creating anew, rather than each teacher create a separate set of prompts. The alignment that the teachers sought between writing-to-learn and the quarterly tests went beyond mere attempts to teach to the tests. They hoped to begin recovering what a decade of No Child Left Behind had done in their district to narrow the curriculum, to emphasize lower-level skills, and to decrease teacher and student engagement in the development of science literacy (Gallagher 39).

    Although the three teachers, the reading specialist, and the WAC coordinator collaborated closely upon the prompts for the first quarter—discussing and revising them in the PLC meetings, as well as examining samples of student’s corresponding work—the rest of the year’s prompts on the topic lists remained unfinished. As chance would have it, the teachers’ plan to continue collaborating on the prompts did not account for the unexpected.

    The Unexpected: First-Quarter Pacing for Different Levels of Achievers

    Pacing was one of the biggest problems the science teachers encountered during the first quarter of their implementation of writing-to-learn. Although the instruction-based, quarterly tests focused on the actual curriculum that teachers taught, and the Common Core Standards encouraged richer, more “varied genres and purposes of writing,” the district hadn’t abandoned the PSAE or ACT, which still emphasized shallow coverage over deep learning—pressuring the teachers to move through the curriculum “more quickly than they would if their professional judgment were their guide” (Cox and Gimbel, Chapter 2; Gallagher 65-66). Such conflicting exigencies could only complicate the science teachers’ attempts to extend instruction “by doing more exploratory learning and more constructivist learning” through scaffolding (Gallagher 66).

    To illustrate, one teacher—Dawn—had her students write “bell ringers” at the beginning of classes, to provide them with the “multiple writing tasks across connected topics” that she had learned to design (Hand, Hohenshell, and Prain 344). She would spend a few minutes getting the students to focus on responding to the bell ringers, and then she would lead a discussion of the students’ responses. Next, she would go on to address the day’s lesson, which connected directly to the bell ringer. After several days and two to three bell ringers, she would give the students the longer prompt to address. She found out, however, that the impact of the bell ringers differed between her average and her honors students in biology. When she did a series of lessons on biotic and abiotic factors, one lesson began with the bell ringer: “Explain the difference between a rock and a wooden board.” Then, she introduced the terms and spent time discussing the distinctions. The next lesson began with: “Define biotic and abiotic. Write about three different objects in this classroom that are biotic and three that are abiotic.” She then provided the students with several scenarios that would affect biotic and abiotic factors—e.g., a hurricane, an oil spill, over-grazing, building a golf course, a nuclear leak, highway construction. She broke the students into groups, each group to analyze how one scenario would affect biotic and abiotic factors. They then presented their group analyses. The following day, she assigned the longer prompt:

    Analyze one of the scenarios we discussed. Identify several biotic and abiotic factors, and explain how each of these factors would be affected by the scenario. Then hypothesize if/how these biological situations could return to their natural or normal state over time.

    Dawn used this sequence with both her honors-level and average classes, just as she had with a previous sequence of prompts on the characteristics of living things. She found out the honors students had “the cognitive tools and the conceptual building blocks necessary” for completing the task (Rivard and Straw 587), but the average students did not. She needed to supplement the task with more follow-up prompts and class discussion before they, too, could distinguish between biotic/abiotic and apply their knowledge to real-life scenarios. She did so, but barely covered the material her students needed for the first-quarter test.

    The experience with her biology students made Dawn sensitive to her physical science classes as well. Many of her ELL and special education students not only needed more scaffolding, but also always needed more time to complete the culminating prompts (Lee et al. 33; Stretch and Osborn 4). When she gave these students this extra help and time, they performed nearly on par with their higher-achieving peers—though their work showed less detail and grammatical fluency. This finding suggested that even her low-achieving students could benefit from writing-to learn, contrary to opposite claims (Bangert-Drowns, Hurley, and Wilkinson 6, 53).

    So a dilemma surfaced. Could Dawn ignore that her lower-achieving students would learn almost as well as her higher-achieving students if she simply slowed down the pace of the curriculum? Her physical science students performed quite well on the material they had covered for the first-quarter test, but when the results of their writing-to-learn compelled her to take longer to accommodate their learning pace, she knew they would fail on the material they had not covered. Unlike Mike, her colleague who taught physical science exclusively, she had no special education teacher assigned to her class to help her physical science students keep up.

    Dawn cut back considerably on write-to-learn prompts and focused on helping students improve conclusions to their lab reports instead. The students did three to four lab reports per quarter. This way, she could manage the curricular pace. She reasoned that the guidelines she used for writing the conclusions would provide similar scaffolding. The WAC coordinator noted her reasoning was supported by researchers who state: “scientific writing genres should be explicitly taught, so that all children might have access to the discursive power of scientific texts” (Halliday and Martin xi, 2-4; Keys 118). Above all, researchers assert how young learners affected by poverty and categorized as “low achievers” are especially disadvantaged when not taught explicitly to write scientific texts (Keys 118-19; Rivard 424-25).

    David, a teacher who taught only biology classes for the year, developed another way to pace writing-to-learn in the first quarter. Influenced by the Common Core Standards’ emphasis on students reading and integrating facts, definitions, and details from informative texts into their written work, he introduced short articles to supplement the biology textbook and labs (States Standards Initiatives 65). For example, students could read an article and formulate simple experiments that tested the claims they read. One article asserted that fish could see and were attracted to different kinds of colors. The students experimented with goldfish and multicolored glow sticks to verify the article’s claim. Then they wrote lab reports that involved “peer review, collaborative problem solving, [and] student partner revision teams” (Mullin and Childers 26; Zimmet 106). David also had the students read the articles, write brief summaries, discuss them, and then revise what they had learned (DeBoer 592-93). After reading an article on scientists who planted a “smart gene” into mice DNA, one student wrote in her summary that this was a stupid idea. Why did the world need smarter mice? However, when her class discussed the article, learned its vocabulary, grasped why the experiment was conducted, and questioned the ethical implications, she rewrote her summary, saying that she now saw what the article was all about. She still thought that biologically engineering smarter mice made no practical sense, but if it led to making humans smarter, would it help her learn better and perform well on tests? Would it give some people an advantage over others? Yet again, what if gene-planting affected human brains badly? Some of David’s most successful classes were centered on “sociopolitical and moral contexts” such as these that helped develop his students’ science literacy (Soliday 67).

    Although pleased that his students performed well on the first-quarter exam, David also fell behind in the curriculum. He decided to include only one write-to-learn task in the second quarter. He eliminated the shorter prompts that led up to it, using them instead to guide class discussion. In the third and fourth quarters, he did the same, to see if relying more heavily upon “collaborative talk as a heuristic” might compensate for cut-backs in writing (Rivard 424).

    Mike, who taught physical science courses exclusively, was the only one who stayed with the plan to do two sets of prompts each quarter. He made sure that students responded in writing to the shorter, scaffolding prompts, interspersing their written responses with class discussion. For instance, he would have students compose a T-chart to help them organize and define new terms, then he would conduct a discussion of those terms in the context of the scientific concept he was introducing. Next, he would have the students do a lab that applied those concepts. At that point, students could illustrate or graph their findings as well, as Childers and Lowry recommend (n. pag.). Once the labs provided students with the knowledge base they needed, Mike had them write comparisons of their data (Rivard and Straw 586).

    Yet this pattern pressed Mike for time in the second quarter as well, so he combined the shorter scaffolding prompts with the longer prompt into a kind of step-by-step, essayistic quiz. When he initiated these essayistic quizzes, the results were disappointing. He did not provide enough class discussion or build in enough shorter prompts to support the students written explanations of atomic structure or combustion in a gas engine. So he designed later quizzes as part of a “recursive cycle” with “students applying or practicing each small step” that he modeled, while he or his aide checked that “the class as a whole [was] succeeding on each successive step” (Schmoker n. pag.). Moreover, he carefully strategized the best time to schedule the write-to-learn exercises in relation to lab work. This approach got much better results. Conversely, Mike discovered that if students did lab work, recorded observations, and studied results on one day, but wrote the conclusions to lab reports on the following day, they processed what they had learned and composed better conclusions (McClellan n. pag.).

    Ironically, what each science teacher found out about their students’ learning needs caused the tightly knit collaboration to unravel. Deb, the reading specialist, and Brad, the WAC coordinator, realized that they could not rally the teachers back to the original collaborative model without challenging each teacher’s decision to incorporate write-to-learn principles in her or his own way. While the group recovered a schedule of meeting regularly, the collaboration with the teachers occurred on a much more individual basis—often with only one teacher attending at a time. Although Dawn and David composed a few more prompts together for their biology students, they stayed mostly with what they felt more comfortable doing, according to their first-quarter adaptations. And as Mike focused independently on implementing the physical science prompts he wrote, Dawn focused on guiding her biology and physical science students alike to write “power conclusions.” For good and thoughtful reasons, the teachers might have lost track altogether of what each was doing had Deb and Brad not continued to compare and analyze with each the successes or problems that the others encountered as the year progressed. This adapted model of collaboration sacrificed a uniform approach for an individuated approach. It not only enriched the project’s outcomes, but it also complicated and altered the disciplinary conversation that the PLC had sustained to that point.

    Unfortunately, other issues compounded the teachers’ efforts to keep pace with the curriculum. Student absences and truancy were always high. Each of the teachers walked into classes on days when 50% or even more of the students were missing. Student mobility presented another challenge. Dawn recounted one week in the second quarter when ten new students showed up in a section of physical science. Combining drop-outs with new arrivals, each teacher’s roster for each class included a minimum of five to seven students who would not finish out the year, and the same number of students who might be added at any time.

    The probability of a strike during the second quarter complicated the situation further. The teachers had to cancel three PLC sessions to attend union meetings instead. A conservative Board of Education had threatened to raise teachers’ insurance to $800 per month, cut 138 jobs, increase class sizes from 26 students to 34 students, shut down four schools, eliminate five major programs, and cancel orders for new textbooks. Only the spring before, the district had closed or consolidated ten schools, slashed special education classes, and dropped 281 jobs (Bayer).1

    Given these issues, the end-of-year results yielded a number of encouraging surprises.

    The Impact of Writing-to-Learn upon Instruction-Based Assessment

    Although this project went in three different directions after the first quarter—reverting to the kinds of “fragmented individualism” that can characterize secondary teachers even when they belong to the same department—the quarterly tests provided an anchor that helped the teachers measure the effects of their varied approaches the rest of the year (Siskin 28-29).

    During the first quarter, when all three science teachers incorporated two sets of write-to-learn prompts, the students’ higher scores in David’s and Dawn’s combined biology classes differed from the other district classes by 11.72 percentage points (Table 1, below).2

    Table 1: First-quarter biology

    1st Quarter Biology Tests

    of
    Questions

    of
    Students

    Possible Points

    Avg.
    Points

    Avg. % Correct

    Biology students who did writing-to-learn

    35

    181

    35

    21.77

    62.20%

    District biology students

    35

    1369

    35

    17.67

    50.48%

    Biology students who wrote to learn outperformed the rest of the district on 89% of the questions. When a two-sample t-test compared the 181 writing-to-learn students with the 1369 ninth-grade district biology students, results yielded t(68) = 2.94, p < .05—suggesting less than a 5% probability that higher scores among writing-to-learn students were coincidental. The effect size r was also calculated with a result of .3—a medium effect—indicating that writing-to-learn had made a positive impact on the students’ retention of biological concepts (Steinberg 366).

    Scores in Mike’s physical science classes were even more compelling (Table 2).3

    Table 2: First-quarter physical science

    1st Quarter Physical Science Tests

    of
    Questions

    of
    Students

    Possible Points

    Avg.
    Points

    Avg. % Correct

    Physical science students who did writing-to-learn

    30

    129

    30

    18.6

    61.99%

    District physical science Students

    30

    598

    30

    14.3

    47.56%

    Mike’s 129 students who did writing-to-learn scored higher on 90% of the questions. The t-test showed that t(58) = 3.06 < p .005. There was less than a .05% probability that coincidence could explain the percentage of correct answers among students in his classes, bolstered by a .4 effect size r—or large effect size.

    Dawn’s shift from prompts to a writing-to-learn (WTL) emphasis on better conclusions in lab reports had a moderate, upward effect on her average biology students’ scores (Table 3), but the honors biology students’ scores fluctuated in comparison to the rest of the district.

    Table 3: Comparison of 2nd to 4th-quarter biology, write-to-learn emphasis on lab report conclusions/no known WTL emphasis

    Achievement Levels,
    Biology

    WTL Emphasis on
    Lab Report Conclusions

    District—No Known
    WTL Emphasis

    Average-2nd Quarter

    47.88%

    44.36%

    Honors-2nd Quarter

    59.05%

    61.72%

    Average-3rd Quarter

    51.05%

    42.88%

    Honors-3rd Quarter

    65.14%

    63.86%

    Average-4th Quarter

    54.50%

    51.26%

    Honors-4th Quarter

    65.10%

    69.06%

    Because the third-quarter percentage of correct answers for Dawn’s 23 average students were so much higher, a t-test was run. Calculations showed that t(60) = 2.07 < p .25, with a .256 or medium effect size r. These results suggested that a WTL emphasis on lab report conclusions might account for her average biology students’ higher percent of correct answers, at least in the third-quarter test, if not on the second and fourth quarters as well (Table 4). The less positive impact that a WTL emphasis on lab report conclusions had upon honors students will be addressed in the discussion section.

    Table 4: Third-quarter biology, Dawn’s “average class”

    3rd Quarter Biology, Dawn’s Average Class

    of Questions

    of
    Students

    Possible Points

    Avg.
    Points

    Avg. % Correct

    Students with WTL Emphasis on Lab Report Conclusions

    31

    23

    31

    15.83

    51.05%

    District biology students

    31

    744

    31

    13.29

    42.88%

    The most pronounced correct-answer percentages between WTL classes and the rest of the district came from physical science during the third and fourth quarters. It was apparent that both writing-to-learn prompts and a WTL emphasis on lab report conclusions pushed all of the classes’ percent of correct answers above the students’ district peers (Tables 5 and 6).

    Table 5: Third-quarter physical science

    3rd Quarter Physical Science Tests

    of Questions

    of
    Students

    Possible Points

    Avg.
    Points

    Avg. % Correct

    Physical science students who did writing-to-learn

    30

    119

    30

    19.2

    63.84%

    District physical science students

    30

    494

    30

    14.3

    47.61%

    The 119 third-quarter physical science students who wrote to learn scored 63.84% more correct answers than the 494 students in the rest of the district, yielding a t-test of t(58) = 4.75, p < .005—less than a .05% possibility that the impact of writing-to-learn was coincidental. Effect size r was large at .53. Given the high percentage of ELL and special education students, these results were worth noting.

    In the fourth quarter, the 111 physical science students who wrote to learn scored an average of 65.33% correct answers, while the 448 district students who did not write to learn scored 44.08%.

    Table 6: Fourth-quarter physical science

    4th Quarter Physical Science Tests

    of
    Questions

    of
    Students

    Possible Points

    Avg.
    Points

    Avg. % Correct

    Physical science students who did writing-to-learn

    30

    111

    30

    19.59

    65.33%

    District physical science students

    30

    448

    30

    13.44

    44.08%

    The t-test revealed that t(58) = 6.79, p < .005, or less than a .05% coincidence that writing-to-learn had no impact. Effect size r came in at .66, which was large.

    The least encouraging results derived from David’s increased emphasis on class discussion to compensate for less writing-to-learn. Table 7 shows a possible negative impact—especially for average biology students in the third quarter.

    Table 7: Effects of increased class discussion and decreased writing-to-learn

    Achievement Levels,
    Biology

    Increased Discussion,
    Decreased WTL

    District—No Known
    WTL Emphasis

    Average-2nd Quarter

    45.16%

    44.36%

    Honors-2nd Quarter

    60.69%

    61.72%

    Average-3rd Quarter

    37.41%

    43.84%

    Honors-3rd Quarter

    62.01%

    63.86%

    Average-4th Quarter

    50.29%

    51.26%

    Honors-4th Quarter

    49.13%

    51.26%

    At the end of the year, student folders were also collected to get a sense of what quality of work had been achieved, and how the work might provide insights about differences among honors, average, and lower-achieving students. For students’ responses to the write-to-learn prompts, the science teachers used the rubric in Table 8 to gauge their students’ comprehension of task, content, thinking strategies, and language use (Johannessen, Kahn, and Walter 11-12; Peters 65-66).

    The WAC coordinator then selected student folders that contained at least 60% of the assigned prompts, rated them, and calculated the averages of the two ratings per folder.

    A rating of 8 out of 12 possible points meant that student folders had met expectations. Seventy-four percent of 50 honors biology students who turned in folders completed a minimum 60% of all writing-to-learn prompts assigned. The overall average rating for those honors biology folders was 9.56, with a S.D. of 1.5. Fifty percent of 109 “average” biology students who turned in folders completed a minimum 60% of all writing-to-learn prompts. The overall average rating for the “average biology” folders was 8.35 with a S.D. of 1.1. Forty-six percent of the 163 physical science students who turned in folders completed 60% of all writing-to learn prompts. The overall average rating for the physical science folders was 8.05 with a S.D. of 1.2. Fifty-four percent of honors biology students achieved a rating of 10 or higher, while 9% of the average biology students, and 4% of the physical science students, rated in the same range. Two percent of honors biology students, 6% of average biology students, and 9% of physical science students rated 7 or lower.

    Table 8: Rubric

    CRITERIA

    Exceeds
    expectations

    3

    Meets
    expectations

    2

    Misses
    expectations

    1

    Comprehension of Task—your ability to respond informally to what the writing prompt asks

    You understand and follow instructions exactly.

    You understand and follow instructions adequately

    You misunderstand or disregard instructions.

    Content—your ability to convey knowledge of course content received from reading or listening

    You provide very accurate information and thorough detail

    You provide accurate information and sufficient detail.

    You provide inaccurate information and/ or insufficient detail.

    Strategies—your ability to apply, analyze, back up, compare, classify, critique, define, describe, evaluate, explain, exemplify, graph, hypothesize, illustrate, interpret, observe, organize, predict, question, reflect, review, show cause-effect, solve, summarize, or synthesize.

    You show clear control over the strategy or strategies that the prompt requires.

    You show satisfactory evidence of understanding and practicing required strategies.

    You show little or no evidence of understanding required strategies.

    Language use—your ability to write a readable response and use conventions of grammar and punctuation

    Your response is articulate; errors minimal.

    Your response is easy to read; errors don’t prevent understanding.

    Your response is hard to read/ understand; errors confuse.

    In Retrospect

    In terms of student learning outcomes, perhaps the most thought-provoking finding in the foregoing data has to do with the performance of students who were categorized as low achievers. After a year of consistent writing-to-learn, they successfully demonstrated what they retained in the quarterly tests. Rivard points out: “The ways in which learning strategies have traditionally been utilized in the classroom have effectively denied equal access to knowledge for all students” (424). Yet as he points out, the research literature—including this project—

    suggests that classroom activities which emphasize conceptual understanding, real-life applications, language use, and small-group work may be particularly effective for enhancing the learning of students who have traditionally been marginalized by the educational system: low-ability students, underachievers, and potential school leavers. (Rivard 424)

    Moreover, the data in this study indicate that so-called low achievers are capable of responding well to specific instruction in specialized forms of disciplinary writing when it is structured “to foster more reflective thinking and enhanced student learning from laboratory activities” (Bazerman et al. 42). The same apparently held true to a lesser extent for “average” biology students who more sporadically wrote to learn.

    The drop in honors students’ test scores when they decreased writing-to-learn coincides with other research that shows “high achievers may benefit more from the use of writing” than average or low achievers—for them, especially, “the use of writing enhances learning more than just talk” (Rivard 432). This project suggests that writing-to-learn has a far more substantial effect upon high achievers than previously thought, and they achieved less when they stopped doing it. Furthermore, the results seem to reiterate the National Writing Project’s recommendation of planning write-to-learn activities a minimum of twice a month (Nagin 44).

    Nonetheless, as all three science teachers noticed, if combined consistently with writing-to-learn, “opportunities for all students to engage in extended dialogues signals the expectation that all learners will meet challenging academic standards” (McCann, Johannessen, Kahn, and Flanagan 6-7; Rivard and Straw 567-68). The progressively higher physical science test scores indicated as much.

    Commercially prepared tests that do not align with the local curriculum may not measure such success. Indeed, such tests may have held sway over the potential of low and average-achieving students for far too long. Local, instruction-driven assessment offers real promise for counteracting their detrimental effects, above all if this assessment is aligned with state standards and the national Common Core Standards. As the data here shows, if teachers can be encouraged to find the best, most workable methods for themselves to pace their students through the curriculum, they may use instruction-driven assessment combined with informal writing activities “in a strategic way throughout the school year ... to the extent that students [don’t] even know when they [are] being formally assessed and when they [are] simply carrying out ‘regular’ classroom activities” (Gallagher 63-64).

    In terms of collaboration, this project’s findings are equally thought-provoking. While some contributors to this volume rightly assert that even short-term partnerships between high schools and colleges or universities can have a positive impact, productive collaboration often must begin with some kind of formalized arrangement on the part of the post-secondary institution (Cox and Gimbel, Chapter 2; McMullen-Light, Chapter 6; Smith, Chapter 9). High school faculty not only need opportunities to learn, but also need to apply principles of writing across the curriculum and discuss the results. It helps tremendously if a project includes a high school faculty member who can co-facilitate, and at the same time relate with or convey the teachers’ circumstances and concerns (Peters 63). Conducting several projects at the same school increases the likelihood of success, as one project informs another (McClellan n. pag.). Grants or other sources of funding provide participants with the incentive to keep the project going (McClellan n. pag.; Peters 63;). As McMullen claims, “In all aspects of WAC, context is everything,” so if professors can collaborate on site with teachers, it helps tremendously (Chapter 6). Establishing “joint commitments” to a sustained time to meet and regularly work together must be set up for exchanging ideas and keeping goals equally in perspective (Blumner and Childers 94). Mapping out a plan and setting specific milestones, even if they only serve as a point of departure, enable participants to stay focused.

    In addition, participants of such projects—in large American public schools, at least—should probably realize that the current culture of testing and standards mitigates against a “culture of collaboration,” especially when those standards are imposed rather than adapted through collective activity (Siskin 28). For example, “Logistical constraints of size, time, and space” complicate the situation, engaging teachers in a kind of “parallel piecework” in their departments, where they are more likely to “work alone, learn alone, and [...] derive their most important personal satisfactions alone” (Huberman 22-23; Siskin 29). Even when collective activity occurs and collaboration is successfully sustained, the pull in the opposite direction is strong. Participants will want to find a system by which they can analyze and reconfigure their collaborative model so that the project survives (Beaumont, Pydde, and Tschirpke, Chapter 7).

    Ultimately, some outcomes might not be met. Some problems will not be resolved. But recognizing this much allows for new discoveries and insights to emerge, as well as guidance for another project, another time.

    Notes

    1. In fact, a three-day strike did take place later in the school year.

    2. In an earlier publication on this study (McClellan et al. 2012) errata occurred because of a mistake in the formula that adjusted percentages of correct answers between the writing-to-learn classes and the rest of the district’s classes. The errata are corrected here.

    3. Dawn’s Physical Science scores were removed from the data set because her students hadn’t been able to cover all the material on the 1st quarter test.

    Works Cited

    Anderson, Paul, Chris Anson, Bob Gonyea, and Chuck Paine. “Summary: The Consortium for the Study of Writing in College.” Handout. Council of Writing Program Administrators Summer Conference, Minneapolis, MN. Web. 17 July 2009. <http://nsse.indiana.edu/webinars/TuesdaysWithNSSE/2009_09_22_UsingResultsCSWC/Webinar Handout from WPA 2009.pdf>.

    Bangert-Drowns, Robert, Marlene Hurley, and Barbara Wilkinson. “The Effects of School-Based Writing-to-Learn Interventions on Academic Achievement: A Meta-Analysis.” Review of Educational Research 74.1 (Spring 2004): 29-58. Print.

    Bayer, Cathy (2011, March 3). “Administration’s Proposed Cuts.” Rockford Register Star. Web. 3 Nov. 2011. <http://www.rrstar.com/carousel/x1174964897/Rockford-School-Board-still-has-3-1-million-more-in-cuts-to-make>.

    Bazerman, Charles, Joseph Little, Lisa Bethel, Teri Chavkin, Danielle Fouquette, and Janet Garufis. Reference Guide to Writing Across the Curriculum. West Lafayette, IN: Parlor Press. Fort Collins: WAC Clearinghouse, 2005. Print.

    Bereiter, Carl, and Marlene Scardamalia. The Psychology of Written Composition. Hillsdale: Erlbaum, 1987. Print.

    Blumner, Jacob, and Pamela Childers. “Building Better Bridges: What Makes High School-College WAC Collaborations Work?” The WAC Journal 22 (2011): 91-101. Web. 11 November 2011.

    Chabot, Chris, and Warren Tomkiewicz. “Writing in the Natural Science Department.” Writing Across the Curriculum 9 (1998): 52-59. Web. 17 November 2014.

    Childers, Pamela and Michael Lowry. “Connecting Visuals to Written Text and Written Text to Visuals in Science.” Across the Disciplines 3 (2005): n. pag. Web. 7 Feb. 2011. <http://wac.colostate.edu/atd/visual/childers_lowry.cfm>.

    College Readiness Project, An Initiative of the Higher Education Coordinating Board to Define English and Science College Readiness in Washington State. Phase II (2008-09) Science Teams. Web. 17 June 2013. <https://web.archive.org/web/20130831194446/http:/collegereadinesswa.org/index.asp>

    DeBoer, George. “Scientific Literacy: Another Look at Its Historical and Contemporary Meanings and Its Relationship to Science Education Reform.” Journal of Research in Science Teaching 37.6 (2000): 582-601. Print.

    Gallagher, Chris. Reclaiming Assessment: A Better Alternative to the Accountability Agenda. Portsmouth: Heinnemann, 2007. Print.

    Halliday, Michael A. K., and James Martin. Writing Science: Literacy and Discursive Power. Pittsburgh: U of Pittsburgh P, 1993. Print.

    Hand, Brian, Lisa Hohenshell, and Vaughan Prain. “Examining the Effect of Multiple Writing Tasks on Year 10 Biology Students’ Understandings of Cell and Molecular Biology Concepts.” Instructional Science 35.4 (2007): 343-73. Print.

    Hohenshell, Liesl and Brian Hand. “Writing-to-Learn Strategies in Secondary School Cell Biology: A Mixed Method Study.” International Journal of Science Education 28.2-3 (2006): 261-89. Print.

    Hillocks, George, Jr. The Testing Trap: How State Assessments Control Learning. New York: Teachers College P, 2002. Print.

    Huberman, Michael. “The Model of the Independent Artisan in Teachers’ Professional Relations.” Teachers’ Work: Individuals, Colleagues, and Contexts. Ed. Judith Little and Milbrey McLaughlin. New York: Teachers College P, 1993. 11-50. Print.

    Illinois Interactive Report Card. 2012. Web. 13 June 2013. <http://illinoisreportcard.com/School.aspx?schoolId=041012050250004>.

    Illinois State Board of Education (ISBE). “Illinois Learning Standards: Science.” Web. 14 March 2013. <http://www.isbe.state.il.us/ils/science/standards.htm>.

    Johannessen, Larry R., Elizabeth Kahn, and Carolyn Walter. Designing and Sequencing Prewriting Activities. Urbana: National Council of Teachers of English, 1982. Print.

    Keys, Carolyn. “Revitalizing Instruction in Scientific Genres: Connecting Knowledge Production with Writing-to-Learn in Science.” Science Education 8.2 (1999): 115-30. Print.

    Lee, Okhee, Jaime Maerten-Rivera, Randall Penfield, Kathryn LeRoy, and Walter Secada. “Science Achievement of English Language Learners in Urban Elementary Schools: Results of a First-Year Professional Development Intervention.” Journal of Research in Science Teaching 45 (2008): 31-52. Print.

    McCann, Thomas M., Larry Johannessen, Elizabeth Kahn, and Joseph Flanagan. Talking in Class: Using Discussion to Enhance Teaching and Learning. Urbana: National Council of Teachers of English, 2006. Print.

    McClellan, Michael, Dawn Myelle-Watson, Brad Peters, Debora Spears, and David Wellen. “Writing Science in Hard Times.” Across the Disciplines, 9.3 (2012). Web. 29 May 2013. <http://wac.colostate.edu/atd/second_educ/mcclellanetal.cfm>.

    Mullin, Joan A. and Pamela B. Childers. “The Natural Connection: The WAC Program and the High School Writing Center.” The Clearing House: A Journal of Educational Strategies, Issues and Ideas, 69.1 (1995): 24-26. Print.

    Nagin, Carl. Because Writing Matters: Improving Student Writing in Our Schools. San Francisco: Jossey-Bass, 2003. Print.

    Peery, Angela. Writing Matters in Every Classroom. Englewood: Lead+Learn P, 2009. Print.

    Peters, Bradley. “Lessons about Writing to Learn from a University-High School Partnership.” Writing Program Administration: Journal of the Council of Writing Program Administrators 34.2 (Spring 2011): 59-88. Print.

    Petersen, Meg. “The Atomic Weight of Metaphor: Writing Poetry Across the Curriculum.” Writing Across the Curriculum 12 (2001): 97-100. Print.

    Pierce, Bettina, and David Shellhaas. “How to Write a ‘Power Conclusion.’” Irvine Unified School District. Web. 3 June 2013.

    Reeves, Douglas. Accountability in Action: A Blueprint for Learning Organizations. Englewood: Lead and Learn P, 2000. Print.

    Rivard, Leonard P. “Are Language-Based Activities in Science Effective for All Students, Including Low Achievers?” Science Education 88 (2004): 420-42. Print.

    Rivard, Leonard P., and Stanley Straw. “The Effect of Talk and Writing on Learning Science: An Exploratory Study.” Science Education 84 (2000): 566-93. Print.

    Rockford Public Schools. High School Planning Guide 2011-2012. Web. 5 Apr. 2013. <http://www3.rps205.com/District/Documents/Old%20Planning%20Guides/2011-2012_HSPlanningGuide.pdf>.

    Schmoker, Mike. “The Lost Art of Teaching Soundly Structured Lessons.” Education Week, 4 June 2013. Web. 10 June 2013. <http://www.edweek.org/tm/articles/2013/06/04/ fp_schmoker_lessons.html?tkn=QOTFkyJRaX74kUeRDdmIZji7rSRatnFx%2B8n7&cmp=ENL-TU-NEWS1>.

    Siskin, Leslie. “Subject Divisions.” The Subjects in Question: Departmental Organization and the High School. Eds. Leslie Siskin and Judith Little. New York: Teachers College P, 1995. 23-47. Print.

    Soliday, Mary. “Mapping Classroom Genres in a Science in Society Course.” Genre Across the Curriculum. Eds. Anne Herrington and Charles Moran. Logan: Utah State UP, 2005. 65-82. Print.

    States Standards Initiative. Common Core Standards for English Language Arts and Literacy in History/ Social Studies, Science, and Technical Subjects. Web. 5 Feb. 2012. <http://www.corestandards.org/ELA-Literacy>.

    Steinberg, Wendy. Statistics Alive! Los Angeles: Sage Publications, 2008. Print.

    Stretch, LoriAnn S., and Jason Osborn. “Extended Time Test Accommodation: Directions for Future Research and Practice.” Practical Assessment, Research, and Evaluation 10 (2005): 1-8. Print.

    Sunderman, Gail, James Kim, and Gary Orfield. NCLB Meets School Realities: Lessons from the Field. Thousand Oaks: Corwin P, 2005. Print.

    Wood, David, Jerome Bruner, and Gail Ross. “The Role of Tutoring in Problem Solving.” Journal of Child Psychology and Psychiatry 17 (1976): 89-100. Print.

    Zimmet, Nancy. “Engaging the Disaffected: Collaborative Writing Across the Curriculum Projects.” English Journal 90 (2000): 102-06. Print.