Citation
Students' perceptions of and responses to teacher written feedback

Material Information

Title:
Students' perceptions of and responses to teacher written feedback
Creator:
Moersen Hohman, Adria Marie
Publication Date:
Language:
English
Physical Description:
viii, 75 leaves : ; 28 cm

Subjects

Subjects / Keywords:
Feedback (Psychology) ( lcsh )
Students -- Attitudes ( lcsh )
Grading and marking (Students) ( lcsh )
Students -- Rating of ( lcsh )
Feedback (Psychology) ( fast )
Grading and marking (Students) ( fast )
Students -- Attitudes ( fast )
Students -- Rating of ( fast )
Genre:
bibliography ( marcgt )
theses ( marcgt )
non-fiction ( marcgt )

Notes

Bibliography:
Includes bibliographical references (leaves 74-75).
General Note:
Department of English
Statement of Responsibility:
by Adria Marie Moersen Hohman.

Record Information

Source Institution:
|University of Colorado Denver
Holding Location:
|Auraria Library
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
268659039 ( OCLC )
ocn268659039
Classification:
LD1193.L54 2008m H46 ( lcc )

Full Text
STUDENTS PERCEPTIONS OF AND RESPONSES TO
TEACHER WRITTEN FEEDBACK
by
Adria Marie Moersen Hohman
B.A., Saint Olaf College, 2000
A project submitted to the
University of Colorado Denver
in partial fulfillment
of the requirements for the degree of
Master of Arts
English


The thesis for the Master of Arts
degree by
Adria Marie Moersen Hohman
has been approved
by
Hong Guang Ying
Date
incy F. Ciccone


Moersen Hohman, Adria Marie (M.A., English)
Students Perceptions of and Responses to Teacher Written Feedback
Project directed by Associate Professor Joanne Addison
ABSTRACT
In this empirical research project, the researcher studied the ways in which high
school students perceive and respond to teacher written feedback. A closed-response
questionnaire, an open-response questionnaire in response to a paper evaluated by an
instructor, and analysis of the modes and forms of teachers written feedback were all
used to gather data. Ground theory methodology was used to code and draw
conclusions from these sources. Throughout the research, four main questions were
addressed:
1) What do students perceive to be the purpose of teacher written feedback and
how do students utilize such feedback?
2) Do students have affective responses to teacher written feedback, and how
intense are those responses?
3) What types of comments make students feel successful or unsuccessful?
4) What types of comments do students perceive as useful or not useful?


The data collected and evaluated during this project suggest that students typically
do read and reflect upon teacher written feedback, most commonly in order to
improve on future writing assignments and to create a desired effect upon an
audience. While feedback is seen as purposeful, students also demonstrated a belief
that the amount of feedback was an indicator of paper quality: the more feedback, the
lower the grade. Students indicated affective responses to teacher feedback.
Varying degrees of negative active emotions were indicated, suggesting that the type
of essay, relative weight of the essay in relation to the course grade, and students age
may all play a role in how students emotionally react to written feedback. The results
also suggest that comments dealing with global issues garnered most of the students
attention and that very little attention was paid to symbols placed on essays. Students
overwhelmingly indicated that praise comments helped them feel successful.
Specificity and detail, rather than a particular form or mode, most often determined
whether or not a student valued a piece of written commentary. These conclusions
are further discussed in light of practical teacher application.
This abstract accurately represents the content of the candidates project. I
recommend its publication.
Signed


TABLE OF CONTENTS
Figures......................................................vii
Table........................................................viii
CHAPTER
1. INTRODUCTION..................................................1
2. REVIEW OF LITERATURE..........................................5
3. RESEARCH DESIGN AND RESULTS..................................14
Research Design..........................................14
Data Collection Methodology........................16
Data Analysis Methodology..........................19
Data Analysis: Coding of Each Data Source..........21
Results..................................................25
Results Relating to Attitudes Toward Writing.......25
Results Relating to Perceptions of Feedback........26
Results Relating to Affective Response to Feedback.30
Results Relating to Student-Identified
Portions of Teacher Commentary.....................32
4. DISCUSSION AND CONCLUSIONS...................................42
Discussion of Results....................................42
Students Use Feedback and Understand its Purposes
.42


Students Focus Primarily on Global
Concerns and Comments................................46
High School Students Respond to
Feedback Emotionally.................................47
Students are Confused by Many Comments...............50
Limitations of Research.....................................53
Implications for Future Research............................55
Implications for Classroom Practice.........................55
APPENDIX
A. Consent Form for Teacher Participants.........................59
B. Consent Form for Student Participants.........................61
C. Closed-Response Questionnaire.................................63
D. Open-Response Questionnaire...................................66
E. Results from Initial Questionnaire............................69
F. Coding sheet for teacher commentary and student
responses on open-response questionnaire.......................71
G. Coding Scheme for Teacher Comments............................72
BIBLIOGRAPHY.............................................................74
vi


LIST OF FIGURES
Figure
3.1 Results of Question 5 (Initial Questionnaire).....................28
3.2 Helpful Comments by Mode..........................................38
3.3 Unhelpful Comments by Mode........................................40
vii


LIST OF TABLES
Table
2.1 List of modes and focuses used for coding.................................6
3.1 List of emotions as characterized by type................................22
3.2 Student reasons for liking and disliking writing.........................26
3.3 Subject affective response to written feedback...........................31
3.4 Comments that made students feel successful by focus...................33
3.5 Comments that made students feel unsuccessful by focus.................36
3.6 Helpful and unhelpful comments by focus..................................39
A. 1 Student response to closed-response questionnaire........................69
A.2 Examples of teacher comments and how they were coded.....................72
viii


CHAPTER 1
INTRODUCTION
Walking up the aggressive flight of stairs, I struggled to catch my breath, my heart
pounding and my hands shaking and clammy. I wasnt out of shape; I was nervous. The
date: August 16, 2000. The location: a suburban high school. The reason: my first day
as a bona fide English teacher. Nervousness was matched with excitement as my mind
raced with ideas of lessons and units, workshops and collaborations. My students were
going to embrace the power of English; they were going to read and discuss with
enthusiasm. We would write, we would share, we would explore, we would learn
together. Images of an ideal classroom, full of students writing furiously, flashed through
my mind until the first bell startled me out of my reverie. I looked out into the classroom
and noticed blemishes that I had previously overlooked; the orange carpets were stained
with years of spilled sodas and carelessly discarded chewing gum. The desks were etched
with profanity, and the books sported sketches of drug paraphernalia. As students
poured into the room, my attention was drawn to the boy whose pants drooped around his
kneesunderwear clearly showingand the girl chatting away loudly on her cell phone
while chomping down on a cheeseburger. Three violations before class even started.
I, however, was not to be dissuaded, so I heedlessly and passionately welcomed
students to class, explaining the interactive and collaborative environment we were going
to create. Earnestly, I invited them to view themselves as writers and this classroom as a
forum for them to explore and develop their skills. After sharing with them my grand
1


scheme, I asked them to respond... and they groaned, loudly. And they continued to groan
as the semester continued, begging me to just tell them the answers to write down.
Time set aside for peer collaboration did more to promote school-wide gossip than to
improve writing. Essays were carelessly tossed together, if they were done at all.
Faced with mediocre essays, I eventually caved. I became stricter and stricter
with paper requirements. My classrooms became replicas of my high school experience;
the words writing process became an empty phrase, used simply to indicate that students
occasionally had to turn in more than one draft. And I labored. Basic five-paragraph
essays were taking me up to 20 minutes to grade; but no matter how extensive and
elaborate my comments were, students werent improving. Half the time the papers were
shoved into cluttered notebooks after a quick glance at the last page for the grade. The
other students would read my comments and later, near the end of year, comment that they
never quite understood what I was talking about. I began to realize that most of my
students werent improvingmy written feedback wasnt helping. In fact, the more I
commented on papers the less they cared!
In reflection, I began to see that the power relationship between students and
teachers is the most overt at the moment of the grade. I, as the teacher, have been given
the authority to rank, to judge, to comment, and to, frankly, say whatever I want about a
piece of student writing. Somehow, my title has given me the right to do so without
restraint; I cannot think of another aspect of our world where criticism (perhaps without
justification) is so embraced. At the same time, I believe that evaluation is important and
2


cannot be abandoned. If classroom assignments and activities are the brick and mortar
that hold up a writing course, evaluation is the electrical and wiring system. Without
light being shed on students workcompleted and in-progressboth the writer and I
are lost, fumbling around in the dark as we try to take the next step. Feedback illuminates
students strengths and weaknesses, helping us to plan the next instructional steps and
assignments. It allows students to monitor their own growth, encouraging them to reflect
on their development and set personalized goals. Without meaningful and purposeful
feedback, I might stumble across a brilliant idea once in a while, but I am equally likely to
teach an unnecessary and redundant topic, to cause student frustration, and to squelch a
students development before he has even had a chance to take root as a writer.
As an educator, I must find a way a way to effectively provide students with
written feedback that will help them to improve as writers and that they will perceive as
supportive, rather than punitive. In the second chapter, I will analyze previous studies of
teacher-written feedback, noting the current beliefs of effective feedback and how my
research will attempt to add to the dialogue. In the third chapter, I will describe my
research methodology, explicate the results, and discuss the themes or trends that have
emerged. Analysis of this data will, hopefully, lead to suggested types of responses that
teachers should incorporate into written feedback, which will be incorporated into the fifth
chapter.
I hope that this study will help to provide meand other teachers of
compositionwith the strategies needed in order to accomplish this difficult mission.
3


Oftentimes, the dialogue ends when the paper is passed backstudents have written,
teachers have responded, the dialogue is closed. While some students may seek out the
instructor in a writing conference, high school teaching assignments of nearly 150
students often preclude these conferences. By asking students to reflect on the written
feedback that a teacher has given them on an actual writing assignment, I hope to extend
this dialogue and give voice to their affective responses and cognitive perceptions.
4


CHAPTER2
REVIEW OF LITERATURE
Researchers have studied teacher feedback on writing for many years. Much of
that early research focused on why extensively marking grammar does not improve
student writing and why students were more receptive to comments that were helpful as
opposed to critical (Straub, 1997; Bardine et al., 2000). One particularly notable and
often-referenced study is Twelve Readers Reading conducted by Lunsford and Straub
(1995). In order to determine what types of feedback well-informed teachers of writing
make, they asked twelve college composition teachersnotable experts in the fieldto
provide written feedback on twelve student compositions. Though these papers were not
written by the participants students, the researchers tried to make the participant
responses more valid by providing them with both classroom and student context;
however, they also instructed the participants to create models of response (that could
potentially be used in teacher training) rather than try to re-create the comments that they
actually may have given. Results varied greatly from instructor to instructor. For
example, some averaged twelve comments per paper while others averaged thirty-five per
paper. Teachers who wrote marginal notes to students tended to provide fewer comments
than teachers who wrote letters to students; such letters also tended to provide students
with more specific details. Regardless of the form, most participants only focused on two
or three areas per paper. Additionally, there were many differences between the
5


participants commenting styles. In order to determine the styles of each, comments were
carefully coded into categories of focuses and modes as shown in the table below.1
Table 2.1 List of modes and focuses used for coding (Straub & Lunsford, 1995, p. 159)
Focus Mode
Global Corrections
Ideas Negative evaluations
Development Qualified Negative Evaluations
Global Structure Imperatives
Advice
Local
Local structure Praise
Wording
Correctness Indirect Requests
Problem-Posing Questions
Extra-Textual Heuristic Questions
Reflective Statements
These categories were associated with different styles of teacher commentary, and the
study results indicated that interactive comments, as opposed to authoritarian comments,
were determined to be the most favored among these readers as models of response;
however, since the context of the paper was fictionalized and participants were instructed
to re-create perfect commentary, this study provides limited insight into the actual
phenomenon of teacher written feedback. The feedback actually provided by teachers to
real students may differ greatly from the ideal feedback; additionally, students responses
1 The details of Straub and Lunsfords coding schema are included here because this is the coding
schema that I used to analyze teacher comments in my study.
6


were not factored into the study. That is, effective comments were only judged from an
instructors point of view, not a students.
Other studies have also investigated teacher commentary in great depth. Connors
and Lunsford (1993) examined 3,000 papers. These essays were provided by instructors
from around the United States; the papers were assigned and evaluated by a single
instructor, unlike the out-of-context essays graded by twelve different individuals in
Straub and Lunsfords work (1995). Connors and Lunsfords examination of teacher
commentary led to some interesting results. They found that 77% of all papers had global,
which they defined as dealing with issues of rhetoric, structure, general success,
longitudinal writing development, master of conventional generic knowledge, and other
large-scale issues (p. 141). Additionally, they found a dearth of positive comments; only
9% of papers had essentially positive comments and these tended to be short. Simply
negative judgments were on 23% of papers and 42% of papers had comments that started
positively and then became negative. This large-scale study is notable, in part, because of
the large-scale nature of the research and their ability to articulate actual teacher practice,
not ideal teacher practice. The limitation is that comments were examined for only what
they were, not how students perceived or interpreted them. The researchers commented
that the judgments expressed in writing by teachers often seems to come out of some
privately held set of ideals about what good writing should look like.. .one of our readers
called this tacit assumption the problem of writer-based teacher response (p. 152).
7


Since we rarely seek out student responses to our commentary, it is no surprise that
teachers often fail to recognize their audience (the student) as they write.
Based upon this and other studies, general guidelines for teachers have emerged,
including an emphasis on specificity, dialogue-type responses, limiting the number of
responses, and praise (Heller, 2004; Straub, 1997; Straub, 2000; McGee, 1999, Zacharias,
2007; Bardine, 2000). Much of this research has solicited, through self-report surveys,
students own perceptions toward teacher commentswhether they find teachers
comments useful and, if so, which types they find most and least helpful (Straub, 1997,
p. 91). However, this has not always been done within a real student-teacher
contextualized relationship. Straubs research published in 1997 asked students to
evaluate teacher comments for effectiveness. One hundred and forty-two first-year
college students served as research subjects; they were provided with a sample of a single
student paper with 40 possible teacher comments. For some questions, the subjects were
asked to simply indicate, using a four-point scale, whether or not they would prefer that
type of feedback; for other questions, the students were also asked to give a reason why
they would/would not like that type of feedback. The comments were categorized as
criticism, praise, imperative, advisory, close question, or open question; the data was
tabulated accordingly. The conclusion indicates that students appreciate specific,
constructive feedback but want to maintain control over their own writing (Straub, 1997).
Though these conclusions were valuable, the impact of student emotional response to
8


receiving feedback was negated as students were not asked to reflect upon/judge the
comments that they received on their own writing.
All of this research has been valuable in helping teachers to craft the most
effective feedback. However, only recently have researchers sought to truly examine two
important factors: the sociocultural context of teacher written response and the students
personal perception of feedback. This has not, fortunately, eluded researchers attention,
and a paper presented by Billings (1998) examines written feedback in an authentic,
genuine setting, unlike many of the studies of the past that ask students to reflect on
feedback written on other students papers. By looking at four different instructors
practices, Billings was able to note how teachers experiences and beliefs influence their
comments. Most students expressed an understanding of the basic purpose of feedback (to
make it better) and appreciated both praise and specificity in comments; however, these
students also negatively assessed their instructors comments, both due to handwriting,
disagreement over comment, and confusion. Through audio-taped think-alouds,
instructors also expressed negativity towards some of their own written feedback. There
was a correlation between teachers intended focus and the composition of their
comments; for example, content was a priority that was mentioned in over 50% of
comments given; however, actual comparisons between teachers intended focuses and
students perceptions were not addressed.
Murphys (2000) recent work also addresses the importance of placing the
teacher-student relationship within a sociocultural context, indicating that written
9


feedback is one of the ways in which communication is achievedor not, and by which
knowledge in constructedor not. This philosophy directed Murphy to carefully
examine other research, concluding that the teachers side of the interaction was the
focus and that the student side (interpretation, reaction) has been largely overlooked (p.
81). This is an area where further research is most definitely needed.
In order to explore student perceptions within this full contexta key factor in
determining the effectiveness of written feedbackMcGee (1999) developed her doctoral
dissertation in order to address the issue of how students respond to, interpret, and use
teacher-written comments. Her dissertation addresses a question very similar to my own:
what are students affective response to teacher-written comments and what accounts for
those responses? In this case study, she focused on five students of different revision
abilities, none of whom were enrolled in her classes. They were asked to keep oral and
written logs of their revisions. Students affective responses were further explored
through interviews. She found that students did, in fact, respond emotionally to teacher
feedback and that clarity (both of course expectations and the comments themselves) was
of the utmost importance. McGee concluded that all students had some emotional reaction
to feedback, though these reactions were not strong. Through this written reflection,
students also indicated whether or not the comments were fair, what comments were
helpful in revision, and what comments were not helpful.
As a result of the recent work that has attempted to focus more extensively upon
student responses and perceptions of feedback, researchers have noticed a gap that exists
10


between teachers intentions and the students interpretations. Montgomery and Baker
(2007) focused their attention on this discrepancy. Students were asked to categorize how
much and what types of feedback they received on papers; additionally, teachers were
asked to categorize how much and what types of feedback they gave on papers. Results
indicated that teachers self-perceptions were not accurate; teachers generally were giving
local feedback but believing that they were giving global feedback. Additionally, students
perceived that they were receiving much more feedback than teachers perceived giving
(Montgomery & Baker, 2007). Other work has also indicated that an abundance of
feedback could leave students depressed while little feedback was more motivating.
This response, however, was only seen through interviews; the survey results tended to
only indicate student excitement at receiving feedback (Zacharias, 2007, p. 51).
Much of the recent research on teacher-written feedback indicates a need to
continue investigations into students responses to feedback, particularly to gauge how
emotional responses may alter the effectiveness of teacher feedback. Very little research
into teacher feedback at the high school level has been done; most research has been done
at the collegiate level or on students acquiring English as a second language. One high
school-oriented study from Bardine, Bardine, and Deegan (2000) focused on students
perceptions of effectiveness but not on students affective responses. This research study
focuses on written feedback provided by the teacher on one set of essays and how twelve
students enrolled in a sophomore Honors English class perceived that feedback. After a
teacher graded one essay from each student, the comments were then numbered before
11


students responded to each comment on a questionnaire. Four individual interviews and
one group interview were also completed.
The focus on college-level composition classes and effectiveness rating of written
feedback has left a clear gap in empirical research, which I intend to begin closing within
my own work. More work must be done into feedback presented to high school students,
as this is a time when many students first experience the rigors of critical, academic
writing and extended written feedback from instructors. Because of the strong emotional
reactions often associated with adolescents, it is particularly important to focus upon how
students react to and perceive this feedback on their own papers, as that is where the
affective response may be the most pronounced.
Just like teachers of college composition, high school English teachers often provide
students with extensive written feedback on essays. Though goals may vary, most would
agree that the purpose is to help identify students strengths and weaknesses so that
students may improve as writers; however, written feedback is not always as effective as
teachers would like. This may be because students misinterpret teachers comments. This
may occur because of lack of a common vocabulary, because of the type of comments
teachers write, or because students emotional perceptions of the feedback interfere with
their ability to explore it cognitively. My hypothesis is that comments written by teachers
in order to be helpful are often perceived by students in a much different way. This may
be due to an imbalance in power, in which a teachers voice as the authority often
12


supersedes a students own voice as a writer. I aim to begin answering the following
questions in relation to high school students:
What do students perceive to be the purpose of teacher written feedback? Do
students usually read and use teacher feedback according to that purpose?
Do students respond emotionally to teacher written feedback? What is the relative
intensity of those emotional reactions?
What types of comments make students feel successful or unsuccessful?
What types of comments do student perceive as useful or not useful?
Hopefully, the data that I collect will help to develop a more complete picture of the
student-paper-teacher interaction that occurs. Ideally, my conclusions will begin to
establish a grounded theory of feedback, helping teachers to understand students
perceptions of written feedback so that teachers can be more cognizant of the
effectiveness or ineffectiveness of their own written feedback.
13


CHAPTER 3
RES EARCH DESIGN AND RESULTS
Research Design
My empirical research took place at a mid-sized suburban high school in
Colorado. Between 1600 and 1800 students attend the school in grades 9-12. Student
ages typically range from 14 to 19. During the previous school year (2006-2007), the
schools demographics were approximately 85% white and 15% minority students. The
population also has a 17% mobility rate and 17% free and reduced lunch. According to
the Colorado Department of Education, this school is rated a high school. This rating is
based upon standardized state-administered tests as well as the ACT. Other schools in the
area range from excellent (the top rating) to average.
I began by inviting all English teachers at the school to participate. One of the
selection criteria was availability, since the teachers needed to be teaching a class during
time period that I had available; otherwise, I would not be on hand to administer the
questionnaires. A second selection criterion was teacher willingness. Writing
commentary on essays is a time-consuming and, for some, very personal process.
Participation in the study required a willingness to be subject to such scrutiny as well as a
commitment to assigning an essay and evaluating that essay before the end of the allotted
time. Pragmatically, participation in this study also required them to give up classroom
contact time, a difficult decision in late spring.
14


Three teachers responded and agreed to be participants in the study. One teacher
was a student teacher at the school and two teachers had fifteen or more years of
experience. All of the teacher participants gave their informed consent (Appendix A).
The participation of these individuals gave me access to three classes of students,
approximately 80 students. None of the classes involved students whom I was teaching,
though some of the students were in my classes during previous years. In the end, one of
the classes was an Advanced Placement course; AP courses allow students to earn college
credit during high school. The other two classes were designated Honors courses; these
courses move at a quicker pace and involve more in-depth thinking than a regular course.
At this high school, students were allowed to self-select into the AP course and were
allowed to enter the Honors course after completion of an application. According to the
teachers of these classes, student skills ranged from average to excellent, though one
teacher noted that struggles with writing were clearly evident.
Students were the primary participants in the study. Students were selected based
on the availability of the class for research (as per their teachers permission) and their
own desire to participate in the study. No students were excluded on any basis, including
special education status, talented/gifted status, race, gender, or current grade in their
English class. No demographic data was kept on the students. All students who were in a
selected class and who completed a consent form were able to participate (Appendix B).
The primary research activities took place with students during the regular school
day and during their English class time. I began with a brief visit to each participating
15


class, outlining the research, the students potential risks/benefits, and the details of
student participation. At that time, I also passed out the consent forms. Students were
given ample time to read the form and ask questions. Of the 80 or more invited students,
only 25 brought back signed consent form within the time frame allowed. Fifteen of the
students were sophomores (age range: 15-16) and ten were seniors (age range: 17-19).
Data Collection Methodology
Over the course of the study, I collected data from three sources: a closed-
response student questionnaire; student papers with teacher written feedback; and an
open-response student questionnaire (based on students responses to the teacher written
feedback on the essays indicated above).
Initially, the participants completed a closed-response questionnaire that included
fifteen items (Appendix C). This questionnaire asked students to reflect upon all of the
academic writing that they had completed in school, including previous years and in
various content area classes. Twelve of the items were presented as standard five-point
Likert items. This is considered a standard technique for researchers who want to
determine the participants perceptions or attitudes about a subject. Two of the items
asked students to indicate what they liked or disliked about writing. The 8th question
asked students to complete the sentence When I see a lot of marks on my paper, I
automatically assume that Ive done with one of four possible responses. This
questionnaire allowed me to gather some generalized data on students perceptions of
16


writing and the value of written feedback. Finally, this instrument allowed me to compare
student perceptions on this closed-response, decontextualized questionnaire to the more
personal, open-response questionnaire that was administered next.
By this point, teachers had completed giving written feedback on student essays
for a particular assignment. The two 10th grade classes just completed a relatively
extensive research paper based upon Death of a Salesman by Arthur Miller. Students
were instructed to research a current, socially relevant issue (depression and mental health,
sports and teenage athletes, the effects of birth order, etc.) and relate the issue to the
literary text. Through this assignment, students were expected to demonstrate an
understanding of the play, an understanding of the social issue, the proper use of MLA
citation, and the ability to formulate and support a thesis. This assignment would be a
significant portion of the students overall course grades, and nearly a month was spent on
its completion. The essays were evaluated on a teacher-designed rubric. The 12th grade
class had just completed an in-class essay written to an Advanced Placement-type prompt.
Here, students were given a passage and asked to analyze it carefully, examining how the
author reveals the values of the characters and the nature of their society. This was an in-
class, hand-written essay, similar to those completed by the students throughout the school
year. This essay only made up a small portion of the students overall grades in the
course. The essays were evaluated using a standard AP Rubric (9 points).
The teachers evaluated and gave written feedback on these essays as usual.
When teachers completed feedback on one class set of essays, they gave them to me so
17


that I could make copies and code the essays according to the same anonymous
identification number placed on the closed-response questionnaire. The 10th grade essays
were copied without the grades and accompanying rubrics; this was at the teachers
request, since they would not be returning the remainder of the essays (for students who
were not participating in the study) until later. The 12th grade essays did have scores on
them.
After I had copied and placed identification numbers on the essays, I met with the
student participants without their classroom teacher present. Such teacher presence might
have created apprehension or prevented the students from being as open and forthright
with their responses. I then explained the second, open-response questionnaire (Appendix
D). This questionnaire is modeled after one used by S.J. McGee in her doctoral
dissertation (1999). The third section of the questionnaire relates specifically to students
affective responses to the feedback that they received; here, the list of adjectives used and
format for student self-reporting of emotions comes from the Brand Emotions Scale for
Writers (BESW) as presented in The Psychology of Writing: The Affective
Experience (Brand, 1989).2 McGee further adapted the BESW in her own work,
removing emotions that seemed inappropriate in the particular context of her research (e.g.
2 Brand created this measurement tool after reviewing a number of other measures of emotions.
Her process led her to choose the Differential Emotions scale of C.E. Izard and others since it had
the greatest potential for adaptation to discursive tasks (p. 63). She adapted this scale through a
series of refining processes, including a trial with students; feedback was also gathered from a
group of psychologists and writing specialists. The result was the Brand Emotions Scale for
Writers (BESW), a list of twenty single-word adjectives.. .provided with a five-step unidirectional
scale (p. 68).
18


affectionate). With the exception of this third section, the rest of the questionnaire
elicited open-ended responses in response to a specific essay that they had written. They
had not, however, seen the teacher written commentary prior to this point.
I briefly modeled completion of the questionnaire on the overhead projector using
one of my graduation literature papers; however, I had blacked the content of the teacher
comment out so as not to influence students perceptions of their own papers. Students
were given as long as they needed to complete the questionnaire. The 12th grade
respondents took much less time (approximately 15 minutes) while the 10th grade students
took anywhere from twenty to forty-five minutes to complete the questionnaire. The
seventh section of the questionnaire was removed from one of the classes packets at the
teachers request, so no data was analyzed from that section for any of the students.
Data Analysis Methodology:
A Grounded Theory Approach
My qualitative research study followed the tenets of grounded theory research as
established by Strauss and Corbin (1987) and explicated by Creswell (1998). The social
interaction that occurs between teachers and students via the written word (that is,
students writing and teachers comments upon the writing) is complex and multi-layered.
Since grounded theory methodology emphasizes the need for developing many concepts
and their linkages in order to capture a great deal of the variation that characterize the
central phenomena studied, (p. 7), it is ideal for my project. Only a methodology that
19


allows for the collection of multiple forms of data that are gathered within the appropriate
context can begin to unveil the perceptions and responses that are intrinsic to this
interaction.
Grounded theory methodology is based upon the process of constant comparative
coding, which I used in order to help me move towards the discovery of core categories
and towards the integration of ideas (p. 55). The initial open coding of the students
questionnaires and the teachers written feedback allowed me to look for and identify
patterns or trends that emerged. Then, 1 moved onto the more focused and deliberate step
of axial coding, attaching information to the axis of a larger category (p. 65). This
typically occurred as I took categories from one data source (for example, the open-
response questionnaire) and compared them with categories from another data source (for
example, the teacher written commentary or the closed-response questionnaire). To give
me guidance and to provide me with ideas for coding schema, I also looked to the coding
methodologies of other researchers (Bardine et al., 2000; McGee, 1999; Straub &
Lunsford, 1995). Memoing and constant note-taking also become a vital part of this
process, as I was constantly taking notes upon additional connections or comparisons as
well as considering more effective procedural methods. Throughout this detailed coding
process, I focused on thinking comparatively (Straus & Corbin, 1987, p. 243), a
particularly important task since 1 was integrating so many different forms of data: a
closed-response questionnaire, an open-response questionnaire, and students papers with
teacher commentary. In the end, I formed some initial propositions, which are elaborated
20


upon in the conclusion section of this paper, and hope to have started the process of
developing a theory of written feedback grounded in data collected from student and
teacher perceptions.
Data Analysis:
Coding of Each Data Source
Analysis of Closed-Response Questionnaire
I began my analysis through a careful examination of the closed-response
questionnaire. Initially, I used open coding to examine each of the questions. Entering
the response from the Likert items into a data table and averaging the results gave me
sense of the groups overall response to the questions (see Appendix E). To create a
clearer picture of student response, I created pie charts that demonstrate the frequency of
each response. Most questions from this measurement tool can be classified into three
categories: attitude towards writing, views of written feedback and grading, and the
balance of power in writing. Results from questions addressing the first two categories
are explicated in more detail below, as these results were later found to be the most
relevant to answering my research questions.
Analysis of Open-Response Questionnaire
The second questionnaire required three different approaches for analysis since
there were three types of questions. Sections I, II, V, and VI all focused upon open
student responses. Students words were entered into a data table. Then, I began an open
21


coding process, which was later focused into axial coding as I attached students responses
to different categories and trends.
The third section, which focused on students affective responses to written
feedback, required a different approach. These questions were originally taken from the
work of Alice Gladen Brand (1989). She used the categories put forth by Davitz, a fellow
psychologist, to divide the emotions into three categories: positive, negative active and
negative passive. Following the model of both Brand and McGee, I too divided the
emotions into those categories as shown in the chart below.
Table 3.1 List of emotions as characterized by type
Positive Negative Active Negative Passive
Adventurous Afraid Bored
Excited Angry Confused
Happy Anxious Depressed
Inspired Disgusted
Interested Frustrated
Relieved Surprised
Satisfied
Surprised3
Since students had rated each emotion on a scale of one to five (with 1 indicating that
they did not feel an emotion at all and a 5 indicating that they very strongly felt an
emotion), it was prudent to determine the mean for all student responses as determined by
3 Surprised has been placed in two categories (positive and negative active) as per McGees
suggestion that surprised can be both a positive and negative reaction (p. 49)
22


emotion. I initially did so by averaging the 10th grade and 12th grade responses together;
however, I noticed discrepancies in the strengths and types of emotions indicated by each
group. Thus, I also completed data analysis for the two grade-level groups separately.
Section IV of the questionnaire presented a different challenge. Each question in
this section included two elements: student-identified teacher written feedback and the
student response to such feedback. Coding teacher comments was initially difficult, as
they could be categorized in nearly countless ways: location, form, length, content,
purpose, tone, etc. Since the foundation of my research was based, in part, on the work of
S.J. McGee in her doctoral dissertation, I looked to her work for guidance. She utilized
the work that Straub and Lunsford completed in Twelve Readers Reading: Responding to
College Student Writing. In this work, comments were classified twice: once into a
category of focus and a second time into a category of mode. By their own definition,
focus identifies what a comments refers to in the writingfor instance, whether the
comment mainly addresses the writers working, organization or ideas while mode
allows us to examine how the comment is made...[and] characterized the image a teacher
creates for herself and the degree of control she exerts, through that comment, over the
writing (Staub & Lunsford, 1995, p. 158).
As I began to more thoroughly investigate the concepts of focus and mode, I
found them to be beneficial in my analysis of teacher comments. Straub and Lunsford
provided excellent examples of teacher commentary that fit each of the above modes and
focuses. I was careful to follow their model as I coded (see Appendix F for an example of
23


the coding sheet). Appendix G outlines each of the modes and focuses, examples from the
Straub and Lunsford text, and examples from the essays that I analyzed. I also included a
column which indicates the specific code I used. At times, students would mark a teacher
comment that fit more than one of the modes or focuses; this was most common when
students would refer to a teachers longer end-note. I then divided the comment into
smaller sections in order to properly code it. Additionally, I did not code every comment
on an essay. Straub, Lunsford, and McGee evaluated every teacher comment placed upon
an essay and were, therefore, able to gain an overall picture of a teachers response style
and, in McGees case, the students response to such feedback. On the other hand, I
limited my coding to those comments that students marked (as per the questionnaire
instructions) and/or specifically referred to in their responses. This seemed more
appropriate, as I was specifically working to link student perception to teacher comments.
Thus, those comments without corresponding student response were not helpful in
answering my research questions.
Coding the student reactions to the teacher written feedback was the next logical
course of action. I read and re-read the student answers on the questionnaire, looking for
any patterns and trends that might emerge. My initial coding was very broad and quite
messy; when I stated comparing the types of responses to the teachers comments,
however, I was able to gain clarity and start focusing the responses into groups and sub-
groups. Finally, I developed a coding schema and applied it to the student response.
24


Since many responses were multiple sentences or dealt with more than one reason,
responses were placed into as many categories as appropriate.
Results
The coding and tabulating of responses created a great deal of data for analysis. Below, I
have explicated some of the data that, in the end, proved to be germane to my primary
research questions. Discussion of the results can be found in the final chapter.
Results Relating to Attitudes Toward Writing
Subjects involved in this research study demonstrated generally positive attitudes
towards writing. Thirty-three percent of the students indicated that they really liked or
liked writing for school; the other 67% indicated that it was ok, and none of the
students marked a 1 (really dislike it) or 2 (dislike it). The second and third questions
provided students with ten sentences relating to their reasons for liking or disliking
writing. Students were instructed to mark all of the possible options that they felt applied
to them. The table below shows the percentage of students who choose each sentence.
A majority of students (80%) marked that they liked writing because it prepared
them with skills they would need later in life. Only 7% of the students liked writing
because it would improve their grade, whereas 87% indicated worry about a bad grade
being one reason why they disliked writing.
25


Table 3.2 Students reasons for liking and disliking writing
Statement about liking writing Percentage of students who choose this statement
I always do well, so its a chance to improve my grade. 7%
Its a way to learn. 53%
Its a chance to express my feelings and opinions 67%
Its a chance to learn more about the world 67%
I know that I will need the skills later in life, so I need to practice them now. 80%
Statement about disliking writing Percentage of students who choose this statement
It takes a lot of time and effort. 53%
Im not always sure what to do. 67%
I am often worried about getting a bad grade. 87%
I am often unsure what the teacher wants. 67%
I dont usually have the chance to express my own ideas. 33%
In regards to the scoring of papers, students also responded to the question Does
being graded help you work to improve your writing the next time? The mean of
students responses was 3.33, similar to that of other questions. One student strongly
disagreed with the statement and the number of students who responded neutrally reached
only 27%, matching the number of students who responded negatively.
Results Relating to Perceptions of Feedback
Students provided responses on teachers feedback on the initial questionnaire,
which had the following statement: When I see a lot of marks on my paper, I
26


automatically assume that Ive done..Students had four options to choose in response:
a )poorly, b) average; c) well; and d)the number of marks doesnt reflect how well I did.
Nearly half (47%) of respondents completed the sentence with the word poorly,
demonstrating a belief that the number of comments equated the number of errors and,
thus, a poor grade. The remaining student responses were split between average (27%)
and the number of marks doesnt reflect how I well I did (27%). No one indicated that
a plethora of marks equated a good score.
Another question on feedback asked, Do you find teachers comments on writing
assignments/essays helpful? Students responded on a five-point scale, with a 5
representing students beliefs that comments were veiy helpful and a 1 representing
the belief that comments were not helpful at all. The mean response was a 3.4,
indicative of a generally positive response (see Figure 3.1 below). Responses indicate that
nearly half (47%) of respondents agreed or strongly agreed that teachers comments were
helpful, forty percent of respondents were neutral, and only 13% disagreed and found that
responses were unhelpful. No student marked strongly disagree.
27


Question #5: Do you find teacheis conments
on witing assignments helpfid?
B 1 (Strongly !
Disagree)
U 2 (Disagree)
3 (Neutral)
E3 4(Agree) |
B 5 (Strongly Agree)
Figure 3.1 RESULTS OF QUESTION 5 (INITIAL QUESTIONNAIRE)
In the open-response questionnaire, students were instructed to respond to the question
Do you usually read the comments that his instructor puts on your paper? Why or why
not? Responses were initially coded into three categories: students who usually or
always read comments, students who sometimes read comments, and students who rarely
or never read comments. Interestingly, 88% of the respondents indicated that that usually
or always read the comments, 8% indicated that they sometimes read the comments, and
only one student (4% of the total) stated that he/she did not usually read comments.
In addition to asking students whether or not they read teacher written
commentary, the open-response questionnaire also asked students about the relationship
between the amount of commentary to the grade earned. They were asked to look at the
28


number of comments and consider if there were more, the same, or fewer then they were
accustomed to. As a follow-up, the questionnaire asked them, What does this make you
think about your paper? Every 10th grade student who believed that he/she had received
fewer comments than usual also indicated a belief that did better on this essay than
those previous. One student wrote, This [fewer comments] makes me think my paper is
somewhat good. It shows I am improving as a writer. Of the three 10th grade students
who received more teacher comments, two students expressed a belief that the essay was
poor or the teacher did not like my paper. Interestingly, one of the three students had
marked on the initial, closed-response questionnaire that the number of comments was not
a reflection of the papers quality; however, when reflecting on his/her own paper, he/she
responded that judging by their length and location... .[the paper] wasnt very good.
A large number of students indicated that a certain inevitability existed in regards
to comments and/or their scores. A majority of the students (65%) indicated that they
received the same number of comments that they were used to. For one-third of those
students, the same number of comments indicated a poor essay or one that still had much
to improve upon, perhaps indicating that they usually received self-perceived poor
scores. For another third (5 out of 16), this same number of comments led them to belief
that they would receive the same grade. A few even expressed this as a statement of
inevitability: I think that I got the grade I usually get and Less comments would
usually mean a better score, but I always get the same scores and the same amount of
comments.
29


Results Relating to Affective Responses to Feedback
Students affective responses to written feedback on an essay were evaluated in a
global sense; that is, they were asked how they felt after reading all of the comments on a
paper. A modified version of the Brands Emotions Scale for Writers was used to begin
this evaluation. This is a five-point scale, which uses 5 to represent very strong
emotions and 1 to represent no emotional response. Table 3.3 shows the emotions
rated, the means of the scores given by students, and the average of the means for each
category of emotion (positive, negative active, and negative passive). Under each
category, emotions are ranked alphabetically.
The most obvious discrepancy between the grade levels was in the area of
negative active emotions. The 10th grade students showed strong negative active
responses, including a high mean score of 3.47 for anxiety. One-third (five out of 15) of
the 10th grade students marked a 4 or 5, indicating strong anxious feelings. The 12th
grade students, conversely, had a relatively low response for anxiety at just 1.50 on
average. Only one of the ten students marked a 4 or a 5. Eighty percent marked a
1, indicating that they did not feel anxious at all.
30


Table 3.3 Subject affective responses to written feedback
Emotion Mean: 10th 2rade Mean: 12th Grade
Positive 2.07 2.06
Adventurous 1.40 1.30
Excited 2.20 1.90
Happy 1.80 2.80
Inspired 1.40 1.70
Interested 2.87 2.40
Relieved 2.20 1.90
Satisfied 2.00 2.70
Surprised 2.67 1.80
Negative Active 2.46 1.38
Afraid 2.67 1.10
Angry 2.20 1.20
Anxious 3.27 1.50
Disgusted 1.33 1.00
Frustrated 2.60 1.70
Surprised 2.67 1.80
Negative Passive 1.91 1.80
Bored 1.47 1.80
Confused 2.40 2.50
Depressed 1.87 1.10
31


Results Relating to Student-Identified
Portions of Teacher Commentary
In addition to eliciting to more global response to feedback in the previous
section, students were also asked to identify and comment upon specific portions of
feedback to comment upon. The four categories that students placed comments into were
1) those that made them feel successful; 2) those that made them feel unsuccessful; 3)
those that were helpful; and 4) those that were unhelpful.
Successful
A total of 23 comments were indicated as making students feel successful as
writers. Two students did not respond: one who marked N/A and the other who wrote,
cannot find one, there is one that makes me feel good, but not successful. The
comments were first coded into categories of focus. Of the 23 comments, 18 comments
were global in nature. They were split relatively evenly among the three types of global
comments: six related to ideas, seven related to development, and five related to global
structure. The remaining five comments that were not classified as having a global focus
32


Table 3.4 Comments that made students feel successful by focus
Focus Number of successful comments (out of 47) Percentage of successful comments
Global (includes ideas, development, global structure) 18 78%
Ideas 6 26%
Development 7 30%
Global Structure 5 22%
Local (includes local structure, wording, and correctness) 0 0%
Local Structure 0 0%
Wording 0 0%
Correctness 0 0%
Extra-Textual 5 22%
were extra-textual and provided a comment about the writing as a whole. Not a single
comment that focused on location structure, working or correctness was defined as
making students feel successful.
Secondly, the comments that created feelings of success were coded into
categories of mode. Ninety-six percent of them (22 out of 23) were in the mode of praise.
Such comments ranged from the very short (good connection) to more extensive (This
is good. You demonstrate a good understanding of the passage and you do it in clear
writing.). The one student who did not indicate a praise comment chose one that could
be categorized in the negative evaluation mode. The teacher wrote This paper would be
33


A material if you had used more evidence toward the end. The student responded to
this in a positive manner, writing that it showed her the potential her paper had. Two of
the comments were found to have two modes. Both started with praise before changing
direction to an imperative or negative evaluative statement. The first such mixed comment
ended with you need more introduction, a clear imperative statement. The second
ended with this is really where your paper should have focused, a statement of negative
evaluation. This student found the mixing of praise and negative evaluation to be
disheartening, explaining,
This comment is the only compliment on my paper. It makes me feel that I, at
least, had something good in my paper. However, even that was ruined because
the teacher said my whole paper should have been like that paragraph.
Clearly, this students understanding of a comment as making him/her feel successful is
tainted by a perception of the other comments in the essay. Other students echoed the first
portion of this students statement, indicating that their chosen comment only made them
feel successful in light of the perceived negativity of the other comments on the paper.
Five of the 23 students mentioned the overall sense of criticism and negativity in
relation to their choice. In addition to the response quoted immediately above, their
responses included the following phrases:
This comment is the only compliment on my paper.
It makes me feel successful because it does not have any criticism or negative
feedback, so I feel that I have finally written an entire decent paragraph with little
to no mistakes (emphasis theirs)
It is a positive comment instead of the usual negative ones.
34


I chose this comment because, really, it is the only positive comment...Every
other comment seems to be putting down almost every aspect of my paper.
The other 18 students who responded to teacher commentary that made them feel
successful typically related only to ideas of feeling smart, doing good things,
receiving compliments, and feeling like they had potential.
The average total of these comments was 6.17 words. When the sections of the
teacher response that were categorized as being in either the imperative or negative
evaluation modes were removed from analysis, the average length of teacher commentary
dropped from 6.17 words to 5.00 words per comment.
Unsuccessful
A total of 20 comments were indicated as making students feel unsuccessful as
writers. Five students did not respond, and one student wrote a response without
identifying a comment. The comments were first coded into categories of focus; the table
below shows the disbursement of comments. The majority of comments (65%) were
global in nature, and the majority of those were in regards to the development of the
nature. Table 3.4 shows the percentage of each teacher comment as coded by focus.
Secondly, twenty-three comments were classified into modes. This number differs
from the number classified into focus because some comments were determined to be in
more than one mode and some comments could not be classified because they were only
grades/rankings. Of the twenty-three analyzed, seven (30%) were negative evaluations,
35


(26%) were problem-posing questions, and five (22%) were imperatives. There were also
two examples of praise and one example each of advice, heuristic questioning, and
reflective statement. Both of the praise comments were linked to other modes: one was
paired with an imperative statement and the other was paired to a negative evaluation.
The average length of each of comment was 8.35 words.
Table 3.5 Comments that made students feel unsuccessful by focus
Focus Number of unsuccessful comments (out of 47) Percentage of unsuccessful comments
Global (includes ideas, development, global structure) 13 65%
Ideas 1 5%
Development 8 40%
Global Structure 4 20%
Local (includes local structure, wording, and correctness) 4 20%
Local Structure 2 10%
Wording 0 0%
Correctness 2 10%
Extra-Textual 3 15%
36


Helpful
Every student responded to the question asking about helpful feedback. In total,
students identified thirty-seven comments as being helpful to them as writers. A number
of identified comments were extensive end-notes, so I further divided them by focus. In
all, there were forty-seven focus-specific comments identified in this category. According
to students responses, the most helpful comments were usually associated with global
issues, totaling 68%. Comments referring to correctness (19%) were the most prevalent of
those classified as local concerns4.
Teacher comments were also analyzed by mode. Figure 3.2 below shows the
number of comments classified by mode. The modes of the feedback identified were
varied; however, comments in the imperative mode were the most common (43%). No
problem-posing questions were identified, and only one comment dealing specifically
with correctness was identified. Of the many subtypes of reflective comments, all three of
the comments indicated as helpful were explanatory in nature.
Students rationale for choosing these comments was also analyzed through
careful re-reading and coding. Most commonly, the explanations by students mention one
of two reasons. Fourteen of the responses indicate that they appreciated the teachers
specificity (shows me exactly where I need to add something in), detail, or clarity (This
comment is helpful because it makes me realize that I must be more clear with my writing,
and after reading the comment, I can now clearly see my mistake). Five of the students
4 Table 3.6 provided a comparison of helpful and unhelpful comments by focus.
37


Reflect fve Stctemrts
Heuristic Questions
Problem-Posing Questions
Indirect Requests
Plaise
Advice
Inperatives
Qualified native evaluation
Negative vahjEt ions
Correctness
0 5 10 15 20 25
Nurrber of "hdpfiil" oorrmerts identified
Figure 3.2 HELPFUL COMMENTS BY MODE
also referred to the helpfulness of teachers summing up the previous comments, usually in
an end note (I appreciate the comment at the end of the paper. It summarized what I did
right and what I should try to work on). When divided into focuses, the average length
of each helpful comment is 9.2 words. When divided by form (that is, treating end-note
paragraph as one unit) the average length of each comment is 11.7 words.
Not helpful
The twenty-five participating students identified thirty comments as being
unhelpful to them as writers. Table 3.6 shows both the helpful and unhelpful
38


comments as broken down by focus. According to students responses, the least helpful
comments were usually associated with global issues: a total of 53% of the identified
comments related to global concerns, and 56% of those were in regards to ideas.
Seventeen percent of the unhelpful comments were focused upon wording; none of the
helpful comments had this same focus.
Table 3.6 Helpful and unhelpful comments by focus
Focus Number of helpful comments Percentage of helpful comments Number of unhelpful comments (out of 30) Percentage of unhelpful comments
Global (ideas, development, global structure) 32 68% 16 53%
Ideas 8 17% 9 30%
Development 15 32% 5 17%
Global Structure 9 19% 1 3%
Local (local structure, wording, and correctness) 15 32% 12 40%
Local Structure 6 13% 3 10%
Wording 0 0% 5 17%
Correctness 9 19% 4 13%
Extra-Textual 0 0% 3 10%
39


The modes of the feedback identified were varied; however, comments in the
negative evaluative mode were the most common, making up 30% of the comments
identified by students as being helpful. Twenty-three percent of comments labeled as
unhelpful were problem-posing questions. Figure 3.3 shows the number of comments
classified by mode.
Figure 3.3 UNHELPFUL COMMENTS BY MODE
40


Two comments that fell under the mode of praise were also listed. The comments were
exactly the same on both papers: This is very good. The written responses explain why
the students felt the praise was not helpful:
Duh! Thats why its a [grade], tell me how to do better not how I did.
I did not find it helpful because my paper couldnt have been that good if I only
got a [grade].
Students reasons were analyzed through careful re-reading and coding. Most commonly,
the explanations by students refer to their own confusion. Responses were determined to
indicate confusion if they included phrases such as I dont know, referenced situations
that they saw as contradictory, and mentioned wanting more information on how to make
a change. Nineteen of the responses (63%) indicate that students are confused.
Sometimes the confusion related to the meaning of the comment itself (My teacher wrote
this above a direct quote I put in from one of my sources. I dont know what it means).
Other times, the confusion related to students inability to understand how to fix the
identified problem (There was already another comment just like this one and I cant fix
the problem.). Another common thread focused on students desire for more information
so that they can better understand the teachers rationale and learn how to write more
effectively in the future. The average length of an unhelpful comment is 5.9 words5.
5 One comment was simply a circle, so that symbol was not figured into the average length.
41


CHAPTER 4
DISCUSSION AND CONCLUSIONS
Discussion of Results
Themes and trends began to emerge as results gleaned from the different
measurement tools were compared and seen in conjunction with one another. For
example, the closed-response questionnaire initially seemed useless. The limited response
choices for answers provided little insight into students rationale or perceptions. I found,
however, that these responses became meaningful when compared to the results of the
open-response questionnaire. By looking at a single students responses on the first
questionnaire (which was out of context as well as general in nature) to the same students
responses on the second questionnaire (which was relevant to a particular context), I was
able to see how contextual changes may affect student perceptions. Through this constant
comparative analysis, the following themes emerged.
Students use feedback and understand its purposes
An important concern for teachers is whether or not students read teacher written
commentary on essays; nearly every high school teacher, myself included, can recount an
instance in which a student flipped to the last page, looked at the grade, crumpled the
essay up and flippantly tossed it in the trash on the way out the door. If students arent
reading commentary, why do we write it? And obviously, if they arent reading it, their
42


affective responses cannot be evaluated. Equally important is students understanding of
the purpose of feedback. Do they perceive it as a justification of a grade, a means to
improve their future writing, or as a way of simply criticizing?
In this study, most of the students indicated that they valued teacher feedback and
were able to articulate a clear rational for doing so. Their responses can be associated with
four main trends.
1) Students look to feedback to improve future writing assignments. Beyond
improvement of grades, a number of students indicated that they read comments in order
to help them improve as writers on future drafts or on future assignments. Any reference
to needing teachers help or assistance was determined to be an improvement-oriented
response. The open-response questionnaire indicated that 80% (16 out of 25) of students
believe that future improvement was a motivating factor in reading teacher feedback on
their essays. These responses came in response to essays that the students knew they
would not be able to revise; interestingly, students still showed an interest in improvement
overall and in writing as a craft.
2) Students use written feedback to gauge audience response. These comments
indicate that students were aware of and interested in their teachers responses as
individuals. Responses that indicated a respect for the instructor as a writer or as an
educator were also included in this category. Other responses referred to what a teacher
may have liked or disliked. Examples from student questionnaires include:
43


I like to know what they [the instructors] thought. I like to compare my thoughts
on my paper to theirs.
Yes, because I am always curious about if my writing gets the point across that I
intend.
Ten of the students included these audience awareness comments in their responses.
Many of them did not indicate why they valued teachers opinions and thoughts, however.
It may have been out of a true concern for their writings affect on an audience; it may
have solely to do with an attempt to please the teachers as a grade-giver. The first, of
course, is a highly valued rhetorical skillall writers understand the importance of
appealing to a particular audience for a purpose. The second, however, brings to the
forefront questions of power and relationships in composition.
3) Few students use written feedback to understand the teachers justification for
a grade. Teachers often write commentary as a justification for giving a student a
particular grade, but only a small number of students responses reflected that paradigm as
a primary reason for why students read commentary. Only five of the students (25%)
indicated grade justification as a reason for reading comments. Interestingly, the student
who indicated that he/she did not usually read comments related his/her rationale directly
to grading: No, not usually. If I did better work than I got a grade for, then I read the
comment. Also, if the comment is short. Of the two respondents who sometimes read
teacher comments, one student indicated that the grade received was the primary
motivating factor: Sometimes I read the comments. If I did badly then I read to see what
I did wrong, and if I did well then I usually just skim over the comments. Of the students
44


who usually or always read comments, only 14% mentioned grades or scores directly as a
reason for doing so. This does not necessarily indicate that students no longer care about
grades; however, more global concerns (effect on an audience, improvement of writing
skills) seem to be on the forefront of students minds.
4) Students associate the amount of written feedback with failure. Despite some
clear attempts by students to use written feedback as a means of future improvement, and
despite evidence that students may not often read comments to find a justification for a
grade, there is also evidence to suggest that students find extensive comments on a paper
to be evidence of failure. This is, perhaps, one of the most contradictory elements of
students responses during this research study; however, it is a belief that both students
and teachers of composition have long held to be true: the more extensive the comments,
the worse the grade.
On the first, out-of-context questionnaire, 27% of students indicated that the
number of comments is not an indicator of doing poorly. On the second, in-context
questionnaire, the portion drops to 13%. This may be a case of a students analysis of a
hypothetical situation contradicting with an analysis of a tangible, personal situation.
Only one students written response seems to fully espouse the belief that commentary is
not indicative of poor work:
There are more comments on my paper than I am used to receiving, but I like
more comments. More comments let me know what the teacher is thinking. The
comments always make me stop, just for a second, but after I read them I know
why they were placed there.
45


This response, though notable, still does not get to the issue of whether or not the number
of comments was an indicator or of an unsuccessful essay. Instead, the student focused on
the positive purpose of comments and the value of the audience response.
Students Focus Primarily on Global Concerns and Comments
Even though the data indicates that students read and value written commentary,
there is also strong evidence to suggest that the focus of the comment, more so than the
mode, also plays a role in how carefully or attentively a student examines a comment.
Nearly every English teacher is well-versed in the codes of editing and grading. An RO
indicates a run-on, an AWK indicates an awkward phrase or construction, and a
FRAG indicates a fragment. A plethora of symbols (A, /, t) are also used, to represent
everything from adding a comma to starting a new paragraph. For many teachers, such
comments are a main-stay of the feedback given to students. Typically, they would be
classified under the mode local feedback, under the subcategories of wording or
correctness.
Nearly all students ignored such markings when responding to the commentary on
their papers. Instead, they were consistently focused upon global comments. In the two
positively indicated categories (feedback that was helpful and feedback that made them
feel successful) the feedback identified by students was 68% global and 78% global,
respectively. In the two negatively indicated categories (feedback that was unhelpful and
feedback that made them feel unsuccessful) the comments identified by students were
46


53% global and 65% global. Regardless of category, students continually concentrated
upon global feedback. At first, 1 thought that perhaps these instructors simply had not
provided much local feedback for students; however, examination of the essays disproved
that theory. Many of the essays were laden with local comments, sometimes upwards of
ten per page. When students did choose to address localized feedback, they most
commonly found it unhelpful.
The conclusion here is relatively clear. Students are likely to pay more attention
to and concentrate more upon global feedback, particularly in contrast to local feedback.
The mere presence of global feedback does not mean that a student will perceive it
positively; however, global feedback was more often seen as helpful than local feedback.
Even when global feedback is seen as being confusing, at least it is being read and
interpreted by students. Localized feedback focusing on wording or error correction, on
the other hand, is likely to be unhelpful for students, if they notice it at all.
High School Students Respond to Feedback Emotionally
One of the primary purposes of this research was to determine what, if any, types
of emotional responses students had to written feedback. Of equal importance was an
analysis of the strengths of such responses. In general, I found that students did react
emotionally to teachers comments, those these reactions tended to be moderate according
to the self-reporting BESW scale. In fact, the open-responses to questions indicated
stronger emotional reactions. When indicating which teacher comments were unhelpful or
47


made them feel unsuccessful, many students used emotionally-loaded words or phrases.
Many of the comments related to specific frustrations, a emotion which was ranked a 2.60
on average for the 10h grade students and 1.7 for the 12h grade students. These frustrations
emerged for a number of different reasons, and some student responses below are
illustrative of this:
...apparently my teacher didnt wish to hear what I thought about the issue
I feel like instead ofunclear it really says no effort and its frustrating.
1 hate when teachers asked questions in my paper instead of fully explaining
themselves. Its frustrating and doesnt teach me things about my writing
Student comments were also equally indicative of positive emotions, particularly in their
response to this prompt: Find a comment that makes you feel successful about yourself
as a writer... briefly explain why. Many simply restate a feeling of success without
further elaboration. Others continue to elaborate about the emotions that they felt:
fts an encouraging feeling
Makes me feel accomplished
[These comments] make you feel smart!
Makes me feel good
Give me confidence.
Clearly, some of these responses dont specifically label an emotion. However, feeling
smart and feeling good are references to a state of being that is more emotionally than
cognitively driven. All but one of the responses given by students in response to this
question relate to the mode of praise. Though many of such comments were short (e.g.
48


good connection) and were not necessarily labeled as being helpful, they still had a
positive impact on the affective response of the student.
Interestingly, the 10th and 12th grade students had similar positive emotional
scores, which were only slightly lower than those that McGee recorded for college-age
students6. In regards to negative emotions however, there is a significant discrepancy.
The 10th grade students I studied averaged a score of 2.456 for negative active emotions,
in comparison to the 12th grade students who averaged 1.383. Simply put, the 10th grade
students had a significantly stronger negative reaction to their feedback than the 12th grade
students or the college-age students McGee studied. This could be caused by a number of
factors. First, the 10th grade paper was a significant one in terms of its impact on the
overall course grade and time spent. Additionally, this was the first major research project
they had completed this year. The 12th grade students in-class essay was similar to many
that they had completed over the course of the year, took less than a class period to write,
and had little impact on their overall course grade. This discrepancy between types of
assignments may account for this discrepancy in emotions such as frustration, fear, and
anxiety. The differences in age and/or emotional maturity between the two groups may
have also contributed to the score difference.
In conclusion, there is clear evidence of students emotional reactions to teacher
written feedback, though these responses are often mild. The strongest reaction was in
6 McGees dissertation found that college-age students averaged a 2.3 for positive emotions, a 1.9
for negative active emotions, and a 1.4 for negative passive emotions.
49


relation to negative active emotions, and the degree of emotional response may vaiy based
upon the age of the subjects, the type of assignment, and the socio-cultural context of the
classroom.
Students are Confused by Many Comments
One of the primary concerns for any instructor is clarity. Self-reflective teachers are
constantly asking themselves if their assignments, lectures, discussions, or comments
make sense to the primary audience: students. Unfortunately, results from this study
strongly indicated that students are frequently confused by teacher written commentary.
This confusion not only prevents learning, it also leads to frustration and other negative
affective responses. On their surveys, students labeled confusing comments both as being
unhelpful and as making them feel unsuccessful. These confusions seem to stem from
four main areas:
1. Confusion as to what the teacher desired from the students writing
I have no idea what sort of quote my teacher wants.
I cant fix this problem
2. Confusion as to what a comment means
On this comment my citation is half crossed out.. .1 have no idea what that
means.
3. Confusion as to how or why to make changes
This comment doesnt tell me know to make my intro longer and why.
50


4. Confusion seeming from perceived contradictions
At the end of my paper, the teacher mentioned that I should tiy working on
sentence variety/length. When I tried using some shorter sentences, a
comment was made that they were choppy. Contradicting comments do no
good.
All of these student explanations are related to feedback they identified as unhelpful.
The comments typically lacked the length and specificity of the comments that were
identified as helpful. For instance, the length of an unhelpful comment was (on
average) 5.9 words; the length of a helpful comment was 11.7 words. These longer
comments were more didactic in nature, showing again that students desire to learn and
want specific information. This may lead teachers to focus upon writing fewer, longer
comments that articulate why a change needs to be made, rather than simply identifying
what needs to change.
In regards to the modes of teacher comments, it seems that questions are
generally considered confusing and purposeless. The questions particularly disliked by
students were closed problem-posing questions, in which the teacher strongly implies
that something is wrong with the text and insinuates in the wording of the comment itself
the teachers answer to the question...in a way, these comments are imperatives or
evaluations cast in the form of the question (Straub & Lunsford, 1995, p. 169). Subjects
in this study also felt that problem-posing questions were confusing, since students were
unsure whether the question was a command, something to think about, or a criticism.
51


Even though problem-posing questions can be imperative in nature, students do
not seem to mind the imperative mode as long as the comment is specific, clear, and
articulates a reason. Twenty of the thirty-seven helpful comments were in the imperative
mode, a clear command from the teacher to do something to the essay. This, of course,
raises questions about authorship and appropriation. It may be that students do not truly
feel ownership for their writing and simply are trying to get the grade. If so, students
would be likely to happily make any change dictated by the teacher. Alternatively,
students may view teachers as important partners in the process of composition and, thus,
value teachers suggestions for changes as the response of an informed audience. Here,
the context of the classroom and the relationship between the teacher and student become
important points of consideration; unfortunately, these issues were beyond the scope of
this project.
In conclusion, students are often confused by written feedback. This typically is a
result of lack of specificity. Nancy Sommers (1982) also articulated a similar finding
from her research, concluding that most teachers comments are not text-specific and
could be interchanged, rubber-stamped, from text to text (p. 291). Such vague comments
do not provide students with the guidance they need and are likely to result in confusion.
When revising their work, students misinterpretation of comments can results in papers
that are less coherent and effective than the original. According to this study, the least
useful comments were often negative evaluations that identified an error without
providing guidance as to fixing it or were problem-posing questions. In contrast,
52


comments that are specific and didactic in nature are generally well received by students,
regardless of the focus or mode.
Limitations of Research
This study had a number of limitations which preclude its use to generalize
student responses to feedback as a whole. Primarily, the sample size (25 students) was
quite small. Additionally, the students were not chosen at random; if their teacher did not
agree to participate and allow me to use class time, students were themselves unable to
participate. The students were also all from honors or AP classes. Though this does not
invalidate their own responses to, reactions to, and interactions with teacher-written
feedback, the results may differ from those students who have more pervasive negative
attitudes towards school. Typically, honors and AP students have a strong desire to be
successful and excel at school; responses from students without the same attitude towards
education could be vastly different.
Another limitation of this study is the teachers knowledge of their own
participation. Their involvement could have altered the way in which they gave feedback
to student writers. This effect may have been limited in some part because teachers did
not know which students were and were not participating. Thus, they would have needed
to adjust their commenting style for all the essays they graded, not just those of the
students who participated. One of the teachers did not agree to participate until after the
53


essays had been evaluated and commented upon; thus, these comments were largely
untainted by knowledge of my project.
The study was additionally limited by the participation of only three teachers, all
from the same school. It is possible that certain forms or modes of comments were not
chosen by students because they simply did not exist on their papers. If the three teachers
did not, for example, usually write heuristic questions, then students obviously could not
comment on their responses to heuristic questions. Since not all forms or modes of
commentary were equally represented, it is not possible to say which ones create which
effects on students, either affectively or cognitively.
Implications for Future Research
Though many researchers have delved into the types of feedback that teachers do
and should provide to students, more research needs to be done in order to fully
understand how students react to feedback, why they react in that manner, and what types
of feedback with students find both respectful and helpful to themselves as writers.
In particular, more research needs to take place at the high school level or middle
school level; my data indicates a difference in the responses from seniors to sophomores,
and analysis of how students react at different ages may help teachers to better formulate
their responses. Within secondary schools, students from a wider range of socio-
economic statuses and academic achievement levels need to be studied. My work focused
on higher-achieving students, and the results are not applicable to the student population at
54


large. Additionally, researchers could give even greater credence to students voices by
interviewing them. Time constraints prevented me from interviewing; however, they
would have been invaluable in determining why students responded in certain ways to
certain aspects of feedback.
Finally, more work needs to be done to connect what happens within the context
of the classroom to students perceptions of feedback. How do conferencing, a writers
workshop environment, and classroom climate affect how teachers comment on students
work and how students interpret those comments? Knoblauch and Brannon (1981)
articulately express this dilemma in relation to composition studies: Any remark on a
student essay...owes its meaning and impact to the governing dialogue that influences
some students reactions to it. Remarks taken out of this context can appear more
restrictive or open-ended, more facilitative or judgmental, that they really are in light of a
teachers overall communicative habits (p. 71). Clearly, written commentary is part of a
larger educational dialogue between a teacher and a student. More research must be done
to incorporate this written dialogue within the context of the classroom dialogue.
Implication for Classroom Practice
If I want my dream classroom of writing, exploring, and learning to become a
reality, than my approach to commenting upon and grading student writing needs to
reflect that same interactive, student-centered philosophy. By focusing upon student
55


perceptions and affective reactions rather than teacher intentions, this research project has
provided me with the guidance I need to evaluate student papers in an effective, caring
manner. I will, without a doubt, continue to provide students with feedback. They clearly
want to learn and want to see how their audience (or at least one member of their
audience!) responds to their writing. The form and type of feedback, however, will be
drastically different.
I expect that my future feedback will both be less and more than my current
feedback. I will be less negative, less picky, and less focused on local concerns. The
results indicating students high levels of negative active emotions were truly mind-
opening, as were the questionnaires that reflected how few positive comments students
perceived to have been given. Negative comments need to be nearly eliminated in favor
of comments that teach a lesson or provide an example. Local, minor concerns of
wording and grammar need to become the work of final editing groups or writing
conferences before a paper goes to publication; students seem to overlook our detailed
comments focused on local wording and correctness, so they may need to be abandoned
or, at least, limited dramatically. I will also use fewer questions, particularly those defined
as problem-posing; if I do write questions, I will be clear about articulating why I am
writing them and how I want students to interpret them. In all, there will be fewer
comments on students papers, at least until we are able to engage in open classroom
discussions about the purpose of commentary and why more commentary does not equate
a poor grade.
56


There also will be much more in my written feedback on students future
papers. I want to write with more encouragement, more thoughtfulness, and more
specificity. I will be sure to include statements of praise that are not vague (good job)
and do not immediately devolve into negative commentary. My focus will be to build
writers that are both confident and effective. Focusing on grammar rules may create more
correct writers; however, this is meaningless if my students become so self-conscious that
they stop writing. In place of copious, vague comments, I want to write fewer, longer
comments that are specific to the essay in front of me, that provide students with a
different perspective about how they may choose to rethink their work. The focus will be
on teaching and responding, not correcting. Finally, my commentary will be more
focused on global issues of structure and ideas. If students are either ignoring or
responding negatively to the rest, then it is not only unhelpfulit is also damaging.
Clearly, these steps do not solve all of the dilemmas that surround the issue of
written feedback; however, they may make the process more fruitful for both teachers and
students. In conclusion, I look to Mark Hallidays poem Graded Paper, which
articulates many of the delights and contradictions of evaluating writing; near the end of
the poem, the speaker expresses his epiphany:
You are not
me, finally,
and though this is an awkward problem, involving
the inescapable fact that you are so young, so young
it is also a delightful provocation.
57


This encapsulates one of my main beliefs supported by this research: that we must
remember that we are not the authors of the papers that we grade. That our responses
should be those of caring readers whose want to help build great writers. That our first
instinct should be to encourage and our second instinct should be to teach. The papers
that we lug home at night are the products of young writers, both in age, skill, and
experience. We are not to be their editors and critics, but to be their reflectors and guides.
More than anything, we need to remember and encourage their youthful energy, even if it
means ignoring a few awkward phrases or comma splices.
58


APPENDIX A
Consent Form for Teacher Participants
Dear Colleague:
I am currently completing research about the effectiveness of teachers written feedback.
To conduct this research, I am asking you to participate in the study. This form provides
you with information about the study; additionally, I will be available to describe this
study to you and answer all of your questions. Please read the information below and ask
questions about anything you dont understand before deciding whether or not to take part.
This study plans to learn more about the effect of teacher written comments on papers.
Teachers spend a great deal of time writing feedback, so it is important that the feedback
helps students. You are being asked to be in this research study because you, as a high
school teacher, frequently give students feedback on their writing. This makes you an
expert on what works, what doesnt work, and what type of feedback you prefer to give.
Up to 150 students and 3 other teachers will participate in this study. Participation would
include the following: allowing me to give one of your classes two questionnaires (copies
are attached); completing a questionnaire on your own beliefs about writing evaluation;
and participating in a 20-30 minutes semi-structured interview.
This study presents minimal risk to you as a participant. You may feel uncomfortable
giving feedback about your own teaching methods, particularly since I am a colleague in
your department. However, all information will be kept confidential and pseudonyms will
be used when necessary. Confidentiality will be maintained through the following:
Keeping all of your comments anonymous
Not putting participant names on any documents except for the consent form
Keeping your consent form separate from other information
Using a random assigned identification number (not your student ID number or
name)
Not using your name or any identifying information in reports or published
papers.
At no time will names or identities of participants be revealed to supervisors. I will do
everything I can to keep your records a secret; however, it cannot be guaranteed. Both the
records that identify you and the consent form signed by you may be looked at by any
59


federal agencies that monitor human subject research and the human subject research
committee. The results from the research may be shared at a meeting or may be in
published articles; however, your name will not be used.
Taking part in this study is voluntary. If you choose to take part, you have the right to
stop at any time. You may also have questions about your rights as someone in this study.
You can also call the Human Subject Research Committee (HSRC) at the University of
Colorado Denver. You can call them at 303-556-4060.
Please contact me (Adria Moersen Hohman) if you have any questions about the study at
any stage. I can be reached by phone (303-982-1287), email (amoersen
@jeffco.kl2.co.us), or during Access in room G103.
Thank you for your consideration and assistance!
Mrs. Adria Moersen Hohman
Agreement to be in this study
I have read this paper about the study or it was read to me. I understand the possible risks
and
benefits of this study. I know that being in this study is voluntary. Please mark one of the
following.
I choose to be in this study; I will get a copy of this consent form.
I am declining to participate in this study.
Signature:
Date:
Print Name:
60


APPENDIX B
Consent Form for Student Participant
Dear Student:
I am currently completing research about the effectiveness of teachers written feedback on your
essays. To conduct this research, I am asking you to participate in the study. This form provides
you with information about the study. I also will be available to describe this study to you and
answer all questions. Please read the information below and ask questions before deciding whether
or not to take part.
This study is designed to learn more about how you perceive teacher-written feedback and what
feedback is the most effective. Teachers spend a great deal of time writing feedback, so it is
important that the feedback helps you as a writer. This study is designed for me to learn more
about how you interpret written feedback and which pieces of written feedback are the most
effective. You are being asked to be in this research study because you frequently receive feedback
on your writing. This makes you an expert on what works, what doesnt work, and what type of
feedback you prefer.
If you join the study, you will be asked to complete two questionnaires. The first will ask you 15
multiple choice questions on how you feel about writing in general. The second questionnaire will
ask you to examine an actual paper that you have written for class and indicate which comments
are/arent effective. Up to 100 individuals will participate in this portion of the study.
Approximately five-eight of these students will also be invited to be interviewed so that I can
gather more specific information. The interview will be audio-recorded. I will keep this recording
secure in a locked filing cabinet until I transcribe the words and remove any identifying
information; at this time, no later than June 1st, 2008, the recordings will be destroyed. The
interview portion will be completely voluntary and would take place outside of class time. The
initial questionnaires will be done during about 40 total minutes of class time and wont increase
your overall workload. The interview will take no more than 30 minutes.
This study presents minimal risk to you as a participant. You may feel uncomfortable giving
feedback about your teachers comments. However, confidentiality will be maintained through the
following:
Keeping all of your comments anonymous
Not putting your name on any documents except for the consent form
Keeping your consent form separate from other information
Using a random assigned identification number (not your student ID number
or name)
Not using your name or any identifying information (including school or
district name) in reports or published papers.
61


I will do everything I can to keep your records a secret; however, it cannot be guaranteed. Despite
efforts to maintain your anonymity, some of the information that you provide, including the content
of your paper, may identify you. Both the records that identify you and the consent form signed by
you may be looked at by any federal agencies that monitor human subject research and the human
subject research committee.
If you feel uncomfortable at any time, you may stop participating. Additionally, the counselors at
the Arvada West Counselor Center will be available if you would like to talk about your
experience.
Taking part in this study is voluntary. If you choose to take part, you have the right to stop at any
time. Participation in this study will not effect your grade in a class. You may also have questions
about your rights as someone in this study. You can also call the Human Subject Research
Committee (HSRC) at the University of Colorado Denver. You can call them at 303-556-4060.
Please contact me (Adria Moersen Hohman) if you have any questions about the study at any stage.
I can be reached by phone (303-982-1287), email (amoersen@jeffco.kl2.co.us), or during Access
in room G103.
Thank you for your consideration and assistance!
Mrs. Adria Moersen- Hohman
Agreement to be in this study
I have read this paper about the study or it was read to me. I understand the possible risks and
benefits of this study. I know that being in this study is voluntary. Please mark one of the
following.
I choose to be in this study; I will get a copy of this consent form.
I am declining to participate in this study.
Signature:________________________________________________________________Date:
Print Name:_______________________________________________________________
Parent Signature:_________________________________________________________Date:
Print Name: ____ _____ ____ _____
62


APPENDIX C
Closed-Response Questionnaire
As you answer these questions, think about all of the writing that you have done from 6th grade
until now. Try not to focus on your current teachers or classes; remember to consider the writing
that you do in all of your classes, not just your English/Language Arts class. Please answer
honestly; all of your information will be kept anonymous
For most questions, you have five choices. Circle the number that best represents how you
feel.
1. Do you enjoy writing for school?
_________really dislike it..........think its ok...................really like it
1 2 3 4 5
2. What do you like about writing? Place an X next to all that apply.
________I always do well, so its a chance to improve my grade.
________Its a way to learn.
________Its a chance to express my feelings and opinions.
________Its a chance to learn more about the world
________I know that 1 need the skills later in life, so I need to practice them now.
3. What do you dislike about writing?
________It takes a lot of time and effort.
________Im not always sure what to do.
________1 am often worried about getting a bad grade.
________I am often unsure what the teacher wants.
________I dont usually have the chance to express my own ideas.
4. How nervous or anxious are you when your teachers pass back graded writing
assignments/essays?
not nervous at all ...somewhat nervous very nervous
1 2 3 4
63


5. Do you find teachers comments on writing assignments/essays helpful?
not helpful at all......somewhat helpful.................very helpful
1 2 3 4 5
6. Do you feel like the grades on your writing assignments/essays are fair?
not fair................usually fair.....................always fair
1 2 3 4 5
7. Does being graded help you work to improve your writing the next time?
not at all..............somewhat.........................quite a bit
1 2 3 4 5
8. When I see a lot of marks on my paper, I automatically assume that Ive done: (Circle
one)
A. Poorly B. Average
C. Well D. The number of marks doesnt reflect how well I
did.
9. When a teacher grades a piece of writing, to what extent do you feel that they are grading
YOU or grading YOUR WRITING?
grading my work...............................................grading me
1 2 3
10. Do you feel like you are encouraged to share your own experiences and opinions in your
writing?
never......................sometimes..............................very frequently
1 2 3 4 5
64


II. To what extent do you believe that your teachers value your words, ideas, and views?
very little extent....somewhat.................to a great extent
1 2 3 4 5
How often do you and your teachers have conversations about your writing?
never .sometimes frequently
i 2 3 4 5
Who has the power or control in your writing?
I have the power the power is equally shared the teacher has the pov
1 2 3 4 5
14. How valuable do you feel writing is in the real world?
not valuable at all................somewhat valuable.................very valuable
1 2 3 4 5
15. After you have finished your formal education (high school and, if you choose, college),
how much writing do you think that you will do?
None........................................a little..................................a lot
1 2 3 4 5
65


APPENDIX D
Open-Response Questionnaire
Paper Identifying Number:
The following questions ask you to reflect on the written feedback that your teacher has provided to
you. Give as many details as possible, and please be honest. Remember that your comments are
confidential and will not be shared with your teacher while you are still a student in his/her class.
I. Do you usually read the comments that this instructor puts on your paper? Why or why not?
II. Look at the number of comments on your paper. Are there more, the same, or less than you are
used to receiving?
What does that make you think about your paper?
III. Please rate all of the following from 1 to 5, with
5 = very strongly, 4 = strongly, 3 = moderately, 2 = slightly, and 1 = not at all
The first time I read my teachers comments on my paper, I felt...
Adventurous Frustrated
Afraid Happy
Angry Inspired
Anxious Interested
Bored Relieved
Confused Satisfied
Depressed Surprised
Disgusted Other:
Excited Other:
66


IV. Look through the comments again. Think about how effective they are and how they make you
feel. Then, answer the questions below. If you cannot find a comment that fits the criteria, please
write N/A.
Find two comments that you think are not helpful. Label them with an Al and A2. Below,
briefly explain why you did not find them helpful.
Al.
A2.
Find two comments that you think are helpful and will help you become a better writer. Label
them with a Bl and B2. Below, briefly explain why you find them helpful.
Bl.
B2.
Find a comment that makes you feel successful about yourself as a writer. Label it with a C.
Below, briefly explain why.
C.
Find a comment that makes you feel unsuccessful as a writer. Label it with a D. Below, briefly
explain why.
D.
67


V. In general, what types of comments would you like to have more of on your writing? Why?
VI. In general, what types of comments would you like to have less on of your writing. Why?
VII. For the following questions, please circle the number that best represents how you feel.
How much do you enjoy this class?
very little................somewhat.......................very much
1 2 3 4 5
How would you describe your relationship with this teacher?
poor.............................average.........................very good
1 9 3 4 5
How often do you and this teacher have conversations about your writing?
never........................sometimes........................frequently
1 9 3 4 5
68


APPENDIX E
Results from initial questionnaire
Table A.1 Student responses to fifteen closed-response questions
Do you enjoy writing for school? (1) Really dislike it (2) (3) Think its ok (4) (5) Really like it
0% 0% 67% 20% 13%
How nervous or anxious are you when you teachers pass back graded writing assignment/essays? (1) Not nervous (2) (3) Some- what nervous (4) (5) Very nervous
0% 7% 47% 13% 33%
Do you find teachers comments on writing assignment/essays helpful? (l)Not helpful (2) (3) Some- what helpful (4) (5) Very helpful
0% 13% 40% 40% 7%
Do you feel like the grades on your writing assignments/essays are fair? (1) Not fair (2) (3) Usually fair (4) (5) Always fair
0% 7% 47% 47% 0%
Does being graded help your work to improve the next time? (1) Not at all (2) (3) Some- what (4) (5) Quite a bit
7% (1) Grading my work 33% 20% 27% (3) Grading both equally 67% 27% 20% (5) Grading me 0%
69


Do you feel like you are encouraged to share your own experiences and opinions in your writing? (1) Never (2) (3) Some- times (4) (5) Very often
0% 40% 33% 27% 0%
To what extent do you believe that your teachers value your words, ideas, and views? (1) Very little extent (2) (3) Some- what (4) (5) To a great extent
0% 20% 53% 27% 0%
How often do you and your teachers have conversations about your writing? (1) Never (2) (3) Some- times (4) (5) Frequently
7% 53% 27% 13% 0%
Who has the power or control in your writing? (1) I do (2) (3) It is equally shared (4) (5) The teacher does
7% 40% 33% 20% 0%
How valuable do you feel writing is in the real world? (1) Not valuable (2) (3) Some- what (4) (5) Very valuable
0% 7% 27% 33% 33%
After you have finished your formal education, how much writing do you think that you will do? (1) None (2) (3) A little (4) (5) A lot
0% 13% 20% 47% 20%
70


Student Code #
O
s
S'
a
Teacher
Comment
Focus
of teacher
____comment_____
Mode of teacher
comment
(secondary) of
teacher
Student
Response
c
09

a
S*
*
e
S'
B
09
>
~
ft
a
a
O
Coding of
student response
APPENDIX F
Coding sheet for teacher commentary and student
responses on open-response questionnaire


APPENDIX G
Coding Schema for Teacher Comments
Table A.2 Examples of teacher comments and how they were coded
Code Staub and Lunsford Example Example from written teacher responses coded for this project
Focus
Ideas GI An insightful observation You demonstrate a good understanding of the passage.
Development GD What happened? Tell us more. Nice discussion to prove your point. Prove it. Where did you get those ideas?
Global Structure GS If this is your main idea, put it up in the first paragraph. Which is your thesis?
Local Structure LS I have trouble following this sentence. Unclear here
Wording LW Right word? You might want a different workit works but another might be better.
Correctness LC Subject-verb agreement error. Where are your citations?
Extra-Textual7 E Who is your audience? and I think you have the makings of a very good paper here. Overall, this is a good essay.
Mode
Correctness COR The teacher actually makes a change in the text. The word play written over the word book
Negative valuations NE Poor sentence structure. Your conclusion should not focus primarily on Death of a Salesman.
Qualified negative evaluation NE.Q I need to know more about the specific techniques. I dont think that this applies to [play character]
7 Extra-textual comments include a wide range of responses, including those outside of the text
(audience, writing process, writing assignment), those relating to tone, those that address the essay
as a whole, and miscellaneous comments.
72


Table A.2 (Cont)
Imperatives IMP Add more details. Prove it. You need more introduction.
Advice ADV I suggest you put this material about the gangs name at the start of the essay. You probably could have had a little more discussion of research here.
Praise PR Yes! Good introduction.
Indirect Requests IDR Can you give an example? Do you have a name [of the expert]?
Problem- Q.PP.c Is this paragraph really Is Charley harsh with Willy or is it the
Posing Questions necessary? (closed) other way around?
Q.PP.o What will some readers like about this quality, and other readers dislike? (open) Any evidence of this in the play?
Heuristic Questions Q.H.c Q.H.o How old were you? (closed) How is fall in Syracuse different from fall in other cities? (open) Is there a good need for discrimination?
Reflective8 REF Here is your second Clearly sets up the direction of the paper
Statements argument. for me.
8 Reflective statements take a number of forms: teacher reflections (explanatory, horatatorical) and
reader reflections (interpretive, reader-response comments, reader remarks, reader reactions). Since
few of the comments were determined to be reflective, no further analysis was done to further
specify the type of reflective statement made.
73


BIBLIOGRAPHY
Bardine, B.A., Bardine, M.S. & Deegan, E.F. (2000). Beyond the red pen: Clarifying
our role in the response process. English Journal 90( 1), 94-101.
Billings, S.J. (1998). The story of shifting perspectives: how instructors and
students construct and use instructors comments on drafts and final versions.
Paper presented at The Conference on College Composition and Communication.
Chicago, IL: April 1-4.
Brand, A.G. (1989). The Psychology of Writing: The Affective Experience. New York:
Greenwood Press.
Connors, R.J., & Lunsford, A.A. (1993). Teachers Rhetorical Comments on Student
Papers. In R. Straub (Ed.), Key Works on Teacher Response (pp. 135-158).
Portsmouth, NH: Boynton/Cook Publishers, Inc.
Creswell, J.W. (1998). Qualitative Inquiry and Research Design: Choosing Among Five
Traditions. Thousand Oaks, CA: SAGE Publications.
Heller, S.B. (2004). The art of grading papers quickly and effectively. English
Journal, 94 (1), 115-119.
Knoblauch, C.H., & Brannon, L. (1981). Teacher Commentary on Student Writing. In R.
Straub (Ed.), Key Works on Teacher Response (pp. 69-76). Portsmouth, NH:
Boynton/Cook Publishers, Inc.
74


McGee, S.J. (] 999). A qualitative study of student response to teacher-written
comments (Doctoral dissertation, Purdue University, 1999). Dissertation
Abstracts International, 62/06, 2101.
Montgomery, J. L. & Baker, W. (2007). Teacher-written feedback: student
perceptions, teacher self-assessment and actual teacher performance. Journal of
Second Language Writing, 16, 82-99.
Murphy, S. (2000). A sociocultural perspective on teacher response: is there a
student in the room? Assessing Writing, 1, 79-90.
Sommers, N. (1982). Responding to Student Writing. In R.
Straub (Ed.), Key Works on Teacher Response (pp. 69-76). Portsmouth, NH:
Boynton/Cook Publishers, Inc.
Straub, R. (1997). Students reactions to teacher comments: an exploratory study.
Research in the Teaching of English, 31(1), 91-119.
Straub, R. (2000). The student, the text, and the classroom context: a case study
of teacher response. Assessing Writing, 7, 23-55.
Straub, R., &Lunsford, R.F. (1995). Twelve readers reading: Responding to College
Student Writing. Cresskill, NJ: Hampton Press, Inc.
Strauss, A., & Corbin, J. (1987). Qualitative Analysis for Social Scientists. New York:
Cambridge University Press.
Zacharias, N.T. (2007). Teacher and student attitudes toward teacher feedback.
Regional Language Centre Journal, 38(1), 38-52.
75