A typology of problems learners encounter in online learning environments

Material Information

A typology of problems learners encounter in online learning environments
Batty, Michael S
Publication Date:
Physical Description:
xii, 155 leaves : ; 28 cm


Subjects / Keywords:
Education, Higher -- Computer-assisted instruction ( lcsh )
Internet in higher education ( lcsh )
Education, Higher -- Computer-assisted instruction ( fast )
Internet in higher education ( fast )
bibliography ( marcgt )
theses ( marcgt )
non-fiction ( marcgt )


Includes bibliographical references (leaves 149-155).
General Note:
School of Education and Human Development
Statement of Responsibility:
by Michael S. Batty.

Record Information

Source Institution:
|University of Colorado Denver
Holding Location:
Auraria Library
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
66899564 ( OCLC )
LD1193.E3 2005d B37 ( lcc )

Full Text
Michael S. Batty
B.U.S., University of New Mexico, 1984
M.B.A., University of New Mexico, 1989
A thesis submitted to
the University of Colorado at Denver
in partial fulfillment
of the requirements for the degree of
Doctor of Philosophy
Educational Leadership and Innovation

2005 by Michael Shawn Batty
All rights reserved.

This thesis for the Doctor of Philosophy
degree by
Michael S. Batty
has been approved
Ellen Stevens
//-S' 20CM
Barbara McCombs

Batty, Michael S. (Ph.D., Educational Leadership and Innovation)
A Typology of Problems Learners Encounter in Online Learning Environments
Thesis directed by Professor Brent Wilson
A typology of the kinds of problems learners may encounter in an online
learning environment was developed and used in the study of three online classes to
explore the kinds of problems learners encountered in those environments, how
learners responded to the problems, and possible connections between solving or not
solving problems and learner satisfaction and persistence. The three classes selected
for this study were taught entirely online using the WebCT learning management
system. One instructor taught all three classes. The instructor previously taught the
classes and he had experience teaching online using WebCT.
The original typology of problems included technical, institutional,
instructional design, dispositional, situational/environmental, and social problems.
During the study three changes were made to the typology: the dispositional
category was removed, the instructor category was added, and the definition of the
situational/environmental category was expanded.
All problems reported by learners fit into the typology of problems. Learners
were observed using one of four strategieskeep on task, try it again later, ask for
help, and ignore the problemto solve problems they encountered. All learners

who participated in the study completed the classes; therefore, there was no
observed effect on learner persistence. Frustration and confusion were observed
most often when learners encountered technical, instructional design, and instructor
problems. Indicators of dispositional problems were not sufficient to warrant a
separate category in the typology. The typology of problems and learner affective
responses were important in understanding the findings and indicate areas for future
This abstract accurately represents the content of thgj^ndylates thesis. I
recommend its publication.
Brent Wilson

While working on my dissertation one of my committee members, Dr. Marie
Wirsing, passed away. Marie was a teacher, philosopher, and friend who always had
time to for in-depth discussions and brief chats. She often wrote about the impact of
legislature on the lives of children and teachers. Dr. Marie Wirsing touched the lives
of many people and I consider myself fortunate to be counted as one of the many.

Completing a Ph.D. program is something for which one person is rewarded
even though the work is not done in solitude. Many people have helped me with my
doctoral work by providing feedback, encouragement, and support. I am forever
indebted to my partner, Patricia Bradford who put up with my quirks and helped
make it possible for me to retire from my day job to complete my doctoral program.
I hope she realizes I was only joking when I said I would go back to work after I
earned my Ph.D.
To my committee chair and advisor, Dr. Brent Wilson, I cannot begin to tell
you what your friendship and guidance means to me. To my committee members
(Dr. Ellen Stevens, Dr. Laura Goodwin, and Dr. Barbara McCombs) I say thank you
for your time and encouragement. I am honored to have had you participate in my
In addition to my advisor and committee members I received the support of
Dr. Stevens Post-Secondary Teaching Lab members and other doctoral students
such as Dr. Noel LeJeune, Dr. Kim Peterson, Dr. Terri McFarlane, and Karen
I would like to further thank (anonymously) the instructor who let met
explore my questions in his online classes and the students who participated in my

study. All studies cannot run like well-oiled machines. To cooperate with a study
when problems arise that might reflect poorly on an instructor tells a lot about the
instructors character and dedication to research.
Finally I would like to thank everyone with whom I held impromptu
conversations about my dissertation, including the folks at several of my favorite
local cafes and my brother Danny MacCallum.

1. INTRODUCTION.................................................................1
Conceptual Framework........................................................3
Problem Solving.........................................................4
Problem Typology........................................................7
Significance of the Research...............................................13
Research Questions.........................................................14
Overview of Methodology....................................................15
Operational Definitions................................................17
Organization of the Dissertation...........................................18
2. REVIEW OF THE LITERATURE...................................................19
Search Procedure...........................................................19
Online Problems........................................................26
Problem Solving........................................................28
Problem Space..........................................................32
Problem Solvers........................................................36
Individual Differences.....................................................37
3. METHOD.....................................................................43
The Classes................................................................51
Researcher Role........................................................55

Data Collection..........................................................57
Data Analysis............................................................62
Categorizing Problems................................................67
Limitations of the Study.................................................69
4. RESULTS...................................................................75
Research Question No. 1: What kinds of problems do adult learners encounter in an
online learning environment?.............................................76
Research Question No. 1 Summary......................................84
Research Question No. 2: How do learners respond to problems they encounter?........84
Research Question No. 2 Summary......................................93
Research Question No. 3: How do learners' solution patterns relate to their ongoing
satisfaction or frustration with the learning experience?................93
Research Question No. 3 Summary.....................................100
Research Question No. 4: How do learners' solution patterns and ongoing perceptions
and experiences relate to their decisions to participate and continue in the class?.101
Research Question No. 4 Summary.....................................105
5. SUMMARY AND IMPLICATIONS.................................................109
Conceptual Framework....................................................Ill
Research Question No. 1: What kinds of problems do adult learners encounter in an
online learning environment?............................................114
Research Question No. 2: How do learners respond to problems they encounter?........118
Research Question No. 3: How do learners' solution patterns relate to their ongoing
satisfaction or frustration with the learning experience?...............121
Research Question No. 4: How do learners' solution patterns and ongoing perceptions
and experiences relate to their decisions to participate and continue in the class?.123
Suggestions for Practice................................................130
Recommendations for Future Research.....................................131
A. CONSENT FORM............................................................138
B. CONSENT CONFIRMATION....................................................140
C. PRE-CLASS DEMOGRAPHIC SURVEY............................................141
D. TROUBLE REPORT..........................................................146
E. PROBLEM IMPORTANCE SURVEY...............................................147
F. PILOT OF ONLINE PROBLEM TYPOLOGY........................................148

Figure 1-1 Basic Problem Solving Model.....................................................5
Figure 1-2 Simplified Problem Solving Model................................................6
Figure 3-1 Basic Types of Designs for Case Studies.......................................54
Figure 5-1 Basic Problem Solving Model...................................................112
Figure 5-2 Simplified Problem Solving Model..............................................113

Table 1-1 Typology of problems online learners encounter.......................................10
Table 3-1 Class demographic data...............................................................45
Table 3-2 Female and Male composition..........................................................45
Table 3-3 Learning environment data............................................................46
Table 3-4 Computer access and online experience................................................47
Table 3-5 Computer use data....................................................................48
Table 3-6 Current on-campus classes............................................................48
Table 3-7 Computer skill data..................................................................49
Table 3-8 Data collection and analysis matrix..................................................63
Table 3-9 Initial codes for categorizing data..................................................66
Table 3-10 Keywords for identifying problem categories.........................................69
Table 4-1 Problems by category..................................................................77
Table 4-2 New problem types, including number coded.............................................78
Table 4-3 Problem importance....................................................................80
Table 4-4 Reports with confusion/frustration indicators.........................................86
Table 4-5 Problem solving strategies............................................................86
Table 4-6 Confusion and frustration keywords....................................................89
Table 4-7 Confusion, frustration, and problem type..............................................90
Table 4-8 Problem-solving strategies by gender..................................................92
Table 4-9 Problem, state, and strategy combinations.............................................95
Table 4-10 Confusion, frustration, and strategy................................................97
Table 5-1 New problem typology.................................................................116
Table 5-2 Instructional aids...................................................................129

Since the development of the World Wide Web and the first Web browsers
in 1994 (Berners-Lee, 2000; December & Randall, 1994), Web-based instruction
offered as part of an e-leaming initiative has steadily increased in importance in
educational institutions and corporations (G. R. Jones, 1997; Pittinsky, 2003;
Rossett, 2002). Not surprisingly, much has been written about designing effective
web-based instruction (M. Driscoll, 1998; McCormack & Jones, 1997; Palloff &
Pratt, 1999), including the intentional design of problems as part of the instruction
to teach a particular concept (Jonassen, 2004). However, literature on web-based
instruction generally does not address the kinds of unintentional problems learners
As Jonassen (2000) noted, Virtually everyone, in their everyday and
professional lives, regularly solves problems (p. 63), including solving problems in
the online learning environment. Problem solving ability is often associated with
learner characteristics such as motivation (Bandura, 1997; Pintrich, 1995; Pintrich &
Schunk, 1996; Wlodkowski, 1993) and coping with change (Brock & Salerno, 1994;
McWhinney, 1997; Quinn, 1996; Watzlawick, Weakland, & Fisch, 1974).

Individual learners in online learning environments call upon different skill sets to
solve problems in varying contexts and domains (Jonassen, 2000). On one hand, a
learners ability to solve a particular problem may lead to increased learner
participation while, on the other hand, a learners failure to solve a problem may
lead to increased frustration and disengagement from an online classincluding
stopping out (a learner no longer attends a class and does not formally drop the
class) or dropping out of the class.
As enrollments in distance and online classes increase (G. R. Jones, 2002;
Pittinsky, 2003; Shea-Shultz & John, 2002) dropout rates continue to be higher than
in face-to-face classes (Carr, 2000; ERIC, 1984; Hoffman & Elias, 1999; Parker,
2001; Ridley, Miller, & Williams, 1995). Data on dropout rates in online classes are
variously reported to be highwith no figures reported(Diaz, 2002; Frankola,
2001; Martinez, 2003), 10 to 20% higher than in traditional classes (Beck, 2000;
Carr, 2000) and even higher than 40% (Parker, 2001). As Flood (2002) and
Shepherd (2003) report, searching for online dropout rates will yield numbers as
high as 80%. One explanation for the wide range of online learning dropout rates is
that reports of online learning dropout rates are often anecdotal. A reason for the
lack of hard data from institutions offering online learning may be that reporting
high dropout rates in online classes would be bad for business; that is, it can be
difficult for institutions to recruit new online learners if published dropout rates are
higher for online classes than for traditional classes

Using a conservative online dropout rate of 15% higher than traditional
classes (15% was the most commonly reported percentage), it is clear that more
needs to be done to retain online learners. Improving our understanding of the kinds
of problems online learners encounter, learner problem solving, and learner
participation in online learning environments will improve our ability to design and
deliver instruction that supports learner participation and encourages learner
The purpose of this research was to (1) develop a typology of the kinds of
problems adult learners encounter in an online learning environment, (2) gain
insight into how learners respond to problems encountered in an online learning
environment, and (3) explore connections between solving (or not solving)
problems and learner satisfaction and participation. Specific research questions are
presented following the conceptual framework.
Conceptual Framework
The conceptual framework for this study brings together the typology of
problems, problem solving, and learners. As the foundation of the conceptual
framework, the typology of problems is a schema for categorizing problems learners
encounter in online learning environments. Problem solving is the process through
which learners move to arrive at a solution to a problem. Finally, learners provide
the goal setting and the motivation to solve the problems. Should new types of

problems be observed, the typology of problems can be adjusted; problem types can
be merged or subdivided if warranted after examining the observed cases.
Problem Solving
A problem can be defined as having two critical attributes (Jonassen, 2000).
First, a problem has an unknown quality, a difference between states or a difference
between what is and what is desired (Kneeland, 1999). Second, a problem solver
perceives value in solving the problem. Problems also vary in structuredness.
Structured problems have answers that are available and the problems are
straightforwardly solved by following a set of well-structured rules (Jonassen,
2000). In contrast, ill-structured problems involve a high degree of uncertainty in
both process and outcome, often requiring judgment and value decisions (Jonassen,
2000; Spiro, Coulson, Feltovich, & Anderson, 1988).
Problem solving is the process through which a learner progresses to arrive
at a solution to a given problem (Jonassen, 2000; Kneeland, 1999). There are many
ways to describe the problem solving process; such as a as progressing through a
series of steps or stages, a cyclic process, and even an upward moving spiral. As a
cyclic process, problem solving involves multiple steps or stages including learner
assessment of progress, which allows the learner to exit the problem solving process
or cycle through the process until the problem is solved. One example of a cyclic
problem solving process (see Figure 1-1) suggests that a learner first recognizes that
a problem exists; this is the stage at which the learner enters (En) the problem space,

P. As the problem space is created and manipulated the learner (a) defines the
problem (Df), (b) develops a set of solutions to the problem (De), (c) selects a
solution that the learner believes will yield the best results (Se), (d) implements the
solution (Im), and (e) finally evaluates the solution (Ev). If the learner is satisfied
with the solution (i.e., the outcome meets the learners criteria for success) then the
learner exits the problem space (Ex). Should the learner be unsatisfied with the
solution the learner will cycle through the problem-solving states until she or he
either solves or abandons the problem (i.e., the learner leaves the problem space
with no solution to the problem).
Figure 1-1 Basic Problem Solving Model
For a problem space (Pn) with entry and exit points and seven stages: (En) entry into the problem
space, (Df) problem definition, (De) solution development, (Se) selection of a solution, (Im)
implementation of a solution, (Ev) evaluation of the solution, and (Ex) exit from the problem space.
This model is based on Banduras (1997) and Wlodkowskis (1993) theories of motivation and
Jonassens (2000) theory of problem solving.
Progressing through the stages of problem solving can be accomplished
quickly, in a matter of seconds, or over an extended time. In some cases, learners

may circumvent stages to arrive at a quick solution; for example, a learner may
discover that asking a classmate for assistance yields an answer to a problem more
quickly than struggling independently. Whether a learner progresses through the
problem solving stages or takes a short cut to arrive at a solution, the learner will
enter and exit the problem space as shown in the simplified problem-solving model
(Figure 1-2).
Figure 1-2 Simplified Problem Solving Model
Basic problem solving model (Figure 1-1) simplified to show only the entry and exit points for
problem space Pn.
Borrowing from adult learning theory and motivation theory (Cross, 1981;
Ford, 1992; Merriam & Caffarella, 1999; Pintrich & Schunk, 1996), in the online
learning environment, a learners ability to solve problems coupled with a level of
importance the learner attaches to solving that problem may lead to changes in a
learners level of anxiety which may, in turn, lead to changes in a learners level of
participation and possibly a learners decision to drop out of an online class. In

online classes instructors must infer learner levels of participation by noting, for
example, the amount and quality of learner posts, learner participation in discussion
groups, and completed assignments such as tests.
Problem Typology
The final piece in the conceptual framework is a categorization of the kinds
of problems learners may encounter in an online learning environment. The
typology used in this research brings together six kinds of problems identified by
researchers: (1) situational/environmental, (2) institutional, (3) dispositional, (4)
technical, (5) instructional design, and (6) social (Table 1-1).
Cross (1981) identified three categories of barriers to learning: situational,
institutional, and dispositional. Situational barriers are created as a result of events
in a persons life at any given time (Cross). Situational barriers may include access
to computers as well as general life situations (Bullen, 1998; Cross, 1981); for
example, a learner may not be able to purchase a personal computer for home use,
leading to problems borrowing resources from friends and family or the
inconvenience of having to travel to an on-campus computer lab.
Institutions create barriers, directly and indirectly, as a result of policies,
practices, and procedures. Institutional barriers exclude, discourage, or otherwise
hinder participation in online or traditional education (Cross, 1981). Examples of
institutional barriers include requirements for proctored exams (on campus or at

designated testing centers), on campus registration (instead of online registration),
and higher tuition for online classes.
Dispositional barriers are those related to attitudes and self-perceptions
about oneself as a learner (Cross, 1981, p. 98). Dispositional barriers are created as
learners develop and change their beliefs and attitudes about their ability to learn
(Cross, 1981). Examples of dispositional barriers include learner attitudes about
their ability to use computer technology and participate effectively in a distance
education setting, a learners comfort and confidence with the course material, and a
learners feeling of self-efficacy (Bandura, 1997; Bullen, 1998; Jonassen &
Grabowski, 1993).
Ryan, Carlton, and Ali (1999) identified three basic categories of technical
problems in an online learning environment: hardware, software, and connectivity.
Hardware problems include problems with the learners personal computer, modem,
and printers. Software problems include problems with applications on the learners
personal computer such as word processors and Internet browser software as well as
software on the learners Internet Service Providers (ISP) or the learning
institutions servers. Connectivity problems include problems such as working
telephone, cable, and wireless network connections as well as software connections
such as being able to connect to an institutions web site (problems such as invalid
user ids and passwords fall into the connectivity category of technical problems).

M. Driscoll (1998) identified, and Bullen (1998) supported, two common
issues with the design of online learning environments: inadequate materials and
lack of variety. Lack of materials includes inadequate course content and supporting
resources such as old and outdated links. Online classes that take advantage of new
technology present the learner with a variety of methods to access class content (for
example, presenting both an audio and text version of a speech or lecture) and other
class features (for example, using both synchronous and asynchronous
communication). Failure to provide adequate materials and lack of variety may
result in learning environments that do not adequately engage and captivate learners.

Table 1-1 Typology of problems online learners encounter
This typology of problems was developed and tested in a pilot survey prior to collecting data for the study. The problem categories were
sufficient to capture all problems reported by survey respondents.________________________________________________________________
Problem Category Definition Structured Example Ill-structured Example
Situational / Environmental Problems arising from the learners life situation A learners PC is located in a noisy part of the house, making it difficult for the learner to concentrate. A learner, faced with limited resources, must determine how to allocate those resources to cover essential itemse.g., rentand non- essential itemse.g., entertainmentwhile paying for online accesse.g., monthly Internet Service Provider (ISP) fee.

Institutional Practices or procedures imposed by policies and institutions that discourage learners A learner is required to log at least two hours per week for each online class. An institution requires learners to travel to campus to take exams or to find a trusted resourcee.g., a professor at another institution closer to the learners hometo proctor the exam.

Dispositional Learner attitudes and beliefs about their ability to learn and their ability to use technology to effectively participate in an online class A learner who feels prepared for an online class is unfamiliar with a new PC and is concerned that the new cable modem may be hard to use. A learner recently completed an online class but did not do well because the learner felt he or she did not have adequate technology, time management, and study skills.

Technical Problems with computer components including hardware and software An online learner attempts to connect to the online class only to receive a message that the system is unavailable until the following day. An online learner attempts to connect to the online class without success. There are no system messages to help the learner determine the cause(s) of the problem.
(Table 1-1 continues)

Table 1-1 Continued
Problem Category Definition Structured Example Ill-structured Example
Instructional Design Problems arising from the design of a class After clicking a button on a screen, the learner receives a message that the link is inactive. The information on the unavailable web page is not critical for the learners understanding. After clicking on a button the learner is taken to the instructors notes, which contain multiple links that are unavailable. The unavailable links contain information that is critical to the learners understanding of the instructors notes.

Social Problems arising from the learners life situation including the consequences of learner-instructor or learner-learner interaction A learner is able to satisfy her or his need for social contact with the instructor and classmates through the use of email or the class discussion room. A learner is flamede.g., a learner receives email that is unprofessional or of a disparaging natureand, as a result, feels hurt and no longer feels safe to participate in online discussions.

Another barrier to learning online is social in nature. The social
consequences of learner interaction may create barriers to learning (Bullen, 1998;
Holt, 1998; McCormack & Jones, 1997; Strate, Jacobson, & Gibson, 1996); for
example, flamingsending disparaging comments via email or in a chat
roommay lead to increased learner frustration, decreased learner satisfaction, and
decreased learner participation.
The conceptual framework for this study brings together the typology of
problems, problem solving, and learners. The problem typology comprises six
categories or problem types1) Situational/Environmental, 2) Institutional, 3)
Dispositional, 4) Technical, 5) Instructional Design, and 6) Socialand forms the
basis of the conceptual framework. Problem solving is a process through which
learners move to arrive at a solution to a problem. Problem solving has been
variously described as a cyclic process that includes a problem space and a
mechanism for determining if a problem has been solved. The learner (problem
solver) provides the final piece of the conceptual frameworkgoal setting and
The conceptual framework will help us identify the kinds of problems
learners might encounter in an online learning environment and how learners go
about resolving those problems.

Significance of the Research
Research indicates that dropout rates tend to be higher in online learning
environments than in traditional classroom based learning environments; the most
commonly reported difference between dropout rates shows a 15% difference higher
in online classes than traditional classes (Carr, 2000; ERIC, 1984; Hoffman & Elias,
1999; Parker, 2001; Ridley et al., 1995). While current research tends to focus on
improving the instructional design and delivery of web-based instruction, students
continue to enroll inand drop out ofonline courses. Students drop out of online
courses for many reasons, both internal and external to the learning environment. As
new technologies are developed and old technologies are improved it is critical that
instructional designers, instructors, and institutions understand the potential benefits
of the new technologies and problems learners may encounter as a result of using
those technologies (Hartley & Bendixen, 2001).
The typology of problems developed for this research will help us identify
the kinds of unintentional problems learners encounter in online learning
environments. Improving our understanding of the kinds of problems learners
encounter in online learning environments will help instructional designers,
instructors, and institutions establish a basis for the development of learner support
tools and resources; the use of which may in turn lead to increased participation and
decreased dropout rates. Once we understand the kinds of problems learners
encounter in online learning environments we can explore how online learners

respond to problems in the online classroom environment. Furthermore,
understanding the kinds of problems learners encounter in online learning
environments will help us link research in adult learning with research in designing
web-based instruction in a way that is centered on the learner. For example, a
teacher may use the results of this study to change the way she or he interacts with
online learners while an instructional designer may use the results of this study to
design learner centered activities or develop support tools for an online class.
Support staff may use the results of this research to identify infrastructure problems
and improve support for both instructors and learners. Finally, learners may use the
results of this research, coupled with tools and resources provided by the institution
or the instructor, to improve her or his general problem solving skills and, in
particular, to improve skills solving unintentional problems the learner may
encounter while taking an online class.
Research Questions
The purpose of this research was to (1) develop a typology of the kinds of
problems adult learners encounter in an online learning environment, (2) gain
insight into how learners respond to problems encountered in an online learning
environment, and (3) explore connections between solving (or not solving)
problems and learner satisfaction and participation. The research questions are:
1. What kinds of problems do adult learners encounter in an online learning

2. How do learners respond to problems they encounter?
3. How do learners' solution patterns relate to their ongoing satisfaction or
frustration with the learning experience?
4. How do learners' solution patterns and ongoing perceptions and
experiences relate to their decisions to participate and continue in the
Overview of Methodology
Qualitative methods and selected quantitative measures, within the
boundaries of an embedded case study (Yin, 1994), were used in this study. The
overarching case consisted of three semester-long online classes. The embedded
multiple units of analysis were the problems online learners encountered during the
classes. The classes selected for this study were two sections of History I and one
section of Geology I. The classes were taught online with no planned face-to-face
interactions. The classes were selected for several reasons. First, introductory online
classes may be the first online classes to which learners are exposed and first-time
online learners may have difficulty solving problems that experienced online
learners find easy to solve. Second, the instructor taught all three classes for more
than three semesters, minimizing instructor problems with the material and the
online environment. Third, access to the learners, instructor, and class artifacts were
relatively easy.

Data were collected and analyzed for the duration of the classes. Data were
collected from multiple sources including weekly trouble reports, class postings,
class assignments, and learner and instructor feedback. Using multiple data sources
gave me the flexibility to explore connections, confirm or disconfirm findings, and
develop a description of the events that unfolded during the class.
The demographic survey and weekly trouble reports used to collect data in
this study were self-report tools. There are several acknowledge weaknesses in
using self-report data (Krathwohl, 1993; Marshall & Rossman, 1999). Primarily,
self-report data relies on open and honest communication from/with study
participants. Even when participants are being open and honest, perceptions,
situations, and external issues may affect participant responses. For example, a
learner may not have reported all problems they encountered during any given week
simply because the learner believed an event to be a mere inconvenience instead of
a problem.
I was unable to collect data from learners who dropped any of the three
classes or from learners who decided not to participate in the study. Without data
from learners who dropped any of the three classes used in this study I could not
draw conclusions about the kinds of problems they encountered that may have lead
to their decisions to drop a class. It is also possible that the learners who remained in
the classes and agreed to participate in this study had generally different

characteristics than learners who remained in the classes and chose not to participate
in the study.
Operational Definitions
Definitions of problem, problem space, and problem solving vary. For the
purposes of this research the following definitions were used.
1. Online Learning. Terms such as online learning, Web-based instruction,
and Internet based learning are often used interchangeably. Online
learning, as used in this research, means that learners interact with the
instructor, other class members, and the class material entirely online,
with no planned face-to-face interactions.
2. Problem. A problem exists whenever a learner perceives a gap between a
current situation and a desired outcomebetween what is and what
ought to be (Krulik & Rudnick, 1984; Mager & Pipe, 1983).
3. Problem space: A problem space is a temporary representation of the
problem, created by the problem solver, for the express purpose of
searching for solutions (Jonassen, 2000; Newell, 1990; Newell & Simon,
1972). Additionally, artifacts can be created, used, reused, and disposed
of while the learner manipulates the problem space.
4. Problem solving. A cyclical process used by a problem solver to (a)
identify potential solutions, (b) select an appropriate solution, (c)
implement the solution, (d) assess the impact of selecting the solution,

and (e) (if required) repeat the cycle until an acceptable solution is
achieved or the problem is abandoned.
Organization of the Dissertation
This dissertation comprises five chapters. Chapter 1 includes the purpose of
the study, the conceptual framework, the research questions, and a brief description
of the method. Chapter 2 is a review of the relevant literature on online learning,
problem solving, and participation. Chapter 2 includes a brief description of search
procedures and covers, in more detail, problems, problem solving, and problem
solvers. Chapter 3 is a detailed description of the design of the study, including a
description of the participants and the embedded case method. Chapter 4 presents
the results of the case study. Chapter 5 contains a discussion of the study, the
findings, conclusions, and suggestions for future research.

Chapter 1 presented the conceptual framework for this study and the
development of a typology of problems learners may encounter in online learning
environments. Chapter 2 discusses, in detail, the underlying concepts of problems
and problem solving upon which the conceptual framework in chapter 1 was built.
This chapter begins with an overview of the procedures used for searching the
literature related to the study. After reviewing searching methods, the discussion
moves to problems, the process of problem solving, problem space, goals, and the
problem solver. The final section of this chapter discusses possible relationships
between problem solving and learner feelings of frustration and satisfaction.
Search Procedure
While conducting a review of the literature for this study I used the
computer systems at the University of Colorado at Denvers Auraria Library to
search databases locally and nationally (e.g., ERIC). Many of the keywords I used
resulted in high hit rates; for example, using online learning as a keyword, a search
of full text education publications returned 1,005 references and a search of the
Dissertation Abstracts database resulted in 347 listings.

Topics of the 347 dissertations included online collaboration, designing
online learning, teacher perceptions of online learning, examination of specific
online learning classes (mostly medical in nature), and examination of specific
computer science topics (such as robotics and speech recognition). There was no
single dominant theme; instead, there were several categories in which multiple
dissertations appeared. Categories in which more than 15 dissertations appeared
included designing online instruction, research on specific computer science
problems, exploring teacher skills, and dissertations that covered specific online
classes. Searches of other databases produced similar results. Searching databases
with combinations of the keywords such as online, problem(s), problem solving,
critical incidents, and solving online problems reduced the number of listings to
manageable chunks.
I was able to find references to online learning and distance learning;
however, I located only one reference, a dissertation by Smith (2000), that focused
on critical incidents (problems that differ significantly from what learners or faculty
consider normal problems) teachers and learners encountered in an online learning
environment. The purpose of Smiths (2000) qualitative research was to assess how
faculty responded to, and managed, critical incidents in the online classroom. Using
a combination of survey instruments and interviews, Smith collected demographic
and critical incident data. Brookfields Critical Incident Questionnaire (CIQ) (as
cited in Smith, 2000) was used to collect data on critical incidents from both faculty

and students. A corroboration instrument was used to verify critical incidents
reported by faculty. The study participants were both faculty and students. Faculty
had at least one year of online teaching experience. Student participants were
selected from the pool of students taught by the faculty participants; no outside
students were used. The classes were taught online, with students participating from
remote locations (home, office, or school labs). A description of the study
participants was missing from the dissertation, however, it appears that 30 students
and 30 faculty members were provided by each institution that participated in the
study; the number of institutions was not given. No statistical measures were
recorded, so I was unable to determine the size of the sample(s). However, as a
primarily qualitative study, two of her three key findings support my research. Key
findings from the Smith (2000) study include: (a) faculty members are not trained to
handle critical incidents in the online classroom, (b) feedback from the instructor is
critical, and (c) technology problems create barriers to positive learning experiences.
The critical nature of instructor feedback emerged from my studysupported by
Smiths study; item (b) aboveadditionally, the Smith study supports one of the
types of problems developed for this studyitem (c) above.
Much of the research on solving problems online focused on collaborative
problem solving and solving specific types of assigned problemsusually math or
computer science problems. Much of the research on learning styles compared
student learning styles and student success as measure by gradesgenerally, the

results indicated no significant difference between learning style and student
success. One study, a dissertation by Ahn (1999), looked at learner personality types
from a qualitative perspective, noting that, for example, perceiving types
participated less in online discussions than did judging, sensing, or knowing types.
For general problem solving I turned to Human Problem Solving, an
extended study of problem solving by Newell and Simon (1972). Influenced by the
development of the digital computer and the developing field of information
processing (now called cognitive science with the embedded field of artificial
intelligence), the authors asserted that information processing theory can be used to
explain thought processes (1972). Their goal, and an acknowledged limitation of
their study, was to develop a precise symbolic model on the basis of which
pertinent specific aspects of [a persons] problem solving behavior can be
calculated (p. 5). In selecting an information processing theory, the authors
excluded physiological models previously used to explain problem solving
behavior, freeing them to study things that could not be directly observed, such as
short- and long-term memory. In essence, their study was a break from the
traditional behaviorist approach used to study problem solving. The key elements of
the Newell and Simon information processing theory are the task environment
(including the problem space created by the problem solver) and the information
processing model (comprising short-term memory, long-term memory, external
memory, receptors, effectors, and a governing processor), and the goal of the

problem solver. The theory was tested using ciyptarithmetic (essentially encrypted
math puzzles), logic, and chess problems. Key contributions from the Newell and
Simon study (1972) include the definition of a problem space as the internal
representation of the task environment used by the subject (p. 56) and the model of
an information processing system. While there are other theories of problem solving
(Jonassen, 2000), the information processing theory put forth by Newell and Simon
(1972) was selected for this study because of its foundational status in the problem-
solving literature.
In conclusion, research suggests that problem solving is a process involving:
(a) a problem, (b) the problem solving process, (c) a problem space, (d) a goal or
end state, and (e) the problem solver (Cross, 1981; Holland, Holyoak, Nisbett, &
Thagard, 1986; Jonassen, 2000; Mayer, 1992; Newell & Simon, 1972; Pintrich &
Schunk, 1996). The remainder of this chapter covers each of the five components of
problem solving.
A problem is often described as a state of discomfort, a question requiring an
answer, or some unknown quantity or unresolved issue (Jonassen, 2000; Krulik &
Rudnick, 1984; Mish & Morse, 1998; Watzlawick et al., 1974), an obstacle to
attaining a goal (Hamilton & Ghatala, 1994) or a difference between actual behavior
and desired behavior (Elliott, Kratochwill, Littlefield, & Travers, 1996) and that
solving the problem must satisfy some social, cultural, or intellectual value

(Jonassen, 2000, p. 65). Newell and Simon (1972) add to the definition of a problem
stating that to have a problem implies (at least) that certain information is given to
the problem solver: information about what is desired, under what conditions, by
means of what tools and operations, starting with what initial information, and with
access to what resources (p. 73). Simply put, a problem is often defined as a thing
that requires resolution for which there is no easy or apparent solution (Hamilton &
Ghatala, 1994; Hayes, 1989; Krulik & Rudnick, 1984; Mayer, 1992; Newell &
Simon, 1972; Watzlawick et al., 1974). Watzlawick, on the other hand, admitting
that he might be engaged in semantic hair-splitting (p. 38), made a distinction
between a problem and a difficulty. A difficulty being either a situation that can be
resolved without any special problem solving skills or a common problem one
encounters in the course of their daily lives that has no known solution and one must
just live with the common problem or situation (1974). A problem, on the other
hand, is the result of a situation that ends in an impasse or deadlock. For the purpose
of this study, no distinction was made between a problem and a difficulty.
A problem has at least four characteristics: (1) givensthe initial state; (2)
goalsthe desired end state; (3) operatorsto facilitate movement from one state
to another; (4) and obstaclesconstraints or barriers (Holland et al., 1986; Mayer,
1992; Newell & Simon, 1972). Additional characteristics of a problem include
complexity, domain specificity, and structuredness (Jonassen, 2000).

Complex problems are defined by issues such as the number of variables, the
interconnectedness of the variables, the dynamic nature of the problem, and the
ability to observe key variables (transparency) (Funke, 1991; Jonassen, 2000).
Problems can be well-structuredsimple with well-defined relationshipsor ill-
structuredcomplex with irregular and ill-defined relationships (Jonassen, 2000;
Spiro & Jehng, 1990). Structured problems are easy to identify: they have correct
and incorrect answers; learner performance while solving the problems is
observable and measurable; and application of knowledge to solve the problem
varies little from learner to learner and situation to situation (M. Driscoll, 1998). Ill-
structured problems are complex and may require the learner to call upon multiple
sources of information, skills, and knowledge to solve (Spiro et al., 1988; Spiro,
Vispoel, Schmitz, Samarapungavan, & Boerger, 1987). Solving ill-structured
problems requires learners to apply knowledge in different ways to arrive at an
acceptable outcome (M. Driscoll, 1998). Finally, problem-solving is situated,
embedded, and therefore dependent on the nature of the context (Jonassen, 2000, p.
68). A problem solver, therefore, relies on domain-specific cognitive operations to
solve a problem in a given context (Jonassen, 2000).
A further distinction can be made between problems created as part of the
instructional design process (intentional problems)such as the development of
problems in a problem based learning environment (Barrows, 1988, 1994; M. P.
Driscoll, 2000; Duffy & Jonassen, 1992)and problems that arise as a result of

participating in an online learning environment (unintentional problems). From a
learners perspective, whether a problem is intentional or unintentional may be of
little significance. More important to the learner is whether or not the problem can
be resolved and the effect of solving or not solving the problem on the learners
performance and persistence; e.g., if the problem is not solved will the learners
grade be impacted or will the learner give up and drop out? From an instructor,
instructional designer, or institutional point of view understanding the kinds of
problems learners encounter in online learning environments is critical.
Generally, problems are not uniform in the way they appear to learners or in
the way they interact with other problems (Jonassen, 2000); furthermore, what can
be a difficulty for one learner can be a problem for another learner. Additionally,
learners may decide not to report a problem to an instructor, teaching assistant, help
desk, or (in the case of this research) to me because the learner views the problem as
a mere nuisance.
Online Problems
Research indicates that as instruction moves from the traditional classroom
to the Web, instructional developers, teachers, and learners must understand how
learners respond to problems and the consequencesdesirable versus undesirable,
direct versus indirect, and anticipated versus unanticipated (Rogers, 1995)of
teaching in this new environment (Holt, 1998; O'Malley & McCraw, 2001; Palloff
& Pratt, 1999). Online instructors and learners will encounter problems, directly and

indirectly, that are unique to the online learning environment. How instructors and
learners respond to problems and to each other in the online learning environment
may affect learner participation and learner attitudes about online learning.
In a web-based classroom, as in a traditional classroom, learners and
teachers interact to perform learning-related tasks. However, in a web-based
classroom learners and teachers are not collocated (McCormack & Jones, 1997).
Instructional technologies (in this case, computers and the Web) are the bridge that
joins learners and teachers separated by time and space (Moore & Kearsley, 1996).
In addition to problems with content, learners and teachers in an online
(web-based) learning environment face the added task of learning about, and
working with, technology. Effective use of web-based learning takes time and
practice for instructors and learners (Cambre, Erdman, & Hall, 1996) and
inexperience with the technology may create a unique set of problems for online
learners that may lead an online learner to drop out, stop out, decrease their level of
participation, or increases in the learners level of frustration.
Underlying all other issues are the limitations of the web (and the Internet)
that lead to changes in the way learners and teachers interact. For example, verbal
and nonverbal cues available to the teacher in the traditional classroom are
nonexistent in the online classroom (Collison, Elbaum, Haavind, & Tinger, 2000).
When a learner is sitting at her or his personal computer responding to another
learners post or an instructors question they do so isolated from other learners and

the instructor. Attempts to reduce learner isolation include personalizing online
learning environments by posting photos and bios of learners and instructors.
However, a personalized online learning environment is quite different from a
traditional classroom in which a learner can see and hear other learners and the
Advances in technology such as development of Intemet2 or the pan-
European Geant network and the development of new tools will help improve the
web and the delivery of online instruction and possibly allow new ways to reduce
social isolation for online learners. Currently, however, instructional designers,
instructors, learners and learning institutions have to work within the limitations of
existing technology to develop and deliver online instruction which may result in
creating online learning environments that are more teacher centric than learner
centric. Coupled with, as research suggests, higher dropout rates in online learning
courses than in traditional classroom-based courses (Carr, 2000; ERIC, 1984;
Hoffman & Elias, 1999; Ridley et al., 1995), todays online learning environments
may not be the best environments for all learners.
Problem Solving
Much has been written about human problem solving as a subtopic of larger
research areas (Maier, 1970) such as philosophy, psychology, and education
(including the field of instructional design) (Jonassen, 2000). In Western
philosophy, problem solving is often tightly coupled with discussions attempting to

describe reality, arguing, perception, and thought (Angeles, 1992; Dewey, 1991;
Flew, 1989; Fraser, 1959; Tamas, 1991). In The Passion of the Western Mind:
Understanding the Ideas That Have Shaped Our World View, Tamas (1991)
discusses how philosophers concern themselves with explaining the nature of
reality, determining the limits of human knowledge, and critical thought.
Development of a philosophers worldview is achieved by presenting (or being
presented with) a problem, followed by an extensive dialogue with the self and
others to develop an explanation. In the philosophy of education, knowledge and the
nature of that which is valuable, are central issues (Kneller, 1971); the philosophy of
education is an attempt to discover what education is and how it takes place
(Archambault, 1964, p. 3). To achieve this goal, educational philosophers engage in
the tasks of representing and solving problems that are speculative, prescriptive, and
analytic (Kneller, 1971). Educational philosophers are concerned with asking the
rightrelevant and meaningfulquestions (Morris & Pai, 1994). Online learners,
on the other hand, are faced with solving problems that are part of the instruction
(intentional problems developed to challenge a learners problem solving and
critical thinking abilities) and solving unintentional problems that arise as a result of
participating in an online class and prevent the learners from effectively
participating in an online class.
Psychology, Cognitive Psychology, Cognitive Science, and Computer
Sciencein particular Artificial Intelligence studieshave contributed greatly to

our understanding of how people think and solve problems (Newell & Simon,
1972). From the behaviorist studies by B. F. Skinner to the social cognitive learning
studies by Albert Bandura, much has been written on human problem solving
activities. Problem solving, however, is often included as an integral part of
discussions such as resolving personal life situations, decision making processes
(including problems caused by faulty decision making processes), and the
development of the person (Bandura, 1997; Bateson, 1972; Kegan, 1982). The
definitions of problems and problem solving, as used in this research, were not
pulled from a single source; instead, the definitions (documented in chapter 1 under
operational definitions) were influenced by literature from multiple disciplines.
The theory of problem solving selected for this study is the information
processing theory presented by Newell and Simon (1972). The components of an
information processing system are (1) an active processor, (2) an input (sensory)
system, (3) an output (motor) system, (4) internal long-term memory (LTM), (5)
internal short-term memory (STM), and (6) external memory (EM). A problem
space comprises information about the problem, the problem solvers initial
understanding of the problem, a set of information processing tasks a problem
solver may evoke to produce new information about the problem, and the total
knowledge available to the problem solverincluding temporary dynamic
information, information in LTM and EM, and access and reference information or
additional information in LTM and EM (Newell & Simon, 1972). Problem solving

in the Newell and Simon model is a process that selects from a large set of
possibilities an element having certain properties, or as traversing a large space of to
find one of a rare set of paths with preferred properties (p. 137). Newell and
Simons definition of problem solving is supported by other researchers who define
problem solving as the process whereby a problem solver moves from a current
state (how things are) to a desired state (how things ought to be) (Elliott et al., 1996;
Jonassen & Grabowski, 1993; Pintrich & Schunk, 1996).
The problem solving process is often described as a cycle (Bruner, 1966;
Gagne & Medsker, 1996; Hamilton & Ghatala, 1994; Newell, 1990; Newell &
Simon, 1972) in which the problem solver moves from an initial state to a desired
end or goal state (Mayer, 1992). The problem solving process comprises steps
variously labeled: (a) formulating a test procedure, implementing the test procedure,
comparison of the rest results with some criterion (Bruner, 1966); (b) interpretation,
scanning, propositional construal, imaginative insight, new cognitive interpretation,
remembering, and action (Mezirow, 1991); (c) need, create, implement, evaluate
(Knowles, Holton, & Swanson, 1998); (d) finding the problem (recognition),
representing the problem, planning the solution, carrying out the plan, evaluating the
solution, and consolidating gains (learning from the process) (Hayes, 1989); (e)
preparation, incubation, illumination, and verification (Mayer, 1992; Wallace,
1926); and, (f) Understanding the problem, devising a plan, carrying out the plan,
and looking back (Mayer, 1992; Polya, 1957).

Problem solving has also been defined as: (1) a higher level cognitive
activity either novel or routine, that requires previous learning of various types and
that may result in new learning (Gagne & Medsker, 1996, p. 124); (2) a means of
finding an appropriate way to cross a gap (Hayes, 1989, p. xii); (3) the means by
which an individual uses previously acquired knowledge, skills, and understanding
to satisfy the demands of an unfamiliar situation (Krulik & Rudnick, 1984, p. 4);
and (4) a process of searching through a state space (Holland et al., 1986, p. 10).
Problem solving is a process triggered by a person whenever he or she
perceives a difference between a current state and a future or desired state, coupled
with a perceived need to solve the problem (a need to move from the current state to
the desired state). To solve a problem a learner uses a problem solving process such
as those proposed by various researchers (Bruner, 1966; Gagne & Medsker, 1996;
Hamilton & Ghatala, 1994; Jonassen, 2000; Newell, 1990; Newell & Simon, 1972).
An additional, and critical, component of any problem solving process is the
problem space (Newell, 1990; Newell & Simon, 1972).
Problem Space
Defined as the space created by the problem solver within which the
problem solver will conduct a search for a solution to the problem at hand (Newell,
1990), the problem space is, to a large extent, transitory; it is an internal
representation of the problem, created by a problem solver to solve a specific
problem (Newell & Simon, 1972). When a problem space is created, the problem

solver adds, deletes, and interprets information; that is, the problem solver uses
knowledge of the language and the world to understand problem information
(Hayes, 1989, p. 9). Once the problem is solved or abandoned the problem solvers
attention moves on to other issues, resulting in the abandonment of the problem
space. The problem space may include physical artifacts, such as the problem boxes
developed for medical students (Barrows, 1994) and metaphysical objects such as
mental maps and schemas.
A problem space is created by a problem solver to place boundaries on a
problem to make it solvable (Newell, 1990). When the problem space is no longer
needed it is abandoned; some artifacts are preserved, others are discarded; some
concepts are stored in long-term memory, others are forgotten. When a new, similar,
problem is encountered a new problem space is created. Artifacts and concepts can
be reused, recreated, or created anew to solve the existing problem, however, each
problem is framed within the context of a problem space.
Problems are solved by searching and manipulating a problem space, either
mentallyby manipulating ideas and conceptsor physicallyby manipulating
physical artifacts such as resetting a computer modem (Jonassen, 2000; Newell,
1990). There are two basic kinds of strategies used for searching a problem space.
The first kind of strategy is called the problem search (Newell, 1990); a search in
which the problem solver moves immediately to a solution. In the problem search,
the problem solver is familiar with the problem and possible answers to the

problem; thus, the time required to search the problem space is minimal. The second
kind of strategy is called a knowledge search (Newell, 1990). In the knowledge
search, the problem solver searches memory for knowledge to guide the problem
solving activities. Depending upon the problem and the problem solvers knowledge
baseknowledge about the problem or similar problemsa knowledge search may
be conducted continuously, and the more problematical the situation, the more
continuous is the need for it (p. 98). An additional strategy I have labeled the
problem shortcut occurs when, in essence, a problem solver bypasses her or his
normal problem solving method by seeking assistance or a quick answer from
another source (such as a teacher, another student, or a friend).
Searching the problem space is a matter of applying a set of operators, for
example, pressing a button to reset modem or selecting a phone number from a list
of remember phone numbers (accessing long-term memory) (Newell, 1990).
Another key aspect of a problem space is that operators can be applied freely and in
any order (Newell, 1990); that is, the goal of problem solving activity is to reach an
end state regardless of how operators are applied. Furthermore, searching the
problem space involves a trade-off between deliberation and preparation (Newell,
1990). At some point in the problem solving process a decision must be made. The
problem solver can deliberateengage in activities to analyze the situation, the
possible responses, their consequences, and so on (p. 102)or the problem solver
can select from various responses or aspects of responses (p. 102) stored in the

problem solvers memory (Newell, 1990) in order to progress toward the problem
solvers end-state or goal.
To solve a problem the problem solver must have a goal in mind; that is, the
problem solver must have some knowledge about the desired end state (Hayes,
1989; Mayer, 1992; Newell, 1990; Newell & Simon, 1972) and during the problem
solving process, the problem solver must keep the desired end state in mind because
multiple solution paths can be generated, one or more of which may lead the
problem solver away from or toward a desired solution (Newell & Simon, 1972). A
problem solver is not always presented with a solution, instead, a problem solver
often uses an existing problem solving process to develop potential solutions
(Newell & Simon, 1972). In examining a goal, the problem solver places boundaries
on the problem space; including some information while excluding other
information. Establishing a goal does not necessarily lead to the optimum path
through the problem space. Problem solvers will make errors, requiring a strategy to
back up and take another path (Newell & Simon, 1972).
Much that we know about goal setting behavior comes from the field of
psychology, such as Banduras (1997) research on self-efficacy and Lewins (1999)
research on level of aspiration. Lewin (1999) identified four stages in a level of
aspiration sequence: (a) the last performancethe learners perception of their
performance the last time they attempted to solve a similar problem; (b) setting the

level of aspiration for the next performancethe learners perception of their
potential performance the next time they try to solve a similar problem; (c) the new
performancethe learners perception of their actual performance; (d) the
psychological reaction to the new performancethe affect of solving, not solving,
or abandoning a problem. Bandura defined self-efficacy as judgments of personal
capability (Bandura, 1997, p. 11) and linked self-efficacy to goal setting, stating
that goal setting is mediated by three behaviors: (a) self-evaluation of the problem
solvers performance, (b) the problem solvers perceived self-efficacy for reaching a
goal, and (c) adjustment of the problem solvers behavior based on performance in
attaining the goal (Bandura, 1997). Lewins goal setting and level of aspiration and
Banduras goal setting and self-efficacy are similar in that they identify self-
evaluation of previous performance, setting an expected level of performance for the
next event, self-evaluation of the event just completed, and the psychological
reaction and subsequent adjustment of behavior based on perceived performance in
attaining a goal.
Problem Solvers
It has been said that human beings are natural problem solvers (M. D. Jones,
1998), that problem solving is part of everyones daily life (Hamilton & Ghatala,
1994; Jonassen, 2000), that some people are better problems solvers than other
people (Elliott et al., 1996), and that people faced with identical problems may
create unique internal representations of those problems (Hayes, 1989); In other

words, adult learners are not homogeneous (Cross, 1981); each learner has a unique
set of filters through which they solve problems and experience lifes events,
including instruction (Davis, Sumara, & Luce-Kapler, 2000; Jonassen & Grabowski,
Individual Differences
Although results of studies on individual differences and online learning are
mixed, awareness of individual differences will help educators understand some of
the difficulties learners encounter when engaged in specific learning and problem-
solving tasks (Jonassen & Grabowski, 1993), leading to an increased understanding
of what it takes to be a successful online learner. Hartly and Bendixen (2001) argue
that understanding how learners characteristics impact their ability to succeed in
environments that are very different from traditional learning situations (p 23) is
critical if we are to help learners succeed in online learning. Generally, research on
individual differences interprets results of learning style measures, such as Kolbs
Learning Styles Inventory (1976), as indicators of preferences or general tendencies
(Healey & Jenkins, 2000; Jonassen & Grabowski, 1993). Furthermore, research
suggests that, used in conjunction with other measures learning style measures may
be predictors of student success in online learning (Blocher, 2001; Diaz, 2001;
Parker, 2001).
An often unstated requirement for successful online learning is prior
computer skill; including use of software products such as Microsoft Word ,

Safari , and Internet Explorer , and a working knowledge of the Internet (M.
Driscoll, 1998). Diaz (2001) stated that a learners prior experience with online
learning environments and comfort with asynchronous communicationdelivery of
the course material precedes the learners access to the material (O'Malley &
McCraw, 2001)may have an impact on a learners participation. A learners
computer knowledge and or skill level, intellectual inquisitiveness, and online
activity can be predictors of success, while the kinds of problems learners encounter
and how learners solve those problems may affect the learners level of
Problem solving is affected by the problem solvers internal state
particularly when there is a strong feeling of anxiety (Bruner, 1966). Furthermore,
there is some evidence that high drive and anxiety lead one to be more prone to
functional fixedness (p. 52), a condition in which a person selects and focuses
exclusively on a single and incorrect solution resulting in the inability to solve the
problem at hand (Bruner, 1966). Problem-solving behavior varies between people
(two people may approach a problem differently) and within people (a person may
approach similar problems in different ways) (Maier, 1970). Behavior undergoes
an abrupt change when (a) the individual experiences repeated failure because the
problem is too difficult for him to solve; (b) pressures to solve are present; (c)
escape from the situation is impossible; and (d) substitute goals are not available

(Maier, 1970, p. 177). As a person struggles with a problem they approach a
frustration threshold; when it is exceeded, the autonomic nervous system takes over,
and the behavior is characterized by fixation, aggression, and regression (Maier,
1970, p. 177). A learners inability to solve a problem may therefore lead to
increased frustration that may lead, in turn, to decreased learner participation.
Bruner (1966) and Maier (1970) are supported by Bandura (1997) who
asserted that the self-efficacy mechanism also plays a pivotal role in the self-
regulation of affective states (p. 137) and Lewin (1999) who found that a persons
level of aspiration is dependent upon their need for success, their need to avoid
failure, and their perceived probability of success. Put into a framework of problem
solving, a problem solver is affected by: (1) previous performance in a similar task;
(2) perceived performance of the next task; (2) perceived pressures while
performing a taske.g., time pressure, degree of difficulty, etc.; (3) perceived value
in completing the task; and (5) belief in the problem solvers ability to complete the
While motivation is not the focus of this study it is an integral part of
problem solving behavior; without motivation there would be no problem solving
activity. Many theories of motivation exist (M. P. Driscoll, 1994, 2000; Pintrich &
Schunk, 1996). Pintrich and Schunk provide an excellent review of motivation
theory in Motivation in Education: Theory, Research, and Applications. Included in

Pintrich and Schunks work (1996) are reviews of theories such as Intentions and
Behaviors by Fishbein and Ajzen, Goal Setting by Locke and Latham, and Self-
regulation and Volition by Zimmerman and Schunk.
Much has been written about motivation, directly and indirectlyimbedded
in topics such as psychology and learning theory. Knowles (1980) included needs
(as motivating forces) as an antecedent to an act of behavior. In this instance, a need
exerts pressure on a person motivating the person to satisfy the need. Kegan (1982)
defined meaning making as physical, social and survival activity that is the primaiy
human motion, irreducible (p. 19); humans are motivated to make sense of then-
world. Wlodkowski (1993) pointed out that motivation cannot be directly measured
so researchers develop hypothetical constructs of motivation to explain behavior in
terms of motivation. Bandura (1997) defined motivation as a general self-regulatory
construct that includes behaviors of selection, activation, and goal direction.
Driscoll devoted a chapter in Psychology of Learning for Instruction (1994; 2000)
on motivation, recognizing the importance of motivation while making the
observation that there is no clear definition of, or way to measure, motivation.
Although not always mentioned explicitly, motivation plays a critical role in
research on instructional design and problems solving, for example, in Learning to
Solve Problems: An Instructional Design Guide Jonassen (Jonassen, 2004) defines
problems solving, in part, as a goal directed activity. Finally, Pintrich and Schunk
(1996) acknowledge that there are many definitions of motivation and motivational

processes. In Motivation in Education; Theory, Research, and Applications (Pintrich
& Schunk, 1996), the authors settled on a definition of motivation (used in this
research) as the process whereby goal-directed activity is instigated and sustained
(1996, p. 4).
Research on problem solving is often embedded in other fields of study,
such as philosophy, cognition, psychology, and artificial intelligence. One study in
particular, by Newell and Simon (1972), focused on human problem solving activity
as an information processing process. The information processing theory put forth
by Newell and Simon (1972) is a key element of my research. While I found the
distinction between a difficulty and a problem, and the difference between an
intentional or unintentional problem to be of interest, no such distinctions were
required of the learner. This research relied on learner perception and learner self-
reports of problems, to have made a distinction between mere difficulties and
intentional or unintentional problems may have caused confusion and resulted in the
loss of valuable data. As a researcher I was interested in all problems online learners
encountered, including those problems labeled as difficulties.
Literature on problem solving reviewed for this dissertation fell into three
major categories: (1) general problem solving; (2) problem solving in education
(predominantly intentionally developed problems); and (3) problem solving as part
of a decision making process (often embedded within psychological contexts).

Missing from the literature was research on the kinds of unintentional problems
learners encounter. While researchers acknowledge the importance of problem
solving skills in education, the lack of material on the kinds of unintentional
problems learners encounter in online learning environments speaks volumes.
Problem solving involves motivation, motivation to learn and motivation to
reach a goalmotivation to solve a problem. Literature on motivation includes, but
is not limited to, studies of intentions and behaviors, attribution theory, self-
regulation, and goal setting (Bandura, 1997; Cross, 1981; M. P. Driscoll, 1994,
2000; Kegan, 1982; Knowles, 1980; Pintrich & Schunk, 1996; Wlodkowski, 1993).
Given the plethora of theories and definitions of motivation I settled on the
definition of motivation as the process whereby goal-directed activity is instigated
and sustained (Pintrich & Schunk, 1996, p. 4).
Literature on problems, problem solving, distance education and web-based
instruction support the categories used in the problem typology that is central to this
research. The typology of problems learners may encounter in online learning
environments that was developed for this research is a framework, a flexible tool,
that allows the addition or removal of categories should the need arise. Instructors,
instructional designers, and institutions can use the typology to improve the
creation, delivery, and support of online instruction.

This chapter describes the research method and design, including reasons for
selecting the research method, the unit of study, subjects, data collection procedures,
and instrumentation.
The purpose of this research was to (1) develop a typology of the kinds of
problems adult learners encounter in an online learning environment, (2) gain
insight into how learners respond to problems encountered in an online learning
environment, and (3) explore connections between solving (or not solving)
problems and learner satisfaction and participation. The research questions are:
1. What kinds of problems do adult learners encounter in an online learning
2. How do learners respond to problems they encounter?
3. How do learners solution patterns relate to their ongoing satisfaction or
frustration with the learning experience?
4. How do learners solution patterns and ongoing perceptions and experiences
relate to their decisions to participate and continue in the class?

This study involved a contemporary issue (problems online learners
encounter) within a real-life context (online classes) in which I had little control
over the events as they unfolded within a bounded context (the online environment),
making this an ideal candidate for the case study method (Krathwohl, 1993; Miles &
Huberman, 1994; Stake, 1995; Yin, 1994).
Participants in the study were adults, 18 years or older. Data on the average
learner age in each class were not available; however, the average ages of all
learners at each institution were available (Table 3-1). Using fictitious names for
each institution, the average learner age at Aspen University (AU) was 29 and the
average age at Rocky Mountain University (RMU) was 32. Eighty-nine learners
completed the three classes: 26 (18 female and 8 male) in AU-Geology I, 31 (23
female and 8 male) in AU-History I, and 32 (19 female and 13 male) in RMU-
History I.
The mix of female to male students in each class was: 69% female and 31 %
male AU Geology 1, 74% female and 26% male AU History 1, and 59% female and
41% male RMU History 1. The overall mix of learners who completed all three
classes in the study were 60 female and 29 male learners or 75% female and 25%
male learners. The mix of female and male learners who participated in the study
(Table 3-2) matched the overall mix of female and male students who completed the

three classes; of the 32 study participants, 24 (75%) were female and 8 (25%) were
male. No data on ethnicity were collected.
Table 3-1 Class demographic data
Demographic data were collected using a self-report survey administered at the beginning of the
semester with the exception of the average age at each institution; supplied by the instructor.
Notes: 1) AU=Aspen University, RMU= Rocky Mountain University. 2) Geo=Geology,
His=History. _________________________________________________________________________
School Class Registered Dropped before census date Dropped After census date Completed the class Females who completed the class Males who completed the class Average student age at each institution
Number of students
AU Geo I 39 11 2 26 18 8 29
AU His I 49 17 1 31 23 8 29
RMU His I 51 16 3 32 19 13 32
Total 139 44 6 89 60 29 N/A
Percentage of students
AU 28% 5% 67% 69% 31% 29
AU 35% 2% 63% 74% 26% 29
RMU 31% 6% 63% 59% 41% 32
Total 32% 4% 64% 67% 33% N/A
Table 3-2 Female and Male composition
The number and percentages of Female and Male study participants.
Note: F=Female, M=Male._____________________________________________________
Class 1 Class 2 Class 3 All Classes
F M n F M N F M n F M N
Number of learners 10 3 13 9 3 12 5 2 7 24 8 32
Percenta geof learners 77% 23% 13 75% 25% 12 75% 25% 7 75% 25% 32

Table 3-3 Learning environment data
Participants were asked to identify the learning environment that most closely described their
primary study location.__________________________________________________________________
Class 1
Noise Level Quiet, distraction free. Somewhat noisy with occasional distractions. Somewhat noisy with frequent distractions. Noisy with occasional distractions. Noisy with frequent distractions. n
# of learners 4 7 1 1 0 13
%of learners 31% 54% 8% 8% 0% 13
Class 2
Noise Level Quiet, distraction free. Somewhat noisy with occasional distractions. Somewhat noisy with frequent distractions. Noisy with occasional distractions. Noisy with frequent distractions. n
#of learners 4 6 0 1 1 12
%of learners 33% 50% 0% 8% 8% 12
Class 3
Noise Level Quiet, distraction free. Somewhat noisy with occasional distractions. Somewhat noisy with frequent distractions. Noisy with occasional distractions. Noisy with frequent distractions. n
#of learners 1 5 0 1 0 7
% of learners 14% 71% 0% 14% 0% 7
All classes
Noise Level Quiet, distraction free. Somewhat noisy with occasional distractions. Somewhat noisy with frequent distractions. Noisy with occasional distractions. Noisy with frequent distractions. N
U of learners 9 18 1 3 1 32
% of learners 28% 56% 3% 9% 3% 32

Table 3-4 Computer access and online experience
Depicts the various methods learners used to access the online classes used in this study and
learner experience with online learning and the subject matter._______________________
Computer Access
Number of learners
Class 1 Class 2 Class3 All Classes
Yes No n Yes No n Yes No n Yes No n
Home 13 0 13 12 0 12 7 0 7 32 0 32
Campus 1 12 13 3 9 12 4 3 7 8 24 32
Work 3 10 13 2 10 12 3 4 7 8 24 32
Other 2 11 13 2 10 12 2 5 7 6 26 32
1a online class? 6 7 13 3 9 12 4 3 7 13 19 32
1st subject class? 12 1 13 10 2 12 2 5 7 24 8 32
Attending classes on campus? 3 10 13 4 8 12 4 3 7 11 21 32
Percentage of learners
Class 1 Class 2 Class 3 All Classes
Yes No n Yes No n Yes No n Yes No n
Home 100% 0% 13 100% 0% 12 100% 0% 7 100% 0% 32
Campus 8% 92% 13 25% 75% 12 57% 42% 7 25% 75% 32
Work 23% 76% 13 17% 83% 12 43% 57% 7 25% 74% 32
Other 15% 85% 13 17% 83% 12 29% 71% 7 19% 81% 32
1st online class? 46% 54% 13 25% 75% 12 57% 43% 7 41% 59% 32
1st subject class? 92% 8% 13 83% 17% 12 29% 71% 7 75% 25% 32
Attending classes on campus? 23% 77% 13 33% 67% 12 57% 43% 7 34% 66% 32
With regard to learning environment (Table 3-3), a majority of the learners
(56%) reported that they worked in a somewhat noisy environment with occasional
distractions. All learners (100%) reported having access to a PC in their home
(Table 3-4) with 25% having access to computers at work, 25% having access to
computers on campus, and 19% having additional locations from which they were

able to access a PC. Most learners (97%) reported computer use of five or more
hours per week (Table 3-5), split almost evenly between five to nine hours per week
(44%) and 10 or more hours per week (53%). A majority of the learners (66%) were
not taking classes on campus during the semester (Table 3-6); the remaining 34% of
the learners were taking classes on campus during the semester25% attended one
additional class and 9% attended two classes.
Table 3-5 Computer use data
The estimated number of hours a learner used their PC in any given week.
Class 1 Class 2
Hour/Week 1-4 5-9 10+ n 1-4 5-9 10+ n
# of learners 0 9 4 13 1 2 9 12
% of learners 0% 69% 31% 8% 17% 75%

Class 3 All classes
Hour/Week 1-4 5-9 10+ n 1-4 5-9 10+ n
# of learners 0 3 4 7 1 14 17 32
% of learners 0% 43% 57% 3% 44% 53%
Table 3-6 Current on-campus classes
The number of on-campus classes learners participated in during
the semester they took the online class ____________________
Class 1 Class 2
Number of classes 9 - 1 2 n 0 1 2 n
# of learners 10 2 1 13 8 3 1 12
% of learners 77% 15% 7% 13 67% 25% 8% 12

Class 3 All Classes
Number of classes 0 1 2 a 9 1 2 a
# of learners 3 3 1 7 21 8 3 - 32
% of learners 43% 43% 14% 7 66% 25% 9% 32
A majority of learners (93%) rated their Internet skills as intermediate (59%)
or advanced (34%) (Table 3-7) and 91% rated their email skills as intermediate

(66%) or advanced (25%). Learners rated their general PC skills as beginning
(34%), intermediate (47%), and advanced (19%). Learners who participated in this
study had experience using the Internet and email applications while, at the same
time, somewhat less experience using application software on their PCs (e.g., a
word processor or spreadsheet application). When asked if this was the first online
class the learners had taken, the answer (Table 3-4) was closely split between yes
(41%) and no (59%). When asked if this was the first subject related class (e.g., the
first history class) they had taken, the answer was 75% yes and 25% no. A majority
of learners (59%) completed more than 30 credit hours before the start of the
semester while 12% completed 0 to 9 hours, 12% completed 10 to 18 hours, and
16% completed 19 to 30 hours. Forty-seven percent of the learners had not
participated in an online class prior to participating in this study (Table 3-4).
Table 3-7 Computer skill data
Computer Skills
Number of learners
Class! Class 2 Class 3 All Classes
Beg Int Exp N Beg Int Exp N Beg Int Exp n Beg Int Exp n
Email 0 9 4 13 3 7 2 12 0 5 2 7 3 21 8 32
Internet 0 10 3 13 2 6 4 12 0 3 4 7 2 19 11 32
General PC 4 9 0 13 5 3 4 12 2 3 2 7 11 15 6 32

Percentage of learners
Class 1 Class 2 Class 3 All Classes
Beg Int Exp n Beg Int Exp N Beg Int Exp n Beg Int Exp n
Email 0% 69% 31% 13 25% 58% 17% 12 0% 71% 29% 7 9% 66% 25% 32
Internet 0% 77% 23% 13 17% 50% 33% 12 0% 43% 57% 7 6% 59% 34% 32
General PC 31% 69% 0% 13 42% 24% 33% 12 29% 43% 29% 7 34% 47% 19% 32
Beg = Beginner, Int = Intermediate, Exp = Expert

A majority of the learners (88%) were employed during the semester; 22%
of the learners were not employed during the semester. Of the learners employed
during the semester, 6% worked 1 to 3 hours per week, 3% worked 10 to 19 hours
per week, 12% worked 20 to 29 hours per week, 22% worked 30 to 39 hours per
week and the largest group (34%) worked 40 or more hours per week.
One of the attractive features of an online class is that people may participate
in the class regardless of where they live or whether they prefer to complete class
work during the day or evening. This flexibility holds true unless the instructor or
institution places constraints upon the learners. For example, requiring all learners
to log in at 7:00 p.m. Mountain Standard Time (MST) to participate in an interactive
(synchronous) discussion may cause problems for a learner located on the East coast
of the United States in the Eastern Standard Time (EST) zoneit would be 9:00
p.m. on the East coast. Depending on the learners time zone, a learner located in a
faraway place such as England or India can be forced to participate in an online
discussion in the middle of the night. The classes used for this study were conducted
entirely online with no scheduled interactive (synchronous) discussions. Learners
located anywhere they had access to the Internet could have participated in the
classes, however, as in previous iterations of the classes used in this study, all
learners were located in Colorado.

Selection of the institutions used in this study was dictated by selection of
the instructor. The instructor taught at two community colleges in the Rocky
Mountain area. The two institutions were similar in that (a) both had enrolments in
excess of 10,000 learners per year, (b) the traditional classroom setting, not online
learning was, the primary method of teaching, (c) the use of online learning was
increasing at both institutions, and (d) both institutions used the same learning
management system (WebCT). For confidentially reasons, I changed names of the
institutions in this publication to Rocky Mountain University (RMU) and Aspen
University (AU).
The Classes
Three undergraduate introductory level classes were selected for this study,
two sections of History I and one section of Geology I. The three classes were
offered at two institutions. The classes were semester-long undergraduate classes
taught online with no planned physical (face-to-face) interaction between the
instructor and the learners or between learners. The three classes were selected for
several reasons. First, the classes were taught entirely online using the WebCT
learning management system (LMS); there were no planned face-to-face
interactions between the class members or the instructor. Second, the instructor
taught all three classes for more than two semesters and he was familiar with the
WebCT LMS, thus minimizing problems associated with first-time online

instructors such as familiarization with the content, the online environment, and
teaching online. Third, as introductoiy online classes they may have been one of the
first online classes in which learners enrolled. Fourth, access to the instructor and
the course artifacts were relatively easy. Fifth, the estimated class size was
manageable for a qualitative study.
Learners in each class were given the opportunity to participate in the study.
Learners who agreed to participate in this study were given five extra credit points.
The instructor determined the point value based on the possible total points a learner
might have earned during the semester. Learners who did not wish to participate
were given the opportunity to submit a paper to the instructor for equal extra credit
(5 points) or they opted not to receive extra credit. Extra credit points were added to
a learners final class score; extra credit points were not a significant portion of a
learners overall grade. Finally, learners had the option to drop out of the study at
any time; none of the learners who agreed to participate in the study dropped out of
the study. No data, including extra credit data, were available for non-participants.
I assigned fictitious names to the institutions and random identification
numbers (using a table of random numbers) to the instructor and each learner to help
protect the confidentiality of the participants, the instructor, and the institutions used

in this study. I replaced learner names with their assigned participant numbers each
time a learners name appeared in a document.
Confidentiality of communication using electronic methods such as email
could not be guaranteed; however, I took reasonable steps to minimize the risk of
breaching the trust between the learners, the instructor, and me. Each learner
participating in the online class was informed of the potential risks of conducting
research online. A consent form (Appendix A) was sent to each learner who agreed
to participate in the study. I then sent a confirmation letter to each learner who
agreed to participate in the study. I established the requirement that consent forms
had to be sent to me from the learners email account so I could match an email
address to a learner in the class rosters. Upon receipt of a consent form I sent a
confirmation letter to the learner (Appendix B).
I stored all electronic data on my personal computer to minimize the risk of
unauthorized access to the data; data were not stored on a University computer.
Neither the instructor nor the learners had access to the raw data. Backups were
made of all data on a regular basis. Once this dissertation is complete, data will be
copied to compact disks, tested to ensure the data can be read, then removed from
my personal computer. I will store the compact disks and other documents for the
required timeframe of five years.

Yin (1994) identified four basic types of case study designs (Figure 3-1):
Type 1 holistic, a single case with a single unit of analysis; Type 2 embedded, a
single case with multiple units of analysis; Type 3 holistic, multiple cases with a
single unit of analysis; and Type 4 embedded, multiple cases with multiple units of
analysis. The form of case study selected for this research was a Type 3 holistic,
single case with multiple units of analysis. The overarching case was an aggregation
of three semester-long, online classes. The embedded multiple units of analysis
were the problems online learners encountered during the class.
Holistic (single unit of
Embedded (multiple
units of analysis)
Figure 3-1 Basic Types of Designs for Case Studies
Adapted from Yin (1994, p. 39), original source COSMOS corporation.
The model of the kinds of problems online learners may encounter was
piloted in the spring of 2002. Following a suggestion from Dr. Brent Wilson, I
created a list of the kinds of problems online learners may encounter in an online
learning environment. A request (Appendix F) was sent to subscribers of a
Single-case designs Multiple-case

University of Colorado list serversoftware that facilitates two-way
communication between subscribers of the listrequesting examples of problems
they encountered while taking online classes. The size of the list server varies from
semester to semester and is difficult to determine; however, more than 100 people
received the request for information. Fourteen subscribers responded with 20
problems. The low response rate may have been a result of low activity levels
(participation) in the list server (reading email from the server), pilot questionnaire
recipients who had not taken online classes and therefore believed they had nothing
to contribute, or list server participants who were engaged in current studies and had
little time to respond to a survey. Each problem reported during the pilot fit within
the proposed list of the kinds of problems learners may encounter in an online
learning environment; no dispositional problems were reported in the pilot of the
problem typology. In addition to testing the problem categories, the format for the
problem reports and questions on level of preparation, level of participation, level of
satisfaction, and level of importance was presented to two professors and a group of
doctoral students for review; questions were revised based on feedback from the
Researcher Role
A researcher who observes learners in a traditional classroom environment
often fills the role of participant-observer. The researcher as participant-observer is
known and visible to learners in a traditional learning environment. In the online

environment, however, the learners do not see the researcher as they go about their
online activities. Without constant reminders of the online researchers presence
(such as a notice that appears when learners log into the class website) it is possible
that learner awareness of the researchers presence may wax and wane. In the
lexicon of the Internet whenever a person observes the activities of others without
revealing her or his presence, the observer is said to be lurking. I was not a
participant-observer or concealed-observer (lurking-observer); instead, I was
introduced to the learners early in the semester and I reminded them of my presence
when I sent email requesting weekly trouble reports. As the sole researcher in an
online environment my role was that of a known-observer.
I collected trouble reports from participants, read participant posts, and on
one occasion had to refer a participant to the instructor because the learner
expressed her frustration about a technical problem that prevented her from turning
a test in on time. As a known-observer, learners knew I was reading their posts and
occasional emails sent to me by the learners. As a result, my observation may have
had undesirable effects on the study, such as learners reporting only the problems
they thought I wanted to hear about or learners reporting only problems they
perceived to be critical.

In order to minimize problems with the instructor, the class material, the
software interface (what the online learner sees), the learning management system
(LMS)the software that lies under the class interface and handles tracking learner
process, testing and other administrative functionsand possible institutional
differences (such as how classes are taught and tests are administered) I used one
instructor who was an experienced online teacher familiar with the WebCT LMS.
Additionally, the instructor previously taught all three classes at both institutions.
Using a single instructor familiar with WebCT, the course material, and teaching
online should have helped ameliorate instructor effects.
There are problems associated with the use of self-report data; such as,
learners providing open and honest answers to questions. I made the assumption that
the use of several sources of data including weekly trouble reports, class artifacts
(such as weekly posts), and email to validate information provided by learners
would help minimize (but not eliminate) problems with self-report data.
Data Collection
All data were collected online. A demographic survey was created and made
available to all participants using the Zoomerang online survey toolset
('http://www.zoomerang.comL Using an external tool to collect data for the class
minimized of time required to develop and deliver the online demographic survey.
After all survey data were collected I closed the survey (to prevent additional input).

I then printed all available reports from the external survey. Unfortunately, there
was no way to download the raw data so that I could import the data into
SPSSand Microsoft Excel. I entered all SPSS and Microsoft Exceldata
from the online surveys manually. Data entered into SPSS and Microsoft
Excel were verified by printing summary reports in SPSS , Microsoft Excel ,
and Zoomerang then the data were compared for accuracy. All errors were
corrected prior to analysis. As stated in the confidentiality section, neither the
instructor nor the learners had access to the raw data. I made weekly (and
occasionally daily) backups of all data. After completion of this study, all data will
be copied to compact discs (CD) and tested to ensure the data can be read. I will
then remove all data from my PC and store the CDs and paper documents for five
I sent each learner a blank weekly trouble report including instructions on
how to complete the reports. When learner trouble reports were not submitted I sent
reminder notices to participants. Trouble reports were in email format and had to be
transferred to a Microsoft Excel spreadsheet for analysis. Data from each trouble
report were cut from the email and pasted into the spreadsheet. After trouble report
data were transferred to spreadsheets I compared the data in the spreadsheets with
data in the original email. Once the data were verified I copied the email into an
email folder for future reference and to function as a backup location.

I examined class artifacts weekly, including reviews of learner posts. I
created a spreadsheet was created to record the type of post (weekly question,
weekly answer to another learners question, comment, or criticism) and the length
of the post. Short posts were one sentence in length, medium posts were two to three
sentences, and long posts were four or more posts. Selecting a somewhat arbitrary
length measurement for learner posts was done so that I could get a sense of how
learners were completing class assignments; measurement of the length of learner
posts was not done to measure the quality of a learner posts.
Research was conducted online, with me functioning as a known-observer.
The instructor and learners were aware of my presence as a researcher, however, I
did not actively participate in the classes. I was introduced to the learners at the end
of the first week and during the second week of instruction; I explained what I was
doing and asked for participants. After the introduction phase of the class, I
observed the class by reading trouble reports, class assignments, discussion board
posts, and available email (some email, such as leamer-to-leamer email, was not
available); beyond the introduction phase, the only other researcher-to-leamer
communication was with individual learners to clarify questions and to request
trouble reports. I sent a final thank-you note to each participant during the last week
of class, with a reminder that participants may request (in writing) a copy of the
research results. The only researcher-to-class communication occurred at the
beginning of the class so that I could introduce myself and answer questions about

the research process. Instruments were available online with the exception of the
weekly trouble report format which was sent via email to the learners during the
introduction phase of the class.
Data were collected from multiple sources, categorized, coded, triangulated,
and examined for emerging patterns. The three sources of data, learner self-reports,
class artifacts, and instructor feedback, formed a triad. The use of multiple data
sources supported my ability to describe the types of problems the online learners
encountered and allowed me to contrast and compare learner responses with actual
performance and instructor perceptions. Data triangulation helped reveal patterns
between learner perception and actual performance.
There were three distinct sources of data: (1) demographic data from the
learnersa demographic survey (Appendix C), trouble reports (Appendix D) with
levels of satisfaction and perceived future participation (Appendix E), and leamer-
to-researcher email; (2) class artifactsweekly posts to a discussion board, email,
and completed class assignments; and (3) instructor feedback.
Demographic data were collected using a 17 question, online, survey
(Appendix C). All of the survey items were forced-choice. The demographic survey
contains factual data (using the assumption that learners will answer the data
truthfully and accurately). There was no ability to verify data (such as comparing
learner answers with information provided by the institutions) on the demographic

surveys. The data on the demographic surveys were used to describe the classes and
the participants.
A weekly trouble report (Appendix D) was an open-ended survey instrument
that asked the learner to describe a problem they had encountered during the week.
While an open-ended question allowed a learner to write as much as they wished
there was no way to ask immediate follow-up questions that may have provided
additional information. Weekly trouble reports included a reminder that not all
problems are solved in order to minimize the chances learners would skip (not
report) problems they did not solve. However, it is possible that learners
encountered problems they considered mere inconveniences that were not reported.
Weekly trouble reports also included space to describe a problem and three
questions addressing the learners perception of (1) the importance of the problem,
(2) the learners perception of their ability to solve the problem, (3) and the learners
perception about their future, anticipated, level of participation as a result of solving
or not solving a given problem. The total number of trouble reports submitted for
the course was dependent upon (1) the number of learners and (2) the number of
trouble reports actually submitted by each learner. Ninety-seven trouble reports
were submitted. Learner trouble reports were cross-checked with learner posts to
confirm or disconfirm reports of delayed participation or non-participation; such as
non-participation when a learner was on vacation.

Data Analysis
Multiple tools were used to categorize problems and analyze data. Table 3-8
depicts the relationship between the data collection methods and the research
questions. I used a problem type matrix to categorize problems (problems were
coded using the codes in Table 3-9 and inserted in the matrix) and data from the
learners, including trouble reports, leamer-to-researcher email, class assignments,
and posts to class discussions.
Quantitative data (analyzed with SPSS and within Microsoft
spreadsheets) were used in conjunction with qualitative data to explore learner
problems and patterns of learner participation. Quantitative data included
demographic data and learner contribution data (including the number of trouble
reports and posts to the online class). Qualitative data included comments in learner
trouble reports, discussions between learners on common problems (such as a
discussion on the use of two textbooks), and quality of learner posts (including the
length and the depth of learner answers to questions).

Table 3-8 Data collection and analysis matrix
Each research question is listed followed by the data collection method and a brief description of how the data were used.

Question 1 What kinds of problems do adult learners encounter in an online learning environment?
Instrument or Natural Artifact Use Analysis Technique
Pre-study questionnaire (pilot) To fine tune problem typology. Read and coded each returned questionnaire using predetermined categories. Create keyword list for next phase of the study.
Trouble reports To collect data on problems learners encounter. Read and code each trouble report using keyword list to determine proper category in which to place each report. As each trouble report is read, compare it to previous trouble reports using a constant comparative strategy of compare, code, and change keywords list/codes as needed.
Question 2 How do learners respond to problems they encounter?
Instrument or Natural Artifact Use Analysis Technique
Trouble reports To identify actions taken by learners to solve problems. Each trouble report is read and a list of problem solving strategies will be developed from information contained in the reports.
Threaded discussions To identify possible problems that can be revealed in discussions. Read threaded discussions to discover shared problems learners encounter and strategies used by the learners to solve the problems.
(Table 3-8continues)

Table 3-8 continued
Question 3 How do learners solution patterns relate to their ongoing satisfaction or frustration with the learning experience?
Instrument or Natural Artifact Use Analysis Technique
Trouble reports To explore possible effects of solving or not solving problems as revealed in trouble reports. Read trouble reports, looking for keywords such as frustration and satisfaction and learner solutions then code in Excel to explore possible connections.
Threaded discussions To explore possible effects of solving or not solving problems as revealed in threaded discussions. Read threaded discussions to look for leamer-to-leamer discussions about solving problems and expressions of frustration, satisfaction, and dissatisfaction. Record findings in a solution pattern matrix.
Demographic survey To explore possible connections of solving or not solving problems with differences in learner demographic data. Use Excel spreadsheet containing trouble report and demographic data to explore possible connections between solution patterns and expressions of satisfaction, frustrations, or dissatisfaction.
Question 4 How do learners solution patterns and ongoing perceptions and experiences relate to their decisions to participate and continue in the class?
Instrument or Natural Artifact Use Analysis Technique
Trouble reports To explore data reported by learners for possible connections with problem solving and continued learner participation Compare data in trouble reports to frequencies of learner posts and completion of or exit from online class.
Email To explore data reported by learners for possible connections with problem solving and continued learner participation Review email messages for problems, comments about satisfaction, and comments about continuing participation or non-participation in the online class. Compare with data on learner posts to explore possible connections.
Threaded discussions To explore possible connections with problem solving and continued learner participation Review threaded discussions for problems, comments about satisfaction, and comments about continuing participation or non-participation in the online class. Compare with data on learner posts to explore possible connections.

Self-report data requires accurate and honest reporting by participants
(Krathwohl, 1993; Marshall & Rossman, 1999). For this study, learners were asked
to submit weekly trouble reports describing a problem or problems they had during
the week. Conducting the research online may have resulted in considered rather
than spontaneous responses (Krathwohl, 1993, p. 394); for example, when
completing a weekly trouble report a learner may have decided to report what they
believed to be a major problem while ignoring a problem they believed was a mere
inconvenience. Additionally, even though open-ended questions were used, learner
responses were generally short. The classes were held during the summer session
(eight weeks) so it was impractical to collect data and conduct weekly interviews
with participants to ask clarifying and/or probing questions.
To address potential problems with self-report data I compared learner
responses to learner activity, looking for changes in participation. For example, if a
learner reported they could not access a class for a week I reviewed that learners
posts for the week to confirm or disconfirm a gap in their weekly posts.
Additionally, I compared learner responses as they were coded, looking for similar
problems such as problems with the textbook, syllabus, or class website.
Unfortunately, many of the problem reports could not be triangulated with learner
participation data and class artifacts. For example, there was no way to confirm a
learners feelings of frustration by reviewing how many learner posts were made in
a week because learners did not express frustration or confusion in weekly posts and

they contributed the minimum number of posts per week; possibly due to the
shortened summer session and the simplistic format of the classes.
Table 3-9 Initial codes for categorizing data
Each problem report was coded using the codes and problem types in this table. Some problems were coded with multiple problem types.
Major Code Sub-code Description
PI Problem importance as perceived by the learner
PI-1 Problem importance low
PI-2 Problem importance medium low
PI-3 Problem importance medium
PI-4 Problem importance medium high
PI-5 Problem importance high
AS Ability to solve a problem as perceived by the learner
AS-1 Ability to solve low
AS-2 Ability to solve medium low
AS-3 Ability to solve medium
AS-4 Ability to solve medium high
AS-5 Ability to solve high
LOP Anticipated level of future participation as perceived by die learner
LOP-1 Anticipated level of future participation low
LOP-2 Anticipated level of future participation medium low
LOP-3 Anticipated level of future participation medium
LOP-4 Anticipated level of future participation medium high
LOP-5 Anticipated level of future participation high
FT Problem type.
PT-INS Institutional problems: Practices or procedures imposed by policies and institutions that discourage learners.
PT-DIS Dispositional: Learner attitudes and beliefs about the subject and their ability to learn.
PT-TEC Technical: Problems with computer components including hardware and software.
PT-ID Instructional Design: Problems arising from the design of a class.
PT-SOC Social: Problems arising from a learners perception of a lack of social interaction with other learners, teaching assistants, and teachers.
PS Problem structuredness
PS-S Structured, easy to solve, may have known answers.
PS-I Ill-structured, difficult to solve, requires higher order thinking skills.

Categorizing Problems
Although I collected, sorted, coded, and interpreted data, I had two other
raters classify a selection of problem reports submitted by learners. I trained the two
raters on the process for categorizing problem reports then had the raters
independently categorize a sample of 32 trouble reports. I used Cohens kappa
(Upton & Cook, 2004) to measure agreement between me and each of the two
observers. Cohens kappa for researcher-observer 1 agreement was .71 and for
researcher-observer 2 agreement it was .73. A measure of .70 or greater is
considered acceptable. A review of the disagreements led to a change in the wording
of one problem category.
Problems were categorizing by reading each problem statement and
identifying keywords (Table 3-10) in learner comments that indicated potential
problem categories. For example a trouble report that contained the phrase It is
difficult for me to access quiet time to concentrate on my studies was coded as
Situational/Environmental problem. As another example, the keyword link was used
to categorize a problem as both technology and instructional design problems. A
comment that stated, The first problem for me occurred when another History link
showed up on my account was categorized as both a technical problem and an
instructional design problem because the link was in the incorrect place and the
instructor did not review the class material to ensure proper links were displayed.

The list of keywords was expanded during the study to include as many new words
as possible without making the list unmanageable.
In addition to the keyword list, I used a variation of the constant-
comparative method (Krathwohl, 1993) to code each trouble report. As I read a
trouble report I assigned a code (or codes) to the trouble report then I compared the
trouble report with previously coded trouble reports to verify that similar reports
used similar codes. If, for example, I added a new keyword I examined all
previously coded trouble reports for the new (or similar) keyword(s). During the
process of coding trouble reports I found evidence suggesting that I should add an
instructor category. The new instructor category was added and I examined all
previously coded trouble reports. Eleven of the 15 reports previously categorized as
social problems were updated and categorized as instructor problems

Table 3-10 Keywords for identifying problem categories
Keywords in this table were used as a guide to help identity problem categories. The list of keywords is not exhaustive. Reading a learner comment to find keywords, words similar to keywords, or phrases that indicated an appropriate category is the first step; in the end the researcher must decide in which category or categories to place each problem.
Problem Type Keywords
Situational/Environmental family, light, noise, organization, quiet, situation, space, time, work, workload, vacation
Institutional access, on-campus, password, registration, required
Dispositional behind, easy, hard, I cant, knowledge, skill, time, workload
Technical attachments, cable, computer, connect, error, error message, failure, froze, hardware, hung up, Internet, link, navigate, opening, PC, power, program, reconnect, save, screen, software, submit, technical, telephone, threads, URL, website, world wide web, world wide web
Instructional Design assignments, attachments, book, class, content, directions, error, error message, exam, homework, interface, Internet, learning, link, log in, navigate, pace, password, post, questions, quiz, reading, research, screen, submit, syllabus, terminology, terms, test, textbook, threads, website, www, world wide web
Social class, collaborate, communication, contact, family, flame, help, name calling, posts, posting, response, responsiveness, share, social, team
Instructor assistance, availability, communication, contact, help, input, instructor, involvement, response, responsiveness, teacher
Limitations of the Study
Multiple data sources were used for this study, including class posts, trouble
reports, and surveys. Using multiple data sources, I was able to verify aspects of
problem reports and problem resolution by crosschecking learner trouble reports
with patterns of learner posts. For example, one learner reported being unable to
access the class web site for a weekend. A review of the learners posts for the
weekend in question revealed that the learner had indeed made no posts during the
weekend. Another learner reported that he was out of town for a week. A review of
that learners posts revealed a gap of one week in the learners posts.

Another limitation (and strength) of the study lies in its qualitative aspects.
Learners were asked to report problems they encountered by answering several
open-ended questions in a weekly problem report. The words learners used to report
problems were often powerful, particularly when learners reported feelings of
frustration and dissatisfaction. For example, one learner reported This problem has
really frustrated me and is discouraging [me] from recommending an online class to
a friend. The same learner later said, My worst fears are coming true! Learner
problem reports contained self-report data, and a weakness of self-report data is that
respondents may not be willing to reveal their honest feelings and opinions. With
regard to this study, it is possible that events not related to the online class
environment may have influenced learner responses. However, using multiple data
sources I was able to confirm problem types in weekly trouble reports. For example,
critical problems such as lack of feedback were corroborated with other learner
The conceptual framework developed to identify the kinds of problems
online learners encountered strengthens this study on one hand and can be a
limitation of the study on the other hand. The framework was developed and tested
using participants on a University of Colorado education list server. The initial
typology of the kinds of problems learners may encounter in an online learning
environment was used to categorize all problems study participants encountered
with only one minor change to a category description prior to data collection. The

strength of the conceptual framework was that it provided a lens through which I
viewed problems in the online classes. However, one side effect of a lens is that it is
only one way to look at the world or a problem and by using this particular lens I
may have missed some problems or opportunities to code problems differently.
Another research design aspect turned out to be both a strength and
weakness of the study. Using one instructor who was familiar with the classes and
the technology (he taught both topics multiple times at the same institutions)
strengthened the study by minimizing problems such as those associated with first-
time instructors. Unfortunately, learners in all three classes encountered
communication problems with the instructor that may have masked other problems.
Problems included lack ofor greatly delayedresponses, inappropriate feedback
(as perceived by the learner), and lack of availability. Learners were not alone in
these experiences. For example, I had a face-to-face meeting scheduled with the
instructor. After traveling to the instructors office and waiting for an hour, the
instructor failed to show up. I received a response from the instructor concerning the
missed meeting three days after sending him an email message to set up another
meeting; we eventually discussed my questions using email.
Another limitation of the study lies in the sample size; there was a problem
obtaining willing participants. While a sample size of 32 is not necessarily small,
more learners opted out of the study (n=57) than those who opted into the study
(n=32). Research design issues such as a requirement to email weekly trouble

reports to me may have led to a sampling of participants with very different traits
from those of non-participants. Also, some learners may have reported only those
incidents they thought were of interest to me as a researcher; they may not have
reported what they considered inconveniences or small problems.
Another research design issue was the failure to obtain responses from
learners who dropped out of the class. Input from dropouts may have provided
insight into a unique set of problems that led to learner decisions to drop out of a
given online class. However, most of the learners who dropped out of the online
classes did so before the census date; three learners dropped the classes after the
census date. Given that these learners did not consent to participate in the study, it
was difficult to obtain data. Additionally, no data from leamer-to-leamer
communications (such as email conversations about problems) were available. This
behind-the-scenes type of problem solving would have been useful in my research;
however, I have no data to suggest that it did or did not take place.
While the problem typology developed as part of this study can be used in
future research, the design of this study makes it difficult to apply generalizations
about the relative importance of problem types. Different groups of learners may
encounter different mixes of problems and other instructors will have different skills
and abilities and they may use different delivery methods. For example, learners
who receive timely, meaningful, and relevant feedback from an instructor may
report that problem categories other than the instructor category have a stronger

effect on their experiences and beliefs about online learning than was reported by
the learners in this study.
In the end, I can only report on the relative importance of problems reported
by the learners who participated in this study; I cannot report generally on the
relative importance of one category of problems over another for all online learning
classes. For example, it is possible that instructor problems (in this case
communication with the instructor) overshadowed other problem types because of
the instructors behavior. Furthermore, instructors vary in their abilities and delivery
approaches. The behaviors exhibited by the instructor of the classes studied in this
research are most likely not representative of behaviors exhibited by all online
This chapter described the research method for the study, an embedded case
study with the online class as the overarching case and the problems online learners
encounter as the units of analysis. Although the common way to refer to a
researcher observing a class is participant-observer I was not physically present.
Instead, I was lurking in the virtual environment, receiving email from learners and
reading posts, acting as a known-observer. A review of the findings of the pilot of
the typology of problems learners may encounter in an online learning environment
resulted in a minor change to one categorythe situational/environmental
description was changed to include distractions. The primary data source was the

weekly trouble reports submitted by learners; additional sources (such as learner
posts in discussion areas) were used to improve the reliability of the study.
A primary limitation of this study is its reliance on self-report data. The
reliability of the study is dependent upon the accuracy and honesty of learner
reports. Unfortunately, the dataset collected during this research is not rich enough
to allow me to verify all learner trouble reports particularly when the trouble reports
express feelings of learner frustration. Additionally, that which I believed to be a
strength at the beginning of the study (one instructor, familiar with the course
material and online learning, teaching three online classes using the same learning
management system), became a liability when problems developed with leamer-to-
instructor and instructor-to-leamer communication. In the end, all of the problems
reported by the online learners fit within the typology of problem (with the changes
mentioned above).

The purpose of this research was to (1) develop a typology of the kinds of
problems adult learners encounter in an online learning environment; (2) gain
insight into how learners respond to problems encountered in an online learning
environment, and (3) explore connections between solving (or not solving)
problems and learner satisfaction and participation. This chapter presents findings
for each research question. Specific research questions are:
1. What kinds of problems do adult learners encounter in an online learning
2. How do learners respond to problems they encounter?
3. How do learners' solution patterns relate to their ongoing satisfaction or
frustration with the learning experience?
4. How do learners' solution patterns and ongoing perceptions and
experiences relate to their decisions to participate and continue in the

Research Question No. 1: What kinds of problems do adult learners encounter in an
online learning environment?
A framework was created to categorize the kinds of problems learners
encounter in an online learning environment. During the course of the study I
determined (with feedback from my advisor and one committee member) that it was
appropriate to make three changes to the problem typology. First, the definition of
situational/environmental problems was expanded after the pilot of the typology to
include distractions and interruptions such as those caused by taking holiday time
off from school. Second, a new category of instructor was added to better categorize
a group of social problems that focused on instructor issues. Third, the dispositional
category was removed after discovery that dispositional problems reported by
learners in the study were not stand-alone problems; that is, dispositional problems
were reported as components of other problem categories.
Table 4-1 shows the number of problems coded in each category. Using the
new problem typology (Table 4-2) I was able to identify each kind of problem
learners encountered during the online classes. Analysis of trouble reports often
resulted in dual coding; therefore, 93 trouble reports were categorized as 109
problem types in Table 4-1 and Table 4-2. Of the 93 trouble reports, 47 (51%) of the
trouble reports included the importance the learner placed on solving the problem
(Table 4-3). Two of the 47 trouble reports (4%) that included an importance level
reports were labeled unimportant, 13 of the 47 reports (28%) were labeled

somewhat important, 26 of the 47 trouble reports (55%) were labeled important, and
6 of the 47 trouble reports (13%) were labeled very important. The majority of the
problem reports that included the severity of the problem were labeled important to
very important (32 of 47 or 68%).
Table 4-1 Problems by category
Problem Category Total problems reported = 109 Reported by 24 Females (75% of the participants) Reported by 8 Males (25% of the participants)
Count %of total Count %of category Count %of category
Technical 44 40% 27 61% 17 39%
Instructional Design 22 20% 14 64% 8 36%
Situational/Environmental 23 21% 12 52% 11 48%
Instructor 13 12% 7 54% 6 46%
Social 5 5% 3 60% 2 40%
Institutional 2 2% 1 50% 1 50%

Table 4-2 New problem types, including number coded
Category Total % Description Examples from learner trouble reports
Technical 44 40% Difficulties with technology such as computer componentsincluding network, hardware, and software. - Learner had problems opening threads in discussion areas. - Learner completed two exams but was given credit for three. - Learners password was not recognized. - Links had been marked as read, yet the learner had no recollection of reading the threads. - School server was down for the weekend.
Instructional Design 22 20% The design and delivery of the class. - Learner had difficulty learning how to navigate the class pages. - Learner had a short version of the text so the chapters did not match the syllabus. - Syllabus contained the term wiggle room, multiple learners reported confusion about the meaning. - Learner reported problems with their Internet connection.
Situational/Environmental 23 21% The learners life situation - Learner reported taking a holiday from class. Had to work hard to catch up. - Learner reported that as foreigner he had to work harder than English speaking learners. - Learner reported that class load required a lot of extra work. - Learner reported that as a mother and a wife it is difficult for her to find quiet time to study.
Table 4-2 continues

Table 4-2 continued
Category Total Description Examples from learner trouble reports
Instructor 13 The consequences of instructor-learner interaction. Specifically, problems (as perceived by the learner) caused by the instructor. - Learner reported delay from Friday to Tuesday concerning an exam grade. - Learner expressed that there was not enough contact with the instructor. - Learner stated instructors response to a question was inappropriate; the instructor said check the course content, reading carefully is the sign of a true scholar. - Learner reported two-week delay in hearing about an exam.
Social 5 The social consequences of learner and instructor interaction (includes instructor- learner and learner-learner interaction). - Learner reported that other learners were not posting in correct locations causing problems finding questions. - Learner reported disappointment with the pace of other learners, slow taking tests. - Learner stated the instructor could have been more involved with weekly posts.
Institutional 2 Practices or procedures imposed by policies and institutions that discourage learners. - Multiple learners reported the class website was down for the weekend, causing problems with learner weekend class activity. - Learner reported having to take a proctored exam on-campus.

Table 4-3 Problem importance
Of 93 trouble reports, 47 (or 51%) contained an importance level assigned by the learner.
Count Percent
Trouble reports that included an importance level 47
Problems rated unimportant 2 4%
Problems rated somewhat important 13 28%
Problems rated important 26 55%
Problems rated very important 6 13%
Reports submitted by females were coded most often in each category
(Table 4-1) except institutional problems (only two institutional problems were
coded, one by a male and one by a female); however, more females than males
participated in this study. Overall, the number of problems coded as reported by
females and males closely reflected the number of females and males who
participated in the study in three of the six problem categories: Technical,
Instructional Design, and Social. For example, 24 females (75%) who participated
in the study reported 27 technical problems (61% of all technical problems) while 8
males (25%) who participated in the study reported 17 technical problems (39% of
all technical problems). In the end, the data do not suggest a difference in problem
reporting between the genders concerning who generally reports more problems in
an online learning environment (females or males). It is possible that with a larger
sample size there may be differences; i.e., given a larger sample size one gender
may report more technical problems than the other gender.
Overall, learner trouble reports were coded as 44 technical problems, the
largest category of problems reported (40% of the problem types coded). Examples

of technical problems include: inability to access a class, dropping a learners
connection as the result of a power outage or low quality telephone lines, the
appearance of another class on the learners login page, and learner PC problems
such as a frozen screen.
Twenty-two of the learner trouble reports were coded as instructional design
problems (20% of the problem types coded). Examples of instructional design
problems include: unclear exam directions, a problem with the textbook (there was a
long and a short version of the text), problems with the syllabus, problems
navigating the online class web site, and limited feedback on test performance and
individual progress.
Twenty-three of the learner trouble reports were coded as
situational/environmental problems (21% of the problem types coded). Examples of
situational/environmental problems include: class load, language issues (for an ESL
learner), insufficient time to participate in the class (e.g., busy due to work or
entertaining visiting relatives), falling behind, family pressures, lack of sufficient
resources (including, in one case, lack of a PC), and physical discomfort. As an
interesting side note, one learner stated on a trouble report that she did not have
access to a PC at home, yet all learners (100%) reported having home PC access in
the demographic survey. Without follow-up data, I could not determine if the
learner (a) answered the demographic survey incorrectly or (b) if the learners

situation changed during the coursee.g., her PC may have broken and she no
longer had access to a PC at home.
Originally, 14 learner trouble reports were coded as social problems. After
reviewing learner feedback it became apparent that problems with the instructor
comprised most of the trouble reports categorized as social. The new category of
instructor comes from two sources: first, the observation that many of the social
problems recorded as part of this research were related to the instructor and second,
research into instructor presence supports the importance of social presence and the
instructor in an online learning environment (Anderson, Rourke, Garrison, &
Archer, 2001; Brady & Bedient, 2003; Tammelin, 2003; Thomam, 2003; Wegerif,
1998). Therefore, the social category was divided into two categories: social and
instructor. Of the 14 trouble reports coded as social problems 13 (12% of all
reported problems) were re-categorized as instructor problems and 5 as social
problems (5% of all reported problems)four of the trouble reports originally
coded as social problems were categorized as both a social and instructor problems.
Thirteen learner trouble reports were coded as instructor problems (12% of
the problem types coded). While not the largest number of problem types reported,
much of the emotive language used by learners in trouble reports was found in
trouble reports categorized as instructor problems. For example, one learner
reported that she believed the instructor was mad at her after she had not received
timely feedback from the instructor. Another learner reported that he believed the

instructor did not devote enough time to the class, taking away from the online
learning experience.
Five learner trouble reports were coded as social problems (5% of the
problem types coded). Reports of social problems typically involved the behavior of
learners other than the learner reporting the problem. For example, one learner
reported that other learners were posting questions and answers in the wrong
ordershe stated that learners should first post an answer to a question then post a
weekly question. The learner reported that the problem was important, however, she
also stated that it was the instructors responsibility to address the other learners
Two learner trouble reports were coded as institutional barriers, including 1
problem report submitted for a class that was not part of this study (2% of the
problem types coded). One institutional barrier reported by learners taking the
classes used for this study was the decision by one institution to bring down the
school web site for a weekend, thus barring learners from accessing their classes.
The second institutional barrier involved a problem report turned in for a class
outside this research. In this case, the learner reported having to take a proctored
exam for an online class at the school. The learner reported difficulty accessing the
testing center during a time the learner was available and a problem with the testing
center was not set up to administer the exam. I decided to leave this report in the
findings, as another example of an institutional barrier to online learning.

Research Question No. 1 Summary
Problems that were identified by online learners coded for this study fit into
the modified problem typology in Table 4-1. Technical problems comprised the
largest category (44 or 40%), followed by situational/environmental problems (23 or
21%), instructional design problems (22 or 20%), instructor problems (13 or 12%),
social problems (5 or 5%), and institutional problems (2 or 2%). The typology of
problems is not rigid; therefore, it is possible to sub-divide one of the existing
problem categories or to add a new category should the need arise. As a heuristic,
instructional designers, instructors, and trainers can use the modified problem
typology to improve the design and delivery of online instruction.
Research Question No. 2: How do learners respond to problems they encounter?
Learner responses to problems varied; however, they typically followed a
process of reaction to the problem (an affective reaction) and developing a strategy
to solve the problem coupled with actions taken to solve the problem (an effective
Of the 93 trouble reports, 66 (or 71%) contained indicators of confusion
and/or frustration (Table 4-4). Ten trouble reports contained indicators of confusion
and frustration, 16 trouble reports contained indicators of confusion only, and 40
trouble reports contained indicators of frustration only. I categorized (in Table 4-4)
13 instances of confusion and 27 instances of frustration under technical problems;
1 instance of confusion and 2 instances of frustration under institutional problems; 2

instances of confusion and 13 instances of frustration under instructor problems; 1
instance of confusion and 3 instances of frustration under social problems; 13
instances of confusion and 11 instances of frustration under instructional design
problems; and 6 instances of frustration under situational/environmental problems.
Learner problem solving strategies (47 of 99 reports or 48%) fell into four basic
categories: 1) try it again later (used 17 times), 2) ignore the problem (used 11
times), 3) ask for assistance (used 14 times) and 4) keep on task (used 21 times)
(Table 4-5).

Table 4-4 Reports with confusion/frustration indicators
Of 93 trouble reports 66 or 71% contained indicators of confusion and/or frustration.
Count Percent
Total trouble reports that had confusion/frustration indicators (sum of the three following categories) 66
Trouble reports with only frustration indicators 40 61%
Trouble reports with only confusion indicators 16 24%
Trouble reports with both confusion and frustration indicators 10 15%
Table 4-5 Problem solving strategies
Problem solving strategies used by learners fell into four categories: 1. Try it again later (Later), 2. Ignore the problem (Ignore), 3.
Ask for assistance^^Ask^jjmdJiJfCeeg^ontai^^
Problem Type
Technical Institutional Instructor Social Instructional Design Situational / Environmental
Strate gy Count Percent Count Percent Count Percent Count Percent Count Percent Count Percent
Later 7 25% 2 100% 0 - 0 - 2 25% 6 50%
Ignore 2 ..7% 0 - 2 22% 2 50% 2 25% 3 25%
Ask 7 25% 0 - 5 55% 0 - 2 25% 0 -
Task 12 43% 0 - 2 22% 2 50% 2 25% 3 25%
Total 28 2 9 4 8 12

The existence of the emotional states of confusion and frustration were
inferred from learner comments and the problem context. As trouble reports were
coded I developed and updated a list of keywords and phrases that indicated
confusion and frustration (Table 4-6). Using a constant comparative method I
continually reviewed trouble reports for keywords added to the keyword list.
Learners expressed frustration and confusion when attempting to solve
problems they encountered. Learners typically expressed frustration when they did
not receive timely and helpful responses from the instructor and when they
encountered problems that prevented the learners from participating in the online
class at the time they wished to participate.
Learner expressions of confusion included comments on topics such as
trouble finding learner posts, confusion with the instructors use of the term wiggle
room, the use of two versions of the same textbook (learners expressed confusion
over the content of the chapters and chapter post areas), the number of chapters to
be read, and the number of tests to be taken. Learner expressions of frustration
included comments on topics such as lack of the instructors commitment, the
instructors responses (and lack of responses), software and hardware problems,
time requirements, and problems with the course materials.
Overall I recorded 62 instances of frustration and 30 instances of confusion
in 93 learner trouble reports (Table 4-7); I was not able to detect a state of confusion
and/or frustration in each trouble report. For example, one learner reported that they

logged into WebCT, tried to get into the discussion area and they were dropped
by the system. While I may have indicated that it was frustrating had it happened to
me, there was nothing in the learners trouble report to indicate a state of frustration;
therefore, the learners trouble report was not counted in the frustration total.
Additional information on confusion and frustration are reported in the answer to
research question number three.
Of the six problem categories in the updated problem type matrix (Table
4-2) the try it again later strategy was used 17 times (Table 4-5); 7 times to solve
technical problems, 2 times to solve institutional problems, 2 times to solve
instructional design problems and 6 times to solve situational/environmental
problems. Learners did not use the try it again later strategy to solve instructor or
social problems. The try it again later strategy was most often used when learners
encountered technical and situational/environmental problems. For example, when
one learner could not access the Internet due to low-quality telephone lines in a rural
setting, she resorted to the try it again later strategy by waiting and dialing her
Internet Service Provider (ISP) at a later time; eventually, a connection was
successfully made between her PC and the ISP.