PROOF OF CONCEPT OF PROTOTYPE TOOL
FOR SOCIAL INTERACTION IN ONLINE INSTRUCTIONAL MATERIALS
Patti Oringel Shank
B.A. University of Maryland, 1977
M.A. George Washington University, 1997
A thesis submitted to the
University of Colorado at Denver
in partial fulfillment
of the requirements for the degree of
Doctor of Philosophy
Educational Leadership and Innovation
2003, Patti Oringel Shank
All Rights Reserved
This thesis for the Doctor of Philosophy
Patti Oringel Shank
has been approved
Joanna C. Dunlap
Shank, Patti Oringel (Ph.D., Educational Leadership and Innovation)
Proof of Concept of Prototype Tool for Social Interaction in Online Instructional Materials
Thesis directed by Professor Brent Wilson
This study evaluated a potential new tool for online social interaction, for dynamically
creating epistemic forms embedded in online instructional materials. Example forms were built and
forms were embedded in actual instructional materials in order to assess their value as an interaction
tool and make suggestions for additional development and use in the future. The study was a design
narrative that looked at how the forms were used, the interactions that occurred, and the resulting
artifact (the form with submissions) in a variety of domains and situations. As a preliminary design
experiment or proof-of-concept, the study was purposely limited. The primary intent of this study
was to determine whether the tool showed enough value to be developed and studied in the future.
Nine instructors, thirteen instructional designers and developers, multiple learners, and I
evaluated the forms in three study parts. Part 1 involved evaluating example forms. Part 2 involved
designing a lesson that utilized one or more forms and Part 3 involved implementing that lesson in
actual use. Participants and learners came from a variety of instructional settings including K-12,
higher education, and corporate training.
Study results showed that the tool shows value as a support for online social interaction.
The tool supported a variety of instructional activities and could be used for a variety of content
types and strategies. Participants described social interaction benefits and knowledge building in
some of the study tests. Depending on how the tool was used, people could respond to each other and
were impacted by what others had posted. The resulting artifact could serve as additional content.
Using the tool was intuitive and the visual artifact appeared to be beneficial in a number of
respects. The primary usability issue was the ability to print the resulting artifact. Most of the
instructors and instructional designer/developers who designed a lesson and implemented it continue
to use the tool even after the study concluded. Additional features are desired and should be
considered and tested in the next version. There is a need for additional research on this tool and
other types of tools for online social interaction.
This abstract accurately represents the content of the candidates thesis. I recommend its publication.
Getting a Ph.D. is somewhat like running in a very long race. There are highs and lows and
many times where you feel like you simply can not take another step. But you do, and somehow it
gets done. Besides intense perseverance and passion, it helps to be slightly crazy. More than this, it
helps to have people who believe in you and who cheer you on, especially when the going gets truly
icky (I told you I would get this word in here Joni), and gladly listen to you when you have major ah-
has to share.
I have had a terrific cheering section and want to thank them for being there for me. First I
want to thank my committee for putting up with me, which is no small feat, and helping me all along
the way. I thank Joanna Dunlap for getting me into this mess and for providing critical insights and
invaluable guidance. Brent Wilson, my committee chair, believed in me from the beginning, which
helped me persevere, and gave me leeway to follow my passions, which kept me motivated. Alan
Davis had a knack for pointing out important implications and for helping me focus. Carol Kamins
work was an inspiration for my own and she always made me feel like I had something important to
say. Dr. Thiagarajans (Thiagi) input and guidance made my study far more insightful than it would
have been otherwise and his humor kept me laughing though the hard times. These were time
consuming and incredible gifts. Thanks to all for helping me make it to the finish line and molding
me into someone who could accomplish this goal.
There are some other people who had a profound influence on this process. Eric Replinger,
an instructional designer and developer with whom I regularly work helped me with the technical
implementation of this study and the interrater agreement study. Without his assistance, many of the
things that the instructors and instructional designers/developers wanted to do could not have been
tested because I did not have enough programming and database skills to implement them. With his
help, this study was far more robust than it might have been otherwise. He was also the
source of much encouragement when I felt lost. Thank you Eric for being a good friend and making
me look good. You are a gem.
I want to point out the influence of one of my professors, Mark Clarke, because his courses
and conversation substantially changed how I view the world. I am still pretty narrowly purposive
but have come a long way toward being more system oriented. My growing ability to think in
systems terms literally made it possible for me to get this work done. It has also helped me with
family and work relationships and this may the most important contribution of my doctoral work.
Thanks so much for making this difference in my life, Mark.
I need to thank my family for supporting me and for putting up with my being preoccupied
too much in the last few years. This has been hard on them and I want them to know how much I
appreciate their support and dinners out at Subway. My husband Greg, an electrical engineer,
listened to me talk about epistemic games and other weird learning topics and even asked questions.
He cheerfully supported me financially and emotionally while I finished writing my dissertation. To
answer one of my son Andys persistent questions: No Andy, I will not being doing another Ph.D.
after this one. Another Oringel girl becomes a doctor. Very cool.
The following people were influential in my doctoral studies and merit many thanks. A few
regularly listened to me gripe and comforted me, some were fellow doctoral students or faculty, and
several are clients, colleagues, or former students. All guided or influenced me in any number of
ways. I thank you more than I can say. I look forward to paying it back and forward.
1. INTRODUCTION TO THE STUDY.......................................................... 1
Impetus for Online Learning.........................................................1
Tools for Learner Interaction.......................................................2
Problem Space Need for Targeted Knowledge Building Tools..........................3
Purpose of the Study...................................................................4
Structure of the Dissertation..........................................................6
Limitations of the Study...............................................................9
2. LITERATURE REVIEW................................................................. 11
Interaction Definitions and Types.....................................................11
People Interaction............................................................... 13
One More: Learner-Technology Interaction...........................................15
Closed vs. Open Systems Interactions..................................................16
Affordances of Interaction for Learning...............................................18
Conceptual Frameworks for Instructional Interaction...................................21
Social Constructivism / Situated Learning..........................................23
Social Interaction and Explicit and Tacit Knowledge................................24
Design of Open Systems Learning Environments......................................27
Asynchronous People Interactions in Online Instruction...............................29
Tools for People Interactions.....................................................31
Tools for Knowledge Building......................................................33
Using Knowledge Building Games in Instruction.....................................36
Collaborative Learning Tools Research.............................................44
Assurances And Confidentiality....................................................51
Limitations of the Design............................................................54
Plan for Reporting Findings..........................................................55
Description of Participants..........................................................60
Description of Study Activities......................................................63
Instructional Activities Coding Category..............................................79
Interactions Coding Category..........................................................93
Tool/Artifact Coding Category........................................................100
Visual and Physical Attributes......................................................100
Summary of Findings....................................................................108
Question 1. What types of content, strategies, objectives, and outcomes does
the tool best support?_Does the tool support knowledge building?........112
Question 2. What is the nature of the interactions that occur when using the tool?
In what ways are learners interacting with each other? In what ways are learners
interacting with the content?...........................................117
Question 3. What is the nature of the artifact that is built? How does the tool compare
to computer conferencing tools? What usability issues arise and what additional
features are needed?....................................................119
Limitations of the Study..................................................122
Social Interaction Tools Research.......................................124
Epistemic Form Tools Research...........................................126
A. PART 1 INTERVIEW QUESTIONS............................................133
B. PARTS 1-3 STUDY PARTICIPATION REQUEST FOR INSTRUCTORS AND
C. PART 2 EXAMPLE TEMPLATES FOR LIST AND TABLE FORMS.....................137
D. PART 2 INTERVIEW QUESTIONS............................................139
E. PART 3 STUDY PARTICIPATION REQUEST, LEARNERS..........................141
F. PART 3 INTERVIEW QUESTIONS, INSTRUCTORS AND INSTRUCTIONAL
G. PART 3 INTERVIEW QUESTIONS, LEARNERS....................145
H. INFORMED CONSENT FORM, INSTRUCTORS AND INSTRUCTIONAL
I. INFORMED CONSENT FORM, LEARNERS.........................152
J. HUMAN SUBJECTS COMMITTEE APPROVAL.......................156
K. EXAMPLE, ACTIVITY NOTES.................................158
L. EXAMPLE, ARTIFACT NOTES.................................166
M. INTERRATER AGREEMENT DIRECTIONS/EXAMPLES................179
N. INTERRATER AGREEMENT RESULTS............................182
Figure 1.1. Problem space need for targeted knowledge building tools.......................4
Figure 1.2. Haddon's Matrix embedded in injury prevention course materials...................5
Figure 3.1. Ethics definition list example..................................................47
Figure 3.2. Page design rubrics table example...............................................48
Figure 3.3. Haddon's Matrix example.........................................................49
Figure 4.1. Haddons Matrix form built by students in a graduate level injury
prevention course............................................................... 67
Figure 4.2. Ancient landscapes form completed by students in an online high school
social studies course............................................................69
Figure 4.3. Portion of comparison form built by students in an online high school
Earth science course.............................................................70
Figure 4.4. Activity 1: Portion of first interactive fiction form...........................72
Figure 4.5. Activity 2: Portion of second interactive fiction story form....................73
Figure 4.6. Activity 2: Portion of second interactive fiction comment form..................73
Figure 4.7. Activity 4: Portion of team development Case 1 form.............................74
Figure 4.8. Activity 4: Portion of team development cases comment form......................74
Figure 4.9. Portion of comments table from case building activity..........................86
Figure 4.10. Portion of comments table from case building activity.......................... ..96
Figure 4.11. Portion of comments table from interactive fiction writing activity............97
Figure 4.12. Portion of table from high school Earth science activity......................102
Figure 5.1. Portion of test team development cases form....................................121
Table 1.1 Study Parts........................................................................6
Table 2.1 Insights for Designing Open Systems Learning Environments..........................28
Table 2.2 Example Epistemic Forms Based on Sugrues Model of Content
and Objectives Types...............................................................37
Table 3.1 Overview of Study Research Questions, Parts, Data Sources, and
Table 3.2 Research Questions and Initial Coding Themes........................................52
Table 4.1 Recap of Study Research Questions, Data Sources, and Data Analysis
Table 4.2 Overview of Study Participants......................................................62
Table 4.3 Overview of Study Activities........................................................63
Table 4.4 Documents and Contributors..........................................................75
Table 4.5 Code Definitions....................................................................76
Table 4.6 Text Passages Coded in Coding Categories, Codes, and Subcodes......................109
Table 5.1Uses of Epistemic Forms in the Study, Based on Sugrues Model of Content
and Objectives Types..............................................................113
INTRODUCTION TO THE STUDY
The ability of learners to interact with other learners in instructional environments conveys
numerous instructional benefits and is considered by many to be critical to learning (Anderson, 2002;
Berge, 1999; Brown & Duguid, 1989; Kearsley, 1995; Kearsley & Shneiderman, 1999; McDonald &
Gibson, 1998; M. G. Moore, 1991,1993; Nunn, 1996; Scardamalia & Bereiter, 1994). Specific
studies have, in fact, shown that this type of interaction can positively influence learning, motivation,
and problem solving (Nunn, 1996), and help learners gain needed support and overcome frustration
(O'Reilly & Newton, 2002).
Learners who use online instructional materials are often separated by time or place. The
types of learner interactions that would normally occur in a face-to-face setting (discussion, sharing,
peer review, group activities, etc.) commonly occur via online technologies and tools. These tools,
and their inherent utility and usability, place limits on what kinds of online interactions are possible
and likely to happen. If learner interactions convey instructional benefits, careful attention should be
paid to making constructive learner interactions occur.
Impetus for Online Learning
Economic, political, and sociocultural changes have brought about increased needs for
education and training and require new technologies to support them (J. S. Brown, 2000; Laurillard,
2002; Luker, 2000; Scardamalia & Bereiter, 1994). The push by organizations to develop and
implement online courses has been fueled by a desire to reduce costs, increase enrollments, become
early adopters, and make money (B. L. Brown, 2000; McMahon, 1997; Shea & Boser, 2001).
Additionally, adults increasingly desire flexibility and options for gaining essential
knowledge and skills (B. L. Brown, 2000). Online courses, both for higher education and for
training, often meet a variety of organizational and learner needs. Distance methods of education
have been used for teaching and learning for more than 100 years (M. G. Moore & Kearsley, 1996)
but the use of Internet and network technologies for teaching and learning are a much more recent
Tools for Learner Interaction
Although Internet technologies can promote real-life types of interactions, they can also
thwart them. Learners cannot effectively interact unless they are able to easily use the media they are
tasked with employing for these purposes (Hillman, Willis, & Gunawardena, 1994; Kruper, 2002;
Salmon, 2001). Web or computer conferencing is one of the most widely used tools for
asynchronous discussions and collaborative work in online courses (Burge, 1994; Cartwright, 2000;
Harasim, 1997). Despite the widespread use of computer conferencing, learners and facilitators
describe numerous problems with its use, including the extended time it takes to feel comfortable
(Cartwright, 2000), and information overload from having to dig through large amounts of postings
and the large percentage of posting content that is off topic or irrelevant (Burge, 1994; Shank, 2002;
The Centre For Systems Science, 1994). A related complaint is the bandwidth and time requirements
of opening numerous postings (McMahon, 1997; Shank, 2002). The bottom line is that computer
conferencing tools can be cumbersome for interaction (Lipponen, 2002).
In face-to-face dialog, when people desire to inquire directly, share understanding, or
develop solutions for a problem, they often build lists, tables, and matrices as knowledge building
artifacts (Collins & Ferguson, 1993; Morrison & Collins, 1996). Building these types of knowledge
building artifacts should be possible in online learning but using computer conferencing tools for this
purpose is difficult.
Epistemic game theory describes how targeted knowledge is often generated in the natural
course of informal and formal dialog during social interaction. Epistemic forms, such as lists, tables,
and matrices, often guide inquiry during formal and informal dialog and the resulting artifact
constitutes new knowledge (Collins & Ferguson, 1993; Morrison & Collins, 1996). For example, a
list form is commonly used in daily conversation as people try to come up with alternatives from
which to choose, and the compare-and-contrast and cause-and-effect forms occur commonly in the
workplace as people attempt to solve problems. Although epistemic forms and games are a natural
part of socially oriented knowledge building, it is hard to take part in them using computer
Problem Space -Needfor Targeted Knowledge Building Tools
Figure 1.1 illustrates the problem space impacted by this study. Instructors generate
instructional activities for learners. These activities involve interactions with people and content.
Learner interactions are mediated, in online instructional activities, by tools that allow learners to
interact. A given tool allows certain activities and interactions to happen well and others not so well
or not at all. It is my assertion (to be investigated in this dissertation study) that a tool which allows
learners to build targeted knowledge building artifacts should be available as part of the interactions
toolset used by online instructors and instructional designer/developers.
Available People Interaction Tools
i b tiff m \ ftfiKp Â§ vr*
Figure 1.1. Problem space need for targeted knowledge building tools.
Purpose of the Study
Knowledge building discourse is an important instructional activity (Scardamalia &
Bereiter, 1994). Although computer conferencing is commonly used for this purpose, the need to
open and wade through various postings makes discourse for the purpose of targeted knowledge
building very difficult. I believe this difficulty presents a strong need for tools to allow learners to
build targeted knowledge dynamically within instructional content. Such a tool would not require
learners to go to a separate computer conferencing area or wade through numerous discussion
postings to extract relevant information.
This study evaluated prototypes of such tools as mechanisms for interaction and learner
knowledge building in online instructional materials. Figure 1.2 shows an example matrix form (with
the names of the learners hidden) that was embedded in online course materials for an injury
prevention course during this study. This form lets learners populate the matrix with potential
interventions that might occur before, during, or after a motor vehicle occupant injury event and to
consider other learners intervention ideas. This simple tool allowed learners to practice using a tool
that public health professionals regularly use to consider injury prevention strategies and to see what
other learners are thinking. It made their thinking visual.
* ''< tT* Host (Human) ^ V* / V J Vector pellicle) Physical & Sociocultural ' v .Inrra'onmentV
Pre- event , Drunk driver Big, gas guzzling SUV passing everyone it can Road repair
1 1 i i i i
Event : Airbags
Post- event 1 } V-i GPS that T < ut automatically sends signal on crashed car 'V1 ; '-'-'i Regionalized trauma care Strategically placed bulldozers
1 i -
.v: rfT' ^ !> r 'rT-.. >. Submit your answers ' v.-f i V
Figure 1.2. Haddon's Matrix embedded in injury prevention course materials.
The studys primary purpose was to analyze prototypes of these epistemic forms tools and
the tools in actual use in order to make suggestions for additional development and use. The primary
question answered by this study was: What interaction atfordances do these tools provide? The
methods section of the study offers a more detailed set of questions specific to the inquiry. Table 1.1
provides an overview of the parts and goals of the study.
Part 1 Part 2 Part 3
Goal Evaluate example forms to determine potential uses and usability Use example form templates to build lessons to gain insights about potential uses and prepare for Part 3 Evaluate forms in actual use in online instructional materials
Participants 10+ instructors/ instructional designer/developers 5+ instructors/instructional designer/developers 2+ instructors 8+ learners
Methodology Interview instructors/instructional designer/developers Evaluate inputs to examples Interview instructors/instructional designer/developers Evaluate lessons Questionnaires -online instructors/students Evaluate actual use
Structure of the Dissertation
This dissertation is structured as follows:
1. Chapter 1 provides an introduction to the problem and briefly explains the purpose of the
study. Some definitions are provided and a discussion of the limitations of the study is provided.
2. Chapter 2 is a review of the interaction literature. It presents a theoretical framework which
establishes the need for the study.
3. Chapter 3 explains the research questions, and how the study was conducted in order to
answer these questions, including data collection methods and instruments.
4. Chapter 4 explains how the data were analyzed and presents the results of the study.
5. Chapter 5 summarizes the major findings from the study, answers the research questions,
and assesses the implications and limitations of the study.
The following terms are used throughout the literature review and study and are defined
here for the sake of clarity.
Activity. Person(s) pursuing a specific goal in a purposeful way (Peal & Wilson, 2001).
This is based on activity theory concepts that explain the link between individual cognition and
social practices by looking at people, in groups, involved in attaining specific goals (Engestrom,
1987; Wertsch, 1998). According to activity theorists, activity develops over time and provides the
context for individual and collaborative actions that are embedded in it. Actions are made up of
chains of operations, which are conscious or unconscious behaviors (Wertsch, 1998).
Affordances. An objects capabilities and capacities that can be accessed and employed
(James Gibson, cited in Ryder & Wilson, 1996). Internet technologies have affordances that can be
employed for learning because they provide an infrastructure that allows connections to people and
objects that are not in our immediate physical environment (Ryder & Wilson, 1996).
Authentic. Real or realistic. Authentic learning environments are those suffused with real
or realistic complexity in order to encourage learners higher-level thinking and transfer to real-
world situations (Brooks & Brooks, 1999; Davis, Sumara, & Luce-Kapler, 2000; Dunlap &
Grabinger, 1996; Lave & Wenger, 1991).
Collaboration. Activity where participants work toward a shared goal or goals.
Computer conferencing. Also known as discussion forums, a tool widely used for
asynchronous dialog and collaborative work in online instruction (Burge, 1994; Cartwright, 2000;
Harasim, 1997). Simple computer conferencing allows people to post text and respond to others
Computer-supported collaborative learning (CSCL). A field of study that looks at how
collaborative learning can be supported by technology and how collaboration and technology can
augment learning by allowing knowledge and expertise to be shared (Lipponen, 2002).
Dialog. Communicative acts for the purpose of shaping social interaction. Learning is
mediated by social interactions; dialog displays thinking so it can be acted on by others and influence
the direction of further dialog and inquiry (Burbules & Bruce, 2002).
Interaction. Interaction consists of reciprocal events that require at least two objects and
two actions. Interactions occur when these objects and events mutually influence one another
(Wagner, 1997, p. 20). Therefore, interaction can be seen as a feedback loop that influences learner
activity and learning (Yacci, 2000). Learning is change over time through engagement in activity
(M. Clarke, personal communication, March 2003) and instructional interaction provides the primary
means by which learners are prompted to engage in activity for the purpose of learning.
Knowledge building. Knowledge building is the construction and enhancement of ideas
that bring value to a community. While learning is largely unobservable, knowledge building is an
observable phenomenon (Scardamalia & Bereiter, 2002). It is a type of collaborative activity that
builds shared understanding through construction of conceptual artifacts (Lipponen, 2002).
Tools. Cultural artifacts that mediate activity. Tools are used to perform actions in order to
pursue goals (Wertsch, 1998).
Limitations of the Study
As a proof of concept, the primary purpose of this study was to determine whether the
simple prototype tools I have developed were useful for very targeted knowledge building in online
instructional materials. Type 1 developmental research of this kind provides researchers with a way
to establish the potential effectiveness of an instructional product or procedure, with an eye toward
additional testing of the product in the future (Richey, 1997). Tessmer (1993) described how
evaluating instructional materials for the purpose of revising them leads to instructional materials
that are more effective, efficient, motivating, usable, and acceptable to students. This is especially
beneficial, asserted Tessmer, whenever new technologies or instructional strategies are being tried.
This study was exploratory in nature and was designed to determine whether the tools
should be developed and studied further. It has the following potential limitations:
1. Participation in the study. I have access to a wide variety of people who teach online and
design online courses. This is not meant to be a random sample but a purposeful sample of people I
know who will willingly provide the depth of feedback needed for a proof of concept study because
of their own interest in tools for online discussion. Participants were asked to suggest others for
participation in order to improve the diversity of feedback. I specifically included participants from a
wide range of instructional contexts and domains.
2. Data collection. Data were collected in person or at a distance by phone or e-mail. I did not
watch people actually use the tools or collect key press data. Instead, I relied on their written or
verbal comments and the artifacts that were created as they used them.
3. Time. Data were collected and analyzed over the course of a few months. It would be
beneficial to continue the research into the future at some time in order to obtain data about longer
term use of the tools.
4. Prototypes. The actual tools studied were a small subset of this class of potential tools.
Success with this subset would suggest promise for the tools but generalizing to an entire class of
tools would be difficult.
5. Multiple influences. The outcomes from a design experiment necessarily results from
hundreds of factors (The Design-Based Research Collective, 2003) and it will therefore be somewhat
difficult to determine the exact reason for specific outcomes and the expected replicability of these
outcomes in the future. Chapter 4 includes a very detailed account of what happened so that readers
can judge the potential impacts and their applicability to their own settings. Chapter 5 includes
suggestions for additional studies that will be needed to confirm and add to the findings of this study.
The main way that these limitations impact the study is that generalizability may be limited.
A limited set of these tools were used in a limited number of settings, with a limited number of
participants, over a limited period of time. Therefore, the studys results may not be generalizable to
other participants or instructional situations. The reader will need to determine the applicability of
my results for your own situation.
In this literature review, I present a conceptual framework that describes the importance of
learner interaction in instruction, and how adequate tools for this purpose are needed to allow the
type of outcomes that can be gained from this type of interaction to occur in online instruction. It is
organized by my approach to the problem space and begins with interaction definitions and types.
Open-systems interactions are highlighted as having greater instructional significance than others,
and a list of instructional outcomes from these kinds of interaction are presented.
Since I assert throughout this review that people or social interactions have a high potential
instructional value, a conceptual framework that explains why this so is presented. Because online
people interactions are mediated through tools, the challenges of the most commonly used tool for
this purpose is described. A specific type of people interaction, targeted knowledge building, is
discussed, along with the challenges of accomplishing this kind of interaction in an online
environment. I suggest the need for additional people interaction tools and make a case for building
and evaluating them. The epistemic form tools evaluated in this study were evaluated as potential
tools for building targeted knowledge building artifacts such as lists, tables, and matrices within
online instructional materials. This study evaluated a number of prototypes of these kinds of tools to
see if they are useful and usable.
Interaction Definitions and Types
Definitions of the word interaction, used in an instructional context, vary Widely. One
simple definition of interaction is a sequence of instructional events that reciprocally affect each
other (Anderson, 2002). Logically, these events can be brought about and responded to by people
(e.g., learners, instructors, etc.) or the instructional activities (e.g., reading text, interacting with
computer exercises, etc.).
A similar definition describes learner input affecting the presentation of the lesson
(Sherman, 1999). An underlying assumption throughout these and other definitions is that interaction
is a method of gaining meaningful feedback (Anderson, 2002; Kearsley, 1995; Yacci, 2000).
Synthesizing definitions leads to the conclusion that interaction provides a primary means by which
learners are prompted to engage in activity for the purpose of learning.
A commonly used typology for describing the object of the instructional interaction
describes interactions with content or with people (M. G. Moore, 1989,1993). Some definitions
more clearly describe interactions with content and others are more people oriented. Many
definitions apply to both.
The definitions that are primarily concerned with content interactions generally include
levels of interactivity. For example, Sherman's (1999) simple definition is extended with three types
or levels of interactivity. The three types include the following:
Navigational (instructionist): low interactivity level, user navigates (e.g., links and
Functional: medium interactivity level, user gains information (e.g., search engine)
Adaptive (constructionist): high level of interactivity, user modifies system for own
purposes (e.g. interactive forms and database-driven sites)
Similarly, Sims described seven levels of interactivity (1997), from object interactivity,
where the user clicks on an object (e.g., button, image, etc.), through hierarchical interactivity, where
the user can select among a predefined set of options (e.g., main menu, navigation bar), through
update interactivity, where the program initiates interaction between the user and the computer-
generated content (e.g. questions for the user to respond to), and beyond.
Learner-content interaction includes interaction with the actual instructional materials or
content of an instructional website (Moller, 1998; M. G. Moore, 1993). In self-paced online courses,
content interaction is especially important because the instructional content has the full burden of
imparting the desired learning objectives. The types of learner-content interactions most commonly
found in an instructional website include text, questions, forms, graphics, and simulations. Complex
authoring tools and programming technologies allow for (a) development of simulations, tutorials,
and dynamic content that reacts to learner input* and (b) complex branching.
One of the most critical outcomes of learner-content interaction is feedback (Wagner,
1997). Feedback has two main functions: verifying the correctness of a learner's response and
elaborating on a response in order to provide additional information or assist the learner in selecting
a different response (K. Clark & Dwyer, 1998).
Definitions that are more concerned with people interaction explicitly call for the learner to
be socially engaged. Kearsley and Shneiderman (1999) asserted that optimal teaching and learning
using technology requires learners to be meaningfully engaged in learning activities through
interaction with others and worthwhile tasks... [including] active cognitive processes such as
creating, problem solving, reasoning, decision-making, and evaluation (p. 20). Garrison (1993)
similarly asserted that interaction requires people contact in order for perspectives to be explained,
evaluated, and challenged. Berge (1999) added the importance of receiving feedback and the ability
to adapt activities based upon the outcomes of the interactions. These people-oriented definitions
assert that meaning-making results from communication with people in the pursuit of authentic
Learner-learner interaction (in groups or one-to-one) allows learners to learn from each
other. In online instructional activities, this most commonly occurs when learners are engaged in
discourse, collaboration, problem solving, or product building (Adelskold, Aleklett, Axelsson, &
Blomgren, 1999; Moller, 1998). These interactions can occur synchronously (same time) or
asynchronously (different times) and are generally mediated by networked communication tools such
as chat, discussion forums, and other collaborative workspaces.
Learner-instructor interaction allows learners to receive feedback, personal encouragement,
and assistance (Anderson, 2002; Gunawardena & Zittle, 1997; McDonald & Gibson, 1998). The
instructor's skill in questioning, responding to, and assisting learners can have a large impact on the
learning experience (Berge, 1995; Harasim, Hiltz, Teles, & Turnoff, 1996).
Harasim et al. (1996) described the importance of an instructor in organizing and
sequencing instructional activities, making sure they function as expected, and evaluating learners
and the instructional activities. One of the critical roles that an instructor performs is to be on the
lookout for learners' incomplete understandings or misunderstandings and help them achieve more
mature understandings. If this does not happen, learners are likely to end up with very different
understandings than what is intended by the objectives of the course or lesson (Bransford, Brown,
Cocking, Donovan, & Pellegrino, 2000; Brooks & Brooks, 1999; Brown & Duguid, 1989; Lave &
Wenger, 1991). This may be a shortcoming of completely self-paced instruction and provides a
compelling rationale for people interaction for most online instructional environments.
One More: Learner-Technology Interaction
Some theorists assert the need for a fourth type of interaction, interaction that occurs
between the learner and the technologies used to deliver instruction (Hillman et al., 1994, p. 30).
Since users cannot easily interact with content or people unless they are able to interact with the
technologies, instructional activities that help the learner become proficient and comfortable are
needed. Much of the study of how humans interact with technology comes from the field of human-
computer interface studies (Bannon, 1998; Shneiderman, 1998).
Usability tenets provide a unifying theme for evaluating learner-technology interaction.
Usability involves making sure that something works well: that a person of average (or even below
average) ability and experience can use the thing... for its intended purpose without getting
hopelessly frustrated (Krug, 2000, p. 5). Even though usability is often thought of as common sense,
it gets little thought unless it is intentionally considered. Norman (1988) saw usable design as a
competitive advantage. He cautioned us to begin with optimal design in mind instead of forcing
people to adapt to less-than-adequately designed products. Usability goes beyond mere surface
features. The tools that are used and the way that content is presented affects perception and learning
(Fleming & Levie, 1978).
Attention to learner-technology interaction is not only good for learners but also important
to the organizations that design and develop online courses (eLeam Magazine, 2002). Higher than
expected dropout rates are considered a problem in online learning and poor usability is one of the
reasons for this phenomenon (Frankola, n.d.). Research clearly shows that improving usability is
highly cost-effective because it shortens development time, and reduces support, training,
documentation, and maintenance costs. The cost-benefit ratio for usability-related activities is
considered by many experts to be 1:10-100 (Donahue, Weinschenk, & Nowicki, 1999).
Interaction has the potential to imbue instruction with desirable outcomes and these will be
described shortly. Prior to this discussion, however, it is important to differentiate between closed
and open systems interactions, as this distinction forms the foundation for understanding how these
desirable outcomes are achieved.
Closed vs. Open Systems Interactions
Ryder and Wilson (1996) called simple content interactions like those typically found in
computer-based training (CBT) examples of closed systems in which the content is pre-defined,
responses are anticipated, and action is controlled by the designer alone (Interactivity heading, para.
1). Instead they called for open systems interactivity, which allows for the mutual, autonomous, and
simultaneous activity of both participants working toward a common goal (A. Lippman, cited in
Ryder & Wilson, 1996, Interactivity heading, para. 1). Ryder and Wilson listed Lippman's criteria for
satisfying this definition:
interruptability, the ability of either participant to interrupt the other at any point
graceful degradation, the ability to set aside the unanswerable questions in a way that
does not halt the conversation
limited look-ahead, the quality that makes it impossible to predict the ultimate outcome
of a conversation by either party
no default, a quality which allows the shape of a conversation to develop naturally,
organically, without a preplanned path
the impression of an infinite database, the quality of limitless choices and realistic
responses to any possible input (A. Lippman, cited in Ryder & Wilson, Interactivity
heading, para. 1).
Open systems interactivity depends less on artificial lessons and more on the types of
interactions that happen naturally in the course of authentic activity (Ryder & Wilson, 1996). This is
an important point, as networked technologies can allow interaction beyond the mere hardware and
software in our computers. Ryder and Wilson noted, however, that networked technologies only
provide the opportunity for open systems interactivity, but these opportunities must be fully
exploited in order for their potential to be realized. This opportunity presents a call to action for
online learning course designers and developers to think beyond the closed system interactions that
have largely dominated online instructional materials.
Open systems interactions, therefore, are often achieved with people interaction because
people interaction is more likely to allow the level of spontaneity and iteration that A. Lippman
(cited in Ryder & Wilson, 1996, Interactivity heading, para. 1) was describing. The potential benefits
of people interaction does not make content interaction unnecessary, however. Content interactions
commonly provide grist for people interactions and people interactions can become additional
content (e.g., the text of discussion postings). In addition, pragmatics sometimes makes it difficult or
impossible to provide the type of group-paced, instructor-led activities that provide optimal people
interaction. In those cases, it may be possible to simulate people interaction (e.g., ask the expert
exercises) or to provide some means for interactions with others who are using the course materials
or a subject matter expert. It is therefore most fruitful to ask what range of interactions best realize
the needs of learners (Bates, 1995; Greeno, 1997) and attempt to provide them, within the constraints
of the instructional situation.
Open systems interactions provide opportunities for exchanges that can alter the content and
context of learning (Ryder & Wilson, 1996). These types of organic and natural exchanges make
certain outcomes of learning possible.
Affordances of Interaction for Learning
The importance of interpersonal interaction for learning has been clearly documented
(Berge, 1995; Brown & Duguid, 1989; Fulford & Zhang, 1993; Gunawardena & Zittle, 1997;
Kanuka & Anderson, 1998; Kearsley, 1995; Kearsley & Shneiderman, 1999; McDonald & Gibson,
1998; M. G. Moore, 1991, 1993; Nunn, 1996; Scardamalia & Bereiter, 1994). Interpersonal
interaction allows the learner to reflect and reconsider, get help and support, and participate in
authentic problem solving (Berge, 1996; Brooks & Brooks, 1999; Brown & Duguid, 1989; Lave &
Wenger, 1991). A recent meta-analysis (Lou, Abrami, & d'Apollonia, 2001) evaluated the effect of
people interaction in technology-based learning and uncovered important benefits for learners
working together. Three of the more pronounced benefits for learners of people interaction include
improved learning strategies, greater perseverance, and reduced need for help from the instructor.
These outcomes are especially important in distance learning because of the inherent difficulties with
learning without the structure and motivational elements of an in-person classroom setting (M. G.
Moore, 1991). In other words, interaction provides critical affordances for learners who are learning
at a distance.
James Gibson (cited in Ryder & Wilson, 1996) coined the term affordance to describe an
objects perceptual properties that can become opportunities for action in the hands of users
(personal communication, B. Wilson, April 2003). Internet technologies have affordances that can be
employed for learning because they provide an infrastructure that allows connections to people and
objects that are not in our immediate physical environment (Harasim et al., 1996; Ryder & Wilson,
1996). What opportunities for learning are made possible by this infrastructure and its potential for
interaction? The capacity for information retrieval and information sharing are two of the most
obvious (Ryder & Wilson, 1996).
The types of interaction described earlier help us think about interaction in terms of who or
what the learner is interacting with, but only hints at how Internet technologies can be used for
instruction. Wagner (1997) more clearly suggested potential learning affordances by proposing
instructional outcomes achievable through interaction:
1. Interaction to enhance elaboration and retention. Developing examples or explanations to
make information meaningful and aid transfer to other situations.
2. Interaction to support learner control!self-regulation. Managing the learning process,
including depth and range of study, and resources to be used, in order to become self-regulated,
focused, and complete the learning tasks.
3. Interaction to increase motivation. Obtaining information needed to make the learning
experience relevant and reduce frustrations.
4. Interaction for negotiation of understanding. Conveying understanding of and agreement to
the terms of the learning agreement.
5. Interaction for team building. Supporting group goals, including accepting differences,
listening, shared responsibility, and consensus.
6. Interaction for discovery. Allowing cross-fertilization of ideas and perspectives.
7. Interaction for exploration. Defining the scope, depth, and breadth of new ideas.
8. Interaction for clarification of understanding. Gaining information needed to meet course
9. Interaction for closure. Determining whether expectations have been met.
10. Interaction to increase participation. Engaging in the process of learning.
11. Interaction to develop communication. Sharing information and opinions in order to
understand different points of view.
12. Interaction to receive feedback. Correcting performance and understanding (pp. 22-25).
These twelve instructional purposes give designers of online instructional materials a
checklist of potential interactions to consider for their instructional situation. Chism (1998) listed
instructional uses for online dialog which match closely to Wagners (1997) instructional outcomes
from interaction. These uses include:
Building group coherence
Refining communication skills
Most of the interaction outcomes described by Chism (1998) and Wagner (1997) are
operationalized through dialog. Dialog, as defined in Chapter 1, is a communicative act for the
purpose of shaping social interaction. Dialog displays thinking so it can be acted on by others and
influence the direction of further dialog and inquiry (Burbules & Bruce, 2002). Dialog can therefore
be seen to be an especially critical type of people interaction since it conveys the potential for many
desired instructional outcomes.
Open systems interactions afford the types of instructional outcomes described by Wagner
(1997) and Chism (1998). These outcomes are highly valued by constructivist theorists, who believe
that social interaction is critical to learners and society at large (Brown & Duguid, 1989; Burbules &
Bruce, 2002; Grabinger & Dunlap, 1995; Lave & Wenger, 1991; Scardamalia & Bereiter, 1994). In
the next section, I will describe how these theorists make the importance of social interaction for
Conceptual Frameworks for Instructional Interaction
The following constructivist conceptual frameworks, social constructivism, situated
learning, and activity theory, provide key theoretical understandings about why people interaction is
important to learning and offer insights that help us consider how to maximize the learning benefits
made possible through people interaction. These constructivist approaches to learning stress that
meaningful and transferable learning should be the primary goal of instruction (Grabinger & Dunlap,
1995; McMahon, 1997). One of the main arguments against many educational methods is that they
tend to produce inert knowledge (Grabinger & Dunlap, 1995), knowledge that is abstracted from
the complexities of how it is used by real practitioners in real settings. Decontextualized knowledge
is not easily transferred into actual use in the real world (Grabinger & Dunlap, 1995; Scardamalia &
In an online business ethics course, learners might be able to select the correct answer from
the four multiple-choice answers given in the browser window, but ethical dilemmas generally do
not present themselves this precisely and decision making is typically more complex than selecting
among four multiple-choice answers. In real life, most problems have multiple factors to consider
and selecting a good course of action involves complex decision making in a dynamically changing
situation. How is this represented in traditional instruction? Many say it is not, with decreased
learning and less transfer as a result (Davis et al., 2000; Dewey, 1997; Grabinger & Dunlap, 1995;
Lave & Wenger, 1991; Scardamalia & Bereiter, 1994). Scardamalia and Bereiter (1994) suggested
that most traditional instructional methods are not designed to replicate the types of iterative problem
solving that characterizes true expertise.
Why do traditional instructional methods lead to decontextualized learning? Berryman
(1991) presented these five flawed instructional assumptions to answer this question:
1. People transfer learning from one situation to another.
2. Learners are passive receivers of wisdom.
3. Learning is the strengthening of bonds between stimuli and correct responses.
4. Learners are blank slates.
5. Skills and knowledge should be acquired independent of their contexts of use (p. 8).
In contrast to Berryman's (1991) five flawed assumptions, Grabinger and Dunlap (1995)
provide alternative assumptions that are more useful for designing meaningful and transferable
1. People transfer learning from one situation to another with difficulty. Learning is more
likely to be transferred from complex and rich learning environments.
2. Learners take an active role in forming new understandings and are not just passive
3. Learning is a collaborative process. Students learn not only from experts and teachers, but
also from each other.
4. Learning is cognitive, and involves the processing of information and the constant creation
and evolution of knowledge structures. We must focus on and make visible thinking and reasoning
5. Learners bring their own needs and experiences to a learning situation and are ready to act
according to those needs. We must incorporate those needs and experiences into learning activities to
help students take ownership and responsibility for their own learning.
6. Skills and knowledge are best acquired within realistic contexts. ...[Sjtudents must have the
opportunity to practice and learn the outcomes that are expected of them under realistic or authentic
7. Assessment of students must take more realistic and holistic forms, utilizing projects and
portfolios and de-emphasizing standardized testing (p. 9-10).
Grabinger and Dunlaps (1995) assumptions are best understood through the following
learning theories: social constructivism, situated learning and activity theory.
Social Constructivism / Situated Learning
Social constructivism argues that learning is more than passively importing information into
a learner's head (McMahon, 1997). Vygotsky (1978), a Soviet learning theorist, pioneered these
ideas and argued that learning is primarily a social construct that is mediated by discourse. Many
theorists and researchers agree that social interaction, especially dialog, provides one of the most
critical interactions for learning (Burbules & Bruce, 2002; Scardamalia & Bereiter, 1994; Sherry,
One of the central beliefs of social constructivism is the notion that learning is essentially a
situated activity (Brown & Duguid, 1989; Guribye & Wasson, 1999; Lave & Wenger, 1991). All
learning is situated in a specific context because general knowledge is always applied in specific
circumstances. Because knowledge evolves with use and each use changes that knowledge, learning
should ideally be embedded in the types of specific situations in which it is likely to be used (Brown
& Duguid, 1989). Most situations in which knowledge is gained or used involve other people.
The Vygotskian concept of the Zone of Proximal Development (ZPD) has significant
implications for context oriented learning. The ZPD is defined as the distance between the actual
developmental level as determined by independent problem solving and the level of potential
development as determined through problem solving under adult guidance or in collaboration with
more capable peers (Vygotsky, 1978). In other words, the scaffolding provided by working with
others potentially extends an individual's capabilities. Peal and Wilson (2001) described ZPD
inspired learning environments that include realistic activity systems, structured interaction, guidance
by experts, and surrender of control, over time, to increasingly competent learners.
McMahon (1997) asserted that these kinds of strategies can benefit learners in online
courses. Since the Web has strong social interaction affordances, Internet communication tools may
be effectively used to promote reflection and metacognition (McMahon, 1997) and key instructional
strategies for achieving high road transfer, where knowledge associated with one context can be
utilized in another (Perkins & Salomon, 1988).
Social Interaction and Explicit and Tacit Knowledge
The importance of social interaction becomes especially evident when looking at how we
attain two types of knowledge, explicit and tacit (J. S. Brown, 2000). Explicit knowledge is concrete,
the know-whats that are often expressed through text (e.g., documents, procedure manuals, etc.).
For example, the precise steps needed to ring up a sale on a cash register can be considered explicit
knowledge. In other words, the procedure is unambiguous and could be easily written in a job aid.
Tacit knowledge, on the other hand, deals with the know-hows and is more ambiguous
and complex. Examples of tacit knowledge include how to handle the myriad problems that might
arise when attempting to ring up a sale (e.g., problems with customers, problems with the sales
codes, cash register problems, etc.). It would be nearly impossible to write a concise job aid for this
type of knowledge because there are simply too many variables to consider. Tacit knowledge is
intuitive and context-specific and is generally passed on informally (J. S. Brown, 2000). Real
expertise, then, depends a great deal on tacit knowledge, making dialog and information sharing
critical. This kind of knowledge is generally ignored in most traditional instruction (Scardamalia &
The need to share tacit knowledge is especially evident in the workplace. The ability to
locate, gain, and share tacit knowledge, develop relationships, and synthesize and apply knowledge
from a variety of sources is a key skill for knowledge workers in the information age (Nardi,
Whittaker, & Schwarz, 2000). Nardi et al. (2000, para.4) explained how under these conditions of
rapid structural change, workers leverage their own personal networks, rather than relying on
unstable, weakening org charts. Workers are empowered only if they are successful at creating and
maintaining personal social networks. The reality of the world of work emphasizes the need for
people interaction and knowledge sharing. Instructional situations that are meant to transfer to these
working conditions must do so also.
Social interaction in online courses provides the potential for embedding more authentic
contexts. This is especially evident in online training courses. Beer (2000) observed,
Workplace learning [needs to be] social, because skills and knowledge [need to be] learned
and applied in an environment of collaboration, coordination, and negotiation. While
acquiring information may be an individual task, creation of knowledge is not, as it is from
the social context of peers, managers, and customers that your employees get feedback on
the correctness and effectiveness of their knowledge and skill (p. 42).
Thus, interaction with others helps prepare learners for the challenges of real life and real
work (Kearsley & Shneiderman, 1999; Scardamalia & Bereiter, 1994). Even though the necessity for
tacit knowledge sharing is more obvious in workplace learning, all learning environments that intend
to build expertise must grapple with how tacit knowledge is acquired and the transfer of what is
being taught to real situations.
How should we think about individual learning in terms of social interaction? Lipponen
(2002) described individual learning in a social context through the eyes of two important learning
theorists: Piaget and Vygotsky. Piaget believed that an individual mind develops the ability to take
into account others' perspectives and this capacity increases as we come into contact with others'
views. Vygotsky theorized that learning happens as a result of participation in social knowledge
construction and emerges through interaction, mediated by tools (including language). Situated
learning theorists add that individual learning maximizes each individual's ability to contribute to the
whole (Brown & Duguid, 1989; Greeno, 1997; Lave & Wenger, 1991). The major implication is that
the purpose of learning is to enhance the ability of individuals to participate in the domains of
practice that they are involved in or that they will become involved in.
Activity theory evolved from Vygotsky and other Soviet theorists' thinking about thought
and consciousness. It provides perspectives on how human actions influence consciousness. Activity
theory describes how individual mind emerges as a result of interactions with the environment and
therefore activity is needed for learning (Engestrom, 1987). This is in contrast to the more common
notion that learning is a precursor to activity. Activity theory concepts help us understand the link
between individual cognition and social practices by looking at people, in groups, involved in
attaining specific goals (Sasha A. Barab, Barnett, Yamagata-Lynch, Squire, & Keating, n.d.;
Jonassen & Rorher-Murphey, 1999; Nardi, 1997). Through this lens, we see that learning results
primarily from interaction with people and things in the real world.
Vygotsky and his colleagues explored the ways in which individual interactions with the
environment are mediated through means, tools and signs (Engestrom, 1987). Language and
mathematical signs are tools that have a significant influence on how people see and interact with
their environment. Activity theory claims that it is through our interactions with the environment
(including people and tools) that we learn and our learning likewise affects the activity system in
which we are engaged. Like situated learning, this view provides a social view of learning that is
deeply iterative. We participate in activities that impact our learning. Our learning influences our
participation, which impacts our learning, the activities themselves, and others' learning.
Sherry (1998) described how the six elements that are commonly used in activity theory to
specify the boundaries and impacts of a specific activity system (Engestrom, 1996) as they pertain to
online instructional materials.
subject (e.g., a student, teacher, or expert);
object of activity (e.g, a message posted);
mediating tools of the activity (e.g, the discussion forum, multimedia tools, etc.);
community of learners (anyone connected electronically by the network);
division of labor (the responsibilities commonly associated with the roles of student,
teacher, artist in residence, expert, etc.); and
rules or norms regarding appropriate social actions (who can post, participate, moderate
discussion, etc.) (bullet list after CMC: A Mediating Tool in an Activity System
These six elements focus on those elements that can be manipulated in online instructional
activities in order to facilitate authentic, transferable learning. Activity theory, as a lens for looking at
the design and implementation of learning environments focuses far less on knowledge inside an
individual learner's head than on the activities that people engage in, including the context of those
activities, relationships, tools, and desired outcomes (Jonassen & Rorher-Murphey, 1999).
Understanding the interrelationships between these entities helps us consider the kinds of
meaningful, open systems interactions needed for a given instructional situation.
Design of Open Systems Learning Environments
How can we utilize the insights gained from social constructivism and situated learning and
activity theory when designing instruction? Wilson and Myers (1999) provided some excellent
insights for considering instruction from a situated point of view. Renkl et al. (1999) listed pivotal
instructional principles common to situated learning and Grabinger and Dunlap (1995) suggested
instructional strategies that best operationalize a situated and social constructivist point of view.
Table 2.1 shows the alignment between these views.
Insights for Designing Open Systems Learning Environments
Instructional Implications Wilson and Myers (1999, p. 71): Key insights about situated learning Grabinger and Dunlap (1995, p. 2): social constructivist instructional strategies Renkl et al. (1999): Instructional principles for situated learning
Context and Thinking, learning, and Authentic contexts and Learning in real-life
meaning cognition are context relevant, meaningful situations is ideal.
specific. Meaning fostered via communities through dialog. learning Learning context should be similar to context in which it expected to be applied.
Activities Activities create context Dynamic, generative Learning often best
and tools for knowledge, knowledge grows through participation. Tools (especially language) enable, limit, or channel intellectual processes. Tools are culturally derived and transmit culture. Tools are changed by and change culture. learning activities triggered by interesting problems. These should be authentic or at least near-to-reality. Problems should contain the level of complexity found in real-life.
Learners' Situations impact Intentional learning and Learners should engage
thinking individual thinking/action and student responsibility in reflective and metacognitive practices
individual Reflection and self- while working on
thinking/action impacts situations. assessment authentic problems.
Learners require support,
modeling to grapple with
Identity and Learning increases Collaboration and the Learner epistemological
Participation participation and social negotiation of beliefs and tolerance for
participation increases meaning ambiguity influence
learning. ability and willingness to learn in these complex
Constant and interplay between individual cognition and the situations/activities individuals are involved in. learning environments.
Learning environments designed from these points of view are clearly contextual and
revolve around the types of activities and tools used by real practitioners and real life situations.
Learners are encouraged to systematically think about their thinking and learning so they can better
regulate their learning experience and improve their ability to participate. Since these environments
are complex (mirroring the complexity of real life), support is needed to help learners navigate and
be successful. Attention to individual cognition and group understanding is in constant interplay;
each affects the other.
In summary, social constructivism and situated learning and activity theory, as theoretical
frameworks, tell us that knowing and meaning happen in the context of the activities that people
engage in. This includes the other people who are involved in those activities and the tools that
mediate language and the activity itself. Learning in this way prepares learners to use the content of
instruction in the real world, rather than simply to take tests.
In online instruction, dialog is mediated by the tools available for this purpose. The
affordances of these tools make certain types of people interactions possible and others less possible.
In the next section, the tools used for this purpose are described and the challenges of the most
commonly used tool for this purpose are delineated. One type of people interaction, targeted
knowledge building, is described, and the challenges of accomplishing this kind of interaction online
Asynchronous People Interactions in Online Instruction
Open systems interactions, by their nature, depend chiefly on interaction with people, and
dialog is the primary method by which these open systems interactions are operationalized.
Interpersonal interaction allows the learner to reflect and reconsider, get help and support, and
participate in authentic problem solving (Berge, 1996; Brooks & Brooks, 1999s; Brown & Duguid,
1989; Lave & Wenger, 1991). In online courses, learners must first interact successfully with the
technologies used in order to interact with content or people (Hillman et al., 1994). Learners cannot
effectively interact unless they are able to use the medium with ease (Hillman et al., 1994; Kruper,
2002; Salmon, 2001). Therefore, the quality of their interaction with technology profoundly affects
learners' experiences (Kruper, 2002) and their ability to utilize these technologies for the types of
learning experiences described earlier.
Empirical evidence substantiates the impact that usability has on user experience (Bailey,
2001; Bernard, n.d.; Communication Technologies Branch of the National Cancer Institute's Office
of Communications, n.d.). The need to attend to usability for online learning sites is based on
concepts from psychological and educational research about how humans perceive and process
information and utilize tools for learning (Kruper, 2002). Concepts encountered in the study of
perception, memory and processing, and learner control provide empirical underpinnings for
usability insights for online learning designers and developers (Alessi & Trollip, 2001; Fleming &
Levie, 1978). Raskin (2000, p. xix) explained, Quality ... is ultimately determined by ...the
interaction between one human and one system. .. .If... not pleasant and facile, the resulting
deficiency will poison the performance of the entire system, however fine that system might be in its
Perceptual attributes should be considered when designing and developing online course
materials and when selecting tools and technologies. Design elements falling within the domain of
usability include screen placement, use of text, and navigational elements (Alessi & Trollip, 2001).
These elements help learners know what to do and not feel frustrated.
Dual coding, multiple symbol systems, and the principle of organization provide additional
influences on usability. Dual coding refers to visual and auditory information received
simultaneously and processed separately (Clark & Paivio, 1991, as cited in Alessi & Trollip, 2001).
Closely related is the notion that multiple symbol systems (texts, graphics, animation, video, etc.)
enhance learning (Dickenson, 1985, as cited in Alessi & Trollip, 2001). These systems can be helpful
or harmful (e.g. when the modes or systems provide conflicting information) to learning. The
principle of organization says that information is retained for a longer period when it is well
organized and the learner is provided with clues about the intended organization (Fleming & Levie,
Usability is clearly a critical component of online learning design and development as it
affects user willingness to engage in the experience, perceptions about the experience, and
ultimately, the effectiveness of the experience. The success of interaction with content and other
people, so critical to the distance learning experience, is dependent on the systems that allow it to
happen (or not happen, as the case may be). Learners' experiences are mediated by the functionality
of the system.
Tools for People Interactions
A variety of tools are commonly used for asynchronous people interactions in online
courses. Although there is plenty of overlap and it is difficult to neatly categorize the tools, the
following categorization system may be useful for making distinctions between them (The Lab at
Brown University, 1999).
Web or computer conferencing (asynchronous discussion groups/forums): Topic based
conversations using mailing lists, web-based discussion tools, or groupware
Data collection/sharing/organization: Repositories of resources, projects using
databases, groupware, and search engines
Document Sharing: Tools that allow users to display, discuss and collaborate on
documents or artifacts using websites, annotation systems, word processing systems,
electronic whiteboards (pp. 7-12)
Web or computer conferencing, also known as asynchronous discussion forums, is widely
used for dialog and collaborative work in online courses (Burge, 1994; Cartwright, 2000; Harasim,
1997) . Harasim described five characteristics for this type of communication: many-to-many
communication, place independence, time independence, text-based, and computer-mediated
interaction. The simplest computer conferencing software allows people to post text and allows
others to respond to that text. More complex systems may allow people to post attachments, use
HTML in their postings, archive specific posts, and utilize e-mail for notification of new messages
Computer conferencing supports collaborative learning and interaction among course
participants, including the instructor (Cartwright, 2000; Harasim, 1997; Shank, 2002), helps learners
practice dealing with complex and realistic problems, and allows learners to help each other with
course, content, and technological problems (Cartwright, 2000). Computer conferencing can reduce
the isolation felt by distance learners learning alone (Burge, 1994). Additional benefits of computer
conferencing include allowing conversations to take place at each individual's convenience and
permitting individuals to take time to digest what is written and to respond (Burge, 1994; Woolley,
Despite the widespread use of computer conferencing, learners and facilitators express
numerous problems when using these tools. Commonly reported problems include difficulties in
adjusting to the technology and the extended time it takes to feel comfortable using it for online
conversations and group work (Cartwright, 2000), reservations about conversing in print, and worry
about perceptions by others and lack of visual cues (Shank, 2002; The Centre For Systems Science,
1994). A common complaint among online learners is information overload from having to dig
through large amounts of postings and the large percentage of posting content that is off topic or
irrelevant (Burge, 1994; Shank, 2002; The Centre For Systems Science, 1994). A related complaint
is the bandwidth and time requirements of opening numerous postings (McMahon, 1997; Shank,
2002). Additionally, because of the time lag inherent in computer conferencing, discussions may lose
momentum and seem fragmented (Burge, 1994; Shank, 2002; Woolley, 1998). Any or all of these
problems may cause learners to not use computer conferencing, use it less often, or find the
interaction to be less valuable.
Wenger (2001), in a recent survey of tools available for supporting online communities of
practice, asserted that new tools are still in need of development because optimal tools are not always
available to support the following interaction needs continua:
knowledge exchange social exchange
ongoing transitory (p. 10)
Tools that minimize the kinds of problems that have been attributed to computer
conferencing could be valuable adjuncts to those that are currently available for people interaction in
online courses and instructional materials.
Tools for Knowledge Building
Knowledge building is an interaction activity that builds shared understanding by building
conceptual artifacts (Lipponen, 2002). The field of Computer-Supported Collaborative Learning
(CSCL), which grew out of Computer-Supported Cooperative Work (CSCW) research in the mid
1990s, looks at ways that collaborative learning can be supported by technology and how
collaboration and technology can augment learning by allowing knowledge and expertise to be
shared (Lipponen, 2002). Although learning itself is not observable, the process of building
knowledge can be observed because the process produces artifacts (Stahl, 2002).
CSCL researchers have articulated the need to explore the ways that technologies can
provide support for optimal interaction and superior instructional strategies rather than merely make
use of available but less-than-optimal technologies (Lipponen, 2002; Suthers, 2001). CSCL
researchers (Lohner & van Joolingen, 2002; Suthers, 2001; Suthers, Girardeau, & Hundhausen,
2002) explained how representational systems, such as the tools used for online dialog, each have
specific affordances. These affordances, of necessity, restrict or enhance what is able to be
represented by the system and thereby change learners focus and activity. Researchers (Bonk, 2002;
Brush, Bargeron, Grudin, Boming, & Gupta, 2002; Rick, Guzdial, Carroll, Holloway-Attaway, &
Walker, 2002) have described how anchored annotation systems like WebAnn, CoWeb, CoNote, and
CaMILE, where dialog is embedded in the artifact being discussed, can be much more focused than
traditional Web or computer conferencing systems.
Suthers (2001) described a great need for additional research and development into systems
that better facilitate the process of knowledge building. CSCL researchers (Collins & Ferguson,
1993; Maudet & Moore, 1999; D. Moore, 2000; Morrison & Collins, 1996) specifically identified the
role of dialog games for knowledge building in technology supported learning environments. These
descriptions come from a wide variety of domains, including computational linguistics, artificial
intelligence, and education. The call by CSCL researchers for more and better tools for the range of
dialog activities mirror those by Harasim (1997; 1996) and Wenger (2001). One such tool would
allow epistemic games to be played in online instruction.
Epistemic game theory describes dialog games that can be used for knowledge building.
These games occur naturally during the course of informal dialog and more formal dialog in
instructional settings (Collins & Ferguson, 1993; Morrison & Collins, 1996). According to epistemic
game theorists (Collins & Ferguson, 1993; Morrison & Collins, 1996), knowledge is structured in
certain forms (e.g., lists, tables, matrices) according to cultural and domain specific norms. This
notion aligns with those of activity theorists, who explain that knowledge is constructed as a result of
the practices inherent in culturally oriented activity (Jonassen & Rorher-Murphey, 1999).
Epistemic forms guide inquiry during dialog by providing a specific format for classifying
knowledge. For instance, a list form is commonly used in daily conversation as people try to come
up with alternatives from which to choose, and the compare-and-contrast and cause-and-effect forms
occur commonly in the workplace as people attempt to solve problems. Using epistemic forms to
guide inquiry makes underlying organization observable and patterns clearer. The process of
carrying out or completing an epistemic form is the epistemic game. Each epistemic game involve
moves (actions that can be taken), entry conditions, constraints (rules of play), and strategies (Sherry
& Trigg, 1996). Collins and Ferguson (1993) proposed three categories of epistemic games:
structural analysis, functional analysis, and process analysis. The types are increasingly challenging
and can be used individually or together.
Structural analysis games are used to show concepts or elements. Examples of structural
analysis forms include simple lists and tables. Functional analysis games are used to show how
concepts or elements relate to each other. Examples of functional analysis forms include organization
charts and maps. Process analysis games are used to show why elements or concepts behave the way
that they do. Examples of process analysis forms include flowcharts and spreadsheets.
A list of epistemic form types (Collins & Ferguson, 1993; Morrison & Collins, 1996) shows
the range of possible games, but the potential universe of game types is limitless. How the games are
structured determines the category of game. Here are a few of the types of games mentioned:
matrix or table
graph or chart
Using Knowledge Building Games in Instruction
How should we decide when and how to use epistemic games during instruction? Insights
from constructivist theorists, described earlier, provide a context for using these games. The range of
games can be used in a variety of ways (Collins & Ferguson, 1993; Morrison & Collins, 1996). It
would be useful to look how various content types and learning objectives could be supported with
Sugrue (2002) described a content-by-performance model for considering different types of
content and learning objectives which is adapted from Merrills component display theory (1994). In
this model, she illustrated how each type of content (facts, concepts, principles, etc.) will encompass
certain types of learning objectives depending on whether the desired result is to have learners recall
or utilize the content. Epistemic forms can be developed and used for both types of objectives. If the
objective is to have learners recall content, the form can ask learners to provide their recollections. If
the objective is to have learners utilize (or apply) content, the game can be structured so that
analysis, application, evaluation, or synthesis is needed to respond. In other words, forms can be used
to elicit existing knowledge, or can provide a format for building new knowledge.
The different types of content and associated objectives types can be used to help those who
develop online instructional materials determine how the universe of epistemic form types might be
used in online instruction. Although it is difficult to be strictly prescriptive in describing how and
when to use different types of epistemic forms, Table 2.2 provides examples of how different
epistemic forms might be embedded in instructional content to promote different types of learning
objectives in higher education, training, and continuing education online courses.
Example Epistemic Forms Based on Sugrues Model of Content and Objectives Types
Content Type Practice/Assessment (Depending on Level of Performance)
Fact List of top agricultural states Chronology of U.S. decision points during Cuban missile crisis List of suggestions for mnemonics for memorizing top agricultural states Table matching decision points with potential alternate decisions and potential results
Concept List of allowable expenses while on company trips Classification of allowable expenses into expense repeat categories
List of critical economic indicators Concept map illustrating how economic indicators impact stock market
Principle/ Rule Table with health behaviors and predicted outcomes Table describing potential interventions and rationale
Incorporation types for small businesses Decision matrix for selecting one type over another
Procedure List of steps that must be followed for giving a medication injection Table showing potential problems at each step and what can be done to prevent it
Brainstorm most critical step in helping client plan a cruise vacation Table of decision points and what questions to ask client in order to help them make the best decision
Process List of milestones for completing an instructional project Stages of the performance review process and problems each stage Gantt chart for the project with critical path Suggestions for actions to reduce top problems in each stage/ predicted impact J+B
It is also possible to think of forms that could be used to elicit and perhaps modify attitudes.
Forms that might be used in this way include forms that elicit alternatives and consequences, ideas
and reactions, and points of view.
Clarks (2000) four instructional architectures categorize instruction according to the type
of transfer and skills involved. In near transfer, instruction is aimed at procedural skills that are
performed similarly or almost identically each time. At the other end is far transfer, or principle-
based skills which require adjustment each time they are used. Here is Clarks list of instructional
architectures, from simple to complex. For each architecture, a description of instructional strategies
that can augment learning are provided.
Receptive architectures: Learner is expected to absorb information from video or text,
and there is a lack of externally prompted interaction. To enhance learning in this
architecture, metacognitive activity should be prompted through case analysis and
Behavioral architectures: Learner is induced through bottom up skill building from easy
to complex, with carefully constructed and tested interactions, and regular feedback.
This architecture may be good for novice, but is not as good for more expert learners.
Situated guided architectures: Learner is provided with resources to help build internal
knowledge bases. Learning is organized around realistic work problems with
naturalistic feedback. This architecture is better for more advanced learners and far
Exploratory architectures: Learner is provided with links and resources and can move
among topics at will. Learners need good metacognitive skills and prior knowledge.
Usability and information design is critical. Frequent practice can be added to manage
The epistemic form examples listed above can fit into all of these instructional architectures.
For example, epistemic forms could be embedded in receptive architectures to prompt learners to
consider how what they are reading relates to previous knowledge. In behavioral architectures,
epistemic forms could be used for feedback. In situated guided architectures, epistemic forms could
be used to allow learners to share solutions to problems at various stages of the lesson. In exploratory
architectures epistemic forms could be used to help learners build models to manage cognitive
It is important to note that the example epistemic forms listed here are not intended to stand
alone. They are dialog activities that are embedded in instruction or other knowledge building
activities with a specific goal. Although epistemic games are a natural part of socially oriented
knowledge creation, it is hard to take part in them using many existing computer conferencing tools.
A tool developed precisely for the purpose of knowledge building through simple epistemic forms
could allow learners easier access to creating knowledge with others without the overhead of
The importance of social interaction for learning is widely recognized (Berge, 1995; Brown
& Duguid, 1989; Fulford & Zhang, 1993; Gunawardena & Zittle, 1997; Kanuka & Anderson, 1998;
Kearsley, 1995; Kearsley & Shneiderman, 1999; McDonald & Gibson, 1998; M. G. Moore, 1991,
1993; Nunn, 1996; Scardamalia & Bereiter, 1994). If we want learners to be able to use the
knowledge gained in instruction in real world settings, learning activities must be more authentic,
mirroring the types of thinking, tools, and activities that happen in the real world (Grabinger &
Dunlap, 1995; Lave & Wenger, 1991). Horton (2000) explained that this is just as true in online
learning as it is in classroom-based learning. People learn little by merely clicking the mouse....
People learn by considering, researching, analyzing, evaluating, organizing, synthesizing, discussing,
testing, deciding, and applying ideas. [The] goal is to provoke the exact mental activities that lead to
learning (2000, p. 192).
Learning in more authentic ways often involves social interaction and social interaction in
online courses and instructional materials most often occurs through tools like computer
conferencing. Despite the widespread use of computer conferencing, learners and facilitators
describe numerous problems in using these tools. Additional tools are needed to support a range of
people interaction activities (Harasim, 1997; Harasim et al., 1996; Wenger, 2001). One activity that
is not easily accomplished with computer conferencing tools is targeted knowledge building with
epistemic forms. A tool that would allow learners to build epistemic forms online could allow
learners easier access to creating knowledge with others without the overhead of computer
Since people or social interactions have a high potential instructional value, evaluating
additional people interaction tools and discovering means for overcoming existing problems is, I
believe, a valuable and worthwhile research activity. The study described in the next chapter
evaluates a number of epistemic forms tool prototypes to see if they are useful and usable for people
interaction in online instruction.
Social interaction conveys desirable outcomes in classroom-based and online instruction
(Anderson, 2002; Berge, 1999; Brown & Duguid, 1989; Kearsley, 1995; Kearsley & Shneiderman,
1999; McDonald & Gibson, 1998; M. G. Moore, 1991,1993; Nunn, 1996; Scardamalia & Bereiter,
1994). The most common tool used for social interaction in online instructional materials, computer
conferencing, has limitations which make it less than optimal for targeted knowledge building
(Burge, 1994; Cartwright, 2000; Harasim, 1997; Shank, 2002).
This study evaluated a potential new social interaction tool, dynamically created epistemic
forms embedded in online instructional materials, as a mechanism for learner interaction and
knowledge building. Prototype epistemic forms were built and tested in order to make suggestions
for additional development and use in the future. The study took the form of a design narrative that
looked at how the forms were used, the interactions that occurred, and the resulting artifact (the form
with submissions) in a variety of domains and situations. As a preliminary design experiment or
proof-of-concept, the study was necessarily and purposely limited. The primary intent of this study
was to determine whether the tool shows enough value to be developed and studied in the future.
The study took place in three parts. Part 1 involved instructors and instructional
designer/developer feedback on and use of epistemic form examples that I had designed and
developed. In Part 2, these participants designed a lesson using one or more of these forms, and in
Part 3, the designed lesson and form was built and tested in actual use. Data about the forms uses,
interactions, and artifacts were collected and triangulated from multiple sources including face-to-
face, email, and phone communications with instructors, instructional designer/developers, and.
learners, lesson documents, resulting artifacts, and my notes. Table 3.1 provides an overview of the
studys research questions, parts, data sources, and data analysis methods.
Overview of Study Research Questions, Parts, Data Sources, and Procedures
Research Questions Throughout the Study Part 1 (March-May 2003) Part 2 (March-May 2003) Part 3 (March-May 2003)
What types of content, Goals: Goals: Goals:
strategies, objectives, and Determine potential Evaluate desired uses, Evaluate uses,
outcomes does the tool uses, interactions, and interactions, and interactions, and
best support? Does the artifacts using example artifacts using lessons artifacts in actual
tool support knowledge forms Data Sources: online instructional
building? Data Sources: Lesson designed by materials.
What is the nature of the Face-to-face, e-mail, instructor or Data Sources:
interactions that occur or phone interviews instructional Artifact-form built
when using the tool? In with instructor or designer/developer by participants
what ways are learners instructional (see Appendix Q Face-to-face, e-mail,
interacting with each designer/developer Face-to-face, e-mail, or phone interviews
other? In what ways are (see Appendix A) or phone interviews with instructor or
learners interacting with Artifact-form built with instructor or instructional
the content? by study instructional designer/developer
What is the nature of the participants designer/developer (see Appendix F)
artifact that is built? How My notes (general (see Appendix D) Interview with
does the tool compare to observations and Records of e-mails learners (see
computer conferencing themes) and phone Appendix G)
tools? What usability Data Analysis: conversations My notes (general
issues arise and what Code for activities, My notes (general observations and
additional features are interactions, and observations and themes)
needed? artifact themes) Data Analysis: Code for activities, interactions, and artifact Data Analysis: Code for activities, interactions, and artifact
This type of study is considered developmental (Richey, 1997), and is therefore aimed at
informing practice and providing a critical path toward the creation of new tools for learning. Type 1
developmental research allows researchers to establish the effectiveness of an instructional product
or procedure, with an eye toward additional testing of the product in the future. This type of research
project commonly relies on naturalistic evaluation methods and results in the following kinds of
conclusions: suggested improvements, conditions that promote successful use, impact of the product,
and lessons learned (Richey, 1997). The researchers goal in this type of research is to intentionally
influence the learning environment (Sasha A. Barab & Kirshner, 2001). While other types of
research seek more to confirm what researchers believe is known, this type of research seeks
primarily to gain insights about the nature of teaching and learning, in natural contexts (Cobb,
Conffey, diSessa, Lehrer, & Schauble, 2003; Kelly, 2003).
Proof of concept investigations are designed to show whether an idea is feasible. They are
widely used in software engineering and have begun to be used in instructional development to
inform decisions about potential instructional innovations (Jones & Richey, 2000). In educational
research, proof of concept research is commonly known as design experiments (Cobb et al., 2003;
Kelly, 2003; The Design-Based Research Collective, 2003). Design experiments involve studying
innovations in context in order to gain insights about the myriad factors involved in learning, and the
affordances of potential instructional materials and methods (The Design-Based Research Collective,
2003). Design experiments typically describe an instructional intervention as it is modified in the
course of practice. Researchers who use these methods accept that the hundreds or thousands of
complex designer, teacher, learner, and researcher details cannot be explained or accounted for, as
might be done when controlling for specific variables in an experiment. Instead, the situation as it
exists and unfolds is described in detail, analyzed, and used to initiate further inquiry (Sasha A.
Barab & Kirshner, 2001; The Design-Based Research Collective). Reliability is promoted by
triangulation from multiple data sources within a single study and multiple studies over time (The
Design-Based Research Collective, 2003). The typical goal of a preliminary design experiment is to
produce locally applicable knowledge which can inform local practices and set in motion additional
research which can inform larger practice over time. Desired results include better understanding of
specific interventions, improved theoretical understanding of teaching and learning in specific
contexts, and knowledge that informs educational practice (Cobb et al., 2003; The Design-Based
Research Collective, 2003).
The types of results that can be attributed to design experiments are commonly desired by
researchers looking to gain insights about all kinds of educational innovations. In the next section, I
will discuss how researchers in the Computer-Supported Collaborative Learning (CSCL) field use
these methods to study innovations in tools used for collaborative learning.
Collaborative Learning Tools Research
The epistemic forms studied here are proposed tools for targeted knowledge building and
these kinds of tools are commonly studied by researchers in the CSCL field. CSCL researchers assert
that real-world contexts are needed to better understand the effect of these tools (Lipponen, 2002).
Koschmann (2002) calls for CSCL research to be more interested in understanding the practices of
learning (p. 18), rather than simply the outcomes of learning (the latter of which is more typical in
educational research). He asserts that CSCL research needs to be primarily concerned with
meaning-making in the context of joint activity and the ways in which these practices are mediated
through designed artifacts (Koschmann, 2002, p. 20).
Koschmann (2002) calls for CSCL researchers to be active participants in the design of
technologies that facilitate meaning-making. Likewise, Stahl (2002) calls for research that looks at
the growth of communal understanding as reflected by construction of a knowledge object that is
shared by the group (p.64). Researchers can interpret what is happening, Stahl asserts, by viewing
dialog over time. This view of research on knowledge-building tools is mirrored by Hoadley (2002),
who emphasizes that evaluation of tools for collaborative learning must combine design and research
in order to make the context clear when reporting results. He calls for design narratives (p. 454)
that intertwine design and research in order to establish the context for collaboration and the
structures that support collaboration. Since design itself is open-ended and iterative, Hoadley calls
for research on the design of new tools to be likewise.
Like the proponents of design experiments, CSCL researchers (Hoadley, 2002; Koschmann,
2002; Lipponen, 2002; Stahl, 2002) call for research whose specific aim is a better understanding of
the practices of learning (Lipponen, 2002). They call for active participation by researchers in
designing and evaluating technologies that help learners build artifacts for the express purposes of
meaning making. Clearly, CSCL researchers are describing research methods and results that mirror
those described by proponents of design experiments.
The data collection methods used in this study were in-situ, naturalistic methods, where I
was fully integrated as a participant in the research process, as suggested for developmental research
and design experiments (Sasha A. Barab & Kirshner, 2001; Cobb et al., 2003; Richey, 1997; The
Design-Based Research Collective, 2003). These methods included interviews, collaborative and
iterative design, and notes about impact and next steps.
The study was intentionally preliminary, as is the case with initial developmental research,
with the goal of describing impact, suggesting conditions that promote successful use and potential
improvements, and lessons learned (Richey, 1997). The primary intent of this study was to determine
if the tool has enough worth to suggest further study and development.
This study involved a design experiment that looked at the interaction affordances that
epistemic form tools provide inside online instructional materials. Similar to Hoadleys (2002)
research on the SpeakEasy discussion tool, I evaluated prototype epistemic form tools in stages. List
and table epistemic forms were initially selected for this study because they are among the simplest
and most commonly used forms (Collins & Ferguson, 1993; Morrison & Collins, 1996) and were
therefore expected to be applicable in a wide variety of contexts. In addition, a matrix form was built
(in order to meet the needs of the first Part 2/3 participant) and a version of it was subsequently
provided as another example to Part 1 study participants. These epistemic forms allowed
contributions of text into dynamically populated tables.
Figures 3.1-3.3 show three of the example forms that were developed for Part 1 of the
study. The names of the participants have been masked to protect their identity.
! j- Ethical behavior is behavior that recognizes and adjusts for potential misues of power, includes the needs of stakeholders, and attempts to be fair to stakeholders.
1 ..4 ; Ethical behavior is behavior that maximizes outcomes to the larger group (society?) without abusing potition or power or influence.
[ -V:- f V, Ethical behavior is behavior that does not violate the guidelines, or give the appearance of violating the guidelines.
j ^ * uu iW~ - Ethical behavior is behavior that is consistent with principles of proper and right conduct. Moral behavior is behavior that is good and right. That's how I think of it anyway.
Ethical behavior is behavior that does unto others what they would like for us to do to them.
J HSlftea 'U 1 '1 Ethical behavior is behavior that comes from the humility that my definition of ethical behavior may not be another person's definition and that both definitions may be valid even if inconsistent.
4. "l S"l 15 1 . Ethical behavior is behavior that fairly and justly accounts for the broader needs of the specific community or communities the behavior impacts.
i -7>, 7 Ethical behavior is behavior that does not make you queasy when you reflect on what you have done.
I Ethical behavior is behavior that respects the rights and values of others while honoring an individual's freedom to exercise personal choice.
1 * s 1 ** r 4- -* Ethical behavior is behavior that conforms to accepted professional, legal, social or moral standards of conduct that is consistently applied regardless of whether the conduct is puhlic nr private.
l * T 1 " Ethical behavior is behavior that honors and respects the shared values of a given group. Ethical dilemmas often arise when different groups have different shared values and those values collide.
Figure 3.1. Ethics definition list example.
In the example above, participants created a list. They were asked to submit a definition of
ethical behavior. Instructional content appeared before this form and provided ideas, readings, and
directions for the activity.
Rubric question :
Does the instructional page
Thin ffiipar.inn is iTnpnrr.fmr. topnmisp
Submit Vour Question
name rubric question rationale for instructional sites
! '-1. 1J S * j. Does the instructional page use lots of colors. This question is important because it makes the page hard to read.
! ^ f' K'SV$r t 1 ~ v i v ~ ^ i \ Does the instructional page allow students to get to main sections of the course (like...syllabus, discussion forum, lessons) easily? This question is important because the course should be easy to use. The main areas are used a lot and students shoudn't have to do loads of clicking to get where they need to go.
i "t v ^ -ti. ^ j ~ j r' < r~ >> A' If 1 to'-.'.-; H. rVt' 1 V t : T-r: Does the page apply good layout principles to make information easier to read? For example: in general, is text in tables aligned with the tops of cells rather than the center? Is there white space (padding) within cells? Does the text generally avoid fiiill justification? Although web pages aren't print pages, reading is reading. In English and most western languages, the reader's eye has been trained to read straight across, left to right; having to jump up and down is difficult. Similarly, white space' (cell padding, indents from the window border, etc.) costs nothing online but can aid in alignment of information.
S. c x "i x \ *1 -to Does the instructional page reuire some type of meaningful interaction (that goes beyond clicking mouse buttons and answering trivial questions)? This question is important because learning is enhanced by interaction. However, mindless interaction does not contribute to useful learning.
*r *\ \ <* j m i y * Does the instructional page allow documents that are likely to be read to be downloaded (PDF, MB Word, etc.) and printed? Or at least, is the page designed into tables that are easy to print? This question is important because it's hard to read large amounts of text online.
to i_.s .. '"'I'F f i Dobs the instructional page allow the user to 1 ^ c j intuitively find and click on valuable navigation t'~ tools. Are they understandable and can the 1rtT~ "f juser return to a previous page. This question is important because if the user can't navigate they most likely won't complete the lesson or may not be able to return and check for correct understanding on content elements.
In the next project, we'll leam some advanced Dreamweaver skills and you'll have the chance to bring your
HTML, interface design, site design, page design, and Dreamweaver skills all together...
Figure 3.2. Page design rubrics table example.
In the example above, participants created a table. They were asked to submit potential
rubric questions (in the first column) and a rationale for including that rubric question (in column 2).
Instructional content appeared before this form and provided links to previously studied materials
and directions for this activity.
Physical & Sociocultural !
Big, gas guzzling SUV
passing everyone it can.
automatically sends signal
on crashed car
- V>r-, Regionalized
Submit your answers
Figure 3.3. Haddoris Matrix example.
In the example above, participants created a matrix. They were asked to submit a response
in three of the nine cells. Instructional content appeared before this form and provided ideas and
directions for the activity.
Part 1. In Part 1, examples of the list, table, and matrix forms were built and evaluated by
online instructors and instructional designer/developers (see
http://www.leamingpeaks.com/interactions_research/ for all examples used in Part 1 of the study).
Interviews were conducted individually or in small groups, in person or by e-mail or phone with a
structured list of questions (see Appendix A) as a starting point. Participants were encouraged to
provide any additional feedback they wished to provide.
As an established instructional technology consultant, I had access to potential participants
from a diverse range of educational settings (higher education, K-12, corporate training, consulting)
who use or desire to use online instructional materials. I solicited participation for Parts 1-3 via e-
mail (see Appendix B) and all who were interested were selected to participate. I asked those people
to recommend others in order to assure more diversity of feedback. The goal was to get a diverse
range of perspectives (Sasha A. Barab & Kirshner, 2001).
Part 2. The instructors and instructional designer/developers who expressed interest in
having a form built for their own use used templates (see Appendix C) to design a lesson that
incorporated one or more forms. I asked Part 2 participants questions about their lesson and we
discussed the rationale behind including the form(s) (see Appendix D). The lesson they designed
determined the functionality needed for the form that was included. Eric Replinger, an experienced
instructional developer with a Masters degree in instructional technology, and I built pages,
database tables, and programming links for each lesson. These were built with Dreamweaver MX
and PHP, and were hosted on my web hosts web server using the PHP server module and the
MySQL database server.
I was in touch with the participants throughout development by e-mail and phone in order to
assure that I was developing what they wished. Copies of e-mail communications and notes from
phone conversations were kept for evaluation and coding. The actual instructional pages and forms
were tested by the instructor, Eric, and me prior to using them with learners in Part 3.
Part 3. After the lesson was designed by the instructor or instructional designer/developer
and the forms were built and tested, the instructor or instructional designer/developer notified
learners of the online activity (via e-mail or link from other instructional materials) and that they
were participating in my study. I solicited participation (see Appendix E) from learners in one of the
Part 3 activities in order to get their feedback on the tool. E-mail or phone interviews were conducted
with instructors and interested learners during and after the online activity took place. I started with
structured lists of questions (see Appendices F and G) as a starting point but events that took place
during the activity often led me to ask other questions as well. I asked follow-up questions by e-mail
and phone as needed.
Assurances and Confidentiality
All participants signed informed consent forms (Appendixes H and I). The Human Subjects
Committee at the University of Colorado, Denver, approved this dissertation study (see Appendix J).
Qualitative data analysis commonly involves coding portions of the data in order for themes
to emerge (Krathwohl, 1998). Data in this study were coded using QSRs nVivo Qualitative Data
Analysis software, version 2.0. The following textual data sources were brought into nVivo for
coding: answers to interview questions, lesson documents, e-mails, and my notes from interviews
and phone conversations. Conversations between participants and me were not recorded and
transcribed. Instead, I took notes during in-person and phone interviews and transcribed these notes
in documents that were brought into nVivo. After Part 3 of the study concluded, I recorded
impressions of the resulting artifacts (see Appendix L) and these impressions were brought into
Table 3.2 shows initial coding themes that arose from my research questions and the
Research Questions and Initial Coding Themes
Research Questions Initial Coding Themes
What types of content, strategies, and outcomes does the tool best support? Does the tool support knowledge building? content strategies objectives remember
outcomes knowledge building
What is the nature of the interactions that occur when using the tool? In what ways are learners interacting with each other? In what ways are learners interacting with the content? people interaction content interaction
What is the nature of the artifact that is built? How does the tool compare to computer conferencing tools? What usability issues arise and what additional features are needed? artifact computer conferencing usability features
The first question references plans for, implementation of, and outcomes from instructional
activities. The content, strategies, and outcomes coding themes come from the instructional design
literature, specifically Merrills (1994) discussion of strategies for designing instruction according to
types of content and desired outcomes of instruction. Instructional strategies are seen as activities
that are planned (by instructors or instructional designers) so that content can be taught and the
desired outcomes achieved (Merrill, 1994; Sugrue, 2002). The two objectives sub-codes, remember
and use, come from Sugrues (2002) adaptation of Merrills (1994) outcomes of instruction.
Knowledge building is a specific outcome in which artifacts are built for the purpose of sharing
understanding with others (Lipponen, 2002; Scardamalia & Bereiter, 2002).
The second question references interactions that occur during instructional activities. The
coding themes for the second question arose from Moores (1989; 1993) distinction between
interactions with people and interactions with content.
The third question references the artifacts that result when learners submit text into these
forms in instructional activities. The artifact code comes from constructivist and activity theory
considerations about how people regularly build artifacts in the course of learning (Jonassen &
Rorher-Murphey, 1999; Lipponen, 2002; Peal & Wilson, 2001; Scardamalia & Bereiter, 1994,2002)
. Computer conferencing is a common tool used to build dialog artifacts in online instruction (Bonk,
2002; Harasim, 1997; Harasim et al., 1996; Suthers, 2001). The usability and features codes come
from the assertion (Hillman et al., 1994) that interaction with technology needs to be added to
Moores (1989; 1993) people and content interaction types. Usability concepts for online
instructional materials come from psychological and educational research about how humans
perceive and process information and utilize tools for learning (R. Clark, 2002; Kruper, 2002). The
usability code is for issues that impact ease of use. The features code was used to look at desired
Initial codes came from the literature but I allowed for the possibility that additional codes
would be added inductively, as needed, in order to answer my research questions. The unit of coding
analysis (Boyatzis, 1998) was individual comments, which makes sense given that I hoped to gain
insights into how individuals, especially instructors and instructional designer/developers viewed the
tools use in instructional activities. When coding resulting artifacts, however, the unit of analysis was
both individual comments (my comments about the artifact and participant comments inside the
comments forms) and the artifact as a whole, which represented the outcomes from the activity.
Limitations of the Design
This study is a preliminary design experiment to determine the potential usefulness and
usability of epistemic forms for targeted knowledge building in online instructional materials. There
are some obvious design limitations.
1. Participants. Participants were chosen because they were willing to participate and because
they had experience and interest in tools for online learning. I specifically included participants from
a wide range of instructional contexts and domains but the study does not include a representative
sample of all potential users of this tool and additional testing in other environments will be needed
in the future.
2. Data collection. In most cases, I did not see people actually use the tool and instead relied
on their written or verbal comments and the artifacts that were created as they used them. It would be
useful to watch people use the tools and this type of usability study should be considered for future
3. Time. Data were collected over a relatively short period of time. In order to gain insights
about prolonged use of these tools and the strategies and activities supported by them, the study
would need to include a longer time frame.
4. Limited set of tools. This study examined a limited set of epistemic form tools. Other
variations may prove to be more or less useful than the ones that were examined.
5. Multiple influences. The outcomes from a design experiment necessarily results from
hundreds of factors (The Design-Based Research Collective, 2003) and it will therefore be somewhat
difficult to determine the exact reason for specific outcomes and the expected replicability of these
outcomes in the future. Chapter 4 includes a detailed accounting of study events so the reader can
consider why certain outcomes occurred. Chapter 5 includes suggestions for additional studies that
will help confirm the outcomes from this study.
6. Participant researcher. Although designer experiments generally involve the researcher as
an integral participant in the iterative design process and analysis of results, my involvement
certainly cannot be expected to be completely objective. My views, opinions and expectations as an
instructional designer/developer can be expected to influence the process and analysis.
As a result of these limitations, the studys results may not be generalizable to other
participants or instructional situations (Krathwohl, 1998). Preliminary design experiments are
necessarily limited in scope. The primary intent of this study was to determine whether these forms
showed promise for further study and development in the future. Online instructors and instructional
designers/developers who are interested in the forms will need to determine if the descriptions and
the situations in which the tools were tested provide compelling evidence for their use of the forms in
their own instructional situations.
Plan for Reporting Findings
My original intent was to analyze and report on the data in a sequential manner: Part 1, Part
2, and Part 3. It became quickly apparent that data from the three parts are interrelated and therefore
part of one data set. Data from Part 3 provided more complex and compelling insights than those
gained in earlier Parts. Data were therefore coded and analyzed concurrently during all three parts of
the study. A discussion of how additional codes were added is provided in Chapter 4.1 sought
patterns that answered my research questions and provided guidance for continuing development of
these epistemic forms. To help with writing the findings and conclusions sections of the study report,
I kept a sequential log of my thoughts about emerging themes and implications.
Design itself is iterative and it is through the process of iteratively designing, evaluating,
and improving the design, the researcher comprehends what works and in what circumstances
(Hoadley, 2002; Koschmann, 2002; Lipponen, 2002; Stahl, 2002). Information gained informs the
next steps and uses for an innovation. Since this study preliminarily evaluated the design of a
potential new tool for people interaction in online instructional materials, a design experiment
provided the type of information needed to inform future development and future research needs.
Chapter 4 provides a thick enough description of the studys process and outcomes so that the reader
can make a determination of the forms value along with me.
The purpose of this design experiment was the evaluation of a new tool for people
interaction in online instructional materials. This study evaluated dynamically created epistemic
forms embedded in online instructional materials, as a mechanism for learner interaction and
knowledge building. Since social interaction affords desirable outcomes for instruction (Anderson,
2002; Brown & Duguid, 1989; M. G. Moore, 1993; Nunn, 1996; Scardamalia & Bereiter, 1994) and
computer conferencing has limitations which make it less than optimal for some types of social
interactions (Burge, 1994; Cartwright, 2000; Harasim, 1997; Shank, 2002), evaluating new tools that
can be used for social interaction in online instructional materials is a valuable endeavor. The
primary purpose for this kind of study is to determine whether the innovation being studied shows
enough value to be developed and studied further (The Design-Based Research Collective, 2003).
Therefore, this study should be viewed as a first step in the process of investigating the merits of this
The study took place in three parts. In Part 1, instructors and instructional
designer/developers evaluated examples of epistemic forms that I had created. In Part 2, instructors
and instructional designer/developers designed lessons and suggested a form or forms to go along
with the lesson. In Part 3, the Part 2 lessons and forms were built and tested in actual use. The
overarching question I hoped to answer was whether dynamically populated epistemic forms show
promise as a tool for social interaction in online instructional materials.
Initially, I mapped specific data sources to specific research questions. It turned out,
however, that data from many of the sources provided clues to each of the questions and triangulated
data from other sources. For example, I assumed that the bulk of data about usability and features
would come from observing how the forms were used and the problems that arose. Instead, much of
the usability and features data came from the iterative design process and discussions with the
instructor or instructional designer/developer about their lessons. In other words, data to answer the
three research questions came from a variety of data sources and the data sources reinforced each
other. Table 4.1 recaps the study questions and data sources for each of the parts of the study. These
data sources will be described in more detail later in the chapter.
Recap of Study Research Questions, Data Sources, and Data Analysis Methods
Specific Research Questions Part 1 Part 2 Part 3
What types of content, Data Sources: Data Sources: Data Sources:
strategies, and outcomes Face-to-face, e- Lesson designed by Artifact form built
does the tool best support? mail, or phone instructor or by participants
Does the tool support interviews with instructional Face-to-face, e-
knowledge building? instructor or designer/developer mail, or phone
What is the nature of the instructional (see Appendix Q interviews with
interactions that occur when designer/developer Face-to-face, e-mail, instructor or
using the tool? In what (see Appendix A) or phone interviews instructional
ways are learners Artifact form with instructor or designer/ developer
interacting with each other? built by study instructional (see Appendix F)
In what ways are learners participants . designer/developer Interview with
interacting with the content? My notes (general (see Appendix D) learners (see
What is the nature of the observations and Records of e-mails Appendix G)
artifact that is built? How themes) and phone My notes (general
does the tool compare to Data Analysis: conversations observations and
computer conferencing Code for activities, My notes (general themes)
tools? What usability issues interactions, and observations and Data Analysis:
arise and what additional artifact themes) Code for activities,
features are needed? Data Analysis: Code for activities, interactions, and artifact interactions, and artifact
Developmental research and design experiments (Sasha A. Barab & Kirshner, 2001; Cobb
et al., 2003; Richey, 1997; The Design-Based Research Collective, 2003) suggested the data
collection and analysis methods used in this study, including interviews, collaborative and iterative
design, and notes about impact and next steps. Data were collected during March and April of 2003.
I planned to collect data from a minimum of 10 instructors and instructional designer/developers in
Part 1, five instructors and instructional designer/developers in Part 2, and two instructors and eight
learners in Part 3.1 planned to collect data for Parts 1, 2, and 3 more or less sequentially.
In the rest of this chapter, I describe the participants of the study and important details about
the activities that took place in each part of the study, with special emphasis on Part 3 activities.
After this, I describe how the data were analyzed and then I describe the data that emerged in each of
the codes. It is difficult not to initiate a discussion of the implications of the data while presenting it.
That discussion is begun here but a much deeper discussion of these findings follows in Chapter 5.
Description of Participants
I solicited instructor and instructional designer/developer participation at the end of
February 2003 via e-mail (see Appendix B). Interest and participation varied widely. Some
instructors and instructional designer/developers were available immediately and others were
available later. Some initially agreed to only participate in Part 1 but after further consideration
decided to participate further. Since Part 3 required actual implementation of the lesson and form(s),
participation occurred as available instructional circumstances presented themselves. The first Part 3
participant was a medical school professor, INI, who wanted a matrix for use in one of her courses.
This occurred during the first week of data collection. It became clear that data from the three parts
of the study could not be collected sequentially. It also became clear that data from the three parts
were interrelated and part of one data set. In other words, data collected from actual use of the tool
provided further insights to those collected from looking at the examples in Part 1. Data from all
three parts of the study were therefore coded and analyzed as one data set.
A range of instructors and instructional designer/developers from diverse educational
settings (K-12, higher education, corporate training) participated in the study. I initially solicited
participation from seven instructors and 20 instructional designer/developers. Of those I solicited,
two instructors and eight instructional designer/developers participated in one or more parts of the
study. I asked these people to recommend others and they involved an additional seven instructors
and five instructional designer/developers. A total of nine instructors and 13 instructional
designer/developers participated in all three parts of the study.
Table 4.2 provides an overview of who participated in the study. There are essentially two
categories of participants. One category is the instructors and instructional designer/developers who
evaluated the example forms in Part 1 of the study, some of whom designed lessons and
implemented them in Parts 2 and 3 of the study. Throughout the rest of this chapter, these people will
be referred to as participants. The other category includes the people who participated in the
activities that were designed by instructors and instructional designer/developers for Parts 2 and 3 of
the study. Throughout this chapter, these people will be referred to as learners, even though not all
study activities were formal learning situations.
Each instructor or instructional designer/developer participant is represented in the table by
a single number. IN is an instructor participant and DD is an instructional designer/developer
participant. Symbols indicate whether these participants work in K-12, higher education, or corporate
training environments. Participants who work together in the same organization are grouped together
on one line (e.g., DD3, 4, 5, 6,7).
Learners ranged from students in an online high school course to trainers and instructional
designers willing to test out a variety of activities designed by a well-known instructional designer
(DD12). In all cases, learners knew that I was observing their interactions and the form as it was
being built. They are not represented by individual numbers because it was difficult, in some cases,
to distinguish between them. I had the opportunity to interview a few of INIs and IN2s students by
e-mail, but in most cases, I had to glean information about the thoughts of learners from their input
into the forms. DD12 included feedback forms in all of his activities so much data about learner
thoughts were available from those activities. Learners, however, were not always required to use
real names, so I sometimes had to guess at the actual number of learners (since a single learner could
potentially use more than one name or pseudonym).
Overview of Study Participants
Part 1 Part 2 Part 3
Instructors: Instructors: Instructors:
IN3, 4,5, 6 INI INI, 2
IN7 IN4 IN4, 9
Instructional designer/developers: Instructional designer/developers: Instructional designer/developers:
DD1,2 DD1 DD1,2
DD3,4, 5, 6,7 DD8 DD12 DD12
DD10 INI, 2: included 13 learners
-v* DD11 IN4:
DD12 Activity 1 included 7 learners
DD13 Activity 2 included 6 learners Activity 3 included learners DD12: Activity 1 included ~ 9 learners Activity 2 included ~ 19 learners Activity 3 included ~ 5 learners Activity 4 included 5 learners
= K-12 = Higher Education = Corporate Training
Description of Study Activities
Table 4.3 provides an overview of the activities that took place in all three parts of the
Overview of Study Activities
Part 1 Seven examples were evaluated by 6 instructors and 13 instructional designer/developers
Part 2 Ten activities were designed by 3 instructors and 2 instructional designer/developers
Part 3 Nine activities designed in Part 2 were built and tested in actual use:
Test A: 2 instructors ( INI and 2) used their activity between the 1st and 2" class meeting, 13 learners participated in the activity, 5 learners participated in e-mail interviews
Test B: 2 instructors ( 1N4,9) built and used three activities in an online high school
Activity 1:7 learners participated
Activity 2:6 learners participated
Activity 3:8 learners participated
Test C: 2 instructional designers ( DD1 and 2) tested a form for specific uses
Test D: 1 instructional designer ($ DD12) tested four instructional activities
Activity 1: ~9 learners participated
Activity 2: ~19 learners participated
Activity 3: ~5 learners participated
Activity 4: 5 learners participated
= K-12 = Higher Education ^ = Corporate Training
In order to provide background information for the rest of the chapter, additional
information about each of the activities is presented.
Participation in Part 1 was fairly similar across all 19 participants. Instructor and
instructional designer/developers reviewed examples of lessons (each of which included a form) that
I developed. They then answered questions about them (see Appendix A) in person, by phone, or
through e-mail. I followed-up by e-mail or phone if I had additional questions and these comments, if
any, were added to each participants interview notes. As the study progressed, themes about what
was happening began to emerge so I kept an ongoing, chronological log of my thoughts in order to
remember them. Some participants entered text into the example forms and screenshots of the
artifact (the completed form) and the text within them were documented. These data sources,
interview notes, my notes, and artifacts were later coded and analyzed in order to answer my
I had hoped to have most of the Part 1 participants design a lesson using one or more forms
for Part 2, but in general participants were not willing to put forth the effort required to design a
lesson unless they wanted me to build the lesson and form for their use in Part 3. The design of the
lesson and the resulting lesson artifact became part of my data. In some cases, I worked with the
instructor or instructional designer/developer to develop the activity, but in others the instructor or
instructional designer/developer worked alone, using the example forms from Part 1 as examples for
their own activities. In those cases where I communicated with the instructor or instructional
designer/developer about their lesson, the text from e-mails or notes from phone or in-person
conversations was written up as notes. An example of the notes from IN4 is included in Appendix K
(identifying names have been removed). As themes emerged from these activities, 1 added them to
my log. These data sources, lesson documents, notes from working with the instructor or
instructional designer/developer, and my log, were later coded and analyzed in order to answer my
In Part 3, the lesson or activity designed in Part 2 was built and put on the Web. Then each
activity was implemented with learners. I took screenshots of the activities as they unfolded and my
impressions of the artifact were documented (see Appendix L for an example of these notes). After
the activity was completed, the instructor or instructional designer/developer answered questions
about the activity (see Appendix F) through e-mail. I followed-up by e-mail or phone if I had
additional questions and these comments, if any, were added to each participants interview notes.
As themes emerged from these activities, I added them to my log. These data sources, artifacts and
text, instructor or instructional designer/developer interview notes, leamer/activity participant
interview notes, and my log were later coded and analyzed in order to answer my research questions.
Because the Part 3 activities differed greatly from each other, additional details about each
of the activities are presented.
The activity for Test A was designed by a higher education professor (INI) for use in
between weeks 1 and 2 of a classroom-based graduate level higher education course. The professor
wanted students to gain skill using a Haddons Matrix, a tool for considering injury prevention
The professor explained that doing this activity in class during the first class session would
not allow students enough time to fully consider the range of strategies that might be employed. She
hoped that doing the activity online would allow students time for more productive consideration so
that they could suggest more creative solutions. Each student was required to fill in answers in three
cells. Figure 4.1 shows the completed matrix that the students built. Names have been masked to
protect student identity.
n' -y- Hrtl(Hipuz)y. yeeinrfVOiek) F^nkd&Sickvbinl Eafirmt
V'H ^graduated ;y;y- yK*y Vehicle implented cefl phone mic/speakers >Â£*handsfree cellphone
y,*' licenses system (built into car)
AniiDWI : High speeds in y<- Open container laws
-S- :-r. ffiMnpwgn< construction zones
sjT. hi'*- ,; ."tighter vision AniiDWI car starter :j-Roadrepair
;Pre-y: requirements devices Stricter penalties for
.oubfic camoaiEnto r automatic obstacle
. > reduce sleep deprivation proximity detectors >*Promote Designated
SfjÂ£ . yyyy y.-: required v rsiped tires for
defensive driving courses improved traction .- >;r Urban "no car" zones
use of distractions
(radio, telephone, etc.)
ban cellphone use in
** 1 turn off airbag when
child in front seat
^ ? Airbags
v^ r : !v' Built-in car infant seats
< crumple zones
'.'A -1' collapsingsteepng-
Ban cellphone use
Crash guardrails or
j Q&iSr Automatic weight
: detection of front seat
i occupant to fagn airbag cn/ofF
knowledge of basic life
support(BLS) and/or CPR
1 > i-i&l non-flammable fuel
! ' Hi-tech emergency cere
less flammable fuel
p. -y D e sign vehicle to deploy
anti-gawking screen after
1 Aw^'v:;. Region aimed
: trauma care
I davws wiihDUl
GPS for crashed
; y. Knowledge of MV ! vehicles
intact injuries !
Red "help" phones
r:y -; Teach-NOW use cell
phone to get help
1 v.y'v.f /: "panic button"
i installed in cars to alert rescue
[ training for paramedics
Submit your answers
Figure 4.1. Haddon s Matrix form built by students in a graduate level injury prevention course.
In addition to the text in the form, 5 of the 13 students in the course agreed to let me
interview them by e-mail after they completed the activity (see Appendix G).
Test B involved three separate activities designed by a high school teacher (IN4) for an
online earth science course, part of an online high school program for at-risk teens. Figure 4.2 shows
the first activity, which required students to synthesize information about prehistoric landscapes
found on a museum website into a table. Each student was given a landscape to study and they each
filled in a row of the table. Names have been masked to protect student identity.
Nome Landscape MYA Formation Period Plants Animals GO example Other
Colorado's East Coast 100 Dakota Sandstone Cretaceous Astraloptarir.Matonfdlum, Sapindcpsis, Uriophyllym, Protophyllum, tguanandon Dakota ridge DAkota ridge Is a good place to view places like this
Submarine Colorado 70 Pierre Shale Late Cretaceous none pterosaurs Dinosaur Ridge Dinosaur ridge has the rock frmstion image shovn on website near Rooney Road
V t _ Long Nadi Meadow 150 Morrison Late Jurassic Ferns and 2amites Apatosaurs more UUIItlMUtll? known as the brontosaurus Dinosaur Ridge There is a picture right around Golden
HBft. The Rodiies Explode 37 Castle Rock Rhyolite Cenozotc None; major volcanic activity Titanotheres Castie Rock and Monument Kill Rhyolite was used in buildings In downtown Denver, induding on Capitol Hill
Red dirt voHd 55Milllon years ago Pakosoh Dawson Arkose Early Eocene Period Macgfnitiea Trees Lygodium Ferns Crocodiles Hippoiike Coryphodon Parker The background of the picture reminded me of the rainforest
r v Colored os East Coast 100 Dakota Sandstone Cretaceous herbaceous ferns, broad leaved trees and srange conifers Iguanandon dinosaurs Dakota ridge Some other viewing spots are Garden of the Gods Park and Visitor Center and 1*70 roadoit at the Morrison exit
-v ancesteral rockies 300 fountain Pennsylvanian conifers, scale trees reptiles and amhibians, and insects like dragonflies and millipedes. Dinosaurs hadnt yet evolved. red rocks, boulder flatirons. also the garden of the gods.
Front Ranqe Today 0 Rodcy Mountains Cenocoic Grass, flowers, trees, weeds, etc Doqs, cats, cows, elk, deer, birds, etc Rocky Flats No one really has to work hard to study this one...
Submit the answers for your assigned landscape below and then press the SUBMIT button:
Name Landscape j MYA Formation Period Plants Animals CO example Other
1 i i : i i i:- i : 1 \ : vi
| SUBMIT | ;
Figure 4.2. Ancient landscapes form completed by students in an online high school social studies
The second activity was similar to the first, requiring students to synthesize information
from another museum website about rock layers and fossils into a similar table. The third activity
was a matrix activity, asking students to consider advantages, disadvantages, and consequences of
three potential courses of action regarding fossils found in a construction zone. Each student was
required to fill in answer in each of the columns. Figure 4.3 shows this activity. Names have been
masked to protect student identity.
Consequences Now _;.. Consequences Late
f11 *. % V
- i \ "
"S:* Closer to home, i A.S.A.P
t 'V Expensive to
Get to study the ; make new museum
>.*- Less crowds
i;;C~V;ri7;Have to wait to I because its in a smaller
'Near , - Iguess study the fossils. town.
where most fossils are that That wasn't so smart.
found. build them far away. ''"r- .WE win get to study fossils and -^raswfll probley cossed a lot to
_ More " .Need maybe find new ones?! do?!
popularity and closer to more people more stuff to Â£01 the big - \ '' More . Expensive
rr* 1. customers. to build, hard to maintain.
Figure 4.3. Portion of comparison form built by students in an online high school Earth science
In Test C, two instructional designer/developers (DD1 and 2) working in the distance
learning division of a small private university wanted to see if these forms could be used for
overcoming specific difficulties that they experienced with using the computer conferencing system
for some of their courses. They wanted a table where students could paste in a graphic, MathML (a
way to include mathematical expressions in Web pages), or programming code and allow other
students to comment on it.
Dr. Sivasailam Thiagarajan (DD12) set up four tests to see if the forms would be valuable
for building certain types of instructional activities. Dr. Thiagarajan is an internationally recognized
expert on games and simulations for learning. My dissertation topic was largely inspired by his work
and, as a result, he agreed to test these forms and be an outside reader for this dissertation. He agreed
to allow me to identify him in this study in order to properly credit his contribution to the study.
Dr. Thiagarajan had many thoughts about how the forms could be used. He selected a few
of his ideas to try and we worked together on building the activity and forms. As I built them, we
regularly discussed pros and cons of various options and often made changes to the forms before or
during the activity. I kept a log of our phone and e-mail conversations for each of the four activities.
He invited instructional designers and trainers in a variety of corporate and government training
organizations to try these activities and provide input about the activities and the forms. I participated
in the first two activities.
The first activity involved building a story about a fictional performance technologist
working with cultural issues. He began with a few sentences of text to start a story and asked
participants to submit additions to the story. During the activity, Dr. Thiagarajan and I talked about
the need to allow participants to add comments about the stoiy or the process. During the activity, I
added a comments column to the form and participants began adding comments as well as adding to
the story. Figure 4.4 shows a portion of this activity. The left column contains the story and the right
column contains participant comments. Names, others than Dr. Thiagarajans (Thiagi), have been
masked to protect learner identity.
r i A wave of exhaustion flooded over her. Facing the Commander at the moment was impossible. She would feign illness. Quickly she returned to her room, and crawled into bed with a novel. She was soon lost in a world she could understand.
Thanks to for coming up with an amusing dosing (for now, anyway
Â£.. r:. < l 0 \ v:i- Thiagi b I discussed this format on a trip to Indy yesterday. I personally found it frustrating; I read a lot of fiction, and the lack of focus and seeming-incompatible-goels of the co-authors made this hard for me to read. Sines Thiagi forced me to write another paragraph, I looked it over and tried to come up with something that would tie together the previous paragraphs and provide a strong hint for a future direction. 1 clicked the button, and the result looked good ... until Thiagi pointed out that my paragraph had been "pre- empted* by r.-io's (is, Vr-" had inserted a different paragraph between the time I started writing mine and the time 1 clicked the button). That stopped me cold, and made me not want to contribute again.
* t n * I guess overall my two main problems are a lade of focus, and uncertainty about awho gets to go next1. Perhaps these don't bother other people as much. Thiagi b I are working on designing a new format that might address these issues.
t** x tv' l A* V-> C'1 I cant stop myselfI share the frustration mentioned about the multiple directions the story seemed to take. Being in performance consulting (whatever that is...) I both identified with Chaundra and Soon she was asleep and dreaming. But the dream was less than wanted to her to overcome the challenges of being In a field that has enjoyable. much to offer., but also one that Is often misunderstood. I learned a thing or two from her epiphanies. I would add to the her learning, that _. , although 'practitioners have several bast practices that are not visible She was staring m disbelief at at the words flickering on her computer newComers and outside observers....the problem is a non- screen--To:Chandra FromiThiagi problem..." These "solutions" may just take care of the immediate "local* performance concern but be creating problems ata different Let's create a short story ...The current version of the story is system level, presented in the form of a table: The first column displays the names of the co-authors. The second column displays the paragraphs of the : pgr me the story was something of a diffhanger, I came back on stor*rM several occasions to see where things were going. 1 am interested in the first comment above. As I read the story, I have a sense of cohesion. (Maybe because my own experience of performance consulting is so disjointed?) :)
r-* % ^ ,, I meant to note this a moment ago:
While frustrating in some ways. It was nice to have what technically could be viewed as a dissusion forum In which people, for the most part, ereotively built an the ideas of other while adding their points of view
I have found that in other discussions. Ideas ere critiqued overly quickly without opportunity to Hovolnp.
Â£ S - This exercise was really interesting to me. Because I was tied up in a conference, I didn't oet a chance to read or add to the story until a lot of creative things had happened. I wanted to contribute but I felt stuck as to where to go with the buzzing/grinding thing that was being lifted on to the deskl Feeling out of my element, I thought maybe Chandra
Figure 4.4. Activity 1: Portion of first interactive fiction form.
The second activity evolved from lessons learned during the first activity. This was a
similar interactive fiction building exercise but Dr. Thiagarajan wanted to try having two tables, one
for the story and one for the comments. Figures 4.5 and 4.6 respectively show portions of these two
forms. Names and identifying information, others than Dr. Thiagarajans (Thiagi), have been masked
to protect learner identity.
feW" V".1 :
! So out came the markers and Chandra's creativity to depict a visual that all could use to understand her
; mission. Crossing all language and cultural boundaries, Chandra drew her mission, depicting the ISS members,
. and their culture, toqether and in harmony. Chandra took her mission and posted it in the communal area in the
I ISS. Steping back to admire her creation, simple yet effective, Chandra suddenly realised that others had
i gathered around to view her creation, she turned to see their reaction. 'What is that thought that I am
, seeing?1, mused Chandra to herself.
I Where are the pictures asked Che? Pictures, what do you mean pictures"? quizzed Chandra. Then it dawned
j on her; even though everyone on the station spoke and could read English, no two saw it the same way. Just
at that moment there was a brief shudder through her gravitational boots. Nothing major, just that oh oh"
! sense you get when you reverse over the kids skateboard on the driveway.
j Chandra's brain was clicking at a high speed and she was multitasking frantically. Her immediate task was to
I respond to Che and move to a more nonverbal Iconic way of depicting her performance improvement mission.
: But at the same time, she recalled that the Russian scientist and the US scientis were fighting about about
; plutonium levels in the generator. She was not sure whether the argument was due to national differences,
cultural differences, or language differences. Or could it be because there was genuine difference of opinions
; between two prima donna scientists? She will have to find out very soon. Chandra excused herself hurriedly
| from her spectators and strode to the conference room X-1S where she was told the Russian and the US
I scientist are arguing.
! On her way to the conference room, Chandra had to pass through the mile-long inter-dome traversable tube.
: She paused to look at the galaxy surrounding her, amazed as always with the beauty around her. At the stars,
I planets and comets. I hat's when It happened, it hit her like a 2-tan moon rock! it isn't about changing people,
| but about accepting people. Accepting the differences, the uniqueness, even the quirkiness of each member of
i the team. We can be as different and awesome as what I see before mel The first step is to find a common
. ground for communication. Never mind the plutonium problem! 1 need to sketch this out and brainstorm with
| some of my closest associates.
| And there it was...crystal dear to her...her mission as the 'Resident performance consultant'...making people
accept and even enjoy the diversity available in this rich and varied working environment. At least the 'what'
, was clear, now for the 'how'
Figure 4.5. Activity 2: Portion of second interactive fiction story form.
Thiagi If my count is correct, we have 12 different co-authors. If you are bashful, you can always contribute a story segment under a pseudonym. We are not technically competent to figure out who sent what (and nor do we get paranoid).
r * I am enjoying the flow of the story and seeing people build on each others creativity but have found myself feeling the need for more of a target than just a rolling story. Must be too goal oriented. Maybe I just forgot.
Thiagi Here's a goal for you: Can you emphasize the importance of doing unto others what they would like tD be done unto them (instead of merely doing what you would like them to do unto you)in cross-cultural interactions? This may take more than one story segment to develop. But can you push the story in that direction?
iFTTiro 'VV* wow, am I impressed! What a great twist with the fighting scientists. I had no idea how that would tum out, but it is light-years beyond what I expectedl
^ Nice set upfrlf;. It opens up several great possibilities for growth. Thanks for letting me play.
* K 4 k i / j- l 3 Well, I learned again not to let my Thiagi e-mail rest unopened in my inbox to long before reading it. I missed uul uri a lul uf ifilerauliun. By life way, I atn an internal perfurnidriue uunsullanl al Lhe 7?7 SjIwith the directors and personal of the ^ - ~ ,v^7 --- *4<*v > i agree with that we need a focus to bnng application to the discussion. I hope I didn't put to much emphasis on that for others.
Figure 4.6. Activity 2: Portion of second interactive fiction comment form.
The third activity was a debrief list built to allow people who had participated in one of Dr.
Thiagarajans e-mail games to comment on the game and the process. In the fourth activity, Dr.
Thiagarajan had five learners build five different team building cases incrementally on five
consecutive days. Learners built a different part of each case every day in succession so the
completed case was the product of all five learners work. An additional comments form allowed
learners to comment on the cases and the process. Figures 4.7 and 4.8 respectively show portions of
one of the cases and the comment form.
Day j Player* Instructions Case Contribution
Mon 1 Think of an imaginary (but authentic) situation in which the formation of a team is announced. Specify these types of background information: the organization of which the team is a part, the reason for organizing the team, the make-up of the team (number, members), the mission for the team, the team leader, and the sponsors of the team. The Dream Team. The small government agency suddenly had a need to create 'quality teams.' A ; funcitonally diverse, yet surprisingly intelligent team was recruited from various units of the 1 agency. All the recruits shared one dreamto better the work flow and the esprit de corps of the agency.
Tue 2 Imagine what happens during the FORMING stage of the formation of the team. Refer back to the website page on i earn Development stages to refresh your memory. Select a few salient incidents related to the behaviors, reactions, feelings, and statements of different members of the team. Describe these items buuuncUy. 1 Some problems arose immediatelywhat does it mean to "better the work flow*? How could they : improve 'tne espm ae corps* orxne agency? une : group within the team felt that 'bettering the : workflow" simply meant speeding up the process, ; so that the end results could come quicker. Auulhet lliuuylil Utal il hwiiI uuiiuciilidluiy uii : simplifying the tasks of people who do the actual workgreatest good for the greatest number. And nobody seemed to agree on how to improve 1 epsrit de corps
Wed 3 Imagine what happens during the STORMING stage of the formation of the team. Refer back to the website page on Team Development Stages to refresh your ' The team agreed to meet weekly, but that seemed to be all they agreed an. After three weeks of meeting, they were no closer to
Figure 4.7. Activity 4: Portion of team development Case 1 form. I
A QUESTION TO THE FIVE PLAYERS: In my original note, I implied that this activity will take
about 10-15 minutes of your time. 1 probably lied. On an average, how much time do you
spend on this activity?
I was tardy today, sorry. Let me catch up with all the debrief Ive missed here. No, there is
too much...let me sum up. 1 agree there should be more column 3 space. Looking forward to
Round 6. Yes, rue read the other cases: I like them! 1 am probably rot spending much more
than 15-20 minutes. I'm worried that rm too wordy, though.
Oops. I messed up BIG, too. I have several questions answered in the above cell.
No problem. You are wordy, but your words are worth reading.
THURSDAY Road Runner Award goes to He was the first one to post his segment. You
may now go back to sleep, -^*7.
I don't think it's fair that Bast coasters and folks who don't sleep get a jump on the others.
I find I'm spending more time now that the stories are longer I'm needing to read the story
from the beginning & am putting more thought to continuing the story Ene. But it's still
probably about 20 minutes. I also love the idea that the window to enter text would be
larger it's way too small now...hard to review what you've written before submitting.
Should also have a link to the team development info right by the window so you can click
from there to remind yourself of the characteristics of each phase.
Thanks for your valuable comments and wonderful work. In a previous run of a similar game,
I always created my segment on WORD, edited it, and then cut-and-pasted it inside the
text box. For the time you have spent on this pilot test, you will gBt a special
Figure 4.8. Activity 4: Portion of team development cases comment form.
Parts 1-3 of the study yielded 48 documents. The number, type, and document contributors
are described in Table 4.4.
Documents and Contributors I
Document Type # Documents Contributors to Document
Part 1, Interviews with instructors, instructional designer/developers 10 Part 1 study participants (instructors, instructional designer/developers)
Part 1, Artifact impressions 4 Patti Shank
Part 2, Lesson documents 3 Part 2 study participants (instructors, instructional designer/developers)instructors, instructional designer/developers
Part 2/3, Notes from e-mails and phone conversations (many included lesson details) 9 Primary: Part 2/3 study participants (instructors, instructional designer/developers) Secondary: Patti Shank
Part 3, Interviews with instructors, instructional designer/developers 5 Part 3 study participants (instructors, instructional designer/developers)
Part 3, Interviews with learners 6 Part 3 activity participants (learners)
Part 3, Artifact descriptions 9 Patti Shank
Part 1/2/3, Chronological notes (impressions, themes) 2 Patti Shank
I began by precoding nVivo with my initial coding themes (see Table 3.2) because these
were the critical issues that emerged from the literature review, as depicted in my problem space (see
Figure 1.1). As I coded the documents, I added a few specific subcodes because these concepts had
emerged as having explanatory value for establishing the value of the tool. I specifically did not
follow all potential concepts and subconcepts because the study is a preliminary one to see if the tool
should be studied further and it therefore made sense to limit the scope of the study.
After coding the documents once, I coded them again to verify initial coding and to check
the subcodes added subsequently. I marked those documents that were more complex and coded
them once again. As I added subcodes, I created a code definition document, Table 4.5, to assist me
in determining the proper coding.
Research Question Code/Subcode Definition
1. What types of content, strategies, and outcomes does Instructional Activities: The codes in this group reference plans for, implementation of, and outcomes from instructional activities.
the tool best support? Does the tool support knowledge content What range of instructional subject matter can this tool be used with? Includes categories (e.g., K-12, undergraduate, graduate, training) and domains (science, math, business).
building? strategies What instructional methods and activities do these tools support?
-objectives What kinds of objectives does the tool support?
remember recall or recognition objectives
use construction or performance objectives
-directions Do instructions impact tool use and the activity?
-facilitation How does facilitation and discussion of process impact the activity?
-time How does time influence tool use and tool use influence time?
outcomes What results are expected or arise when using these tools?
-social Does using the tool convey social interaction
Table 4.5 (Cont.)
Research Question Code/Subcode Definition
-knowledge building Do participants construct something of value to the group, something people in the group would want to print, keep, and use later?
2. What is the nature of the Interactions: The codes in this group reference interactions that occur during instructional activities.
interactions that occur when using people interaction What types of people interactions does the tool afford?
the tool? In what ways are learners interacting with each other? In what ways are learners interacting with the -respond to others How does the tool support or limit the ability to directly or indirectly react to others contributions, such as drawing on others contributions, replying to or adding on to earlier contributions, offering a contribution with intention of having others reply or add on to it?
content? -return Do learners return to view others contributions?
-anonymity How does identity impact interactions?
content interaction How does the artifact become additional content?
3. What is the nature of the Tool/Artifact: The codes in this group reference the artifact that results from instructional activities.
artifact that is built? How does the tool compare to visual and physical attributes What are the attributes and characteristics of the resulting form and what is the effect of viewing the artifact embedded in instruction?
computer conferencing tools? computer conferencing How does the tool compare to computer conferencing as a tool for people interactions?
What usability issues arise and usability What attributes and characteristics impact ease of use?
what additional features are needed? features What additional tool features are desired?
Overall, I coded 830 text passages in 48 documents. I had difficulties, at first, settling on an
arrangement of the codes and subcodes, because each code was related to so many others. It seemed
that almost every text passage could be coded into multiple places. Many of the text passages, for
instance, could be coded into the Instructional Activities coding category because the study involved
numerous instructional activities. Upon further analysis, it made sense that a great deal of the
passages could be coded in the first coding category because the code arrangement matched my
problem space, a nested series of issues related to online instructional activities. Instructional
activities (question 1) involve interactions with people and content (question 2). These interactions
are mediated by tools which produce artifacts (question 3). The fact that the problem space is nested
necessarily means that there are many interrelations between the codes. The final arrangement of
codes directly matched the nested problem space. The final coding of text passages reflects this
nested problem space as well.
In order to test the validity of my coding, I performed an interrater agreement test, using
two text passages from each of the eight main codes: content, strategies, outcomes, people
interaction; visual and physical attributes, computer conferencing, usability, and features. I did not
include the content interactions code because this code contained only my own descriptions and
impressions from the four Part 1 example artifacts and the five Part 3 artifacts. Eric Replinger, an
instructional designer and developer who helped me develop forms for Part 3 of the study, rated 16
passages (see Appendices M and N for test directions and rating results). We agreed on ratings for 14
out of the 16 passages, for a simple agreement of 87.5%. Cohens kappa was computed in order to
adjust for chance agreement and the result was 85.7% agreement.
In this section, I describe findings for each of the codes and subcodes in the three primary
coding categories (each category matches one of the research questions). For most codes and
subcodes, I present examples of participant comments to exemplify the types of comments received
about the tool and the activities. Comments have not been changed, with the exception of adding text
in brackets for clarity. I participated in the examples in Part 1 and in two of the Part 3 activities
(both of Dr. Thiagarajans interactive fiction activities) and whenever I offer my own comments as
examples, that is disclosed. Some of the implications of these findings are described in this chapter
but the primary discussion of implications occurs in Chapter 5 in order to answer the studys research
Instructional Activities Coding Category
This is the coding category for the codes that reference plans for, implementation of, and
outcomes from instructional activities. The data from the codes and subcodes in this category are
meant to answer the first research question: What types of content, strategies, and outcomes does the
tool best support? Does the tool support knowledge building?
This code was used for texts describing the range of instructional subject matter (domains)
and levels of instruction (K-12, undergraduate, graduate, training) that participants believed the tool
was valuable for, as well as the range of actual uses that the tool was used for in Part 3 of the study.
Participants described how the tool could be used in a variety of content areas and the entire range of
online K-12, higher education, and training instructional materials. Here are some specific
participant comments about the versatility of the tool for supporting a wide variety of content.
IN8: Seems to support many types of objectives and content.
DD10: What was striking in the examples was the wide variety of courses that
the forms supported.
DD11:1 was surprised at the variety of things you could use it for. It was
much broader than I had at first thought.
A few of the content area and levels specifically mentioned by participants include evidence-based
medicine (graduate level higher education), research on ancient landscapes (high school earth
science), sales training (corporate training), mathematical reasoning (high school mathematics),
creativity (corporate training), pharmaceutical product knowledge (corporate training), web design
(higher education), interactive fiction (corporate training), and reading (high school language arts).
None of the participants mentioned using the forms in elementary or middle school but that is
expected because none of the participants work in these environments.
It was clear from participant comments as well as the artifacts generated in Part 1 and Part 3
that the forms could be used to both generate content and explore existing instructional content. IN4
used the tool to allow high school students to synthesize existing content from a website into a table.
INI used the tool to have students apply a model for injury prevention by generating examples. Dr.
Thiagarajan (DD12) used the tool to allow participants to create a fictional story about a performance
technologist on the International Space Station. He added another table so participants could debrief
the story as it was happening and afterwards. These uses illustrate the potential range of possibilities
for exploring or generating content.
A few participants commented that the tool appeared to support certain types of content
DD8: Tool initially seems to be best suited for non-technical course content or
where opinion or no right/wrong answer is OK.
DD9: Offhand Id say itd be more useful for less objective content. If the
topic [was] the Civil War, you wouldnt use this to get names and dates of battles,
except possibly to assess the incoming level of knowledge. Even then, you might
prefer a one-on-one medium like e-mail, or a pretest, in order not to make people
DD11: The tool is probably better suited for content that should engender
discussion rather than things for which there is a correct answer.
These comments suggest that the tool may work best for higher levels of instructional
objectives and this corresponds to data collected in the Strategies and Objectives code and subcodes.
This code was used for texts describing the range of instructional activities that participants
believed the tool could be used to support, as well as the range of actual activities that the tool was
used for in Parts 1 and 3 of the study. The range of strategies suggested and used by participants was
wide and corresponded to the data in the Content code to show that the tool could be used both to
explore content and to generate content. The types of activities suggested for exploring existing
content included analyzing a study or case, examining steps in a process, reflecting on readings,
synthesizing content, evaluating laws, and drawing conclusions. The types of activities suggested for
generating new content included developing a process, creative writing, developing examples and
cases, and producing problem sets to be tackled.
When using the tool, the creation of the artifact results in new content. Therefore, all uses
might be seen as generating new content. Many participants described activities that clearly explore
and generate content, implying multiple activities and uses of the tool (or other tools, including
documents or face-to-face dialog) over time. IN9 suggested having students develop conclusions
about the data they had posted earlier about ancient landscapes as a follow-on activity. IN4 asked
students, after reading a scenario about archeological finds in a construction site, to populate a table
with benefits and consequences of three courses of action. After all students contributed, the artifact
would be used in a discussion of the best course of action. DD5 suggested using the tool to analyze a
problem for which insufficient data is provided. Learners could list additional information needed to
solve this problem. Later, after the additional information is gathered or provided, learners could
propose solutions. Uses of the tool over time, such as some of these, are explored further in the Time
DD9 described two types of strategies that the form could support and the potential impact
of each. His explanation explores the impact of strategy upon objectives and time, which are
explored further in these individual subcodes.
DD9: In the one-hit scenario, the idea is to have each learner submit one item
(definition, example, etc.), and to read the items of others. The primary intent at
this stage is not to have comments on the work of others, though there may be such
comments. The iterative scenario expects and encourages repeat trips. I write
my definition and post it. I examine the postings of others. I make additional
contributions of my own (another example would be...), and I comment on the
postings of others.
Most of the students in INIs course said that they would have benefited from additional
preparation before attempting to perform the injury prevention matrix activity. This suggests that
instructors and instructional designer/developers need to consider student preparation for activities
that call for students to generate content. Dr. Thiagarajan (DD12) asserted that it made sense
instructionally, in some instances, to allow students to struggle with this kind of activity.
The tool was also reported to supported classroom based activities and future activities
(classroom or online). INI used the tool in between the first and second class meetings so that
students could practice working with the injury prevention tool in preparation for future class
activities. IN4s activities were sequential and fed into each other.
Strategies-objectives. This code was used to group text passages that represented the type of
instructional objectives the tool could support. Sugrue (2002) described a model for considering two
types of learning objectives, those that involve learners in recalling or remembering and others which
involve learners in application. Participants recognized that the tool could easily support a wide
range of objectives and the lessons developed for Part 3 of the study included a variety of objective
IN4: Whole range... from simple recall to creative and critical thinking.
Depends on use, strategy, follow-up, etc.
DD12: We can use this approach to teach processes, procedures, conceptual
frameworks, and multiple perspectives.
Dr. Thiagarajan explained that these types of games are very powerful because they can be
used at the beginning, middle, and higher levels of learning. For instance, he described how they
could be used to support recall objectives in an introductory Spanish course by having learners
translate words in a matrix from English to Spanish and Spanish to English. They could be used later
to support use objectives as learners use these words to write a story with these words.
Strategies-objectives-remember. This code was used to group text passages that represented
objectives for prompting learners to recall content. Although most of the suggested and actual uses of
the tool supported more use oriented objectives, the tool also supported remember objectives. In Part
3 of the study, IN4 used the tool to have students synthesize data from a website into a table.
Strategies-objectives-use. This code was used to group text passages that represented
objectives for prompting learners to apply knowledge. In Part 3 of the study, seven of the nine
activities involved use objectives. Participants suggested numerous other uses of the tool that
involved use objectives.
IN8: Identity the researchable question.
IN7: Indicate if the law favors labor or management.
DD12: Generate list of suitable responses.
IN4: Assess advantages, disadvantages, and consequences now and later for
three separate courses of action.
Strategies-directions. The Directions subcode of the Strategies code was used to group text
passages that represented how written directions impacted use of the tool or the activity. Some
participants complained about unclear or cumbersome directions.
Learner, INIs matrix activity: Assignment was unclear- did professors want
actual interventions inside the matrix or did they want characteristics that affected
exposure and outcomes.