Citation
Developing a virtual mentor to accompany a medical simulator

Material Information

Title:
Developing a virtual mentor to accompany a medical simulator
Creator:
Bagur, Michelle M
Publication Date:
Language:
English
Physical Description:
viii, 79 leaves : illustrations ; 28 cm

Thesis/Dissertation Information

Degree:
Master's ( Master of science)
Degree Grantor:
University of Colorado Denver
Degree Divisions:
College of Liberal Arts and Sciences, CU Denver
Degree Disciplines:
Integrated science

Subjects

Subjects / Keywords:
Virtual reality in medicine ( lcsh )
Medicine -- Study and teaching -- Simulation methods ( lcsh )
Simulated patients ( lcsh )
Medicine -- Study and teaching -- Simulation methods ( fast )
Simulated patients ( fast )
Virtual reality in medicine ( fast )
Genre:
bibliography ( marcgt )
theses ( marcgt )
non-fiction ( marcgt )

Notes

Bibliography:
Includes bibliographical references (leaves 78-79).
General Note:
Integrated Sciences Program
Statement of Responsibility:
by Michelle M. Bagur.

Record Information

Source Institution:
|University of Colorado Denver
Holding Location:
Auraria Library
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
71800809 ( OCLC )
ocm71800809
Classification:
LD1193.L584 2006m B33 ( lcc )

Full Text
Developing a Virtual Mentor to Accompany a Medical Simulator
by
Michelle M. Bagur
B.S., University of Texas at Dallas, 1997
A thesis submitted to the
University of Colorado at Denver and Health Sciences Center
in partial fulfillment
of the requirements for the degree of
Master of Integrated Science
2006


This thesis for the Master of Integrated Science degree
by Michelle M. Bagur
has been approved by
Karl Reinig
Date


Bagur, Michelle M. (M.I.S., Master of Integrated Science)
Developing a Virtual Mentor to Accompany a Medical Simulator
Thesis directed by Professor John Lanning
ABSTRACT
As technology advances, simulators are becoming more standard in the
medical field. Training methodologies are emerging that allow doctors to
practice techniques on machines which duplicate patients, instead of on the
patients themselves. This allows for patient safety, a more accessible learning
schedule (as doctors do not need to wait for certain conditions to arise in the
operating theater before they learn how to address the problem), and a more
regulated training environment. Simulators also require less hands-on,
instructional time from a practicing doctor. However, the expertise of the doctor is
still required in a simulated environment, and without being under the watchful
eye of an experienced doctor, the resident needs to be guided through the
necessary curriculum and appropriate steps for the procedure, while being
monitored for ability and comprehension. This thesis describes the creation of a
computer application called the Mentor, which communicates with a medical
simulator. The Mentor guides the resident through expert-dictated curriculum,
allows them to practice procedures on the simulator, and monitors and records
their progress.


This abstract accurately represents the content of the candidates thesis. I
recommend its publication.
Signed
I
l


ACKNOWLEDGEMENTS
Thanks to my thesis committee, Drs. Lanning, Reinig, and Eckhoff. Thank you
also to Neill Kipp, who offered instruction and advice, and introduced me to the
interesting field of Interface Design. Thanks to Dr. Victor Spitzer and Dr. David
Rubinstein, and the staff at the UCHSC Center for Human Simulation and Touch
of Life Technologies. They were integral in this process. I also wish to extend
thanks to the CU Denver CLAS graduate faculty, especially Jill Hutchison,
Tammy Stone, and Doris Kimbrough. And finally, I would like to thank my
wonderful family and dear friends, for encouraging me both to continue the
process, and all along the way.


CONTENTS
FIGURES........................................................................viii
CHAPTER
1. Introduction..................................................................1
1.1 Goals of the Application......................................................2
1.2 Considerations...............................................................3
2. Previous Work.................................................................5
2.1 The history of the Explorable Virtual Human and Interactive Anatomic Animations.5
2.2 Usability Engineering, history and terminology...............................6
2.2.1 Human Factors...............................................................7
2.2.2 Human-Computer Interaction.................................................8
2.3 Usability Engineering Considerations.........................................8
2.4 Content Development.........................................................10
2.5 Current simulators used in training.........................................11
3. Methods......................................................................14
3.1 Analysis of tools for the Mentor............................................15
3.1.1 Java.......................................................................16
3.1.2 Hyper Text Markup Language................................................17
3.1.3 extensible Markup Language................................................18
3.1.4 JavaScript................................................................21
3.1.5 Interactive Anatomic Animations...........................................21
3.1.6. C++.......................................................................23
4. Results and Discussion.......................................................26
4.1 Training on Use.............................................................26
VI


4 2 Presentation of Media
28
4.3 Presentation of Content... 30
4.4 Verification of Completeness............................................32
4.5 No keyboard Input.......................................................35
4.6 Tests...................................................................36
4.7 Mentor use in the field.................................................39
5.1 Summary..................................................................40
5.2 Current State and Future Work...........................................41
APPENDIX
A. XML Code: files for lesson...............................................43
B. Html Code: file for individual page (content by Adam Lawson).............45
C. Socket communication and simulator communication.........................46
D. AAOS Faculty Checklist for Evaluating Surgical Performance on a Patient..60
REFERENCES...................................................................78
vii


FIGURES
Figure
1.1 The original EVH, illustrating a concept for the American Academy of
Orthopedic Surgeons............................................................6
2.1 A student using the ToLTech arthroscentesis simulator....................13
2.2 A student using the ToLTech arthroscopy simulator........................13
4.1 Overview page for the Mentor.............................................28
4.2 Mentor with movie 30
4.3 Image of content page for the Mentor....................................32
4.4 Image of the Mentor preparing the resident for a simulator task.........34
viii


1. Introduction
When a doctor is training a resident in surgical technique, the doctor
typically stands over the residents shoulder, offering instruction, asking
questions, and guiding the procedure. The doctor does not only suggest and
dictate; he also critiques and evaluates the ability of the resident. The phrase
associated with teaching a resident is see one, do one, teach one; an
intimidating notion for the uninitiated. Touch of Life Technologies has spent a
number of years creating medical simulators that allow a resident to practice an
assortment of techniques, such as arthrocentesis (needle injection) and knee
arthroscopy. When demonstrating these simulators, and allowing them to be
used in public, they found that it was useful to have an actual person sitting with
the user, explaining the parameters of the simulator and the intricacies of the
procedure. As time progressed and the procedures became more complex, they
found the presence of a doctor, or a mentor, to be invaluable when it came to
content and procedure. The goal became to create a virtual Mentor that would
perform all of the necessary instruction and guidelines that an actual doctor
would provide, while at the same time keeping careful and accurate track of the
resident's progress throughout the simulator training. The Mentor must be
transparent enough to allow for intuitive use, while guiding and offering the
resident expert feedback. The Mentor must adhere to standard web paradigms
and good user interface guidelines, while also functioning as a teacher, test-giver
and test-grader, and communicate with and control the simulator.
1


1.1 Goals of the Application
My role in this project was to design and begin to implement the user
interface for the Mentor. I was tasked to generate an application that could
accompany simulators initially during demonstrations, and eventually for
commercial development. I used as my original model an application called the
Explorable Virtual Human (EVH), which will be discussed in a future section.
I met with colleagues to determine the needs of the Mentoring program,
and incorporated theories from the field of usability engineering to formulate a
system to meet the requirements designated below.
1. The Mentor must train the resident to use the system, both the Mentor
itself and the simulator.
2. The Mentor must present movies, animations, and images in a useful
and easily accessible way.
3. The Mentor must dictate content in a meaningful order; facilitating
reviewing lessons but not allowing the resident to jump ahead.
4. The Mentor must verify that each unit has been at least attempted by
asking questions that must be successfully answered in order for the resident to
progress.
5. The Mentor must require no keyboard input from the resident.
6. The Mentor must conduct and score tests designed to determine the
expertise of the resident.
2


1.2 Considerations
In order to accomplish the goals of the Mentor described above, there are
two main subjects that required special consideration. One is the user interface
of the Mentor, as it relates to ease of use and information conveyance, and the
other is the programmatic and language requirements of the simulator. Due to
the inherent complexity of the simulator, it is vital to provide a new user with a
Mentor interface that will be familiar, logical, and easily understandable. As one
of the tasks of the Mentor is to teach the use of itself, it is aided by incorporating
current understandings from the field of graphical user interface (GUI) design.
Key issues in usability and GUI design will be addressed, as well as trends in the
areas of content development and online and computer aided learning, and
current simulators that are used in the field.
Programmatic language considerations were based on a number of
factors. One is the desire to utilize the accomplishments of past work toward a
"Mentoring" program via an application called the Explorable Virtual Human,
which was written using the Java programming language. The simulators,
however, are written in the C++ programming language, and are usually run on a
separate computer from the Mentor itself. My original thought was to create the
Mentor in an engine that was being developed simultaneously by my colleagues,
and that is in the same programming language as the simulators. However, at
the time when development on the Mentor began, the engine was not complete
3


enough to handle the requirements of the Mentor. Language considerations
were made based on a need to have Mentor that was immediately useful and
demonstrable. The decision to have the Mentor run in a different programming
language and on a different computer requires the use of sockets, to
communicate from one computer to another and to convey information from one
programming language to another. Languages and programming tools used, in
addition to a more thorough description of the simulators themselves, will be
discussed in the Methods section.
4


2. Previous Work
2.1 The history of the Explorable Virtual Human and Interactive Anatomic
Animations
The Explorable Virtual Human (EVH) was an application developed by
the Center for Human Simulation at the University of Colorado Health Sciences
Center. It was undertaken via a grant from the National Library of Medicine, to
satisfy the requirements of a program that would utilize the Next Generation
Internet (Contract Number N01-LM-0-3507). The EVH consisted of two main
parts, a series of hyper text markup language (HTML) pages which contained
text and media content, and accompanying Interactive Anatomic Animations
(lAAs), which describe the anatomy in three dimensions (3D). lAAs will be
discussed more fully in the Methods section. The EVH was successful in
meeting the goals of the grant, but development was mostly discontinued after
the grant ended. In addition, there were some additional complexities involved in
the application. The use of an HTML Tenderer that was created in-house was
only partially successful, and failed to take advantage of current advancements in
standard HTML rendering engines (such as Internet Explorer or Netscape). Also,
the EVH had no method for communicating with a simulator, as it was developed
for online learning.
5


Tins IAA illustrates the difference between the two axes,
which are neither colinear nor parallel.
The following IAA shows that the corresponding two
planes are neither coplanar nor parallel.
|~CIck to explore
The angle between the two axes is 6 ~6 Degrees.
I or the CT knees, the angle between the two axes were.
kneel
knee
li____:
knee '
3
knee
i
knee
<
knee
6 j
knee
knee
8
7.6 !
10*
14.:
-.1
mi
1.83;
9.02'
1.25

Figure 1.1 The original EVH, illustrating a concept for the American Academy of
Orthopedic Surgeons
2.2 Usability Engineering, history and terminology
Usability is an attribute that assesses how easy an interface is to use.
Jakob Nielsen describes usability as based on five factors: learnability, efficiency,
memorability, errors, and satisfaction (Nielsen 2003). The usability of an
application is an issue that is becoming more and more prominent as people use
computers in increased aspects of their daily lives. Learnability is recognizing
how easy it is for the user to accomplish tasks when they first use the application.
6


Efficiency is the measure of the time it takes for the user to complete a task.
Memorability concerns the ease of the user to reestablish proficiency when they
leave an application and then return to it at a later time. Error involves counting
the number, severity, and recoverability of errors that users make. Satisfaction
pertains to how pleasurable it is for the user to interact with and use the
application (Nielsen 2003).
2.2.1 Human Factors
Usability in software engineering emerged as a topic of great importance
in the 1980's. In the 1970's, it was mostly engineers and computing
professionals who interacted with computers. However, as personal computers
became more prevalent, the end users were a much larger pool of people who
were less technical. Marketing and Quality Assurance groups began to analyze
aspects of software applications in regard to human performance. This testing
became known as human factors evaluation. The evaluation allowed
manufacturers to know how well people were able to use their applications, but
such tests were done at a time in the development process where it was too late
to incorporate the findings into the design itself (Rosson and Carroll).
7


2.2.2 Human-Computer Interaction
In the 1980s, the fields of cognitive psychology and computer science
came together to begin to analyze the way people interact with computers.
According to ACM's SIGCHI, the Association for Computing Machinery's Special
Interest Group on Computer-Human Interaction, "Human-Computer interaction is
a discipline concerned with the design, evaluation and implementation of
interactive computing systems for human use and with the study of major
phenomena surrounding them (Hewett, et al.).
In other words, Human-Computer interaction is concerned with aspects of
application design and development that make computers more user-friendly.
These concerns include methods for designing and implementing interfaces,
evaluating interfaces, developing new interaction techniques, and creating
predictive models of interaction (Human-Computer Interaction).
2.3 Usability Engineering Considerations
There are a number of different guidelines, theories, and principles that
have emerged in the study of Usability Engineering. These principles are
applicable to software and web design. They have been put forth by usability
experts Jakob Nielsen and Bruce Tognazzini, and are quite extensive. Nielsen
and Tognazzini are principals (along with Donald Norman) in the Nielsen Norman
Group, widely considered the leading consults in human computer interaction
(Bruce Tognazzini).
8


Jakob Nielsen is considered the "guru of webpage usability" (Richtel),
with a decade-long history of helping software companies make their software
more user-friendly. Nielsen's heuristics are fairly concise; they include: visibility
of system status (keep the user informed), match the system to the real world
(speak the users' language), allow for control and freedom (designate a clear
way to exit, undo, and redo tasks), be consistent and standard in terminology and
platform conventions, prevent errors before they occur, plan for recognition rather
than recall (keep options visible so that the user does not have to remember
them), make the software flexible and efficient, maintain aesthetics and
minimalist aspects of design, make error recovery simple, and provide help and
documentation (Nielsen).
Tognazzini, who worked for Sun Microsystems and Apple Computer
before joining the Nielsen Norman Group, has a list of principles of interaction
design as well. They can be summarized in an over arching set of three main
principles:
Effective interfaces are visually apparent and forgiving, instilling in their
users a sense of control. Users quickly see the breadth of their options,
grasp how to achieve their goals, and do their work.
Effective interfaces do not concern the user with the inner workings of the
system. Work is carefully and continuously saved, with full option for the
user to undo any activity at any time.
Effective applications and services perform a maximum of work, while
requiring a minimum of information from users (Tognazzini).
g


2.4 Content Development
There are a number of companies who exist solely to incorporate content
into an electronic learning (e-learning) or virtual environment. Electronic learning
is normally associated only with the internet, but it in fact encompasses all kinds
of technologies, such as video conferencing or CD/DVD based applications
(Instructional Technology/E-Learning). An e-learning application is generally
part of a Learning Management System (LMS). The LMS incorporates the
registration of the student, delivery of e-learning applications, the tracking of
student progress, and testing and reporting (Learning Management System).
The EVH application described earlier is an example of an e-learning
application; it took anatomical content and presented it in a computerized
environment, distributable across the internet. Custom course development
continues to be an increasing market. Most schools and colleges now offer
electronic-based learning, with classes conducted entirely online. Companies
are offering e-based training because it is cost effective and results in high
information retention. Is it also easier to track the comprehension of students,
and can be easily updated to reflect changing policies and procedures (Vaas).
As an example, one such company, MountainTop Learning (www.mntntp.com),
analyzes a companys needs, their methods of distribution, and their target
market in order to create web or application based software for learning or
training. They incorporate methods for testing the retention and comprehension
of the user, and also track performance for future assessment. They are a
typical example of a company that creates e-learning applications and Learning
10


Management Systems.
2.5 Current simulators used in training
Mannequin-based simulators are one type of medical simulator. Medical
Education Technologies, Inc (Meti, www.meti.com) is a company based in
Florida that creates mannequin-based simulators such as the Human Patient
Simulator (HPS). This simulator mimics many of the functions of a normal
patient (pulse, blinking eyes, a circulatory system). The HPS is controlled by the
Patient Editor software. This software allows the user to dictate the name, age
and gender of the patient, and then create a medical history. It is up to the
resident to figure out how to treat the patient.
Other types of simulators involve virtual environments without force
feedback. Mentice Corporation (www.mentice.com), based in Goteburg,
Sweden, is one such company. MIST, or the Minimally Invasive Surgery Trainer,
teaches core laparoscopic skills and stitch and knot tying. It is a purely virtual
environment (as opposed to a mannequin-based environment) in which the user
holds onto physical tools and looks at a computer monitor for instruction. Tasks
are dictated via step by step instructions, and the user can see clips of the
technique in use juxtaposed with their own hand movements in a more simplistic
graphical environment.
A third type of medical simulator incorporates force feedback devices into
virtual environments. ToLTech (Touch of Life Technologies, www.toltech.net) is
11


based in Denver, Colorado, and utilizes data from the Visible Human, a project
sponsored by the National Library of Medicine and completed at the University of
Colorado Center for Human Simulation, to create simulators and virtual anatomy
software. The Mentor application will accompany each of the simulators in
development. There is a possibility that eventually there will be one simulator (a
virtual patient) instead of the many that currently exist to address different
procedures. In the virtual patient model, which procedure the user wishes to
perform will be chosen through the Mentor application. However, at the current
time, each simulator functions separately, and each Mentor is specific to the
particular simulator it accompanies. If a user does not have a simulator attached,
the Mentor can be used as a standalone application. The Mentor still presents
the content, but no longer communicates with the simulator. With the exception
of the arthroscopy simulator (which incorporates a mechanical leg), each of the
ToLTech simulators use completely virtual environments. The user holds on to
robots which generate force feedback, and they look at a computer monitor while
performing the procedure. In the first image below, the user is working on the
arthrocentesis simulator. She is holding the handles of two haptic devices in her
hand, looking in to a mirrored setup and wearing 3D glasses, which cause the
image float in front of her. The Mentor screen is mounted to her left, just out of
view. In the second image, the user is working on the arthroscopy simulator.
She is leaning on the mechanical leg, looking at the simulator on a monitor on
the superior side of the patient. In this case, the Mentor is mounted on a
monitor to the right of the simulator screen.
12


Figure 2.1 A student using the ToLTech arthroscentesis simulator
Figure 2.2 A student using the ToLTech arthroscopy simulator
13


3. Methods
The Mentor is written in a standard book paradigm. It has a series of
chapters and lessons, the flow of which is controlled by next and previous
buttons. If there is a simulator connected to the Mentor, the first section of the
Mentor contains information on using the simulator itself. The next section of the
Mentor is the curriculum (or procedure) which the simulator is teaching. This
curriculum has been dictated or advised by at least one authority in the
procedural field; in many cases there are multiple authorities who provide the
content. A content developer creates each page to communicate the information
the authority designates. These pages are organized to create a natural flow.
The resident reads the information on the page, perhaps watches a movie or
examines an image, and interacts with a pre-defined 3D animation. They then
use the simulator to practice tasks that have been presented. The simulator
communicates with the Mentor by way of Windows and Java sockets (a method
of information sharing over a network). As the resident completes tasks on the
simulator, that information is sent to the Mentor. The Mentor then records the
data and displays feedback for the resident. This way, the resident can see how
well they performed the individual tasks, and can also get further instruction if
that is required.
The last section of the Mentor is the overall testing section. Throughout
the use of the Mentor, the resident has been tested both via small informational
tests and on simulator tasks. In the final test, the resident is required to perform
the procedure, this time with no help from the Mentor. The Mentor records the
14


residents progress, and creates a report at the end.
The Mentor is displayed on a separate monitor that is next to the
simulator. It does not overlap with the simulator itself. The goal is to keep the
simulator as true to reality as possible. This means that it will have no
transparent views, no unrealistic 3D views, and no floating text or instructions.
All of these things are the tasks of the Mentor. The Mentor resembles a doctor
standing over a residents shoulder, offering instruction, guidance, and direction.
The resident can look to the Mentor for more information at any time; they can
step away from the simulator in order to revisit previous tasks or curriculum. The
Mentor monitor and mouse will be close enough to the simulator to be easily
accessible, but separate enough to establish it as a separate presence from the
simulator.
3.1 Analysis of tools for the Mentor
The Mentor is written in a combination of Java, HTML (HyperText Markup
Language), JavaScript, and XML (extensible Markup Language), and
incorporates the functionality of lAAs (Interactive Anatomic Animations). It
communicates with simulators that are written in C++ via sockets. Each of these
technologies will be addressed separately.
15


3.1.1 Java
Java is a programming language that grew out of a collaborative effort in
the 1990's, from Sun Microsystems employees. It arose partially due to
frustrations with the C++ programming language, and according to Patrick
Naughton, a desire to "do fewer things better" (Harold). It has a number of
different aspects to recommend it as the appropriate language for the Mentor.
One of its main recommendations is the fact that it is platform independent. It
can be run on all Internet browsers and on all computing platforms (Windows,
Unix, Macintosh). This is especially important for the Mentor that will run on a
computer without a simulator. If a Mentor is connected to a simulator then it will
probably be on a computer that is packaged with the simulator. If it is not
connected to a simulator, then it can be run anywhere, and the computer
requirements should be minimal. Javas platform independence allows the
Mentor to run on different computers with possibly vastly different operating
systems (although the Mentor has been optimized to run on the Microsoft
Internet Explorer browser). Java provides automatic memory allocation and
deallocation, which are two of the main causes of bugs in C and C++
applications. Java also incorporates the notion of garbage collection (cleaning
up memory) by means of a thread that runs in the background of the program
and cleans up unused memory, so that the programmer does not have to
remember to write it into the code. It is object oriented, which means that datum
is represented by objects which have fields (or variables) and methods (or
functions). Object orientation allows the programmer to create an instance of an
16


object without having to know exactly how the object is created. This provides
for simplicity of use and more efficient re-use of code. Java also has a number of
security features built into it. Java Applets are mini applications that are created
in Java and are run in the user's internet browser. Java enabled browsers check
the applets to make sure they are not trying to access information on the user's
hard drive, or write anything to the hard drive. This prevents viruses from
entering the system. There are many other aspects that recommend Java as a
good language to use for the Mentor, but these reasons are particularly pertinent
(Sun).
Java is used in the Mentor both as the language in which the lAAs are
displayed, and as the Mentor-side method of receiving information from and
sending information to the simulator, via sockets. Java is also used to write
information to specific accessible application files, which will eventually be
located on a network, and which allows the program to store pertinent status
information each time a resident logs on or off of the system.
3.1.2 Hyper Text Markup Language
HTML, commonly used to display information on the World Wide Web, is
rendered (displayed) in Internet Explorer, which allows it to take advantage of
pre-existing capabilities; namely, the ability to display text, images, and movies in
an established environment. HTML is a markup language that grew from SGML,
or Standard Generalized Markup Language. SGML began as the international
17


standard for representing text electronically. A markup language is one that
describes tags or markings that are defined in separate file. These tags are put
on either side of text, and describe how the text should be formatted on the page
(Sperberg-McQueen and Burnard).
An example of HTML:

Chapter One: lAAs


In this example, the

tags are recognized by HTML Tenderers to be a
certain sized heading. The text that is contained within the tag is displayed by
means of specific formatting, and the

tag is the largest and most prominent
of all of the established heading tags. There are a number of recognized tags, all
of which can be found on the website for the World Wide Web Consortium, which
develops specifications and guidelines for languages such as HTML (World Wide
Web Consortium, et al.).
HTML is the language in which all of the content (instructional and
curriculum) is written in the Mentor. Each page that is presented to the resident
is a separate HTML page. All of the pages contain text, some contain movies
and images. Each HTML page is rendered in Internet Explorer.
3.1.3 extensible Markup Language
XML, like HTML, is also a markup language derived from SGML.
However, it is different in its application. XML can be considered a meta-
18


language, or a language that describes the relationship between words, or tags.
Whereas HTML has a predefined set of tags that all HTML Tenderers recognize,
XML allows users to create their own set of tags. Meaningful tags make the XML
document much more understandable inherently. In order to be rendered, XML
files can refer to DTD files, Document Type Declarations. These files are the
true meta-information files, as they describe what each tag means, and how it
should be displayed (Walsh).
XML allows for a flexibility of data that would otherwise need to be hard
coded into individual pages. Ordinarily, in order to move to the next page in a
series or to go to a previous one, a link has to be included on a page. This link
contains a defined filename, the page for which is then loaded. In order to
change the order of pages, the appropriate forward link must be changed to
redirect to the new page, and the previous link of the following page needs to
be adjusted to point to the newly inserted page. The newly inserted page must
also point forward and back correctly. If many pages are created or
rearranged, this can get tedious and can lead to problems of broken links if one
of these steps is accidentally omitted. The Mentors XML documents, on the
other hand, are formatted and used in a way such that one file, in this case
Lesson.xml, contains a list of all of the content pages for the Mentor. An XML
parser is utilized in the code of the Mentor in order to go from one page to the
next. If a new page needs to be inserted, a line is added in the XML file in the
appropriate place. If one needs to be removed, that line is removed from the file.
No hard links are required, and the problem of broken links is solved.
19


The use of XML also allows for the future goal of having one Mentor
which can accompany any number of simulators. The core code of the Mentor
functions the same way, but the specific content is dictated by which XML files
are loaded. The XML files then point to specific content files. This way, instead
of having an entirely separate instance of the Mentor for each of the simulators, it
is just a matter of the Mentor choosing the appropriate data files to load, for the
particular simulator.
An example of XML:


desson num = "0" name = "Introduction" file =
"./Html/Learn_Parts_lntro.htmr>
"./Html/Learn_Parts_Mentor.html">
"./Html/Learn_Parts_Simulator.html">
desson num = "3" name = "lAAs" file =
"./Html/Leam_Parts_IAAs.html">

"./Html/Learn_Using_lntro.html">
desson num = "3" name = "The Tools" file =
"./Html/Learn_Using_Tools.html">


20


3.1.4 JavaScript
JavaScript, contrary to its name, is not a Java application. It is a dynamic
scripting language developed by Netscape. A dynamic language refers to one in
which objects are created at runtime, or when a page is displayed. JavaScript
was written to resemble Java and C++, to aid in familiarity of use (Ginda).
JavaScript is commonly used online in situations where buttons change their
appearance when they are moused over, or to pop up new windows. It is
common practice for users to create JavaScript scripts, such as popup windows
or dropdown menus, which can be incorporated into others code freely. Usually
the only stipulation is that the end user gives credit to the person who created the
JavaScript script initially. The Mentor utilizes JavaScript to check quiz answers
and to change the appearance and the functionality of buttons such as the "next"
button.
An example of JavaScript:

cscript language="JavaScript1.2" src="styles/definition.js">




The Shoulder: c/h1>

Functional Anatomy 2




cdiv id="object1" style="position:absolute; background-
color:FFFFDD;color:black;border-color:black;border-width:70; visibility:show;
left:125px; top:-100px; z-index:+1" onmouseover="overdiv=1;"
onmouseout="overdiv=0; setTimeout('hideLayer()',50)">
pop up description layer

The Scapulothoracic Articulation is stabilized by 6 extrinsic shoulder muscles
that originate from the axial skeleton and insert on the Scapula:


cul>

Using the mouse, rotate and examine the structures of the
shoulder surrounding the glenohumeral joint.
It is essential to understand how the bones of the joints relate to the surface of
the skin.c/p>


45


C. Socket communication and simulator communication
#ifndef CHSSimulatorCommunication_h
#define CHSSimulatorCommunication_h
#include "CUT_Utils.h"
#include "CUT_Types.h"
#include "CUT_String.h"
using namespace chslltilities;
include "CUI_WIN32Console.h"
using namespace chsLII;
#include "CIV_TargetExaminerApp.h"
using namespace chslnventor;
//require no further data
tlnt32 LEFT_RIGHT_ANATOMY_TOGGLE = 1;
tlnt32 PHANTOM_CALIBRATED = 2;
tlnt32 SWITCH_SCOPE_DEGREE = 4;
tlnt32 CARTILAGE_FORCE = 6;
//require additional data
tlnt32 SCOPE_PORTAL = 8;
//arthroscopy
tlnt32 NEEDLE_IN_BODY = 9;
tlnt32 EXCESSIVE_FORCE_ON_STRUCTURE = 10;
tlnt32 SLOW_VELOCITY = 11
tlnt32 REDIRECTION =12
tlnt32 CHANGE_NEEDLE_GAUGE = 13
tlnt32 SUCCESS_TEST = 16
tlnt32 TEST_BEGIN =17
tlnt32 TEST_END = 20
tlnt32 RESET_ALL_VARS = 30
tlnt32 UPDATE_MODELS = 32
tlnt32 INITIALIZED_MENTOR = 64
//only sent by simulator
46


tlnt32 PLAY SOUND = 3;
tlnt32 SWITCH TOOLS = 128;
tlnt32 UPDATE SCENE = 256;
tlnt32 PRESSED SWITCH = 512;
tlnt32 INITIALIZED SIM = 1024
tlnt32 GHOST ERROR = 2048
tlnt32 TOUCHED FEMURTIBIA = 4096
class cCHSSimulatorCommunication
{
public:
cCHSSimulatorCommunication();
~cCHSSimulatorCommunication();
tVoid closeAHSocketsO;
tVoid receiveMessage();
tVoid addListener(); //to listen for occasional messages from mentor
tVoid addSwitchListener(); //to listen to switch being pressed
tVoid initializeSendingSocket();
tVoid initializeListenerSocket();
tVoid setupSendingSocket();
tVoid setupListeningSocket();
bool testForMessage(tVoid *data, tlnt32 datalen, tBool wait) const;
cString getModels();
cString getScenelnitialization();
//send to Mentor
tVoid sendllpdateScene( SbMatrix scopeMatrix, SbMatrix probeMatrix,
float flexion, float varus_valgus, int msgNum); //this is all transforms,
flexion, etc.
tVoid sendlnitializeScene( cString Anatomy, int left_right, int
toolMedialSide, int scopeDegree);
tVoid sendToolSwitchSides();
tVoid sendTouchedFemurTibia();
tVoid sendCartilageForce( float cartForce);
tVoid sendSwitchPressed();
tVoid sendSuccessTest();
tVoid sendGhostError( cString errorMsg );
tVoid sendPlaySound( cString fileName);
tVoid sendExcessiveForceOnStructureQ;
47


tVoid sendNeedlelnBody();
tVoid sendSlowVelocity();
tVoid sendRedirection();
char buffer[20];
long bufferAsl_ong[2];
double bufferAsDouble[2];
long tempLong;
char FAR *servername;
SOCKADDRJN AAddr;
int LPort;
int SPort;
int SendAddrSize;
unsigned long ul;
bool modelsReceived;
bool fullString;
tChar msg[64];
bool resetHapticsFromError;
bool changeNeedleGauge;
bool addNewModel;
bool testHasBegun;
bool resetAIIVars;
int listenerSocket;
int sendingSocket;
int remoteSocket;
float mScopePortal[3];
float mProbePortal[3];
cString mModelList;
cString *mScenel_ist;
tBool mNewMessageAlert;
int mNeedleGauge;
int mNewModel;
//socket variables
int Lret; // variable telling size of received packet
int Sret;
struct sockaddrjn la; // socket address information
struct sockaddrjn sa; // socket address information
struct hostent *h;
cString mHostName;
cString mHostIPAddr;
tULong mHostMachineAddr;
tBool mlsHost;
48


};
inline cCHSSimulatorCommunication::cCHSSimulatorCommunication(){
servername = "192.168.1.11"; //arthrocentesis mentor shuttle
LPort = 5478;
SPort = 6634;
SendAddrSize = sizeof(AAddr);
ul = 1;
fullString = true;
mNewMessageAlert = false;
resetHapticsFromError = false;
changeNeedleGauge = false;
addNewModel = false;
testHasBegun = false;
resetAIIVars = false;
}
inline cCHSSimulatorCommunication::~cCHSSimulatorCommunication(){
closeAIISockets();
}
inline cString cCHSSimulatorCommunication::getModels( ) {
}
inline tVoid cCHSSimulatorCommunication::initializeSendingSocket() {
// Set up the socket
hostent host = NULL; // Host Info Structure
#ifdef _WIN32 // Initialize the Windows Socket DLL.
WSADATA data;
if( WSAStartup( MAKEWORD( 1,1), &data ))
{
ERRORMACRO( "ERROR: WSAstartup unsuccessful", kAbort);
}
#endif
// Find the local host name
unsigned long address;
address = inet_addr(servername);
h = gethostbyname(servername);
if (h == NULL){
exit(O);
} //else
49


}
inline tVoid cCHSSimulatorCommunication::initializeListenerSocket() {
// Set up the socket
listenerSocket = socket(AF_INET, SOCK_STREAM, 0);
if (listenerSocket == INVALID_SOCKET){
CHS_CONSOLE.writeFormat(1,3,"invalid listening socket, %d",
WSAGetLastError());
}
}
inline tVoid cCHSSimulatorCommunication::closeAIISockets(){
if( listenerSocket >= 0 )
{
::closesocket( listenerSocket);
listenerSocket = -1;
}
if( sendingSocket >= 0 )
{
::closesocket( sendingSocket);
sendingSocket = -1;
}
}
inline tVoid cCHSSimulatorCommunication::setupSendingSocket(){
mHostIPAddr = servername;
mHostMachineAddr = inet_addr( servername);
sendingSocket = socket( AFJNET, SOCK_STREAM,IPPROTO_TCP );
sockaddrjn raddr; // Socket Address of Remote
Machine;
memcpy( &raddr. sin_addr, (char*)h->h_addr,h->h_length ); // Specify IP
Address
raddr.sin_port = htons( SPort); // Specify Port to connect
raddr.sinjamily = AFJNET; // Specify the address family as Internet
(IP)
if( (::connect( sendingSocket, (sockaddr*)&raddr, sizeof( raddr))) ==
SOCKET_ERROR)
{
CHS_CONSOLE.writeFormat(1,5,"invalid connection of sending
socket, %d", WSAGetLastError());
}
inline tVoid cCHSSimulatorCommunication::setupListeningSocket() {
// Fill out Socket Address Structure
50


sockaddrjn raddr, laddr; // Socket Address of Local
Machine;
laddr.sin_port = htons( LPort); // Specify Port to connect
laddr.sin_family = AFJNET; // Specify the address family
as Internet (IP)
laddr.sin_addr.s_addr = INADDR_ANY; // Bind to any/all
interfaces.
// Bind Socket to local address
if( ::bind( listenerSocket, (sockaddr*) &laddr, sizeof( laddr)) < 0 )
{
//close_sockets();
//ERRORMACRO( "Bind unsuccessful", kContinue);
CHS_CONSOLE.writeFormat(1,3,"invalid BIND");
}
// Listen for one connection
if( ::listen( listenerSocket, 1 ) < 0 )
{
CHS_CONSOLE.writeFormat(1,3,"invalid LISTEN");
}
// Accept the remote socket from client
tlnt32 fromlen = sizeof( raddr); // Size of
remote address structure
if( (remoteSocket = ::accept( listenerSocket, (sockaddr*) &raddr,
Sfromlen )) < 0)
{
CHS_CONSOLE.writeFormat(1,3,"invalid ACCEPT..raddrr
%d %d", raddr, fromlen);
}
int ret;
do{
ret = ioctlsocket(remoteSocket,FIONBIO, (unsigned long
FAR*) &ul);
} while (ret == SOCKET_ERROR);
}
inline bool cCHSSimulatorCommunication::testForMessage( tVoid *data, tlnt32
datalen, tBool wait) const
{
fd_set readfds;
FD_ZERO( &readfds);
FD_SET( remoteSocket, &readfds);
51


struct timeval timeout;
timeout.tv_sec = 0;
timeout.tv_usec = 100;
if( ::select( remoteSocket, &readfds, NULL, NULL, &timeout) <= 0 )
{
// CHS_CONSOLE.writeFormat(1,3,"no SELECT, %d",
WSAGetLastError());
}
if( (::recv( remoteSocket, (tChar*)data, datalen, 0)) == datalen ){
return true;
}
else
{
return false;
}
}
inline tVoid cCHSSimulatorCommunication::receiveMessage() {
tChar msg[64]="";
tlnt32 command = 0;
tlnt32 mCommand = 0;
tlnt32 dataSize = 0;
if( testForMessage( msg, sizeof( msg ), false))
{
mNewMessageAlert = true;
//get the command
memcpy( &mCommand, msg, sizeof( tlnt32 ));
if( !lsComputerBigEndian())
{
SwapByteslnt32( mCommand);
}
CHS_CONSOLE.writeFormat(1, 18, "receiving command
%d", mCommand);
switch (mCommand) {
//left/right anatomy toggle
case 1:
CHS_CONSOLE.writeFormat(1, 19,"
52


ANATOMY");
CHS_CONSOLE.writeFormat(1, 19, "TOGGLING
break;
Calibrated");
Scope Degree");
sizeof(tFloat));
sizeof(tFloat));
sizeof(tFloat));
dataSize));
case 2:
CHS_CONSOLE.writeFormat(1, 19,"
CHS_CONSOLE.writeFormat(1, 19, "Phantom
break;
case 4:
CHS_CONSOLE.writeFormat(1, 19,"
CHS_CONSOLE.writeFormat(1, 19, "Change
break;
case 8:
//get new scope portal from message
memcpy( &mScopePortal[0], msg + 4 ,
memcpy( &mScopePortal[1], msg + 16 ,
memcpy( &mScopePortal[2], msg + 28 ,
if (!lsComputerBigEndian())
{
SwapBytesFloatArray( mScopePortal, 3);
}
break;
case 13:
memcpy( &dataSize, msg + 4, sizeof( tlnt32 ));
if( !lsComputerBigEndian())
{
SwapByteslnt32( dataSize);
}
memcpy( &mNeedleGauge, msg + 8, sizeof(
if( MsComputerBigEndian())
{
SwapByteslnt32( mNeedleGauge);
}
53


CHS_CONSOLE.writeFormat(1, 19,"
CHS_CONSOLE.writeFormat(1, 19, "data size is
%d, Change needle Gauge: %d", dataSize, mNeedleGauge);
changeNeedleGauge = true;
break;
case 17:
testHasBegun = true;
CHS_CONSOLE.writeFormat(1,5, "Received test
has begun message");
break;
case 20:
testHasBegun = false;
CHS_CONSOLE.writeFormat(1,5, "Received test
has ended message");
break;
case 30:
resetAIIVars = true;
CHS_CONSOLE.writeFormat(1,6, "Received Reset
All Vars message");
break;
case 32:
memcpy( &dataSize, msg + 4, sizeof( tlnt32 ));
if( !lsComputerBigEndian())
{
SwapByteslnt32( dataSize);
}
memcpy( &mNewModel, msg + 8, sizeof( dataSize
));
if( !lsComputerBigEndian())
{
SwapBytesInt32( mNewModel);
}
CHS_CONSOLE.writeFormat(1, 17, "data size is
%d, New Model: %d", dataSize, mNewModel);
addNewModel = true;
break;
haptics
case 2048:
//a ghost error has been received and user has put
54


break;
//back into the reset state.
resetHapticsFromError = true;
default:
break;
}
}
}
//send to Mentor
tVoid cCHSSimulatorCommunication::sendl)pdateScene( SbMatrix scopeMatrix,
SbMatrix probeMatrix, float flexion, float varus_valgus, int msgNum){
tFloat* sceneData = new tFloat[35]; //35 b/c we're now sending num
messages
sceneData[0] = scopeMatrix[0][0];
sceneData[1] = scopeMatrix[0][lj;
sceneData[2] = scopeMatrix[0][2];
sceneData[3] = scopeMatrix[0][3];
sceneData[4] = scopeMatrix[1][0];
sceneData[5] = scopeMatrix[1][lj;
sceneData[6] = scopeMatrix[1][2];
sceneData[7] = scopeMatrix[1][3];
sceneData[8] = scopeMatrix[2][0];
sceneData[9] = scopeMatrix[2][lj;
sceneData[10] = scopeMatrix[2][2];
sceneData[11] = scopeMatrix[2][3];
sceneData[12] = scopeMatrix[3][0];
sceneData[13] = scopeMatrix[3][1];
sceneData[14] = scopeMatrix[3][2];
sceneData[15] = scopeMatrix[3][3];
sceneData[16] = probeMatrix[0][0];
sceneData[17] = probeMatrix[0][1];
sceneData[18] = probeMatrix[0][2];
sceneData[19] = probeMatrix[0][3];
sceneData[20] = probeMatrix[1][0];
sceneData[21] = probeMatrix[1][lj;
sceneData[22] = probeMatrix[1][2];
sceneData[23] = probeMatrix[1][3];
sceneData[24] = probeMatrix[2][0];
sceneData[25] = probeMatrix[2][1];
55


sceneData[26] = probeMatrix[2][2];
sceneData[27] = probeMatrix[2][3];
sceneData[28] = probeMatrix[3][0];
sceneData[29] = probeMatrix[3][1];
sceneData[30] = probeMatrix[3][2];
sceneData[31] = probeMatrix[3][3];
sceneData[32] = flexion;
sceneData[33] = varus_valgus;
sceneData[34] = (float)msgNum;
//Send the tool position
//first send data type
tl)lnt32 dataType = UPDATE_SCENE;
send(sendingSocket, (char*)&dataType, sizeof(tUlnt32), 0);
//CHS_CONSOLE.writeFormat(1,2, "sending command: %d", dataType);
//then send data size
tUlnt32 dataSize = 35*sizeof(float);
send(sendingSocket, (char*)&dataSize, sizeof(tUlnt32), 0);
//CHS_CONSOLE.writeFormat(1,3, "sending dataSize: %d", dataSize);
//then send data
send(sendingSocket, (char*)&sceneData[0], dataSize, 0);
}
tVoid cCHSSimulatorCommunication::sendlnitializeScene(cString Anatomy, int
left_right, int toolMedialSide, int scopeDegree){
}
tVoid cCHSSimulatorCommunication::sendToolSwitchSides(){
}
tVoid cCHSSimulatorCommunication::sendSwitchPressed(){
tlnt32 mCommand = PRESSED_SWITCH;
CHS_CONSOLE.writeFormat(1,16, "Sending switch pressed %i",
mCommand);
send(sendingSocket, (char*)&mCommand, sizeof( tlnt32), 0);
//then send data size
tUlnt32 dataSize = 0;
send(sendingSocket, (char*)&dataSize, sizeof(tUlnt32), 0);
}
tVoid cCHSSimulatorCommunication::sendTouchedFemurTibia(){
56


tlnt32 mCommand = TOUCHED_FEMURTIBIA;
send(sendingSocket, (char*)&mCommand, sizeof( tlnt32), 0);
//then send data size
tl)lnt32 dataSize = 0;
send(sendingSocket, (char*)&dataSize, sizeof(tl)lnt32), 0);
}
tVoid cCHSSimulatorCommunication::sendCartilageForce( float cartForce){
tlnt32 mCommand = TOllCHED_FEMURTIBIA;
send(sendingSocket, (char*)&mCommand, sizeof( tlnt32), 0);
//then send data size
tUlnt32 dataSize = sizeof( float);
send(sendingSocket, (char*)&dataSize, sizeof(tUlnt32), 0);
//then send data
send(sendingSocket, (char*)&cartForce, dataSize, 0);
}
tVoid cCHSSimulatorCommunication::sendSuccessTest(){
tlnt32 mCommand = SUCCESS_TEST;
send(sendingSocket, (char*)&mCommand, sizeof( tlnt32), 0);
//then send data size
tUlnt32 dataSize = 0;
send(sendingSocket, (char*)&dataSize, sizeof(tUlnt32), 0);
}
tVoid cCHSSimulatorCommunication::sendGhostError( cString errorMsg){
tlnt32 mCommand = GHOST_ERROR;
CHS_CONSOLE.writeFormat(1,1,"errorMsg %s", *errorMsg);
//send message
send(sendingSocket, (char*)&mCommand, sizeof( tlnt32), 0);
//then send data size
tUlnt32 dataSize = errorMsg.size();
send(sendingSocket, (char*)&dataSize, sizeof(tlllnt32), 0);
//then send data
send(sendingSocket, (char*)&errorMsg[0], dataSize, 0);
}
tVoid cCHSSimulatorCommunication::sendPlaySound( cString fileName){
57


tlnt32 mCommand = PLAY_SOUND;
CHS_CONSOLE.writeFormat(1,2,"Mentor play: %s", *fileName);
//send message
send(sendingSocket, (char*)&mCommand, sizeof( tlnt32), 0);
//then send data size
tl)lnt32 dataSize = fileName.size();
send(sendingSocket, (char*)&dataSize, sizeof(tUlnt32), 0);
//then send data
send(sendingSocket, (char*)&fileName[0], dataSize, 0);
tVoid cCHSSimulatorCommunication::sendExcessiveForceOnStructure(){
tlnt32 mCommand = EXCESSIVE_FORCE_ON_STRUCTURE;
send(sendingSocket, (char*)&mCommand, sizeof( tlnt32), 0);
//then send data size
tUlnt32 dataSize = 0;
send(sendingSocket, (char*)&dataSize, sizeof(tUlnt32), 0);
tVoid cCHSSimulatorCommunication::sendNeedlelnBody(){
tlnt32 mCommand = NEEDLE_IN_BODY;
send(sendingSocket, (char*)&mCommand, sizeof( tlnt32), 0);
//then send data size
tl)lnt32 dataSize = 0;
send(sendingSocket, (char*)&dataSize, sizeof(tUlnt32), 0);
tVoid cCHSSimulatorCommunication::sendSlowVelocity(){
tlnt32 mCommand = SLOVVVELOCITY;
send(sendingSocket, (char*)&mCommand, sizeof( tlnt32), 0);
//then send data size
tUlnt32 dataSize = 0;
send(sendingSocket, (char*)&dataSize, sizeof(tlllnt32), 0);
}
tVoid cCHSSimulatorCommunication::sendRedirection(){
tlnt32 mCommand = REDIRECTION;
58


}
#endif
send(sendingSocket, (char*)&mCommand, sizeof( tlnt32), 0);
//then send data size
tUlnt32 dataSize = 0;
send(sendingSocket, (char*)&dataSize, sizeof(tl)lnt32), 0);
59


D. AAOS Faculty Checklist for Evaluating Surgical Performance on a
Patient
American Academy of Orthopaedic Surgeons
Diagnostic Arthroscopy General Criteria:
Rate each of the following criteria using the rating scale noted.
Examinee:
Examiner:
Date:
Did the surgeon obtain proper visualization of the following
structures? (Use the 7 point scale to determine the surgeons rating
for each action. Score each bulleted item.)
1 2 3 4 5
(Severe or Not at all) (Mild, Moderate, Good)
(Excellent)
Criteria Rating Weight
1 The time it took to perform the complete diagnostic examination of the knee, (minutes) 1: Procedure not completed. 2: Procedure completed >25 min. 3: Procedure completed 15-25 min. 4: Procedure completed 11 -<15 min. 5: Procedure completed <11 min. 6.2
2 There was iteration of movement. 1: There was gross iteration of movement. 2: There was moderate iteration of movement. 3: There was mild iteration of movement. 4: There were a few iteration movements during the examination, but otherwise effectively carried out. 5: There was no iteration of movement. 6.2
60


3 The surgeon was gentle with the use of the arthroscope and instruments. 1: The surgeon caused significant intra-articular injury with the arthroscope and instruments. 2: The surgeon caused moderate intra-articular injury with the arthroscope and instruments. 3: The surgeon caused mild intra-articular injury with the arthroscope and instruments. 4: The surgeon caused minor intra-articular injury with the arthroscope and instruments. 5: The surgeon was completely gentle with the use of the arthroscope and instruments. 7.0
4 The diagnostic sweep of the knee joint was conducted in the correct order according to pre-established criteria. 1: Four or more parts of the exam were out of order. 2: Three parts of the exam were out of order. 3: Two parts of the exam were out of order. 4: One part of the exam was out of order. 5: The sweep of the knee was conducted in the correct order and properly visualized. 3.4
5 The diagnostic examination of the knee was thorough and complete with and without the probe. 1: Less than 70% of examination accomplished. 2: Between 70% and 80% accomplished. 3: Between 80% and 90% accomplished. 4: Between 90% and 95% accomplished. 5: Between 95% and 100% accomplished. 7.0
6 The skill at triangulation between arthroscope and introduced instrument. 1: The skill at triangulation between arthroscope and introduced instrument was poor, with the surgeon occasionally spending more than one minute locating the introduced instrument. 2: The skill at triangulation between arthroscope and introduced instrument was fair, with the surgeon occasionally spending more than 20 seconds locating the introduced instrument. 3: The skill at triangulation between arthroscope and introduced instrument was good, but occasionally the surgeon spent more than 15 seconds locating the introduced instrument. 4: The skill at triangulation between arthroscope and introduced instrument was good. 5: The skill at triangulation between arthroscope and 7.0
61


introduced instrument was excellent.
7 The 30 degree arthroscope was aligned properly in order to maximally visualize the structure that he/she was examining. 1: The 30-degree arthroscope was poorly aligned, and visualization was poor. 2: The 30-degree arthroscope was moderately poorly malaligned, and visualization was not adequate. 3: The 30-degree arthroscope was mal-aligned frequently and visualization of the anatomic structures was incomplete and only fair. 4: The 30-degree arthroscope was occasionally aligned poorly, but visualization of anatomic structures was adequate. 5: The 30-degree arthroscope was aligned properly in order to maximally visualize anatomic structures. 5.4
8 The orientation of the arthroscopic video camera was correct. 1: The orientation of the arthroscopic video camera was poor, and visualization was poor. Camera is consistently out of plane with resulting loss of any effective visualization. 2: The orientation of the arthroscopic video camera was moderately poor, and visualization was not adequate. Camera is consistently out of plane significantly effecting visualization of the compartments. 3: The arthroscopic video camera was malaligned frequently and visualization of the anatomic structures was incomplete and only fair. Camera is intermittently out of plane resulting in altered visualization of the compartments. 4: The arthroscopic video camera was occasionally aligned poorly, but visualization of anatomic structures was adequate. Camera is occasionally out of plane but compartments are seen well. 5: The arthroscopic video camera was aligned properly in order to maximally visualize anatomic structures. Camera is in plane consistently. 6.2
9 The tip of the arthroscope was positioned correctly throughout the examination. 1: The tip of the arthroscope was grossly malpositioned during the examination. 2: The tip of the arthroscope was moderately malpositioned, and anatomic structures were moderately incompletely visualized. 6.8
62


3: The tip of the arthroscope was mildly malpositioned, and anatomic structures were incompletely visualized. 4: The tip of the arthroscope was not quite positioned correctly, but anatomic structures were completely visualized. 5: The tip of the arthroscope was positioned correctly throughout the examination.
10 The attending surgeon was required to intervene during the procedure. 1: Attending surgeon intervened 3 or more times. 2: Attending surgeon intervened 2 times. 3: Attending surgeon intervened once. 4: Attending surgeon gave verbal guidance during the procedure. 5: Attending surgeon did not have to intervene during the diagnostic examination. 6.5
Diagnostic Arthroscopy Specific Criteria:
Did the surgeon obtain proper visualization of the following
structures? (Use the 7 point scale to determine the surgeons rating
for each action. Score each bulleted item.)
1 2 3 4 5 6 7
Poor Fair Average Good Excellent
Structure Score Weight
1 Suprapatellar pouch:
In flow cannula visualized 3.4
1: Cannula not visualized.
2: Scope orientation incorrect, not in suprapatellar pouch, cannula partially visualized. 3: Scope orientation incorrect, in suprapatellar pouch, cannula partially visualized. 4: Scope orientation correct, cannula partially visualized. 5: Scope orientation correct, cannula fully visualized.
Suprapatellar plica (if present) 1: Suprapatellar plica not visualized. 2: Scope orientation incorrect, suprapatellar plica partially visualized. 1.0
63


3: Scope orientation incorrect, suprapatellar plica partially visualized. 4: Scope orientation correct, suprapatellar plica partially visualized. 5: Scope orientation correct, suprapatellar plica visualized completely.
Medial parapatellar plicae (if present) 1: Medial parapatellar plica not visualized. 2: Scope orientation incorrect, medial parapatellar plica inadequately visualized. 3: Scope orientation incorrect, medial parapatellar plica partially visualized. 4: Scope orientation correct, medial parapatellar plica partially visualized. 5: Scope orientation correct, suprapatellar plica visualized completely. 2.6
2 Patello-femoral joint visualization: Patellar surface 1: Patellar surface not seen. 2: Scope orientation incorrect, less than 50% patellar surface seen. 3: Scope orientation incorrect, less than 75% patellar surface seen. 4: Scope orientation correct, less than 75% patellar surface seen. 5: scope orientation correct, 100% patellar surface seen. 6.4
* Articular surface of trochlear sulcus 1: Articular surface of trochlear sulcus not seen. 2: Scope orientation incorrect, less than 50% trochlear articular surface seen. 3: Scope orientation incorrect, less than 75% trochlear articular surface seen. 4: Scope orientation correct, less than 75% trochlear articular surface seen. 5: Scope orientation correct, 100% trochlear articular surface seen. 6.6
* Lateral femoral side of the trochlea 1: Lateral femoral side of trochlear groove not seen. 2: Scope orientation incorrect, less than 50% lateral femoral side of trochlear groove seen. 3: Scope orientation incorrect, less than 75% of lateral femoral side of trochlear groove seen. 4: Scope orientation correct, less than 75% lateral femoral side of trochlear groove seen. 5: Scope orientation correct, 100% of lateral femoral side of trochlear groove seen. 5.6
64


* Medial femoral side of the trochlea 1: Medial femoral side of trochlear groove not seen. 2: Scope orientation incorrect, less than 50% medial femoral side of trochlear groove seen. 3: Scope orientation incorrect, less than 75% of medial femoral side of trochlear groove seen. 4: Scope orientation correct, less than 75% medial femoral side of trochlear groove seen. 5: Scope orientation correct, 100% of lateral femoral side of trochlear groove seen. 5.2
* Patello-femoral articulation at the appropriate degree of flexion 1: Patello-femoral joint visualization not seen. 2: Scope orientation incorrect, less than 50% patello-femoral joint seen. 3: Scope orientation incorrect, less than 75% of patello-femoral joint seen. 4: Scope orientation correct, less than 75% patello-femoral joint seen. 5: Scope orientation correct, 100% patello-femoral joint seen. 6.2
3 Medial recess: look for loose bodies, adhesions, spurs, synovitis: 1: Medial recess not visualized. 2: Scope orientation incorrect, less than 50% medial recess seen. 3: Scope orientation incorrect, less than 75% of medial recess seen. 4: Scope orientation correct, less than 75% medial recess seen. 5: Scope orientation correct, 100% medial recess seen. 4.2
4 Medial compartment visualization: Anterior aspect of the medial femoral condyle 1: Anterior aspect of medial condyle not seen. 2: Scope orientation incorrect, less than 50% anterior aspect medial condyle seen. 3: Scope orientation incorrect, less than 75% of anterior aspect medial condyle seen. 4: Scope orientation correct, less than 75% anterior aspect medial condyle seen. 5: Scope orientation correct, 100% anterior aspect medial condyle seen. 4.4
I
65


Posterior aspect of medial femoral condyle, viewed at 90 degrees flexion 1: Posterior aspect of medial condyle not seen. 2: Scope orientation incorrect, flexion less than 70, less than 50% posterior aspect of medial condyle seen. 3: Scope orientation incorrect, flexion angle correct, less than 75% of posterior aspect of medial condyle seen. 4: Scope orientation correct, flexion angle correct, less than 75% posterior aspect of medial condyle seen. 5: Scope orientation correct, flexion angle correct, 100% posterior aspect of medial condyle seen. 6.6
Medial tibial plateau 1: Medial tibial plateau not seen. 2: Scope orientation incorrect, flexion angle incorrect (>50 flexion), less than 50% medial tibial plateau seen. 3: Scope orientation incorrect, flexion angle correct (20-30flexion), less than 75% of medial tibial plateau seen. 4: Scope orientation correct, flexion angle correct, less than 75% medial tibial plateau seen. 5: Scope orientation correct, flexion angle correct, 100% medial tibial plateau seen. 7.0
Appropriate valgus force applied to knee at between 20- 30 degrees flexion 1: Flexion angle >50, no valgus force applied. 2: Flexion angle >50, mild valgus force applied. 3: Flexion angle >50, moderate valgus force applied. 4: Flexion angle correct, inadequate valgus force applied. 5: Flexion angle correct, adequate valgus force applied. 6.8
Arthroscope viewing posteriorly (not up, down, anteriorly) 1: Flexion angle >50, scope oriented anteriorly. 2: Flexion angle >50, scope oriented up or down. 3: Flexion angle >50, scope oriented posteriorly. 4: Flexion angle correct, scope oriented anteriorly, up, or down. 5: Flexion angle correct, scope oriented posteriorly 6.8
* Posterior horn of medial meniscus 1: Flexion angle >50, scope oriented anteriorly, insufficient valgus force applied, posterior horn not seen. 7.0
66


2: Flexion angle >50, scope oriented up or down, insufficient valgus force applied, posterior horn partially seen. 3: Flexion angle >50, scope oriented posteriorly, moderate valgus force applied, posterior horn partially seen. 4: Flexion angle correct, scope oriented anteriorly, up, or down, moderate valgus force applied, posterior horn partially seen. 5: Flexion angle correct, scope oriented posteriorly, adequate valgus force applied, posterior horn completely seen (if within anatomic limits).
* Mid-third of the medial meniscus 1: Flexion angle >50, scope oriented anteriorly, insufficient valgus force applied, mid- third of meniscus not seen. 2: Flexion angle >50, scope oriented up or down, insufficient valgus force applied, mid- third of meniscus partially seen. 3: Flexion angle >50, scope oriented posteriorly, moderate valgus force applied, mid-third of meniscus partially seen. 4: Flexion angle correct, scope oriented anteriorly, up, or down, moderate valgus force applied, mid-third of meniscus partially seen. 5: Flexion angle correct, scope oriented posteriorly, adequate valgus force applied, mid-third of meniscus completely seen. 7.0
Anterior horn of the medial meniscus 1: Anterior horn not seen. 2: Scope orientation incorrect, less than 50% anterior horn seen. 3: Scope orientation incorrect, less than 75% anterior horn seen. 4: Scope orientation correct, less than 75% anterior horn seen. 5: Scope orientation down or anterior, anterior horn clearly seen (within anatomic limits). 2.8
5 Intercondylar notch visualization: Scope moved over top of ligamentum mucosum 4.2
1: scope not passed over the ligamentum mucosum or the ligamentum mucosum is not visualized. 2: The ligamentum is visualized but the camera/scope is poorly orientated and the scope is not passed over it. 3: The ligamentum is visualized and either the scope/camera is poorly orientated or the scope is not passed over it. 4: The ligamentum is visualized, the scope/camera orientation is good but the scope is not passed over it. 5: The ligamentum is visualized, the scope/camera is oriented correctly and the scope is passed over it with smooth movements. ACL 6.8
1: ACL is not seen. 2: Scope orientation incorrect, less than 25% of ACL seen. 3: Scope orientation incorrect, less than 75% of ACL seen.
67


4: Scope orientation correct, less than 75% of ACL seen. 5: Scope orientation correct, ACL completely seen.
PCL 4.4
1: PCL is not seen. 2: Scope orientation incorrect, less than 25% of PCL seen. 3: Scope orientation incorrect, less than 75% of PCL seen. 4: Scope orientation correct, less than 75% of PCL seen. 5: Scope orientation correct, PCL completely seen.
6 Lateral compartment visualization: Anterior aspect of the lateral femoral condyle 1: Anterior aspect of lateral condyle not seen. 2: Scope orientation incorrect, less than 50% anterior aspect lateral condyle seen. 3: Scope orientation incorrect, less than 75% of anterior aspect lateral condyle seen. 4: Scope orientation correct, less than 75% anterior aspect lateral condyle seen. 5: Scope orientation correct, 100% anterior aspect lateral condyle seen. 4.8
Posterior aspect of lateral femoral condyle, viewed at 90 degrees flexion 1: Posterior aspect of lateral condyle not seen. 2: Scope orientation incorrect, flexion less than 70, less than 50% posterior aspect of lateral condyle seen. 3: Scope orientation incorrect, flexion angle correct, less than 75% of posterior aspect of lateral condyle seen. 4: Scope orientation correct, flexion angle correct, less than 75% posterior aspect of lateral condyle seen. 5: Scope orientation correct, flexion angle correct, 100% posterior aspect of lateral condyle seen. 5.6
Lateral tibial plateau 1: Lateral tibial plateau not seen. 2: Scope orientation incorrect, less than 50% lateral tibial plateau seen. 3: Scope orientation incorrect, less than 75% of lateral tibial plateau seen. 4: Scope orientation correct, less than 75% lateral tibial plateau seen. 5: Scope orientation correct, 100% lateral tibial plateau seen. 7.0
68


* Appropriate varus force applied to knee at between 20- 30 degrees flexion or figure 4 position 1: No varus force applied. 2: Mild varus force applied, incorrect angle of flexion. 3: Moderate varus force applied, incorrect angle of flexion. 4: Moderate varus force applied, correct angle of flexion. 5: Adequate varus force applied, correct angle of flexion. 6.8
Arthroscope viewing posteriorly (not up, down, anteriorly) 1: Scope oriented anteriorly. 2: (Between 1 and 3) 3: Scope oriented up or down. 4: (Between 3 and 5) 5: Scope oriented posteriorly. 6.2
* Posterior horn of lateral meniscus 1: Posterior horn not seen 2: Scope oriented anteriorly, insufficient varus force applied, posterior horn only partially seen. 3: Scope oriented anteriorly, up, or down, moderate varus force applied, posterior horn only partially seen. 4: Scope oriented posteriorly, moderate varus force applied, less than 75% of posterior horn seen. 5: Scope oriented posteriorly, adequate varus force applied, posterior horn completely seen (if within anatomic limits). 7.0
* Mid-third of the lateral meniscus 1: Mid- third of meniscus not seen. 2: Scope oriented anteriorly, insufficient varus force applied, mid- third of meniscus partially seen. 3: Scope oriented anteriorly, up, or down, moderate varus force applied, mid-third of meniscus partially seen. 4: Scope oriented posteriorly, moderate varus force applied, mid- third of meniscus partially seen. 5: Scope oriented posteriorly, adequate varus force applied, mid- third of meniscus completely seen. 7.0
Anterior horn of the lateral meniscus 1: Anterior horn of lateral meniscus not seen. 2: Scope oriented posteriorly, insufficient varus force applied, anterior horn of meniscus partially seen. 6.8
69


3: Scope orientation incorrect, less than 75% anterior horn seen. 4: Scope orientation correct, less than 75% anterior horn seen. 5: Scope orientation down or anterior, anterior horn completely seen.
7 Lateral gutter: look for loose bodies, spurs, adhesions, synovitis, popliteus tendon: 1: Lateral gutter not visualized. 2: Scope orientation incorrect, less than 50% lateral gutter seen. 3: Scope orientation incorrect, less than 75% of lateral gutter seen. 4: Scope orientation correct, less than 75% lateral gutter seen. 5: Scope orientation correct, 100% lateral gutter seen. 5.2
Diagnostic Arthroscopy Probing Criteria:
Did the surgeon obtain proper visualization of the following
structures? (Use the 7 point scale to determine the surgeons rating
for each action. Score each bulleted item.)
1 2 3 4 5 6 7
Poor Fair Average Good Excellent
Criteria Score Weight
1 Introduction of 18-gauge needle: * 18-gauge needle introduced parallel to the medial tibial plateau, close to superior surface anterior horn medial meniscus 1: Scope orientation incorrect, or difficulty visualizing needle, incorrect angle of introduction of needle. 2: Scope orientation incorrect, needle impales femoral condyle or tibial plateau. 3: Needle angle not quite correct, does not impale femoral condyle or tibial plateau. 4: Scope orientation incorrect, needle introduced parallel to tibial plateau, close to superior surface anterior horn medial meniscus 5: Scope orientation correct, needle introduced parallel to tibial plateau, close to superior surface anterior horn medial meniscus. * 18-gauge needle passed to posterior horn medial meniscus 3.8 3.8
70


1: Scope orientation incorrect, or difficulty visualizing needle, incorrect angle of introduction of needle. 2: Scope orientation incorrect, needle impales femoral condyle or tibial plateau. 3: Needle angle not quite correct, does not impale femoral condyle or tibial plateau, cannot reach superior surface posterior horn medial meniscus. 4: Scope orientation incorrect, needle introduced parallel to tibial plateau, touches superior surface posterior horn medial meniscus. 5: Scope orientation correct, needle introduced parallel to tibial plateau, touches superior surface posterior horn medial meniscus.
2 Obturator, then nerve hook introduced: 1: Scope orientation incorrect, difficulty visualizing obturator. 2: Scope orientation incorrect, obturator too high or too low in the joint. 3: Scope orientation incorrect, obturator introduced in proper position, some difficulty visualizing obturator. 4: Scope orientation incorrect, obturator easily visualized in proper position. 5: Scope orientation correct, obturator easily visualized in proper position. 2.2
3 Media) meniscus probing: Posterior horn 1: Saw meniscus incompletely, no probing. 2: Saw posterior horn incompletely, probed ineffectively. 3: Saw posterior horn well, probed ineffectively. 4: Saw posterior horn well, probed well enough to demonstrate gross tear. 5: Saw posterior horn well, and probed posterior horn superior and inferior surfaces efficiently. Mid-third inferior and superior surfaces 1: Saw mid-third incompletely, no probing. 2: Saw mid-third incompletely, probed ineffectively. 3: Saw mid-third well, probed ineffectively. 4: Saw mid-third well, probed well enough to demonstrate gross tear. 5: Saw mid-third well, probed superior and inferior surfaces mid-third efficiently. Anterior horn 7.0 7.0 2.4
71


1: Saw poorly, no probe. 2: Saw poorly, not probed effectively. 3: Saw well and probed at center, not peripherally. 4: Saw well, used probe to demonstrate superior surface. 5: Saw well, used probe for assistance, saw full anterior surface.
4 Probing articular surfaces: Medial femoral condyle 1: Either the medial condyle was not visualized or the probe was not used to touch the surface. 2: The camera/scope was oriented poorly and the probing was done in a rough or limited way. 3: Either the camera/scope was oriented poorly, or the probe was used in a rough way over a small area. 4: The scope and camera were oriented correctly and the probe was used to touch the surface but either the probe use was rough or the probing was done over a small area. 5: The scope was oriented correctly, the camera was oriented correctly, the surface was visualized well and the surface was touched with the probe using smooth, non-invasive strokes over the extended and flexed surfaces. 2.8
Medial tibial plateau 1: Either the medial tibial plateau was not visualized or the probe was not used to touch the surface. 2: The camera/scope was oriented poorly and the probing was done in a rough or limited way. 3: Either the camera/scope was oriented poorly, or the probe was used in a rough way over a small area. 4: The scope and camera were oriented correctly and the probe was used to touch the surface but either the probe use was rough or the probing was done over a small area. 5: The scope was oriented correctly, the camera was oriented correctly, the surface was visualized well and the surface was touched with the probe using smooth, non-invasive strokes over the anterior and posterior portions of the surface. 2.8
5 Probing: PCL and PCL synovium 1: PCL is either not seen or the probe is not used to touch the ligament or covering synovium. 2: The scope/camera is not oriented correctly and the probe is not used effectively to touch and pull on the PCL and covering synovium. 3.0
72


3: Either the scope/camera is not oriented correctly or the probe is not used effectively to touch and pull on the PCL and covering synovium. 4: The scope/camera is oriented correctly but the probe is used only briefly to touch the PCL without pulling and testing the tissue. 5: The camera/scope is oriented correctly and the probe is used to touch and pull on the PCL and covering synovium.
6 Arthroscope and probe passed over ligamentum mucosum 1: Neither the scope nor the probe ispassed over the ligamentum mucosum or the ligamentum mucosum is not visualized. 2: The ligamentum is visualized but the camera/scope is poorly orientated and the scope and/or probe are not passed over it. 3: The ligamentum is visualized and either the scope/camera is poorly orientated or the scope and/or probe are not passed over it. 4: The ligamentum is visualized, the scope/camera orientation is good but either the probe or the scope is not passed over it. 5: The ligamentum is visualized, the scope/camera is oriented correctly and the scope and the probe are passed over it with smooth movements. 4.0
7 Probing: ACL 1: ACL is not visualized or not probed. 2: Scope/camera is poorly oriented and the ligament is touched but not pulled to test for integrity. 3: Scope/camera is properly oriented, but the ligament is ineffectively probed. 4: ACL is visualized, scope/camera orientation is correct but the ligament is only touched and not pulled and tested for integrity. 5: ACL is visualized, scope/camera is oriented correctly and the ligament is probed and pulled from the lateral and medial side. * Lachman Test 1: Lachman test is not performed. 2: Lachman test is performed but scope/camera orientation is poor and flexion angle is wrong and test is performed poorly (too fast, too little force), and probed ineffectively. 3: Lachman test is performed but two variables are done poorly: scope/camera orientation, flexion angle, performance of the test, and not probed efficiently. 6.8 3.8
73


4: Lachman test is performed but scope/camera orientation is improper or orientation is proper but flexion angle is wrong or probed poorly. 5: Lachman test is performed between 20-40 deg. and with proper scope/camera orientation and and ACL probed efficiently.
8 Probing lateral meniscus: Posterior horn inferior and superior surfaces 1: Saw the posterior horn incompletely, no probing. 2: Saw posterior horn incompletely, probed ineffectively. 3: Saw posterior horn well, probed ineffectively. 4: Saw posterior horn well, probed well enough to demonstrate gross tear. 5: Saw posterior horn well, and probed posterior horn superior and inferior surfaces efficiently. Mid-third lateral meniscus inferior and superior surfaces and popliteus hiatus 1: Saw mid-third incompletely, no probing. 2: Saw mid-third incompletely, probed ineffectively. 3: Saw mid-third well, probed ineffectively. 4: Saw mid-third well, probed well enough to demonstrate gross tear. 5: Saw mid-third well, probed superior and inferior surfaces mid-third efficiently. Anterior horn 1: Saw poorly, no probe. 2: Saw poorly, not probed effectively. 3: Saw well and probed at center, not peripherally. 4: Saw well, used probe to demonstrate superior surface. 5: Saw well, used probe for assistance, saw full anterior surface. 7.0 7.0 5.8
9 Probing articular surfaces: Lateral femoral condyle 2: Either the lateral condyle was not visualized or the probe was not used to touch the surface. 2: The camera/scope was oriented poorly and the probing was done in a rough or limited way. 3: Either the camera/scope was oriented poorly, or the probe was used in a rough way over a small area. 4: The scope and camera were oriented correctly and the probe was used to touch the surface but either the probe use was rough or the probing was done over a small area. 3.2
74


5: The scope was oriented correctly, the camera was oriented correctly, saw and probed articular surface well through range of motion from sulcus to 90 deg of flexion. Lateral tibial plateau 1: Either the lateral tibial plateau was not visualized or the probe was not used to touch the surface. 2: The camera/scope was oriented poorly and the probing was done in a rough or limited way. 3. Either the camera/scope was oriented poorly, or the probe was used in a rough way over a small area. 4: The scope and camera were oriented correctly and the probe was used to touch the surface but either the probe use was rough or the probing was done over a small area. 5: The scope was oriented correctly, the camera was oriented correctly, the surface was visualized well and the surface was touched with the probe using smooth, non-invasive strokes over the anterior and posterior portions of the surface 3.2
10 Posteromedial compartment: * Probe placed under posterior horn medial meniscus 1: Attempted, but probe never touched posterior horn. 2: Scope looking anteriorly, probe placed in the vicinity of the posterior horn of the medial meniscus. 3: Probe placed over the superior surface of the posterior horn of the medial meniscus. 4: After 2 or more attempts, the probe was placed under the posterior horn of the medial meniscus. 5: Viewing from the front of the joint, the probe was visualized and placed directly under the posterior horn of the medial meniscus. * Posterior horn medial meniscus probed including menisco-synovial junction with valgus force, 30-60 deg. flexion 1: Either scope not in posteromedial compartment or probe not brought into view. 2: Probe under posterior horn medial meniscus, superior surface not probed. 3: More than one attempt to get the probe in view, posterior horn poorly seen and inadequately probed. 4: Probe brought into view, but scope not oriented to view between 90-180. 5: Probe efficiently brought into view, posterior horn medial meniscus probed with the scope viewing between 90-180. 3.0 3.6 3.4
75


Posterior medial compartment visualized 1: Surgeon unable to enter posteromedial compartment either by directly visualizing the passage, or with an obturator in the scope sheath and feeling one's way into the posteromedial compartment. 2: Scope in the passage to the posteromedial compartment, but not in the compartment, and less than 50% of the compartment seen. 3: Scope in posteromedial compartment, improper orientation of scope (not looking medially). 4: Scope oriented looking medially, medial compartment partially visualized with mild obstruction from medial femoral condyle. 5: Scope oriented looking medially, medial compartment visualized with minimal obstruction from medial femoral condyle. Arthroscope rotated to view PCL 1: PCL not seen. 2: Scope too far in the medial compartment, or not far enough, and PCL only partially seen. 3: PCL partially seen, but scope oriented looking medially. 4: Scope oriented looking laterally, but not entire inferior one half of PCL visualized. 5: Scope oriented looking laterally, pulled back slightly from posteromedial compartment and PCL origin on tibia and tibial one half of PCL visualized. 5.4
11 Posterolateral compartment: [is the weighting acceptable\ Scope moved to anteromedial portal varus position to visualize posterolateral compartment 1: Attempt to visualize posterolateral compartment with scope still in anterolateral portal. Or scope moved to anteromedial portal, but posterolateral compartment not located. 2: Scope moved to anteromedial portal, but posterolateral compartment not located. 3: Scope moved to anteromedial portal, but posterolateral compartment only partially visualized because of poor scope position. 4: Scope in anteromedial portal, directed to posterolateral compartment, not visualizing between 180-270. 5: Scope in anteromedial portal, directed to posterolateral compartment, visualizing between 180-270. 5.4
76


The arthroscope must be accurately oriented to view the anatomy correctly
and completely. If the anatomy is viewed with the scope inaccurately
oriented, a reduction of 1 point is made.
Efficiency is a rating factor for each item included on the checklist. It is
acceptable if a study participant makes more than one attempt to complete
a step in the sweep. However, if repeated attempts are needed to finally
visualize and/or probe a landmark, the rating should be reduced by at least
1 point.
When probing the condyle, representative areas of the weightbearing
surface
should be probed, including medial, central, and lateral.
A systematic approach is desired to correctly complete the diagnostic
sweep.
77


REFERENCES
Bruce Tognazzini. Wikipedia, the Free Encyclopedia. 14 Feb 2006.
=39562415>
Ginda, Robert. What is JavaScript?. 9 September 2004.

Harold, Elliotte Rusty. Why Javas Hot. The Java Developer Resource.
2000.
Hewett, Baecker, Card, Carey, Gasen, Mantei, Perlman, Strong and Verplank.
ACM SIGCHI Curricula for Human-Computer Interaction. 1996.

Human-Computer Interaction. Wikipedia, the Free Encyclopedia. 15 March
2006. computer_interaction&oldid=43871135>
Instructional Technology/E-Learning Wikibooks, the Open-Content
Textbooks Collection. 7 August 2005.

Learning Management System Wikibooks, the Open-Content Texbooks
Collection. 28 Feb 2006.

Nielsen, Jacob. Ten Usability Heuristics. Online post.

Nielsen, Jacob. Usability 101: Introduction to Usability. AlertBox. Online
Newsletter. 25 August 2003.
Richtel, Matt. Making Web Sites More Usable Is Former Sun Engineers
Goal. The New York Times on the Web. 13 July 1998.

Rosson, Mary Beth and John M. Carroll. Usability Enqineerinq:Scenario-based
Development of Human-Computer Interaction. San Francisco: Morgan
Kaufmann, 2002
78


Schildt, Herbert. C++: The Complete Reference. Berkeley: Osborne McGraw-
Hill, 1991 pp 257-267
Sperberg-McQueen, C.M., and Lou Burnard, Eds. A Gentle Introduction to
SGML. Guidelines for Electronic Text Encoding and Interchange. 1994.

Sun Microsystems. The Java Language Environment. White Paper. 1997.

Tognazzini, Bruce. First Principles of Interaction Design. AskTog Online
Column.
Vaas, Lisa. The E-Training of America. PC Magazine December 2001
Walsh, Norman. A Technical Introduction to XML. Website. 3 October
1998.
World Wide Web Consortium, Massachusettes Institute of Technology,
European Research Consortium for Informatics and Mathematics, Keio
University, eds. HyperText Markup Language Homepage. Online Post.
2002.
79