Citation
The evaluation of management information systems

Material Information

Title:
The evaluation of management information systems Artemis as a management information system for the space station program
Creator:
Whittington, Harold W
Publication Date:
Language:
English
Physical Description:
xxii, 408 leaves : illustrations, forms ; 28 cm

Subjects

Subjects / Keywords:
Management information systems -- Case studies ( lcsh )
Space stations ( lcsh )
Management information systems ( fast )
Space stations ( fast )
Genre:
Case studies. ( fast )
bibliography ( marcgt )
theses ( marcgt )
non-fiction ( marcgt )
Case studies ( fast )

Notes

Bibliography:
Includes bibliographical references (leaves 300-317).
General Note:
Submitted in partial fulfillment of the requirements for the degree, Doctor of Philosophy in Public Administration, Graduate School of Public Affairs.
General Note:
School of Public Affairs
Statement of Responsibility:
by Harold W. Whittington.

Record Information

Source Institution:
|University of Colorado Denver
Holding Location:
|Auraria Library
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
23066120 ( OCLC )
ocm23066120
Classification:
LD1190.P86 1989D .W447 ( lcc )

Full Text
THE EVALUATION OF MANAGEMENT
INFORMATION SYSTEMS: ARTEMIS AS A
MANAGEMENT INFORMATION SYSTEM FOR
THE SPACE STATION PROGRAM
by
Harold W. Whittington
B.S., East Texas State College, 1961
M.S., East Texas State College, 1963
M.S., The University of Southern California, 1967
A thesis submitted to the
Faculty of the Graduate School of Public Affairs
University of Colorado in partial fulfillment
of the requirements for the degree of
Doctor of Philosophy in Public Administration
Graduate School of Public Affairs
of the
1989


i i i
Whittington, Harold Wayne (Ph D., Public
Administration)
The Evaluation of Management Information Systems:
Artemis as a Management Information System for
the Space Station Program
Thesis directed by Professor Sam Overman
On January 20, 1984, President Ronald
Reagan directed NASA to develop a permanently
manned space station within a decade. Soon after
the announcement NASA decided to implement an
all-inclusive technical and management
information system, referred to as TMIS, for the
Space Station Program. A subset of TMIS is a
business-oriented project control system (PCS)
called Artemis. Artemis has been in existence
for a decade and has had extensive use within the
Department of Defense (DOD) and the petrochemical
industry. By early 1987 Artemis had been in use
in the Space Station Program for more than two
years, yet no significant evaluation of its
effectiveness had been performed.
Pre and post implementation evaluations of
management information systems, such as Artemis,


IV
present a theoretically difficult problem for
evaluation researchers. Any MIS depends upon
complex relationships between system performance,
the functions performed and the levels at which
they are performed, and the people who use the
MIS.
These three ingredients formed the
theoretical framework for an evaluation of the
effectiveness of Artemis as a Space Station MIS.
Statistical in approach and supported by data
acquired through written comments and follow-up
interviews, the evaluation was based upon
pair-wise comparison of the three ingredients
mentioned above.
It was concluded that Artemis is an
effective, but somewhat costly, MIS. It is as
effective for program planning as for management
control, a conclusion which contradicts a
proposition upon which the research was
initiated. Programmer/Analysts are more inclined
to look favorable upon Artemis than are top
managers and users, presumably because they are
less intimidated by its complexity. While it is
considered somewhat costly and there are some
concerns with regard to the timeliness of


V
responses associated with the use of the system,
it is considered an MIS capable of producing
large quantities of high quality information.
The broader implications of this research
deal with the use of evaluation research in the
post-installation evaluation of MIS's. This
approach appears very appropriate in such
situations. It also appears that the three
research ingredients are those, or certainly
among those, which best scope the parameters
against which MIS's should be evaluated.
The form and content of this abstract are
approved. I recommend its publication.


THE EVALUATION OF MANAGEMENT
INFORMATION SYSTEMS: ARTEMIS AS A
MANAGEMENT INFORMATION SYSTEM FOR
THE SPACE STATION PROGRAM
by
Harold W. Whittington
B.S., East Texas State College, 1961
M.S., East Texas State College, 1963
M.S., The University of Southern California, 1967
A thesis submitted to the
Faculty of the Graduate School of Public Affairs
of the
University of Colorado in partial fulfillment
of the requirements for the degree of
Doctor of Philosophy in Public Administration
Graduate School of Public Affairs
1989


This thesis for the Doctor of Philosophy
degree by
Harold W. Whittington
has been approved for the
Graduate School
of Public Affairs
by
Date
im


Whittington, Harold Wayne (Ph D., Public
Administration)
The Evaluation of Management Information Systems:
Artemis as a Management Information System for
the Space Station Program
Thesis directed by Professor Sam Overman
On January 20, 1984, President Ronald
Reagan directed NASA to develop a permanently
manned space station within a decade. Soon after
the announcement NASA decided to implement an
all-inclusive technical and management
information system, referred to as TMIS, for the
Space Station Program. A subset of TMIS is a
business-oriented project control system (PCS)
called Artemis. Artemis has been in existence
for a decade and has had extensive use within the
Department of Defense (DOD) and the petrochemical
industry. By early 1987 Artemis had been in use
in the Space Station Program for more than two
years, yet no significant evaluation of its
effectiveness had been performed.
Pre and post implementation evaluations of
management information systems, such as Artemis,


iv
present a theoretically difficult problem for
evaluation researchers. Any MIS depends upon
complex relationships between system performance,
the functions performed and the levels at which
they are performed, and the people who use the
MIS.
These three ingredients formed the
theoretical framework for an evaluation of the
effectiveness of Artemis as a Space Station MIS.
Statistical in approach and supported by data
acquired through written comments and follow-up
interviews, the evaluation was based upon
pair-wise comparison of the three ingredients
mentioned above.
It was concluded that Artemis is an
effective, but somewhat costly, MIS. It is as
effective for program planning as for management
control, a conclusion which contradicts a
proposition upon which the research was
initiated. Programmer/Analysts are more inclined
to look favorable upon Artemis than are top
managers and users, presumably because they are
less intimidated by its complexity. While it is
considered somewhat costly and there are some
concerns with regard to the timeliness of


V
responses associated with the use of the system,
it is considered an MIS capable of producing
large quantities of high quality information.
The broader implications of this research
deal with the use of evaluation research in the
post-installation evaluation of MIS's. This
approach appears very appropriate in such
situations. It also appears that the three
research ingredients are those, or certainly
among those, which best scope the parameters
against which MIS's should be evaluated.
The form and content of this abstract are
approved. I recommend its publication.


VI
The programmed computer is tireless, persistent,
and rapid in generating both meaningful and
meaningless results.
Edward A. Tomeski, August, 1975
Information means less than it says and systems
is an amorphous term, but a remarkable
metamorphosis occurs when the two are joined
together. The "information system" emerges as a
tidy and finite entity.
Ida Hoos, June, 1971


ACKNOWLEDGMENTS
The significant factors in this endeavor
have not been the three "r's", but the three
"f's", family, faculty, and friends. I would
like to acknowledge my thanks and appreciation to
all three.
The one person to whom I am most grateful
is my wife Dottie, who never doubted that I could
finish this dissertation, even when I was certain
I could not. While other wives were going to
dinner and movies, she was going to the library
to pay fines on my overdue books. To Mike, my
oldest son, who taught my youngest son Matt to
pitch while I was learning about the virtues of
paradigms, I am most grateful. To Steve, my
middle son who would announce, "the bear is in"
when I was writing and "the bear is out" when I
was occasionally available for parental things, I
would like to say that I appreciate your sense of
humor and patience. One of my fervent hopes is
that.when I grow up I am as mature and in control
as Matt, my youngest son, who probably lost the
most of me during the PhD years.


Vlll
Dr. Sam Overman assumed an important role
in the lives of the NASA Doctoral students.
Rather than remaining in Denver and allowing us
to fail on our own terms, he came to Houston to
ensure that we succeeded on his terms. Sam was
the single most important ingredient in the
completion of this dissertation. It was Dr. Jay
Shafritz and his Philadelphia charm, who guided
us all through the course work and got us into
the position to write dissertations. He knocked
the edges off our egos and added rigor to our
thinking. I want to thank Dr. Bob Gage and Dr.
Mark Emmert for serving on my committee and to my
practitioner and friend of twenty years, Dr. Hum
Mandell.
The final "f's" go to my friend and
long-time neighbor, Dr. Vance Etnyre of the
University of Houston at Clear Lake, who assisted
me in both the overall approach to the research
and also in the analysis of the data, and to Dr.
Mike Hanna of the University of Houston at Clear
Lake who helped me understand the use of SAS and
the statistics behind it.
I would like to thank Dee Reid, Pat Durst,
Pete Padron and Ruby Talley, who typed my


dissertation, and finally I would like to thank
Dr. Scott Hamilton, a man I have never met, but
whose earlier work on post-installation reviews
cemented my thinking on the feasibility of this
research project.


CONTENTS
CHAPTER
I INTRODUCTION............................. 1
Background ........................... 3
Statement of Research Objectives ..... 7
Research Problem ..................... 7
Approach ............................ 10
The Space Station Program ........... 13
Technical and Management
Information Systems (TMIS) .......... 18
Artemis ............................. 22
Summary ............................. 2 6
II. EVALUATING MANAGEMENT INFORMATION
SYSTEMS................................. 31
Introduction ........................ 31
Management Information Systems ...... 32
Evaluating Management Information
Systems ............................. 43
Levels of Management Function ....... 55
Planning Theories and
Concepts ........................ 62
Organizing Theories
and Concepts ................... 74
Management Control Theories
and Concepts ..................... 79
Operational Control Theories
and Concepts ..................... 85
Communication Theories and
Concepts ......................... 88


xi
CONTENTS (continued)
CHAPTER
Types of Information ............... 93
Position in the Organization........ 101
The Research Framework ............. 110
Research Proposition ............... 113
Summary .......................... 115
III DESIGN AND METHODOLOGIES................ 126
Research Design ..................... 126
Research Framework .................. 126
Population .......................... 128
Sample .............................. 128
The Survey Instrument ............... 131
Follow-up Interviews ................ 135
Data Analysis ....................... 135
The Case: Artemis ................... 136
Scheduling ....................... 140
Network Analysis ................. 145
Performance Measurement .......... 146
Management Graphics .............. 148
Other ............................ 150
IV SUMMARY OF RESULTS...................... 157
Overall Analysis of Variance ........ 157
Adjusting the Sample ................ 159


XI 1
CONTENTS (continued)
CHAPTER
Evaluation of Artemis by Level of
Management Function ................ 162
Planning ........................ 162
Organizing ...................... 171
Management Control ............. 174
Operational Control ............. 181
Communication ................... 186
Summary of Evaluation by Level of
Management Function ................ 196
Evaluation of Artemis by Type of
Information ........................ 199
Quality ......................... 203
Quantity ........................ 207
Cost ............................ 211
Time ............................ 221
Summary of Evaluation by Type of
Information ........................ 226
Evaluation of Artemis by Position in
the Organization ................... 228
Top Managers .................... 231
Users ........................... 241
Programmer/Analysts ............. 245
Implementors .................. 255
Summary of Results ................. 263


XI 1 1
CONTENTS (continued)
CHAPTER
V CONCLUSIONS .......................... 272
Artemis is an Effective
Space Station MIS .................. 273
Position in the Organization
Does Influence Opinion with
Regard to MIS Effectiveness ........ 274
The Types of Functions Performed
by a MIS are not Sufficient
Discriminators in the Evaluation
Of MIS'S ........................... 277
Quality, Quantity, Cost and Time
are viable, and often Conflicting,
Parameters to be used in MIS
Evaluation ......................... 278
Management and Organizational
Acceptance is the Most Important
Factor in MIS Acceptance ........... 280
Users Always Win ................... 283
Cost is a Relative Term in
MIS Evaluation ..................... 285
Why are Post Installation
Evaluations Rare? .................. 287
Difficulty of Evaluations ....... 287
Lack of Motivation .............. 288
MIS Enhancements ................ 289
Validity of Evaluation Research
for MIS Evaluation ................. 290
Broad Implications of the Research 292
Epilog
296


XIV
CONTENTS (continued)
BIBLIOGRAPHY ...................... .......... 300
APPENDIX
A. SURVEY INSTRUMENT ...................... 319
B. DETAILED DISCUSSIONS OF THE SPACE
STATION PROGRAM, TMIS AND ARTEMIS ...... 326
The Space Station Program ........... 329
The Space Station Organization ...... 340
The Space Station Technical and
Management Information System
(TMIS) .............................. 353
Artemis ............................. 365
Scheduling ....................... 370
Network Analysis ............: 374
Performance Measurement .......... 375
Management Graphics .............. 377
Other ............................ 381
Program Management in NASA.......... 382
Phase A (The Conceptual Phase) .. 387
Phase B (The Definition Phase) .. 392
Phase C (The Design Phase) ....... 398
Phase D (The Development Phase) . 401
Phase E (The Operations Phase) .. 402
Summary ............................. 403


XV
TABLES
TABLE
1- 1. Major Processes Supported by TMIS .... 20
2- 1. Post Installation Review Activities
and Involvement of Salient Groups .... 49
2-2. Evaluation Activities Associated with
Performing Post Evaluation Results ... 50
4-1. Distribution of Responses ........... 161
4-2. Summary of Response Data by Means of
Functions ........................... 163
4-3. Evaluation of Artemis by Level of
Management Function Scheffe's Test
for Variation........................ 164
4-4 Evaluation of Artemis as a Space
Station Management Information
System Planning.................... 166
4-5. Evaluation of Artemis as a Space
Station Management Information
System Organizing.................. .... 176
4-6. Evaluation of Artemis as a Space
Station Management Information
System Management Control .............. 178
4-7. Evaluation of Artemis as a Space
Station Management Information
System Operational Control ............. 185
4-8. Evaluation of Artemis as a Space
Station Management Information
System Communication .............. 190
4-9. Evaluation of Artemis as a Space
Station Management Information
System Type of Information Versus
Level of Management Function......... 200
4-10.Evaluation of Artemis as a Space
Station Management Information
System Type of Information Versus
Position in the Organization.............. 201


XVI
TABLES (continued)
TABLE
4-11.Evaluation of Artemis by Type of
Information Scheffe's Test for
Variation ................................
4-12.Summary of Response Data Evaluation
of Artemis as a Space Station
Information System Calculation of
Overages Quality .......................
4-13.Evaluation of Artemis as a Space
Station Management Information
System Quantity..........................
4-14.Evaluation of Artemis as a Space
Station Management Information
System Cost ............................
4-15.Summary of Interview Data Cost .
4-16.Evaluation of Artemis as a Space
Station Management Information
System Time ............................
4-17.Evaluation of Artemis by Position
in the Organization Scheffe's Test
for Variation .......................
4-18.Evaluation of Artemis as a Space
Station Management Information
System Position in the
Organization versus Level of
Management Function ......................
4-19.Evaluation of Artemis as a Space
Station Management Information
System Position in the
Organization versus Type of
Information ..............................
4-20.Evaluation of Artemis as a Space
Station Management Information
System Top Manager .....................
202
204
208
212
222
224
230
232
233
234


XVI 1
TABLES (continued)
TABLE
4-21.Evaluation of Artemis as a Space
Station Management Information
System Top Manager ..................... 236
4-22.Evaluation of Artemis as a Space
Station Management Information
System Top Manager ..................... 237
4-23.Evaluation of Artemis as a Space
Station Management Information
System - Users ..................... 246
4-24.Evaluation of Artemis as a Space
Station Management Information
System - Users .................... 247
4-25.Evaluation of Artemis as a Space
Station Management Information
System - Users ..................... 248
4-26.Evaluation of Artemis as a Space
Station Management Information
System - Programmer/Analysts ....... 253
4-27.Evaluation of Artemis as a Space
Station Management Information
System - Programmer/Analysts ....... 256
4-28.Evaluation of Artemis as a Space
Station Management Information
System - Programmer/Analysts ....... 257
4-29.Evaluation of Artemis as a Space
Station Management Information
System - Implementor ............... 264
4-30.Evaluation of Artemis as a Space
Station Management Information
System - Implementor ............... 265
4-31.Evaluation of Artemis as a Space
Station Management Information
System - Implementor ............... 266
4-32.Summary of Results .................. 213


xvi i i
FIGURES
FIGURE
1-1. Research Framework .................... 8
1-2. Space Station Schematic .............. 15
1-3. Overall Structure of Space Station ... 16
1-4. Space Station Program Organization ... 17
1-5. Space Station TMIS ................... 19
1- 6. Space Station Artemis Network ....... 23
2- 1. Comparison of Evaluation Approached .. 46
2-2. Standard Network Drawing ............. 65
2-3. Space Station Master Schedule ........ 67
2-4. Work Breakdown Structure ............. 69
2-5. Integration of Planning and Control .. 72
2-6. Examples of Activities in a
Business Organization Included in
Major Framework Headings ............. 80
2-7. Four Elements of Management Control .. 84
2-8. The Traditional Management
Control Process ...................... 86
2- 9. Research Framework ................. Ill
3- 1. Research Design..................... 127
3-2. Space Station Master Schedule ....... 142
3-3. Space Station Program Management .... 143
3-4. Space Station Product Assurance
Long Range Plan...................... 144


XIX
FIGURES (continued)
FIGURE
3-5. Sample Network....................... 147
3-6. JSC Level C Cumulative FY87 Totals .. 149
3-7. Space Station Program/Projects
Office .............................. 151
3- 8. Resources Required Project-Video
Camera................................ 152
4- 1. Evaluation of Artemis as a Space
Station Management Information
System - Total by Function........... 165
4-2. Evaluation of Artemis as a Space
Station Management Information
System - Planning.................... 167
4-3. Evaluation of Artemis as a Space
Station Management Information
System Frequency Distribution -
Planning................................... 170
4-4. Evaluation of Artemis as a Space
Station Management Information
System - Organization ............... 175
4-5. Evaluation of Artemis as a Space
Station Management Information
System Frequency Distribution -
Organizing................................. 177
4-6. Evaluation of Artemis as a Space
Station Management
Information System Function of
Management Control ........................ 179
4-7. Evaluation of Artemis as a Space
Station Management Information
System Frequency Distribution -
Management Control ........................ 182
4-8. Evaluation of Artemis as a Space
Station Management Information
System Function of Operations
Control ................................... 187


XX
FIGURES (continued)
FIGURE
4-9. Evaluation of Artemis as a Space
Station Management Information
System Frequency Distribution -
Operational Control ......................
4-10.Total Received vs Total Dist. FY88
4-11.Total Received by 3 Digit UPN ......
4-12.Evaluation of Artemis as a Space
Station Management Information
System Function of Communication ..
4-13.Evaluation of Artemis as a Space
Station Management Information
System Frequency Distribution -
Communication ............................
4-14.Evaluation of Artemis as a Space
Station Management Information
System Quality .........................
4-15.Evaluation of Artemis as a Space
Station Management Information
System Frequency Distribution -
Quality ..................................
4-16.Evaluation of Artemis as a Space
Station Management Information
System Quantity ........................
4-17.Evaluation of Artemis as a Space
Station Management Information
System Frequency Distribution -
Quantity .................................
4-18.Evaluation of Artemis as a Space
Station Management Information
System Cost ............................
4-19.Evaluation of Artemis as a Space
Station Management Information
System Frequency Distribution -
Cost .....................................
188
192
193
197
198
205
206
209
210
214
215


XXI
FIGURES (continued)
FIGURE
4-20.Evaluation of Artemis as a Space
Station Management Information
System Time ............................
4-21.Evaluation of Artemis as a Space
Station Management Information
System Frequency Distribution -
Time .....................................
4-22.Evaluation of Artemis as a Space
Station Management Information
System Top Managers by Function ...
4-23.Evaluation of Artemis as a Space
Station Management Information
System Top Managers by Type of
Information ..............................
4-24.Evaluation of Artemis as a Space
Station Management Information
System Frequency Distribution Top
Managers .................................
4-25.Evaluation of Artemis as a Space
Station Management Information
System Users by Function ...............
4-26.Evaluation of Artemis as a Space
Station Management Information
System users by Type of
Information ..............................
4-27.Evaluation of Artemis as a Space
Station Management Information
System Frequency Distribution -
Users ....................................
4-28.Evaluation of Artemis as a Space
Station Management Information
System Programmer/Analysts by
Function .................................
4-29.Evaluation of Artemis as a Space
Station Management Information
System Programmer/Analysts by
Type of Information ......................
225
227
238
239
240
249
250
251
258
259


XXI 1
FIGURES (continued)
FIGURE
4-30.Evaluation of Artemis as a Space
Station Management Information
System Frequency Distribution -
Programmer/Analysts ...................... 260
4-31.Evaluation of Artemis as a Space
Station Management Information
System Implementors by Function ... 267
4-32.Evaluation of Artemis as a Space
Station Management Information
System Implementors by Type of
Information............................... 268
4-33.Evaluation of Artemis as a Space
Station Management Information
System Frequency Distribution -
Implementors.............................. 269


CHAPTER 1
INTRODUCTION
The following pages present a
theoretically-grounded approach to MIS evaluation
and the results of an evaluation of an existing
business-oriented Management Information System
(MIS) called Artemis. At the time this research
was conducted, the system had been used
operationally for two years by the National
Aeronautics and Space Administration (NASA) on
the Space Station Program. It is part of a very
large, integrated technical and management
information system (TMIS) which, as implied by
its name, is (or is hoped to become) an
all-inclusive MIS for the Space Station Program.
Artemis, and several other elements of
TMIS, were purchased to meet the immediate needs
of a program which was announced rather suddenly
by the President during the State of the Union
speech in 1984. Literature is replete with
discussions of evaluations of MIS's as part of an
overall process of formulation, implementation
and evaluation, but there is a scarcity of


2
information on post-installation evaluations as
unique processes, i.e., evaluations that were not
planned at the time of MIS acquisition. This
must certainly be a void in the research
literature since so many Government programs are
based upon responding to a threat, opportunity,
etc., in less than a leisurely manner.
The existence within a large, highly
technical organization such as NASA of a complex
MIS which was acquired to solve an immediate need
presented a unique evaluation opportunity (i.e.,
complex organization, all-inclusive MIS in the
embryo stage, purchased without the benefit of a
detailed evaluation). The opportunity was
realized through structured evaluation research,
using expert opinions from a large group of
people very familiar with Artemis, e.g., users,
managers, programmers. The evaluation explored
the relationship among the people who use,
manage, etc., the system, the functions such as
planning performed by the system and types of
information, or parameters, against which the MIS
could be measured, e.g., quality, cost (of the
MIS). These relationships have been discussed in


3
literature, but have not formed the basis for
evaluation designs. Indeed, the evaluation of
MIS's is a relatively immature science. This is
especially true with regard to post-installation
evaluations. The research was oriented toward
the support, or lack thereof, of specific
research propositions which can be found in
Chapter 3. Once specific results were determined
for a specific MIS (Artemis), general conclusions
were drawn with regard to the applicability of
the methodology to the evaluation of all MIS's.
As any introduction should, this one
scopes the content of the dissertation,
consequently, the following paragraphs synopsize
what can be found in great detail in the
following chapters. The Space Station Program,
TMIS and Artemis will be briefly discussed, as
will the research design, approach, etc.
Finally, the entire chapter will be summarized.
Background
Tonight I am directing NASA to develop a
permanently manned space station, and to
do it in a decade.
Ronald Reagan
State-of-therUnion
January 1, 1984 (1)


4
The above statement provided the National
Aeronautics and Space Administration (NASA) with
its most clear objective since President John F.
Kennedy's famous statement "to land a man on the
moon in this decade and return him safely to
earth".(2)
Shortly after the State-of-the-Union
address, NASA intensified its already existing
efforts on a manned Space Station. At the time
of this research, there were hundreds of civil
servants and contractors actively working on
Space Station preliminary design, the fiscal year
1987 budget for Space Station exceeds $400
million, and the fiscal year 1988 budget is
projected to approach $500 million. (3)
In early 1984 a concept was developed for
the management of Space Station technical and
management information. The system upon which
this concept was based was named TMIS (Technical
and in early 1987 was in its earliest stages of
development. TMIS (pronounced "Tea Miss") is a
NASA-wide system of electronic information
exchange that supports al1 activities associated
with the Space Station Program, including those
of Government and contractor personnel,


5
and Management Information Management System),
international partners, and users and customers
of the Space Station.(4) It is an all inclusive
Space Station information system, or will be when
it's development and implementation are
completed.
Both the technical and the management
subsystems of TMIS contain several elements,
e.g., the management subsystem includes
scheduling, budgeting, configuration management
and numerous other non-technical elements. It is
with one of the elements within the management
subsystem that this research is concerned. In
November of 1984 a project control system (PCS)
called Artemis was selected as an interim system
to assist management in integrating program costs
and schedules, and in aiding in assessing the
overall progress of the Space Station Program.
By the end of 1986 Artemis was being used
by most organizations within the program for
technical planning, scheduling, network analysis,
and the developmental stages of performance
measurement.
All of the original elements of TMIS,
including Artemis, were acquired as interim


6
systems in order to allow NASA to operate until
a final TMIS was selected in the summer of 1987.
(Actually, the TMIS project will not be completed
until the early 1990's). At the time this
dissertation was written, the process for
selecting the final TMIS was underway, and, as
one of the incumbent systems, Artemis must
certainly be a candidate for the final project
control system. If it is selected, perhaps the
reasons for such a decision can be found in the
following research. Similarly, if it is
rejected, the reasons might be found in the
following pages, for Artemis has both its
supporters and detractors. The population
comprised of these supporters and detractors, and
the three elements discussed above, i.e., Space
Station, TMIS, and Artemis, form the foundation
upon which this research was based.
The following paragraphs will discuss all
of the elements of the research. Initially, a
statement of the research objective will be
presented, after which the research approach will
be discussed. Following this there will be brief
discussions of Space Station, TMIS, and Artemis,
with the discussions being limited to that


7
information necessary to scope the research.
Very detailed discussions can be found in
Appendix B.
Statement of Research Objective
It is the overall objective of this
research to make a significant contribution to
the emerging body of literature on the use of
large, integrated technical and management
information systems to plan and control programs,
and on the applicability of evaluation research
to the post-installation evaluation of MIS's.
The specific objective of the research is to
evaluate the effectiveness of Artemis as a MIS
for use on the Space Station Program.
Research Problem
There were both theoretical and practical
questions to be answered during this research.
From a theoretical perspective, the interest was
in knowing, What are the relationships within and
between levels of management function, types of
information, and position in the organization as
they relate to the evaluation of a MIS (see
Figure 1-1). Specific research questions were
asked:


LEVELS OF MANAGEMENT FUNCTION
PLANNING
ORGANIZING
MANAGEMENT CONTROL
OPERATIONAL CONTROL
COMMUNICATION
POSITION
IN THE
ORGANIZATION
TOP MANAGERS
USERS
IMPLEMENTORS
PROGRAMMER/
ANALYSTS
FIGURE 1-1
RESEARCH FRAMEWORK


9
1. Does position in the organization
influence perceptions of the types of information
required?
2. Do managers, users, etc., have different
functional uses of a MIS?
3. Are quality, quantity, cost and time
valid parameters for MIS evaluation, and of these
which are the most significant?
4. Which MIS functions, e.g., planning,
control, are most important, and how are they
related to position in the organization and type
of information?
5. What other factors, other than those
contained in the research framework, are
significant in MIS evaluation?
The practical research question was:
Would Artemis be an effective MIS for project
planning and control in the Space Station
Program? At the onset it was hoped that there
would be some theoretical extrapolations possible
as this practical question was answered,
specifically dealing with the further use of
evaluation research for the evaluation of
management information systems.


10
Approach
At the onset of the research it was
accepted that it would be difficult and tedious.
According to Hamilton and Chervany, evaluation of
an MIS's effectiveness is difficult due to its
multidimensionality, its quantitative and
qualitative aspects, and the multiple, and often
conflicting, evaluating viewpoints.(5) This
difficulty is probably the greatest with a system
such as Artemis which, as will be discussed in
Appendix B, is among the most complex of its type
in existence.
Lucas supports the position of Hamilton
and Chervany with his proposition that MIS's are
difficult to evaluate because of the
uncontrollable environment in which most systems
operate.(6)
Most evaluations seem to dwell upon
evaluation of either user satisfaction or system
operating time (e.g., Srinivasan in 1985;
Chandler in 1982).(7,8) However, the amount of
use a system experiences might reflect that it is
a cumbersome system. User satisfaction is
difficult to quantify, and is dependent upon


11
factors such as quality of user training,
communication links, etc..
There remains a general problem with
regard to the approach to be used in evaluating
the effectiveness of any MIS, especially when a
post-installation evaluation is required. while
literature is full of examples of evaluations of
MIS's, most of the evaluations were conducted as
part of an integrated formulation-implementa-
tion-evaluation process. As mentioned earlier
there is a dearth of information on post-
installation evaluations as discrete activities.
Authors such as Scott Hamilton have addressed the
effectiveness of post-installation reviews
(PIR/s) in substantial detail, others in less
detail; however, much of the work in this area
has been behavioral in nature.(9) There is
probably, as Lucas proposes, resistance to
post-installation evaluations on the parts of
designers and users of the MIS, who are relieved
just to have the system implemented.(10)
As mentioned earlier, Artemis was
baselined as an interim Space Station PCS in
September of 1984. By early 1986, it was
operational across the Space Station program and


12
there were, at that time, between fifty and one
hundred people in the Space Station program
working with or on Artemis. In addition, there
were dozens of others who used it on other NASA
programs such as Space Shuttle and Space
Telescope. These people provided the resources
necessary, i.e., the population, for conducting
the development research discussed below.
The overall research approach involved the
development of a theoretical framework for the
evaluation of any Management Information System
(MIS), and the specific application of this
framework to the evaluation of Artemis as a PCS
for the Space Station program. The framework
developed is displayed in Figure 1-1. There are
three elements within the framework: the people
who use, manage, etc., the PCS (based upon their
position in the organization); functions such as
planning and control which must be performed
(referred to as levels of management function);
and parameters against which the PCS should be
evaluated (referred to as types of information).
Each of these is broken into greater detail in
Figure 1-1.


13
Opportunities for discrete post-
installation evaluations are rare. The
circumstances surrounding NASA's decision to
implement TMIS and Artemis in response to an
immediate need presented such an opportunity, the
opportunity was grasped, and became the basis for
this dissertation's research.
The approach used to answer these
questions fell into the overall area of
evaluation research. A survey instrument was
developed based upon the three research elements
in Figure 1-1, as well as other factors such as
experience with the system. The instrument was
pretested, finalized (see Appendix A), and sent
to a group of Artemis experts, each of whom was
asked to evaluate the effectiveness of Artemis.
The results were accumulated, analyzed, and were
synopsized (see Chapter 4) and conclusions drawn
(see Chapter 5).
The Space Station Program
A detailed discussion of the Space Station
program can be found in Appendix B. The
following paragraphs contain only that summary
information necessary to understand and
appreciate the size of the undertaking and the


14
information needs of the people who will be using
whatever PCS/MIS chosen.
Figure 1-2 is a schematic of the current
Space Station concept.(11) This configuration
will have a development cost of between $12 and
$15 billion in 1987 dollars and will require
approximately ten years to develop.(12)
Obviously very technically complex because of the
cost and the length of time required to complete,
Space Station is even more complex from an
organizational perspective, a factor which will
be a substantial driver in the selection of the
final MIS/PCS (i.e., the management portion of a
management information system is greatly
influenced by the complexity of the
organization).
There are three levels of management
within the Government, and at least two
significant levels of contractor activity. In
addition, major pieces of hardware (costing
several billion dollars) will be built by the
Europeans, Japanese, and Canadians.(13) Figure
1-3 displays the overall organization of the
Space Station, while Figure 1-4 is an
organizational chart for the U. S. Government


FIGURE 1-2
SPACE STATION SCHEMATIC
cn


PROGRAM DIRECTION
PROGRAM MANAGEMENT
PROGRAM ENGINEERING
MAJOR CORE ELEMENTS
SERVICING FACILITY
MAJOR SYSTEMS
CO-ORBITING PLATFORM
CREW ACTIVITIES
LAUNCH
SPACE OPERATIONS
COMMERICAL
DEVELOPMENT
SCIENCE
APPLICATIONS
MOBILE SERVICE CENTER
PRESSURIZED MODULE
EXPOSED FACILITY
EXPERIMENT MODULE
SCIENCE AND
TECHNOLOGY RESEARCH
PRESSURIZED MODULE D
POLAR PLATFORM
MAN-TENDED
FREE-FLYER
FLUID PHYSICS
LIFE SCIENCES
MATERIALS SCIENCE
FIGURE 1-3
OVERALL STRUCTURE OF SPACE STATION
cn


FIGURE 1-4
SPACE STATION PROGRAM ORGANIZATION


18
portions of the program.(14,15) There are nine
major Government organizations at nine locations,
at least that many significant levels of
contractor management, and the three
international organizations mentioned above.
Collectively, this is perhaps the largest group
of major participants ever to contribute to a
program. There is tremendous interaction among
the different organizations, e.g., major hardware
deliveries between contractor and Government
organizations. In order to fully appreciate the
difficulty of managing the Space Station program,
and consequently the difficulty in managing
information, Appendix B should be read carefully.
Technical and Management Information System
(IMIS)
The capabilities of TMIS are well
illustrated in Figure 1-5 and Table 1-1.(16)
While some of the terms on the charts are not
defined, suffice it to say that TMIS is a massive
undertaking. There are centralized and
decentralized elements; mainframe computers,
minicomputers and microcomputers; engineers and
technicians; existing systems and new systems;
relational and hierarchical data bases;


PARTICIPANTS
FIGURE 1-5
SPACE STATION THIS


MANAGEMENT PROCESSES
TECHNICAL PROCESSES
PLANNING
BUDGETING
DOCUMENTATION
CONFIGURATION MANAGEMENT
SCHEDULING/PROJECT MANAGEMENT
POLICY DEVELOPMENT
INTERNATIONAL RELATIONS
PERFORMANCE MANAGEMENT
PROGRAM REVIEW
CUSTOMER RELATIONS
ACQUISITION
ADMINISTRATIVE CONTRACT MANAGEMENT
EXTERNAL AFFAIRS
ADMINISTRATION
REQUIREMENTS ANALYSIS
TECHNICAL ANALYSIS
DESIGN
DESIGN REVIEW
COST/FINANCIAL ANALYSIS
PROTOTYPING
TECHNICAL CONTRACT MANAGEMENT
IMPLEMENTATION/INTEGRATION
INTERFACE CONTROL
TEST AND VERIFICATION
OPERATIONS
MAINTENANCE
INVENTORY MANAGEMENT
TRAINING
TABLE 1-1
MAJOR PROCESSES SUPPORTED BY TMIS


21
computer-aided design systems and simple
scheduling systems, etc. TMIS is the epitome of
the all encompassing systems deemed impractical
by Ackoff and Dearden.(17,18) It also seems to
have been at least a modest success during the
first two years of existence. Recall that only
interim TMIS systems have been procured to date.
As mentioned earlier, a final TMIS
contractor was in the process of being selected
at the time this dissertation was being written.
During the period 1984-1987, NASA used two very
powerful contractors to assist in the
implementation and management of the interim TMIS
system. The Mitre Corporation and
Booze-Allen-Hamilton were instrumental in the
implementation of the initial phase of TMIS.
Perhaps the fact that not one, but two,
contractors of this type, were required,
demonstrates the complexity and difficulty of the
system.
The current estimate of the cost for TMIS
development is approximately $350 million, and
will take seven years to develop.(19,20)
The purpose of the previous paragraphs was
to adequately demonstrate the complexity and


22
all-inclusive nature of TMIS in order to scope
the probability of success in its implementation.
A detailed description of TMIS can be found in
Appendix B.
Artemis
Artemis is a computer language and
application system designed for project
management. It is a fourth generation computer
language which is interactive and conversational
and supports English-style expressions and
commands.(21) It was developed in the 1970's in
Britain by Metier, Ltd., under the name Apollo.
The name was changed to Artemis (Apollo's sister
in Roman mythology) when it was introduced into
the United States because there was in existence
a United States Corporation which produced Apollo
computers. Metier was recently purchased by
Lockheed, one of the largest aerospace companies
in the United States.
As shown in Figure 1-6, Artemis resides on
a Hewlett-Packard 1000 minicomputer at the
Johnson Space Center. While a mainframe version
of Artemis is available, there was no intention
at the time this dissertation was written to
migrate to the larger machine until the final


SITE 6
SPACE STATION
BUSINESS
MANAGEMENT
OFFICE AT NASA
HEADQUARTERS,
WASHINGTON, D.C.
SITE 1
SPACE
STATION WORK
PACKAGE #1 AT
MARSHALL SPACE
FLIGHT CENTER,
HUNTSVILLE,
ALABAMA
HP1000
MINICOMPUTER
AT JOHNSON
SPACE CENTER,
HOUSTON, TEXAS
SITE 2
SPACE STATION
WORK PACKAGE #2
AT JOHNSON
SPACE
CENTER,
HOUSTON,
TEXAS
SITE 5
SPACE STATION
PROJECTS
OFFICE
AT KENNEDY
SPACE CENTER,
CAPE CANAVERAL,,
FLORIDA
SITE 4
SPACE STATION
WORK PACKAGE
#4 AT LEWIS
RESEARCH
CENTER,
CLEVELAND,
OHIO
SITE 3
SPACE STATION
WORK PACKAGE
#3 AT GODDARD
SPACEFLIGHT
CENTER
GREENBELT,
MARYLAND
FIGURE 1-6
SPACE STATION ARTEMIS NETWORK
ro
CO


24
Space Station Project Control System was selected
in 1987. The Artemis system will be managed by
the Space Station Program Office which was
discussed earlier (see Figure 1-3), and will be
distributed throughout the program locations.
There are 28 ports on the HP1000 to which
user hardware can be connected. At the present
time the following seven organizations are
connected to the Artemis system:
NASA Headquarters,
Washington, D.C.
Space Station Program Office,
Reston, Virginia
Marshall Space Flight Center,
Huntsville, Alabama
Johnson Space Center,
Houston, Texas
Goddard Space Flight Center,
Greenbelt, Maryland
Lewis Research Center,
Cleveland, Ohio
The above locations have dedicated
attachments to the HP1000. There is also a
limited amount of dial-in capability which allows
organizations with proper credentials to remotely
access the system on a first-come, first-serve
basis.
Two of the three elements of Artemis have
been discussed: the software (basic Artemis and


25
related applications software) and the host
computer, the HP1000. The approved hardware
configuration for remote sites is as follows:
a. Two IBM PCXT's which can be used as
remote terminals to the host computer (the
Hewlett-Packard 1000), down-loaded from the host
and operated stand-alone, or operated stand-alone
using special PC (Personal Computer) software.
b. Two copies of Artemis software which
allow the remote sites to communicate with the
host computer and also allow the IBM PCXT's to
operate without the host. Purists would say that
only this software can properly be called
Artemis.
c. One Hewlett-Packard plotter which can
produce schedules, networks, pie-charts, etc., up
to 11" x 17". It can be used with the host
computer or with the PCXT.
d. One Hewlett-Packard plotter used to draw
large 3' x 5' networks. At this time it can only
be used with the host computer.
e. One high quality, 132 character printer.
f. Associated communication hardware,
manuals.
Artemis is uniquely a project control


26
system (PCS) and is not used for computer-aided
design, engineering data base management, etc.
It is. used for most business applications, and is
debatably the most powerful PCS in the world.
Artemis can be used to develop 30,000 activity
networks, perform detailed performance
measurement, and integrate cost and schedules at
almost any level of detail (22) It is one of
the three or four systems currently capable of
handling the large amounts of data planned for
the Space Station Program. It was designed to
satisfy the cost and schedule integration
criteria of the Department of Defense, and is
best used on very large programs requiring
rigorous control.
Summary
NASA is in the preliminary stages of
design on a very large, highly complex Space
Station program. It is both technically and
organizationally complex, involving both national
and international organizations. An
all-inclusive technical and management
information system, referred to as TMIS, is
currently being procured by NASA to assist it in
the formulation, implementation, and control of


27
the Space Station. One of the several TMIS
subsystems is a business-oriented system called
Artemis. Artemis is referred to as a project
control system (PCS), a term which, in NASA, is
used to describe the general area of business
management (cost, schedules, performance
management, etc.).
Artemis was selected as an interim PCS in
the fall of 1984, and had been used across the
Space Station program for approximately two years
at the time this research was conducted. There
is a group of managers, users, etc., who now have
a great deal of expertise in the use of the
system. These experts form the population for an
evaluation of Artemis as the final PCS for the
Space Station Program (a final selection will be
made in the Spring of 1987.)
A survey instrument was developed,
pre-tested, modified, and sent to a large sample
of this population. Returned questionnaires were
tabulated and cross-tabulated. Many of the
responders were interviewed in order to better
use the data they provided. Responders were
classified according to their position in the
organization as managers, users, programmer/


28
analysts, or implementors. A detailed discussion
of the rationale behind these classifications can
be found later in this dissertation. The levels
of management function against which Artemis was
evaluated were planning, organizing, management
control, operational control, and communication.
The types of information used in the evaluation
were quality, quantity, cost, time. The reasons
for the selections of the function and parameter
will be discussed later.


REFERENCES
1. Congressional Record, U.S. Government
Printing Office, January 23, 1984, Vol. 130,
No. 3., April 1975.
2. Arnold S. Levine, Managing Apollo in the
Apollo Era (Washington, D.C.: Scientific
and Technical Information Branch, National
Aeronautics and Space Administration, 1982)
p. 3.
3. Brian Reynolds, Space Station Reconsidered
(Houston, Texas: NASA Lyndon B. Johnson
Space Center, Undated), p. 26.
4. Booze, Allen, and Hamilton, Inc., Technical
and Management Information System (TMIS
Phase I Users Guide(Houston. Texas:
National Aeronautics and Space
Administration Johnson Space Center, 1986),
p. 1-1.
5. Scott Hamilton and Norman L. Chervany,
"Evaluating Information System
Effectiveness- Part 1: Comparing Evaluation
Approaches", MIS Quarterly. Vol. 5, No. 3,
p. 55.
6. Henry C. Lucas, Jr.,- "Performance and the
Use of an Information System", Management
Science. Vol. 21, No. 8, April 1975.
7. Arnanth Srinivasan, "Alternative Measures of
Systems Effectiveness: Associations and
Implications", MIS Quarterly. Vol. 9, .No. 3,
September 1985.
8. John S. Chandler, "A Multiple Criteria
Approach for Evaluating Information
Systems", MIS Quarterly. Vol. 6, No. 1,
March 1982.
9. Scott Hamilton, Post Installation Evaluation
of information Systems:___An Empirical
Investigation of the Determinants for Use in
Post Installation Reviews (82-11482)
(University of Minnesota, 1981).
10
Ibid, Lucas.


30
11. NASA Lyndon B. Johnson Space Center,
Baseline Configuration Document JSC 30255
(Houston, Texas: Space Station Program
Office, January 1987), p. 2-17.
12. Houston Post, February 4, 1986, p. 1.
13. National Aeronautics and Space Admini-
stration, "Partner Elements for Preliminary
Design", S-86-01475A (August 1984).
14. National Aeronautics and Space
Administration, SpaceStation (Washington,
D.C.: Space Station Program Office, October
1986)
15. Ibid.
16. Donald J. Andreotta, Current Task
Definition:___TMIS IOC 1 Functional
Requirements and Data Architecture
(Washington, D.C.: Space Station Program
Office, Undated),p. SP5/031A.
17. Russell L. Ackoff, "Management
Misinformation Systems", Management
Sciences. Vol. 14, No. 4, (December 1967),
pp. B147-B156.
18. John Dearden, "MIS is a Mirage", Harvard
Business Review. Vol. 50, No. 1, (Jan Feb
82), pp. 90-99, as reprinted in V. Thomas
Dock, Vincent P. Luchsinger, and William R.
Cornett, MIS;___A Managerial Perspective
(Chicago: Science Research Associates,
Inc., 1977), p. 288.
19. Thomas L. Moser, "Space Station Program
Status Presentation to Space Station
Management Council" (Washington, D.C.:
National Aeronautics and Space Admini-
stration, March 20, 1987), p. 055-4168.
20. National Aeronautics and Space
Administration, TMIS Master Plan
(Washington, D.C.: Space Station Program
Office, Level A', February 13, 1987), p.
2-7.
21. Metier management Systems, Managing Projects
with Artemis (May 1985), p. 1-1.
22. Metier Management Systems, Distributive
Processing for Project Management (October
1984), pp. 53-55.


CHAPTER 2
EVALUATING MANAGEMENT INFORMATION SYSTEMS
introduction
As stated in Chapter 1, the first step in
the research approach was the development of a
theoretical framework for the evaluation of a
management information system. To do so
properly, the theory of MIS's, and of the
research ingredients discussed in Chapter 1, were
explored. The following paragraphs provide such
an exploration. As the pertinent theory is
reviewed, the theoretical framework mentioned
above will be constructed.
Because of the wide range of subjects,
boundaries must be placed around and partitions
placed within the candidate areas for theory
review. The following three areas will be
discussed in detail in this chapter.
1. Management information systems will be
discussed at length, emphasizing the definitions,
needs for, and limitations of MIS's. The
feasibility of large systems such as TMIS will be


32
discussed, as will the applicability of the term
MIS to an element of TMIS such as Artemis.
2. Pertinent literature on approaches to
evaluating MIS's will be addressed, emphasizing
post-installation evaluations.
3. A review of the literature pertinent to
the research design and proposition(s) will be
presented, emphasizing literature pertinent to
the three elements within the research framework,
i.e., level of management function, position
within the organization and type of information.
The following paragraphs will follow the
above outline in the development and presentation
of the theory review. This will leave
unaddressed the subject of management in NASA, a
critical element in the foundation of this
research. A quite detailed discussion of this
subject can be found in Appendix B.
Management Information Systems
Whenever managers or technical persons
connected with organizational data processing
(EDP) get together and are confronted with the
term "management information systems", the first
question they ask is how the term should be
defined.(1)


33
This statement was made sixteen years ago
by G. W. Dickson and John Simmons and still
applies today. It is certainly pertinent to this
research since the specific topic of this
research is the effectiveness of Artemis as a
Space Station Management Information System, not
planning system, control system or scheduling
system. The definition of MIS, including its
characteristics, uses, limitations, etc., is an
essential ingredient in this research.
When defining a term such as MIS, it is
usually desirable to attack the definition
hierarchially. It is also desirable to discuss
what an MIS is not. The following paragraphs
will define MIS's in increasing levels of detail
and will provide appropriate exclusions from the
definitions.
In essence, a successful MIS will contain
only the information pertinent for decision
making, and will present the information in a
meaningful form and in a timely manner.(2)
Information is data Which has been processed into
a form that is meaningful to the recipient.(3)
Among the better definitions of MIS is the


34
literal and graphical one presented by Murdick
and Ross and displayed below:
MIS

MANAGEMENT INFORIV IATION SYSTEM
(1) Makes decisions
regarding
- planning
- organizing
- controlling
(2) Information
consists of
orderly selected
data used for
making
decisions
(3) Systems
for integration
of all company
activities
through
exchange of
information (4)
Source: Robert G. Murdock and Joel E.
Ross, Introduction to Management Information
Systems ((Englewood Cliffs) New Jersey: Prentice
Hall, Inc., 1977) p.8.
Further investigation of the Murdick and
Ross information shows that there is a large
Operational Information System which is the
foundation upon which the MIS exists. As stated
in the preceding illustration, MIS is an
integrated, company-wide information system, much
as TMIS is projected to be an integrated,
program-wide information system.
It is possible that this definition
excludes Artemis since it contains the words "all
company activities". As stated before, Artemis
is essentially a Business Management Information
System (BMIS), part of a large MIS called TMIS.


35
There is a substantial amount of literature that
supports this broad definition of MIS, Wendler
defines an MIS as a system which is all
encompassing, one which views the company as an
entity and the MIS as a system which incorporates
the recording and processing of work throughout
the company.(5) By this definition Artemis is
probably not an MIS since it does not include,
nor have the ability to manipulate directly, much
of the technical information in the Space Station
Program. It also does not include nor does it
have direct access to the configuration
management data base or numerous other sources of
information within the program.
Cougar states that the purpose of an
information system is to capture and generate all
data pertinent to the firm's operation, to
process the data in the most efficient manner
utilizing management sciences to the fullest
extent possible. It should produce concise and
timely information as required by each level of
management for optimum execution of its
fundamental objectives.(6) This definition is
both vertically and horizontally all inclusive in


36
that the MIS must "capture and generate all data
pertinent to the firm's operation" and "produce
concise and timely information as required for
each level of management" It is this definition
of MIS that has drawn criticism in the past,
criticism which will be discussed later in this
chapter.
There is a substantial amount of data
which supports the definition of Artemis as an
MIS. Forming a bridge between the all-inclusive
definitions discussed above and those more
appropriate to this research is the work of Noble
Deckard. He states that the "total management
information system will encompass the total
management information needs".(7) While it is
not stated, it is at least implied that a MIS can
take the following shape:
MIS
MIS MIS
MIS MIS
MIS MIS
MIS MIS
MIS
MIS
MIS
MIS
Each lower level MIS is an integral part
of an all-inclusive MIS.


37
Again, this work is probably the bridge
between those who believe in the all-encompassing
MIS and those who discredit the concept as
infeasible. The above concept is completely
compatible with the concept of TMIS discussed
earlier and is consistent with the paradigm of
Ein-Dor and Segev discussed later.
Emery and Sprague view the MIS as a
decentralized collection of subsystems consisting
of one or more computers, a definition apparently
consistent with the TMIS/Artemis relationship and
the Deckard arrangement.(8) Perhaps it is the
definition of one of the three words in MIS,
specifically system, that substantiates the
position that the decentralization of MIS's is
appropriate. Supporting this is Anthony who
states that "an effective MIS is a collection of
information systems".(9) Finally, it is the work
of the two authors upon whose work much of this
research is based who provide the framework .for
defining Artemis as an MIS.
Ein-Dor and Segev present a paradigm with
regard to the definition of MIS's. According to
these two authors, there are two classes of


38
definitions of MIS's.(10) The first one, MIS (in
capital letters), encompasses all of the
organization's activities, and is probably
representative of the kind of system perceived by
the people who originated the concept of TMIS.
The second classification, the mis (in small
letters), contains only management information.
Within the Space Station Program TMIS is an MIS,
while Artemis is a miis.
Having established that Artemis qualifies
as an MIS (or mis), it is now meaningful to
investigate why it is even important to have as
an MIS Artemis or any other system. There are
succinctly stated quantitative reasons for MIS's.
However, perhaps the best reason for having an
MIS is more Organizational Behavior in nature.
The ownership of information is the ownership of
power.(11) Information is a resource;
consequently, the owner of the MIS has a source
of income and power.(12) As a resource,
information is not all that different than land,
labor, and capital.(13) Hodge, Fleck, and Honess
expand upon these points by providing the
following comparison of resources and related
processes:


39
RESOURCE
PROCESS
People
Organization
Cash
Budgeting
Information system(14)
Data
If the implementation of MIS's actually
results in increases in organizational power and
in equivalent resources, this is a very strong
argument for such implementation. However, such
increases might not be so important to an
organization such as NASA, which does not have to
show a profit nor compete quite so heavily for
its overall position within its environment.
What would seem important is the ability to make
sound and timely decisions on a program with a
great deal of technical uncertainty. Many of the
justifications for MIS's are based upon enhancing
decision making and reducing uncertainty.
Nichols addresses both of these factors (as
well as increasing equivalent resources) as
follows:
The primary purpose of a management
information system is expose significant
relationships that will decrease
uncertainty in organizational decision
making with a corresponding increase in
the utilization of organizational
resources.(15)


40
Both Dickey and Anthony are supportive of
this statement. They propose that the purpose of
a MIS is to provide timely and accurate data
when, where, and in the form that they are
needed.(16,17) It is interesting to note that
all three of these sources emphasize decision
making. The increase in the number and size of
organizations, as well as the limited time for
decision making and the influence of technology,
are addressed by McLeod as he presents the
following four reasons that MIS's are needed:
1. An increase in the number of
organizations.
2. An increase in the size of existing
organizations.
3. Increasing complexity of technology.
4. Lack of time to make decisions.(18)
The lack of time for decision making is a
recurring theme in the discussions of the needs
for MIS's. Freeman considers the short time
between the emergence of a problem and the time
required for its solution as one of the prime
reasons for an MIS.(19) He also presents two
other reasons which are especially pertinent to
NASA at this particular time. They deal with


41
overall efficiency and a resulting need for
MIS'S.
1. Changes in the public attitude toward the
administration of business.
2. Problems in dealing with government
regulated, multinational organizations.(20)
NASA operates in public view more than
perhaps any other public organization and
contracts with some of the largest companies in
the world. If the above reasons are truly
reasons for needing MIS's, they certainly apply
to NASA.
Of all the reasons for having MIS's, a
simple contraction of two simple reasons provided
by Anthony and Aron is among the best: The
function of the MIS is to make the manager
smarter without a blizzard of paperwork.(21,22)
In a few pages twenty-one reasons from
twenty-three sources have been presented
supporting the need for Artemis (since Artemis is
a MIS). While perhaps not as abundant as the
supporters of MIS, there is a substantial group
of detractors. While none of the following
sources specifically addressed large planning and
control systems such as Artemis, it certainly


42
fits the profile of the systems implied, i.e., a
large system used to control people in a variety
of functions.
The debate over the benefits of MIS's is
not all that different than the one that existed
for almost a century on the benefits of
scientific management. (23) One of the most
vocal critics is John Dearden, who wrote an
article in the Harvard Business Review entitled
"MIS is a Mirage".(24) According to Dearden,
information within a company is not sufficiently
homogenous to allow for the development of a
company-wide MIS. It is also not practical,
according to Dearden, to centralize the MIS and
retain any degree of acceptance. Actually, it is
debatable if these limitations apply to Artemis,
which is a decentralized business management
information system. It is certainly a large
decentralized system, however, and does not
easily fit within the definition of the MIS as a
mosaic of several small specialized mis's as
presented by Emery and Sprague.(25)
Regardless of whether MIS's are practical,
desirable, required, dehumanizing, etc., NASA has
many, one of which is Artemis. Having committed


43
to these MIS's, NASA has a responsibility to
evaluate them. The following paragraphs deal
with current literature on evaluating MIS's.
Evaluating Management Information Systems
The difficulty in adequately evaluating
MIS's was discussed briefly in the introduction,
a difficulty which does not obscure the need,
however. There have been attempts to both
qualitatively and quantitatively evaluate MIS
effectiveness, the more applicable (to this
research) of which will be discussed in the
following paragraphs.
John S. Chandler discusses a multiple
criteria approach for evaluating information
systems in an article in the March 1982 edition
of MIS Quarterly. He proposes that information
systems can be evaluated from two perspectives;
the domain of the system (in this case Artemis),
and the domain of the users (in this case,
hopefully, the people to whom the questionnaires
were sent).(26) In the computer system domain,
performance is measured in terms of resource
utilization, cost (one of the types of data
evaluated in this research) and efficiency.
Within the user domain, throughput, reliability,


44
and response time (another of the types of
information evaluated in this research) are
common measures.
Chandler discusses an evaluation in three
stages: system evaluation, user goal evaluation
and design evaluation. He essentially models the
operation of the information system, developing
algorithms to be used in measuring the
differences between parameters such as user
expectation and actual system performance. Both
of these parameters can be measured in terms of
time and cost. An algorithm used to measure the
achievement of a user goal could be expressed as
follows:
D(i) =G(i) Rj(i) where,
D(i) is the discrepancy between
G(i), the user's performance goals and
Rj ( i) the summation of actual performance
measures
In essence, there is some similarity
between this approach and the one used in this
research. The location of the entry onto the
Lickert scale by the individual responders
reflect their opinion of the actual performance
of Artemis (which they observed) and their


45
expectations of the system, whether implicit or
explicit.
In September of 1981 an article entitled
"Evaluating Information System Effectiveness" was
published in the MIS Quarterly. The article was
written by Norman L. Chervany, Chairman of the
Management Sciences Department of the university
of Minnesota, and Scott Hamilton, a Senior
Project Manager of COMSERV Corporation. They
proposed an evaluation based upon the
establishment of objectives (of the MIS), and an
assessment of the performance of the MIS.
Performance can be measured in terms of both how
well it met these objectives, and how well it met
the standards of good practice in doing so.
Figure 2-1 presents a comparison of various
evaluation approaches.(27) An evaluation can be
conducted for each cell of the matrix.
Hamilton and Chervany followed their
September article with one in December of 1981.
In it they presented data very pertinent to this
dissertation research, and similar to the work of
Ein-Dor and Segev upon which this research was
based. They propose that there are four groups


MEANS TO MEASURE ACCOMPLISHMENT OF OBJECTIVES
HIERARCHY OF OBJECTIVES
QUALITY
MMWtt
MEVEW
OOtrUANOE
AUNT
UOQET
FSFMh
REVIEW
>
P
O
m
O
O
RESOURCE INVESTMENT
PRODUCTION CAPABILITY
mi feiKMe. ootnrrs' 0MOE MS' fOIT OOIT/
rmouonvTTY fQVCMMNOC LEVS. ATTTPUE MTAllATION teem
MEAWRaea EVALUATION MOMTOM URVEY REVIEW ANALYM
5
CE
0
1
o
3j
O
RESOURCE CONSUMPTION
INFORMATION SYSTEM
3 INFORMATION AND SUPPORT
> PROVIDED
O
CD
O
I
z
UJ
c
0
1
>
p
o
Ui
USE PROCESS AND
USER PERFORMANCE
ORGANIZATIONAL
PERFORMANCE
FIGURE 2-1.- COMPARISON OF EVALUATION APPROACHES.
C906440C.ART;2
cr>
SOURCE: SCOTT HAMILTON AND NORMAN L. CHERVANY, "EVALUATING INFORMATION SYSTEM EFFECTIVENESS -
PART I: COMPARING EVALUATION APPROACHES**. MIS QUARTERLY. VOL. 5. NO. 4. DECEMBER. 19BX p.00.


47
of people who are involved in MIS development and
operation. These are:
1. User personnel
2. MIS development personnel (probably
similar to implementors)
3. Management personnel
4. Internal audit personnel(28)
Performance measures can be assigned to
each of these groups; however, the goals of each
of these are in conflict to some extent. For
instance, users want high quality and timely
output; developers want the system to meet cost,
technical and schedule goals; audit personnel
emphasize reliability, etc.
Both of the articles by Hamilton and
Chervany seem to validate the approach used in
this dissertation research, especially the
December article. Of more pertinence is the
dissertation research performed by one of these
authors, Dr. Scott Hamilton. The title of the
dissertation was Post Installation Evaluation of
Information Systems:___An Empirical Investigation
of the Determinants for use of Post Installation
Reviews.(29) Dr. Hamilton developed the
framework for evaluation which forms Tables 2-1


48
and 2-2. Both the parameters proposed and the
categories of personnel involved seem similar
enough to those in this dissertation research to
validate the approach. Of all the literature
surveyed, this seems the most pertinent, and also
the most succinct in its presentation of material
in the area of post-installation evaluation.
There are several other sources which
address certain of the research ingredients and
seem to validate parts of the research approach.
If cost is indeed a viable parameter for
measuring effectiveness, cost/benefit analyses
would seem an appropriate method for evaluation.
There are numerous sources which contain
discussions of cost/benefit analyses. The review
of theory associated with the research parameter
"type of information" will provide additional
support for the research approach used in this
dissertation.
While it would take some creativity to
directly relate the methodology used in this
dissertation to that discussed by Powers, Adams
and Mills, there are demonstrable some
similarities. These gentlemen propose the review
of existing systems in order to determine what


Responsibility
and Involvement Planning and Organizing PIR Performing PIR Reviewing PIR
Responsibility for decision to perform MIS Head of MIS unit Head of user unit
Responsibility for performing PIR MIS project leader Project Steering Committee System Review Committee Internal Audit manager
Responsibility for active participation on PIR task force Project team members Uninvolved MIS personnel User personnel Internal audit personnel
Responsibility for passive participation as sources of information User Personnel User management MIS development personnel MIS operations personnel
Responsibility for review of evaluation findings User management MIS management Senior Management Internal audit management
TABLE 2-1
POST INSTALLATION REVIEW ACTIVITIES AND INVOLVEMENT OF SALIENT GROUPS
PIR ACTIVITIES


Evaluation of Area of Concern Examples of Evaluation Activities
Actual development and
maintenance process
followed for the system
System quality with respect to
meeting the design
specification
Adequacy and completeness
of controls for the system
o Accuracy of estimates for budget and schedule
o Compliance to development budget
o Compliance to development schedule
o Problems and successes in actual development process followed
o Compliance and maintenance budget
o Problems and successes in actual maintenance process followed
o Performance of MIS development personnel
o Participation of user personnel
o Support by management personnel
o Accuracy and completeness of original design specification
o Compliance to design specification
o Problems and successes in actual operation of system
o Compliance to standards; e.g., for programming documentation
o Processing error controls (e.g., using test data)
o Input validation
o Error correction controls
o Access and security controls
o Backup and recovery provisions
TABLE 2-2
EVALUATION ACTIVITIES ASSOCIATED WITH PERFORMING
POST INSTALLATION REVIEW
in
o


Evaluation of Area of Concern Examples of Evaluation Activities
Information provided by the system o Quality of information provided (e.g., accuracy, error rates) o Timeliness of information provided (e.g., response times, turnaround times) o Content of information provided (e.g., relevance) o Presentation form of information provided (e.g., system interface and report formats)
Support provided for the system o Amount and quality of user training and user guides o Quality of relationships between user and MIS personnel o Performance of support personnel
Effect of system on users and user organizational performance o Perceptions of system adequacy o Actual usage of system and procedures o Effects of system on primary and secondary users (e.g., job satisfaction) o Effects on data handling procedures o Effects on operation decision making process
Actual economic payoff of the system with respect to prior expectations o Tangible cost savings and benefits o Percieved and intangible costs and benefits
TABLE 2-2 (CONTINUED)
EVALUATION ACTIVITIES ASSOCIATED WITH PERFORMING
POST INSTALLATION REVIEW


52
should be continued in the existing system, and
what should be replaced. They propose the use of
a multidisciplinary project team which develops a
functional model of the MIS, determines the
objectives of the system, identifies critical
documentation, and conducts interviews at every
level of the organization.(30) To some extent,
this approach was used in the research to be
discussed in the following chapters. The project
teams are comprised of a project leader, systems
analysts, user managers, and user operating
personnel. Both the use of interviews and the
composition of the project teams seem to be
consistent with the methodology discussed in
Chapters 1 and 3.
Four to six months after the system has
been installed and operating, a post-
implementation review should be conducted.(31)
Standard systems analysis tools are used,
including interviews with users and operations
personnel. Among the parameters considered is
cost.
Powers, Adams and Mills discuss another
approach to evaluation which has some pertinence
to this research. The approach is called


53
prototyping. Prototyping is a specialized
systems development approach in which
operational, working systems are created
virtually on a real-time basis.(32) This is
generally accomplished through the purchase of
existing applications software. The benefit of
prototyping is that it emphasizes the system
hardware and software. The weakness is that the
objectives of the organization must be tailored
to some extent to the capabilities of the
prototyped system.
When taken together, (i.e., the
prototyping of a system composed of existing
hardware and software; the post-implementation
review by a project team comprised of managers,
users, analysts, and operating personnel; the use
of interviews; the inclusion of cost as an
evaluation parameter), the elements discussed by
Powers, Adams and Mills scope the research upon
which this dissertation is based. At no time,
however, are all of these combined in such a
manner that a research process is defined.
Weatherbie also discusses an approach
which seems to support the methodology used in


54
this dissertation. Essentially existing systems
can be evaluated with regard to the degree that
they support decision making. The evaluation is
based upon information gathering, and consists of
the following four steps:
1. Review of Documentation this consists of
reviewing recorded specifications that describe
the objectives, procedures, reports produced, and
equipment used in the information system.
2. Observation involves watching the MIS in
operation to note and record facts about its
operation.
3. Interviews meeting with individuals or
groups to ask questions about their roles in, and
their use of, the MIS.
4. Questionnaires (which might be open-ended
or closed ended) submitting questions in printed
form to individuals to gather information on
their roles in, and use of, the MIS.(33)
The above paragraphs provide a relatively
indepth review of the literature on MIS
evaluation with an emphasis on post-installation
evaluation. The following pages also present
some information on evaluations, especially in
the section on the evaluation parameter "type of


55
information". None of the sections within this
chapter are truly independent. A decision had to
be made as to where literature sources would be
included. On more than one occasion, such
sources were considered to be more appropriate in
other sections; consequently, they were placed in
these sections.
Levels of Management Function
As displayed in Figure 1-1, the levels of
management function against which Artemis will be
evaluated are:
1. Planning
2. Organizing
3. Management Control
4. Operational Control
5. Communicating
The above list of functions is a
combination of those presented by Johnson, Kast,
and Rosenzweig in The Theory and Management of
Systems and Robert Anthony in Planning and
Control Systems: A Framework for Analysis. The
functions contained in each of these works will
be discussed in the following paragraphs, as will
the rationale behind their consolidation.


56
Johnson, Kast, and Rosenzweig listed and
defined the following four basic functions:
Planning The managerial function of
planning is one of selecting the organizational
objectives and the policies, programs,
procedures, and methods for achieving them. The
planning function is essentially one of providing
a framework for integrated decision making and is
vital to every man-machine interface.
Organizing The organizing function helps
to coordinate people and resources into a system
so that the activities they perform lead to the
accomplishment of system goals. This managerial
function involves the determination of the
activities required to achieve the objectives of
the enterprise, the departmentalization of these
activities, and the assignment of authority and
responsibility for their performance. Thus the
organizing function provides the interconnection,
or intertie, between the various subsystems and
the total organizational system.
Control The managerial function of
control is essentially that of assuring that the
various organizational subsystems are performing
in conformance to the plans. Control is


57
essentially the measurement and connection of
activity of the subsystems to assure the
accomplishment of the overall plan.
Communication The communication function
is primarily one of the transfer of information
among decision centers in the various subsystems
throughout the organization. The communication
function also includes the interchange of
information with the environmental forces.(34)
There are several observations which can
be made with regard to these definitions. First
of all, planning, organizing, and control are
referred to as managerial functions in both the
above definition and in the research framework,
i.e., level of management function. This raises
the question as to whether the opinions of
managers should be weighed more heavily when
evaluating data gathered from the questionnaire
than those of people at the different levels in
the organization. This weighting should not be
necessary since functions will be individually
evaluated by managers, users, and implementors.
Also of interest is that part of the
definition of planning which states that "the
planning function ... is vital to every


58
man-machine system (e.g.,Artemis) for its
applicability as a MIS for a man-machine system
of another type (Space Station)."
Robert Anthony suggests that the failure -
of success of any organization will depend upon
its ability to accomplish three key activities:
strategic planning, management control, and
operational control.(35) By comparing the levels
of management function against which Artemis will
be evaluated (Figure 1-1) and the list provided
by Johnson, Kast, and Rosenzweig, one difference
can be noted. In Figure 1-1 control has been
divided into two parts, specifically divided into
management control and operational control.
These are, of course, the final two of the three
activities listed by Robert Anthony. The
construction of Figure 1-1 can be displayed as
follows:
Johnson. Kast
and Rosenzweig Figure 1-1
Planning Planning
Organizing Organizing
Control Management
Control
Operational
Control
Robert Anthony
Strategic Planning
Management
Control
Operational
Control
Communicating
Communicating


59
Anthony defines his two categories of
control as follow:
Management control The process by which
managers assure that resources are obtained and
used effectively and efficiently.
Operational control The process of
assuring that specific tasks are carried out
effectively and efficiently.(36)
Two questions must be addressed at this
time. These are as follow:
1. Why, among all of the sources which
postulate the basic functions of an organization,
were Anthony and Johnson, Kast, and Eosenzweig
chosen?
2. What are, within pertinent literature,
other potential classification of functions?
A quick survey of the table of contents of
The Theory and Management of Systems reveals why
this is such a suitable source. Part II deals
with weapon-system management, automation, and
numerical control.(37) All three of these
characteristics apply to Space Station
development. Weapon systems are usually high
technology, multibillion dollar developments
which last for several years.


60
Rhocrematics is defined by Johnson, Kast,
and Rosenzweig as the science of managing
material flow, embracing the basic functions of
producing and marketing as an integrated system
and involving the selection of the most effective
combination of subfunctions such as transporting,
processing, handling, storing, and distributing
goods.(38) Automation and control are, of
course, what Artemis and management information
systems are all about.
Part III of the Johnson, Kast, and
Rosenzweig book is devoted to management science,
PERT/PEP (Program Evaluation and Review
Technique/Program Evaluation Procedure), Systems
Design, and People and Systems. All of these are
directly applicable to the evaluation of Artemis
as an MIS for the Space Station Program. Artemis
is often incorrectly called a PERT system (a much
too narrow term for Artemis).
The Theory and Management of Systems is a
classic reference in the area of Systems
Management and its application to a very complex
program such as the Space Station is clear.
The reason for the selection of Anthony's
two categories of control to further delineate


61
the Johnson, Kast, and Rosenzweig category of
"control" is not so clear. As mentioned earlier,
Johnson, Kast, and Rosenzweig state that "the
managerial function of control is essentially
that of assuring that various organizational
subsystems are performing in conformance to the
plans." Organizations are hierarchies, and
control at the top of the hierarchy is different
than that at the bottom of the hierarchy. There
is a substantial amount of literature which
discusses the needs at different organizational
levels, and in doing so supports the two
categories of control provided by Robert Anthony.
The need for control at different levels
of the organization is depicted in the following
exhibit. (39)
MANAGERIAL
HIERARCHY
TIME SPENT IN 100%
PLANNING AND
CONTROL


62
Source: Gerald E. Nichols, "On the Nature of
Management Information", April 1969, pp.
9-13, as reprinted in V. Thomas Dock, Vincent
P. Luchsinger and William R. Cornett, MIS: A
Managerial Prospective (Chicago: Science
Research Associates, Inc., 1977), pp. 71.
The above information reflects significant
differences within the term "control". At least
one of the evaluation parameters selected is
involved in the Ein-Dor and Segev comparison.
The time span, and presumably the required
response time, differs between management control
and operational control. The day-by-day need for
operational control'information would definitely
drive system design, implementation, and
operation. Of course, it is possible that an MIS
is not needed for operations control if the
amount of judgment required actually is "none".
According to Aron, the purpose of a MIS is to
assist in decision making under conditions of
uncertainty.(40) Perhaps a MIS is not needed for
operational control as defined by Anthony and
Ein-Dor and Segev.
Planning Theory and Concepts
Phased Project Planning (PPP) within NASA
is discussed in great detail in Appendix B. From
this discussion two concepts emerge. First,


63
planning within NASA has a technical orientation
(i.e., networking, the use of hierarchical tools
such as work breakdown structures, etc.), and is
so related to control that these two functions
are inseparable. These two concepts scope the
following theory review. The initial paragraphs
will discuss technical planning, and in doing so,
relate Artemis to this function. The latter
paragraphs will relate planning to control. The
overlap with the subsequent theory review of
control is inevitable and beneficial since
planning and control do, indeed, overlap.
In the forward to his book, Managing
High-Technology Programs and Projects. Russell D.
Archibald briefly discusses the two programs
which best reflect the successful use of
technical planning systems, the Navy's Polaris
Program in the late 1950's, and NASA's Apollo
Program in the 1960's.(41) He continues to say
that to be most effective project management
requires the availability of methods and systems
for integrated project planning and control.(42)
Integrated planning and control will be discussed
later in this section. Methods and systems of
technical planning will be discussed below.


64
An example of the type of planning network
developed within Artemis is shown in Figure 2-2.
Dozens, if not hundreds, of planning networks
have been developed for the Space Station
Program. Each is so large that it can only be
displayed on a 3 feet by 5 feet sheet of paper.
The technique used for these is a method which
deterministically estimates the durations of
program activities based upon one mean value per
activity.(43) The fact that Artemis is based
upon a deterministic methodology is considered a
drawback by many NASA operations researchers
since the means of activities in a highly complex
program such as Space Station cannot be closely
estimated. Both Apollo and Polaris used a
probabilistic system for planning.(44,45) The
system used was PERT (Program Evaluation and
Review Technique). PERT provides a stochastic
approach to planning networks and, according to
Steiss is designed to deal with large-scale
projects characterized by:
1. Unclear objectives.
2. Multiple and/or overlapping management
responsibilities.


SAMPLE NETWORK
CERV PHASE B SCHEDULE
ISO
\
e^l. v-
EflLStuByS^
j
HFflPPlf
DEFINITION r
rrnsr:Msw; TTrTir:MsT7:^ p4.-r.rr: m w; T.cra i
piir.rr^iP^T.rr*!
fSO

. Mil BaaftAMEj^
SfSggWlifflQaBg
FIGURE 2-2


66
3. Relatively high levels of uncertainty as
to time requirements and costs.
4. Complex problems of logistics.
5. Problems of sufficient complexity to
justify the use of the computer.(46)
While it appears that NASA might have made
a mistake in selecting a CPM system such as
Artemis for program planning, there are two
mitigating factors which are pertinent to this
discussion. NASA has dropped the requirement for
probabilistic scheduling, permitting the use of
single activity times in planning networks.(47)
Secondly, Metier has developed an applications
software package called PAN which allows for
probabilistic analyses of the deterministically
developed Artemis networks.
Also discussed earlier was the use of
Artemis to develop planning schedules. The
schedule shown in Figure 2-3 is a typical Gantt
bar chart, denoting current status and planned
activities and events.(48) Artemis can be used
to develop Gantt charts directly, or can develop
schedules directly from planning networks.
Considering the cost of Artemis (see the
discussion of "cost" in Chapter 4), using it for


cr>


68
the development of simple schedules is not cost
effective. Developing easily understood
schedules from complex planning networks is
certainly a cost effective use of the system.
Work Breakdown Structures are discussed
and displayed in Appendix B. Metier has
developed a WBS software program for use with
Artemis, an example of which can be seen in
Figure 2-4. It is relatively expensive program
(approximately $10,000), but will graphically
display a WBS of thousands of elements, and in
doing so display horizontal and vertical
relationships. The value of a WBS cannot be
overestimated. Steiss speaks to its value and
addresses four of the five functional elements of
this research. He states:
A WBS can provide the manager with an
important tool for the integration of
planning and controlfunctions at various
responsibility and authoritylevels within
an organization. It also serves as
animportant mechanism for the
communication of project-management
information.(49)
Archibald discusses the Project Breakdown
Structure (PBS) which he acknowledges to the the
same as the WBS. He states:


C7>


The PBS thus becomes a framework of the
project that enables all management
information (from various systems and
sources) to be correlated and summarized
for planning and control functions.(50)
The Archibald statement not only discusses
the use of the WBS (PBS) for planning, and for
the integration of planning and control, it
addresses the correlation and summation of
management information. It, the above statement,
relates to a great portion of the scope of this
research. It, taken with the earlier quote from
Steiss, certainly seems to present a prima facia
case for Artemis as a MIS for planning.
Finally, Artemis, through the use of an
independent applications software program
developed by Metier, can be used for resource
allocation. The software program, CMS (Cost
Management System) can be used to assign
resources to each planning activity. The result
is a funding plan which relates levels of funding
by period of time.(51)
The previous paragraphs discussed
technical planning and the use of Artemis in such
an effort. The following ones discuss the
relationship of planning and control. This
relationship has already been introduced in the


71
WBS definition of Archibald and Steiss.
Literature contains numerous other discussions of
this relationship.
In 1965 Robert Anthony wrote a book
entitled Planning and Control Systems: A
Framework for Analysis. In it Anthony portrays
both planning and control as elements in the same
continuum, with the absolute amount of each
depending upon the circumstances and the time
within an activity.(52) Anthony is not unique in
his perception of the relationship between
planning and control. Essentially, Zmud
considers both of these functions to be part of
the same process.(53)
Figure 2-5 displays Zmud's integrated
process.(54) The important factor associated
with Zmud's process is that it is discussed in
context of management information systems in his
book Information Systems in Organizations.
Artemis is a very complex information system
which is used to develop planning networks which
can be baselined and used for control.
The above philosophies are supported by
McNichols who states, "Management cannot devise a
control system as an independent function; it


PLAN
ORGANIZATIONAL
ACTIVITIES
INDICATORS
CORRECTIVE
ACTION8

YE8
CONTROLS |
NO
REVISION
NEEDED IN
PLANS OR
CONTROLS
TARGETS
DEVIATION
YES
CONTROLLABLE?
PLAN
C906441C.ART;S
FIGURE 2-6.- INTERGRATION OF PLANNING AND CONTROL.
SOURCE: ROBERT W. ZMUD, INFORMATION SYSTEMS IN ORGANIZATIONS.
(GLENVIEW. ILLINOIS: SCOTT FORESMAN AND COMPANY. 1977), p. 119.
ro


73
must be part of the sequential process of
strategic planning".(55)
Obviously planning is a process which
exists throughout the life of the program.
Reinharth, Shapior, and Kallman state that there
are at least three types of planning: strategic
planning, tactical planning, and operational
planning.(56) Planning occurs at all times
within an organization's existence. These three
authors relate a number of the portion of this
research. Among these relationships are as
follow:
1. At the operational level there is an
overlap of implementation (of plans) and
control.(57)
2. Planning brings direction and organization
out of chaos.(58)
3. A successful planning system must
communicate a well-defined plan of reasonably
attainable goals.(59)
Reinharth, Shapiro, and Kallman present an
integrated picture of the four basic ingredients
of the research, i.e., planning, organizing,
control, and communication.


74
The previous paragraphs have demonstrated
that Artemis can be used for both planning and
for the integration of planning and control.
These two capabilities are probably synergistic
since there is evidence that planning and control
are part of the same process. Additional support
for this proposition will be provided within the
discussion of the theory and concepts of
management control and operational control.
Organizing Theory and Concepts
It is the earlier definition of organizing
presented by Johnson, Kast, and Rosenzweig which
will bound this theory review. This definition,
which was presented earlier, is as follows:
The organizing function helps to
coordinate people and resources into a
system so that the activities theyperform
lead to the accomplishment of system
goals. This managerial function involves
the determination of the activities to
achieve the objectives of the enterprise,
the departmentalization of these
activities,and the assignment of authority
and responsibility fortheir performance.
Thus the organizing function provides the
interconnection, or intertie, between the
various subsystems and the total
organization system.
Much of the earlier discussions of
Planning, and for that matter the following
discussions of management control are pertinent