Citation
Metascience, economics, and the methodology of scientific research programs

Material Information

Title:
Metascience, economics, and the methodology of scientific research programs an exposition and critique
Creator:
Watson, Bruce
Publication Date:
Language:
English
Physical Description:
vii, 122 leaves : illustrations ; 29 cm

Subjects

Subjects / Keywords:
Economics -- Methodology ( lcsh )
Research ( lcsh )
Economics -- Methodology ( fast )
Research ( fast )
Genre:
bibliography ( marcgt )
theses ( marcgt )
non-fiction ( marcgt )

Notes

Bibliography:
Includes bibliographical references (leaves 116-121).
General Note:
Submitted in partial fulfillment of the requirements for the degree, Master of Arts, Department of Economics.
Statement of Responsibility:
by Bruce Watson.

Record Information

Source Institution:
|University of Colorado Denver
Holding Location:
|Auraria Library
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
23379213 ( OCLC )
ocm23379213
Classification:
LD1190.L53 1990m .W37 ( lcc )

Full Text
METASCIENCE, ECONOMICS, AND THE METHODOLOGY
OF SCIENTIFIC RESEARCH PROGRAMS:
AN EXPOSITION AND CRITIQUE
by
Bruce Watson
B.A., University of Denver, 1983
A thesis submitted to the
Faculty of the Graduate School of the
University of Colorado in partial fulfillment
of the requirements for the degree of
Master of Arts
Department of Economics
1990


This thesis for the Master of Arts degree by
Bruce Watson
has been approved for the
Department of Economics
by
Steven G. Medema
Date


Watson, Bruce (M.A., Economics)
Metascience, Economics and the Methodology of Scientific
Research Programs: An Exposition and Critique
Thesis directed by Professor Suzanne W. Helburn
This thesis examines the relationship between
economics and Imre Lakatos' methodology of scientific
research programs (MSRP). It first defines the four
primary functions of scientific methodology: demarcation
of science from non-science, evaluation of theories,
provision of processual guides for the practice of
science, and "rational reconstruction" of scientific
history. Each function is discussed in terms of how it
is approached in the main methodological traditions
instrumentalism, conventionalism and falsificationism.
MSRP is then profiled as an attempt to overcome the
limitations of earlier methodologies in performing the
four functions of metascience. This is followed by an
exposition of six examples of its application to
economics. These are categorized as "intensive" or
"extensive," depending on how broadly they define the
unit of analysis, or research program, and how precisely
they specify the hard core of that program. It is
evident that economic methodologists have often gone well
beyond the original Lakatosian construct, in some cases
articulating what amounts to a "neo-Lakatosian" system.


Finally, MSRP is assessed as an economic
methodology. It is argued that as a system which
embodies sophisticated falsificationist epistemology,
MSRP provides useful insights into economic theory.
However, as an attempt to establish a rigid framework to
analyze specific theories, MSRP has proved much less
satisfactory.
The form and content of this abstract are approved. I
recommend its publication.
Signed,
iv


CONTENTS
Figures......................................vii
CHAPTER
1. INTRODUCTION: THE RESURGENCE OF .............. 1
ECONOMIC METHODOLOGY
2. THE FUNCTIONS OF METHODOLOGY...................9
Problems of Definition ...................... 9
Demarcation Criteria ....................... 11
Theory Evaluation .......................... 15
Processual Guides .......................... 23
Rational Reconstruction .................... 27
. Methodological Monism vs.
Methodological Pluralism ................... 38
3. THE DEVELOPMENT AND STRUCTURE OF MSRP ... 42
Evolution of Lakatos' Epistemology .... 42
Structure of MSRP............................47
Criticisms of MSRP...........................53
4. APPLICATIONS OF MSRP TO ECONOMICS.............60
Intensive Applications ..................... 63
The Human Capital Research Program ... 63
The Neoclassical Theory of Demand ... 70
The Neoclassical Theory of Production . 74
Extensive Applications ..................... 81
Marxian Analysis as a Reseach Program . 82
The Neo-Walrasian Program ............... 86


Neoclassical Economics and
Core Demi-Core Interactions ............. 91
5. ASSESSMENTS..........................103
Economists' Affinity for MSRP...............103
MSRP Concepts and the Concept of MSRP ... 109
BIBLIOGRAPHY ...................................... 117

VI


FIGURES
Figure
1. Interrelationships between
methodological functions .................... 26
2. Basic Lakatosian model
of a research program..........................94
3. Core demi-core model...........................95
4. The disciplinary structure of economics . 97
vii


CHAPTER I
INTRODUCTION:
THE RESURGENCE OF ECONOMIC METHODOLOGY
The publication of Mark Blaug's Methodology of
Economics in 1980 marked the beginning of a period of
renewed interest in economic methodology. Following the
so-called "assumptions controversy" and the debate
kindled by Friedman's 1953 essay "The Methodology of
Positive Economics," both of which had spent themselves
by the late 1960s, methodological discussion had settled
into a predictable pattern. Work in the field was
typically confined to reconstructions of the
methodological positions of leading figures in the
history of economic thought, although there were echoes
of the disputes of the preceding decade.
All of that was to change in the '80s. Suddenly,
methodology was back in vogue. Books and articles
treating methodological issues were again widely read and
discussed. A measure of the field's resurgence is
provided by the steady increase in the number of
methodology papers published during the decade. As a
percentage of total articles listed in category 030,
"History of Economic Thought and Methodology," in the
1


Journal of Economic Literature. subcategory 036,
"Methodology," increased from only 10% in 1980 to over
17% by 1989.
Several reasons for this resurgence may be advanced.
Perhaps the simplest assigns to methodology the role of
what might be termed "paradigm consolidation." In this
view, each major period of theory building in the
discipline has been followed by a search for
methodological roots. The theoretical constructs which
arose during the period are thereby given greater
coherence and scope, as well as a more rigorous
justification in terms of what are held to be the
indispensable canons of science.
Hence the methodological work of Cairnes, Mill and
Senior served to establish the scientific bona fides of
the classical tradition. J. N. Keynes and Lionel Robbins
adapted that tradition to provide a firmer foundation for
the marginalist revolution, while Terence Hutchison
proposed an alternative ground for that same revolution.
Postwar methodology, from this perspective, reflects the
fact that new theoretical formulations have arisen in
tandem with new economic realities. The F-twist, and the
host of other methodological schemas from Lakatos's to
rhetoric, merely confirm economics as a mature science in
2


which metascientific discourse is inevitable, if not
always important.
An alternative explanation of recent methodological
controversy takes a less serene view. The '70s and '80s
were widely perceived as a period of crisis for
economics. The old macroeconomic verities, embodied in
the neoclassical-Keynesian synthesis, fell increasingly
into disrepute. Inherited theoretical structures seemed
consistently wanting in the face of successive rounds of
"exogenous" price shocks and stagflation. Alfred
Eichner (1986, 4) expressed a prevalent view:
". . economic theory . .offers no solution to the
central problems that politicians and the public alike
are impatient to solve." Such censure was not limited to
politicians and the public. Leading theoreticians like
Wassily Leontief and Herbert Simon voiced disapproval of
the state of the profession and the paradigm which
dominates it.
Much of the criticism that emerged from this crisis
was methodological. In a spate of books and articles
with titles such as What1s Wrong With Economics? and Why
Economics Is Not Yet A Science, critics located the
source of the discipline's maladies primarily in the way
economists build and test (or don't test) their theories.
Their strictures portrayed a theoretic regimen which
3


placed too high a premium on mathematical abstraction,
deductive logic, and assumptions derived from intuition
and introspection. Leontief, in his presidential address
to the AEA, denounced what he viewed as a "preoccupation
with imaginary, hypothetical, rather than with
observable, reality." (Leontief 1971, 3) Herbert Simon
articulated a similar criticism and offers an historical
explanation:
Economists don't develop the same empirical
approaches in theorizing as in other social
sciences and in the natural sciences because we
had. .the great fortuneor maybe it was a
misfortuneto have inherited a really neat
theory. Anybody with the slightest
mathematical bent can't help but view
neoclassical general equilibrium theory. . as
a very, very beautiful thing. .We don't want
to depart very far from such a powerful
deductive theory. Economists are very
reluctant to recognize and accept facts in the
real world that seem to fly in the face of that
beautiful theory, or undermine its basic
assumptions. (Simon 1986, 21)
Criticism of economics as a discipline almost always
centers on the dominant paradigm, neoclassical economics,
and almost always involves an explicit or implicit
critique on methodological grounds. Theoretical results
are inadequate or irrelevant, it is alleged, because the
neoclassical method of theory construction is
fundamentally flawed.
As the legitimacy of the prevailing conceptual
structure comes under attack, procedural support is
4


sought both for its overthrow and for its defense. The
superiority of rival theories is often said by their
proponents to derive, at least partially, from their
superior methodological underpinnings. Thus, defenders
of orthodoxy must either uphold their original
methodology as the most workable (or only possible) way
to proceed, or have recourse to alternative
methodological formulations which can be made to provide
some justification for the way in which their theories
are constructed.
Such alternative methodologies have been readily
available in the '70s and '80s. Indeed, yet another
cause of the revival in methodological scholarship in
economics could be the availability of new "imports" from
the philosophy of science. Beginning with the
publication of T.S. Kuhn's Structure of Scientific
Revolutions in 1962, controversy and innovation in the
history and philosophy of science has been a prominent
feature of the intellectual landscape. In particular,
the debate between Kuhnians and Popperians has spawned
numerous alternative methodologies and given new scope
and relevance to older ones. Sociology of knowledge
theories, which originated in the work of Karl Mannheim
in the 1930s, acquired a new importance in the light of
Kuhn's ideas. In the social sciences, the possible
5


significance of ideological considerations became a topic
of explicit methodological analysis in the late '60s,
particularly as a component of an overall radical
assessment of the aims and impact of orthodox thought.
Popperians scrambled to reinterpret the basic
elements of falsificationism so as to render it immune
from Kuhnian attack. Problematic as they were,
falsificationist tenets had been the "standard" against
which scientific practice, as well as alternative
methodologies, were judged. The indisputable fact that
these tenets were generally at odds with the actual
history of science was a disturbing anomaly for most
Popperians, if not for Popper himself. Of what value is
a purely normative, hortatory methodology? Shouldn't
methodology be able to account for the way science is
actually done? Are normative considerations even
appropriate? Questions such as these provided the
impetus for the work of several "post-Popperians," and
especially for the ideas of Imre Lakatos. Their work
acquired added significance by virtue of the fact that
methodologists like Kuhn and Feyerabend called into
question not only Popper's "standard" methodology, but
the proper role of methodology itself. These
developments will be discussed in greater detail in the
next chapter.
6


For economists, the new ideas readily available from
philosophy of science furnished new ways to consider
their disciplineits methods of theoretical discovery
and justification, even its claim to the status of a
science. Imports from other areas such as literary
criticism afforded still more opportunities.
Deconstructionist analysis of economic concepts, unheard
of a decade ago, is now a growing area of scholarship. A
related development is the articulation of the "rhetoric
school" of McCloskey and Klamer. This diversity of
methodological approaches in economics clearly owes much
to work in other fields.1
Imre Lakatos' methodology of scientific research
programs is a philosophy of science import which has
found numerous applications in economics. This thesis
examines some of these, and attempts to assess the
overall usefulness of Lakatos1 methodology in
understanding the assumptions, mode of inquiry and
structure of the discipline. Chapter II defines the four
primary functions of scientific methodology: demarcation
of science from non-science, evaluation of theories,
provision of processual guides for the practice of
1Such diversity also owes something, at least, to
the ingenuity of economists who see in applications of
these ideas a virtually unlimited source of journal
articles.
7


science, and "rational reconstruction" of scientific
history.
MSRP is then profiled in Chapter III as an attempt
to overcome the limitations of earlier methodologies in
performing these four functions. This is followed by an
exposition in Chapter IV of six examples of its
application to economics. These are categorized as
"intensive" or "extensive," depending on how broadly they
define the unit of analysis, or research program, and how
precisely they specify the hard core of that program. It
is evident that economic methodologists have often gone
well beyond the original Lakatosian construct, in some
cases articulating what amounts to a "neo-Lakatosian"
system.
Finally, MSRP is assessed as an economic
methodology. It is argued that as a system which
embodies sophisticated falsificationist epistemology,
MSRP provides useful insights into economic theory.
However, as an attempt to establish a rigid framework to
analyze specific theories, MSRP has proved much less
satisfactory.


CHAPTER II
THE FUNCTIONS OF METHODOLOGY
This chapter offers a cursory survey of some of the
major concerns of metascience which are of particular
relevance to economics. It is designed to place these
issues in historical context, and provide some indication
of the problems which methodological scholarship must
face in any discipline. In addition it provides the
necessary background for the discussion of Lakatos, and
applications of his methodology to economics, which
follows in Chapters III and IV.
Problems of Definition
Much confusion exists about the definition of the
word "methodology." In general parlance, it is often
used as a (perhaps more impressive sounding) synonym for
"method," although, properly considered, the two terms
denote completely different levels of analysis of
scientific activity. Fritz Machlup decries what he calls
the "debasing of the term"; he cites as evidence of
malapropism talk of "methodology" by a statistician
. . trying to explain the procedures and
techniques he used in collecting, arranging,
9


and analyzing his numerical data. . (Machlup
1978, 7-8)
This is to be contrasted with usage by
. . discriminating writers [who] resisted the
new fashion and, modestly and appropriately,
wrote 'Technical Notes' for their discussions
of scope and method, definitions, weighting and
estimating procedures, standard errors, the
process of sample selection, and similar
problems, without any incantation of
'Methodology'. (Machlup 1978, 8)
Methodology, properly conceived, is a branch of
logic and epistemology, and is not concerned with the
development or application of a particular research or
experimental technique. Shapere, while noting the
ambiguity involved between the words method and
methodology, reserves the latter term for:
. . the conceptual steps [involved in
scientific procedures] as explicitly expounded
by scientists and philosophers, and the latter
as philosophically explicated or reconstructed.
In this last case methodology is no longer
merely descriptive but is partly legislative.
(Shapere 1978, 580, n. 11)
This last nomothetic role of methodology is perhaps its
most controversial aspect. As will be seen below, it
enters into methodological discussion at several levels.
Having attended to semantic preliminaries, the
question remains as to what, precisely, methodology is
supposed to do. Given its status as a logical and
epistemic enterprise, what function does it serve in
science? This question has had different answers in
10


different epochs. However, in twentieth century-
philosophy of science, methodology has been assigned
basically four functions: Demarcation of "science" from
"non-science"; establishment of criteria for evaluating
theories; provision of processual guides to the practice
of science; and the "rational reconstruction" of
scientific history.2 The first three are usually deemed
to embody the normative dimension of methodology; the
last captures its descriptive aspect.
In brief, methodology should answer five questions:
What is science? Which theory in a given field is the
best in light of current evidence? On which theory
should scientists in a given field work? How should they
work on it? How have scientists worked on theories,
especially those which are acknowledged to have been the
best theories, in the past?
Demarcation Criteria
Enunciation of demarcation criteria has been a
pivotal concern in almost all modern methodological
discourse. The perceived need to designate some theories
as scientifically acceptable and others as "unscientific"
2This list obviously exhibits the influence of the
logical positivists and their critic/emender, Karl
Popper. For better or worse, these philosophers
undeniably set the agenda for modern philosophy of
science.
11


due to their origin and/or structure is a pervasive
warrant for methodological scholarship. Clearly, the
most recent wellspring of this preoccupation is the
logical positivists' demarcation between meaningful and
meaningless propositions.3 The former included both
analytic propositions, those which are true by
definition, and synthetic statements, those which derive
from observable reality. Synthetic statements, however,
were deemed by logical positivists to be meaningful only
if they were, in principle at least, capable of empirical
verification. Otherwise, synthetic propositions were
considered meaningless, and as such had no place in
"scientific" thought.
This basic dichotomy served as the point of
departure for virtually all subsequent philosophers of
science. The numerous problems with this view provided
fertile ground for dissent and reformulation. Karl
Popper, in particular, translated the words "meaningful"
and "meaningless" into "scientific" and "unscientific,"
and sought less problematic grounds for distinguishing
between the latter two. Additional impetus for Popper
was provided by the rise of theories, such as
30bviously, the search for historical antecedents of
the demarcation criterion can reach as far back as Kant
and Hume, but the most immediate source, the work which
almost defines twentieth century methodology, is logical
positivism/empiricism.
12


psychoanalysis and Marxism, which he found especially
wanting in terms of epistemological rigor. How could
such ideas be unambiguously relegated to the area outside
the pale of true science? Popper's specific answer to
this question will be sketched below. But the question
itself, or at least discussion of the demarcation issue,
has not been confined to philosophers working in the
Popperian tradition.
In essence, the articulation of any demarcation
criterion between science and non-science seeks to
establish a basic ontology, and in due course, a
distinctive epistemology based on this ontology. Most
modern philosophies of science posit an "objective"
reality, i.e. a reality independent of perception. The
question then becomes: How can this reality best be
apprehended? Formulation of a demarcation criterion
presumes that there is something fundamentally distinct
about a "scientific" answer to this question. It
stipulates certain elemental requirements that such an
answer must satisfy. Moreover, it confers on any answer
which meets these requirements the status of "objective"
knowledge which transcends the vagaries of individual
experience.
Hence, any coherent demarcation standard involves a
judgment concerning what is unique about science. In the
13


work of Kuhn, and even more so in Feyerabend, this
uniqueness is social and institutional. "Scientific"
explanations are only propositions which carry the
sanction of a distinct social group, viz. scientists.
For Feyerabend, there is no reason to assume that ideas
espoused by this segment of society correspond any more
closely to reality than the ideas of any other social
group. In this view, demarcation is entirely a
sociological exercise.
Philosophers like Popper and Lakatos, however,
postulate science as a distinctive process. not as a
social unit or institution, or even a body of knowledge.
For scientific inquiry to proceed, statements about the
world must be categorized in some consistent, often
hierarchical, way. For example, what is the status of
perception and the "facts" which are based on perception?
Can they be taken as basic, i.e. as the foundation of all
other knowledge? Can perceived "facts" be prescinded
from the complex of intellectual constructs which are
called theories? Answers to these questions stake out
the territory of science, its unique domain and
prerequisites. Certain theory structures can be shown to
be "scientific" if they are proposed, developed and
accepted or rejected in conformity with an accepted
14


methodological canon. But this is only one component of
the normative dimension of scientific methodology.
Theory Evaluation
Another normative component is concerned with the
comparison and judgment of two or more rival theories.
Based on the demarcation criterion, a theory is expected
to conform to certain requirements in order to be
considered "scientific." While these requirements are
both necessary and sufficient for a given theory to be
preferred over a "non-scientific" proposition, they are
necessary but not sufficient for a theory to be preferred
over another "scientific" theory. Additional criteria
must be specified, and theories compared to determine
which is more in conformity with those criteria. Hence,
an inter-theoretic standard is essential if preference
for one theory structure is to be logically justified.
This is a major point of contention between Popperian, or
Popper-influenced methodologists, and methodological
relativists.4 The latter deny the existence of a pan-
included in this category are such proponents of
the "strong program" in the sociology of knowledge school
as David Bloor, Bruno Latour and Steve Woolgar. Of
course, Kuhn was often accused of metascientific
relativism, but a close reading of his work, especially
his later work (including the modifications in the second
edition of his Structure of Scientific Revolutions^ makes
this charge difficult to sustain. However, it is fair to
say that Kuhn was not much interested in the normative
15


theoretical norm which can serve as a touchstone to
evaluate theories. Theories are, in this sense,
radically incommensurableincapable of appraisal on
terms other than those which are internal to the theories
themselves. Because of this, in the relativist view,
methodology as an evaluative enterprise is doomed to
failure. Theories simply cannot be compared in terms of
a single normative criterion.
Karl Popper, on the other hand, attempted to develop
what might be called a "calculus of comparison." His
version of falsification depended on the existence of a
theory which was demonstrably superior to the theory
being "falsified." Hence, it is vital that two competing
theories be contrasted, and the basis of that contrast
lies, for Popper, in the notion of "empirical content."
This term is usually taken to denote the set of
statements which could serve as falsifiers of a given
theory. The bolder or riskier a theory is, the larger is
its associated set of potential falsifiers, and thus the
larger its empirical content. The title of one of
Popper's books, Conjectures and Refutations. sums up much
of his approach to metascience. Hypotheses can be as
dimension of methodology. It is also undeniable that the
sociology of knowledge theorists owe a great deal to
Kuhn, and their ideas show some affinity to basic Kuhnian
concepts.
16


daring as the scientist wishes; in fact, the bolder the
better. But, these conjectures, or rather their
implications, are then compared with observed phenomena
to assess their accuracy and viability. If a "round" of
observation fails to refute them, they are said by
Popper, not to have been proved, but rather to have been
corroborated.
Given two rival theories which seek to account for
roughly the same phenomena, the better theory is the one
which has excess empirical content over its rival,
assuming that some of that content has also been
corroborated. But the notion of empirical content is
difficult to pin down in quantitative terms. Obviously,
the number of statements which could act as a theory's
potential falsifiers is infinite. Popper proposed two
ways around this problem. The first entails the notion
of "basic statements," i.e. empirically derived
observation statements. In some instances, we can
compare the "degree of universality" and "degree of
precision" of the basic statements that constitute the
set of a theory's potential falsifiers. If the set of
such falsifiers of theory T1 has a higher degree of
universality or precision than those of theory T0, T1 is
held to have the greater empirical content (Popper 1959,
121 123).
17


The second means of comparing empirical content
involves the idea of "dimensionality." The dimension of
a theory is given by the least number of statements which
would be required to refute it. By this criterion, a
statement of lower dimension is more falsifiable than one
of higher dimension, and hence has more empirical
content. Therefore, of two rival theories, the one that
has the lower dimension is preferable. (Popper, 1959, pp.
126ff.)
In the development of his brand of falsificationism,
Popper confronted what has come to be known as the Duhem-
Quine thesis. This proposition stems from the ideas of
Pierre Duhem, contained in his The Aim and Structure of
Physical Theory (1906), and further elaborated by the
logician W. O. Quine in his essay "Two Dogmas of
Empiricism" (1951). The "weak variant" of this thesis,
due mostly to Duhem, asserts the impossibility of testing
a single hypothesis.5 Inescapably, hypotheses come in
clusters; to isolate only one out of the matrix of
conjectures, theories, etc. in which it is enmeshed is an
empirical, if not a logical, absurdity. Thus, whether
sLakatos devotes the concluding pages of his
"Falsification and the Methodology of Scientific Research
Programmes" to a discussion of Duhem-Quine, both in its
weak and strong versions. (Lakatos 1978a, 96-101) For
its relevance to economics (and MSRP), cf. Heijdra and
Lowenberg (1986) and Cross (1982).
18


one attempts to confirm (prove) or refute (falsify) a
given proposition by testing it, one invariably tests a
cluster of ancillary propositions as well. The most that
can be said of the results of such a test is that the
results tend to confirm or disconfirm that cluster, that
conjunction of hypotheses which includes the target
proposition. As Quine pointed out, a corollary of this
interpretation is that any theory can be "saved" from
falsification by sufficient adjustments to one or more of
the auxiliary hypotheses. Lakatos notes (Lakatos 1978a,
96-97) that this constitutes a severe challenge to
"dogmatic" falsificationism, but not to "methodological
falsificationism." (Cf. below, 45-46)
Quine's extension of Duhem's ideas constitutes the
so-called "strong version" of the Duhem-Quine thesis.
The substance of his argument is that any alteration in
any hypothesis within a given cluster may contradict
another hypothesis, possibly well-established, in a
totally different area of knowledge. Hypotheses are
linked, not only in particular clusters related by the
phenomena or groups of phenomena they purport to explain,
but also in the entire structure of knowledge as a whole.
Science, in this view, is an intricate fabric; pull one
thread and the whole may start to unravel.
19


The strong version of Duhem-Quine seemingly negates
any attempts at falsification of any theory whatever. If
all tests are nglobal" in the sense that 11. .each test
is a challenge to the whole of our knowledge. ."
(Lakatos 1978a, 98), then every negative result would
imply that science must start from scratch. Predictably,
this version of Duhem-Quine has been used by proponents
of methodological anarchism to maintain the claim that
normative metascience is, at best, a kind of .
philosophical hoax. However, Lakatos avers that even in
its strong variant, Duhem-Quine poses no problem for a
"sophisticated" methodological falsificationism. Indeed,
he developed MSRP in part to circumvent the more nihilist
aspects of Duhem-Quine, and to preserve a semblance of
Popperian methodological appraisal.
The normative aspect of metascience, so prominent in
Popper's methodology, highlights a concomitant question:
What is the goal or aim of science? Although the
question is normally a part of methodological analysis,
it is not inevitably so. Hands points out that in his
early philosophy Popper had developed
. . a set of rules for the game of science
(rules for demarcating science from nonscience,
rules for correctly playing the scientific
game, and a criterion for successful play or
progress) without providing an ultimate aim or
purpose for playing the game. (Hands 1990, 4)
20


Most earlier metascience conceived the ultimate aim of
science to be the discovery of truth. A particular
metascientific prescription was then judged in terms of
the likelihood that it would further this aim. The
notion of "truth" became such a philosophical casus belli
that most methodologists and logicians from the 1930s on
couched their views on the goal of science in other
phrases; for Popper, such a phrase was "verisimilitude,"
or truth-likeness. However, as Hands demonstrates,
Popper adopted verisimilitude as the object of scientific
inquiry only in 1959-60, after his acquaintance with the
work of Alfred Tarski.6
If the link between methodology and putative
scientific aims is severed, Lakatos observed that "The
rules of the game, the methodology, stand on their own
feet; but these feet dangle in the air without
philosophical support." (Lakatos 1978a, 154) The
ontological/epistemological role of metascience,
discussed above, is fulfilled only when there is some
posited reality as a referent.
6Tarski had attempted to resolve problems in the
discussion of "truth" by relying on metalanguages. The
details of his argument are well beyond the scope of this
paper. Briefly, Tarksi asserted that the truth of a
statement made in a certain theoretic "language" can only
be assessed by a separate and distinct metalanguage.
21


It is only in these terms that the concept of
progress in science has any meaning; progress must be
defined with reference to some ultimate end. If the aim
of science is greater approximation to the truth, i.e. to
objective reality, then theories may be compared on their
truth-likeness, their ability to account for and explain
ever larger sets of observed phenomena. This idea was
the essence of Popper's notion of verisimilitude, and
some comparable view is common to most metascientists,
including Lakatos, who were influenced by Popper.
Yet this is not the only way in which reality can
act as referent to theory. Instrumentalism eschews, at
least initially, any concern with explanation. Empirical
phenomena enter only as "checks" against predictions made
on the basis of theory; the theory whose predictions are
closest to what actually happens (or, in the case of
retrodiction, closest to what actually happened) is held
to be superior. It should be noted that there are many
varieties of instrumentalism, each defined by the goal
which theory is conceived to foster. Hence, theories can
be prediction "tickets," or can be designed primarily to
facilitate policy recommendations. In the latter case,
theoretical progress is assessed on more pragmatic
criteria: How well do the policies which issue from a
given theory work? What is absent in most
22


instrumentalist metascience is the requirement that
theory explain. Explanation enters in some
instrumentalist formulations only in the sense that
predictive power can be taken as evidence of explanatory
power; a theory which predicts better might do so because
it is closer to the "truth."7
In a purely conventionalist metascience, reality
enters only as a set of observations to be aggregated
into convenient sets. Conventionalist science, like
instrumentalist, surrenders any goal of truth or truth-
likeness. The primary criterion of theory evaluation is
serviceability: Which theoretical formulation provides
the simplest, most "elegant" way to organize certain
data? As will be seen below, conventionalism has played
a major role in modern philosophy of science, as well as
in recent economic methodology.
Processual Guides
The word processual is used here advisedly. As the
above discussion about the difference between "method"
7It is sometimes argued that Kuhn's ideas come close
to instrumentalism in this sense: Contrary to Popper,
Kuhn denied that theoretical success was any indication
of truth-likeness. Scientific advance does not mean that
scientists are getting closer to objective reality. It
merely means that theories are developed which have
greater "internal consistency" or which are preferable
for their "simplicity, fruitfulness", etc. (Kuhn 1970,
185, 199)
23


and "methodology" should make clear, the latter does not
offer specific procedural instructions. Rather, the
focus is on the scientific process, i.e. the process of
theory construction, appraisal and acceptance or
elimination.
This function of metascience has two elements. In
terms of the five basic questions of methodology listed
above, these elements constitute answers to the
questions: On which theory should scientists in a given
field work? and How should they work on it? Although
potentially related to appraisal criteria, these
questions are logically distinct from such criteria.
John Watkins (1989) points out that general agreement
within the scientific community on what, at a given time,
is the best theory does not automatically mean that
scientists working in a field should abandon all others.
Indeed, it can be argued that scientific inquiry requires
theoretical diversity. Competition between rival
theories is beneficial, as long as there is some overall
assessment as to which theory is "ahead" at a given time.
As will be discussed below, this idea is a salient
feature of the later versions of Imre Lakatos'
methodology, and one of the most controversial.
Obviously, there are several other possible
interrelations between normative methodological roles.
24


The following figure gives a rough idea of these
interrelations in the four major metascientific systems
discussed above. A indicates that a methodology is
concerned with the issue designated; i.e. it establishes
criteria for answering the relevant methodological
question. A denotes that, at least in general, the
methodology offers no such criteria. Equals signs are
placed between issues for each methodology to indicate
methodological interrelationships. For example, an "="
between process and evaluation signifies that a
methodology establishes some definite processual
criteria, and that theories are judged by that
methodology based on their conformity with those
criteria. Column headings are as follows: The "Why"
column indicates each methodology's view of the general
aim of scientific activity; "Process" denotes whether or
not a methodology lays down processual rules, i.e.,
whether it has an answer to the "how" question; "Eval."
specifies the presence or absence of appraisal criteria a
methodology might offer; and "Selection" designates that
a methodology includes some dictate as to when scientists
should abandon a particular theory or theory structure.
Analysis of one row may help to make the table
clearer. In Popper's falsificationist metascience, and
in later work which bears more or less affinity to
25


Method.
Why
Process
Eval.
Selection
Popperian Truth- + = + = +
likeness
Lakatosian Truth- + = +
likeness
Conven- Best org. -
tionalism of data
Instru- Prediction - + = +
mentalism or policy,
etc.
Figure 1: Interrelationships between methodological
functions
Popperian ideas, one observes that advice on how to
proceed, appraisal, and decisions about when to abandon
theories are all a part of the methodology. Further,
they are all linked. A crude synopsis of this linkage
would be something like:
Theory should be constructed in accordance with the
tenets of falsificationist methodologyProduction
of testable hypotheses; attempts at falsification;
elimination on refutation; continued testing if not
refuted.
Theory T' has survived numerous attempts at
falsification. Theory T has not. T' also has
excess empirical content over T. Therefore T' is to
be preferred to T.
26


Scientists should cease work on T and work on T1,
continuing to try to falsify it, and adding other
testable auxiliary hypotheses.
As in any attempt to reduce complex and diverse
ideas to a simple schema, numerous caveats are in order.
Each tradition of the four has a myriad of interpreters
and hence disparate interpretations. Undoubtedly, some
of these interpretations would not fit into the pattern
assigned above. Some brands of conventionalism, for
example, incorporate elements of falsificationist views
regarding appraisal. Some varieties of instrumentalism
afford definite processual advice, often based on their
distinctive view of the aims of scientific inquiry. All
the above thumbnail sketch is meant to indicate are
tendencies within each metascientific tradition, with the
ultimate intention of clarifying the normative functions
of scientific methodology.
Rational Reconstruction
Demarcation criteria, processual rules, appraisal
criteria and selection guidelines are all prescriptive;
they are all concerned with how science should be done.
They were seemingly articulated with little regard to how
science is actually done, how theories have actually been
formulated, tested, accepted or eliminated. Yet, as will
27


be discussed below, the connection between the
prescriptive and descriptive dimensions of metascience is
effected by a process called "rational reconstruction."
This term first entered methodological discourse
through the work of Rudolf Carnap. He wished to give a
rigorous account of the origin of empirical knowledge, to
ground observation in a precise theory of sensation,
perception and cognition. For this, he developed an
abstract model of the relevant psychological processes.
He described this model as a rational reconstruction of
actual mental activity, a "schematized description of an
imaginary procedure. . which would lead to essentially
the same results as the actual psychological process."
(Carnap 1963, 16)
The term was then adopted by Hans Reichenbach in his
Experience and Prediction (1938). Reichenbach wanted to
consider epistemology purely from the standpoint of its
"internal relations," relations which "... belong to
the content of knowledge. ."in contrast to "external
relations," which ". .combine knowledge with utterances
i
of another kind which do not concern the content of
knowledge." (Reichenbach 1938, 4) Internal relations are
the proper object of epistemology, while external
relations lie in the sphere of sociology. Rational
reconstruction denotes the attempt to give an account of
28


knowledge entirely from the standpoint of its internal
relations, i.e. the logical structure of the ideas which
constitute it:
What epistemology intends is to construct
thinking processes in a way in which they ought
to occur if they are to be ranged in a
consistent system; or to construct justifiable
sets of operations which can be intercalated
between the starting-point and the issue of
thought-processes, replacing the real
intermediate links. Epistemology thus
considers a logical substitute rather than real
processes. For this logical substitute the
term rational reconstruction has been
introduced. . (Reichenbach 1938, 5)
The idea of rational reconstruction and the distinction
between internal and external relations plays a major
role in Lakatos' methodology of scientific research
programs. Lakatos' use of the term can best be
understood by substituting the words "philosophy of
science" for "epistemology," and "theory construction"
for "thought processes" in the above quotation. He was
among the first methodologists to ask whether, and how, a
coherent, rational account can be given of the actual
history of science.
Lakatos' interest in this question was prompted by
Kuhn's discussion of scientific change in The Structure
of Scientific Revolutions. Nineteenth century
inductivist notions of knowledge as a cumulative entity,
painstakingly built up through patient observation and
inductive generalization, had been thoroughly discredited
29


by precursors of the logical positivists, and by
conventionalist/instrumentalists like Duhem. Questions
about scientific history and its relation to methodology
had not interested the logical positivists themselves, or
even Popper. For many years, scholarship in this area
languished.
However, the publication of Kuhn's book put the
issue on center stage.8 Kuhn sought to account for those
episodes in the history of science when fundamental
change occurs, when one paradigm is supplanted by
another. These instances he likened to a gestalt switch
in perspective. But as an historian of science, he was
also aware of the extraordinary continuity in scientific
development; true revolutions are rare. Contrary to
basic falsificationist tenets, theories exhibit
remarkable tenacity. They are not surrendered
immediately on the appearance of some empirical anomaly
or some adverse experimental result.
The problem of accounting for the long periods of
scientific conformity (which Kuhn termed "normal
science") while still explaining revolutionary episodes
8What follows is not intended to give anything like
an adequate account of Kuhn's views. That is obviously
not germane to the major focus of the discussion.
Instead, it is enough to describe the overall import of
Kuhn's ideas, and the basic problems which gave rise to
them.
30


induced Kuhn to fuse external with internal
methodological components. Following Polanyi, Kuhn now
admitted sociological and psychological factors into
metascientific discussion, and the sociology of knowledge
school which developed after Kuhn followed this line to
its logical extremeonly sociological factors were held
to be important in analyses of scientific history and
practice. This notion is also prominent in the work of
methodological "anarchists" like Feyerabend.
One of Lakatos' most notable contributions to
metascience was the articulation of a model of rational
reconstruction which sought to preserve the primacy of
internal factors. Indeed, he saw his methodology as the
only practical competitor to Kuhn's, viewing the latter
as banishing science to the "... realm of the (social)
psychology of discovery" where "Scientific change is a
kind of religious change." (Lakatos 1978a, 9) The
ultimate meaning of Kuhn, according to Lakatos (and
Popper), was that science is at base an irrational
enterprise.9
With the methodology of scientific research programs
(hereafter MSRP), Lakatos endeavored to reconnect
9For a recent indictment of irrationalism, cf. Stove
(1982). Surprisingly, Stove broadens the charge to
include not only Kuhn and Feyerabend, but Lakatos and
even Popper as well.
31


metascientific prescription with historical description.
Influences derived from the inherent "logic" of
theoretical development itself were to be used as the
basic exolanans of scientific change. These influences
alone, Lakatos held, can make it "rational" for
scientists to shift their attention and allegiance from
one set of theories, or research program, to another.
Regardless of the extra-logical (or sociological) factors
which might be operative, scientific history must be
reconstructed so as to abstract from such adventitious
elements, and expose the rational essence of science.10
Before concluding this section on rational
reconstruction, an important distinction made in many
methodological discussions of scientific history should
be noted, viz. that between the context of discovery and
the context of justification. This dichotomy was
introduced by the philosopher John Herschel in 1830 to
sustain the idea that "the. procedure used to formulate a
theory is strictly irrelevant to the question of its
acceptability." (Losee 1980, 115-116) It was later
10Some scholars have suggested that Lakatos' idea of
rational reconstruction often amounts to the actual
misrepresentation of scientific history: "There need be
no resemblance whatever between the 'internal' account so
constructed and the actual exigencies of the case under
examination." (Laudan 1977, 169)
32


employed by Reichenbach in the formulation of his ideas
about rational reconstruction.
The heart of the distinction lies in its radical
dissociation of the process by which theories are
"discovered" or devised from the process by which they
are formalized, tested, and amended in the light of
empirical evidence. In the context of discovery
. . anything and everything may be thrown
in: Kepler's mysticism, Newton's alchemy,
Kekule's dreams, Haldane's politics, Keynes's
moral views. There can be a million and one
influences, intellectual, financial, emotional,
social, cultural, political, subjective, and
objective, which lead scientists to come up
with the sort of thoughts they do. (O'Hear
1989, 55)
By contrast, the context of justification is the realm in
which conjectures are made to conform to the canons of
scientific rationality. Hence, scholars who accept the
contextual distinction normally confine methodological
discourse to the context of justification. In this view,
discovery is a "black box," and the myriad influences
which enter into it can best be discerned (if at all) by
psychology, anthropology or sociology. It is only when a
hypothesis has become "objective" or "World 3" knowledge,
in Popper's sense, that it can be subjected to the
scrutiny of metascientific analysis.11
11Popper's 3 Worlds concept is now widely used,
although it has engendered much philosophical debate. It
was first developed in his Objective Knowledge: An
33


Clearly, the distinction between the two contexts
parallels the distinction between internal and external
historical factors detailed above. Rational
reconstruction, then, is concerned with portraying
science solely in the province of the context of
justification, where only internal factors need be
considered.
There is a significant corollary to the contextual
distinction: Scientific theories must be judged on the
basis of methodological criteria which are equally
applicable to all theories. Historical provenance is not
a legitimate ground for appraisal:
Even if, as is sometimes claimed, the spirit of
capitalism created a climate in which men would
naturally seek to quantify, analyse, and
exploit nature, it does not follow that all the
theories produced in this context are not true.
Whether they are found wanting or not will be
determined by their ability to predict the
course of nature, and not by the desires or
beliefs of capitalists, entrepreneurs, mystics,
or social historians. (O'Hear 1989, 56)
Hence, the fact that some of Darwin's ideas were
consonant with a prevailing nineteenth century view of
competitive struggle and "survival of the fittest" does
Evolutionary Approach in 1972. World 1 is the material
world, the world of our sensory perception; World 2 is
the world of thoughts (but not thought), the domain of
totally subjective mental processes; World 3 is the
sphere of "objective" knowledge, which contains the
products of World 2. World 3 continually evolves under
the impetus of logic, criticism, and methodology. It is
the realm of "thought".
34


not explain their acceptance at the time as theories of
biology, let alone their continued favor among biologists
and natural historians (O'Hear 1989, 212) .
Not surprisingly, critics of the divorce of
discovery from justification have come primarily from the
sociology of knowledge and "irrationality of science"
schools. Feyerabend, in particular, has argued that the
processes of hypothesis formation and evaluation are
inextricably linked. Those working within a paradigm are
predisposed to see the world in a certain way, and this
will determine not only what hypotheses are developed,
but also how they are evaluated and compared. There is
no empyrean height from which the scientific process can
be viewed "objectively."
The contextual dichotomy has obvious implications
for economics. Far more than in natural science,
theories in economics can be seen as especially sensitive
to the social and political milieu in which they develop.
One of the clearest explications of the problem in
economics is contained in Schumpeter's presidential
address to the American Economic Association (Schumpeter
1949). He accepts the basic contextual dichotomy, but
distinguishes between the processes of discovery and
justification in a way which still allows for creative
interplay between the two. Moral and political
35


considerations undoubtedly influence the economists'
selection of problems, and the theories which are
proposed to deal with them. However, it is still
possible to maintain an objective stance when evaluating
theories, or even when following theories through to
their logical conclusions:
To investigate facts or to develop tools for
doing so is one thing; to evaluate them from
some moral or cultural standpoint is, in logic.
another thing, and the two need not conflict.
Similarly, the advocate of some interest may
yet do honest analytic work, and the motive of
proving a point for the interest to which he
owes allegiance does not in itself prove
anything for or against this analytic work.
Examples abound in which economists have
established propositions for the implications
of which they did not have any sympathy.
(Schumpeter 1949, 346)
Perception of social phenomena inevitably involves
what Schumpeter called a "pre-scientific" analytic act:
"This mixture of perceptions and prescientific analysis
we shall call the research worker's Vision or Intuition."
(Schumpeter 1949, 351) This is the realm in which
ideology is legitimately operative. Lakatos would later
christen this realm the "hard core," i.e. those
assumptions which are prerequisite to theorizing and are
not subject to question or test.
Schumpeter also conceives the context of
justification in much the same way that Lakatos portrays
work in the "protective belt":
36


This work consists in picking out certain facts
rather than others, in pinning them down by
labeling them, in accumulating further facts in
order not only to supplement but in part also
to replace those originally fastened upon, in
formulating and improving the relations
perceivedbriefly, in 'factual' and
'theoretical' research that go on in an endless
chain of give and take, the facts suggesting
new analytic instruments (theories) and these
in turn carrying us toward the recognition of
new facts. (Schumpeter 1949, 252)
According to Schumpeter, interaction between ideology and
theory is the catalyst of progress in economics. The
"pre-scientific vision" provides the motif for the
context of discovery, while in the context of
justification this vision is fashioned into hypotheses
which can be checked against, and revised in light of,
actual economic experience.12 Far from being an
impediment to truly scientific inquiry in economics,
ideology is a blessing in disguise:
That prescientific cognitive act which is the
source of our ideologies is also the
prerequisite of our scientific work. No new
departure in any science is possible without
it. . .though we proceed slowly because of
12It is important to note that Schumpeter does not
use the phrases "context of discovery" and "context of
justification". Indeed, little of his writing contains
explicit reference to general works in metascience.
However, it is clear that he held views that paralleled
those of many contemporaneous philosophers of science,
and even some logical positivists. His leanings in this
regard, especially in work expounding and defending
Pareto's methodology, were particularly distasteful to
members of the Austrian school from which he sprang.
(Cf. Lachmann 1982, 34)
37


our ideologies, we might not proceed at all
without them. (Schumpeter 1949, 358)
Methodological Monism vs. Methodological Pluralism
The issue of monism vs. pluralism is operative in
economics at two levels. First comes the question of
whether there is one overarching methodology which can
apply with equal validity to both physical and social
science. Assuming a negative answer to the first query,
the issue arises as to whether economists should adopt
only one methodological stance, or whether several are
needed to assure the greatest scope and diversity within
the discipline.
The first question has a long and sometimes
convoluted history in economic thought. (Cf. Blaug
1980b, 46-52) But there has usually been a consistent
reply. One can distinguish a tradition of methodological
dualism stretching back to the work of Mill and Cairnes,
and continued by Robbins, Mises, etc. While
metascientists generally have held to the belief that
there can be only one methodology which is applicable to
all sciences, physical, natural, and social, economic
methodologists have argued that the sciences for which
human behavior is the primary datum are fundamentally
different, and hence require different metascientific
criteria.
38


Blaug identifies two principal sources of this
attitude in the discipline. The oldest he dubs the
Verstehen doctrine. (Blaug 1980b, 47) Originating in
neo-Kantian philosophy, Verstehen denotes the
"understanding" to which the social science theorist is
privy by virtue of his humanity. Introspection and
empathy are sources of knowledge available to the social
scientist which are obviously not accessible to the
theorist in, say, physics or chemistry. This difference
in the ground of knowledge mandates a difference in the
way theories are constructed in social science, which, in
turn, mandates a different methodology to validate those
theories.
A newer version of the same belief springs from
Wittgenstein's ideas about meaning and intentionality in
human behavior. Essentially, this gives rise to the
notion that, since human action is guided by meaning
structures and rules, any understanding of that action
presupposes that one is a party to those same rules.
Explanation in terms of empirically derived, correlative
causal laws is meaningless. As Blaug comments, this
position
ultimately blends into the old. .Verstehen
doctrine; both are subject to the same
criticism that we are offered no
interpersonally testable method of validating
assertions about rule-governed behavior. (Blaug
1980b, 49)
39


Blaug goes on to demonstrate the connection between
the Verstehen doctrine and methodological individualism
(1980b, 49ff). He notes that both, taken to their
logical conclusion, imply the impossibility of
macroeconomic theory. In spite of this, he is willing to
retain methodological individualism as "a heuristic
postulate which should guide economic theory as much as
possible.
The two major proponents of methodological monism in
economics have been Terence Hutchison and Milton
Friedman. The ideas of both will be discussed in more
detail below. Karl Popper, while holding to a monist
position in most cases, thought that economics was an
exception. Indeed, he suggested a methodology for the
discipline which is strikingly similar to Lakatos' MSRP.
More paradoxical still, Lakatos had serious reservations
about the applicability of MSRP to economics.
Even granting the case for the uniqueness of
economics, the question remains as to whether the science
needs just one (albeit unique) metascientific structure,
or whether economists should allow, or even encourage,
methodological pluralism. The case for the latter is
made most forcefully by Bruce Caldwell in Beyond
Positivism (1982). The entirety of his argument is
beyond the scope of the present paper, but, in brief, he
40


suggests that the theoretical diversity one finds in the
discipline necessitates methodological diversity as well.
There is no one metascientific view which can comprehend
such disparate schools as Marxian economics, neoclassical
thought, and Austrian theory. Each has something to
contribute, a singular perspective to offer on the
economic dimension of human experience. Because of this,
each much be judged on terms which are the most
appropriate for that perspective.
41


CHAPTER III
THE DEVELOPMENT AND STRUCTURE OF MSRP
One of the most widely applied methodologies in
economics has been the methodology of scientific research
programs (MSRP) developed by Imre Lakatos.
Superficially, it appears to represent a major break with
the mainstream methodological tradition in the
discipline. Indeed, as mentioned above, Lakatos himself
had reservations about its usefulness outside physical
science. Yet it is precisely there that it has been most
influential.
An understanding of the reasons for MSRP's adoption
by economists, and the structure of the MSRP framework
itself, presupposes an understanding of the
epistemological problems it was designed to solve.
Moreover, consideration of these problems is the only way
to assess whether, on its own terms, MSRP has proved
useful in their resolution. What follows, then, is a
sketch of Lakatos1 own "rational reconstruction" of the
history of the philosophy of science.
Evolution of Lakatos1 Epistemology
Over half of Lakatos' seminal essay, "Falsification
and the Methodology of Scientific Research Programmes,"


is devoted to a synopsis of the history of epistemology.
Lakatos took as his point of departure the demise of
"justificationism" (which can be read as a synonym for
verificationism), the doctrine that knowledge must be
proved in order to be considered knowledge. Mill had
systematized methodological precepts which asserted that
true knowledge could arise, and be validated, with either
inductive or deductive logic. All the momentous
scientific achievements of the previous three centuries
were held to be testimony to the power of logic to
establish, and authenticate with certainty, propositions
about the world.
However, with the invention of non-Euclidean
geometries and the overthrow of Newtonian mechanics in
the late nineteenth and early twentieth centuries, the
metascientific principles of justificationism, at least
in its pristine, Millian form, were cast in serious
doubt. According to Lakatos, there were three major
responses to this situation. The first was an attitude
of perpetual skepticism, which he took to be the
intellectual progeny of conventionalism and the
progenitor of methodological irrationalism/anarchism.
The second sought to retain justificationism by combining
it with probability theory. Even if certain knowledge was
no longer possible, highly probable knowledge could stand
43


in its stead. Popper deserves the credit, according to
Lakatos, for demonstrating that probabilistic
justificationism was no more tenable than any other
variety. The "problem of induction" precludes assignment
of probabilities to knowledge just as it thwarts claims
of certitude based on inductive generalization.
To Lakatos, falsificationism represents the only
valid alternative to these other responses to the
overturn of justificationism. It avoids the prime
"logical error" inherent in verificationism.13 However,
falsificationism also comes in several forms. Dogmatic
falsificationism hinged on the independent ontological
status of "facts" which were held to be the infallible
basis of epistemology. In this view, theorizing and
fact-gathering are totally distinct scientific
activities, although science proceeds from the validation
of the former by the latter. Theories, once "falsified"
by hard fact, are to be eliminated entirely.
13The logical problem is usually illustrated by
likening it to the deductive fallacy of affirming the
consequent: p implies q; q; therefore p.
Falsificationism, on the other hand, involves the
logically legitimate operation of modus tollens. or
denying the consequent: p implies q; not q; therefore
not p. Hence, one can never establish the truth of any
proposition by adducing any finite number of presumed
implications. (This is the essence of the "problem of
induction") One can, however, falsify a proposition by
pointing to the absence of just one implication.
44


But this approach had three crippling problems.
First, it still depends on the "infallibility" of
observed facts, i.e. their complete independence of any
theoretical or interpretive structure. Second, it
ignores the historical continuity of science, which
Kuhn's work so forcefully describes. Third, it confronts
the strictures of the Duhem-Quine thesis which asserts
that no single proposition can be decisively refuted by
experimental evidence.
Hence, the next incarnation of falsificationist
methodology, associated with Popper's early work, sought
to avoid at least the first and third problems by
adopting the stance of "fallibilism"facts are no longer
to be regarded as completely autonomous, completely
independent of theory or preconceptions. Fallibilism was
consistent with what Lakatos calls an "activist" view of
science, the recognition that theories are human products
which are constructed and evaluated by the conscious
decisions of scientists. They are not dictated by
supposedly sovereign facts. With the embrace of this
idea, a definite conventionalist strain entered
falsificationism.
Lakatos terms this school "naive methodological
falsificationism," and the word "methodological" is the
key: Theories are not passively refuted by objective
45


fact; falsification is a matter of active human judgment,
embodied in a methodology, that the evidence does not
support a given scientific statement. Because of this,
the Duhem-Quine thesis is effectively circumventedthere
is no contention that a single hypothesis can ever be
conclusively falsified simply by examining the evidence.
Whether or not any theory is "falsified" depends
ultimately on the judgment of the scientists working on
it that it is not well corroborated vis-a-vis another
theory.
This "activist" epistemology has two variants,
conservative and revolutionary. Philosophers influenced
by Kantianism took what Lakatos dubs the conservative
viewpreconceptions and the theoretical structures built
upon them are persistent and, once entrenched, scientific
thought is forever doomed to be circumscribed by them.
The revolutionary activist view, which Lakatos ascribes
to both himself and Popper, insists that theories, as the
products of human thought, are subject to control and
change. This attitude, called "sophisticated
methodological falsificationism," enables one to come to
terms with the actual history of science, and dispenses
with the second problem of dogmatic falsificationism
detailed above. Lakatos identifies elements of this
approach in Popper, but argues that Popper's thought lies
46


mainly in the tradition of naive methodological
falsificationism.
Sophisticated falsificationism allows an even
greater role for conventionalism. Falsification is held
to occur, not at the level of one theory, but in terms of
a group of related theories concerned with the same
phenomenon. Thus, there is a major broadening of the
unit of analysis. Further, scientific history explicitly
enters the analysis; the methodologist is concerned with
the evolution of a theoretical systemhow it has
responded over time in the face of theoretical and
observational anomalies. Most important, falsification
is predicated not only on the existence of such
anomalies, but also on the existence of a rival cluster
of theories which can explain some of them, and give rise
to new implications as well.
Structure of the MSRP
Lakatos developed his methodology of scientific
research programs to embody the tenets of sophisticated
falsificationism. This enabled him to claim that MSRP
avoids the dire implications of the "strong" Duhem-Quine
thesis, detailed above. Alterations to any part of the
fabric of knowledge, even if these changes are
47


inconsistent with other parts, are perfectly acceptable
if they represent "progressive" problem shifts:
The sophisticated falsificationist allows any
part of the body of science to be replaced but
only on the condition that it is replaced in a
'progressive' way, so that the replacement
successfully anticipates novel facts. . The
direction of science is determined primarily by
human creative imagination and not by the
universe of facts which surrounds us. (Lakatos
1978a, 99)
Although MSRP was designed expressly to facilitate
rational reconstruction of the history of science, there
is an implicit prescriptive dimension as well.
Processual and appraisal criteria are articulated which
clearly apply to current as well as past research
programs. In essence, what Lakatos tried to do was
conjoin Popperian normative metascience (highly modified)
r~
with more purely descriptive Kuhnian in order to
actualize sophisticated falsificationist epistemology.
As mentioned above, sophisticated falsificationism
operates at the level of research programs. not
individual theories:
It is a succession of theories and not one
given theory which is appraised as scientific
or pseudo-scientific. But the members of such
series of theories are usually connected by a
remarkable continuity which welds them into
research programmes. This continuity
reminiscent of Kuhnian 'normal science'plays
a vital role in the history of science. .
(Lakatos 1978a, 47)
48


Lakatos gives several examples of such research programs
including Newton's theory of gravitation, Prout's
hypothesis about atomic weight, and quantum mechanics.
It is important to note in this context that all such
examples are from the physical sciences; further, all
programs are defined by particular phenomena for which
they seek to account. This aspect of Lakatos' use of the
concept will be of some importance when looking at
applications of MSRP to economics.
The defining feature of a research program is its
"hard core," certain propositions which are segregated
and held to be immune to attempts at falsification or
even modification. The methodological device which
effects this impunity is the "negative heuristic," an
injunction to keep hard core propositions inviolate. For
example, "In Newton's programme the negative heuristic
bids us to divert the modus tollens from Newton's three
laws of dynamics and his law of gravitation." (Lakatos
1978a, 48)
The "positive heuristic" of a program is basically
its research agenda together with general instructions on
how to pursue that agenda. It is under the aegis of the
positive heuristic that the sacrosanct propositions of
the hard core give rise to the refutable theories of the
"protective belt." The latter is the constellation of
49


theories which form the "working hypotheses" of day-to-
day scientific life. In this regard, the protective belt
closely parallels Kuhn's notion of "normal science," with
its "puzzle solving" activity, i.e. its attempts to work
out minor theoretical inconsistencies and empirical
anomalies, and to achieve wider applicability for the
program. It should be noted that Lakatos viewed the
positive heuristic as set out at the start of a research
program. Likely empirical anomalies are anticipated at
the beginning, and even the order in which they are dealt
with is specified:
. .it should not be thought that yet
unexplained anomalies. .are taken in random
order. .The order is usually decided in the
theoretician's cabinet, independently of the
known anomalies. (Lakatos 1978a, 49, emphasis
added)
The prescriptive functions of methodology take shape
in Lakatos' conception of progressive and degenerating
problem shifts. A program can be said to be progressive
in two senses: theoretically and empirically.
Theoretical progressivity lies in a program's ability to
predict novel facts, phenomena not anticipated or
explained by any other program. This is a obviously a
restatement of Popper's idea of "excess empirical
content." The least that should be asked of a program is
theoretical progress in this sense:
50


. .we must require that each step of a
research programme be consistently content-
increasing: that each step constitute a
consistently progressive theoretical
problemshift. (Lakatos 1978a, 50)
Ad hoc formulations such as 'Theory T holds true except
in cases X and Y', where X and Y are observed anomalies,
are illegitimate and indicative of a degenerating problem
shift.
Empirical progressivity occurs when some of the
novel facts predicted theoretically are actually
observed. Progress at this level is not likely to be as
consistent as progress in theory construction:
". . the programme as a whole should. .display an
intermittently progressive empirical shift." (Lakatos
1978a, 49)
Lakatos summarizes the relation between the various
components of the MSRP as follows:
. . [the hard core] is 'irrefutable' by the
methodological decision of its proponents:
anomalies must lead to changes only in the
'protective' belt of auxiliary, 'observational'
hypotheses and initial conditions. (Lakatos
1978a, 48)
The incorporation of conventionalist elements into MSRP
is accomplished through the idea of the irrefutability of
the hard core in response to the negative heuristic:
The idea of 'negative heuristic' of a
scientific research programme rationalizes
classical conventionalism to a considerable
extent. . (Lakatos 1978a, 49)
51


By virtue of this, the persistence of theories in the
face of apparent refutations can be viewed as a rational,
completely justifiable part of scientific endeavor;
indeed, it accords well with much of the history of
science. Lakatos even seems to argue that this
conventionalist practice is a sine qua non of scientific
progress:
The positive heuristic of the programme saves
the scientist from becoming confused by the
ocean of anomalies. . the scientist's
attention is riveted on building his models
following instructions which are laid down in
the positive part of his programme. He ignores
the actual counterexamples, the available data
. . .Thus the methodology of scientific
research programmes accounts for the relative
autonomy of theoretical science: a historical
fact whose rationality cannot be explained by
the earlier falsificationists. (Lakatos 1978a,
50, 52)
In other words, MSRP facilitates rational reconstruction
of science as it has actually been practiced. More
factors can be considered as internal (cf. above, 28ff)
and fewer need be deemed outside the realm of
metascientific analysis.
Lakatos' views on the relation between methodology
and scientific history are set out in his 1971 essay
"History of Science and Its Rational Reconstructions."
(Lakatos 1978a, 102-138) Reconstruction is therein
discussed as a process of mathematical modeling. A good
methodology will allow for the inclusion of more
52


historical variables as endogenous; less resort is
necessary to "external" or exogenous factors. Further,
methodologies can be judged, at least in part, by how
well they retrodict past scientific ventures, especially
the "best gambits" or Kuhnian "exemplars" in a field
those theories, experiments, etc., which are widely
accepted within the field as some of its premiere
accomplishments. To continue the "model" analogy,
methodologies are to be judged by the "goodness of fit"
between the rational reconstruction and actual history.
Exogenous forces have always existed; no methodology will
have an R2 approaching 1. But Lakatos believed that MSRP
was the one metascientific structure which could
"rationalize" those episodes in scientific history which
other methodologies wrote off as irrational, "bad"
science, or wholly within the context of discovery and
hence, inexplicable.
Criticisms of MSRP
It is not surprising that a methodology as
innovative as MSRP should have been subjected to
extensive critique. In substance, MSRP tries to blend
two very different approaches to metascience: Popper's
prescriptive falsificationism and Kuhn's descriptive
conventionalism. The issue of this coupling has
53


generally been rejected by both parents. Strictures have
been directed at virtually every component, but only some
of these will be discussed here. More specific
criticisms of the use of MSRP in economics will be
reviewed in the next chapter.
Much of the criticism of MSRP over the years has
centered on the concept of the hard core. For a
construct which is so central in his methodology, Lakatos
says tantalizingly little about it. It can be
interpreted simply as a set of assumptions used in the
articulation of theories, yet this seems not to capture
the power and cohesion which the core is supposed to
impart to programs. Applications of the concept in
economics often take the core to be ideological in
origin, to spring from Schumpeter's pre-scientific
vision. But this reading is at odds with Lakatos' own
use of the concept. As Fulton has pointed out, every
example of a hard core that Lakatos gives is a very
narrow, clearly delineated set of propositions, not a
grand metaphysical world view (Fulton 1984, 187-8).
This ambiguity about the precise nature of the hard
core is compounded by the question of just how "hard" it
is. Lakatos, in a footnote containing one of the very
few explicit discussions of the concept, raises the
54


possibility that its constituent propositions can be
quite labile:
The actual hard core of a programme does not
actually emerge fully armed like Athene from
the head of Zeus. It develops slowly, by a
long, preliminary process of trial and error.
In this paper this process is not discussed.14
(Lakatos 1978a, 48, n.4)
Recall that the hard core is the motor force of a
research program; it provides the program's confidence,
direction and verve. Its propositions are the sanctum
sanctorum of a program, completely off limits to attempts
at falsification. How can this description be reconciled
with the quoted notion of a fluid hard core? What is the
impetus behind its "slow development"? If it is seen as
evolving, when can one say that its evolution has ceased?
On such questions Lakatos is silent.
Alan Musgrave (1976) raises other objections to the
concept: It is descriptively inaccurate, and
prescriptively untenable. Lakatos' prime example of a
hard core is Newton's theory of gravitation, especially
the inverse square law. Now it is perfectly true,
Musgrave comments, that this theory was sustained in the
face of empirical anomalies for over two centuries.
Newtonian scientists, on encountering some observations
contrary to the theory's predictions, usually sought the
14It might be noted that the process is not discussed
in any other paper, either.
55


source of the difficulty in an auxiliary proposition;
e.g. that there was some hitherto unseen force (an
undiscovered planet, for example) which accounted for the
discrepancy. The majority of times, this strategy was
successful. Significant discoveries were made as a
result (the planet Neptune being the most spectacular),
and Newton's theory seemed to have been completely
vindicated.
However, each time a new anomaly was encountered,
there were some researchers, with unimpeachably
"Newtonian" credentials, who contemplated modifications
to Newton's law. Indeed, some questioned its usefulness
and continued acceptance. Hence, the idea that
scientists working in the Newtonian program had, by fiat,
rendered certain portions of that program unfalsifiable
is not historically accurate:
. .to reconstruct the history of Newton's
theory in this way [i.e. in line with MSRP]
would be to falsify it. Some Newtonians at
some points did take a methodological decision
to retain Newton's laws unchanged in the face
of an anomalyand some did not. The former
were usually successful, and the latter were
not. But it is only by ignoring half of the
actual history that it can be made to fit
Lakatos's methodology. (Musgrave 1976, 466)
Musgrave thus demonstrates (in great detail) that the
foremost example of a hard core which Lakatos himself
adduces does not conform to his own definition, a set of
56


propositions "... 'irrefutable' by the methodological
decision of its proponents. ." (Lakatos 1978a, 48)
William Berkson echoes and amplifies Musgrave's critique.
Scientists most certainly rely on assumptions; theory
would be impossible without them. However, that set of
assumptions is likely to be quite diverse even for
scientists within the same research program. Berkson
discusses one of physics' "best gambits," field theory,
and finds that
Every scientist had his own "hard core," and
sometimes more than one, which shows that the
'cores' are not exactly hard in that they were
protected from change at almost all cost.
Scientists actually, contra Kuhn and Lakatos,
have had enough intellectual independence to
make up their own minds about what to take as
fundamental, and enough independence to change
their minds also, or to keep an open mind and
try different alternative ideas as fundamental.
(Berkson 1976, 52)
As far as the prescriptive facet of the concept is
concerned, Musgrave remarks that it gives dubious advice.
The question of how, exactly, the hard core propositions
are to be chosen, or how they originate, is nowhere
addressed by Lakatos. Presumably, this is a problem to
be relegated to the context of discovery, and hence not
amenable to methodological dictates. But unless there is
some guiding principle for selection of hard core
components, "... his methodological rule, stated
generally gives carte blanche to any group who want to
|
I
57


(Musgrave 1976,
erect some, pet notion into a dogma."
I
465) The role of methodology in laying down demarcation
criteria and processual guidelines seems to be
significantly compromised under MSRP.
Another methodological function, the articulation of
appraisal criteria, also seems to be vitiated in Lakatos'
system. This provides a second focus for critics of
MSRP. In its 1970 incarnation ("Falsification . .),
MSRP honored its Popperian lineage; progressive programs
were to be.strictly preferred over those which could be
shown to be degenerating. Ad hoc propositions were to be
taken as evidence of such degeneration. Scientists would
"rationally" abandon a research program which had "run
out of steam" and turned into a motley collection of
hypotheses modified solely to accommodate successive
rounds of refutation.
However, in a paper titled "Replies to Critics,"
published in 1971, much of this is qualified or abandoned
outright. Lakatos disavows any processual guidelines as
an aim of his methodology, and also surrenders any clear
criteria to determine when a research program should be
abandoned, or even when it is irretrievably degenerative.
Programs which have long been in decline, he insists, may
become progressive once again. Hence, it may be just as
"rational" to pursue one program as another. Appraisal
58


criteria may still exist, but barely. They are deprived
of any operational significance. Feyerabend hailed this
i
version of Lakatos' methodology as a capitulation to his
methodological anarchism.
I
A third area of criticism of MSRP concerns the exact
role of the positive heuristic. While spelled out in
greater detail than the hard core, the positive heuristic
is still l^ft vague and imprecise. In particular, the
notion that the positive heuristic is articulated at the
outset of a research program, before its actual
i
development! and encounter with contradictory data, is
difficult tio envision. Certainly, a research agenda
could be so specified, at least in broad outline. But
the idea that the positive heuristic anticipates
particular janomalies and seeks to defuse them at the
outset is contrary both to scientific history and common
i
sense.
I I
I
I
59


CHAPTER IV
APPLICATIONS OF MSRP TO ECONOMICS
Many Lakatosian constructs have percolated into
economics, at least at the level of semantics. Even
economists who are hostile to MSRP itself (such as
Boland) commonly refer to research programs or hard
cores. These concepts, in particular, seem to provide a
useful shorthand for recurrent issues in economic
methodology. But such uses can hardly be said to
constitute applications.
According to estimates by Neil de Marchi and others,
there have been in excess of ninety genuine applications
of Lakatosian methodology to economics. MSRP has been
employed to analyze or rationally reconstruct everything
from Marx to marketing (Blaug 1980a; Leong 1985).
Further, the interpretations of what constitutes
Lakatosian metascience are as diverse as the fields to
which it has been applied. Researchers have adapted MSRP
to fit their own conceptions of the proper role of
economic methodology, as well as to address the specific
!
problems in which they are interested. Many of these
adaptations are at some variance with Lakatos' own
formulationJ This is, perhaps, inevitable, and may well
I
60


enhance our understanding of the discipline in ways which
slavish adherence to MSRP could not: "Bringing Lakatos
. . to economics promises to yield new insights;
bringing economics to Lakatos promises little more than a
series of artificially generated congruencies." (de
Marchi Forthcoming, 25)
The first applications of MSRP to economics came out
of a conference organized by Spiro Latsis in 1974. de
Marchi comments on the high expectations that were
engendered by that meeting, and the sense of excitement
that participants brought to the sessions. MSRP
represented a package of new methodological tools, the
first since Popperian ideas of the '30s, which showed
great promise in sorting out the numerous competing
schools and theories in the discipline. It could also be
seen as rationalizing economics, as providing a
justification for the way economists work. This aspect
of the use of MSRP in economics will be explored in the
next chapter. Also postponed until Chapter V is a
general assessment of MSRPwhat economists have said
about it and its use in analyzing their discipline.
The purpose of the present chapter is to give a
sketch of some of the more notable applications of MSRP
within economics. This sketch is organized principally
around one issue: the characterization of what
61


constitutes a research program in economics. In some
applications, this characterization closely approximates
Lakatos' own description of a program as a set of
sequentially related ideas flowing from a common,
specific, precise and carefully delimited set of hard
core propositions. Fulton points out that "Lakatos'
illustrations set out hard cores involving highly
technical concepts and indeed specific laws or axioms."
(Fulton 1984, 191) Applications in this vein I shall
call "intensive."
Even in early applications, however, some economists
conceived of research programs in a fundamentally
different way (e.g. Blaug 1976). In their studies,
"research program" is used as a synonym for "school of
thought" in the broadest possible sense. In this view,
the hard core contains the ideological underpinnings of a
comprehensive approach to economics. It serves to define
a distinctive world view. Such applications I shall term
"extensive." Many applications along this line are
avowedly at odds with the original Lakatosian constructs,
and articulate what amounts to a neo-Lakatosian
methodology.
62


Intensive Applications
The Human Capital Research Program
Perhaps the best examples of what are here termed
intensive applications of MSRP are provided in Blaug's
Methodology of Economics. The seven studies contained in
Part III of that book are devoted to research programs
which are narrow, clearly defined fields within
neoclassical economics. These include the theory of
consumer behavior, theory of the firm and the Heckscher-
Ohlin theory of international trade. The analysis
contained in chapter 13, devoted to a study of human
capital theory, best exemplifies an application which is
entirely consonant with the original Lakatosian scheme.15
Blaug notes that human capital theory is a "perfect
example" of a research program, inasmuch as it is not
reducible to one single theory. The human capital
program seeks to account for such disparate phenomena as
consumer expenditure on health and education, workers'
job search strategies, worker migration and salary
15As mentioned above, Blaug also used Lakatos in
"extensive" ways. Indeed, Fulton remarks that Blaug's
(1976) seems to construe all of orthodox economics since
Smith as one research program. (Fulton 1984, 191) Blaug
reconciles these two approaches by referring to his more
"intensive" applications as relating to "subprograms" of
the larger classical-neoclassical program. The next
section will indicate how this distinction in fact gives
rise to a neo-Lakatosian approach.
63


differentials. It has even spawned another research
program, Becker's theory of the family.
Analysis of human capital's hard core begins with
the proposition of methodological individualism, "the
view that all social phenomena should be traced back to
their foundation in individual behavior." (Blaug 1980b,
227) This is supplemented by the notion that individuals
have different rates of time preference with regard to
present versus future pecuniary returns. An individual's
feasible consumption set contains items, like education,
which can be viewed essentially as investments.
Consumption of such goods and services can then be
treated in the same way as investment project analysis;
the present value of benefits and costs are compared to
ascertain if the net present value is positive or
negative. The final hard core proposition maintains that
the "investment" will be undertaken by an individual if
the net present value is positive, foregone if it is
negative.16
This core allows the articulation of protective belt
theories which determine the scope of the program as well
16Blaug notes that a minor change in the human
capital core, viz. assuming that the "decision maker is a
household rather than an individual", gave rise to the
"economics of the family" research program. (Blaug 1980,
226)
64


as its research agenda. Concerning education, a typical
protective belt proposition would be that
the demand for postcompulsory education is
responsive both to variations in the direct and
indirect private costs of schooling and to
variations in the earnings differentials
associated with additional years of schooling.
(Blaug 1980b, 226)
"Indirect private costs" capture the opportunity cost
aspects of an educational "investment"; they measure
primarily the short-term earnings or leisure forfeited
while in school.
Obviously, this agenda represents a considerable
change of focus from the pre-human capital view that
education could be treated as simply a consumer "good,"
its demand being thus determined by tastes, incomes,
costs, etc. as is the demand for any other service. As
the human capital program has evolved, two main versions
have emerged with regard to the economics of education.
The first attempts, on the basis of the hard core and
protective belt components detailed above, to predict
actual enrollments at the post-secondary level. The more
far-reaching version seeks to estimate the number of
entrants into particular fields of study, based on the
projected earnings for future practitioners in those
fields.
Clearly, educational decisions have a social as well
as a private dimension. Therefore, an area of protective
65


belt theorizing has concerned the social rate of return
on educational investment. This in turn gave rise to a
new protective belt proposition which constitutes a
social investment criterion:
resources are to be allocated to years of
schooling and to levels of education so as to
equalize the marginal, "social" rate of return
on educational investment, and . this
equalized yield . should not fall below the
yield on alternative private investments.
(Blaug 1980b, 228)
Whether, in fact, societies tend to allocate public
funds in this manner is, of course, a testable
proposition. A finding that they do not would
necessitate adjustments to the criterion stipulated, or
if no satisfactory adjustment could be made and
observational anomalies continued to pile up, the
criterion would have to be abandoned. But other parts of
the program would remain in tact; damage from contrary
empirical evidence would have been localized. The same
is true with the other testable program components
(school enrollments, etc.). It is in this sense that the
"belt" of auxiliary propositions surrounding the hard
core is "protective."
Outlined above is only one facet, education, of the
human capital program. The protective belt abounds with
applications of the basic hard core propositions to other
66


areas; health, on-the-job training, etc. Blaug comments
that, taken together, such applications comprise
an almost total explanation of the determinants
of earnings from employment; it predicts
declining investments in human capital
formation with increasing age and hence
lifetime age-earnings profiles that are concave
from below. (Blaug 1980b, 231)
Human capital theory's positive heuristic thus directs
researchers to specify and estimate "earnings functions"
to determine if such exogenous variables as ability and
demographic characteristics are less important than
education and work experience, as the theory would
predict.
This leads to the question of program performance.
It will be recalled that the Lakatosian appraisal
criterion contains two major elements: theoretical and
empirical progressivity. Blaug insists that these
elements are entirely relative, i.e. they are meaningful
only in conjunction with a parallel assessment of a
competing program. In terms of the economics of
education, human capital theory's closest competitor,
according to Blaug, is the "screening hypothesis" or
"credentialism," which in turn is a component of a larger
program, labor market segmentation theory.
The theoretical productivity of the human capital
program is legend. By dint of the sheer number of
phenomena which it has brought within its compass, it has
67


generated scores of auxiliary hypotheses which have
predicted novel facts, i.e. increased its empirical
content. Hence, the program must be judged as
theoretically progressive (Blaug 1980b, 232).
Empirical progressivity is another matter. Blaug
observes signs of degeneration. The theory's account of
individuals' demands for education has yet to be well
corroborated. Moreover, while the program purports to
offer normative prescriptions about the allocation of
public funds to education, it does not provide an
explanation of those allocations already in place.
Perhaps most damaging:
. . its rate-of-return calculations
repeatedly turn up significant differences in
the yields of investment in different types of
human capital, but its explanation of the
distribution of earnings nevertheless goes on
blithely assuming that all rates of return to
human capital are equalized at the margin.
(Blaug 1980, 238)
Further evidence of degeneration is the program's
tendency to advance ad hoc assumptions when confronted by
a recalcitrant data set.
The program which derives from the so-called
screening hypothesis fares little better. Any
description which would do justice to credentialism is
well beyond the scope of this paper. Basically, it
posits a positive correlation between earnings and
education, as does human capital theory, but asserts that
68


this results from employers' use of educational
credentials as a screening device to determine which
applicants are most likely to perform well on the job.
Blaug notes that "... the difference between the two
explanations is . that of discovering whether schools
produce or merely identify those attributes that
employers value." (Blaug 1980, 236) However, instead of
focusing on specific educational practices, which gets at
the true substance of the dispute, researchers in both
programs usually resort to data on labor markets to
support their contentions,
But no market test is likely to discriminate
between human capital and screening
explanations, because the question is not
whether schooling explains earnings, but rather
why it does. (Blaug 1980, 236)
In fact, Blaug suggests that the two programs are not
really competitors; they are complements. In a situation
of this sort, Lakatosian comparisons are difficult if not
inappropriate. This is compounded by the fact that the
human capital program is much more comprehensive than
credentialism. Due to these factors, Blaug is able to
offer no systematic appraisal of the human capital
research program vis-a-vis credentialism. He concludes
his Lakatosian reconstruction with the prediction that
its fate is likely to be absorption into some "richer,
69


still more comprehensive view of the sequential life
cycle choices of individuals." (Blaug 1980, 239)
The Neoclassical Theory of Demand
The 1974 Nafplion colloquium, whose papers were
published as Latsis (1974), contained four Lakatosian
studies of putative research programs in economics. The
one which provides the best example of "intensive"
reconstruction is Coats' survey of the pre-Hicksian
neoclassical theory of consumer behavior (Coats 1976).
In this essay, the hard core and positive heuristic are
spelled out much more clearly and concisely than in the
study by Blaug reviewed above. It is also a prime
example of rational reconstruction, concentrating on an
early incarnation of an ongoing research program.
According to Coats, the hard core of this program
had hardened sufficiently by the end of the 1920s to
allow the following characterization. There are eight
constituent propositions: (Condensed from Coats 1976,
53)
(1) Economic theory is by nature abstract, general,
and concerned primarily with static analysis;
(2) In light of (1), economic assumptions should be
simple, uniform and constant; realism and
testability are not necessary;
(3) Consumers seek to maximize total utility;
70


(4) Consumers' incomes are limited;
(5) Consumers' wants are unlimited; consumption of
a given commodity is subject to diminishing
marginal utility after some point;
(6) Consumers have perfect knowledge of relevant
market conditions;
(7) Consumption is based on rational calculation of
choices between alternative consumption
possibilities;
(8) A consumer's choices are independent of other
consumers' choices.
These hard core propositions are combined with the
following five instructions which make up the program's
positive heuristic: (Coats 1976, 54)
(i) Construct static models;
(ii) Minimize the number of postulates, especially
psychological postulates;
(iii) Develop general theories;
(iv) Concentrate analysis on price; ignore
questions about the origin of wants, the
stability of preferences, etc.
(v) Allow casual empiricism to dictate
appropriate theoretical reinterpretations
when anomalies arise.
71


Coats is aware that his specification is arguable.
Yet he feels it facilitates methodological assessment of
the program's strengths and weaknesses, as well as its
comparison with a rival program. This opposing research
effort had its origins in American pragmatism and in
early twentieth century psychology. It attempted to
construct a "realistic" picture of human motives and the
behavior to which they led. Coats characterizes this
approach as emphasizing the "active, dynamic, and
constructive aspects of human behavior," in contrast to
the economist's essentially static and reactive view
(Coats 1976, 48).
Turning to appraisal, Coats departs somewhat from a
strictly Lakatosian framework. He provides a list of
"criteria of good theories" gleaned, he says, from
economic methodology past and present. These are:
(Based on Coats 1976, 56)
(a) consistency;
(b) simplicity;
(c) generality;
(d) fruitfulness; i.e., adaptability and extensibility;
(e) theoretical manageability in terms of extant techniques;
72


(f) congruence with reality, i.e. capacity to
explain existing empirical observations; and
(g) testability; falsifiability.
Defenders of orthodox economics, Coats contends, placed
great stress on items (a) (e), whereas proponents of
the rival "psychologistic" program were preoccupied with
(f) and (g). This difference in normative methodological
emphasis eventually resulted in neoclassical demand
theory becoming a degenerative program. Coats remarks
that
The orthodox programme was not progressing
empirically, and in certain respects it was
degenerating theoretically as economists
reinterpreted the established theory, and added
fresh terminological refinements and ad hoc
assumptions in an effort to protect it from
refutation. (Coats 1976, 58)
This degeneration finally resulted in the replacement of
the old program, first by indifference curve analysis and
then by revealed preference theory. But even as orthodox
demand theory was degenerating, psychology-based theories
of decision and choice were progressing. That program
has consequently matured such that it is again a
competitor to the new economic orthodoxy in demand
theory. Coats quotes Henry Simon to the effect that the
expansion of the scope of consumer theory in recent
decades has been its downfall; it is simply incapable of
dealing with the new problems and phenomena it
73


encounters. In Lakatosian terms, it has again become a
degenerating program.17
Coats concludes his study by noting that MSRP was
developed using Newtonian physics as its exemplar. He
sees no comparable exemplar in economics, now or in the
recent past. This he takes as an argument, not for the
irrelevance of MSRP to economics, but rather for due
caution in economics applications. Rational
reconstruction of theory giving primacy to internal
factors is what Coats sees as MSRP's particular forte.
"To the historian, MSRP is essentially a practical tool
which will ultimately be judged by its results. At this
early stage of its application it seems to possess
considerable promise." (Coats 1976, 61) The extent to
which that promise has been realized will be discussed in
the next chapter.
The Neoclassical Theory of Production
G. Fulton has been cited several times in preceding
sections as a leading proponent of intensive applications
of MSRP. Relying on Lakatos' own examples, Fulton
contends that the most useful conception of "research
program" is as a sequence of theories relating to a
17Simon's hope, of course, is for a new composite
program, made up of elements from both economics and
psychology.
74


specific set of phenomena. In this sense, a research
program is quite distinct from a school of thought. A
given school, (e.g. neoclassical, Marxian, or
institutional economics) contains numerous
"interconnected but distinct research programmes."18
(Fulton 1984, 195)
Specification of a program's hard core must, of
course, be consistent with this conception. Again
pointing to Lakatos' examples, Fulton insists that
authentic hard core propositions are precisely
formulated, resembling mathematical axioms. He quotes
Leijonhufvud to this effect:
. . hard core propositions have been through
the purgatory of incorporation in (more or
less) rigorous formal models; the language in
which they are formulated has been honed to
fulfill requirements of mathematical
consistency with other propositions; they are
precise. (Fulton 1984, 192; quote from
Leijonhufvud 1976, 71-72)
The positive heuristic, in this view, is primarily
concerned with how to build models. It enumerates the
deductive possibilities latent in the axiomatic hard
core. Consequently, the protective belt theories to
which this heuristic gives rise are chiefly of two kinds
mathematical refinements, reformulations, and extensions
18Fulton also concedes (1984, p. 195, n.ll) that "Of
course, certain parts [of a school] may not fit into a
research programme framework".
75


of the hard core; and econometric models which concretize
hard core elements.
Fulton proceeds to analyze neoclassical production
theory as an illustration of his view of the proper use
of MSRP. In delineating the hard core, he first renders
the propositions in verbal form, then in mathematical:
(Condensed from Fulton 1984, 197)
(1) A production function exists for each firm. It
is defined (following Ferguson 1969, 7) as 'a
single-valued mapping from input space into
output space in as much as the maximum
attainable output for any stipulated set of
inputs is unique': Y = f(x), where Y is a
given output and x is the input vector;
(2) Diminishing marginal returns: f.>0 for all i,
flM<0 for all Xj>x.; and
(3) Continuous substitutability of inputs.
These standard components of production theory give rise
to the following positive heuristic:
(1) Construct models based on profit maximization
or cost minimization;
(2) Specify market conditions for firms, consumers
and inputs such that a determinate equilibrium
ensuesusually, perfect competition in all
markets;
76


(3) Create models on the aggregate level analogous
to micro-level models; and
(4) Extend model applicability where possible by
introduction of such real world conditions as
imperfect markets, time, and heterogeneous
capital.
Obviously, not all of the propositions in the positive
heuristic are unique to production theory; Fulton admits
that several are shared among many neoclassical programs.
It is (3) and (4), he asserts, which dictate this
particular program's research agenda and thus serve to
help define the program itself. Moreover, contrary to
Lakatos, Fulton does not see how directives such as (3)
and (4) can be given at a program's inception; they take
shape only as the program develops.19
Appraisal of the neoclassical theory of production
is, according to Fulton, a straightforward matter.
Theoretical progressivity is evidenced by its refinement
into a coherent theory of production and distribution
which has issued in the marginal productivity hypothesis
to account for distributive shares. Further progress has
stemmed from its ability to incorporate imperfectly
competitive markets into the analysis. Fulton traces the
19Compare the criticism in the same vein by Alan
Musgrave (1976), noted in Chapter III above.
77


program's development from Wicksteed to Hicks and on
through the vintage capital models developed in the late
'50s and early '60s (mostly in the course of the
Cambridge capital controversy). By allowing for
heterogeneous capital, vintage models represented a major
increase in empirical content. In particular, Solow's
vintage capital model: "... appears to present a
testable version of the capital-embodiment hypothesis,
and indeed various empirical tests have been attempted."
(Fulton 1984, 201)
A potential rival program developed with the work of
Johansen, who was among the first to construct a vintage
capital model. By dropping the assumption of smooth
substitutability (hard core proposition (3)), and later
adopting a novel concept of the production function
itself, Fulton contends that Johansen could no longer be
considered to be working within the old program.
When it comes to the question of empirical
progressivity, the situation is more difficult to assess.
Having addressed the issue elsewhere (in his unpublished
doctoral dissertation), Fulton merely comments that he
finds little evidence of empirical progress. Few of the
theoretical refinements seem to be able to produce
corroborated hypotheses.
78


There is one other feature of Fulton's paper that
should be discussed. Following Leijonhufvud, he
introduces the notion of "presuppositions" into
Lakatosian methodology. These are the "... informal
and (in modelling contexts) implicit propositions" which
are distinct from the more precise formulations in the
hard core. (Fulton 1984, 192) Fulton sees the
conflation of "presupposition" and "hard core" as a major
source of confusion in "extensive" applications of MSRP.
Alluding to such applications, Fulton insists:
They are dealing with two separate concepts.
In the analysis of neoclassical economics. as
a single research programme, the hard core thus
consists exclusively of presuppositions which
can only be stated imprecisely, and so
different neoclassical economists will give
different expressions to them. . Lakatos,
however, saw no problem in setting out the hard
core because he viewed it as a stage further on
from . .presuppositions. He saw it in the
context of a certain series of theories as set
out in rigorous models. . (Fulton 1984, 192)
Presuppositions are described as "quite woolly 'grand
generalities' somewhat in the nature of cosmological
beliefs." (Leijonhufvud 1976, 76) One is again reminded
of the "context of discovery" and Schumpeter's "pre-
scientific vision." It is worth cataloguing those ideas
which Fulton takes to be the presuppositions of the
neoclassical theory of production: (Fulton 1984, 195-
196)
79


(1) Economics is concerned with the allocation of
scarce resources among competing ends;
(2) Economic theories are necessarily abstract,
general, and based on deductive models;
(3) Economic theory is grounded in the behavior of
individual agents whose actions are independent
of other agents;
(4) Agents possess perfect knowledge of the
relevant aspects of their situation;
(5) Agents act rationally;
(6) Agents maximize;
(7) Economic theory is concerned with equilibrium
states and transitions between these states;
(8) The normative/positive distinction;
economics is a positive science; and
(9) Theories are best cast in rigorous mathematical
form.
As will be discussed in the next section, this list is
strikingly similar to what other economists take to be
the neoclassical hard core. While it is true that these
statements are much more general than the kind of
propositions Lakatos identifies as "hard core," no
support for a distinction between presuppositions and
hard core is to be found in his formulation of MSRP.
However, the distinction does serve as a convenient point
80


of departure for the discussion of what I have called
"extensive" applications of MSRP.
Extensive Applications
Metaphysical factors or presuppositions are an
inevitable part of social science; the quest for a value-
free, "positive" science of society can itself be seen as
a metaphysical predisposition.20 It is not surprising,
then, that methodologies developed primarily for the
physical sciences occasion diverse interpretations when
applied outside that realm. This is especially true of a
methodology like MSRP, many of whose categories are
somewhat vague and imprecise. One is left with a fairly
intricate classificatory scheme with no clear basis of
classification. Confusion over what component belongs
where, indeed, over the proper unit of metascientific
analysis, is inevitable.
Rather than seeing philosophical presuppositions as
external to the hard core, "extensive" applications of
MSRP take them to be the substance of the core. A
research program is then defined in much broader strokes.
It becomes more on the order of a world view, a
20As above, in Fulton's inclusion of the idea as one
of the presuppositions of production theory. For the
history of the positive/normative distinction in
economics, see Blaug 1980, 129-156.
81


comprehensive system of beliefs and ideas which may even
transcend conventional disciplinary boundaries. To
varying degrees, this is a common denominator in the
three extensive applications profiled below.
Marxian Analysis As A Research Program
Marxian economics has been the subject of several
analyses in terms of MSRP (Blaug 1980a; Glass and Johnson
1988). These studies generally specify the hard core
and/or positive heuristic solely in terms of Marx's
economics, ignoring the wider dimension of Marx's
thought. Blaug concedes the inadequacy of this view:
Marxian economics is embedded in a wider
scientific research programme, comprising
sociological, political, and even
anthropological theories, all knitted together
by an over-arching theory of historical change,
and as such it must be appraised in the round
as a complex of interconnected theories.21
(Blaug 1980a, 2)
One study that appraises it as such a complex is Suzanne
Helburn's "Marx's Research Program." (1986) The view
adopted in that paper is that Marxian thought represents
a coherent whole, of which economics is only one
21Blaug goes on, however, to justify his exclusive
focus on Marxian economics as follows: ". .1 must
insist that the larger Marxian research programme cannot
be understood except in terms of its extensive economic
content. . we, as economists, need to remind the world
that Marx devoted nearly twenty years of his mature
output to questions of economic theory, and that the
economic aspects of Marxism are the only ones which he
himself polished to anything like a finished state."
82


component. Consideration of this component in isolation
from the rest of the program (or, worse still, regarding
it as coextensive with the entire program) misconstrues
the program's theoretical substance and vitiates its
political intent. Hence, her study articulates hard core
and positive heuristic for Marxian theory, taken in the
widest possible sense.
"Marx's Research Program" does not specify hard core
propositions as explicitly as do the other studies
surveyed in this chapter. However, the following list
represents an attempt to extract more exact core
propositions from the papers's general discussion of
Marxism's hard core: (Helburn 1986, 73-84)
(1) Historical Materialism
-Human needs and productive powers develop as a
process of social evolution involving the
development of the social division of labor
which increases human productivity and
interdependence.
-Historical development is a progression of
epochs, each distinguished by a particular mode
of production, which in turn is based on the
level of technology, division of labor, and
class relations.
-Social change is propelled by class conflict.
(2) The Nature of Capitalism
-Theory of alienation: Class divisions are
antithetical to the satisfaction of "truly
human" needs.
83


-Capitalist development is dominated by
capitalist control over production to
accumulate capital.
-The commodity is a unity of oppositesvalue
in use and value in exchange.
-Labor theory of value.
-Capitalism is inherently contradictory because
of the requirements of "expanded reproduction."
(3) Theory of Revolution
-Proletarian revolution is necessary to
overthrow capitalist rule.
(4) Epistemology
-Essentialism: The aim of science is to get
behind appearance and apprehend essence
-Dialectical method
-Cognition through praxis
In her discussion of hard core category (4), Helburn
asserts that Marxian theory posits an indissoluble link
between how the world is known and what is known about
the world. Inclusion of a determining epistemology in
the hard core is a distinctive feature of Marxian
thought,22 one which would not be captured in a narrower,
more "intensive" construction of the core.
The positive heuristic is formulated in terms of
what "the researcher must discover and his theory must
expose": (Helburn 1986, 85)
22Epistemological prescriptions could also be seen as
an important part of the hard core of Austrian economics.
84


(1) the exploitive nature of capitalist production;
(2) the forms of class struggle; and
(3) the fetishistic character of bourgeois
economics, that is the preoccupation with
exchange value which. .captures only
appearances.
The paper goes on to discuss how this agenda was pursued
by Marx himself in each of the volumes of Capital, and
also in his outline of the evolution of capitalist social
relations through the three stages of capitalist
development.
Helburn notes that both Popper and Lakatos were
relentless critics of Marxian analysis. Detractors have
focussed on inconsistencies in Marx's applications of the
labor theory of value, and on the apparent falsification
of several of Marx's predictions. In particular, the
prediction of the increasing immiseration of the
proletariat, of a falling rate of profit, and of
proletarian revolution are often cited as examples of
empirical content that has not been historically
corroborated. The Marxian research program must still be
judged as both theoretically and empirically progressive,
Helburn contends, in that the problem shifts to which
some of these anomalies gave rise have served to
highlight novel facts, some of which have been borne out
85


by data. For example, labor market segmentation theories
cannot be considered as mere ad hoc adjustments, but
constitute a "healthy and productive reaction" to
perceived failures in Marxian analysis. (Helburn 1986,
90)
The Neo-Walrasian Program
Perhaps the most systematic "extensive" application
of MSRP to economics has been the work of E. Roy
Weintraub. He has carefully interpreted the history of
general equilibrium analysis in the Lakatosian framework,
and his study represents the most thorough rational
reconstruction of economic theory to date. The version
of that reconstruction from which the following sketch is
taken is contained in his article "Appraising General
Equilibrium Analysis." (Weintraub 1985)
Weintraub's book Microfoundations: The
Compatibility of Microeconomics and Macroeconomics posits
the existence of two primary research programs in
economics: the Keynesian program, and the neo-Walrasian.
"Neo-Walrasian" is adopted in preference to
"neoclassical" since Weintraub feels the latter
designation has been applied in such an indiscriminate
manner as to become meaningless.
The neo-Walrasian program is defined by the
following hard core: (Weintraub 1985, 26)
86


(1) There exist economic agents;
(2) Agents have preferences over outcomes
(3) Agents independently optimize subject to
constraints;
(4) Choices are made in interrelated markets;
(5) Agents have full relevant knowledge; and
(6) Observable economic outcomes are coordinated,
so they must be discussed with reference to
equilibrium states.
These propositions are insulated by the following
elements of the negative heuristic: (Weintraub 1985, 26)
(1) Do not construct theories in which irrational
behavior plays any role;
(2) Do not construct theories in which equilibrium
has no meaning; and
(3) Do not test the hard core propositions.
The program itself is pursued in accordance with the two
prime directives of the positive heuristic: (Weintraub
1985, 26)
(1) Construct theories in which economic agents
optimize; and
(2) Construct theories that make predictions about
changes in equilibrium states.
Weintraub gives a succinct and cogent statement of the
nature of hard core propositions: "They are tenets,
87


overriding assumptions, that by the definition of the
program are taken as 'givens' by those who work in the
program." (Weintraub 1985, 26) Accordingly, neo-
Walrasian theorists are taken aback by queries as to
whether optimization by agents can actually be observed,
or whether it can be demonstrated that they have perfect
knowledge of the relevant facts of their situation.
Within the program, these questions simply do not arise.
But the hard core given above was only established
in the early 1950s. Taking a cue from comments by
Lakatos (1978a, 48 n.4) and Leijonhufvud (1976, 79),
Weintraub suggests that the series of refinements,
amendments, and extensions of general equilibrium
analysis from the 1930s to the 1950s constitute the
"hardening of the hard core" of neo-Walrasian economics.
The program's increasing sophistication and rigor can be
traced through Schlesinger's restatement of the Walras-
Cassel equations in terms of inequalities, continuing
through the existence proofs of Wald and Koopmans, and
finally the use of theorems from combinatorial topology
by von Neumann, Arrow, Debreu and McKenzie. Although not
mentioned specifically in Weintraub's paper, later work
by Gale and Nikaido, as well as the numerous attempts to
relax some of the assumptions of GE theory, can be seen
as part of an ongoing "hardening" process. Weintraub
88


sees this process as a series of theoretically
progressive problem shifts, in the Lakatosian sense.
Appraisal of the neo-Walrasian research program
presents special problems, according to Weintraub. These
stem from the fact that the hardening of the core
proceeds as an extended exercise in applied mathematics,
not as a sequence of empirically meaningful, or testable,
hypotheses in the conventional metascientific sense.
Because of this, a dualistic approach to evaluation of
neo-Walrasian economics is necessary. Developments in
the hard core must be appraised using criteria
appropriate for gauging mathematical progress."
(Weintraub 1985, 34) Weintraub suggests that a framework
such as that of Lakatos' Proofs and Refutations would be
useful in this regard.23 Progress in the protective belt
can be assessed in more traditional ways by reference to
corroborated empirical content. Examples Weintraub gives
of neo-Walrasian belt theories which can be evaluated in
this way are human capital theory and demand theory.
23Proofs and Refutations (1976) is the published
version of Lakatos' PhD dissertation. In it, he attempts
a reconstruction of the history of the so-called
Descartes-Euler conjecture that for any polyhedron, the
number of vertices minus the number of edges plus the
number of faces equals 2. The view of mathematics which
results from this reconstruction is refreshingly
different from the traditional Euclidean'' view of a
rigid deductive structure based on undefined axioms.
Many economists have been attracted to Proofs as an
alternative source of economic methodology.
89


While true to the formal categories of MSRP, it is
evident that Weintraub's use of that methodology goes
well beyond Lakatos' original conception in two ways.
First, Weintraub takes advantage of the ambiguity in
Lakatos' idea of "hard core." It will be recalled that
Lakatos allowed for the "hardening" process which is so
prominent in Weintraub's reconstruction; however, the
latter implies that the core never can be considered as
fully formed: "The refinement of the hard core, the
hardening as it were, was not even completed by 1954. It
continues today." (Weintraub 1985, 36) Weintraub
further suggests that the core and positive heuristics
evolve jointly. He cites rational expectations as an
example of a protective belt theory which results from a
reinterpretation of core concepts:
If the core proposition "agents optimize" circa
1954 is now "packed" with the concept that one
of the objects of the optimizing choice is the
set of expectations of the future values of the
choice variables, then the hard core supports
the theories of rational expectations.
(Weintraub 1985, 36, n. 23)
Core propositions are thus in a constant state of flux:
"The terms of the core functioned in 1900 in a way that
hardly resembled the way they functioned in the 1960s."
(Weintraub 1985, 36) The second innovation in
Weintraub's approach lies in his assertion that economics
requires two standards of appraisal, one for developments
90


in the hard core and one for the constructs of the
protective belt. As will be discussed in the next
chapter, such interpretations of MSRP are at some
variance with Lakatos' own prototype and examples. Also
considered below are recent developments in GE theory
which may have obviated the need for two sets of
appraisal criteria.
Neoclassical Economics and
Core Demi-Core Interactions
If other "extensive" applications of MSRP point the
way toward modifications of Lakatosian constructs the
better to apply them to economics, the work of Joseph
Remenyi represents a complete reformulation of Lakatos'
ideas to that same end. Remenyi takes all of ,
neoclassical economics as the unit of appraisal, the
"research program." (Remenyi 1979) Contrary to
Weintraub's vision of a dynamic hard core, Remenyi
locates all the program's activity in the protective
belt. The hard core is once again considered as a fairly
constant set of "metaphysical" or axiomatic propositions.
In the protective belt, however, are located all of
the subdisciplines which are part of mainstream
economics: industrial organization, international trade,
public finance, etc. Each of these possesses its own
"demi-core" which directs research in that specialty just
91


as the hard core governs the work of the overall program.
Demi-cores are not immutable. Some may disappear over
time, or be absorbed into one or several others, while
completely new demi-cores come into being. Certain demi-
cores may go through periods of comparative inactivity;
Remenyi mentions welfare economics in this regard.
Others (e.g. the economics of "primitive societies") pass
entirely into other disciplinary research programs (e.g.
anthropology).
Regardless of their ultimate fate, however, demi-
cores invariably arise via the workings of the
discipline's positive heuristic. As such, they must all
be fundamentally consistent with, indeed, be miniature
expressions of, the program's (i.e. economics') hard
core. Deviations from the core encounter what Remenyi
calls the "oversight principle." (Remenyi 1979, 35)
This principle operates through two factors: the EH (or
errant hypothesis) factor and the IR factor
(institutional response). EH responses are primarily
academic; new subdisciplinary theories which challenge
hard core propositions are immediately critiqued and the
source of their "error" is determined. The IR factor
comes into play in the attempt to cordon off heterodox
theorists from institutional supportjobs, funding,
journal space. It is obvious that Remenyi's "oversight
92


principle" is akin to Kuhn's "invisible college."
Remenyi proposes two other neologismsthe "bravado
impulse" and "absorptive reaction." The former provides
the impetus for extensions of the hard core's
applications. It countenances such innovations as are
necessary to increase the range of phenomena brought into
the program's domain, providing, of course, that none of
the innovations violate the negative heuristic.
Absorptive reaction denotes "... the natural tendency
to absorb into an SRP all core-supporting facts and
knowledge. ." (Remenyi 1979, 36)
One of the strengths of Remenyi's approach is that
it provides a compact way to visualize the branches of
mainstream economics. He develops his schema in steps
through a series of diagrams, some of which are
reproduced below. Figure 2 represents his view of the
Lakatosian model of a scientific research program.24
The position of the hard core and protective belt in the
diagram is self-explanatory. The outer ring symbolizes
the domain of real world phenomena (Popper's World I), as
24Remenyi deems the figure to represent what he calls
a modified form of MSRP; an earlier drawing, showing just
hard core and protective belt, with all arrows,
indicating the direction of influence, flowing outward
from the hard core. This constitutes a distortion of
Lakatos' conception. It is closer to a purely
conventionalist view of the structure of science, which
Lakatos' certainly is not. Figure 2 better represents
Lakatos' own formulation.
93