Citation
Implementation as organizational learning capability

Material Information

Title:
Implementation as organizational learning capability a case study of Colorado's standards-based reform policy
Creator:
Bailey, James Andrew
Place of Publication:
Denver, CO
Publisher:
University of Colorado Denver
Publication Date:
Language:
English
Physical Description:
292 leaves : ; 28 cm.

Subjects

Subjects / Keywords:
Education -- Standards -- Case studies -- Colorado ( lcsh )
Educational change -- Case studies -- Colorado ( lcsh )
Organizational learning -- Case studies -- Colorado ( lcsh )
Education and state -- Case studies -- Colorado ( lcsh )
Education and state ( fast )
Education -- Standards ( fast )
Educational change ( fast )
Organizational learning ( fast )
Colorado ( fast )
Genre:
Case studies. ( fast )
bibliography ( marcgt )
theses ( marcgt )
non-fiction ( marcgt )
Case studies ( fast )

Notes

Thesis:
Thesis (Ph. D.) University of Colorado at Denver, 2000. Educational leadership and innovation
Bibliography:
Includes bibliographical references (leaves 264-292).
General Note:
School of Education and Human Development
Statement of Responsibility:
by James Andrew Bailey.

Record Information

Source Institution:
University of Colorado Denver
Holding Location:
|Auraria Library
Rights Management:
All applicable rights reserved by the source institution and holding location.
Resource Identifier:
45535839 ( OCLC )
ocm45535839

Downloads

This item has the following downloads:


Full Text
IMPLEMENTATION AS ORGANIZATIONAL LEARNING CAPABILITY; A CASE STUDY OF COLORADOS STAND ARDS-BASED REFORM POLICY by
James Andrew Bailey B.A., Kansas Wesleyan University, 1985 M.A., University of Northern Colorado, 1986
A thesis submitted to the University of Colorado at Denver in partial fulfillment of the requirements for the degree of Doctor of Philosophy Educational Leadership and Innovation
2000


This thesis for the Doctor of Philosophy degree by
James Andrew Bailey has been approved by
Alan Davis
Date


Bailey, James Andrew (Ph.D., Educational Leadership and Innovation) Implementation as Organizational Learning Capability: A Case Study of Colorados Standards-Based Reform Policy
Thesis directed by Assistant Professor Nancy M. Sanders
ABSTRACT
Research about educational reform has found that most reforms are not fully implemented, and there is tremendous variation in interpretation and implementation across sites. A major factor explaining variation in implementation is organizational capacity to understand, interpret and carry out the reforms in specific, local contexts. Research in private sector settings indicates that organizational learning orientations and facilitating factors are key to organizational capacity for implementing change. Using a model of organizational learning, this dissertation provides three comparative case studies of school districts selected to represent implementation variation within the same state policy context. The study investigates district organizational learning orientations and facilitating factors in implementing standards-based education reform. An organizational learning framework illuminates complexity of implementation at the local level and highlights implementation as a process that is intercative with the surrounding policy environment, developmental in learning about the meaning of reform, and related to specific learning orientations and facilitating factors at the local level. Standards-based reform looked different and meant different things in each district because each interpreted state policy differently, used different learning orientations to understand and interpret the policy, and had different facilitating factors present. Each district was implementing the reform in different degrees of depth and effectiveness. The districts with striong interpretive mechanisms identified through the model show evidence of progressive changes in schools and classrooms regardless of their specific interpretations of state policy. The study indicates that attention to organizational learning contributes to research and practice about improving district organizational capacity.
This abstract accurately represents the content of the candidates thesis. I recommend its publication.
Signed
in


DEDICATION
I dedicate this thesis to my family whose support and persistence during this process was never ending. I especially thank my wife for supporting my efforts and frequently going beyond the call of duty to give me time to finalize the writing of this thesis. I am forever grateful.
I would also like to dedicate this thesis to my parents who both gave me the love of learning and knowledge from a young age.


ACKNOWLEDGMENT
This analyses was partly supported by the National Science Foundation grant # REC 9905548, Nancy M. Sanders, Principal Investigator.
My sincere thanks goes to the following people: district personnel who gave their time willingly and openly; David Garretson and my transcriber who worked diligently to fix my tapes; and my committe who supported these ideas and rescheduled their time to help me meet deadlines. Last, I wish to thank my advisor, Dr. Nancy Sanders, for her patience and dedication in seeing this project through even in some very trying times.


CONTENTS
Figures............................................... xi
Tables ............................................... xiii
CHAPTER
1. INTRODUCTION....................................... 1
Colorados Policy Context...................... 5
The General Problem............................ 8
Background of the Problem.................. 12
Policy as Pedagogy......................... 13
Capacity Studies........................... 14
Theoretical Framework............................ 17
Specific Problem and Research Questions.......... 19
Research Questions......................... 20
Methodology for the Study........................ 20
Structure of the Dissertation.................... 23
2. REVIEW OF THE LITERATURE............................. 24
The Nature of Ideas in Policy.................... 26
Policy Implementation............................ 28
Rational Frameworks for
Examining Implementation................... 28
Conflict and Exchange Framework............ 30
Synthesis Frameworks....................... 34
Social Construction of Meaning Framework. 35
Organizational Learning and Sensemaking.......... 40
Capacity Studies................................. 44
VI


Capacity as Organizational Resources
and Influence................................. 46
Capacity as Economic Capital.................. 54
Capacity as Cognition......................... 57
Relation to Organizational Learning........... 59
Organizational Learning Capability and Orientations. 61
Conclusion.......................................... 66
3. RESEARCH DESIGN......................................... 69
Site Selection and Sampling ........................ 74
Case Descriptions............................. 77
Data Collection..................................... 79
Interviews.................................... 80
Document Analysis............................. 82
Observations.................................. 83
Data Analysis....................................... 84
Categorization Strategies..................... 85
Contiguity-based Relations................... 106
Validity and Limitations of the Study.............. 107
Conclusion......................................... 110
4. IMPLEMENTATION AS ORGANIZATIONAL LEARNING CAPABILITY: UNDERSTANDING THE ROLE
OF LEARNING ORIENTATIONS................................ 112
Acquisition........................................ 113
Knowledge Source............................. 114
Learning Focus............................... 119
Learner Focus................................ 122
vii


Use of Data.................................... 125
Interpretation....................................... 127
Interpretive Orientation....................... 128
Interpretive Mechanism......................... 130
Dissemination........................................ 131
Dissemination Mode............................. 132
Knowledge Reserve.............................. 134
Utilization.......................................... 137
Learning Scope................................. 137
Value Chain................................ 142
Conclusion........................................... 144
5. IMPLEMENTATION AS ORGANIZATIONAL LEARNING CAPABILITY: UNDERSTANDING THE ROLE OF FACILITATING FACTORS..................................... 146
Acquisition.......................................... 147
Scanning .................................. 147
Performance Gap................................ 149
Policy......................................... 151
Concern for Measurement........................ 155
Organizational Curiosity....................... 157
Interpretation....................................... 160
Involved Leadership............................ 160
Leadership Cognition........................... 162
Focus.......................................... 164
Systems Perspective............................ 165
Dissemination........................................ 169
Climate of Openness............................ 170
Vlll


Continuous Education ................... 173
Utilization....................................... 176
Multiple Advocates.......................... Ill
Accountability.............................. 178
Resources................................... 182
Conclusion........................................ 188
6. IMPLEMENTATION AS ORGANIZATIONAL LEARNING CAPABILITY: UNDERSTANDING VARIATION AS
LOCAL MEANING.......................................... 190
Constructing Local Interpretations of
Standards-Based Reform Policy..................... 191
Case 1: Midplains........................... 191
Case 2: River Valley........................ 196
Case 3: Front Range......................... 201
Variation in Meaning of Standards-Based Reform.. 208
Meaning of Standards-Based
Reform in Midplains......................... 208
Meaning of Standards-Based
Reform in River Valley.......................211
Meaning of Standards-Based
Reform in Front Range....................... 216
Conclusion........................................ 222
7. ORGANIZATIONAL LEARNING CAPABILITY
IN STANDARDS-BASED REFORM : CONCLUSIONS
AND IMPLICATIONS FOR POLICY............................ 226
Implications for Interaction of
Organizational Learning Capability................ 228
Interaction of Learning Orientations........ 230
Differences in Facilitating Factors......... 235
ix


Implications for Systemic Reform Policy in
Engaging Organizational Learning Capability........ 243
Influence of Policy on the Three Cases....... 243
Challenges for Policy as Pedagogy............ 247
Implications for the Organizational Learning
Capability Model on Practice....................... 249
The Models Utility in Understanding LEAs as Learning Organizations.................... 250
The Models Utility for Enhancing
Learning Capability.......................... 253
Conclusion......................................... 256
APPENDIX
A. Interview Protocol..................................... 260
BIBLIOGRAPHY..................................................... 264
x


FIGURES
Figure
4.1 Knowledge Source Rubric........................ 114
4.2 Learning Focus Rubric............................... 119
4.3 Learner Focus Rubric................................ 122
4.4 Use of Data Rubric.................................. 125
4.5 Interpretive Orientation Rubric..................... 128
4.6 Interpretive Mechanism Rubric....................... 130
4.7 Dissemination Mode Rubric........................... 132
4.8 Knowledge Reserve Rubric............................ 134
4.9 Learning Scope Rubric............................... 137
4.10 Value Chain Rubric.................................. 142
5.1 Degree of Use of Scanning........................... 147
5.2 Degree of Use of Performance Gap.................... 149
5.3 Degree of Use of Policy............................. 151
5.4 Degree of Concem/Use of Measurement................. 155
5.5 Degree of Organizational Curiosity.................. 158
5.6 Degree of Involved Leadership....................... 160
5.7 Degree of Leadership Cognition...................... 163
5.8 Degree of Focus..................................... 164
5.9 Degree of Systemic Perspective/Alignment............ 165
5.10 Degree of Climate of Openness....................... 170
5.11 Degree of Continuous Education...................... 174
5.12 Degree of Multiple Advocates........................ 177
XI


5.13 Degree of Accountability............................ 179
5.14 Degree of Aligned Resources ........................ 182
7.1 The Interaction of Midplains Learning Orientations 232
7.2 The Interaction of River Valleys
Learning Orientations...................................... 233
7.3 The Interaction of Front Ranges
Learning Orientations...................................... 234
7.4 The Interaction of Facilitating Factors and
Learning Orientations in Midplains....................... 237
7.5 The Interaction of Facilitating Factors and
Learning Orientations in River Valley.................... 240
7.6 The Interaction of Facilitating Factors and
Learning Orientations in Front Range..................... 242
xii


TABLES
Table
2.1 Organizational Learning Capability Model............. 63
3.1 Case Summaries....................................... 77
3.2 Informants Interviewed............................... 81
3.3 Coding Rules for Acquisition......................... 86
3.4 Coding Rules for Interpretation...................... 87
3.5 Coding Rules for Dissemination....................... 88
3.6 Coding Rules for Utilization......................... 88
3.7 Coding Rules for Knowledge Source.................... 90
3.8 Coding Rules for Learning Focus...................... 91
3.9 Coding Rules for Learner Focus....................... 91
3.10 Coding Rules for Dissemination Mode.................. 92
3.11 Coding Rules for Knowledge Reserve................... 93
3.12 Coding Rules for Learning Scope...................... 93
3.13 Coding Rules for Value Chain Focus................... 94
3.14 Coding Rules for Facilitating Factors Under Acquisition 95
3.15 Coding Rules for Facilitating Factors Under Dissemination 97
3.16 Coding Rules for Facilitating Factors Under Utilization 98
3.17 Coding Rules for Facilitating Factors Not Specific to a
Learning Process..................................... 99
3.18 Coding Rules for Use of Data............................ 100
3.19 Coding Rules for New Facilitating Factors Under
Acquisition.......................................... 101
3.20 Coding Rules for Interpretive Mechanism................. 101
xiii


3.21 Coding Rules for Interpretive Orientation.................... 102
3.22 Coding Rules for Emergent Facilitating Factors Under
Interpretation............................................... 103
3.23 Coding Rules for Emergent Facilitating Factors Under
Utilization.................................................. 104
3.24 Organizational Learning Capability Model for Local
Education Agencies........................................... 106
3.25 Interrater Reliability Percentages........................... 109
6.1 Summary of Midplains Learning Orientations and
F acilitating F actors....................................... 191
6.2 Summary of River Valleys Learning Orientations and
F acilitating F actors......f.......................... 196
6.3 Summary of Front Ranges Learning Orientations and
Facilitating Factors......................................... 201
7.1 Differences in Learning Orientations and
Facilitating Factors......................................... 228
XIV


CHAPTER 1
INTRODUCTION
Until recently, the majority of educational policy in the United States was left to local decision making processes. While most states have constitutional authority over public schooling, most of this authority was delegated to local education agencies (LEA). This delegation of authority was especially true in matters of curriculum and instruction (Spillane, 1993) with the assumption that local educators were in a better position to decide what and how to teach students. Since the early 1970s with the rise of both federal and state aid to LEAs, however, a proliferation of educational policy dealing with curriculum and instructional issues has been passed in an attempt to increase student achievement. This is especially apparent since 1983 when a Nation At Risk was published. This seminal work on the dangers of a failing public educational system led many states to emphasize policies that dealt with curriculum and instructional matters.
The 1980s, from a state policy perspective, saw great emphasis put on increasing graduation requirements, improving teacher quality and basic skills testing (Goertz, Floden & ODay, 1995). These policies, however, did little to change the content or instructional methods of coursework and were contradictory in that they called for
1


increased content and thinking but often mandated minimal, basic skills testing. These contradictory notions led to little professional learning and the inability to use this type of reform policy (Goertz et al.) for enhanced achievement.
More recently, a more systematic and consistent approach to educational policy has emerged. The concept of systemic reform as articulated by Smith & ODay (1991) assumes that if all parts of the system align around specific outcomes for student learning, student achievement on these outcomes will increase. The parts of the system that have to be aligned or organized to achieve ambitious student outcomes or standards for learning include: assessment systems that focus on measuring these ambitious standards, professional learning focused on the standards and assessments with new instructional methods, teacher certification that models the type of learning promoted, and restructured governance systems to promote increased student achievement. Similar to policies in the 1980s, systemic reform attempts to change the central practices or core technology of schooling, teaching and learning. Systemic reform,however, attempts to change the entire policy system in a more coherent fashion based on goals for student results and new conceptions of learning. By 1998, the power and logic of this idea led to 48 states passing their own versions of systemic reform policy (Massell, 1998).
2


Recent research on systemic reform policy in numerous states, however, suggests that the alignment of educational policy is not as rational as set forth in theory. For instance, Jennings and Spillane (1996) in their study of systemic reform policy in South Carolina found that unequal forms of capacity led to different interpretations of policy. At the state level, Massell (1998) also found that states have interpreted the intent of systemic reform quite differently and have attempted various capacity building strategies, with varying degrees of success, to help local education agencies (LEAs) enact these policies. Last, Quality Counts (1999), a yearly analysis of standards- based reform in all 50 states, reported that 48 of the 50 states have passed legislation dealing with systemic reform. However, state accountability systems vary widely, state standards and assessments vary greatly in form and substance, and only 14 provide any form of incentive to improve performance. Similarly, Quality Counts reports that due to this variability in interpreting systemic reform around accountability, only two states come close in having all the components of a complete accountability system. According to Quality Counts, the principles for a statewide accountability system include having standards, balanced assessments that measure these standards, rewards and consequences for performance, and data and information from performance measures that schools and publics can understand. In conjunction, a real accountability system includes report cards that summarize the
3


performance of individual schools; public rating systems of schools that target low performers, targeted assistance for school improvement, rewards based on performance; and authority to close or take over schools that do no improve (p.lO).The two states that have these elements, North Carolina and Texas, posted the largest average gains on the National Assessment of Educational Progress from 1990 to 1996.
This recent policy trend raises many questions about the role of states, LEAs and the relationship between state policy and local practice (Spillane, 1994). Is the line of implementation between state policy and LEAs any different from earlier policy attempts that showed local implementation to be problematic? Because this new form of state policy attempts to change curriculum, instruction and professional development, how have districts interpreted these policies? Similarly, how have LEAs gone about making sense of a multitude of these policies in an internally coherent fashion? Last, what role does the LEA play in systemic reform in light of the tremendous amount of change and learning required by new systemic reform policies? I take up these questions in the following dissertation through comparative case studies of how three Colorado school districts responded to state systemic reform policy.
4


Colorados Policy Context
In the 1990s, the state of Colorado, as in most other states, became much more active in curriculum and instructional policy aimed at increasing student achievement. However, because of the states long history of local control, state initiatives were always designed to let LEAs determine how they would implement policy while pushing for more ambitious pedagogy by LEAs. Colorados version of systemic reform began in 1993 with the passage of HB 93-1313 that called for the state to develop model content standards as achievement goals for all students. LEAs were required to adopt their own version of standards that either met or exceeded the state version. Although local standards were to meet or exceed state models, the Colorado Department of Education had no mechanism to evaluate locally adopted standards. This legislation also called for both the state and LEA to measure achievement of these standards aligning tests to adopted standards. The legislation required districts to provide professional development and ensure curriculum aligned with standards. Additional legislation changed the ways in which teachers were certified moving to a licensure system with testing requirements of teachers and administrators entering the field.
Since that date, many other policies have been legislated. For instance, in 1996 HB 96-1139, the Colorado Basic Literacy Act was passed. This legislation does not
5


allow third grade students to go past third grade if they do not have the literacy skills to succeed at higher levels. Most recently, HB 98- 1267 saw new requirements for LEA and school accreditation. This legislation requires LEAs to contract with the state to focus on outputs or indicators of how well students are achieving adopted content standards, community satisfaction, and a host of other indicators that focus on the results of the local educational system. Through a contractual agreement, this legislation strengthens local control on how to use standards and assessment at a local level, but holds LEAss accountable for raising achievement on state and local tests.
Last, within all of these policies has been the Colorado State Assessment Program (CSAP) which was mandated in HB 93-1313 to measure student achievement of state standards across all schools in Colorado. Content areas and grades were initially designated in separate legislation in 1995. Due to resource constraints, the initial testing occurred in the spring of 1997 with only fourth grade reading and writing being assessed. The spring of 1998 continued fourth grade assessment but added third grade reading for the literacy act. HB-98 1267 changed the testing schedule for CSAP that will continue to add content areas and grades through 2001 including 10th grade assessment of reading/writing and math to determine eligibility for graduation.
6


With the passage of SB 00-186 in the spring of 2000, grades 3-10 will take the CSAP annually. These scores will be used for school report cards.
Colorados policy context also includes substantial charter school legislation aimed at giving parents choice, and inadequate funding resources. Attempts to use surplus revenue to pay for construction costs were soundly defeated in the 1998 election. Similarly, Quality Counts (1999) gave the state of Colorado a grade of F for its adequacy of funding because it was ranked 49 of 50 states in education spending per pupil, and the state has failed to keep funding levels equalized with inflation. Colorado spent $4,941 per student in 1998 while New Jersey, rated as first, spent $8,436 per student. Similarly Colorado education spending showed a -11% change in inflation adjusted education spending per student from 1987-1998 while New Jersey showed a +23% change in the same time period (Quality Counts, p. 120).
With the proliferation of state instructional policy, state-wide testing and inadequate funding in Colorado, districts have implemented standards-based reform and system alignment differently. Because many of these policies are dramatic in their intent and consequences for students but not directive in how to implement them, the role of the LEA has increased as a conduit for information and interpretation of these new instructional policies. The ambiguity in many of these policies requires local districts to interpret what they mean, determine how they fit
7


into local values, and coordinate the state policy efforts with local improvement and policymaking. In essence, the implementation of these policies is as Yanow (1996) describes, not an instrumental or rational process, but a part of the policy process that focuses on meaning of these policies and the processes by which those meanings are communicated to and read by various audiences (p.9).
Analyzing implementation from an interpretive lens suggests multiple ways to interpret policy depending upon the local context and local capacity. Recent studies by the Colorado Department of Education(1997) suggest a large range of progress among Colorado districts around developing local assessments, changing instruction, and providing the necessary learning opportunities to become standards-based. Similarly, resource issues such as time, staff development and funds point to the unequal progress between city, suburban, outlying city and rural districts. These differences in how policies are interpreted, and disparity of resources results in the general problem of variation in implementation.
The Genera! Problem
Variation in the implementation of educational policy has been an issue since the RAND studies of the 1970s (McLaughlin, 1987). Recent research about standards-based reform policy has also found variation in the implementation and enactment of
8


standards policies (Massell, Kirst & Hoppe, 1997). Jennings and Spillane (1996) suggest that variation is due to local needs and contexts. This is especially true in specific curricular areas like math, led by NCTM curriculum and instructional standards, which call for greater degrees of mathematical power for all students, but which run counter to highly ingrained institutional patterns. Implementation problems, as McLaughlin (1987) suggests, are never solved but evolve through multi-stage, iterative processes. The failure of past implementation and models to explain local responses to systemic reform begs that we look at new models and ways to analyze implementation as an interactive process versus a linear model. This is especially important in the current systemic reform initiatives because they define the policy context through instructional policy which is complex, extensive and reaches the core practices of teaching and learning that are more open to varied interpretations (c.f. Spillane, 1993, 1994).
To fully realize the potential for this complex and systemic type of policy at a local level, implementation relies on local capacity (Jennings and Spillane, 1996; Corcoran & Goertz, 1995). Capacity in systemic reform studies has been identified in various ways as organizational resources, economic capital, cognitive knowledge and beliefs, but no common definition exists. Similarly, while many components have
9


been identified to enable various forms of capacity, no connection to how these enable LEAs to interpret or learn from state policy has been defined.
Knapp (1997) suggests that by only looking at systemic reform through past implementation knowledge we are bound to misunderstand the dynamics of this type of policy. Past implementation knowledge does not take us deeply into the meaning of capacity as a construct, where it comes from, or how policy itself contributes to that development process (p. 251) Knapp also suggests that too easily contextual conditions in the variety of case examples studied so far are treated as static givens, rather than dynamic, developmental features of an environment that is itself evolving (p.252).
To enhance the capacity of a system to meet more demanding requirements, four methods are commonly utilized. These include: 1) improving the performance of workers through professional training; 2) adding additional resources in the forms of money, materials, personnel or technology; 3) restructuring the way work is organized; or 4) restructuring how services are delivered (Goertz et.al, p.l 10).
Capacity in common terms has mainly dealt with improving teacher learning and professional development relative to standards-based reform policy. The problem of capacity, however, does not rest merely on improving teaching skills and knowledge. This so-called training model without the other capacity increasing elements is
10


incompatible with current systemic reform. Training alone does not allow teachers, schools or districts opportunities to learn, experiment, consult and evaluate their practice (Little, 1993) relative to higher cognitive demands placed on students and teachers.
Organizational learning may be an important way to construct an understanding of policy implementation related to interpretation, learning capability and capacity. Between the intent of systemic reform and classroom practice exist district and school contexts which mediates the influence and meaning of standards-based reform on teaching and learning. Goertz et al. (1995) found that individual capacity interacts and is interdependent with organizational capacity in many respects. Similarly,
Knapp (1996) suggests that the unfolding patterns of systemic reform can be seen from the perspective of organizational learning. This perspective, seen as an organizations cognitive capability to acquire, interpret, disseminate and utilize new knowledge (Huber, 1991), seems to be the gap in understanding the implementation of this type of policy. How local education agencies (LEA) learn about and from systemic reform policy suggests a new approach to understanding factors that enable organizational learning, policy implementation and their relationship. Therefore, this study will investigate how LEAs learn about and interpret state mandated standards-based reform policy from an organizational learning perspective. This study will
11


also investigate what factors and orientations in LEAs contribute to organizational learning about this type of policy.
Background of the Problem
Systemic reform policies that focus more on the core technology of schools are open to more varied interpretations because of the demands for different learning outcomes, equity and new institutional patterns. Compared to traditional educational policies which tended to focus more on the structure and organization of schools, systemic reform policies require learning in all parts of the LEA. Spillane (1993) suggests that because these types of policies may increase the activity of LEAs, new questions arise about the relationship between state policy and local practice, and this demands new ways to explore this relationship. A promising approach to explore the relationship is to examine how systemic reform policy and organizational capability exist in a reciprocal relationship based on learning. To understand this relationship, the ambitious changes in instructional policy can be viewed as constructivist forms of learning (Cohen & Barnes, 1993) required by teachers, principals and central office personnel. This theoretical perspective argues that if the idea of a policy is potentially worthwhile to implementors, its success will depend on the degree and
12


quality of reconstructing beliefs, knowledge and action in individuals and organizations which is in effect learning through actual practice (Fullan, 1992, p. 66).
Cohen and Barnes (1993); Fuhrman and others. (1988); and Yannow (1991, 1996) have also alluded to the use of learning theory as the next orientation for policy implementation research. Other authors (Elmore and Fuhrman, 1994; Goertz, Floden & ODay, 1995) have agreed, suggesting that the ambitious instructional reforms of state policy are vulnerable to the capacities of individuals and organizations to learn from policy. Support for this perspective also comes from the latest implementation studies including research on the relationships among policy and practice, and capacity.
Policy as Pedagogy
Similar to the latest theoretical perspective for studying implementation (Cohen and Barnes, 1993; Fuhrman and others ,1988; Yannow, 1991,1996), educational policy and practice studies from the Center for Teacher Learning at Michigan State University and Consortium for Policy Research in Education at Michigan University have used the constructivist learning metaphor to explore the relationships between policy and practice. This line of research uses cognitive psychology as its theoretical framework and investigates the educative nature of policies to see how local
13


practitioners, schools and districts construct an understanding of specific policies. Field studies using this framework also suggest a tenuous relationship between policy and individual practice mediated by prior beliefs, experience and the school organization (Wiemers, Wilson, Peterson and Ball, 1990). Similarly, Ball, Cohen, Peterson and Wilson (1994) examined how the differences in local teachers, schools and school districts influenced the response to state policy and mediated the original intent of the policy. Their findings suggest that sources of variance in local instructional guidance are substantial. These variations placed limitations on the ability to capitalize on reform ideas and the ability to increase professional knowledge.
Capacity Studies
Other research in this field looked in more depth at the interaction and capacity needs among the intent of instructional policy, the implementing organizational unit and individual teachers to ascertain how learning from policy can occur. Spillane (1993) for instance looked at how different districts in Michigan responded to state reading policy. He found that state policy and its meaning are affected by the notions central administrators hold about curriculum and instruction. If central administrators hold traditional pedagogical views, they limit the resources and opportunities for
14


\
teachers to leam from these policies. Jennings and Spillane (1996) in their study of South Carolina policy for at-risk learners found that unequal capacity at the local school site affects learning opportunities for practitioners. Their study raises questions about the effects of policy when it requires local school systems with unequal capacity to implement the same ambitious reform. Similarly, Spillane and Thomson (1997), argued that capacity has to be rethought in light of the amount of learning required by all educators in relation to current instructional reform.
Focusing on Local Education Agencies (LEAs), they found that where ambitious instructional reform was taking place, three interrelated dimensions commonly used by economists were evident including: human capital, social capital and financial resources. These dimensions of capacity helped local LEA leaders to help others leam.
The Consortium for Policy Research in Education (CPRE 1995, 1996) in their studies of systemic reform from a learning perspective found similar interrelationships between individual capacity and organizational capacity. These studies identified elements of teacher capacity necessary for instructional change and how they depended highly upon organizational capacity to leam from and make sense of more ambitious instructional policy. Organizational capacity, defined as vision, collective commitment, knowledge or access to knowledge, resources and
15


organizational structures conducive to learning, were found to interact with individual capacity in a number of ways. High capacity schools had these organizational elements in place that allowed individual learning by teachers to occur and pursue reform ideas together.
As these studies of standards-based reform policy suggest, the local school and
district organization have a strong influence on teachers ability to learn from and
implement policy in their classrooms. However, none capture the interrelationships
of all of these dimensions of capacity in a complex and systemic manner suggested
by organizational sociologists such as Scott (1992) or Weick (1995). In order to
understand the interrelationship between policy and individual practitioners, it is also
necessary to examine how organizations mediate this learning and make sense from
their external environments. This perspective on capacity seen as the organizational
capability to leam seems to be the gap in understanding the implementation of this
type of policy. As Knapp, (1996) suggests:
In these terms, the patterns of implementation and effect are largely a story of incomplete professional and organizational learning occurring as the learners (teachers, administrators, curriculum coordinators, staff developers) encounter the often limited teaching (by those who make or promulgate reform policies, as well as by policy itself) and learning opportunities (created by, or around, reform policies).
The longer term prospects for the policys success depend in large measure on the quality of the reform policys pedagogy and learning resources over time.
This then suggests that the idea of capacity as organizational learning capability needs to be examined as an important issue in implementation of educational policy.
16


Theoretical Framework
This study posits that systemic reform policy and organizational learning exist in a reciprocal relationship. This exploratory assertion suggests that this study will use organizational learning nested within cognitive learning and interpretation as the theoretical framework. To understand this relationship, the ambitious changes in curriculum and instructional policy will be viewed as constructivist forms of learning required by a local district or school organization (Cohen & Barnes, 1993). Learning by a local school organization viewed through the conceptual framework of organizational learning will be defined as the capacity or processes within an organization to maintain or improve performance based on experience (DiBella & Nevis, 1998). To say that learning has occurred means that new knowledge has come into an organizational system, has been interpreted, has been disseminated or transferred, and is used (DiBella & Nevis, p. 28). In this sense, implementation becomes a change in both cognitive structures and behaviors of individuals and the organizations in which they work.
More specifically, this study will use Hubers (1991) conceptual framework that organizational learning is the cognitive processes an organization uses to acquire, interpret, disseminate and utilize knowledge. To say that learning has occurred means that all four of these stages occur. This framework also differentiates between
17


individual and organizational learning in that learning is said to be organizational when:
New skills, attitudes, values and behaviors are creatde or acquired over time
What is learned becomes the property of a collective unit
What is learned remains within the organization even if individuals leave. Hubers framework was used by DiBella & Nevis (1998) to define organizational
learning capability. The Organizational Learning Capabilities framework depicts organizational learning in the private sector through seven orientations to organizational learning and ten facilitating factors necessary for organizational learning to occur. Although this research was done in business settings, the framework can be translated into educational organizations.
Together, these seventeen elements provide a way to profile an organizations learning capability. Learning orientations describe how learning occurs and what is learned based on an organizations culture and core competence (DiBella & Nevis, 1998, p. 24). These orientations exist on a bi-polar continuum and help distinguish stylistic variations in organizational learning. Facilitating factors specify elements that promote learning and are based on best practices and common processes and may be what is currently being defined as capacity. These factors enable learning to occur, but do not determine an organizations orientation to learning.
18


Research using this framework began in 1992 to determine how and why organizations leam and was supported by the Center for Organizational Learning at Massachusetts Institute of Technology. Initial research used field-based case study methodology to describe organizational learning and grounded theory to draw emergent constructs about how and why organizations leam. Businesses studied included Centegra Health Systems, Fiat Auto, Motorola, AT&T, British Petroleum, Exxon Chemical and others. Constructs derived in these contexts were then tested using written case notes and validated through research in other companies. Further research has been conducted to develop an Organizational Learning Inventory for specific industries and to ascertain how enhancing learning orientation and facilitating factors affects various change initiatives.
Specific Problem and Research Questions Variation in implementation of educational policy has been a common theme and problem since the earliest implementation studies. Success in implementation of various policies has been credited to high capacity and opportunities to leam, but no single definition or model has been proposed, and no relationships have been drawn between capacity and its role in organizational learning for school districts. No studies to date have explored the relationship between state systemic reform policies
19


and local districts relative to organizational learning capability. Therefore, this study will analyze variation as a problem of differences in organizational learning capability for school districts.
Research Questions
Districts implementing Colorados version of systemic reform provide fertile ground to begin exploring, defining and understanding the dynamic interrelationship .of capacity as organization learning. Questions central to this study inclue:
1. Using Dibella and Nevis (1998) description of organizational learning based on Huber (1991) a.) how did LEAs learn about standards-based reform policies; and b) what orientations did LEAs use to learn about and implement current state-mandated standards-based reform policy?
2. What facilitating factors in each of Hubers four areas were perceived by district respondents to contribute to organizational learning about these policies?
3. Using learning orientations and facilitating factors, what interpretations were constructed by Local Education Agencies (LEAs) about state-mandated standards-based reform policies?
4. How do differences in learning orientations and facilitating factors explain variation of interpretation and implementation of standards-based reform policies?
Methodology for the Study
According to McLaughlin (1987), policy effects are complex, sometimes hidden or invisible, often unanticipated or nominalistic (p.175). Because of this McLaughlin
20


suggests that the analysis of policy implementation move away from positivistic models to models of social learning. Therefore, the methodology for this study in general will be a comparative qualitative case study. Case study methodology is appropriate for this study because it provides an intensive description and analyses of single units or bounded system (Merriam, 1998 p. 19) such as school districts. DiBella and Nevis (1998) suggest that qualitative case studies are the most appropriate methods and data to collect because of the complexities and subtleties of capturing learning processes in organizations.
The school districts is the organizational unit of analysis for two reasons. First, district response to state policy and implementation research has been identified as a critical but neglected area of study. Second, the literature poses many questions concerning district response, capacity, and the interrelationship of organizational learning and implementation at this level. Capacity is especially important in states like Colorado that have a decentralized educational governance system and state mandates for ambitious, systemic reform that is highly susceptible to capacity issues (Elmore and Fuhrman, 1994). Therefore, this study will use Colorados state systemic reform policies as a context to study district capacity as organizational learning capability and to determine what orientations or factors as perceived by
21


district personnel in each of Hubers four areas to contribute to LEAs interpreting and learning from standards-based reform policy.
Three districts (LEAs) in Colorado were chosen as sites to study because of their contrasting sizes and reputations for implementing standards-based education reform. Two were nominated as representing a relatively high level of implementation as confirmed through documents, training provided for teachers and administrators, and serving as resources for other districts. One was nominated as a contrasting case, representing a relatively low level of implementation and confirmed using the same criteria.
Two primary forms of data analysis are used for this study. Categorizing (Maxwell, 1996) is used to determine the types of learning orientations and facilitating factors in relation to Dibella and Nevis (1998) organizational learning capabilities framework. Contiguity-based relations is a second analysis tool. This method uses connection strategies to link codes to show relationships.
Analysis of data will use four stages.The first stage of analysis will code the data according to the four major categories in Hubers description of organizational learning and existing constructs in the Dibella and Nevis model. The second stage of analysis will code the data about attribution by respondents to those facilitating factors which enabled organizational learning to occur. In the third stage of analysis,
22


responses that are not adequately accounted for in the framework and model will be analyzed to address the applicability and appropriateness for understanding school district organizational learning and policy implementation. Last, learning orientations and facilitating factors will be analyzed together to understand how they interact
Structure of the Dissertation
This thesis explores how three districts used varying organizational learning capabilities to learn about and from state systemic reform policy. Chapter two provides a review of the literature about standards-based systemic reform research on capacity to implement complex educational reform, and the theoretical framework. Chapter Three describes the methods, sample and data analysis. Chapter four describes the variation in learning orientations among the three school districts. Chapter five describes variation in the strength and use of facilitating factors among the three school districts. Chapter six explores the interaction of learning orientations and facilitating factors and how this interaction contributed to different meanings for standards-based reform in each district. Chapter seven explores the implications for understanding implementation as organizational learning capability through a preliminary model of how organizational learning orientations and facilitating factors appear to influence organizational learning about systemic reform. Chapter seven will also consider implications for state policy to enhance organizational learning.
23


understand the policy and practice relationship, and to look at how capacity enables that relationship.
While we now know more about variables that affect the implementation of policy- OToole (1986) for instance found over three hundred separate variable in reviewing one hundred studies on implementation- there still seems to be a loose relation between policy and practice in educational settings (Cohen and Spillane, 1992). Why does this problem still exist even after almost a quarter century of implementation research? Initial research showed a lack of capacity and will at the local implementation sites which influenced policy outcomes (Odden, 1991, p. 1). Later research on regulatory measures built into policy to mandate outcomes also shared weak links in policy implementation efforts (Odden, 1991, p.l). From this early research, general schools of thought focusing on different levels of analysis (top-down and bottom-up) emerged that have helped shape our understandings of implementation. Other schools of thought that try to rectify the inherent conflict between these two schools and their level of analysis have also emerged (see Nakamura and Smallwood, 1980 for instance). However, the implementation literature also poses many unresolved issues and offers conflicting opinions. This chapter will suggest that to advance our understanding of implementation and the relationship between state policy and local practice, another conceptual framework
25


has to be used in order to advance our understanding. Second, this chapter will examine this new conceptual framework from the capacity perspective and examine how organizational learning and capacity may provide a new way to examine implementation of educational policy.
The Nature of Ideas in Policy
The nature of implementation research as in all domains has undergone ideological changes as understandings have emerged. Beginning with the top-down versus bottom-up debate, implementation study has been supported by research into policy formation and resulting practice (McDonnell, 1991). Analysts have asked what the original intent of the policy was, how the outcomes at local sites differ, and what factors explain the differences between intent and outcome?
Kingdons (1984) concept of ideas supporting policy and implementation proposes two dimensions for understanding policy and implementation. First, goals -or those agreed upon conceptions of what is desirable for individuals, groups or the society as a whole- exist as solutions to policy problems. These goals such as improving student achievement, teacher performance, improving standards and numerous others are in essence the content of implementation or the lessons to be learned. The second concept includes theories about how the world works. This
26


concept contains those cause and effect beliefs about social phenomena including policy instruments, peoples behavior, regulations and others. These beliefs lead to the processes of implementation or how the content of policy is put into place and learned. Put another way; it is the pedagogy of policy.
While both concepts are crucial in formulating educational policy, the second part of Kingdons concept in relation to implementation connotes more importance. The goals of educational policy will continually change, but the process of implementation as learning is critical in that it is the means of accomplishing the ends of both policy goals and continual organizational development (Fullan 1992, p.66). If the idea of the goal is potentially good, its success will depend on the degree and quality of change in individuals and organizations that is learned through actual practice (Fullan, 1992, p.66).
If we begin to see implementation from a learning or socially constructed meaning framework, we can begin to see how systemic reform policy and organizational capacity exist in a reciprocal relationship. To understand this relationship, changes in instructional policy can be viewed as constructivist forms of learning (Cohen & Barnes, 1993) required of teachers, principals and central office personnel to implement the intentions. A constructivist framework theorizes that learning in this sense means all members and parts of the educational system need to
27


construct meaning about the policy in order to act and for implementation to occur. The system also needs to unlearn (Hedberg, 1981) past patterns and behaviors that are being replaced ny new ones. Although a gap exists in the empirical study of organizational learning, support for this rationale evolves from research in four fields: policy implementation of policy as pedagogy, organizational learning as sensemaking, capacity studies and organizational learning orientations. Each of these fields has or has begun to move toward using constructivist forms of learning to understand organizational phenomena. They may be combined to fill the gap in our understanding responses to systemic reform policy by local educational agencies (LEAs).
Policy Implementation
Rational Frameworks for Examining Implementation
Early implementation research focused on a rational framework that posited that goals can be set at a central location and through various controls, these goals can be achieved (Scott, 1992). Rational frameworks also focus on the structures of organizational settings and their role in purposeful activity centered on goal achievement. In implementation research, this framework has led to the top- down
28


school of thought that focuses on policy makers as central actors. This framework also focuses on factors these central actors control that can affect fidelity to the goals of the policy (Pressman and Wildavsky, 1973, Van Meter and Van Home, 1976, Mazmanian and Sabatier, 1981). This conceptual framework for implementation primarily focused on reducing uncertainty through monitoring, sanctioned interpretations (Ford and Ogilvie, 1996), structures and ability to structure the implementation (Mazmanian and Sabatier, 1989).
As one way to study implementation, the top-down rational approach faces many problems. As a linear approach based on achieving certain goals, the top-down rational approach must have a place from which to begin. The first criticism then is that top-down rational approach begins from the statutory language that fails to take into account actions prior to legislation (Matland, 1995, p. 147). Second, top-down policy fails to take into account the political nature of policies, instead focusing on implementation as a mere administrative practice guided by clear objectives and direction. Third, top-down rational models look at local implementors as vehicles and possible impediments toward policy success and focus on the policy makers as the key actors. Last, Fox (1990), argues that by only focusing on a rational view of implementation, we fail to address the false dichotomy of the objective versus the subjective in implementation. This leaves us questioning the reality of what we
29


analyze. The shortcoming of this perspective takes us into the next evolutionary framework in implementation research: the conflict and exchange framework.
Conflict and Exchange Framework
A second major perspective for studying implementation came from the famous RAND studies of the mid 1970s that focused on the implementors themselves. This perspective commonly known as the bottom-up perspective used an exchange or conflict theory to explain how implementors mutually adapt policy to fit their needs (McLaughlin, 1987). This framework argued, in contrast to rational models, that a more comprehensive view of implementation can take place by looking through the perspective of the local site, target population or service deliverers and the conflicts and exchanges that occur. Exchange theory as a conceptual framework for implementation suggests that the how and why of human decision making is based upon seeking some reward in social transactions. Exchange theory posits that people weigh the alternatives in respect to costs and benefits and the amount of information they may possess. Exchange theory also asserts that exchange not only serves the needs of individuals but shapes and constrains the collective development of social systems such as organizations (LeCompte & Preissle, 1993, p. 130).
30


Conflict theory concerns itself with power and contradictions in social systems. Although somewhat different in their approaches, most notably the view of humans within social systems, exchange and conflict theories are quite rational in their theories of how the world works. Therefore, while the overall perspective is different in bottom-up models some measure of rationality abounds. Models and theories in this framework mainly come from the school of bottom-up research and focus on transactions among individuals and between levels. Hjem and Hull (1982), Sorg (1978), Elmore (1978), and Hall and Loucks (1978) all developed models that suggested this bottom-up understanding of implementation.
The most famous of the bottom-up studies were done by Berman and McLaughlin (1974, 1975, 1977, 1978). In the famous RAND studies of 1974-1978 entitled Federal Programs Supporting Educational Change, Berman and McLaughlin studied four major federal programs in eighteen states in 293 local sites (McLaughlin, 1991, p.143). The original intent of the studies was to examine how federal policy stimulated and spread educational innovation, and how temporary funds were used to support new practices. These policies, as did most policies of the time, assumed a relatively direct relationship between federal policy inputs, local responses, and program outputs (McLaughlin, 1991, p. 144). Bermans and McLaughlins findings suggested something else, however.
31


The RAND studies found that federal policy did have a major role in prompting local school districts to undertake projects that were aligned with federal guidelines, but were mostly pro forma. Their findings suggested that adoption of a project consistent with federal guidelines did not ensure successful implementation, and even if successfully implemented there was no assurance that programs would continue without federal funds. The RAND findings suggested from a conflict and exchange framework that the consequences of federal policies depended primarily on local factors and not federal guidelines or funding levels (McLaughlin, 1991, p. 145). In examining the local factors that affected the outcomes of innovations, the RAND study reached five conclusions:
Educational methods of innovations did not matter as much as how they were carried out.
Resources alone did not guarantee successful implementation or continuation
Project scope needed to be ambitious enough to generate interest and involvement but not so large as to require too much too soon from the implementing system.
The active commitment of the district and site leadership from the very beginning was essential.
Local implementation strategies for innovations dominated the outcome, of federally supported projects (McLaughlin, 1991, p. 146).
The study also found effective and ineffective local strategies for implementation.
Ineffective strategies were ineffective because they did not provide for ongoing
teacher support and training, did not include teachers in development and signaled a
mechanistic role for teachers. These ineffective strategies included such things as
32


reliance on outside consultants, one shot training or comprehensive system-wide projects.
In contrast, the RAND study found that strategies that promoted mutual adaption, or the exchange of ideas for benefit were effective. These strategies included such things as extended training, teachers participation in decisions, regular meetings focused on practical issues, local development of project materials, and principals participation in training (McLaughlin, 1991, p. 146). In essence as a bottom-up theory of implementation of educational innovation, the conflict between macrolevel objectives and macro-level realities led to conflict over power, resources and benefits that led to mutual adaption in the intent of the policy.
The conflict and exchange framework, housed mainly within the bottom-up approach to implementation, places more emphasis on describing those factors that have caused difficulty in reaching policy goals at the local sites, and therefore have led to few explicit policy recommendations (Matland, 1995, p. 149). As an approach to analyzing implementation, this framework has been criticized on four major counts. First, by placing a large extent of policy meaning in the local actors hands, the role of policy makers is neglected or under specified. Second, this framework overemphasizes local factors while resources and access to an implementing arena may be determined centrally and can substantially affect policy outcomes (Matland,
33


1995, p. 150). Third, conflict and exchange analyses which only focus on the participants at the local level fail to take into account broader social, legal and economic factors that structure the perceptions, resources and participation of those actors (Fritz, Halpin & Power, 1994, p. 56).
Last, because most implementation studies focus on the questions of macro-level concerns as compared to micro-level realities, most bottom-up studies can still be considered top down theories. Therein, the conceptual frameworks of exchange and conflict theory still assume rationality in top-down needs and concerns.
Synthesis Frameworks
As implementation research and understanding has grown, so have the attempts to reconcile the differences and problems with the top-down and bottom-up approaches to implementation through combining the two conceptual frameworks. By looking at a rational framework for analysis of the policy itself and an exchange or conflict model for implementation, these hybrid models attempt explaining the how of public policy implementation. Third generation models of implementation take kernels of truth from both top-down, rational models and bottom-up exchange and conflict models. They look for a variety of responses and strategies depending upon either the policy (Matland, 1995), the policy arena (Nakamura & Smallwood, 1980, Miller,
34


1990), coalitions involved (Sabatier 1989) or conditions under which one model may be more appropriate (Berman, 1980, Timar, 1989). In answer to the question of which model is most appropriate, synthesis models rely on context. Most synthesis models combine lists of variables without exploring the theoretical implications (Matland, 1995). Therefore, these models do little for either policy makers or microlevel implementors.
Social Construction of Meaning Framework
If we believe that changing the culture of the school, the nature and profession of teaching, and the nature of the curriculum we offer students is the key (Fullan, 1992, p. 352) to sustained educational change, what then are the implications for policy and implementation studies? Social construction of meaning as a conceptual framework analyzes the constructed nature of social meaning and reality by removing the subject- object dichotomy(LeCompte & Preissle, 1993). Instead of teachers becoming the subject for policy they become actors in constructing meaning through social interaction. This frame, in contrast to a more rational view of the world, assumes that meaning is always changing but always based on interpretations according to local contexts. The social construction of meaning on an organizational
35


level also shares major concepts with cybernetics or systems theory in that
information matters considerably in the construction of meaning.
Returning to the introduction in which Fullan (1992) suggested that
implementation and change was simply putting ideas, practices or activities into
place, we can now see that that can happen but have little impact. For true
improvement to occur, Fullan also suggests that change and implementation requires
understanding meaning beyond surface detail. Fullan states:
The key to understanding the particular worth of particular changes, or to achieving desired changes, concerns what I call the problem of meaning. One of the most fimdamental problems in education today is that people do not have a clear, coherent sense of meaning about what educational change is for, what it is, and how it proceeds... What we need is a more coherent picture that people who are involved in or affected by change can use to make sense of what they and others are doing.
(P-4)
Fullan goes on to say:
Neglect of the phenomenology of change- that is, how people actually experience change as distinct from how it might have been intended- is at the heart of the spectacular lack of success of most social reforms... Solutions must come through the development of shared meaning (p.4-5)
Goertz, Floden and ODay (1995) see the construction of meaning as the greatest
challenge for newer systemic forms of policy:
The first and most critical challenge evident.... is also the most difficult to realize in a system as large and bureaucratic as is public education in the United States. It is to place learning at the front and center of all reform efforts- not just improved learning for students but also the system as a whole and for those who work in it.
For if the adults are not themselves learners, and if the system does not continually assess and leam from practice, then there appears little hope of significantly improving opportunities for all of our youth to achieve to the new standards.
36


This brings us to the role of a different conceptual framework in understanding implementation as learning and for understanding policy and its implications for supporting individual and organizational learning.
The latest perspective on implementation uses a cognitive perspective to understand the dynamics of the interaction between the policy and the implementors. For instance, Fullan (1992) discusses the nature of any school reform including policy implementation as making meaning out of the effort. Others (Cohen and Barnes, 1993; Fuhrman and others, 1988; Yannow, 1991) have also alluded to the use of learning theory as the next orientation for policy implementation research. This theoretical perspective argues that if the idea of a policy is potentially worthwhile to implementors, its success will depend on the degree and quality of reconstructing beliefs, knowledge and action in individuals and organizations which is in effect learning through actual practice (Fullan, 1992, p. 66). Other authors (Elmore and Fuhrman, 1994 & Goertz, Floden & ODay, 1995) have agreed suggesting that the ambitious instructional reforms of state policy are vulnerable to the capacities of individuals and organizations to learn from policy.
Foremost among the research that uses this perspective, the Educational Policy and Practice Studies, done during the early 1990s as a joint effort between Michigan State Universitys National center for teacher learning and the Consortium for Policy
37


Research in Education at the University of Michigan, have also used the constructivist learning metaphor to explore the relationships between policy and practice. This line of research views the educative nature of policy and how local practitioners, schools and districts construct an understanding of policy. For instance, Cohen and Barnes (1993) found that policy seen as pedagogy is didactic in nature and does not engage teachers much like common K-12 instructional practices. Therefore, very little learning occurs from most policy efforts unless policy can be designed as educative in nature. Field studies using this framework also suggest a tenuous relationship between policy and individual practice.
Wiemers, Wilson, Peterson, and Ball (1990) in their study of Californias policy effort for systemic math changes found that learning did occur for individual teachers. This learning, however, was mediated by prior beliefs, experience and the school organization. Similarly, Ball, Cohen, Peterson and Wilson (1994) examined how the differences in local teachers, schools and school districts influenced the response to state policy and mediated the original intent of the policy. Their findings suggest that sources of variation in local instructional guidance are substantial. These variations include different instructional preferences, different subject matter preferences, and different levels of local educator knowledge. These variations placed limitations on the ability to capitalize on reform ideas and the ability to
38


increase professional knowledge. Different school district understanding of policy also led to different messages about the policy given to schools. Finally, Spillane and Jennings (1997) in their study of the alignment of instructional policy and ambitious pedagogy in literacy found that while policy alignment strategies in LEAs were effective in changing surface level aspects of teaching, these strategies may be less effective in changing difficult dimensions of classroom practice: task and discourse.
As stated in the introduction, implementation from a social construction of meaning framework would suggest that implementation is not something to be done, but something to be learned. Once again Fullan (1992) says it eloquently when he states:
... the crux of change involves the development of meaning in relation to a new idea, program, reform, or set of activities. But it is individuals (authors emphasis) who have to develop new meaning, and these individuals are insignificant parts of a gigantic, loosely organized, complex, messy social system that contains myriad different subjective worlds (p. 92).
Implementation would in this framework exist not as mechanical actions to put centrally defined details into practice, but as a combination of factors interacting to help individuals and organizations make sense and learn from their actions. Policy formulation in this conceptual framework would, as Cohen and Barnes (1993) suggest, be educative in nature to help develop organizational capacity to acquire knowledge, share it and interpret its meaning to become part of organizational
39


memory. In one sense, policy would become the curriculum for school organizations to learn. This requires that we understand how organizations learn.
Knapp (1997) also suggests a growing need for different theoretical orientations in studying systemic reform. Past implementation research does not take us deep into the meaning of constructs like capacity, how it develops or where it comes from, or about the features of the context surrounding the implementation process. Past implementation research also ignores the evolution of the policy ideas themselves. Organizational learning, therefore, has been offered as a way to understand implementation as large-scale construction of meaning.
Organizational Learning and Sensemaking As noted in Chapter one, studies on systemic reform policy have pointed to the strong influence of the local school and district organization on teachers ability to learn from and implement policy in their classrooms. In order to understand the interrelationship between policy and individual practitioners, it is also necessary to examine how organizations mediate this learning and make sense from their external environments. These influences by organizations can be understood by reviewing two similar lines of inquiry: organizational sensemaking and organizational learning. This perspective helps us understand policy as something to be learned, and the
40


processes both individuals and organizations use to learn from and about the policy. Key ideas from an organizational learning perspective relative to policy include ideas such as learning opportunities, resources for learning, support for learning, and the storage of new knowledge and processes in organizational memory. Another key idea is that:
...the collective enterprise of a school, production firm, symphony orchestra, or other organization possess qualities that are greater than the aggregate of individuals within it, and that there is some identifiable learning that can be associate with the aggregate as a whole. (Cook & Yanow, 1995).
In both lines of research, organizations can be seen as cognitive structures that can be active in understanding the worlds in which they exist. Sproull (1981) for instance focuses on the behavioral processes necessary for implementation by understanding the information processing of organizations. In her theory, Sproull suggests that school districts respond to federal regulation in an active process organized around four elements: 1) processes by which organization attention is captured, 2) processes by which meaning about external stimuli is constructed, 3) processes by which response repertoires are invoked, and 4) processes by which behavioral directives or guides for actions are communicated (p. 457).
Expanding the idea of organizations as interpretation systems, Daft and Weick (1984) propose a model for different types of interpretation modes. This model is based on first, an organizations assumptions about the environmentanalyzable or
41


unanalyzable: and second, the amount of organizational intrusiveness in the environmentpassive or active. Daft and Weick also include other organizational processes such as scanning processes, interpretation processes, and strategy and decision making processes to help explain organizations as interpretation systems. Taken together, the model proposes four different interpretation types: undirected viewing, enacting, conditioned viewing, and discovering. Similarly, Lotto and Murphy (1990), and Stein (1997) use this conceptual framework to study sensemaking of schools and new Title I policy respectively.
The field of organizational learning also helps us to understand how organizations as collectives of individuals learn. Although many debates rage in this field of inquiry and little formal research exists to date, attention to organizational learning implies a central concern with how action patterns take shape within and are shaped by the stream of experience (Cohen & Sproull, 1996, p.xiii). Therefore, the more cognitive, process oriented approaches should be examined to explore how organizational learning affects implementation. For instance, Weick (1990) describes organizations as bodies of thought and sets of thinking practices that shape organizational variables. Kim (1993) developed a model showing how individual mental models through learning transfer to organizational learning through either complete or incomplete cycles effected by organizational routines. Argyris and
42


Schon (1978) in a similar fashion describe the two levels of organizational learning. Single looped learning is behavioral in that organizations use common responses or routines to correct problems. Double looped learning is cognitive in that the underlying assumptions of the organization are modified to generate different responses. Duncan and Weiss (1979) developed a middle ground theory by suggesting that organizational learning is an active process in which organizational members develop knowledge about action-outcome relationships and the effect of the environment on these relationships.
Recently, other authors have begun to view learning organizations as a collection of disciplines (Senge, 1990) including personal mastery, shared vision, mental models, team learning, and systems thinking. Others (Roberts & Kleiner, 1999) have substantiated organizational learning as dealing primarily with systems theory but argue that there are multiple ways to understand organizations as systems including open systems, social systems, dynamic systems, process systems, or living systems. Last, Hubers model (1991) provides the most parsimonious organizational learning theory. Hubers model uses four constructs to explain how organizations leam through: knowledge acquisition, information distribution, information interpretation, and organizational memory.
43


By using this perspective to understand implementation of systemic reform, we are able to understand how local education agencies as organizations attend to, make sense of and cognitively construct meaning of new reform policies. While some authors suggest that the pedagogy of the policy itself can be blamed for initial failure of systemic reform, an organizational learning perspective attends to local contexts and processes and how they construct greater learning opportunities that lead to both cognitive and behavioral changes. However, organizational learning like all perspectives leaves unanswered questions. Namely, what things lead LEAs to act on what they have learned, or how do LEAs actually learn about and from policy, and what contextual factors enable organizational learning to occur? Last if we accept that capacity is a major part of any implementation perspective, how is capacity related to organizational learning?
Capacity Studies
Many policy researchers who study systemic reform have used the concept of capacity to help explain why standards succeed or fail in the classroom For instance, Knapp (1997) suggests that a lack of capacity led to uneven policy attempts in Montana, Delaware, Connecticut and California with the National Science Foundations Statewide Systemic Initiatives. Researchers who use capacity as an
44


explanatory framework for implementation research though have not come to any common definition or understanding of capacity because capacity has been defined from a variety of perspectives. Capacity has generally been defined as:
the ability to receive hold or absorb information
the ability to learn or retain knowledge
the ability to do something
the quality of being suitable for or receptive to specified treatment Capacity studies use many of these definitions and isolate dimensions such as
resources, training, organizational structure, policy and interaction. Capacity studies also use more nebulous terms such as collective enterprise or cognitive capacity to explain factors shaping local response. Most capacity theories fit under one of three main perspectives: capacity as organizational resources and influence, capacity as a form of economic capital, or capacity as individual cognitive constructions and beliefs. However, no comprehensive interactive model of how or why capacity leads to overall changes in classroom practice or district action exists to date. There are also no studies to date that differentiate between facilitating factors, processes, results or cultural conditions that enable learning and become self-reinforcing over time. Each major element that has been found as capacity, therefore, may help enable
45


implementation. However, no clear understanding has been developed that uses multiple elements around large-scale organizational learning.
Capacity as Organizational Resources and Influence
If an individuals capacity to learn from policy and change practice is dependent upon organizational resources and influences, then many studies have defined capacity in this manner. For instance Spillane & Thompson (1997), Newmann, King & Rigdon (1997), and Smith (1997) all point to the role of resources in enabling other elements of capacity. Common among the resources often cited included the quantity and quality of money, staffing, focused time, equipment and materials.
Other research in this field looked more in-depth at the interaction and capacity needs in the intent of instructional policy, the organizational unit and individual teachers to ascertain how learning from policy can occur. For instance, Newmann and others (1997), Stokes (1998), Wechsler & Friedrich (1997), and Cohen (1995) all point to capacity as a collective enterprise or collegial learning communities. This research suggests that schools with this form of capacity are more receptive to ambitious reforms. These schools contain higher degrees of social resources, shared commitment and collaboration, clearer purposes, a supportive climate and a commonality of beliefs leading to high degrees of internal accountability. This
46


research also points out that this form of capacity makes it easier to understand newer forms of teaching and learning.
Similarly, Knapp (1997) suggests that the teachers as the targets of any policy instrument must overcome the contexts and conditions that mediate individual learning. Knapp (1997) identified conditions in his study of systemic reform that influence practice as professional relationships, contact with knowledge bases, knowledge opportunities, influences that acted as the medium of communication, and the social organization or culture of a school or LEA. Similar to Knapps notion of capacity as knowledge opportunities and access to knowledge, McDonnell and Choisser (1997) and Smith (1997), in their studies of the influence of new state assessment on schools and teachers, also found that the knowledge construction of teachers interacted and was dependent upon both the school and LEA organizations to offer technical support and a supportive climate. These two studies also found that training alone was not enough to totally help teachers change practice but was dependent upon the school and district to remove barriers and incoherence in internal practice in order for teacher practice to change.
The Consortium for Policy Research in Education (1996) in their studies of systemic reform from a learning perspective also found similar interrelationships between individual capacity and organizational capacity. CPRE (Goertz and
47


others 1995) defined capacity as the ability of the educational system to help all students meet higher standards. These studies identified elements of teacher capacity necessary for instructional change including knowledge, skills, dispositions, and views of self. These elements depend strongly upon organizational capacity to learn from and make sense of more ambitious instructional policy. Organizational capacity defined as vision, collective commitment, knowledge or access to knowledge, resources and organizational structures conducive to learning were found to interact with individual capacity in a number of ways. High capacity schools had these organizational elements in place that allowed individual learning by teachers to occur and pursue reform ideas. These high capacity schools worked to enhance teacher capabilities through not only access to knowledge but also through cultural norms, outside coaching relationships, and a strong professional community of practice. Similarly, Sykes (1990) came to similar conclusions earlier. He suggested that organizational control structures limited teachers ability to learn from policy, and that institutional messages about schools limited learning necessary for implementation.
Spillane and others (1995) in their study on the local policy system and how it affects math and science education in Michigan found that organizational influences strongly affected changes in math and science education. Most notable, this research
48


found that an LEAs connection with an outside network for its teachers strongly influenced change in practice. The LEAs commitment and disposition to maintain the change plus the collaborative atmosphere within the district were found to be major components of capacity.
Schlechty (1997) defined capacity from an organizational influence perspective as the ability of a school district to identify and maintain a clear and compelling focus on the future, maintain a constant direction, and act strategically. While only hypothetical in nature, Schlecthy believes that each level of the educational system should aim to build capacity in the level below. Capacity in this sense means to be able to solve the problems that inhibit the improvement of achievement at the student level.
Bodily and others (1998) in their evaluation of the implementation of the New American School Models found that certain organizational factors influenced the implementation of whole school reforms. Using the elements of design in each model as the dependent variable in a case study approach, researchers used a 0=4 scale to rate the degree of implementation in 40 separate schools. Through interviews and survey data, they found that factors including the process of selection, school receptivity or climate for reform, degree and stability of outside consultation, forms of interventions (whole staff training, extensive professional development, use of
49


facilitators, quality checks, materials and overall support), poverty level of the school, teacher mobility, and student teacher ratios as independent variables influenced the degree of implementation. Researchers in this study also found that the influence of the school district support including leadership support, culture of trust and cooperation between the school and district, the level of autonomy for schools, the level of resource support, and the effects of assessment and accountability packages also greatly influenced the degree of implementation of whole school reform models.
In evaluations of the National Science Foundations Statewide Systemic Initiative (SSI), Zuckerman, Shields, Adelman, Corcoran and Goertz (1998), found that eight common implementation strategies were used to build capacity in the 25 states involved. These strategies included: supporting teacher professional development, developing or disseminating instructional materials, supporting model schools, aligning state policy, creating an infrastructure to support reform, funding local initiatives, reforming higher education and preparation of teachers, and mobilizing public and professional opinion. Each of these strategies showed some impact in each of the 25 involved states. However, those states that had the largest impact on student achievement and changes in teacher behavior used intensive teacher development as well as significant investments in instructional materials and
50


resources(Zuckerman and others, 1998 pp. vii-x). This evaluation also found that capacity as state, local and school influences depended on multiple forms including teacher networks, regional assistance centers, technology infrastructure, or improved processes for selecting instructional materials.
Finally, Massell (1998) studied the interaction between states and LEAs and the strategies states used to build capacity at a local level in eight states known for standards-based reform. This research was the first to study how state policy itself helps to build capacity in the LEA. In this research, Massell defined capacity as the elements necessary to support effective instruction. Her study suggests that numerous nested contexts exist that act to support effective instruction in classrooms. In these eight states, Massell found seven areas of capacity that were essential for improving teaching and learning. First, classroom capacities or those elements that directly influence learning included such things as teachers knowledge and skills, students motivation and readiness to learn and curriculum materials for students and teachers. Second, school, district and state organizational capacities were found to provide educational direction and leadership as well as access to resources and knowledge. Capacity elements at these levels included the quality and types of people supporting the classroom, the quantity and quality of interaction within and among
51


organizational levels, material resources and the organization and allocation of school and district resources.
Massell also found in these eight states that despite their differences, they shared four common capacity building strategies. These included: first, building external infrastructure to provide professional development and technical assistance including such things as regional institutions, professional networks, professional associations and stronger ties with higher education; second, setting professional development and training standards for inservice and preservice education; third, providing curriculum materials through such avenues as curriculum frameworks, resource banks, and supporting effective programs in resource allocations; and fourth, organizing and allocating resources through linkages to accountability systems or site-based decision making.
Last, Massell found substantial evidence that state policymakers were paying attention in building local capacity through new strategies. These strategies included such things as locating assistance closer to schools, creating professional networks, providing strong curriculum guidance and adopting professional development standards. Despite these findings, Massell also found potential capacity problems that these eight states had not addressed. These capacity problems including limited capacity of state departments of education, limited understanding of performance
52


data that was supposed to drive all systems to improve, limited scope of capacity building aimed at low-performing schools while ignoring middle performing schools, and the need for continuity in capacity building during periods of conflict. Massell also found that a serious problem in most states were the incentives necessary to engage people in capacity building. Incentives for following professional development standards, improving teacher training institutions, pursuing professional development, holding students to high standards, and engaging in school improvement planning processes were found to be lacking.
In sum, those who define capacity from an organizational resource and influence perspective suggest that capacity is only about allocating the right number of resources, developing the right type of culture to influence individual learning, or having access to the latest knowledge base. This perspective, however, leads to many questions. For instance, how do these elements interact, how is this form of capacity developed over time, and how is it maintained? Nor is it clear from these studies just how many resources correlate with changes in classroom practice. Last, it is unclear from this perspective how individuals and collective units use these forms of capacity to make sense of large systemic reform efforts.
53


Capacity as Economic Capital
Economic capital and production theories have been used by researchers as another way to conceive of capacity. If we can conceive of capital as something acquired or developed over time (Coleman, 1990), then spending this capital to buy implementation becomes another way to conceptualize capacity. Coleman (1990) explains that three forms of capital are available in any organization that can help facilitate goals: physical capital or available materials goods; human capital or the skills and knowledge acquired by individuals; and social capital or the social relations among the people in an organization. In Colemans theory, all three forms of capital help facilitate the particular production function of an organization. However to change, Coleman theorizes that social capital becomes the most important form to draw upon. Note that social capital is very similar to the concept of capacity as social organization, collective capacity, collective enterprise or collegial learning communities discussed in the previous section. Those schools or districts with more capital to spend, can purchase or buy more of the intended policy changes. How this works in actual practice, and how capital is developed is not as clearly delineated. Nor is it clear where social capital in particular rests or how it leads to enhanced efficiency or production.
54


Spillane and Thomson (1997) argue that capacity has to be rethought in light of the amount of learning required by all educators in relation to current complex reforms. Focusing on Local Education Agencies (LEAs), they found that where ambitious instructional reform was taidng place, three interrelated dimensions were evident. These included human capital, social capital and financial resources. They also suggested that successful enactment of reform policy was dependent on local LEA leaders to help others learn by promoting and using these three interrelated dimensions. Spillane and Thompson, therefore, define capacity as innate elements that exist within LEAs to engage individuals in learning. They also argue that the development of human capital depends critically on the development and exploitation of social capital. How this process works, how social capital leads to enhanced human capital, and how each acts to influence large-scale organizational learning though are unclear.
Similarly, Corcoran & Goertz (1995) define capacity from another economic perspective: as production or instructional capacity. In their definition, a systems capacity functions as a set of three variables: human resources, resources such as time, money, organizational arrangements, incentives and materials, and the instructional culture. Corcoran and Goertz also suggest that capacity is limited in most districts and schools because our knowledge of what works is limited and that
55


teachers access to learning opportunities are limited. This in turn affects the effort of a faculty to collaborate around a collective effort. If the product of an educational system is high quality instruction, then the intellectual ability, knowledge and skills of teachers require an instructional culture focused on collaborative learning around common problems of practice. While this definition clearly begins to define capacity as an interactive model, there is no empirical basis for their definition; nor is there a clear understanding of how these elements interact to influence change in instructional practice.
Pure economic analyses of the relationship between resources, spending, and school and student performance also show little evidence to support economic capital as the only resource necessary for capacity. Hanushek (1997) did a meta-analysis of over 400 studies of student achievement. His findings demonstrate that there is not a strong or consistent relationship between student performance and school resources, at least after variations in family inputs are taken into account. Monk (1998) found similar results in a study of a New York policy requiring that students wanting to graduate from high school pass five Regents examinations by 2003. While numbers of students participating in Regents exams increased and inflationary increases in spending were evident in the state from 1992-1996, no increases in spending for new personnel were evident, nor were resulting increases in passing percentages evident.
56


In sum, economic perspectives on capacity ignore the dynamic nature of organizations as open systems and the role of learning defined as capacity. While capital resources are necessary in any educational setting, evidence suggests the impact of such resources alone cannot influence student or school performance.
Capacity as Cognition
A final way capacity has been defined is through the perspective of cognitive capacity. Using recent work in organizational theory that emphasizes the role of cognition in human activity, many authors have explored cognitive capacity of teachers, administrators and as organizational processes relative to systemic reform.
Educator knowledge and beliefs have been an important part of later implementation studies. How individuals access, interpret and make sense of ambitious reforms depends on many things and can influence the allocation of resources and learning opportunities for others. Spillane (1993) for instance looked at how two districts in Michigan responded to state reading policy. He found that state policy and its meaning are affected by the notions central administrators hold about curriculum and instruction. If central administrators hold traditional pedagogical views, they limit the resources and opportunities for teachers to learn from these policies. Similarly, Jennings and Spillane (1996) in their study of South Carolina
57


policy for at-risk learners found that unequal capacity at the local school site affects
learning opportunities for practitioners. Their study raises questions about the effects
of policy when it requires local school systems with unequal capacity to implement
the same ambitious reform. Similarly, Sanders (1998) found LEAs in which key
personnel had or acquired deep understanding and clear beliefs about the
implications of standards for classroom were the most effective at gamering and
using multiple kinds of resources to support teachers work. In her work on
Colorados SSI efforts, cognitive capacity was defined as:
Depth of knowledge and beliefs in the organization about the reforms and strategic use of knowledge and beliefs to obtain important organizational resources that teachers and others in the system need to fully implement standards (Sanders, 1998
p. 2).
From an organizational learning perspective, Elliott (1998) suggests that the internal capacity of schools and school districts in relation to intervention depends on organizational cognition that he defines as non-routine learning. In this sense, nonroutine organizational learning depends upon four cognitive dimensions that add to the cognitive capacity of the organization to learn. These dimensions include: the approach to the organizational dialectic which includes the deep questioning of conflicting norms, purposes and fundamental quality of collective interactions; the approach to knowledge which is a deliberate search, interpretation and experimentation for purposes of both refinement and reorientation; the approach to
58


task which is the interaction with the fundamental norms of teaching, school structure and collective interaction that leads to the development of a collective mind or capacity; and the approach to learning is a conscious awareness of the learning culture as continuous, collective, connected, deliberate and transformative. These cognitive processes are thus enabled by factors that both precede and follow including resources and the ability of a school or LEA to switch to these non-routine learning processes when necessary.
In sum, the ideas of cognitive capacity help us further to define capacity as a process of learning, constructing beliefs, and interpreting reform while better defining where both individual and organizational understanding are stored. However, by itself cognitive capacity does not explain how this capacity is developed, how resources influence its attainment, nor does it explain how this cognitive capacity becomes a collective understanding necessary for deep implementation of standards.
Relation to Organizational Learning
In sum, capacity has been defined as many things from diverse perspectives. If systemic reform requires greater knowledge on all parts of the LEA as some suggest as the logical extension of the central principles of the reform, then capacity of the
59


system can be identified as the whole systems ability to learn or retain knowledge, the faculty to do something, or the receptivity to treatment from policy. Therefore, we must turn to organizational learning as a way to begin to define and understand capacity on a large scale because of the similarities between organizational learning and capacity. These similarities include:
A focus on both individual and collective learning
Enabling resources
Mediation of organizations on individuals and individuals on organizations
A collective sense of focus, purpose and social resources that enables learning
Processes within the organization that help facilitate and store the interpretation of the learning that occurred
To date, no capacity definition or model has used a dynamic, interactive systemic model to describe how the various elements interact or to define how the LEA as an organization does learn. Therefore, there exists a need for a new theoretical perspective to understand organizational implementation and capacity if constructivism is used as the large theory in understanding systemic reform. Using Hubers (1991) model for organizational learning, knowledge acquisition, information distribution, information interpretation, and organizational memory, we can begin to illustrate how capacity and its various elements may link to organizational learning through a conceptual framework of organizational learning capabilities.
60


Organizational Learning Capability and Orientations Organizational learning has suffered from a lack of coherent theory. Some authors have viewed organizational learning as process; others have seen it as an end result. Still others like Leithwood, Leonard and Sharratt(1998) and Scribner, Cockrell, Cockrell and Valentine (1999) who have studied organizational learning in educational settings have begun to understand organizational learning differently. They see it as the relationship among impetus, mediating variables that enable learning on a collective basis, and the eventual changes in both cognitive structures and behaviors. Similarly, Toft-Everson, Jesse and Burger (1997), in their study of organizational learning, suggest that the organizational context into which educational changes are placed determines the receptivity to the proposed changes and improvements. This receptivity as local contextual factors was studied in seven LEAs. Toft-Everson and others found that organizational learning was a result of the culture of the school or LEA, organizational processes, managerial power and control, functions of leadership, communication processes, management of change processes, and stakeholder engagement. They also concluded that there is an essential relationship between organizational learning and organizational change, and that educational organizations can improve their capacity to learn by improving those attributes that lead to learning.
61


Last in his seminal review of organizational learning, Cousins (1996) also suggests that the influences on organizational learning come from two sources: the context within which learning capacity is embedded and the conditions and factors associated with the environment within which the organization exists (p. 608). Within the local context, Cousins classifies the local context as consisting of four components that are similar to many ideas discussed in the previous section of capacity. These components include:
ecology or the relationships between the human and physical or material aspects of the organization
milieu or the psycho-social dimensions of persons, groups and their characteristics
social system or the patterned relationships among persons and groups
culture or the belief system, values, cognitive structures and meanings shared among organization members (Cousins as discussed in Tofit-Everson and others, 1997)
Because organizational learning is now being analyzed as both processes and outcomes, Hubers model has been expanded to better understand differences in organizational learning. This model may help us understand how LEAs successful in learning about and implementing standards-based reform differ from other LEAs. This midground of understanding capacity as organizational learning capability uses the ideas developed by Nevis, Dibella and Gould (1995) through organizational learning and orientations. Modifying Hubers constructs of knowledge acquisition,
62


sharing and utilization, Nevis, DiBella and Gould (1995) explored successful businesses to understand organizations as learning systems and to determine how their learning differed. Picking companies who had reputations for organizational learning, they used a field study approach and grounded analysis technique to develop their concept of organizational learning capability. Their findings suggest that while all organizations learn, they have different learning orientations. Orientations describe how, what, and where learning occurs and are based on culture, experience and core competence. Organizations also possess different normative, facilitating factors or processes that affect how easy or hard it is for learning to occur. These factors are based on best practices and common processes for learning. Using Hubers constructs, the framework for looking at organizational learning orientations allows organizations to be studied as separate entities but to be compared for their different styles and responses to learning, (see table 2.1 below)
Table 2.1 Organizational Learning Capability Model
Learning Orientation Continuum
1. Knowledge source internal.....................................external
2. Content-process focus content......................................process
3. Knowledge reserve personal.....................................public
4. Dissemination mode formal.......................................informal
5. Learning scope incremental...................................transformative
6. Value-chain focus design.......................................deliver
7. Learning focus individual...................................group
Facilitating Factors
1. Scanning imperative 6. Continuous education
2. Performance gap 7. Operational variety
3. Concern for measurement 8. Multiple advocates
4. Organizational curiosity 9. Involved leadership
5. Climate of openness________________________________________________10. Systems perspective
63


Seen in this manner, organizations have an organizational learning system that can be described as learning orientations or style and facilitating factors or those elements within an organization that promote learning. The assumptions in this conceptual framework are that organizations are learning systems, and that learning in organizations is a systems-level phenomenon that is a constant function of any organization (Nevis and others, 1995). The conceptual framework of organizational learning capabilities also assumes that an effective learning organization constantly pursues an enhanced knowledge base allowing the development of competencies for either incremental or transformational changes. In these instances, there is assimilation and utilization of knowledge and some kind of integrated learning system to support such actionable learning (Nevis and others, 1995, p. 74).
Although much of the work with organizational learning capabilities has been done in business organizations, DiBella & Nevis (1998) suggest that this framework can be adapted to specific industry or contexts. For instance, after developing the orientations for general business, DiBella & Nevis used the general orientations to study health care systems. While they found some similarities, orientations had to be modified and adapted to fit both the culture of the industry and the way learning occurred. While no work to date has utilized this framework in schools or school districts, the purpose of this framework is to increase awareness and understanding of
64


the practices and elements that contribute to learning in any industry so that these capabilities may be increased. This dissertation applies the framework to educational settings to investigate whether and how orientations and facilitating factors can be used to describe organizational learning in local education agencies.
In sum, organizational learning capability is a new theoretical framework that utilizes social-construction of meaning to understand collective learning in various industries. It is an appropriate framework to examine both implementation and capacity issues in LEAs because it focuses on how organizations actually learn while examining how facilitating factors influences learning orientations. In this sense, it may encompass many elements from the capacity studies to show how resources enable individual and collective learning, and how orientations as cultural artifacts also enable learning. It is also an appropriate framework to use in that the learning orientations or learning styles are descriptive and therefore do not require value judgments. This allows us to look at how differences in learning orientations among LEAs explain different ways to interpret standards-based reform and therefore better understand variation. Last, this framework allows for modification according to industry.
65


Conclusion
As the struggle to understand educational policy implementation continues, Elmore and Fuhrman (1994) suggest that the big ideas of policy are vulnerable to the capacities of the people and institutions that implement them (p. 9). They go on to suggest that since the current reforms are deep into the core technologies of schooling, they are more vulnerable to capacity problems. Their solutions parallel the main thesis of this dissertation in that implementation of ambitious reforms is a problem of knowledge development and learning (p. 9).
This chapter discussed that early implementation research of educational policy did little to help us understand how large-scale policy implementation could occur. As new forms of policy took shape, new conceptual frameworks were used to understand how the ideas of these policies were put in place and modified according to local contexts. The latest form of educational policy, systemic reform or standards-based reform, focuses more heavily on the core technology of teaching and learning. Therefore, other conceptual frameworks such as organizational learning have been promoted as more appropriate ways to understand implementation within local contexts. These frameworks rely on large conceptual theories like constructivist learning and socially constructed meaning. These frameworks and theories have been promoted as the newest way to understand policy implementation as learning and
66


large-scale construction of meaning since standards-based reform relies heavily on the learning in all parts of the educational system.
In conjunction with organizational learning, this chapter discussed the relationship between the ideas of capacity as organizational resources and influences, economic capital and as cognition to enable this large-scale organizational learning. While most research in this area suggests capacity as the enabling elements that lead to organizational implementation or learning, very little agreement exists as to what capacity means relative to organizational learning or how the elements of capacity interact to enable organizational learning.
Last, if we wish to understand what capacity means relative to systemic reform using organizational learning as a filter for interpretation, organizational learning capability and orientations offers a way to understand the local context of learning. This framework allows modifications appropriate to particular organizations and combines both descriptive orientations of learning and normative, facilitating factors. This framework helps pull together the diversity in capacity research to focus not just on defining elements, but also differing orientations to learning. This allows LEAs and their response to systemic reform to be studied as separate entities but to be compared for their different styles and processes to learning in an interactive manner.
67


In sum, organizational learning and organizational learning capabilities allow farther evolution in understanding educational policy implementation and capacity. Systemic-reform as changes in the core technology of teaching, learning and school organization demands an extraordinary amount of learning and rethinking of common school practices. Focusing on how LEAs respond to systemic reform blurs the common dichotomy that has traditionally been drawn between state policy and the local context and demands a more substantive understanding of large-scale collective learning, enabling elements, and the interaction between the two. Using organizational learning orientations and capabilities to do this, I explore the response of three LEAs to Colorados version of systemic reform in order to offer alternative explanations for the state policy and LEA implementation relationship.
68


CHAPTER 3
RESEARCH DESIGN
This study explored how three Local Education Agencies responded to Colorados standards and assessment policies by exploring the dynamic interrelationship of orientations and factors leading to organizational learning.
Central questions for this study included:
1. Using Dibella and Nevis (1998) description of organizational learning based on Huber (1991) a.) How did LEAs learn about standards-based reform policies; and b) What orientations did LEAs use to learn about and implement current state-mandated standards-based reform policy?
2. What facilitating factors in each of Hubers four areas were perceived by district respondents to contribute to organizational learning about these policies?
3. Using learning orientations and facilitating factors, what interpretations were constructed by Local Education Agencies (LEAs) about state-mandated standards-based reform policies?
4. How do differences in learning orientations and facilitating factors explain variation of interpretation and implementation of standards-based reform policies?
In order to answer these questions, the analysis of policy implementation must
move away from simple cause and effect positivistic models and begin to use models
of social learning afforded us through the lens of organizational learning capabilities.
To look at the relationship between an LEAs capacity, organizational learning
69


capabilities and its degree of enactment of state policy, three steps must be undertaken to understand this learning. First, the LEAs organizational learning capabilities must be discovered through its learning orientations and use of facilitating factors to describe the interplay between the policy and organizational features. Second, how an LEA enacts a policy helps us understand how the policy is interpreted through learning opportunities (Cohen and Hill, 1998) afforded by the policy and other knowledge opportunities and disseminated as learning opportunities within an LEA. Seen in this way, implementation may equate to collective learning by personnel in an LEA. Enactment also allows for comparisons between LEAs and how contextual differences in organizational learning capabilities may lead to variation in implementation. Last, the two previous steps must be analyzed to understand the dynamic relationship between how an LEA as an organization learns, what it learns, how these lessons are interpreted, and how that affects the enactment of state policy.
A general research tradition to accomplish these steps requires a comparative, qualitative case study approach to unravel how LEAs learn from, make sense of, and put in place the lessons of standards-based reform through state policy. A case study approach is most appropriate for this type of policy research for three major reasons. First, according to Olson (as quoted in Merriam, 1998) a case study: first, helps
70


illustrate the complexities of a situation to show how numerous factors contributed to the problem in question; second, has the advantage of both hindsight and the present; third, shows the influence of personalities and their individual cognition on the issue; fourth, can show the influence of the passage of time on the issue; fifth, obtains information from a wide variety of sources; sixth, spells out differences of opinion and suggests how these differences have influenced the result; and seventh, presents information in a variety of ways and from the viewpoints of different groups, (pp. 30-31). Similarly as Merriam goes on to explain, the heuristic quality of a case study is suggested by these aspects... case studies have the ability to explain why an innovation worked or failed to work... and can evaluate, summarize and conclude, thus increasing its potential applicability (p. 31).
Second, educational policy research about the implementation of reform demands by its very nature that we understand certain relationships. These relationships include interactions between the intent of the policy, the implementing agency, and the knowledge generated by this interaction. Case study traditions allow us to understand better the role of policy in generating this knowledge. For instance Stake (1981) suggests that the knowledge gained from case study research is different than traditional research because: first, it is more concrete and not abstract; second, it is more contextual and therefore allows us to see how context and policy interact; third,
71


it is more developed by reader interpretation that leads to better generalization when new data is added to the old; and fourth, it is based more on reference populations that again help transfer the knowledge and generalize it to the reference populations.
Third, the majority of research done with organizational learning, organizational learning capabilities, implementation research and capacity studies use case study as their primary research methodology for particular reasons. First, case studies allow the how, why and meaning questions to be answered more easily. Second, case study is a particularly good design if the phenomena being studied is a process as in the case of implementation research. Case studies in implementation research help us to understand processes of events, projects and programs and to discover contextual characteristics that will shed light on an issue (Sanders, 1981 in Merriam, 1998, p. 33). Third, case study design helps elicit the boundaries of the phenomena under question and helps explain how success of a policy or program is due to contextual features. Last, because the policy in question interacts with the local LEAs in numerous dimensions, case study research allows each of these dimensions to be
fully explored for their dynamic interplay.
In this study, the implementation of standards-based reform policy has three
hypothesized dimensions that can be studied most appropriately by using case study methodology. These dimensions include LEAs as organizational learning processes,
72


temporal processes associated with the policy, and sustained processes as a result of implementation as learning. Each dimension adds to the rich understanding of the implementation process of state policy. This study, however, will focus primarily on the district and its learning processes as the unit of analysis and attempt to interweave the other case dimensions into a composite understanding of how districts as organizations acquired, interpreted, disseminated and utilized knowledge of standards-based reform.
Case study research uses various methodologies depending on research aims and
questions. For this study, qualitative forms of methodology are most appropriate
because as Maxwell (1996) states: qualitative research can develop explanatory
conclusions and theories, and rule out potential validity threats to these conclusions
and theories (p.l). However, as Maxwell goes on to explain, qualitative
methodologies use a different logic from ones employed in traditional experimental
and correlational designs. Qualitative methods in policy and organizational research
focus on process theory versus variance theory. As Maxwell (1996) states:
Process theory, in contrast, deals with events and the processes that connect them; it is based on an analysis of the causal processes by which some events influence others. Process explanation, since it deals with specific events and processes, is much less amenable to statistical approaches. It lends itself to the in-depth study of one or a few cases or a smali sample of individuals, and to textual forms of data that retain the contextual connections between the events, (p.2)
73


In sum, a qualitative case study allows us to see how LEAs as organizations mediate the influence of state policy through its organizational learning capabilities. The remainder of this chapter will more fully detail how case sites were selected and who acted as informants at each site. This chapter will also describe data collection, data analysis and end with a discussion of validity and limitations of the study.
Site Selection and Sampling
As an attempt to redefine capacity from an organizational learning capability perspective this research study used a comparative case study of three school districts. This number allowed for cross-case comparisons and greater fidelity to the organizational learning capability model to come up with descriptors for educational settings. Districts were chosen as the main unit of analysis since they exist as the main flow-through agency for most state policy; and because as Elmore, Siskin and Carnoy (1998) suggest, it is important to examine educational policy questions from the perspective of school and district experiences, rather than external policies that purport to influence schools alone.
Districts were selected by purposeful, theoretical, or criterion-based sampling. The intent of this study was not to develop highly generalized findings, but to begin to explore organizational learning capability in LEAs. Site selection was not meant
74


to be representative, but specific to the research questions. Therefore, specific criteria were needed to select sites or cases to study so that I could discover the organizational learning orientations and facilitating factors of LEAs and their relationship to implementing standards-based reform. Criteria for case selection include the following:
First, cases were selected by reputation for substantial evidence, or lack, of investment, progress, and success in implementing standards-based reform. Only by studying the contrast between successful and not successful implementation can the relationship between the policy and organizational learning begin to be extracted. As Knapp (1997) suggests, successful implementation and/or progress may define the LEAs receptivity to the reform ideas and ability to sustain them over time that suggests higher capacity within these districts. This was done by asking for recommendations from field-based consultants with the Colorado Department of Education (CDE), analyzing documents from CONNECT a statewide math and science initiative based on systemic reform principles, analyzing GOALS 2000 local improvement grant requests, and using researcher knowledge of districts reputations. State assessment data from 1997-1999 were also analyzed as a proxy measure for districts making progress in becoming more standards-based by noting increases in percentages of students meeting proficiency levels.
75


Second, to facilitate the expansion of capacity and organizational learning capability theory, it seemed important to look for variance in response to the policy. This allowed fidelity to organizational learning capability model as a way to understand differences in learning orientations and understand variation as range in response. Differences in response also suggest greater-but different- learning and high-capacity that can be a manifestation of greater attention by actors within these sites to structure greater learning opportunities (Knapp, 1997, p.254). Variation in this sense means what a district chooses to use as its main entry point to implement standards-based reform. This includes elements such as assessment, professional development, curriculum alignment, or parent and community involvement. This was done through consulting with CDE representatives, analyzing district implementation or accreditation plans, analyzing CONNECT documents, and through using
researcher experience and knowledge of LEAs in the state.
Third, two districts were selected as matched demographic pairs to provide
partial control for geographic location, size, and resources available for implementation. This again allows fidelity to the OLC model and understanding capacity as a multitude of factors beyond financial or personnel issues. A third district was chosen as a cross comparison case. This district, while in geographic proximity to the others, differed in size, available resources, and community
76


demographics. This provided a way to contrast typical organizational features (size, resources, demographics) and their role in organizational learning.
Last, districts were selected by reputation. That is, in order to examine implementation as OLC, cases had to be chosen that would help lead to enhanced theory in better understanding organizational learning capability. Therefore, districts with reputations for being learning organizations or for having unique processes or methods for implementation could help illuminate organizational learning capability in new ways. Again, a third case was chosen that did not have this reputation to provide a contrasting case. This was done through consultation with representatives of CDE, the Association of Colorado Education Evaluators and by analyzing number and types of presentations by districts at regional and state meetings dealing with standards-based reform.
Case Descriptions Table 3.1 Case Summaries
District Name Location Number of Students Number of Schools Number of Central Office Administrators
Midplains North Central 2635 4 3
River Valley North Central 14,161 27 8
Front Range Central Front Range 18,397 35 11
77


Case 1: Midplains School District. The Midplains Schools District educates 2635 students, and had 195 full time employees. Midplains is located in the north central part of the state and borders the other two districts. Midplains exhibited a general orientation toward interpreting standards-based reform as new content and spent the majority of their resources on purchasing new curriculum materials. Because of its geographical proximity and lack of state reputation, Midplains was chosen as a contrasting case. At the time of this study, central office administration consisted of a superintendent, a director of staff development and curriculum and an assistant superintendent for auxiliary services.
Case 2: River Valiev School District. The River Valley School District educates 14,161 students in 27 separate buildings. River Valley is also located in the north central part of the state, and enjoys a reputation as a leader in standards-based reform. River Valley had a long history of reform, and exhibited an orientation toward interpreting standards-based reform as data-driven instruction and assessment to raise achievement. Because of its size and reputation, River Valley was chosen as one of the matched demographic pairs of districts. At the time of this study, central office consisted of a superintendent, assistant superintendent, the director of elementary curriculum who was also designated as the director of accountability to deal with new state accreditation laws, director of secondary curriculum, director of
78


assessment, director of human resources, director of state programs and early childhood programs, and a director of business and auxiliary services.
Case 3: Front Ranee School District. The Front Range School District educates 18,397 students in 35 separate buildings. At the time of this study, central office administration housed a superintendent, two assistant superintendents for instruction and business services, directors of elementary (pk-5) and secondary (6-12) education, director of human resources, director of staff development, and numerous TOSAs (teachers on special assignment) charged with implementing special initiatives.
Front Range borders both Midpiains and River Valley but is located closer to the Denver metro area than the other two. Front Range, while not possessing the overall reputation of River Valley, is well known for its efforts in literacy and science reform. Front Range exhibited an interpretation of standards-based reform based around differentiated instruction. Because of its size and reputation, Front Range was also chosen as one of the matched demographic pairs of districts.
Data Collection
Data collection for this study was collected through three primary methods: structured interviews, district documents, and observations of meetings. In addition, early case descriptions of learning processes were developed and discussed in an
79


interactive process during follow-up interviews developed specifically around organizational learning capabilities. This process allowed for both validation of researcher insight and development of ideas relative to a learning profile. All previous research on organizational learning capabilities (Dibella and Nevis, 1995, 1998) has used these forms and methods of data collection to understand various learning styles for distinct industries. Data collection occurred during the spring and summer of 1999. Follow-up interviews occurred in the early fall of 1999.
Interviews
Sampling for interviewing was decided by who had primary responsibility for the enactment of standards-based education in each district. Initial lists of interviewees were designated prior to site access, and a snow-ball method was used to judge others who needed to be interviewed. In all districts, superintendents were interviewed for their view of overall implementation, resource allocation, and the role of the external environment in building organizational learning capability.
In all 24 people were interviewed (see table 3.2 below) in sessions lasting approximately one hour apiece depending on the time each informant had available. Some informants were interviewed more than once depending upon needed follow-
80


up or the informants depth of understanding. All interviews were tape recorded with the permission of the informant.
Table 3.2: Informants Interviewed
Position Held By Informant Midplains School District River Valley School District Front Range School District
Superintendent X X X
Assistant Superintendent NA X X
Staff Development/ Curriculum Personnel X NA X
Director of Assessment NA X X
Human Resource Personnel NA X X
Director of Elementary Education NA X X
Director of Secondary Education NA X X
Elementary Principal X X X
Middle School Principal X X X
High School Principal X X X
Semi-structured interviews were developed to ensure comparable data across all case sites. These interviews were primarily structured around processes of enacting standards-based education using Hubers (1991) framework about organizational learning. The interviews were also structured utilizing prior notions of organizational
81


learning capabilities, and prior notions regarding elements of capacity that may be related to organizational learning. Because the nature of this research was exploratory in that organizational learning capabilities as a guiding framework has not previously been used in education, questions were designed to be as open-ended as possible. This allowed informants the opportunities to address salient issues that they deemed appropriate.
All interview protocols were designed to elicit first, an understanding of how a district is attempting to enact standards-based reforms and the depth of this enactment; second, an understanding of each districts organizational learning orientation or how each acquired, interpreted, disseminated and utilized knowledge relative to standards-based education; and third, what facilitating factors helped each stage of the organizational learning process. A similar interview protocol was used for each district (See Appendix A) to ensure similar understandings across all cases to help better define capacity from an organizational learning perspective. All interview data was transcribed as individual interviews and put into a data base using ATLAS.ti software.
Document Analysis
At all three case sites, I collected district level documents pertaining to the implementation and enactment of standards-based education. These documents
82


included such things as each districts implementation plan submitted to the Colorado Department of Education, accreditation plans, policies regarding standards-based education, reports, curriculum documents, professional development information, specific grants, instructional materials, and any pertinent resource allocation memos or resource analyses relative to standards-based education. These documents served not only as information for analyses, but also helped guide further interview questions.
State-level data was also be collected for each case site. Information on state level CSAP (Colorado Student Assessment Program) data over each year of the test gave an idea of district improvement efforts and their success, and how well district efforts aligned with state expectations.
Observations
Where possible, I observed district led professional development opportunities and meetings dealing with the organization or enactment of standards-based reforms. Early on in gaining access to each case site, I ascertained if such observations were possible and attempted to attend as many as possible. These observations allowed me to understand better peoples conceptions about standards-based reforms, and to see how individuals as part of a larger organization access opportunities to acquire,
83


interpret, disseminate and utilize new knowledge. These opportunities also allowed access to issues and ideas not afforded in interview sessions. These observations allowed different perspectives on the districts organizational learning capability that could be pursued in subsequent interviews. Observational data, therefore, afforded me the opportunity not only to gain a deeper understanding of the districts theory in action (Argyris & Schon, 1993), but also to ground and validate interview data.
In all, I attended one principals meeting in the Midplains School District, an assessment conference plus a superintendents cabinet meeting in the River Valley School District, and two days of an instructional conference on differentiated instruction in the Front Range School District.
Data Analysis
According to McLauglin (1987), the conceptual and instrumental challenges in implementation analysis lie in models of multi-level and multi-actor complexities (p.177). These third generation analyses must also use new methods to elicit cognitive patterns of actions by individuals and organizations. In order to do this, this study analyzed all data in two ways based on the assumptions underlying the overall research questions: categorizing and connecting. Categorizing, or basing analysis on similarities in data, (Maxwell & Miller, 1998, p.l) helped determine what
84


organizational learning orientations and facilitating factors existed in districts. Connecting, or synthesizing data based on relationships, helped determine how learning orientations and facilitating factors influenced overall meaning of the reform policies
As in most qualitative research, data collection and analysis in this study occurred simultaneously. Data went through preliminary analysis using summary forms, initial coding forms and initial categorizing or systems model mapping. This interaction of data collection and data analysis enabled me to refine interview question, test initial hypothesis, and gather additional information as needed. After preliminary analysis, data was transcribed and put into a case data base using ATLAS .ti software and coded according to the processes described below.
Categorization Strategies
Data analysis of my interview, document and other data using categorizing strategies included the following four coding phases. These phases attempted to answer questions specific to Hubers theoretical framework, the DiBella and Nevis model of organizational learning capabilities, and investigate themes that were not well explained by the framework and model.
85


Phase One. The first phase of analysis was to code the data according to the four major categories in Hubers description of organizational learning. Using Hubers
operational definitions (Tables 3.3, 3.4, 3.5, 3.6 below), responses were read again and coded for each category within cases using ATLAS .ti software. Responses were then compared across cases to ensure common use of the codes. Responses that did not fit the categories were analyzed to determine the limits of the framework and its application to public educational organizations. This helped answer the first part of dissertation question one: Using Hubers (1991) description of organizational
learning, how did LEAs leam about state-mandated standards-based reform policy? More specifically, using Hubers operational definitions, exemplars, and data
samples emerged that helped to anchor rules for coding and help the validity and reliability of the codes.
Acquisition Definition: How organizations gain new knowledge through the experiences of their own personnel or indirectly through the experiences of other organizations:
Table 3.3 Coding Rules for Acquisition
Exemplars Data Samples
McREL BOCES Consultants University classes Central office personnel Staff development personnel Teacher/principal leaders Assessment data Outside research our curriculum adoption includes a research piece that evaluates what we're currently doing. I brought the whole concept of book talk and having dialog based on research We have the probably most comprehensive choice list in the state and we have the 5 models of staff development Actually we called some universities and finally ended up with McREL We took our administrators and some of our
86


teachers through SBE training. Developed through our Northern Colorado BOCES.
Interpretation Definition: Both the processes used to give meaning and speciJ meaning given to new ideas and knowledge by an organization.
Table 3.4 Coding Rules for Interpretation
Exemplars Data Samples
Improving our district Improving quality of curriculum Student achievement Writing standards Using data to make decisions Linking data to district decisions Designing units Buying materials so the strategies that we used was we looked at first of all the research around instruction around content area and look at the state model standards. And we started talking about so what is the difference between the traditional instructional system and the standards based system Our occupational education program, there were standards and benchmarks set in everything. So again we started putting money aside because what happened was that teachers knew they had to answer these questions and get to a certain point before they could come and ask for materials. So, we actually created proficiencies or assessments that kids have to pass at the high school level in order to graduate in language arts, math, and science. There was a whole framework of, this is not a fad thats coming through that we're going to jump on the bandwagon. This is not the latest staff development craze. This is something we believe in our heart of hearts is good for kids. If people implement standards really from the standpoint of creating an objective of what a student should know and be able to do and they don't impact the instructional strategies then we might as well not even have standards
Dissemination Definition: How an organization transmits the knowledge and meaning to their personnel.
87


Full Text
CHAPTER 2
REVIEW OF THE LITERATURE
Fullan (1992) asked, What types of things would have changed if an innovation or reform were to become fully implemented (p. 66) ? The dynamic complexity of multilevel change in the educational system does not allow for a simple answer to his question. Does the change have to occur at the policy, school district, school organization, department, grade, classroom or individual level? Or do these changes have to occur at multiple levels in a mutually supportive fashion as McLaughlin (1987) suggests? And do the changes have to look exactly alike to count as implementation? The literature to date has formulated many recommendations, and, according to some (Odden, 1991; Fuhrman, Clune & Elmore, 1988) implementation has occurred, but often the impact of policy and its outcomes is still questioned.
The current agenda of implementation researchers is to find the causes, influences, variables or themes an develop models that enable optimal implementation focusing on the type of policy, level of analysis and the relative complexity of implementing the policy (Matland, 1995). More specifically, implementation research has begun to utilize different conceptual perspectives to
24



PAGE 1

IMPLEMENTATION AS ORGANIZATIONAL LEARNING CAPABILITY: A CASE STUDY OF COLORADO'S STANDARDS-BASED REFORM POLICY by James Andrew Bailey B.A., Kansas Wesleyan University, 1985 M.A., University ofNorthern Colorado, 1986 A thesis submitted to the University of Colorado at Denver in partial fulfillment of the requirements for the degree of Doctor of Philosophy Educational Leadership and Innovation 2000

PAGE 2

This thesis for the Doctor of Philosophy degree by James Andrew Bailey has been approved by Alan Davis Date

PAGE 3

Bailey, James Andrew (Ph.D., Educational Leadership and Innovation) Implementation as Organizational Learning Capability: A Case Study of Colorado's Standards-Based Reform Policy Thesis directed by Assistant Professor Nancy M. Sanders ABSTRACT Research about educational reform has found that most reforms are not fully implemented, and there is tremendous variation in interpretation and implementation across sites. A major factor explaining variation in implementation is organizational capacity to understand, interpret and carry out the reforms in specific, local contexts. Research in private sector settings indicates that organizational learning orientations and facilitating factors are key to organizational capacity for implementing change. Using a model of organizational learning, this dissertation provides three comparative case studies of school districts selected to represent implementation variation within the same state policy context. The study investigates district organizational learning orientations and facilitating factors in implementing standards-based education reform. An organizational learning framework illuminates complexity of implementation at the local level and highlights implementation as a process that is intercative with the surrounding policy environment, developmental in learning about the meaning of reform, and related to specific learning orientations and facilitating factors at the local level. Standards-based reform looked different and meant different things in each district because each interpreted state policy differently, used different learning orientations to understand and interpret the policy, and had different facilitating factors present. Each district was implementing the reform in different degrees of depth and effectiveness. The districts with striong interpretive mechanisms identified through the model show evidence of progressive changes in schools and classrooms regardless of their specific interpretations of state policy. The study indicates that attention to organizational learning contributes to research and practice about improving district organizational capacity. This abstract accurately represents the content of the candidate's thesis. I recommend its publication. Signed N(jVy M. Sanders iii

PAGE 4

DEDICATION I dedicate this thesis to my family whose support and persistence during this process was never ending. I especially thank my wife for supporting my efforts and frequently going beyond the call of duty to give me time to finalize the writing of this thesis. I am forever grateful. I would also like to dedicate this thesis to my parents who both gave me the love of learning and knowledge from a young age.

PAGE 5

ACKNOWLEDGMENT This analyses was partly supported by the National Science Foundation grant# REC 9905548, Nancy M. Sanders, Principal Investigator. My sincere thanks goes to the following people: district personnel who gave their time willingly and openly; David Garretson and my transcriber who worked diligently to fix my tapes; and my committe who supported these ideas and rescheduled their time to help me meet deadlines. Last, I wish to thank my advisor, Dr. Nancy Sanders, for her patience and dedication in seeing this project through even in some very trying times.

PAGE 6

CONTENTS Figures.......................................................................................... x1 Tables ........ .................. ... ... ................... ... ....... ...... ..... ...... ...... ..... xu1 CHAPTER 1. INTRODUCTION ..................... ............................. ................ ... 1 Colorado's Policy Context . . . . . . . . . . . . . . . . . . . . . . . . 5 The General Problem......................................................... 8 Background ofthe Problem................................... 12 Policy as Pedagogy .. .. . . .. . . . . . . . .. . . . .. . . .. .. .. 13 Capacity Studies . . .. . .. .. . .. . . . .. . .. .. ... . .. . . . . ... 14 Theoretical Framework ... .. .. .. . ... .. ....... ... . .. ... . .. . .. . .. . .. 17 Specific Problem and Research Questions:........................ 19 Research Questions .. .......... ..... ...... ...... ... ................ 20 Methodology for the Study ................. ............ ..... ............... 20 Structure ofthe Dissertation............................................... 23 2. REVIEW OF THE LITERATURE ............................................ 24 The Nature ofldeas in Policy ... .......................................... 26 Policy Implementation .. ... ................ ...... .. ............ ............. 28 Rational Frameworks for Examining Implementation ........ ..... ...... .. ... .. . ... ... 28 Conflict and Exchange Framework . .. . .. . . . . .. . . 3 0 Synthesis Frameworks ............................................ 34 Social Construction of Meaning Framework......... 35 Organizational Learning and Sensemaking ....................... 40 Capacity Studies................................................................ 44 vi

PAGE 7

Capacity as Organizational Resources and Influence .. .. . . . .. . .. . .. . .. . . . . . . . . . . . . . .. 46 Capacity as Economic Capital ..... ....................... C 'ty c 't' apac1 as ogru 10n .......................................... 54 57 Relation to Organizational Learning..................... 59 Organizational Learning Capability and Orientations . . . 61 Conclusion......................................................................... 66 3. RESEARCH DESIGN............ .................................................. 69 Site Selection and Sampling .............................................. 74 Case Descriptions................................................... 77 Data Collection.................................................................. 79 Interviews............................................................... 80 Document Analysis .. .. .......... ............. .. .. .. .. ........ ... 82 Observations .......................................................... 83 Data Analysis . .... ...... ..... .. . .. . . . .. .. .. .. . .. .. ..... .. .. . . . . . .. 84 Categorization Strategies ....................................... 85 Contiguity-based Relations .. .. . ...... ........ ... . ........ 1 06 Validity and Limitations of the Study............................... 107 Conclusion......................................................................... 110 4. IMPLEMENTATION AS ORGANIZATIONAL LEARNING CAPABILITY: UNDERSTANDING THE ROLE OF LEARNING ORIENTATIONS .......................................... 112 Acqws1t1on . . . . . .. . . . . . . . .. .. . . . . . . . . . . . . . .. . .. .. . . 113 Knowledge Source................................................ 114 Learning Focus.......... .......................................... 119-Learner Focus ....................................................... 122 vii

PAGE 8

Use of Data........................................................... 125 Interpretation..................................................................... 127 Interpretive Orientation .. . .. . .. .. . .. . .. . .. . .. .. ... .. 128 Interpretive Mechanism .......... .............................. 130 Dissemination .... .. .. .. .. .. .. .. .. .. .. .. .. .. ... .. .. .. . .. .. .. .. .. .. .. .. .. .. 131 Dissemination Mode .............. .... ........................... 132 Knowledge Reserve.............................................. 134 Utilization . .. . .. . . . . . .. . . .. . .. .. .. . . . .. .. .. . . . .. . .. .. . . . 13 7 Learning Scope .. .... .... ..... ... .... .. .. .. .. .. .. .. .. .... .. .. .. 13 7 Value Chain.......................................................... 142 Conclusion........................................................................ 144 5. IMPLEMENTATION AS ORGANIZATIONAL LEARNING CAPABILITY: UNDERSTANDING THE ROLE OF FACILITATING FACTORS............................................. 146 Acquisition ... . ... . . . . .. . . . .. . .. .. .. ... . . . .. .. . . . . .. . . .. .. 14 7 Scanning . . . . .. .. .. . . .. . . . .. . . . . . . . . . .. . .. . . .. .. 14 7 Performance Gap .. .. .. . . .. ... .. .. . . . .. .. .. .. .. .. .. .. .. . . 149 Policy.................................................................... 151 Concern for Measurement.................................... 155 Organizational Curiosity .. .. .. .. .... . .. .... . .. . .... . .... 157 Interpretation .. . .. . . . . ... .. .. .. .. . .. . . . .. .. .. .. .. .. . .. ... ... .. 160 Involved Leadership ..... ..... .......... ..... ....... ............. 160 Leadership Cognition .. . ... ... .. .. .. .. ... .. . .... .. .. . ...... 162 Focus..................................................................... 164 Systems Perspective .. .. ... . . .. .... . .. . .. .. .. . . ... ... . 165 Dissemination . . .. . . . . . . . . .. .. .. .. . . .. .. .. .. . . . .. .. . . .. . . . .. 169 Climate of Openness .... . .. . . .. . .. . . . .. . . .. .. . . .... 170 viii

PAGE 9

Continuous Education . .. .. .. .. .. .. . . . . . . .. .. .. .. . .. 173 Utilization.. ....................................................................... 176 Multiple Advocates .... .. .. .. .. .. .. .. .. ...... .. .... .. .. .. .. .... .. 177 Accountability .............................................. ......... 178 Resources .... .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. 182 Conclusion......................................................................... 188 6. IMPLEMENTATION AS ORGANIZATIONAL LEARNING CAPABILITY: UNDERSTANDING VARIATION AS LOCAL MEANING................................................................... 190 Constructing Local Interpretations of Standards-Based Reform Policy ....................................... 191 Case 1 : Mid plains .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. 191 Case 2: River Valley............................................ 196 Case 3: Front Range .. .. .. .. .. .. .. .. . . .. .. .. .. .. .. .. .. .. .. .. .. 201 Variation in Meaning of Standards-Based Reform........... 208 Meaning of Standards-Based Reform in Mid plains .. ...... .. .. .. .................. .... .. .. .... 208 Meaning of Standards-Based Reform in River Valley ............................................ 211 Meaning of Standards-Based Reform in Front Range ............................................ 216 Conclusion ....... ..... ................... .............. ..................... ......... 222 7. ORGANIZATIONAL LEARNING CAPABILITY IN STANDARDS-BASED REFORM: CONCLUSIONS AND IMPLICATIONS FOR POLICY ......................................... 226 Implications for Interaction of Organizational Learning Capability..................................... 228 Interaction ofLearning Orientations ....................... 230 Differences in Facilitating Factors .......................... 235 ix

PAGE 10

APPENDIX Implications for Systemic Reform Policy in Engaging Organizational Learning Capability.................. 243 Influence of Policy on the Three Cases . . .. ... . .. .. 243 Challenges for Policy as Pedagogy . . . . . . . . . . . 24 7 Implications for the Organizational Learning Capability Model on Practice ...... .. . ... . . ... ... .. . . ... .. .... 249 The Model's Utility in Understanding LEA's as Learning Organizations.................................... 250 The Model's Utility for Enhancing Learning Capability.............................................. 253 Conclusion........................................................................ 256 A. Interview Protocol . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 260 BIBLIOGRAPHY ................................................... .................... ...... ..... 264 X

PAGE 11

FIGURES Figure 4.1 Knowledge Source Rubric .. .. ... . . . . .. .. ... . .. . .. . ...... 114 4.2 Learning Focus Rubric ............................................... 119 4.3 Leamer Focus Rubric................................................. 122 4.4 Use of Data Rubric.................................................... 125 4.5 Interpretive Orientation Rubric................................. 128 4.6 Interpretive Mechanism Rubric................................. 130 4.7 Dissemination Mode Rubric...................................... 132 4.8 Knowledge Reserve Rubric....................................... 134 4.9 Learning Scope Rubric.............................................. 137 4.10 Value Chain Rubric................................................... 142 5.1 Degree of Use of Scanning . . . . . . . . . . . . . . . . . . . . 14 7 5.2 Degree ofUse of Performance Gap........................... 149 5.3 Degree ofUse of Policy............................................. 151 5.4 Degree of Concern/Use ofMeasurement ................... 155 5.5 Degree of Organizational Curiosity........................... 158 5.6 Degree oflnvolved Leadership.................................. 160 5.7 Degree of Leadership Cognition................................ 163 5.8 Degree of Focus......................................................... 164 5.9 Degree of Systemic Perspective/Alignment.............. 165 5.10 Degree of Climate of Openness . . .. . . . .. .. .... ... .. .. .. 170 5.11 Degree of Continuous Education............................... 174 5.12 Degree of Multiple Advocates ................................... 177 xi

PAGE 12

5.13 Degree of Accountability ........................................... 179 5.14 Degree of Aligned Resources .................................... 182 7.1 The Interaction of Midplains' Learning Orientations 232 7.2 The Interaction of River Valley's Learning Orientations ................................................ 233 7.3 The Interaction of Front Range's L 0. eanung nentatlons ................................................ 234 7.4 The Interaction of Facilitating Factors and Learning Orientations in Midplains ........................... 237 7.5 The Interaction of Facilitating Factors and Learning Orientations in River Valley ....................... 240 7.6 The Interaction of Facilitating Factors and Learning Orientations in Front Range ...................... 242 xii

PAGE 13

TABLES Table 2.1 Organizational Learning Capability Model ... . .. . . .. .... .. 63 3.1 Case Summaries............................................................. 77 3 .2 Informants Interviewed .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. 81 3.3 Coding Rules for Acquisition........................................... 86 3.4 Coding Rules for Interpretation .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. 87 3.5 Coding Rules for Dissemination .. ............................ ....... 88 3.6 Coding Rules for Utilization............................................ 88 3. 7 Coding Rules for Knowledge Source .. .... .. .. .. .. .. .. .. .. .. .. 90 3. 8 Coding Rules for Learning Focus .. .. .. ...... ...... .. .. .. .. .. .. .. 91 3.9 Coding Rules for Leamer Focus..................................... 91 3 .1 0 Coding Rules for Dissemination Mode .. .. .. .. .. .. .. .. .. .. .. .. 92 3.11 Coding Rules for Knowledge Reserve .. .. .... .. .... .. .. .. .... ... 93 3.12 Coding Rules for Learning Scope .. .. . .. .. .. .. .. .. .. .. .. .. .. 93 3.13 Coding Rules for Value Chain Focus .. .. .. .. .. .. .. ...... .. .. .. .. 94 3.14 Coding Rules for Facilitating Factors Under Acquisition 95 3.15 Coding Rules for Facilitating Factors Under Dissemination 97 3.16 Coding Rules for Facilitating Factors Under Utilization 98 3.17 Coding Rules for Facilitating Factors Not Specific to a Learning Process .. .. .. .. . .. . . .. .. . . . .. .... ... .. .. .... . .. .. .. .. .. 99 3.18 Coding Rules for Use of Data .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. .. 1 00 3.19 Coding Rules for New Facilitating Factors Under Acquisition .. .. .. . .. .. . . .. . . . .. .. .. . . .. .. . . .. . . .. .. . . .. .. .. 1 01 3.20 Coding Rules for Interpretive Mechanism .. ... .. .... ... .. .. 101 xiii

PAGE 14

3.21 Coding Rules for Interpretive Orientation .... .. .. .. .... .. .. ... 102 3.22 Coding Rules for Emergent Facilitating Factors Under Interpretation .. .. .. .. . . . . . . . . . . . . . .. .. .. .. . .. . . . . . . . 1 03 3.23 Coding Rules for Emergent Facilitating Factors Under Utilization.................................................................... 104 3.24 Organizational Learning Capability Model for Local Education Agencies .. .. .. .. .. .... .... .. .... .. .. .. .. .. .... . .. .. .. ... 106 3.25 Interrater Reliability Percentages................................. 109 6.1 Summary of Mid plains' Learning Orientations and Facilitating Factors...................................................... 191 6.2 Summary of River Valley's Learning Orientations and Facilitating Factors...................................................... 196 6.3 Summary of Front Range's Learning Orientations and Facilitating Factors ...................................................... 201 7.1 Differences in Learning Orientations and Facilitating Factors ...................................................... 228 xiv

PAGE 15

CHAPTER 1 INTRODUCTION Until recently, the majority of educational policy in the United States was left to local decision making processes. While most states have constitutional authority over public schooling, most of this authority was delegated to local education agencies (LEA). This delegation of authority was especially true in matters of curriculum and instruction (Spillane, 1993) with the assumption that local educators were in a better position to decide what and how to teach students. Since the early 1970's with the rise of both federal and state aid to LEAs, however, a proliferation of educational policy dealing with curriculum and instructional issues has been passed in an attempt to increase student achievement. This is especially apparent since 1983 when a Nation At Risk was published. This seminal work on the dangers of a failing public educational system led many states to emphasize policies that dealt with curriculum and instructional matters. The 1980's, from a state policy perspective, saw great emphasis put on increasing graduation requirements, improving teacher quality and basic skills testing (Goertz, Floden & O'Day, 1995). These policies, however, did little to change the content or instructional methods of coursework and were contradictory in that they called for

PAGE 16

increased content and thinking but often mandated minimal, basic skills testing. These contradictory notions led to little professional learning and the inability to use this type of reform policy (Goertz et al.) for enhanced achievement. More recently, a more systematic and consistent approach to educational policy has emerged. The concept of systemic reform as articulated by Smith & O'Day ( 1991) assumes that if all parts of the system align around specific outcomes for student learning, student achievement on these outcomes will increase. The parts of the system that have to be aligned or organized to achieve ambitious student outcomes or standards for learning include: assessment systems that focus on measuring these ambitious standards, professional learning focused on the standards and assessments with new instructional methods, teacher certification that models the type of learning promoted, and restructured governance systems to promote increased student achievement. Similar to policies in the 1980's, systemic reform attempts to change the central practices or core technology of schooling, teaching and learning. Systemic reform,however, attempts to change the entire policy system in a more coherent fashion based on goals for student results and new conceptions of learning. By 1998, the power and logic of this idea led to 48 states passing their own versions of systemic reform policy (Massell, 1998). 2

PAGE 17

Recent research on systemic reform policy in numerous states, however, suggests that the alignment of educational policy is not as rational as set forth in theory. For instance, Jennings and Spillane (1996) in their study of systemic reform policy in South Carolina found that unequal forms of capacity led to different interpretations of policy. At the state level, Massell (1998) also found that states have interpreted the intent of systemic reform quite differently and have attempted various capacity building strategies, with varying degrees of success, to help local education agencies (LEAs) enact these policies. Last, Quality Counts (1999), a yearly analysis of standards-based reform in all 50 states, reported that 48 of the 50 states have passed legislation dealing with systemic reform. However, state accountability systems vary widely, state standards and assessments vary greatly in form and substance, and only 14 provide any form of incentive to improve performance. Similarly, Quality Counts reports that due to this variability in interpreting systemic reform around accountability, only two states come close in having all the components of a complete accountability system. According to Quality Counts, the principles for a statewide accountability system include having standards, balanced assessments that measure these standards, rewards and consequences for performance, and data and information from performance measures that schools and publics can understand. In conjunction, a real accountability system includes report cards that summarize the 3

PAGE 18

performance of individual schools; public rating systems of schools that target low performers, targeted assistance for school improvement, rewards based on performance; and authority to close or take over schools that do no improve (p.10).The two states that have these elements, North Carolina and Texas, posted the largest average gains on the National Assessment of Educational Progress from 1990 to 1996. This recent policy trend raises many questions about the role of states, LEAs and the relationship between state policy and local practice (Spillane, 1994). Is the line of implementation between state policy and LEAs any different from earlier policy attempts that showed local implementation to be problematic? Because this new form of state policy attempts to change curriculum, instruction and professional development, how have districts interpreted these policies? Similarly, how have LEAs gone about making sense of a multitude of these policies in an internally coherent fashion? Last, what role does the LEA play in systemic reform in light of the tremendous amount of change and learning required by new systemic reform policies? I take up these questions in the following dissertation through comparative case studies of how three Colorado school districts responded to state systemic reform policy. 4

PAGE 19

Colorado's Policy Context In the 1990's, the state of Colorado, as in most other states, became much more active in curriculum and instructional policy aimed at increasing student achievement. However, because of the state's long history of local control, state initiatives were always designed to let LEAs determine how they would implement policy while pushing for more ambitious pedagogy by LEAs. Colorado's version of systemic reform began in 1993 with the passage of HB 93-1313 that called for the state to develop model content standards as achievement goals for all students. LEAs were required to adopt their own version of standards that either met or exceeded the state version. Although local standards were to meet or exceed state models, the Colorado Department of Education had no mechanism to evaluate locally adopted standards. This legislation also called for both the state and LEA to measure achievement of these standards aligning tests to adopted standards. The legislation required districts to provide professional development and ensure curriculum aligned with standards. Additional legislation changed the ways in which teachers were certified moving to a licensure system with testing requirements of teachers and administrators entering the field. Since that date, many other policies have been legislated. For instance, in 1996 HB 96-1139, the Colorado Basic Literacy Act was passed. This legislation does not 5

PAGE 20

allow third grade students to go past third grade if they do not have the literacy skills to succeed at higher levels. Most recently, HB 981267 saw new requirements for LEA and school accreditation. This legislation requires LEAs to contract with the state to focus on outputs or indicators of how well students are achieving adopted content standards, community satisfaction, and a host of other indicators that focus on the results of the local educational system. Through a contractual agreement, this legislation strengthens local control on how to use standards and assessment at a local level, but holds LEAss accountable for raising achievement on state and local tests. Last, within all of these policies has been the Colorado State Assessment Program (CSAP) which was mandated in HB 93-1313 to measure student achievement of state standards across all schools in Colorado. Content areas and grades were initially designated in separate legislation in 1995. Due to resource constraints, the initial testing occurred in the spring of 1997 with only fourth grade reading and writing being assessed. The spring of 1998 continued fourth grade assessment but added third grade reading for the literacy act. HB-98 1267 changed the testing schedule for CSAP that will continue to add content areas and grades through 2001 including 1Oth grade assessment of reading/writing and math to determine eligibility for graduation. 6

PAGE 21

With the passage ofSB 00-186 in the spring of2000, grades 3-10 will take the CSAP annually. These scores will be used for school report cards. Colorado's policy context also includes substantial charter school legislation aimed at giving parents choice, and inadequate funding resources. Attempts to use surplus revenue to pay for construction costs were soundly defeated in the 1998 election. Similarly, Quality Counts (1999) gave the state of Colorado a grade ofF for its adequacy of funding because it was ranked 49 of 50 states in education spending per pupil, and the state has failed to keep funding levels equalized with inflation. Colorado spent $4,941 per student in 1998 while New Jersey, rated as first, spent $8,436 per student. Similarly Colorado education spending showed a -11% change in inflation adjusted education spending per student from 1987-1998 while New Jersey showed a +23% change in the same time period (Quality Counts, p. 120). With the proliferation of state instructional policy, state-wide testing and inadequate funding in Colorado, districts have implemented standards-based reform and system alignment differently. Because many of these policies are dramatic in their intent and consequences for students but not directive in how to implement them, the role of the LEA has increased as a conduit for information and interpretation of these new instructional policies. The ambiguity in many of these policies requires local districts to interpret what they mean, determine how they fit 7

PAGE 22

into local values, and coordinate the state policy efforts with local improvement and policymaking. In essence, the implementation of these policies is as Yanow (1996) describes, not an instrumental or rational process, but a part of the policy process that "focuses on meaning of these policies and the processes by which those meanings are communicated to and read by various audiences" (p.9). Analyzing implementation from an interpretive lens suggests multiple ways to interpret policy depending upon the local context and local capacity. Recent studies by the Colorado Department of Education( 1997) suggest a large range of progress among Colorado districts around developing local assessments, changing instruction, and providing the necessary learning opportunities to become standards-based. Similarly, resource issues such as time, staff development and funds point to the unequal progress between city, suburban, outlying city and rural districts. These differences in how policies are interpreted, and disparity of resources results in the general problem of variation in implementation. The General Problem Variation in the implementation of educational policy has been an issue since the RAND studies ofthe 1970's (McLaughlin, 1987). Recent research about standards based reform policy has also found variation in the implementation and enactment of 8

PAGE 23

standards policies (Massell, Kirst & Hoppe, 1997). Jennings and Spillane (1996) suggest that variation is due to local needs and contexts. This is especially true in specific curricular areas like math, led by NCTM curriculum and instructional standards, which call for greater degrees of mathematical power for all students, but which run counter to highly ingrained institutional patterns. Implementation problems, as McLaughlin (1987) suggests, are never solved but evolve through multi-stage, iterative processes. The failure of past implementation and models to explain local responses to systemic reform begs that we look at new models and ways to analyze implementation as an interactive process versus a linear model. This is especially important in the current systemic reform initiatives because they define the policy context through instructional policy which is complex, extensive and reaches the core practices of teaching and learning that are more open to varied interpretations (c.f. Spillane, 1993, 1994). To fully realize the potential for this complex and systemic type of policy at a local level, implementation relies on local capacity (Jennings and Spillane, 1996; Corcoran & Goertz, 1995). Capacity in systemic reform studies has been identified in various ways as organizational resources, economic capital, cognitive knowledge and beliefs, but no common definition exists. Similarly, while many components have 9

PAGE 24

been identified to enable various forms of capacity, no connection to how these enable LEAs to interpret or learn from state policy has been defined. Knapp (1997) suggests that by only looking at systemic reform through past implementation knowledge we are bound to misunderstand the dynamics of this type of policy. Past implementation knowledge does not take us deeply into the meaning of capacity as a construct, where it comes from or how policy itself contributes to that development process (p. 251) Knapp also suggests that "too easily contextual conditions in the variety of case examples studied so far are treated as static givens, rather than dynamic, developmental features of an environment that is itself evolving" (p.252). To enhance the capacity of a system to meet more demanding requirements, four methods are commonly utilized. These include: 1) improving the performance of workers through professional training; 2) adding additional resources in the forms of money, materials, personnel or technology; 3) restructuring the way work is organized; or 4) restructuring how services are delivered ( Goertz et.al, p.ll 0). Capacity in common terms has mainly dealt with improving teacher learning and professional development relative to standards-based reform policy. The problem of capacity, however, does not rest merely on improving teaching skills and knowledge. This so-called training model without the other capacity increasing elements is 10

PAGE 25

incompatible with current systemic reform. Training alone does not allow teachers, schools or districts opportunities to "learn, experiment, consult and evaluate their practice" (Little, 1993) relative to higher cognitive demands placed on students and teachers. Organizational learning may be an important way to construct an understanding of policy implementation related to interpretation, learning capability and capacity. Between the intent of systemic reform and classroom practice exist district and school contexts which mediates the influence and meaning of standards-based reform on teaching and learning. Goertz et al. (1995) found that individual capacity interacts and is interdependent with organizational capacity in many respects. Similarly, Knapp (1996) suggests that the unfolding patterns of systemic reform can be seen from the perspective of organizational learning. This perspective, seen as an organization's cognitive capability to acquire, interpret, disseminate and utilize new knowledge (Huber, 1991), seems to be the gap in understanding the implementation ofthis type of policy. How local education agencies (LEA) learn about and from systemic reform policy suggests a new approach to understanding factors that enable organizational learning, policy implementation and their relationship. Therefore, this study will investigate how LEAs learn about and interpret state mandated standards based reform policy from an organizational learning perspective. This study will 11

PAGE 26

also investigate what factors and orientations in LEAs contribute to organizational learning about this type of policy. Background of the Problem Systemic reform policies that focus more on the core technology of schools are open to more varied interpretations because of the demands for different learning outcomes, equity and new institutional patterns. Compared to traditional educational policies which tended to focus more on the structure and organization of schools, systemic reform policies require learning in all parts ofthe LEA. Spillane (1993) suggests that because these types of policies may increase the activity of LEAs, new questions arise about the relationship between state policy and local practice, and this demands new ways to explore this relationship. A promising approach to explore the relationship is to examine how systemic reform policy and organizational capability exist in a reciprocal relationship based on learning. To understand this relationship, the ambitious changes in instructional policy can be viewed as constructivist forms of learning (Cohen & Barnes, 1993) required by teachers, principals and central office personnel. This theoretical perspective argues that if the idea of a policy is potentially worthwhile to implementors, its success will depend on the degree and 12

PAGE 27

quality of reconstructing beliefs, knowledge and action in individuals and organizations which is in effect learning through actual practice (Fullan, 1992, p. 66). Cohen and Barnes (1993); Fuhrman and others. (1988); and Yannow (1991, 1996) have also alluded to the use of "learning theory" as the next orientation for policy implementation research. Other authors (Elmore and Fuhrman, 1994; Goertz, Floden & O'Day, 1995) have agreed, suggesting that the ambitious instructional reforms of state policy are vulnerable to the capacities of individuals and organizations to learn from policy. Support for this perspective aiso comes from the latest implementation studies including research on the relationships among policy and practice, and capacity. Policy as Pedagogy Similar to the latest theoretical perspective for studying implementation (Cohen and Barnes, 1993; Fuhrman and others ,1988; Yannow, 1991, 1996), educational policy and practice studies from the Center for Teacher Learning at Michigan State University and Consortium for Policy Research in Education at Michigan University have used the constructivist learning metaphor to explore the relationships between policy and practice. This line of research uses cognitive psychology as its theoretical framework and investigates the educative nature of policies to see how local 13

PAGE 28

practitioners, schools and districts construct an understanding of specific policies. Field studies using this framework also suggest a tenuous relationship between policy and individual practice mediated by prior beliefs, experience and the school organization (Wiemers, Wilson, Peterson and Ball, 1990). Similarly, Ball, Cohen, Peterson and Wilson (1994) examined how the differences in local teachers, schools and school districts influenced the response to state policy and mediated the original intent of the policy. Their findings suggest that sources of variance in local instructional guidance are substantial. These variations placed limitations on the ability to capitalize on reform ideas and the ability to increase professional knowledge. Capacity Studies Other research in this field looked in more depth at the interaction and capacity needs among the intent of instructional policy, the implementing organizational unit and individual teachers to ascertain how learning from policy can occur. Spillane (1993) for instance looked at how different districts in Michigan responded to state reading policy. He found that state policy and its meaning are affected by the notions central administrators hold about curriculum and instruction. If central administrators hold traditional pedagogical views, they limit the resources and opportunities for 14

PAGE 29

\ teachers to learn from these policies. Jennings and Spillane (1996) in their study of South Carolina policy for at-risk learners found that unequal capacity at the local school site affects learning opportunities for practitioners. Their study raises questions about the effects of policy when it requires local school systems with unequal capacity to implement the same ambitious reform. Similarly, Spillane and Thomson (1997), argued that capacity has to be rethought in light of the amount of learning required by all educators in relation to current instructional reform. Focusing on Local Education Agencies (LEAs), they found that where ambitious instructional reform was taking place, three interrelated dimensions commonly used by economists were evident including: human capital, social capital and financial resources. These dimensions of capacity helped local LEA leaders to help others learn. The Consortium for Policy Research in Education (CPRE 1995, 1996) in their studies of systemic reform from a learning perspective found similar interrelationships between individual capacity and organizational capacity. These studies identified elements of teacher capacity necessary for instructional change and how they depended highly upon organizational capacity to learn from and make sense of more ambitious instructional policy. Organizational capacity, defined as vision, collective commitment, knowledge or access to knowledge, resources and 15

PAGE 30

organizational structures conducive to learning, were found to interact with individual capacity in a number of. ways. High capacity schools had these organizational elements in place that allowed individual learning by teachers to occur and pursue reform ideas together. As these studies of standards-based reform policy suggest, the local school and district organization have a strong influence on teachers' ability to learn from and implement policy in their classrooms. However, none capture the interrelationships of all of these dimensions of capacity in a complex and systemic manner suggested by organizational sociologists such as Scott (1992) or Weick ( 1995). In order to understand the interrelationship between policy and individual practitioners, it is also necessary to examine how organizations mediate this learning and make sense from their external environments. This perspective on capacity seen as the organizational capability to learn seems to be the gap in understanding the implementation of this type of policy. As Knapp, (1996) suggests: In these terms, the patterns of implementation and effect are largely a story of incomplete professional and organizational learning occurring as the learners (teachers, administrators, curriculum coordinators, staff developers) encounter the often limited teaching (by those who make or promulgate reform policies, as well as by policy itselt) and learning opportunities (created by, or around, reform policies). The longer term prospects for the policy's success depend in large measure on the quality of the reform policy's "pedagogy" and learning resources over time. This then suggests that the idea of capacity as organizational learning capability needs to be as an important issue in implementation of educational policy. 16

PAGE 31

Theoretical Framework This study posits that systemic reform policy and organizational learning exist in a reciprocal relationship. This exploratory assertion suggests that this study will use organizational learning nested within cognitive learning and interpretation as the theoretical framework. To understand this relationship, the ambitious changes in curriculum and instructional policy will be viewed as constructivist forms of learning required by a local district or school organization (Cohen & Barnes, 1993). Learning by a local school organization viewed through the conceptual framework of organizational learning will be defined as the capacity or processes within an organization to maintain or improve performance based on experience (DiBella & Nevis, 1998). To say that learning has occurred means that new knowledge has come into an organizational system, has been interpreted, has been disseminated or transferred, and is used (DiBella & Nevis, p. 28). In this sense, implementation becomes a change in both cognitive structures and behaviors of individuals and the organizations in which they work. More specifically, this study will use Huber's (1991) conceptual framework that organizational learning is the cognitive processes an organization uses to acquire, interpret, disseminate and utilize knowledge. To say that learning has occurred means that all four of these stages occur. This framework also differentiates between 17

PAGE 32

individual and organizational learning in that learning is said to be organizational when: New skills, attitudes, values and behaviors are creatde or acquired over time What is learned becomes the property of a collective unit What is learned remains within the organization even if individuals leave. Huber's framework was used by DiBella & Nevis (1998) to define organizational learning capability. The Organizational Learning Capabilities framework depicts organizationallearnipg in the private sector through seven orientations to organizational learning and ten facilitating factors necessary for organizational learning to occur. Although this research was done in business settings, the framework can be translated into educational organizations. Together, these seventeen elements provide a way to profile an organization's learning capability. Learning orientations describe how learning occurs and what is learned based on an organization's culture and core competence (DiBella & Nevis, 1998, p. 24). These orientations exist on a bi-polar continuum and help distinguish stylistic variations in organizational learning. Facilitating factors specify elements that promote learning and are based on best practices and common processes and may be what is currently being defined as capacity. These factors enable learning to occur, but do not determine an organization's orientation to learning. 18

PAGE 33

Research using this framework began in 1992 to determine how and why organizations learn and was supported by the Center for Organizational Learning at Massachusetts Institute of Technology. Initial research used field-based case study methodology to describe organizational learning and grounded theory to draw emergent constructs about how and why organizations learn. Businesses studied included Centegra Health Systems, Fiat Auto, Motorola, AT&T, British Petroleum, Exxon Chemical and others. Constructs derived in these contexts were then tested using written case notes and validated through research in other companies. Further research has been conducted to develop an Organizational Learning Inventory for specific industries and to ascertain how enhancing learning orientation and facilitating factors affects various change initiatives. Specific Problem and Research Questions Variation in implementation of educational policy has been a common theme and problem since the earliest implementation studies. Success in implementation of various policies has been credited to high capacity and opportunities to learn, but no single definition or model has been proposed, and no relationships have been drawn between capacity and its role in organizational learning for school districts. No studies to date have explored the relationship between state systemic reform policies 19

PAGE 34

and local districts relative to organizational learning capability. Therefore, this study will analyze variation as a problem of differences in organizational learning capability for school districts. Research Questions Districts implementing Colorado's version of systemic reform provide fertile ground to begin exploring, defining and understanding the dynamic interrelationship _of capacity as organization learning. Questions central to this study inclue: 1. Using Dibella and Nevis' (1998) description of organizational learning based on Huber (1991) a.) how did LEAs learn about standards-based reform policies; and b) what orientations did LEAs use to learn about and implement current state-mandated standards-based reform policy? 2. What facilitating factors in each of Huber's four areas were perceived by district respondents to contribute to organizational learning about these policies? 3. Using learning orientations and facilitating factors, what interpretations were constructed by Local Education Agencies (LEAs) about state-mandated standards based reform policies? 4. How do differences in learning orientations and facilitating factors explain variation of interpretation and implementation of standards-based reform policies? Methodology for the Study According to McLaughlin (1987), "policy effects are complex, sometimes hidden or invisible, often unanticipated or nominalistic" (p.175). Because of this McLaughlin 20

PAGE 35

suggests that the analysis of policy implementation move away from positivistic models to models of social learning. Therefore, the methodology for this study in general will be a comparative qualitative case study. Case study methodology is appropriate for this study because it provides an intensive description and analyses of single units or boWlded system (Merriam, 1998 p. 19) such as school districts. DiBella and Nevis (1998) suggest that qualitative case studies are the most appropriate methods and data to collect because of the complexities and subtleties of capturing learning processes in organizations. The school districts is the organizational unit of analysis for two reasons. First, district response to state policy and implementation research has been identified as a critical but neglected area of study. Second, the literature poses many questions concerning district response, capacity, and the interrelationship of organizational learning and implementation at this level. Capacity is especially important in states like Colorado that have a decentralized educational governance system and state mandates for ambitious, systemic reform that is highly susceptible to capacity issues (Elmore and Fuhrman, 1994 ). Therefore, this study will use Colorado's state systemic reform policies as a context to study district capacity as organizational learning capability and to determine what orientations or factors as perceived by 21

PAGE 36

district personnel in each of Huber's four areas to contribute to LEAs interpreting and learning from standards-based reform policy. Three districts (LEAs) in Colorado were chosen as sites to study because of their contrasting sizes and reputations for implementing standards-based education reform. Two were nominated as representing a relatively high level of implementation as confirmed through documents, training provided for teachers and administrators, and serving as resources for other districts. One was nominated as a contrasting case, representing a relatively low level of implementation and confirmed using the same criteria. Two primary forms of data analysis are used for this study. Categorizing (Maxwell, 1996) is used to determine the types of learning orientations and facilitating factors in relation to Dibella and Nevis' (1998) organizational learning capabilities framework. Contiguity-based relations is a second analysis tool. This method uses connection strategies to link codes to show relationships. Analysis of data will use four stages. The first stage of analysis will code the data according to the four major categories in Huber's description of organizational learning and existing constructs in the Dibella and Nevis model. The second stage of analysis will code the data about attribution by respondents to those facilitating factors which enabled organizational learning to occur. In the third stage of analysis, 22

PAGE 37

responses that are not adequately accounted for in the framework and model will be analyzed to address the applicability and appropriateness for understanding school district organizational learning and policy implementation. Last, learning orientations and facilitating factors will be analyzed together to understand how they interact Structure of the Dissertation This thesis explores how three districts used varying organizational learning capabilities to learn about and from state systemic reform policy. Chapter two provides a review of the literature about standards-based systemic reform research on capacity to implement complex educational reform, and the theoretical framework. Chapter Three describes the methods, sample and data analysis. Chapter four describes the variation in learning orientations among the three school districts. Chapter five describes variation in the strength and use of facilitating factors among the three school districts. Chapter six explores the interaction of learning orientations and facilitating factors and how this interaction contributed to different meanings for standards-based reform in each district. Chapter seven explores the implications for understanding implementation as organizational learning capability through a preliminary model of how organizational learning orientations and facilitating factors appear to influence organizational learning about systemic reform. Chapter seven will also consider implications for state policy to enhance organizational learning. 23

PAGE 38

CHAPTER2 REVIEW OF THE LITERATURE Fullan ( 1992) asked, "What types of things would have changed if an innovation or reform were to become fully implemented (p. 66) ?" The dynamic complexity of multilevel change in the educational system does not allow for a simple answer to his question. Does the change have to occur at the policy, school district, school organization, department, grade, classroom or individual level? Or do these changes have to occur at multiple levels in a mutually supportive fashion as McLaughlin (1987) suggests? And do the changes have to look exactly alike to count as implementation? The literature to date has formulated many recommendations, and, according to some (Odden, 1991; Fuhnnan, Clune & Elmore, 1988) implementation has occurred, but often the impact of policy and its outcomes is still questioned. The current agenda of implementation researchers is to fmd the causes, influences, variables or themes an develop models that enable optimal implementation focusing on the type of policy, level of analysis and the relative complexity of implementing the policy (Matland, 1995). More specifically, implementation research has begun to utilize different conceptual perspectives to 24

PAGE 39

understand the policy and practice relationship, and to look at how capacity enables that relationship. While we now know more about variables that affect the implementation of policyO'Toole (1986) for instance found over three hundred separate variable in reviewing one hundred studies on implementationthere still seems to be a loose relation between policy and practice in educational settings (Cohen and Spillane, 1992 ). Why does this problem still exist even after almost a quarter century of implementation research? Initial research showed a lack of capacity and will at the local implementation sites which influenced policy outcomes (Odden, 1991, p. 1). Later research on regulatory measures built into policy to mandate outcomes also shared weak links in policy implementation efforts (Odden, 1991, p.l). From this early research, general schools of thought focusing on different levels of analysis (top-down and bottom-up) emerged that have helped shape our understandings of implementation. Other schools of thought that try to rectify the inherent conflict between these two schools and their level of analysis have also emerged (see Nakamura and Smallwood, 1980 for instance). However, the implementation literature also poses many unresolved issues and offers conflicting opinions. This chapter will suggest that to advance our understanding of implementation and the relationship between state policy and local practice, another conceptual framework 25

PAGE 40

has to be used in order to advance our understanding. Second, this chapter will examine this new conceptual framework from the capacity perspective and examine how organizational learning and capacity may provide a new way to examine implementation of educational policy. The Nature of Ideas in Policy The nature of implementation research as in all domains has undergone ideological changes as understandings have emerged. Beginning with the top-down versus bottom-up debate, implementation study has been supported by research into policy formation and resulting practice (McDonnell, 1991). Analysts have asked what the original intent of the policy was, how the outcomes at local sites differ, and what factors explain the differences between intent and outcome? Kingdon's (1984) concept of ideas supporting policy and implementation proposes two dimensions for understanding policy and implementation. First, goals or those agreed upon conceptions of what is desirable for individuals, groups or the society as a wholeexist as solutions to policy problems. These goals such as improving student achievement, teacher performance, improving standards and numerous others are in essence the content of implementation or the lessons to be learned. The second concept includes theories about how the world works. This 26

PAGE 41

concept contains those cause and effect beliefs about social phenomena including policy instruments, people's behavior, regulations and others. These beliefs lead to the processes of implementation or how the content of policy is put into place and learned. Put another way; it is the pedagogy of policy. While both concepts are crucial in formulating educational policy, the second part of Kingdon's concept in relation to implementation connotes more importance. The goals of educational policy will continually change, but the process of implementation as learning is critical in that it is the means of accomplishing the ends of both policy goals and continual organizational development (Pullan 1992, p.66). If the idea of the goal is potentially good, its success will depend on the degree and quality of change in individuals and organizations that is learned through actual practice (Pullan, 1992, p.66). If we begin to see implementation from a learning or socially constructed meaning framework, we can begin to see how systemic reform policy and organizational capacity exist in a reciprocal relationship. To understand this relationship, changes in instructional policy can be viewed as constructivist forms of learning (Cohen & Barnes, 1993) required of teachers, principals and central office personnel to implement the intentions. A constructivist framework theorizes that learning in this sense means all members and parts of the educational system need to 27

PAGE 42

construct meaning about the policy in order to act and for implementation to occur. The system also needs to unlearn (Hedberg, 1981) past patterns and behaviors that are being replaced ny new ones. Although a gap exists in the empirical study of organizational learning, support for this rationale evolves from research in four fields: policy implementation of policy as pedagogy, organizational learning as sensemaking, capacity studies and organizational learning orientations. Each of these fields has or has begun to move toward using constructivist forms of learning to understand organizational phenomena. They may be combined to fill the gap in our understanding responses to systemic reform policy by local educational agencies (LEAs). Policy Implementation Rational Frameworks for Examining Implementation Early implementation research focused on a rational framework that posited that goals can be set at a central location and through various controls, these goals can be achieved (Scott, 1992). Rational frameworks also focus on the structures of organizational settings and their role in purposeful activity centered on goal achievement. In implementation research, this framework has led to the topdown 28

PAGE 43

school of thought that focuses on policy makers as central actors. This framework also focuses on factors these central actors control that can affect fidelity to the goals of the policy (Pressman and Wildavsky, 1973, Van Meter and Van Horne, 1976, Mazmanian and Sabatier, 1981). This conceptual framework for implementation primarily focused on reducing uncertainty through monitoring, sanctioned interpretations (Ford and Ogilvie, 1996), structures and ability to structure the implementation (Mazmanian and Sabatier, 1989). As one way to study implementation, the top-down rational approach faces many problems. As a linear approach based on achieving certain goals, the top-down rational approach must have a place from which to begin. The first criticism then is that top-down rational approach begins from the statutory language that fails to take into account actions prior to legislation (Matland, 1995, p. 147). Second, top-down policy fails to take into account the political nature of policies, instead focusing on implementation as a mere administrative practice guided by clear objectives and direction. Third, top-down rational models look at local implementors as vehicles and possible impediments toward policy success and focus on the policy makers as the key actors. Last, Fox (1990), argues that by only focusing on a rational view of implementation, we fail to address the false dichotomy of the objective versus the subjective in implementation. This leaves us questioning the reality of what we 29

PAGE 44

analyze. The shortcoming of this perspective takes us into the next evolutionary framework in implementation research: the conflict and exchange framework. Conflict and Exchange Framework A second major perspective for studying implementation came from the famous RAND studies of the mid 1970's that focused on the implementors themselves. This perspective commonly known as the bottom-up perspective used an exchange or conflict theory to explain how implementors mutually adapt policy to fit their needs (McLaughlin, 1987). This framework argued, in contrast to rational models, that a more comprehensive view of implementation can take place by looking through the perspective of the local site, target population or service deliverers and the conflicts and exchanges that occur. Exchange theory as a conceptual framework for implementation suggests that the how and why of human decision making is based upon seeking some reward in social transactions. Exchange theory posits that people weigh the alternatives in respect to costs and benefits and the amount of information they may possess. Exchange theory also asserts that exchange not only serves the needs of individuals but shapes and constrains the collective development of social systems such as organizations (LeCompte & Preissle, 1993, p. 130). 30

PAGE 45

Conflict theory concerns itself with power and contradictions in social systems. Although somewhat different in their approaches, most notably the view of humans within social systems, exchange and conflict theories are quite rational in their theories of how the world works. Therefore, while the overall perspective is different in bottom-up models some measure of rationality abounds. Models and theories in this framework mainly come from the school of"bottom-up" research and focus on transactions among individuals and between levels. Hjem and Hull (1982), Sorg (1978), Elmore (1978), and Hall and Loucks (1978) all developed models that suggested this bottom-up understanding of implementation. The most famous of the bottom-up studies were done by Berman and McLaughlin (1974, 1975, 1977, 1978). In the famous RAND studies of 1974-1978 entitled Federal Programs Supporting Educational Change, Berman and McLaughlin studied four major federal programs in eighteen states in 293 local sites (McLaughlin, 1991, p.143). The original intent of the studies was to examine how federal policy stimulated and spread educational innovation, and how temporary funds were used to support new practices. These policies, as did most policies of the time, assumed a relatively direct relationship between federal policy "inputs", local responses, and program "outputs" (McLaughlin, 1991, p. 144). Berman's and McLaughlin's findings suggested something else, however. 31

PAGE 46

The RAND studies found that federal policy did have a major role in prompting local school districts to undertake projects that were aligned with federal guidelines, but were mostly pro forma. Their findings suggested that adoption of a project consistent with federal guidelines did not ensure successful implementation, and even if successfully implemented there was no assurance that programs would continue without federal funds. The RAND findings suggested from a conflict and exchange framework that the consequences of federal policies depended primarily on local factors and not federal guidelines or funding levels (McLaughlin, 1991, p. 145). In examining the local factors that affected the outcomes of innovations, the RAND study reached five conclusions: Educational methods of innovations did not matter as much as how they were carried out. Resources alone did not guarantee successful implementation or continuation Project scope needed to be ambitious enough to generate interest and involvement but not so large as to require too much too soon from the implementing system. The active commitment of the district and site leadership from the very beginning was essential. Local implementation strategies for innovations dominated the outcome of federally supported projects (McLaughlin, 1991, p. 146). The study also found effective and ineffective local strategies for implementation. Ineffective strategies were ineffective because they did not provide for ongoing teacher support and training, did not include teachers in development and signaled a mechanistic role for teachers. These ineffective strategies included such things as 32

PAGE 47

reliance on outside consultants, one shot training or comprehensive system-wide projects. In contrast, the RAND study found that strategies that promoted mutual adaption, or the exchange of ideas for benefit were effective. These strategies included such things as extended training, teacher's participation in decisions, regular meetings focused on practical issues, local development of project materials, and principals' participation in training (McLaughlin, 1991, p. 146). In essence as a bottom-up theory of implementation of educational innovation, the conflict between macro level objectives and macro-level realities led to conflict over power, resources and benefits that led to mutual adaption in the intent of the policy. The conflict and exchange framework, housed mainly within the bottom-up approach to implementation, places more emphasis on describing those factors that have caused difficulty in reaching policy goals at the local sites, and therefore have led to few explicit policy recommendations (Matland, 1995, p. 149). As an approach to analyzing implementation, this framework has been criticized on four major counts. First, by placing a large extent of policy meaning in the local actor's hands, the role of policy makers is neglected or under specified. Second, this framework overemphasizes local factors while resources and access to an implementing arena may be determined centrally and can substantially affect policy outcomes (Matland, 33

PAGE 48

1995, p. 150). Third, conflict and exchange analyses which only focus on the participants at the local level fail to take into account broader social, legal and economic factors that structure the perceptions, resources and participation of those actors (Fritz, Halpin & Power, 1994, p. 56). Last, because most implementation studies focus on the questions of macro-level concerns as compared to micro-level realities, most bottom-up studies can still be considered top down theories. Therein, the conceptual frameworks of exchange and conflict theory still assume rationality in top-down needs and concerns. Synthesis Frameworks As implementation research and understanding has grown, so have the attempts to reconcile the differences and problems with the top-down and bottom-up approaches to implementation through combining the two conceptual frameworks. By looking at a rational framework for analysis of the policy itself and an exchange or conflict model for implementation, these hybrid models attempt explaining the how of public policy implementation. Third generation models of implementation take kernels of truth from both top-down, rational models and bottom-up exchange and conflict models. They look for a variety of responses and strategies depending upon either the policy (Matland, 1995), the policy arena (Nakamura & Smallwood, 1980, Miller, 34

PAGE 49

1990), coalitions involved (Sabatier 1989) or conditions under which one model may be more appropriate (Berman, 1980, Timar, 1989). In answer to the question of which model is most appropriate, synthesis models rely on context. Most synthesis models combine lists of variables without exploring the theoretical implications (Matland, 1995). Therefore, these models do little for either policy makers or micro level implementors. Social Construction of Meaning Framework If we believe that changing the culture of the school, the nature and profession of teaching, and the nature of the curriculum we offer students is the key (Pullan, 1992, p. 352) to sustained educational change, what then are the implications for policy and implementation studies? Social construction of meaning as a conceptual framework analyzes the constructed nature of social meaning and reality by removing the subjectobject dichotomy(LeCompte & Preissle, 1993). Instead of teachers becoming the subject for policy they become actors in constructing meaning through social interaction. This frame, in contrast to a more rational view of the world, assumes that meaning is always changing but always based on interpretations according to local contexts. The social construction of meaning on an organizational 35

PAGE 50

level also shares major concepts with cybernetics or systems theory in that information matters considerably in the construction of meaning. Returning to the introduction in which Fullan (1992) suggested that implementation and change was simply putting ideas, practices or activities into place, we can now see that that can happen but have little impact. For true improvement to occur, Fullan also suggests that change and implementation requires understanding meaning beyond surface detail. Fullan states: The key to understanding the particular worth of particular changes, or to achieving desired changes, concerns what I call "the problem of meaning." One of the most fundamental problems in education today is that people do not have a clear, coherent sense of meaning about what educational change is for, what it is, and how it proceeds ... What we need is a more coherent picture that people who are involved in or affected by change can use to make sense of what they and others are doing. (p. 4) Fullan goes on to say: Neglect of the phenomenology of changethat is, how people actually experience change as distinct from how it might have been intended-is at the heart of the spectacular lack of success of most social reforms ... Solutions must come through the development of shared meaning (p.4-5) Goertz, Floden and O'Day (1995) see the construction of meaning as the greatest challenge for newer systemic forms of policy: The first and most critical challenge evident .... is also the most difficult to realize in a system as large and bureaucratic as is public education in the United States. It is to place learning at the front and center of all reform effortsnot just improved learning for students but also the system as a whole and for those who work in it. Foi: if the adults are not themselves learners, and if the system does not continually assess and learn from practice, then there appears little hope of significantly improving opportunities for all of our youth to achieve to the new standards. 36

PAGE 51

This brings us to the role of a different conceptual framework in understanding implementation as learning and for understanding policy and its implications for supporting individual and organizational learning. The latest perspective on implementation uses a cognitive perspective to understand the dynamics of the interaction between the policy and the implementors. For instance, Pullan (1992) discusses the nature of any school reform including policy implementation as making meaning out of the effort. Others (Cohen and Barnes, 1993; Fuhrman and others, 1988; Yannow, 1991) have also alluded to the use of "learning theory" as the next orientation for policy implementation research. This theoretical perspective argues that if the idea of a policy is potentially worthwhile to implementors, its success will depend on the degree and quality of reconstructing beliefs, knowledge and action in individuals and organizations which is in effect learning through actual practice (Pullan, 1992, p. 66). Other authors (Elmore and Fuhrman, 1994 & Goertz, Floden & O'Day, 1995) have agreed suggesting that the ambitious instructional reforms of state policy are vulnerable to the capacities of individuals and organizations to learn from policy. Foremost among the research that uses this perspective, the Educational Policy and Practice Studies, done during the early 1990's as ajoint effort between Michigan State University's National center for teacher learning and the Consortium for Policy 37

PAGE 52

Research in Education at the University of Michigan, have also used the constructivist learning metaphor to explore the relationships between policy and practice. This line of research views the educative nature of policy and how local practitioners, schools and districts construct an understanding of policy. For instance, Cohen and Barnes ( 1993) found that policy seen as pedagogy is didactic in nature and does not engage teachers much like common K-12 instructional practices. Therefore, very little learning occurs from most policy efforts unless policy can be designed as educative in nature. Field studies using this framework also suggest a tenuous relationship between policy and individual practice. Wiemers, Wilson, Peterson, and Ball (1990) in their study of California's policy effort for systemic math changes found that learning did occur for individual teachers. This learning, however, was mediated by prior beliefs, experience and the school organization. Similarly, Ball, Cohen, Peterson and Wilson (1994) examined how the differences in local teachers, schools and school districts influenced the response to state policy and mediated the original intent of the policy. Their findings suggest that sources of variation in local instructional guidance are substantial. These variations include different instructional preferences, different subject matter preferences, and different levels of local educator knowledge. These variations placed limitations on the ability to capitalize on reform ideas and the ability to 38

PAGE 53

increase professional knowledge. Different school district wtderstanding of policy also led to different messages about the policy given to schools. Finally, Spillane and Jennings (1997) in their study of the alignment of instructional policy and ambitious pedagogy in literacy fowtd that while policy alignment strategies in LEAs were effective in changing surface level aspects of teaching, these strategies may be less effective in changing difficult dimensions of classroom practice: task and discourse. As stated in the introduction, implementation from a social construction of meaning framework would suggest that implementation is not something to be done, but something to be learned. Once again Fullan (1992) says it eloquently when he states: ... the crux of change involves the development of meaning in relation to a new idea, program, reform, or set of activities. But it is individuals (author's emphasis) who have to develop new meaning, and these individuals are insignificant parts of a gigantic, loosely organized, complex, messy social system that contains myriad different subjective worlds (p. 92). Implementation would in this framework exist not as mechanical actions to put centrally defined details into practice, but as a combination of factors interacting to help individuals and organizations make sense and learn from their actions. Policy formulation in this conceptual framework would, as Cohen and Barnes (1993) suggest, be educative in nature to help develop organizational capacity to acquire knowledge, share it and interpret its meaning to become part of organizational 39

PAGE 54

memory. In one sense, policy would become the curriculum for school organizations to learn. This requires that we Wlderstand how organizations learn. Knapp ( 1997) also suggests a growing need for different theoretical orientations in studying systemic reform. Past implementation research does not take us deep into the meaning of constructs like capacity, how it develops or where it comes from, or about the features of the context surroWlding the implementation process. Past implementation research also ignores the evolution of the policy ideas themselves. Organizational learning, therefore, has been offered as a way to Wlderstand implementation as large-scale construction of meaning. Organizational Learning and Sensemaking As noted in Chapter one, studies on systemic reform policy have pointed to the strong influence of the local school and district organization on teachers' ability to learn from and implement policy in their classrooms. In order to Wlderstand the interrelationship between policy and individual practitioners, it is also necessary to examine how organizations mediate this learning and make sense from their external environments. These influences by organizations can be Wlderstood by reviewing two similar lines of inquiry: organizational sensemaking and organizational learning. This perspective helps us Wlderstand policy as something to be learned, and the 40

PAGE 55

processes both individuals and organizations use to learn from and about the policy. Key ideas from an organizational learning perspective relative to policy include ideas such as learning opportunities, resources for learning, support for learning, and the storage of new knowledge and processes in organizational memory. Another key idea is that: ... the collective enterprise of a school, production finn, symphony orchestra, or other organization possess qualities that are greater than the aggregate of individuals within it, and that there is some identifiable learning that can be associate with the aggregate as a whole. (Cook & Yanow, 1995). In both lines of research, organizations can be seen as cognitive structures that can be active in understanding the worlds in which they exist. Sproull (1981) for instance focuses on the behavioral processes necessary for implementation by understanding the information processing of organizations. In her theory, Sproull suggests that school districts respond to federal regulation in an active process organized around four elements: 1) processes by which organization attention is captured, 2) processes by which meaning about external stimuli is constructed, 3) processes by which response repertoires are invoked, and 4) processes by which behavioral directives or guides for actions are communicated (p. 457). Expanding the idea of organizations as interpretation systems, Daft and Weick (1984) propose a model for different types of interpretation modes. This model is based on first, an organization's assumptions about the environment--analyzable or 41

PAGE 56

unanalyzable: and second, the amount of organizational intrusiveness in the environment--passive or active. Daft and Weick also include other organizational processes such as scanning processes, interpretation processes, and strategy and decision making processes to help explain organizations as interpretation systems. Taken together, the model proposes four different interpretation types: undirected viewing, enacting, conditioned viewing, and discovering. Similarly, Lotto and Murphy (1990), and Stein (1997) use this conceptual framework to study sensemaking of schools and new Title I policy respectively. The field of organizational learning also helps us to understand how organizations as collectives of individuals learn. Although many debates rage in this field of inquiry and little formal research exists to date, attention to organizational learning implies a central concern with how action patterns take shape within and are shaped by the stream of experience (Cohen & Sproull, 1996, p.xiii). Therefore, the more cognitive, process oriented approaches should be examined to explore how organizational learning affects implementation. For instance, Weick (1990) describes organizations as bodies of thought and sets of thinking practices that shape organizational variables. Kim (1993) developed a model showing how individual mental models through learning transfer to organizational learning through either complete or incomplete cycles effected by organizational routines. Argyris and 42

PAGE 57

Schon (1978) in a similar fashion describe the two levels of organizational learning. Single looped learning is behavioral in that organizations use common responses or routines to correct problems. Double looped learning is cognitive in that the underlying assumptions of the organization are modified to generate different responses. Duncan and Weiss (1979) developed a middle ground theory by suggesting that organizational learning is an active process in which organizational members develop knowledge about action-outcome relationships and the effect of the environment on these relationships. Recently, other authors have begun to view learning organizations as a collection of disciplines (Senge, 1990) including personal mastery, shared vision, mental models, team learning, and systems thinking. Others (Roberts & Kleiner, 1999) have substantiated organizational learning as dealing primarily with systems theory but argue that there are multiple ways to understand organizations as systems including open systems, social systems, dynamic systems, process systems, or living systems. Last, Huber's model ( 1991) provides the most parsimonious organizational learning theory. Huber's model uses four constructs to explain how organizations learn through: knowledge acquisition, information distribution, information interpretation, and organizational memory. 43

PAGE 58

By using this perspective to understand implementation of systemic reform, we are able to understand how local education agencies as organizations attend to, make sense of and cognitively construct meaning of new reform policies. While some authors suggest that the pedagogy of the policy itself can be blamed for initial failure of systemic reform, an organizational learning perspective attends to local contexts and processes and how they construct greater learning opportunities that lead to both cognitive and behavioral changes. However, organizational learning like all perspectives leaves unanswered questions. Namely, what things lead LEAs to act on what they have learned, or how do LEAs actually learn about and from policy, and what contextual factors enable organizational learning to occur? Last if we accept that capacity is a major part of any implementation perspective, how is capacity related to organizational learning? Capacity Studies Many policy researchers who study systemic reform have used the concept of capacity to help explain why standards succeed or fail in the For instance, Knapp (1997) suggests that a lack of capacity led to uneven policy attempts in Montana, Delaware, Connecticut and California with the National Science Foundation's Statewide Systemic Initiatives. Researchers who use capacity as an 44

PAGE 59

explanatory framework for implementation research though have not come to any common defmit.ion or understanding of capacity because capacity has been defined from a variety of perspectives. Capacity has generally been defmed as: the ability to receive hold or absorb information the ability to learn or retain knowledge the ability to do something the quality of being suitable for or receptive to specified treatment Capacity studies use many of these definitions and isolate dimensions such as resources, training, organizational structure, policy and interaction. Capacity studies also use more nebulous terms such as collective enterprise or cognitive capacity to explain factors shaping local response. Most capacity theories fit under one of three main perspectives: capacity as organizational resources and influence, capacity as a form of economic capital, or capacity as individual cognitive constructions and beliefs. However, no comprehensive interactive model of how or why capacity leads to overall changes in classroom practice or district action exists to date. There are also no studies to date that differentiate between facilitating factors, processes, results or cultural conditions that enable learning and become self-reinforcing over time. Each major element that has been found as capacity, therefore, may help enable 45

PAGE 60

implementation. However, no clear understanding has been developed that uses multiple elements around large-scale organizational learning. Capacity as Organizational Resources and Influence If an individual's capacity to learn from policy and change practice is dependent upon organizational resources and influences, then many studies have defined capacity in this manner. For instance Spillane & Thompson (1997), Newmann, King & Rigdon (1997), and Smith (1997) all point to the role of resources in enabling other elements of capacity. Common among the resources often cited included the quantity and quality of money, staffing, focused time, equipment and materials. Other research in this field looked more in-depth at the interaction and capacity needs in the intent of instructional policy, the organizational unit and individual teachers to ascertain how learning from policy can occur. For instance, Newmann and others (1997), Stokes (1998), Wechsler & Friedrich (1997), and Cohen (1995) all point to capacity as a collective enterprise or collegial learning communities. This research suggests that schools with this form of capacity are more receptive to ambitious reforms. These schools contain higher degrees of social resources, shared commitment and collaboration, clearer purposes, a supportive climate and a commonality ofbeliefs leading to high degrees of internal accountability. This 46

PAGE 61

research also points out that this form of capacity makes it easier to understand newer forms of teaching and learning. Similarly, Knapp (1997) suggests that the teachers as the targets of any policy instnunent must overcome the contexts and conditions that mediate individual learning. Knapp (1997) identified conditions in his study of systemic reform that influence practice as professional relationships, contact with knowledge bases, knowledge opportunities, influences that acted as the medium of communication, and the social organization or culture of a school or LEA. Similar to Knapp's notion of capacity as knowledge opportunities and access to knowledge, McDonnell and Choisser (1997) and Smith (1997), in their studies of the influence of new state assessment on schools and teachers, also found that the knowledge construction of teachers interacted and was dependent upon both the school and LEA organizations to offer technical support and a supportive climate. These two studies also found that training alone was not enough to totally help teachers change practice but was dependent upon the school and district to remove barriers and incoherence in internal practice in order for teacher practice to change. The Consortium for Policy Research in Education (1996) in their studies of systemic reform from a learning perspective also found similar interrelationships between individual capacity and organizational capacity. CPRE (Goertz and 47

PAGE 62

others1995) defined capacity as the ability ofthe educational system to help all students meet higher standards. These studies identified elements of teacher capacity necessary for instructional change including knowledge, skills, dispositions, and views of self. These elements depend strongly upon organizational capacity to learn from and make sense of more ambitious instructional policy. Organizational capacity defmed as vision, collective commitment, knowledge or access to knowledge, resources and organizational structures conducive to learning were found to interact with individual capacity in a number of ways. High capacity schools had these organizational elements in place that allowed individual learning by teachers to occur and pursue reform ideas. These high capacity schools worked to enhance teacher capabilities through not only access to knowledge but also through cultural norms, outside coaching relationships, and a strong professional community of practice. Similarly, Sykes (1990) came to similar conclusions earlier. He suggested that organizational control structures limited teacher's ability to learn from policy, and that institutional messages about schools limited learning necessary for implementation. Spillane and others (1995) in their study on the local policy system and how it affects math and science education in Michigan found that organizational influences strongly affected changes in math and science education. Most notable, this research 48

PAGE 63

found that an LEAs connection with an outside network for its teachers strongly influenced change in practice. The LEAs commitment and disposition to maintain the change plus the collaborative atmosphere within the district were found to be major components of capacity. Schlechty (1997) defined capacity from an organizational influence perspective as the ability of a school district to identify and maintain a clear and compelling focus on the future, maintain a constant direction, and act strategically. While only hypothetical in nature, Schlecthy believes that each level of the educational system should aim to build capacity in the level below. Capacity in this sense means to be able to solve the problems that inhibit the improvement of achievement at the student level. Bodily and others (1998) in their evaluation of the implementation of the New American School Models found that certain organizational factors influenced the implementation of whole school reforms. Using the elements of design in each model as the dependent variable in a case study approach, researchers used a 0=4 scale to rate the degree of implementation in 40 separate schools. Through interviews and survey data, they found that factors including the process of selection, school receptivity or climate for reform, degree and stability of outside consultation, forms of interventions (whole staff training, extensive professional development, use of 49

PAGE 64

facilitators, quality checks, materials and overall support), poverty level of the school, teacher mobility, and student teacher ratios as independent variables influenced the degree of implementation. Researchers in this study also found that the influence of the school district support including leadership support, culture of trust and cooperation between the school and district, the level of autonomy for schools, the level of resource support, and the effects of assessment and accountability packages also greatly influenced the degree of implementation of whole school reform models. In evaluations of the National Science Foundation's Statewide Systemic Initiative (SSI), Zuckerman, Shields, Adelman, Corcoran and Goertz (1998), found that eight common implementation strategies were used to build capacity in the 25 states involved. These strategies included: supporting teacher professional development, developing or disseminating instructional materials, supporting model schools, aligning state policy, creating an infrastructure to support reform, funding local initiatives, reforming higher education and preparation of teachers, and mobilizing public and professional opinion. Each of these strategies showed some impact in each of the 25 involved states. However, those states that had the largest impact on student achievement and changes in teacher behavior used intensive teacher development as well as significant investments in instructional materials and 50

PAGE 65

resources(Zuckerman and others, 1998 pp. vii-x). This evaluation also found that capacity as state, local and school influences depended on multiple fonns including teacher networks, regional assistance centers, technology infrastructure, or improved processes for selecting instructional materials. Finally, Massell (1998) studied the interaction between states and LEAs and the strategies states used to build capacity at a local level in eight states known for standards-based reform. This research was the first to study how state policy itself helps to build capacity in the LEA. In this research, Massell defined capacity as the elements necessary to support effective instruction. Her study suggests that numerous nested contexts exist that act to support effective instruction in classrooms. In these eight states, Massell found seven areas of capacity that were essential for improving teaching and learning. First, classroom capacities or those elements that directly influence learning included such things as teachers' knowledge and skills, students' motivation and readiness to learn and curriculum materials for students and teachers. Second, school, district and state organizational capacities were found to provide educational direction and leadership as well as access to resources and knowledge. Capacity elements at these levels included the quality and types of people supporting the classroom, the quantity and quality of interaction within and among 51

PAGE 66

organizational levels, material resources and the organization and allocation of school and district resources. Massell also found in these eight states that despite their differences, they shared four common capacity building strategies. These included: first, building external infrastructure to provide professional development and technical assistance including such things as regional institutions, professional networks, professional associations and stronger ties with higher education; second, setting professional development and training standards for inservice and preservice education; third, providing curriculum materials through such avenues as curriculum frameworks, resource banks, and supporting effective programs in resource allocations; and fourth, organizing and allocating resources through linkages to accountability systems or site-based decision making. Last, Massell found substantial evidence that state policymakers were paying attention in building local capacity through new strategies. These strategies included such things as locating assistance closer to schools, creating professional networks, providing strong curriculum guidance and adopting professional development standards. Despite these findings, Massell also found potential capacity problems that these eight states had not addressed. These capacity problems including limited capacity of state departments of education, limited understanding of performance 52

PAGE 67

data that was supposed to drive all systems to improve, limited scope of capacity building aimed at low-performing schools while ignoring middle performing schools, and the need for continuity in capacity building during periods of conflict. Massell also found that a serious problem in most states were the incentives necessary to engage people in capacity building. Incentives for following professional development standards, improving teacher training institutions, pursuing professional development, holding students to high standards, and engaging in school improvement planning processes were found to be lacking. In sum, those who define capacity from an organizational resource and influence perspective suggest that capacity is only about allocating the right number of resources, developing the right type of culture to influence individual learning, or having access to the latest knowledge base. This perspective, however, leads to many questions. For instance, how do these elements interact, how is this form of capacity developed over time, and how is it maintained? Nor is it clear from these studies just how many resources correlate with changes in classroom practice. Last, it is unclear from this perspective how individuals and collective units use these forms of capacity to make sense of large systemic reform efforts. 53

PAGE 68

Capacity as Economic Capital Economic capital and production theories have been used by researchers as another way to conceive of capacity. If we can conceive of capital as something acquired or developed over time (Coleman, 1990), then "spending" this capital to "buy" implementation becomes another way to conceptualize capacity. Coleman (1990) explains that three forms of capital are available in any organization that can help facilitate goals: physical capital or available materials goods; human capital or the skills and knowledge acquired by individuals; and social capital or the social relations among the people in an organization. In Coleman's theory, all three forms of capital help facilitate the particular production function of an organization. However to change, Coleman theorizes that social capital becomes the most important form to draw upon. Note that social capital is very similar to the concept of capacity as social organization, collective capacity, collective enterprise or collegial learning communities discussed in the previous section. Those schools or districts with more capital to spend, can purchase or buy more of the intended policy changes. How this works in actual practice, and how capital is developed is not as clearly delineated. Nor is it clear where social capital in particular rests or how it leads to enhanced efficiency or production. 54

PAGE 69

Spillane and Thomson (1997) argue that capacity has to be rethought in light of the amount of learning required by all educators in relation to current complex reforms. Focusing on Local Education Agencies (LEAs), they found that where ambitious instructional reform was taking place, three interrelated dimensions were evident. These included human capital, social capital and financial resources. They also suggested that successful enactment of reform policy was dependent on local LEA leaders to help others learn by promoting and using these three interrelated dimensions. Spillane and Thompson, therefore, define capacity as innate elements that exist within LEAs to engage individuals in learning. They also argue that the development of human capital depends critically on the development and exploitation of social capital. How this process works, how social capital leads to enhanced human capital, and how each acts to influence large-scale organizational learning though are unclear. Similarly, Corcoran & Goertz (1995) define capacity from another economic perspective: as production or instructional capacity. In their definition, a system's capacity functions as a set of three variables: human resources, resources such as time, money, organizational arrangements, incentives and materials, and the instructional culture. Corcoran and Goertz also suggest that capacity is limited in most districts and schools because our knowledge of what works is limited and that 55

PAGE 70

teachers' access to learning opportunities are limited. This in tum affects the effort of a faculty to collaborate around a collective effort. If the product of an educational system is high quality instruction, then the intellectual ability, knowledge and skills of teachers require an instructional culture focused on collaborative learning around common problems of practice. While this definition clearly begins to define capacity as an interactive model, there is no empirical basis for their definition; nor is there a clear understanding of how these elements interact to influence change in instructional practice. Pure economic analyses of the relationship between resources, spending, and school and student performance also show little evidence to support economic capital as the only resource necessary for capacity. Hanushek (1997) did a meta-analysis of over 400 studies of student achievement. His findings demonstrate that there is not a strong or consistent relationship between student performance and school resources, at least after variations in family inputs are taken into account. Monk (1998) found similar results in a study of a New York policy requiring that students wanting to graduate from high school pass five Regents examinations by 2003. While of students participating in Regents' exams increased and inflationary increases in spending were evident in the state from 1992-1996, no increases in spending for new personnel were evident, nor were resulting increases in passing percentages evident. 56

PAGE 71

In sum, economic perspectives on capacity ignore the dynamic nature of organizations as open systems and the role of learning defmed as capacity. While capital resources are necessary in any educational setting, evidence suggests the impact of such resources alone cannot influence student or school performance. Capacity as Cognition A fmal way capacity has been defined is through the perspective of cognitive capacity. Using recent work in organizational theory that emphasizes the role of cognition in human activity, many authors have explored cognitive capacity of teachers, administrators and as organizational processes relative to systemic reform. Educator knowledge and beliefs have been an important part of later implementation studies. How individuals access, interpret and make sense of ambitious reforms depends on many things and can influence the allocation of resources and learning opportunities for others. Spillane (1993) for instance looked at how two districts in Michigan responded to state reading policy. He found that state policy and its meaning are affected by the notions central administrators hold about curriculum and instruction. If central administrators hold traditional pedagogical views, they limit the resources and opportunities for teachers to learn from these policies. Similarly, Jennings and Spillane (1996) in their study of South Carolina 57

PAGE 72

policy for at-risk learners found that unequal capacity at the local school site affects learning opportunities for practitioners. Their study raises questions about the effects of policy when it requires local school systems with unequal capacity to implement the same ambitious reform. Similarly, Sanders (1998) found LEAs in which key personnel had or acquired deep understanding and clear beliefs about the implications of standards for classroom were the most effective at garnering and using multiple kinds of resources to support teachers' work. In her work on Colorado's SSI efforts, cognitive capacity was defined as: Depth of knowledge and beliefs in the organization about the refonns and strategic use of knowledge and beliefs to obtain important organizational resources that teachers and others in the system need to fully implement standards (Sanders, 1998 p. 2). From an organizational learning perspective, Elliott (1998) suggests that the internal capacity of schools and school districts in relation to intervention depends on organizational cognition that he defines as non-routine learning. In this sense, nonroutine organizational learning depends upon four cognitive dimensions that add to the cognitive capacity of the organization to learn. These dimensions include: the approach to the organizational dialectic which includes the deep questioning of conflicting norms, purposes and fundamental quality of collective interactions; the approach to knowledge which is a deliberate search, interpretation and experimentation for purposes of both refinement and reorientation; the approach to 58

PAGE 73

task which is the interaction with the fundamental norms ofteaching, school structure and collective interaction that leads to the development of a collective mind or capacity; and the approach to learning is a conscious awareness of the learning culture as continuous, collective, connected, deliberate and transformative. These cognitive processes are thus enabled by factors that both precede and follow including resources and the ability of a school or LEA to switch to these non-routine learning processes when necessary. In sum, the ideas of cognitive capacity help us further to define capacity as a process of learning, constructing beliefs, and interpreting reform while better defining where both individual and organizational understanding are stored. However, by itself cognitive capacity does not explain how this capacity is developed, how resources influence its attainment, nor does it explain how this cognitive capacity becomes a collective understanding necessary for deep implementation of standards. Relation to Organizational Learning In sum, capacity has been defined as many things from diverse perspectives. If systemic reform requires greater knowledge on all parts of the LEA as some suggest as the logical extension of the central principles of the reform then capacity of the 59

PAGE 74

system can be identified as the whole systems ability to learn or retain knowledge, the faculty to do something, or the receptivity to treatment from policy. Therefore, we must tum to organizational learning as a way to begin to define and understand capacity on a large scale because of the similarities between organizational learning and capacity. These similarities include: A focus on both individual and collective learning Enabling resources Mediation of organizations on individuals and individuals on organizations A collective sense of focus, purpose and social resources that enables learning Processes within the organization that help facilitate and store the interpretation of the learning that occurred To date, no capacity definition or model has used a dynamic, interactive systemic model to describe how the various elements interact or to define how the LEA as an organization does learn. Therefore, there exists a need for a new theoretical perspective to understand organizational implementation and capacity if constructivism is used as the large theory in understanding systemic reform. Using Huber's (1991) model for organizational learning, knowledge acquisition, information distribution, information interpretation, and organizational memory, we can begin to illustrate how capacity and its various elements may link to organizational learning through a conceptual framework of organizational learning capabilities. 60

PAGE 75

Organizational Learning Capability and Orientations Organizational learning has suffered from a lack of coherent theory. Some authors have viewed organizational learning as process; others have seen it as an end result. Still others like Leithwood, Leonard and Sharratt(1998) and Scribner, Cockrell, Cockrell and Valentine (1999) who have studied organizational learning in educational settings have begun to understand organizational learning differently. They see it as the relationship among impetus, mediating variables that enable learning on a collective basis, and the eventual changes in both cognitive structures and behaviors. Similarly, Toft-Everson, Jesse and Burger (1997), in their study of organizational learning, suggest that the organizational context into which educational changes are placed determines the receptivity to the proposed changes and improvements. This receptivity as local contextual factors was studied in seven LEAs. Toft-Everson and others found that organizational learning was a result of the culture of the school or LEA, organizational processes, managerial power and control, functions of leadership, communication processes, management of change processes, and stakeholder engagement. They also concluded that there is an essential relationship between organizational learning and organizational change, and that educational organizations can improve their capacity to learn by improving those attributes that lead to learning. 61

PAGE 76

Last in his seminal review of organizational learning, Cousins (1996) also suggests that the influences on organizational learning come from two sources: the context within which learning capacity is embedded and the "conditions and factors associated with the environment within which the organization exists" (p. 608). Within the local context, Cousins classifies the local context as consisting of four components that are similar to many ideas discussed in the previous section of capacity. These components include: ecology or the relationships between the human and physical or material aspects of the organization milieu or the psycho-social dimensions of persons, groups and their characteristics social system or the patterned relationships among persons and groups culture or the belief system, values, cognitive structures and meanings shared among organization members (Cousins as discussed in Toft-Everson and others, 1997) Because organizational learning is now being analyzed as both processes and outcomes, Huber's model has been expanded to better understand differences in organizational learning. This model may help us understand how LEAs successful in learning about and implementing standards-based reform differ from other LEAs. This midground of understanding capacity as organizational learning capability uses the ideas developed by Nevis, Dibella and Gould (1995) through organizational learning and orientations. Modifying Huber's constructs of knowledge acquisition, 62

PAGE 77

sharing and utilization, Nevis, DiBella and Gould (1995) explored successful businesses to understand organizations as learning systems and to determine how their learning differed. Picking companies who had reputations for organizational learning, they used a field study approach and grounded analysis technique to develop their concept of organizational learning capability. Their findings suggest that .while all organizations learn, they have different learning orientations. Orientations describe how, what, and where learning occurs and are based on culture, experience and core competence. Organizations also possess different nonnative, facilitating factors or processes that affect how easy or hard it is for learning to occur. These factors are based on best practices and common processes for learning. Using Huber's constructs, the framework for looking at organizational learning orientations allows organizations to be studied as separate entities but to be compared for their different styles and responses to learning. (see table 2.1 below) Table 2.1 Organizational Learning Capability Model I. Knowledge source 2. Content-process focus 3. Knowledge reserve 4. Dissemination mode 5. Learning scope fl. Value-chain focus 7. Learning focus I. Scanning imperative Performance gap Concern for measurement curiosity Climate of openness Learning Orientation Continuum internal ............................................................... external content ................................................................ process personal .............................................................. public formal ................................................................. informal incremental ........................................................ transformative design ................................................................. deliver individual ........................................................... group Facilitating Factors 6. Continuous education 7. Operational variety 8. Multiple advocates 9. Involved leadership I 0. Systems perspective 63

PAGE 78

Seen in this manner, organizations have an organizational learning system that can be described as learning orientations or style and facilitating factors or those elements within an o"rganization that promote learning. The assumptions in this conceptual framework are that organizations are learning systems, and that learning in organizations is a systems-level phenomenon that is a constant function of any organization (Nevis and others,1995). The conceptual framework of organizational learning capabilities also assumes that an effective learning organization constantly pursues an enhanced knowledge base allowing the development of competencies for either incremental or transformational changes. In these instances, "there is assimilation and utilization of knowledge and some kind of integrated learning system to support such actionable learning" (Nevis and others, 1995, p. 74). Although much of the work with organizational learning capabilities has been done in business organizations, DiBella & Nevis (1998) suggest that this framework can be adapted to specific industry or contexts. For instance, after developing the orientations for general business, DiBella & Nevis used the general orientations to study health care systems. While they found some similarities, orientations had to be modified and adapted to fit both the culture of the industry and the way learning occurred. While no work to date has utilized this framework in schools or school districts, the purpose of this framework is to increase awareness and understanding of 64

PAGE 79

the practices and elements that contribute to learning in any industry so that these capabilities may be increased. This dissertation applies the framework to educational settings to-investigate whether and how orientations and facilitating factors can be used to describe organizational learning in local education agencies. In sum, organizational learning capability is a new theoretical framework that utilizes social-construction of meaning to understand collective learning in various industries. It is an appropriate framework to examine both implementation and capacity issues in LEAs because it focuses on how organizations actually learn while examining how facilitating factors influences learning orientations. In this sense, it may encompass many elements from the capacity studies to show how resources enable individual and collective learning, and how orientations as cultural artifacts also enable learning. It is also an appropriate framework to use in that the learning orientations or learning styles are descriptive and therefore do not require value judgments. This allows us to look at how differences in learning orientations among LEAs explain different ways to interpret standards-based reform and therefore better understand variation. Last, this framework allows for modification according to industry. 65

PAGE 80

Conclusion As the struggle to understand educational policy implementation continues, Elmore and Fuhrman (1994) suggest that the big ideas of policy are vulnerable to the capacities of the people and institutions that implement them (p. 9). They go on to suggest that since the current reforms are deep into the core technologies of schooling, they are more vulnerable to capacity problems. Their solutions parallel the main thesis of this dissertation in that implementation of ambitious reforms is a problem of knowledge development and learning (p. 9). This chapter discussed that early implementation research of educational policy did little to help us understand how large-scale policy implementation could occur. As new forms of policy took shape, new conceptual frameworks were used to understand how the ideas of these policies were put in place and modified according to local contexts. The latest form of educational policy, systemic reform or standards-based reform, focuses more heavily on the core technology of teaching and learning. Therefore, other conceptual frameworks such as organizational learning have been promoted as more appropriate ways to understand implementation within local contexts. These frameworks rely on large conceptual theories like constructivist learning and socially constructed meaning. These frameworks and theories have been promoted as the newest way to understand policy implementation as learning and 66

PAGE 81

large-scale construction of meaning since standards-based refonn relies heavily on. the learning in all parts of the educational system. In conjunction with organizational learning, this chapter discussed the relationship between the ideas of capacity as organizational resources and influences, economic capital and as cognition to enable this large-scale organizational learning. While most research in this area suggests capacity as the enabling elements that lead to organizational implementation or learning, very little agreement exists as to what capacity means relative to organizational learning or how the elements of capacity interact to enable organizational learning. Last, if we wish to understand what capacity means relative to systemic refonn using organizational learning as a filter for interpretation, organizational learning capability and orientations offers a way to understand the local context of learning. This framework allows modifications appropriate to particular organizations and combines both descriptive orientations of learning and nonnative, facilitating factors. This framework helps pull together the diversity in capacity research to focus not just on defining elements, but also differing orientations to learning. This allows LEAs and their response to systemic refonn to be studied as separate entities but to be compared for their different styles and processes to learning in an interactive manner. 67

PAGE 82

In sum, organizational learning and organizational learning capabilities allow further evolution in understanding educational policy implementation and capacity. Systemic-reform as changes in the core teclmology of teaching, learning and school organization demands an extraordinary amount of learning and rethinking of common school practices. Focusing on how LEAs respond to systemic reform blurs the common dichotomy that has traditionally been drawn between state policy and the local context and demands a more substantive understanding of large-scale collective learning, enabling elements, and the interaction between the two. Using organizational learning orientations and capabilities to do this, I explore the response of three LEAs to Colorado's version of systemic reform in order to offer alternative explanations for the state policy and LEA implementation relationship. 68

PAGE 83

CHAPTER3 RESEARCH DESIGN This study explored how three Local Education Agencies responded to Colorado's standards and assessment policies by exploring the dynamic interrelationship of orientations and factors leading to organizational learning. Central questions for this study included: 1. Using Dibella and Nevis' (1998) description of organizational learning based on Huber (1991) a.) How did LEAs learn about standards-based reform policies; and b) What orientations did LEAs use to learn about and implement current state-mandated standards-based reform policy? 2. What facilitating factors in each of Huber's four areas were perceived by district respondents to contribute to organizational learning about these policies? 3. Using learning orientations and facilitating factors, what interpretations were constructed by Local Education Agencies (LEAs) about state-mandated standards based reform policies? 4. How do differences in learning orientations and facilitating factors explain variation of interpretation and implementation of standards-based reform policies? In order to answer these questions, the analysis of policy implementation must move away from simple cause and effect positivistic models and begin to use models of social learning afforded us through the lens of organizational learning capabilities. To look at the relationship between an LEAs capacity, organizational learning 69

PAGE 84

capabilities and its degree of enactment of state policy, three steps must be undertaken to understand this learning. First, the LEAs organizational learning capabilities must be discovered through its learning orientations and use of facilitating factors to describe the interplay between the policy and organizational features. Second, how an LEA enacts a policy helps us understand how the policy is interpreted through learning opportunities (Cohen and Hill, 1998) afforded by the policy and other knowledge opportunities and disseminated as learning opportunities within an LEA. Seen in this way, implementation may equate to collective learning by personnel in an LEA. Enactment also allows for comparisons between LEAs and how contextual differences in organizational learning capabilities may lead to variation in implementation. Last, the two previous steps must be analyzed to understand the dynamic relationship between how an LEA as an organization learns, what it learns, how these lessons are interpreted, and how that affects the enactment of state policy. A general research tradition to accomplish these steps requires a comparative, qualitative case study approach to unravel how LEAs learn from, make sense of, and put in place the lessons of standards-based reform through state policy. A case study approach is most appropriate for this type of policy research for three major reasons. First, according to Olson (as quoted in Merriam, 1998) a case study: first, helps 70

PAGE 85

illustrate the complexities of a situation to show how numerous factors contributed to the problem in question; second, has the advantage of both hindsight and the present; third, shows the influence of personalities and their individual cognition on the issue; fourth, can show the influence of the passage of time on the issue; fifth, obtains information from a wide variety of sources; sixth, spells out differences of opinion and suggests how these differences have influenced the result; and seventh, presents information in a variety of ways and from the viewpoints of different groups. (pp. 30-31 ). Similarly as Merriam goes on to explain, "the heuristic quality of a case study is suggested by these aspects ... case studies have the ability to explain why an innovation worked or failed to work. .. and can evaluate, summarize and conclude, thus increasing its potential applicability" (p. 31 ). Second, educational policy research about the implementation of reform demands by its very nature that we understand certain relationships. These relationships include interactions between the intent of the policy, the implementing agency, and the knowledge generated by this interaction. Case study traditions allow us to understand better the role of policy in generating this knowledge. For instance Stake ( 1981) suggests that the knowledge gained from case study research is different than traditional research because: first, it is more concrete and not abstract; second, it is more contextual and therefore allows us to see how context and policy interact; third, 71

PAGE 86

it is more developed by reader interpretation that leads to better generalization when new data is added to the old; and fourth, it is based more on reference populations that again help transfer the knowledge and generalize it to the reference populations. Third, the majority of research done with organizational learning, organizational learning capabilities, implementation research and capacity studies use case study as their primary research methodology for particular reasons. First, case studies allow the how, why and meaning questions to be answered more easily. Second, case study is a particularly good design if the phenomena being studied is a process as in the case of implementation research. Case studies in implementation research help us to understand processes of events, projects and programs and to discover contextual characteristics that will shed light on an issue (Sanders, 1981 in Merriam, 1998, p. 33). Third, case study design helps elicit the boundaries of the phenomena under question and helps explain how success of a policy or program is due to contextual features. Last, because the policy in question interacts with the local LEAs in numerous dimensions, case study research allows each of these dimensions to be fully explored for their dynamic interplay. In this study, the implementation of standards-based reform policy has three hypothesized dimensions that can be studied most appropriately by using case study methodology. These dimensions include LEAs as organizational learning processes, 72

PAGE 87

temporal processes associated with the policy, and sustained processes as a result of implementation as learning. Each dimension adds to the rich understanding of the implementation process of state policy. This study, however, will focus primarily on the district and its learning processes as the unit of analysis and attempt to interweave the other "case dimensions" into a composite understanding of how districts as organizations acquired, interpreted, disseminated.and utilized knowledge of standards-based reform. Case study research uses various methodologies depending on research aims and questions. For this study, qualitative forms of methodology are most appropriate because as Maxwell (1996) states: "qualitative research can develop explanatory conclusions and theories, and rule out potential validity threats to these conclusions and theories" (p.l ). However, as Maxwell goes on to explain, qualitative methodologies use a different logic from ones employed in traditional experimental and correlational designs. Qualitative methods in policy and organizational research focus on process theory versus variance theory. As Maxwell (1996) states: Process theory, in contrast, deals with events and the processes that connect them; it is based on an analysis of the causal processes by which some events influence others. Process explanation, since it deals with specific events and processes, is much less amenable to statistical approaches. It lends itselfto the in-depth study of one or a few cases or a small sample of individuals, and to textual forms of data that retain the contextual connections between the events. {p.2) 73

PAGE 88

In sum, a qualitative case study allows us to see how LEAs as organizations mediate the influence of state policy through its organizational learning capabilities. The remainder of this chapter will more fully detail how case sites were selected and who acted as informants at each site. This chapter will also describe data collection, data analysis and end with a discussion of validity and limitations of the study. Site Selection and Sampling As an attempt to redefine capacity from an organizational learning capability perspective this research study used a comparative case study of three school districts. This number allowed for cross-case comparisons and greater fidelity to the organizational learning capability model to come up with descriptors for educational settings. Districts were chosen as the main unit of analysis since they exist as the main flow-through agency for most state policy; and because as Elmore, Siskin and Carney (1998) suggest, it is important to examine educational policy questions from the perspective of school and district experiences, rather than external policies that purport to influence schools alone. Districts were selected by purposeful, theoretical, or criterion-based sampling. The intent of this study was not to develop highly generalized findings, but to begin to explore organizational learning capability in LEAs. Site selection was not meant 74

PAGE 89

to be representative, but specific to the research questions. Therefore, specific criteria were needed to select sites or cases to study so that I could discover the organizational learning orientations and facilitating factors of LEAs and their relationship to implementing standards-based reform. Criteria for case selection include the following: First, cases were selected by reputation for substantial evidence, or lack, of investment, progress, and success in implementing standards-based reform. Only by studying the contrast between successful and not successful implementation can the relationship between the policy and organizational learning begin to be extracted. As Knapp (1997) suggests, successful implementation and/or progress may define the LEAs receptivity to the reform ideas and ability to sustain them over time that suggests higher capacity within these districts. This was done by asking for recommendations from field-based consultants with the Colorado Department of Education (CDE), analyzing documents from CONNECT a statewide math and science initiative based on systemic reform principles, analyzing GOALS 2000 local improvement grant requests, and using researcher knowledge of districts' reputations. State assessment data from 1997-1999 were also analyzed as a proxy measure for districts making progress in becoming more standards-based by noting increases in percentages of students meeting proficiency levels. 75

PAGE 90

Second, to facilitate the expansion of capacity and organizational learning capability theory, it seemed important to look for variance in response to the policy. This allowed fidelity to organizational learning capability model as a way to understand differences in learning orientations and understand variation as range in response. Differences in response also suggest greater-but different-learning and high-capacity that can be a manifestation of greater attention by actors within these sites to structure greater learning opportunities (Knapp, 1997, p.254). Variation in this sense means what a district chooses to use as its main entry point to implement standards-based reform. This includes elements such as assessment, professional development, curriculum alignment, or parent and community involvement. This was done through consulting with CDE representatives, analyzing district implementation or accreditation plans, analyzing CONNECT documents, and through using researcher experience and knowledge of LEAs in the state. Third, two districts were selected as matched demographic pairs to provide partial control for geographic location, size, and resources available for implementation. This again allows fidelity to the OLC model and understanding capacity as a multitude of factors beyond financial or personnel issues. A third district was chosen as a cross comparison case. This district, while in geographic proximity to the others, differed in size, available resources, and community 76

PAGE 91

demographics. Tills provided a way to contrast typical organizational features (size, resources, demographics) and their role in organizational learning. Last, districts were selected by reputation. That is, in order to examine implementation as OLC, cases had to be chosen that would help lead to enhanced theory in better understanding organizational learning capability. Therefore, districts with reputations for being learning organizations or for having unique processes or methods for implementation could help illuminate organizational learning capability in new ways. Again, a third case was chosen that did not have this reputation to provide a contrasting case. Tills was done through consultation with representatives of CDE, the Association of Colorado Education Evaluators and by analyzing number and types of presentations by districts at regional and state meetings dealing with standards-based reform. Case Descriptions Table 3.1 Case Summaries District Name Location Number of Number of Number of Students Schools Central Office Administrators Midplains North Central 2635 4 3 River Valley North Central 14,161 27 8 Front Range Central Front 18,397 35 11 Range 77

PAGE 92

Case 1: Midplains School District. The Midplains Schools District educates 2635 students, and had 195 full time employees. Midplains is located in the north central part of the state and borders the other two districts. Mid plains exhibited a general orientation toward interpreting standards-based reform as new content and spent the majority of their resources on purchasing new curriculum materials. Because of its geographical proximity and lack of state reputation, Midplains was chosen as a contrasting case. At the time of this study, central office administration consisted of a superintendent, a director of staff development and curriculum and an assistant superintendent for auxiliary services. Case 2: River Valley School District. The River Valley School District educates 14,161 students in 27 separate buildings. River Valley is also located in the north central part of the state, and enjoys a reputation as a leader in standards-based reform. River Valley had a long history of reform, and exhibited an orientation toward interpreting standards-based reform as data-driven instruction and assessment to raise achievement. Because of its size and reputation, River Valley was chosen as one of the matched demographic pairs of districts. At the time of this study, central office consisted of a superintendent, assistant superintendent, the director of elementary curriculum who was also designated as the director of accountability to deal with new state accreditation laws, director of secondary curriculum, director of 78

PAGE 93

assessment, director of human resources, director of state programs and early childhood programs, and a director of business and auxiliary services. Case 3: Front Range School District. The Front Range School District educates 18,397 students in 35 separate buildings. At the time of this study, central office administration housed a superintendent, two assistant superintendents for instruction and business services, directors of elementary (pk-5) and secondary (6-12) education, director of human resources, director of staff development, and numerous TOSA's (teachers on special assignment) charged with implementing special initiatives. Front Range borders both Midplains and River Valley but is located closer to the Denver metro area than the other two. Front Range, while not possessing the overall reputation of River Valley, is well known for its efforts in literacy and science reform. Front Range exhibited an interpretation of standards-based reform based around differentiated instruction. Because of its size and reputation, Front Range was also chosen as one of the matched demographic pairs of districts. Data Collection Data collection for this study was collected through three primary methods: structured interviews, district documents, and observations of meetings. In addition, early case descriptions of learning processes were developed and discussed in an 79

PAGE 94

interactive process during follow-up interviews developed specifically around organizational learning capabilities. This process allowed for both validation of researcher insight and development of ideas relative to a learning profile. All previous research on organizational learning capabilities (Dibella and Nevis, 1995, 1998) has used these forms and methods of data collection to understand various learning styles for distinct industries. Data collection occurred during the spring and summer of 1999. Follow-up interviews occurred in the early fall of 1999. Interviews Sampling for interviewing was decided by who had primary responsibility for the enactment of standards-based education in each district. Initial lists of interviewees were designated prior to site access, and a snow-ball method was used to judge others who needed to be interviewed. In all districts, superintendents were interviewed for their view of overall implementation, resource allocation, and the role ofthe external environment in building organizational learning capability. In all24 people were interviewed (see table 3.2 below) in sessions lasting approximately one hour apiece depending on the time each informant had available. Some informants were interviewed more than once depending upon needed follow80

PAGE 95

up or the informants' depth of understanding. All interviews were tape recorded with the permission of the informant. Table 3.2: Informants Interviewed Position Held Midplains School River Valley School Front Range School By Informant District District District Superintendent X X X Assistant NA X X Superintendent Staff X NA X Development/ Curriculum Personnel Director of NA X X Assessment Human NA X X Resource Personnel Director of NA X X Elementary Education Director of NA X X Secondary Education Elementary X X X Principal Middle School X X X Principal High School X X X Principal Semi-structured interviews were developed to ensure comparable data across all case sites. These interviews Were primarily structured around processes of enacting standards-based education using Huber's (1991) framework about organizational learning. The interviews were also structured utilizing prior notions of organizational 81

PAGE 96

learning capabilities, and prior notions regarding elements of capacity that may be related to organizational learning. Because the nature of this research was exploratory in that organizational learning capabilities as a guiding framework has not previously been used in education, questions were designed to be as open-ended as possible. This allowed informants the opportunities to address salient issues that they deemed appropriate. All interview protocols were designed to elicit first, an understanding of how a district is attempting to enact standards-based reforms and the depth of this enactment; second, an understanding of each district's organizational learning orientation or how each acquired, interpreted, disseminated and utilized knowledge relative to standards-based education; and third, what facilitating factors helped each stage of the organizational learning process. A similar interview protocol was used for each district (See Appendix A) to ensure similar understandings across all cases to help better defme capacity from an organizational learning perspective. All interview data was transcribed as individual interviews and put into a data base using ATLAS.ti software. Docwnent Analysis At all three case sites, I collected district level documents pertaining to the implementation and enactment of standards-based education. These documents 82

PAGE 97

included such things as each district's implementation plan submitted to the Colorado Department of Education, accreditation plans, policies regarding standards based education, reports, curriculum documents, professional development information, specific grants, instructional materials, and any pertinent resource allocation memos or resource analyses relative to standards-based education. These documents served not only as information for analyses, but also helped guide further interview questions. State-level data was also be collected for each case site. Information on state level CSAP (Colorado Student Assessment Program) data over each year of the test gave an idea of district improvement efforts and their success, and how well district efforts aligned with state expectations. Observations Where possible, I observed district led professional development opportunities and meetings dealing with the organization or enactment of standards-based reforms. Early on in gaining access to each case site, I ascertained if such observations were possible and attempted to attend as many as possible. These observations allowed me to understand better people's conceptions about standards-based reforms, and to see how individuals as part of a larger organization access opportunities to acquire, 83

PAGE 98

interpret, disseminate and utilize new knowledge. These opportunities also allowed access to issues and ideas not afforded in interview sessions. These observations allowed different perspectives on the district's organizational learning capability that could be pursued in subsequent interviews. Observational data, therefore, afforded me the opportunity not only to gain a deeper understanding of the district's theory in action (Argyris & Schon, 1993), but also to ground and validate interview data. In all, I attended one principal's meeting in the Mid plains School District, an assessment conference plus a superintendent's cabinet meeting in the River Valley School District, and two days of an instructional conference on differentiated instruction in the Front Range School District. Data Analysis According to McLauglin (1987), the conceptual and instrumental challenges in implementation analysis lie in models of multi-level and multi-actor complexities (p.177). These third generation analyses must also use new methods to elicit cognitive patterns of actions by individuals and organizations. In order to do this, this study analyzed all data in two ways based on the assumptions underlying the overall research questions: categorizing and connecting. Categorizing, or basing analysis on similarities in data, (Maxwell & Miller, 1998, p.l) helped determine what 84

PAGE 99

organizational learning orientations and facilitating factors existed in districts. Connecting, or synthesizing data based on relationships, helped determine how learning orientations and facilitating factors influenced overall meaning of the reform policies As in most qualitative research, data collection and analysis in this study occurred simultaneously. Data went through preliminary analysis using summary forms, initial coding forms and initial categorizing or systems model mapping. This interaction of data collection and data analysis enabled me to refine interview question, test initial hypothesis, and gather additional information as needed. After preliminary analysis, data was transcribed and put into a case data base using ATLAS.ti software and coded according to the processes described below. Categorization Strategies Data analysis of my interview, document and other data using categorizing strategies included the following four coding phases. These phases attempted to answer questions specific to Huber's theoretical framework, the DiBella and Nevis model of organizational learning capabilities, and investigate themes that were not well explained by the framework and model. 85

PAGE 100

Phase One. The first phase of analysis was to code the data according to the four major categories in Huber's description of organizational learning. Using Huber's operational definitions (Tables 3.3, 3.4, 3.5, 3.6 below), responses were'read again and coded for each category within cases using ATLAS.ti software. Responses were then compared across cases to ensure common use of the codes. Responses that did not fit the categories were analyzed to determine the limits of the :framework and its application to public educational organizations. This helped answer the first part of dissertation question one: Using Huber's (1991) description of organizational learning, how did LEAs learn about state-mandated standards-based reform policy? More specifically, using Huber's operational definitions, exemplars, and data samples emerged that helped to anchor rules for coding and help the validity and reliability of the codes. Acquisition Definition: How organizations gain new knowledge through the experiences of their own personnel or indirectly through the experiences of other organizations: Table 3.3 Coding Rules for Acquisition Exemplars Data Samples McREL "our curriculum adoption includes a research BOCES piece that evaluates what we're currently Consultants doing." University classes "I brought the whole concept of book talk and Central office personnel having dialog based on research" Staff development personnel "We have the probably most comprehensive Teacher/principal leaders choice list in the state and we have the Assessment data 5 models of staff development" Outside research "Actually we called some universities and finally ended up with McREL" "We took our administrators and some of our 86

PAGE 101

teachers through SBE training. Developed through our Northern Colorado BOCES. Interpretation Definition: Both the processes used to give meaning and specif meaning given to new ideas and knowledge by an organization. Table 3.4 Coding Rules for Interpretation Exemplars Data Samples Improving our district "so the strategies that we used was we looked Improving quality of curriculum at first of all the research around instruction Student achievement around content area and look at the state Writing standards model standards." Using data to make decisions "And we started talking about so what is the Linking data to district decisions difference between the traditional Designing units instructional system and the standards based Buying materials system" "Our occupational education program, there were standards and benchmarks set in everything." "So again we started putting money aside because what happened was that teachers knew they had to answer these questions and get to a certain point before they could come and ask for materials." "So, we actually created proficiencies or assessments that kids have to pass at the high school level in order to graduate in language arts, math, and science." "There was a whole framework of, 'this is not a fad that's coming through that we're going to jump on the bandwagon. This is not the latest staff development craze. This is something we believe in our heart of hearts is good for kids."' "If people implement standards really from the standpoint of creating an objective of what a student should know and be able to do and they don't impact the instructional strategies then we might as well not even have standards" Dissemination Definition: How an organization transmits the knowledge and meaning to their personnel. 87

PAGE 102

Table 3.5 Coding Rules for Dissemination Exemplars Data Samples Professional development "Another way they gain it is through level Replacement units meetings, secondary and elementary Lists of standards principals meeting." Informational seminars "The main thing right now we have is Newsletters causing a little bit of interest is that assuring Staff meetings the essentials document vision Discussions implementation plan." District plans "maybe publication of something whether it be a wall chart or something in their classroom where they could see the standards and benchmarks." "We provided a variety of different workshops the teachers could go to." "We created a plan where we have standardsbased education trainers and they were used to facilitate our summer Professional Development Centers." Utilization: How knowledge and meaning is used to alter decisions, behavior o culture in an organization. Table 3.6 Coding Rules for Utilization Exemplars Percents of teachers using Classroom assessment Reporting Student understanding of requirements School implementation/use Data Samples "When kids start saying not what is my grade but what are the kinds of things I have to do or how can I meet proficient level, to increase their learning that's better than does it count." "I would say 80% of our elementary and only 40% of our high school teachers use planners." "And it was interesting because schools were at different points. We'd call on the schools but we were all going, doing things in a different way. Which I think was healthy but it wasn't always beneficial to us." "One of the things that came out of this was accountability and how could we get the accountability. And what they decided a standards type system to bring that accountability in line." 88

PAGE 103

"Establishing the big steps, six math standards and saying,'We're going to test over all six,' forces us to integrate. I think it forces integration of the curriculum." Phase Two. The second stage of analysis was to code the data relative to learning orientations within particular district using preexisting codes from DiBella and Nevis' model of learning orientations to determine fit. This helped answer the second part of dissertation question one: Using DiBella and Nevis' (1998) description of organizational learning, what orientations do LEAs use to learn about and implement current state-mandated standards-based reform policy? An orientation, according to the original DiBella and Nevis model (1998), means any consistent pattern for how an organization acquires, interprets, disseminates and utilizes knowledge and is primarily descriptive in nature. According to this model, however, general orientations also contain variations or approaches that fall along a continuum. These approaches which are more stylistic in nature can also be defined by exemplars that help sharpen the descriptions of the bi-polar orientations within an educational setting. These bi-polar orientations form a simple rubric upon which district responses can be categorized to help cross case comparisons. A rubric was used to categorize overall district response rather than straight counts of codes because ofthe different numbers of people interviewed in each district and the difficulty in equating counts of a code to its depth among multiple respondents. 89

PAGE 104

This analysis was more fine grained to help begin understand variation by district case. Data were read, analyzed and coded using ATLAS.ti software using previously defmed learning under each of Huber's four areas. More specifically, using DiBella and Nevis' operational definitions, exemplars and data samples emerged that helped to anchor rules for coding and help the validity and reliability of those codes. Operational definitions shown were determined after initial analysis that some terms did not fit because of the model's original focus on business. In addition, some definitions were clarified after going through interrater reliability study because some codes seemed to overlap and could not differentiate data points. These are included in the following tables. Because ofthe model's original focus on business organizations, some operational definitions were changed to reflect the nature of educational organizations. In addition, those orientations that did not help differentiate district responses were eliminated. Learning orientations under acquisition Table 3. 7 Coding Rules for Knowledge Source l.Knowledg Operational Definition: Preference for where districts acquire new e Source knowledge Approach Internal. .................................... ...................................... External Exemplars Data analysis processes McREL Central office personnel Universities Teacher/ principal leaders BOCES School based problem solving CDE District staff development Consultants Research journals Data Samples "We have the administrative "I also believe that it's important to seminars and we have classes for send teachers and principals to teachers that teach teachers and professional development administrators how to use opportunities." 90

PAGE 105

that data, how to look at it and how "We're talking to Paula Rutherford, to analyze it." Rick "and what we've done is gotten into Stiggins, Bill and Anna Rourke. and its working successfully here is They've been, when we work with school by school, the team by team people, Tom Gusky, it's usually on an kind of intense training that last a ongoing school year or more. Where we can basis. The majority of those people really examine their own practices. work in here once every 2 months or once every month. Table 3.8 Coding Rules for Learning Focus 2.Learning Operational Definition: Preference for what knowledge districts Focus attempt to gain Approach Content ........ : ........................... .......................................... Process Exemplars Standards Assessment Curriculum Instructional planning Materials Data analysis Instructional methods Data Samples "we adopted a standards based math "But in learning to create better and curriculum so that better, technically better assessments." training is on that content so part of it "so there's nine years worth of work is process, the instruction of process, that put the high value on assessment and I would think as a tool that we can actually use as an most of it has just been in content." instructional tool, not just a measure "All they are is basic good for accountability." curriculum kinds of questions. "So that really talks about different There's nothing overly creative about instruction, talks about effective them but those questions actually learning strategies that all teachers ended up developing in a sense or have to know and be able to do well." defining what good curriculum should look like." Table 3.9 Coding Rules for Learner Focus 3. Learner Operational Definition: Preference for who learning is aimed at Focus Approach Individuals ............................... ........................................... Groups Exemplars Teachers Whole schools Principals District committees Leadership groups Data Samples "you would have some "allocated some budget in the form of administrators and key teachers go to competitive grants for various schools some of those BOCES standards to apply toward solving problems for sessions" achievement." 91

PAGE 106

"because we realized that teachers didn't always have the content level knowledge themselves to be successful." "first it started with building principals and administrators to build a foundation with instructional conversations" "but it was a district conversation and groups came together, you know, pretty flexible grouping, grade level and then district level." "And we went through that and trained all of our staffs in our elementaries." ientations under interpretation: None described in the original Nevis model. The original DiBella and Nevis model disagreed with t they believed that interpretation was an integrated element of all tg processes, and therefore organizations did not show any distinct in this process. under dissemination: ing Rules for Dissemination Mode Operational Definition: Preference for how knowledge is shared FormalInformation delivery ............ ............... FormalKnowledge construction Workshops Unit design Consultants Data analysis Classes Facilitators Presentations School based problem solving Meetings Collegial discussions "We gathered input about "We really have to have a collegial proficiencies and then had and instructional conversations." feedback meetings with"the public "and to actually sit down and work on where teachers and public unit organizers or course organizers and parents could come to hear what and basically just to share a wealth of the standards were" knowledge that each of them brings to "A third way they gain it is through the table." leadership council which is the "And each pair of sites were entire leadership team in the district." collaborating. They were producing "And of course we had material. It was your resource community people and those same toolbox." teachers and administrators lead those facilitation groups and then we invited our accountability committee to nartidnate in niH' rlio:1Tir.t 92

PAGE 107

I wide accountability committee Table 3.11 Coding Rules for Knowledge Reserve 5.Knowledg Operational Defmition: Preference for how knowledge is e Reserve documented Approach Explicitinformative ................ ................ .Implicit-non informative link to classroom link to classroom Exemplars Units Lists/ notebooks of standards Replacement units Standardized test results Assessment tasks Textbooks Achievement analysis protocols District plans Unit modification feedback Curriculum maps Data Samples "And all the achievement data now is "if you were to read Assuring the tied to standards. So instead of Essentials document or would have saying, "These are our reading a conversation about scores," we say, "These are the philosophically scores on how our students what it means to have a standards are able to do this." And we list it" based system, there's not a lot you "We also provide a toolbook for can argue with." teachers. We have a whole "Yeah, we have some little booklets tool kit for teachers that they can use that are being reprinted to give for self-evaluation on standards. So people some ideas about what's that they can go through after they've expected and what done a lesson and they can selfthe processes are." evaluate. We have a notebook that, "Yeah, we have loads of notebooks on different tools that teachers can with standards and benchmarks use." listed." "All units from the PDC are piloted and then feedback is given to improve it which is given to all '>t th'>t 1,..,,.1 nr Learning orientations under utilization: Table 3.12 Coding Rules for Learning Scope 6. Learning Operational Definition: Preference for the types of change for scope learning efforts relative to the intellectual quality of student learning experiences Approach Incremental .............................. .. ............................. Transformative Exemplars Increase of basic skills Problem solving Textbooks Constructivist learning Demonstrations ofknowledge 93

PAGE 108

NCTM Math Standards Communication/ Reasoning Long term projects Data Samples "Standards are just about defining "Standards by themselves don't do what we've always been doing" much. They need to be designed 'We just needed to get better into big projects that ask students to materials that match what the state fmd information, put it together, standards ask us to do" communicate it somehow. That's how you know if they learned it." "We have been focusing on the NCTM standards. You know getting them to work together, reason more, communicate the why of what they know rather than just pass a test of skills." Table 3.13 Coding Rules for Value Chain Focus 7. ValueOperational Definition: Preference for emphasis of learning chain focus investments Approach Things .............................................. .. ................................. Human resources Exemplars Textbooks Professional development Workbooks Incentives Programs Stipends Classroom supplies Data Samples "So again we started putting money "so the support really comes from aside because what happened was staff development directed that teachers knew they had to allocations." answer these questions and get to a "We spend a great deal of time and certain point before they could come money on teachers." and ask for materials." "When you see that over the last couple, well three years now, four hundred, five hundred thousand dollars set aside every year for materials. Phase Three: The third stage of analysis was similar to phase two in that I code1 the data from particular districts relative to preexisting codes of facilitating factors 94

PAGE 109

from DiBella and Nevis' model to determine fit. This helped answer dissertation question two: What facilitating factors in each of Huber's four areas were perceived by district respondents to contribute to orgaruzationallearning about these policies? Data were read, analyzed and coded using ATLAS.ti software using previously defined facilitating factors under each of Huber's four areas. Facilitating factors, according to DiBella and Nevis (1998), are those personal, social or organizational resources or practices an organization uses to promote or enable learning. Using Dibella and Nevis' operational definitions for facilitating factors, exemplars and data samples emerged that helped to anchor rules for coding and help the validity and reliability of those codes. These are included in the following tables. Because of the model's original focus on business organizations, some operational definitions were changed to reflect the nature of educational organizations. In addition, those orientations that did not help differentiate district responses were eliminated. Facilitating factors under acquisition Table 3.14 Coding Rules for Facilitating Factors Under Acquisition Facilitating Operational Exemplars Data Samples Factor Definition I. Scanning Process for seeking -Visits "we looked at first of all the information about -State research around instruction external educational and committees around content area and look at policy environment -National the state model standards." conferences "And, as we had people that had direct job responsibilities with curriculum and instruction attend these meetings, started to come back, talk about what we were going to need to do in 95

PAGE 110

terms of standards, what was going to be the expectation from the state 2.PerformShared perception of CSAP data "When we look at our data we ance gap gap between current and Data charts look at changes that we made desired student we look at the curriculum. Look performance at the number of kids that are being successful and there were kids that aren't being successful. It's pretty clear that we're not being successful with second language students " And the other thing is that all of our achievement is put together, you know, across the district so they're looking for, you know, big disparities in scores and whatnot." 3 Concern Shared perception about Assessment "We're looking at the data and for measurethe need to define and plans the data comes from ment attend to key factors Achievement the classroom from the kids and relative to student analysis that is, comes from the achievement protocols teachers and what they're -Linki ng of teaching in the classroom ." multiple forms "I think probably we've gotten of data better at understanding our assessments and using our assessments to form instruction help make instructional decisions. We've got better information on where children are." 4 0rganizaShared perception about Data And we started talking about so tiona! need for new ideas and questioning what is the difference curiosity experimentation Research base between the traditional -Selfinstructional system and the correction standards based system, what should be happening ?" "And the question from us, from the principal, is what data do you have to support that's a good thing to do?" "But that doesn't mean we re not afraid to correct if we need to but I think because we ask tough questions that I think 96

PAGE 111

I keeps us from making any big mistakes." Facilitating factors under interpretation: None described in the original Dibella and Nevis model Facilitating factors under dissemination Table 3.15 Coding Rules for Facilitating Factors Under Dissemination Facilitating Operational Exemplars Data Samples Factor Definition 5. Climate of Shared perception of -Union "it's fascinating to them that openness trust and open leadership of teachers will sit around and communication built reforms they talk about real issues. on mutual relation-Teachers as They're talking about ships evaluators of instruction and assessment and content not in negative ways." -Common "It's like a lot of common sense dialogue and focus on kids and -Visibility of listening to people. It's leaders amazing the ideas that people have." "We had been developing assessments in all of those areas and teachers were really involved in that process and so they were pretty comfortable with it." "And more important, I think, than the actual money, is strong message that goes to teachers. This is important work. You are the experts who need to do it. And we value your time and expertise." 6. Continuous Resources and -Summer "The other piece that we've education formats for high institutes been real clear quality on-going -Early about in the district is the learning release days whole staff development/ -Clear implementation relationship philosophy and how staff development -Research really need to follow what we driven call adult learning models." models "Well again we've done a lot of work with national staff 97

PAGE 112

development and we use a lot of their models." "Professional Development Center, in giving teachers collegial time to work. To talk about, with each other, what it means and what it might look like. That's a major decision. lots and lots of opportunities, too." Facilitating factors under utilization Table 3.16 Coding Rules for Facilitating Factors Under Utilization Facilitating Operational Exemplars Data Samples Factor Definition 7. Operational variety Shared perception for no clear data emerged to appreciating different differentiate district responses methods and procedures 8. Multiple advocates Multiple champions -Community "So that took part with teacher for a cause exist at all expectations committee facilitated by levels -Teacher content area coordinators." leaders "So we would call together -Building principal from every building liaisons and a couple teachers, include a teacher from every grade level, parents, some students, and we sat around and talked, learned about what needed to be done and developed a process." "At every step, community members and teachers and administrative teams that work together to establish standards, to create and revise and review tests. To work on graduation requirements." Facilitating factors not specific to a learning process 98

PAGE 113

Table 3.17 Coding Rules for Facilitating Factors Not Specific to a Learning Process Facilitating Operational Exemplars Data Samples Factor Definition 9. Involved leadership Leaders are Principals "then 2 years ago when DW personally and as staff came and had the vision that actively engaged in developers matched what we had been learning initiatives Superintentalking about, that gave us that dent as much more information." teacher "in the 2 years he's been here, Principals he's made a dramatic impact on attend learning of all individuals summer about standards based workshops systems." "We've done a good job of providing professional development in this area for administrators first and we've had leadership seminars." "I'm not sure, at the elementary level, that it can happen without them. I think that's different. Elementary principals tend to be instructional leaders." 10. Systems Shared perception -Budgeting "So it's a combination of using perspective about the based on the data, having the data, interdependence of achievement knowing how to use it, analyze organizational parts -Design of it, but also, then also and especially the professional establishing the staff linking of direction development development framework to among all buildings based on make lt work, to make that within a district achievement change." -Data "Looking at the whole analysis that philosophy as well because links what we want to do is achievement incorporate all of our efforts to evaluation in human resources and I guess and again another area I curriculum mentioned is the teacher -Curriculum evaluation coupled with clear alignment instructional goals." "I think it means that a clearly articulated statement of what kids should know and do 99

PAGE 114

drives everything. It means that every decision made in the district focuses on a common goal that says kids have to know and de X." Phase Four. Last, responses that were not adequately accounted for in the framework and model were analyzed to address the applicability and appropriateness for understanding school district organizational learning and policy implementation. Using the literature on educational policy implementation and recent capacity research as a guide, the new constructs that emerged were coded and defined to help further answer dissertation questions one and two. New or redefined constructs for both learning orientations and facilitating factors that emerged in this step are included in the tables below: Emergent learning orientations and facilitating factors under acquisition Table 3.18 Coding Rules for Use of Data l.Use of Operational Definition: Preference for the use of data as data knowledge Approach Purposeful .............................. ..................................... Casual Exemplars Analysis protocols Looking at results Reporting and planning Sharing with staff Linking of data to district processes Copies to all staff members Data drives decisions Coverage of curriculum Benchmarking Data Samples "And so reversing it we put student "We always look at data, at the data on the table and started to learn bottom line and try to improve." how to look at it and not be as "The tests let us know if we are afraid of it. And then look for covering the right things with our patterns and then to have curriculum." it, step one, drive our school improvement." "And we're reaching a point now, this year where we will do our first year of public reporting of reading and writing at the elementarv 100

PAGE 115

level." "But the very first time I used it, she helped me set up the data analysis workshop, if you will, after CSAP results. And together we scripted what needed to happen." Table 3.19 Coding Rules for Emergent Facilitating Factors Under AcquisitioiJ Facilitating Operational Exemplars Data Samples Factor Definition 1. Policy Using perception of -Good guy "And they also were power of external bad guy visionaries and saw that that policy as leverage for scenario was something that was advancing internal -The state going to come from the state agenda says we have eventually hat might help to achievement." CSAP asks "There's always, you can us to change always draw back and say, 'It's the law.' You know, it's the ultimate motivator. Which hasn't hurt us at all." "It (policies) was an opportunity for us to use staff development time to talk about what is truly important, what achievement looks like." Emergent learning orientations and facilitating factors under interpretati< Table 3.20 Coding Rules for Interpretive Mechanism 2. InterpreOperational Definition: Preference for how the meaning of tive mecha-reform is linked to changes in student experiences in the nism classroom Approach Implicit .................................... .......................................... Explicit Exemplars Teachers are the experts Assessment as work/direction for Coverage of standards classroom Change instruction Unit/ replacement unit design Processes for deciding standards Data driven decision making Changes in grading practices Students understanding exit requirements 101

PAGE 116

Data Samples "Teachers have always been the "Alignment of assessments to the experts who know the content. They curriculum based on district have always had standards." assessments in specific content area. "And understanding first of all what So that standards based instruction is. To piece is going on now, writing become standards-base really means district assessment, giving the teachers change instruction." district assessments, doing "Well we started off with building alignment, doing the review and instructional repertoire. We were looking at the data comparing that looking at the research on teaching to state assessments, district and learning as a whole and finding assessments and then individual that there was lots of patterns and classroom assessments." people doing the same old stuff'' "I think we had to question our "They were divided up into groups grading system as opposed to which forced P through 12 teachers learning what we wanted, you know to be in every group so that it was a what's the goal? Parents really broad perspective of listening and believe they know what grades thinking about what students should mean and they don't. Because we know and be able to do." don't know what they mean." "Our PDC works on instructional, course planners. We take a hard look at what the standard really asks for and how students would have to prove that to me as a teacher." Table 3.21 Coding Rules for Interpretive Orientation 3. Interpre-Operational Definition: Preference for why the ideas in reform tive orienta-policy matter tion Approach District processes ..................... ...................... Student achievement Exemplars Improve curriculum Higher rates of achievement Add new programs Student thinking To be in compliance Increase of technology usage Student products Data Samples "We didn't acquire knowledge about "I mean all I can think about is what to do and how to do it. We higher expectations and decided what to do and how to do it. achievement for kids." We really tried to improve our whole "Absolutely sitting on status quo is process for adopting standards that not what we do. And all kids we could use for the future." achieve very well, traditionally "I'd like to say we would have done always have. It's not like we were in it without the policy but it forced us this place where we had to change to move quicker." dramatically because we were bombing. That's not it." "Standards-based education is 102

PAGE 117

assuring that there's a level of performance. Not the difference between, different than a level of knowledge, OK? We are moving to performance. Demonstrated performance. Application of the knowledge rather than just seat time and passing grades, getting the grade." Table 3.22 Coding Rules for Emergent Facilitating Factors Under Interpretation Facilitating Operational Exemplars Data Samples Factor Definition 3. Leadership Personal resource of Principal as "That I was the first person cognition the depth of what a true that started talking about in leader understands instructional large groups and with about reform efforts leader principals well what does to leverage Superintenthat mean?" orientations and dent showing "I think we have some very resources. It also design of strong personalities who means the degree to assessment totally which multiple plan understand standards based leaders possess Principal's education here." common cognitive ability to use "Well and I think they've maps about the design mostly been out of the reform. language classroom with teachers for a long enough time that they don't see the necessity for a change in the way we deal with kids." 4. Focus A shared perception Professional "There's an expectation of the clarity of development around that alignment so direction for learning looks similar there's not a presentation efforts and district year to year from the district level to vision. -3-5 year administrators and principals plan focused that doesn't talk about what on .... our direction is." "But then there's also, I call the instincts you have that you don't want everything going all over the place but keep it focused and you're able to do that to get that 103

PAGE 118

sense that this is out here, we got to pull it back in, what is this area, so we're not all over the place." "Kind of constantly reiterating, 'This is what it looks like. This is where we are headed. Come with us.' Emergent learning orientations and facilitating factors under utilization Table 3.23 Coding Rules for Emergent Facilitating Factors Under Utilization Facilitating Operational Exemplars Data Samples Factor Definition 6. Accountability Formal and informal -Reporting "So even though they now processes and shared Evaluation close their doors, the perceptions used to -Achieve-accountability is so assure use ofreform ment as part high around instruction and ideas/ new knowledge of principal student achievement which is evaluation a -Required good thing that that's going to school I think delimit some of that improvement teachers getting stuck on -CSAP favorite topics." pressure "Well I think the accountability awareness is incredibly helpful in a way so no one likes those tests, the tests, but they're certainly paying attention to them." "Pressure from the state, pressure from the building administrator. Pressure from, I would say in general, you know, it's their own personal beliefs if they want, they want kids to succeed." "We're the external monitor. And then we make a report to the board. We check off, we approve whether or not some of these do what they're supposed to do. They 104

PAGE 119

7. Resources What and how a district uses and leverages to help learning including time, money and structures In all, these four phases were used to c: orientations for each of the three cases anc interpretation. The complete model for un learning capability is shown below in tabl1 in orientations and use of facilitating facto studies presented in chapters four, five anc

PAGE 120

Table 3.24 Organizational Learning Capability Model for Local Education Agencies Learning Phase Acquisition Interpretation Dissemination Utilization Learning Orientations (Describe how learning occurs and what is learned) 1. Knowledge Source Internal ........................................... External 2. Learning Focus Content ........................................... Process 3. Learner Focus Individual ....................................... Schools 4. Use of Data Purposeful ...................................... Casual Facilitating Factors (Practices or conditions that promote and enable learning) Scanning Performance gap Policy Concern for measurement Organizational curiosity 5. Interpretive Mechanism 6. Interpretive Orientation Involved leadership Systems perspective Learning Orientations Implicit .......................................... Explicit to classroom experience Processes ........................................ Achievement Facilitating Factors Leadership cognition Learning Orientations Focus 7. Dissemination Mode Delivery ......................................... Construction 8. Knowledge Reserve Explicit link .................................. Implicit link Facilitating Factors Climate of openness Continuous education 9. Learning Scope I 0. Value Chain Multiple advocates Resources Learning Orientations Incremental ................................... Transformative Things ........................................... Human Facilitating Factors Accountability Resources Contiguity-based Relations A second way to analyze qualitative data is to use contiguity-based relations that help show connections in actual contexts (Maxwell & Miller, 1998, p.l). Connection strategies link data to show both reciprocal influences and to develop theory. This 106

PAGE 121

method of analysis helped answer dissertation question three: using learning orientations and facilitating factors, what interpretations were constructed by Local Education Agencies (LEAs) about state-mandated standards-based reform policies? And four: how do differences in learning orientations and facilitating factors explain variation of interpretation and implementation of standards-based reform policies? These interrelationships are explored primarily in chapters six and seven. Validity and Limitations of the Study The methods I utilized during this research, like all research methods, have a number of limitations. A major limitation of this study was that the pool of interviewees only came from administrators. In order to deal with threats to causal inference (Maxwell, 1998, p. 4) in qualitative research, I undertook a number of exercises during data collection and analysis to minimize these limitations and to help make my conclusions more valid. Most of these strategies operate primarily by seeking evidence that either challenges the proposed explanation or that would rule out alternative explanations. First, parochialism (Spillane, 1993, p. 232) of local accounts was off set by constantly referring back to previously reviewed literature in order to gain a wide variety of perspectives. 107

PAGE 122

Second, by using multiple information sources and analyses methods through triangulation, I was able to reduce the risk of bias in this study. In this sense, bias was reduced not merely by adding more data, but by looking for multiple pieces of evidence that support either a category or connection made through analysis. This explains why multiple informants and sources were used so that accounts could be verified and collective understandings gained. Third, subjective response by the researcher is often a problem in qualitative research. Researchers' bias, values, beliefs and prior understandings before entering a site for data collection influences greatly how the research is carried out and how the collected data is analyzed. To avoid problems, I undertook three methods to reduce researcher subjectivity. First, my data collection summary included a section to record my reactions that I analyzed for creeping biases or to reconsider questions or analyses. Second, during interviewing I used a "member check" strategy in which feedback on analyses was sought from people I was studying. Last, I systematically undertook an interrater reliability study to ensure the technical quality of my study including the reliability of codes, independence of codes, and utility of codes. Using a group of teachers who were participating in a local graduate study research class. for an Educational Administration Master's program, I met with the professor and solicited volunteers. Three teachers volunteered instead of the assigned class project. 108

PAGE 123

To determine the interrater reliability of my codes and coding strategy we undertook the following steps. Step one: I explained the general problem of my dissertation and the theoretical models used as the framework to understand the problem. Step two: I explained general coding strategies and used tables 3.2 through 3.23 to address coding rules. Step three: using a randomly selected 20 codes we used the general model of acquisition, interpretation, dissemination, utilization to anchor our meaning. Next, each scorer was given another random 20 codes to score by themselves. Percentages of exact agreement were figured and results are shown below. Specific problems were found with utilization that led to minor changes in definition. Step four: Following the method in step three, codes for learning orientations were used. Results are shown below. Specific problems were found in differentiating interpretive orientation from learning focus and value chain that led to minor definition changes. Step five: Following the method in steps three and four, codes for facilitating factors were used. Results are shown below. Specific problems were found with differentiating concern for measurement, performance gap, and accountability that led to minor definition changes. Step six: Five random pages of codes were given that contained both learning orientation and facilitating factors. Pages were marked with chunks of text to code so that only one primary code would have to be attributed to each chunk. Scorers were told to note only one code per chunk of marked text, but to note if they thought the chunk could contain more than one code. Results are shown below including the percentage of times raters agreed that a chunk could have contained more than one code. Specific problems dealt primarily with differentiating between orientations and facilitating factors for data and measurement that led to further definition clarifications. Table 3.25 Interrater Reliability Percentages Phase Rater 1% Rater2 % Rater3% Total% agreement with agreement with agreement with agreement with original original original original 1-General 15/20= 75% 18/20=90% 17/20=85% 50/60=83% model 2Learning 13/20=65% 19/20= 95% 18/20= 90% 50/60=83% orientations only 109

PAGE 124

3Facilitating 16/20= 80% 15/20=75% 14120-70% 45/60=75% factors only 4All codes 32/43=74% 34/43=79% 38/43=88% 104/129=81% together(% agreement that a section could (10/17=59%) (6/17=35%) (12/17=71%) contain multiple codes) Fourth, I undertook a wide variety of comparison methods to ensure my conclusions for categories and connections were valid. First, multiple case sites and cross case comparisons allow for a greater degree of generalization. Also, all data analysis was compared to literature previously reviewed so that I considered all plausible explanations. Finally, as discussed above, using a case description for infonnants helped validate findings from an insider's perspective. Conclusion In sum, standards-based educational policy provides a unique context to study LEAs organizational learning capability. To study the LEAs role in learning from and about this type of policy, collective, social models of learning were used to orient the organizational learning literature to educational refonn. To do this, I utilized a comparative, qualitative case study because of its strength in looking at phenomena such as implementation of educational policy, and its ability to look at both learning and temporal processes. Site selection and sampling were based on 110

PAGE 125

substantial evidence and investment in implementing standards-based reform, variation in response to the policy, demographic pairings, and by reputation of success. In all, three districts were chosen, two nominated as effective implementoers and one as less effective. Data collection included structured interviews, documents, and observations. Data analysis occurred concurrently with data collection, and used both categorizing and connecting strategies relative to research questions and the guiding conceptual framework. Last, various exercises were used to overcome threats to the validity of my findings. These included such tactics as referring back to the literature, using multiple sources for triangulation, soliciting feedback on data analysis, interrater coding checks, and using constant comparisons within and between cases for emergent findings. 111

PAGE 126

CHAPTER4 IMPLEMENTATION AS ORGANIZATIONAL LEARNING CAPABILITY: UNDERSTANDING THE ROLE OF LEARNING ORIENTATIONS Standards-based reform policies in Colorado proposed ambitious changes in content, pedagogy and student achievement. State policy makers and Department of Education officials took a bold new step in venturing into instructional policy. Their hopes went from specifying what should be taught to how student learning should look as modeled in state-wide assessment. These new policies proposed new ideas, and therefore implied learning for those charged with implementing new policies. Standards-based reform policies in the state of Colorado demanded considerable learning on the part of local educators. Standards-based reform policies, however, interacted with a deeply complex school district culture when passed. While state policy makers and officials entertained certain notions about the meaning of standards-based reform, policies at the local level entered an environment where "the local slate is never clean" (Cohen and Ball, 1990, p. 333). District history and culture determined how a district acquired, disseminated, interpreted and utilized new ideas proposed in policy. Stylistic variation in learning orientations influenced what was learned and how it 112

PAGE 127

was learned, or how districts in this study acquired, disseminated, interpreted and utilized ideas within state mandated reform policy. These unique and stylistic variations of learning orientations will be contrasted through cross case analysis of each LEA studied using the continuums discussed in the previous chapter. Continuwns in this sense are not judgment based but allow a visual contrast along the continuum of emergent approaches. Districts were placed closest to that approach on the continuum which coded data suggested. This chapter will primarily address the following question: 1. Using Dibella and Nevis' (1998) description of organizational learning based on Huber (1991) a.) How did LEAs learn about standards-based reform policies; and b) What orientations did LEAs use to learn about and implement current state-mandated standards-based reform policy? Acquisition The acquisition of new knowledge for organizations refers to how organizations gain new knowledge through the experiences of their own personnel or indirectly through the experiences of other organizations. Orientations wtder acquisition included knowledge source, learning focus, learner focus, and use of data. Key differences between LEAs emerged across these orientations in this learning process that led to different wtderstandings of standards-based reform. 113

PAGE 128

Knowledge Source !Figure 4.1 Knowledge Source Rubric ternal ......................................................................................... External River Valley Front Range Midplains Case 1 Midplains School District. The Midplains School District relied heavily on external knowledge sources to further their initial interpretation of standardsbased reform as content. According to Dr. Rodriquez, Midplains' Superintendent: ... we felt it was important at different points to have people come in to give us some thoughts, some reminders, some suggestions ... we thought it was important to study the research, to listen to others but that doesn't mean you shouldn't also be a critical listener and then you look at yourselves and what we need and what do we know. (Rodriquez interview, March 1999) While some teachers and administrators took part in local BOCES and CDE workshops, Midplains focused their resources for knowledge acquisition on hired curriculum consultants to help their interpretation of reform as content. Both Ms. Adams, Midplains' Staff Development and Curriculum Coordinator, and Dr. Rodriquez discussed the importance of the curriculum consultants: And so we thought you know we got to make sure that they get the kind of help they (teachers) need. So in every one ofthose areas we hired a curriculum specialist. Now if it was a teacher in the district that wanted to take that on and thought they could commit to that, do quality work, we said ok. But it most areas there wasn't anybody that thought they could do that or wanted to do it. And so we hired folks from the outside who met certain criteria and credentials and they came in and helped serve as a facilitator and expert in a sense. (Rodriquez, Adams interview March, 1991) In addition to hiring outside curriculum consultants, Midplains also contracted with Dr. Fenwick English. Dr. English is a well-known curriculum theorist and 114

PAGE 129

helped Midplains develop the process for curriculum reform through long-distance seminars. Similarly, Midplains also hired McREL, one of the nine federally supported educational laboratories, to assist them in finding assessment tools to measure student achievement of standards. McREL was hired to: ... go out and do that and you know, work for us, bring back samples, bring back costs, bring back the advantages, disadvantages, that sort of thing, of the assessments. And then they came back and presented that information to each of the content areas and so these groups now, they would meet in these big groups of teachers P through 12, and they're looking at these assessments. (Rodriquez interview March 1999) While Midplains relied heavily on external knowledge sources in the beginning, administrators interviewed also alluded to the fact that more internal knowledge and ideas had begun to generate. And then we have teachers. It's interesting especially through grade six, and we have a little bit in some other levels but especially through grade six, teachers have developed support groups and they'll reflect. And then they will go to the principal or even give one of us a call, you know we were just thinking what if, or we're facing this or is there a way we could do this. (Rodriquez interview March 1999) Case 2 River Valley School District. The River Valley School District relied heavily on internal knowledge sources to further their decade long reform efforts. Following initial efforts to require higher achievement through proficiency testing for high school students, Mrs. Glitton, River Valley's first Director of Assessment, helped their local BOCES design initial training in understanding standards-based education. Mrs. Glitton also worked on the "Eagle Rock" team that was a collection of educators from around the state who met at Eagle Rock School to help clarify 115

PAGE 130

characteristics of a standards-based classroom and design professional development modules. However, this work fueled the drive to base further knowledge acquisition internally. According to Mrs. Jackson, River Valley's current Director of Assessment, this external work drove many internal questions and iterations of what standards-based reform would look like in River Valley. This internal dialogue, "Constantly reiterated this is what it looks like. This is where we're headed. Come with us" (Jackson interview March, 1999). From this beginning, River Valley used Mrs. Glutton as the primary trainer and spokesperson to train administrators, teachers and standards-based trainers who helped train each building faculty. According to high school principal Mrs. Williams: That's what I think did it in this district to begin, because there's not one person whose been involved in any of our training who's been an outsider. I mean it seriously did start with Mrs. Glitton and probably there were a couple of elementary principals involved in, like I said, it's kind of mushroomed and kept growing. And so then what we've used in our building in best practice this session is teachers who really caught on to it, training and helping other teachers. And that's how we spent a lot of our staff development time. Also every one our new teachers has a teacher mentor trained in standards so that no matter what college they came from they will have standards-based training ... Our whole staff development program revolves around peers working with peers which seems to be just really, our teachers love it. (Williams interview, March 1999) In addition to early standards-based training run by teachers, River Valley developed their own PDC (summer Professional Development Center) run jointly by the teacher's union and district. The design ofthe PDC allowed teachers to work 116

PAGE 131

with each other on instructional planning and assessment, and on common performance levels for student assessment. Last, because River Valley focused on achievement results, data became the internal source of knowledge. If for instance, River Valley saw a decline in achievement in some area, district administrators designed a process of "triangulation" in which they analyzed course and unit organizers, teacher evaluations and student interviews to decide on specific interventions to raise achievement (Dr. Zeus Human Resource Director, personal communication, October 1999). Case 3 Front Range School District. The Front Range School District used a mix of external and internal knowledge sources to bring to view their multiple interpretations of standards-based reforms. For instance prior to their new superintendent's, Dr. Wright, arrival, Front Range relied heavily on external lmowledge sources to adopt standards and to begin implementing their reform efforts. For instance, when adopting standards: ... we decided that would be a review process so strategies that we used were we looked at first all of the research around instruction around content areas and looked at state model standards ... What does the research piece say for the projection of the future and so what do we need to revamp and re-look at as we looked at adopting new materials and curriculum. 117 ( Director of Secondary Education Grant interview, June 1999)

PAGE 132

In addition, Front Range used an extensive list of outside consultants to help with their reform efforts. According to Front Range's Director of Staff Development, Mrs. Klein: We're talking to Ruth Worman, Paula Rutheford, Rick Stiggins, Bill and Ann O'Rourke. They've been, when we work with people, Tom Guskey, it's usually on an ongoing basis. The majority of those people work in here once every two months or once every month. They don't just give you one shot then disappear. They, we have ongoing, a very much an ongoing relationship with them. (Klein interview, June 1999) Front Range, however, also relied quite heavily on internal knowledge sources. For instance district leaders including Superintendent Dr. Wright, Assistant Superintendent Dr. Hayes, Director of Secondary Education Dr. Grant along with Director of Elementary Education Mrs. Bench discussed the ongoing leadership seminars for district leaders that were used to help acquire an understanding of "standards-based philosophy and instruction" (Wright interview, June 1999). The Research for Better Teaching initiative also relied on internal knowledge sources to improve teaching strategies in the classroom because it required schools to identify specific achievement problems and work as groups to find solutions. Similarly, Dr. Grant talked about the use of informal methods to help people better understand standards-based reforms: I would say, and not being a smart alec, that I really brought that knowledge when I came. That I was the first person that started talking about in large groups and with principals well what does that mean? What does it mean to create standards-based systems? How does that look different? What are we doing to support it, not to support it? ... I brought the whole concept of book talks and having dialog based on research that we in turn applied to our building then to our district and to the work that we do. (Grant interview, June 1999) 118

PAGE 133

In addition, the staff development courses in Front Range offered numerous topics, primarily on instruction, using various models for learning. However, when asked which knowledge source had had more impact on the district, Mrs. Klein and Dr. Grant offered contrasting beliefs: I would say our knowledge acquisition is more external. (Klein interview, June 1999) I think it's a combination of both, but I think that the larger majority of what creates a success is that internal piece. (Grant interview, June 1999) Learning Focus Figure 4.2 Learning Focus Rubric Content .................................................................................................. Process Midplains Front Range River Valley Case 1 Midplains School District. In contrast to a learning focus based on instruction or assessment, Mid plains focused their acquisition of learning around defining new content. The Midplains School District oriented all of their work for four years around what students should know or be able to do. In an attempt to influence what teachers taught in the classroom, 12 content groups were fonned across all grades and schools to define standards for student knowledge. According to Dr. Rodriquez: Then we answered those questions and went through again the content committees, the district wide committees, the Council, up top the Board of Education approving every curriculum work plan. (Rodriquez interview March 1999) 119

PAGE 134

By focusing acquisition of knowledge around content, schools in Mid plains also begin to adjust design of courses so that all standards and benchmarks could get "covered". Mr. Bohag, middle school principal, alluded to the need for middle school teachers to begin teaching more integrated curriculum because one teacher could not cover it alone (Bohag interview, March 1999). Similarly, Mr. Wilson, high school principal, discussed changes in course sequencing and course offerings to help students meet new content demands. We've gone through standards and benchmarks, written curriculum work plans. we have come up with specific classes at the high school, we have a different fonnat now. For instance Freshman English, prerequisites for what they cover, what benchmarks fit in with that. we do that for all of the classes ... Math classes came up with new materials and started using them Algebra I, Algebra II. The classes beyond that were still fairly new and they decided to come up with a new transitional Algebra for kids coming up who couldn't handle the new content... We also have a transitional class for English. They're starting to think rather than wait until they fail, let's try and help them out. Science classes are also using new materials this year and any new class has to meet standards we set. (Wilson interview, March 1999) Case 2 River Valley School District. In contrast to a learning focus based on content, River Valley focused their acquisition of knowledge around the processes of instructional planning and assessment, and data analysis for teachers. River Valley oriented all of its staff toward using a different instructional planning process focused on operationalizing standards through performance-based assessment. According to Mrs. Jackson: ... We have spent a good nine years on putting a high value on assessment as a tool we can actually use. As an instructional tool, not just a measure for accountability. That's a whole different perspective. Instructional planing and assessment go hand in hand. Assessment is the other side of the coin. It's a way of organizing what was learned, and fortunately the two things match. (Jackson interview, March 1999) 120

PAGE 135

An ingrained learning focus for River Valley also included processes for analyzing Colorado Student Assessment Program results. In addition, the process of data driven instruction pervaded the district. I mean, we are really working with our teachers on data driven instruction. They have to be able to show us artifacts to show that they are, indeed, using standards-based education in the classroom. That they have unit organizers, that they have course organizers, that they're using rubrics, that they have alternative assessments. So there's a real focus. (Williams interview March 1999) Case 3 Front Range School District. In contrast to a learning focus on content as in Midplains or instructional planning and assessment in River Valley, Front Range focused their knowledge acquisition on instructional strategies and differentiating instruction. Using philosophies and strategies previously discussed, Front Range concentrated the majority of its staff development efforts around instructional differentiation. According to Mrs. Klein, when first thinking about implementation, ... we really looked at the instructional piece of what we needed to be looking at relative to standards and achievement of students" (Klein interview, June 1999). This led to the design of the Research for Better Teaching initiative that offered teachers opportunities for summer workshops, study groups or research teams aimed at increasing instructional repertoires. As with many other orientations, Front Range showed the beginning of a transition toward other orientations. While Front Range initially focused learning on instruction, many in the instruction department also saw the need for more of a 121

PAGE 136

content orientation and began the transition toward that orientation. For instance, Front Range had a statewide reputation for its middle school science curriculum. Likewise, according to Mrs. Bench and Dr. Grant, when modifying math curriculum: We have had to do a fair amount of staff development because we realized that teachers didn't always have the content level themselves to be successful. That's particularly true in math, and right now at the elementary level we have most of our elementary teachers in the summer taking algebra classes. (Bench interview, June 1999) Content training and development around just like in the math. We adopted a standards-based math curriculum so that training is on that content and instruction, so part of it is process, the instructional process, and I would think most of it has just been in content. (Grant interview, June 1999) Leamer Focus Figure 4.3 Learner Focus Rubric Individuals .................................................................................................. Groups Front Range River Valley Midplains Case 1 Midplains School District. In contrast to acquiring knowledge focused on individuals, Midplains' learning focus centered on groups. In this case, cross district content groups jointly defined content for student learning. According to Mrs. Baker, elementary principal: Well I'd say we used a participant structure. Most of our time was spent in groups. We had to, you know, begin with state standards, compared state and national and then sat down in the beginning with grade levels third and fourth grade so that would be building level and discussing how we, what does that look like for our level, how do we do that. But it was also a group conversation across the district, you know pretty flexible grouping, grade level and district level. (Baker interview, March 1999) 122

PAGE 137

Similarly, in the design of the process Dr. Rodriquez alluded to the need to focus on groups as the focus for learning: We took a day, and we had teachers go through a process to look at priorities, all teachers P through 12. They were divided up into groups which forced P through 12 teachers to be in every group so that it was a broad perspective of listening and thinking about what students should know and be able to do ... and it also was good to mix up people because it forced everybody to look at this and understand where we were headed. (Rodriquez interview, March 1999) Case 2 River Valley School District. In contrast to Midplains, River Valley used a learning focus mixing both group and individual professional development methods. In the beginning, River Valley used standards-based trainers to train entire building faculties in the philosophy of standards-based education. According to Mr. Rich, elementary principal and accountability director: We went through and trained all of our staffs. I'd say we probably had a 70-75 percent participation rate. There was group of about 60 people trained to be SBE facilitators. They went through 6 or 7 days worth of training on what an SBE classroom was. And then they spent additional days planning training with their principals for their staffs. (Rich interview March 1999) In addition, River Valley's Board of Education approved an early release day for every Wednesday during the school year primarily for individual schools to work on professional development issues specific to each site. District professional development needs which reinforced data driven instructional ideas took one Wednesday per month (Superintendent Seneca interview, June 1999). 123

PAGE 138

Many administrators also pointed to the benefits of individual professional development offered through the district's Professional Development Center (PDC). According to Mrs. Jackson: PDC has been a huge help for teachers in giving teachers collegial time to work across different areas schools. To talk with each other, what it means, what it might look like. That a major resource, lots and lots of opportunities ... Many people return every year, it's their level of knowledge and understanding about assessments. It's going to take time to bring a critical mass up to speed but I see it happening. It's not exactly the idea of train the trainer. But the idea is for people to understand and help disseminate ideas. PDC was facilitated, planned by teacher leaders. (Jackson interview, March 1999) The PDC, which paid teachers through credit, stipends or a combination, has also been used by principals to work on whole school needs or issues. Dr. Zeus alluded to the flexible design of the PDC: "Sometimes you'd have two or three schools with their SBE trainers as they were called, who would come together and plan a learning focus for the whole school and plan that with principal input" (Zeus interview, March 1999). Case 3 Front Range School District. Similar to River Valley, Front Range used a learning focus of both individual and group professional development methods leaning more toward individual methods. According to Mrs. Klein, most administrators were trained in groups to share ideas about standards-based education and use a similar vocabulary. In contrast, teachers were offered more individual methods of learning so that they could learn in ways that fit their "time and style" (Klein interview, June 1999). 124

PAGE 139

A transition was also occurring as more school-based group learning was promoted through Dr. Wright's entrepreneurial fund. This fund acted as a monetary incentive for whole school action research to solve achievement problems. As part of Dr. Wright's intention to drive reform down to the school sites, these "learning projects" recognized an achievement problem, and through research would identify and implement solutions. The 1998-1999 school year saw three schools utilize this form of action research, and according to Dr. Wright seven more intended to participate in 1999-2000 (Wright interview, June 1999). Use ofData 4.4 use ot uata .KubrJc Purposeful .................................................................................................. Casual River Valley Front Range Midplains Case 1 Midplains School District. Another orientation that,can lead to acquisition of knowledge focuses on the use of data. In contrast to a well-designed and purposeful method to analyze data to show districts areas for growth, Midplains displayed a casual use of its data. Casual in this sense does not mean that district officials ignored data, only that it did not use data as a feedback mechanism to improve performance. In fact, all officials interviewed talked at length about the need for more help in interpreting and using test results to improve the district or its 125

PAGE 140

schools. When pressed about how her school used state assessment results to analyze needs in their curriculum Mrs. Baker said: Our writing scores were horrible, and we were kind of in la la land about it, you know denial. And when our scores came down, the thing that I used to motivate them was the idea that if you do what you've always done, you're going to get what you always got. And that's basically what we've been doing. (Baker interview March 1999) Case 2 River Valley School District. As previously discussed, the use of data as a purposeful activity to gain knowledge and make decisions was an almost universal learning orientation in River Valley. River Valley utilized designed protocols and processes to collect and make sense of data as feedback. For instance, all buildings analyzed CSAP data in a similar fashion and correlated results with curriculum and course organizers, and teacher evaluations. Similarly, reported embedded classroom assessment acted as an accountability mechanism for school, principal and teacher evaluation. Last, all data influenced decisions on professional development needs for individual schools and the district as a whole. Case 3 Front Range School District. Many district administrators in Front Range mentioned that training in use of data had become a major focus for their district. For example: We have administrative seminars, and we have classes for teachers that teach teachers and administrators how to use that data, how to look at it and how to analyze it... Sometimes teachers suggest we should do something. And the question from us, from the principal, is what data do you have to support that's a good thing to do? So we're really pretty researched based relative to what data do you have that you're looking at that says we need to try this that it works. (Grant interview, June 1999) 126

PAGE 141

Then we went into more of the data analysis and not being afraid of it. I jokingly call it assessment is our friend because we used to design classes in assessment that no one came to. And so reversing it, we put student data on the table and started to learn how to look at it and not be as afraid of it. And then look for patterns and then to have it, step one, drive our school improvement. (Klein interview, June 1999) In contrast to River Valley, however, Front Range's use of data had not become as pervasive to inform larger decisions, and again showed a transition from one orientation to another. Many Front Range administrators discussed a greater need to use data. For instance, Dr. Wright advocated the use of data to drive funding allocations for schools, but admitted that schools were probably two to three years from being able to do that. In addition to become more data driven, Dr. Wright also knew that a supporting structure such as an Assessment Center would have to become a reality. Until then, he suggested, "We just look at numbers and don't know what they mean" (Wright interview, June 1999). Interpretation The interpretation of knowledge refers to how LEAs helped their personnel make sense and attend to the meaning of the state-mandated reforms. The importance of this part of the learning cycle for educators related to two major questions: First, what is the purpose for the reforms, and second, what should it look like in classroom practice. Orientations under interpretation included interpretive orientation 127

PAGE 142

and interpretive mechanism. Key differences between LEAs also emerged across these orientations in this learning process that led to different understandings of standards-based reform. Interpretive Orientation !Figure 4.5 Interpretive Orientation RubricProcesses tprocesses .............................................................................. Achievement IMidplains Front Range River Valley Case 1 Midplains School District. A district's interpretive orientation refers to why the ideas in the policy matter. Midplains' interpretive orientation focused on improving key processes for defining standards and curriculum. As Dr. Rodriquez discussed, "the processes we designed were designed to be long lasting; we will continually use them for an on-going review of standards, curricula and assessments" (Rodriquez interview March 1999). Multiple documents also support Midplains' orientation toward defining and integrating processes for defining standards and adopting curriculum. These documents included such items as an overall process for defining standards through on-going review, processes for meetings to define standards, processes for articulating and aligning curriculum with standards, and processes for defining and aligning district wide assessments. 128

PAGE 143

Case 2 River Valley School District. In contrast to Midplains, which focused on process as a way to adopt standards, River Valley demonstrated an interpretive orientation focused toward raising achievement. According to Dr. Seneca, "All of our processes are certainly focused on student achievement. We really think in terms of being aware of standards to being standards-based" (Seneca interview, June 1999). By focusing teachers not only the content of standards but also on the processes of using data to raise achievement, River Valley made constant adjustments to help raise achievement. For instance, CSAP scores rose 17% through the district's processes. In another strategic process, district goals in specific content areas helped schools focus structures and strategies on common needs (Seneca communication, October 1999). Case 3 Front Range School District. Similar to River Valley, Front Range had begun to orient their interpretation around raising achievement. As suggested by Dr. Wright when discussing Assuring the Essentials, "Everything we have to do focuses on raising achievement. We just choose to narrow our focus a little" (Wright interview, June 1999). Similarly, Dr. Hayes suggested an orientation toward achievement, "We have known for a long time, that maybe we could be doing more. Our students face a pretty competitive world, so we are almost forced to figure a way to raise achievement through standards" (Hayes interview, June 1999). 129

PAGE 144

This focus on achievement had already made its way to the elementary schools that had implemented different structures to intervene with students' at risk of early literacy problems. Using a literacy lab concept that focused on multiple interventions and a high degree of parental involvement, all18 elementary schools were given an extra FTE to act as a reading expert and literacy coach within each school (Bench interview, June 1999). High Schools had also added summer schools and intervention programs to address growing numbers of at-risk students. Interpretive Mechanism mplicit Connection .............................................................. Explicit o Classroom Practice Midplains Front Range to Classroom Practice River Valley Case 1 Midplains School District. An interpretive mechanism refers to how teachers are shown how the meaning of reform is linked to changes in classroom practice. In Midplains, no formal mechanism existed to show educators how to make substantial changes in classroom pedagogy, curriculum, or work assigned to students. Instead, an implicit mentality existed that suggested teachers could take new curriculum materials and change their practice with little other training. Case 2 River Valley School District. In contrast to Midplains, River Valley used an explicit interpretive mechanism. The Professional Development Center used 130

PAGE 145

common instructional planners and models for assessment development to explicitly show teachers the changes needed in the work given to students. Teachers used this model and constructed new units and assessments in conjunction with other teachers and principals during summer work sessions and met in follow-up sessions during the year to "debug" the work. Case 3 Front Range School District. Similar to Midplains, Front Range did not use any formal mechanism to help teachers understand necessary changes in the intellectual quality of student work. By focusing on instructional differentiation, teachers were told that the content they gave students was meeting the intellectual quality of the state standards. Standards and student experiences around standards were only implicitly tied to classroom practice leaving the interpretation to individual teachers. Dissemination The dissemination of knowledge for organizations refers to how organizations transmit the knowledge and meaning of reforms to their personnel. The importance of this process in the organizational learning cycle comes from the distinction between tacit knowledge and explicit knowledge. Tacit knowledge refers to individual or personal insight, intuition and abilities that has been constructed and 131

PAGE 146

explicitly linked to changes in classroom practice. Explicit knowledge refers to knowledge that can be shared and communicated or information that has been delivered with an implicit link to classroom practice (Dibella and Nevis, 1998). Orientations under dissemination included dissemination mode and knowledge reserve. Key differences between LEAs also emerged across these orientations in this learning process that led to different understandings of standards-based reform. Dissemination Mode [figure 4.7 Dissemination Mode Rubric Delivery ........................................................................................... Construction Mid plains Front Range River Valley Case 1 Midplains School District. As previously discussed, dissemination of knowledge in the Midplains School District occurred through a formal, designed process that focused on coupling content in 12 areas across all grades. Rather than use informal methods to dialogue and share knowledge of lessons gleaned over time with students and curriculum, Midplains relied heavily on planned, scripted processes to disseminate knowledge and information. More specifically, Midplains used a facilitated process in the beginning to engage and gather input from their community on essential learning's while disseminating the need for higher level outcomes from students. Similarly, Midplains used a formal 132

PAGE 147

process based on explicit expectations, meeting formats, and public approval of curriculum work plans and materials to disseminate information about new curriculum requirements. According to Dr. Rodriquez, this process allowed for checks and balances along the way to make sure groups were not taken in some other direction (Rodriquez interview, March 1999). Case 2 River V allev School District. Knowledge dissemination in River Valley occurred through three formal processes that focused on helping educators construct an understanding of the meaning of standards-based reforms. First, standards-based and curriculum area liaisons at each building modified standards and provided training for school faculties based primarily on unit or instructional design. Second, early release Wednesdays were used by individual buildings as a formal mechanism for building based data analysis and to disseminate internal ideas and knowledge related to building needs. Last, the PDC acted as the primary district-wide method to disseminate ideas and tools for data-driven instruction based primarily on collegial discussions and unit and assessment design. Case 3 Front Range School District. Knowledge in the Front Range School District was primarily disseminated through mixed oral and training modes. Informally, for instance, Dr. Wright presented his vision for the district through Assuring the Essentials to each building over lunch in an informal dialogue. 133

PAGE 148

Similarly, district administrators used "reflective dialogue protocols" in leadership meanings to help shape their understandings. Training and summer workshops appeared to be the primary way that knowledge was formally disseminated focused primarily on information delivery that district leaders hoped would transfer to classroom practice. Knowledge Reserve Figure 4.8 Knowledge Reserve Rubric Implicit Link ..................................................................................... Explicit Link Midplains Front Range River Valley Case 1 Midplains School District. Knowledge reserve refers to the preference for how knowledge is documented and the extent of its relation to instructional guidance for teachers. Through a formally designed process for adopting new content and materials, the Midplains School District focused on public documentation so that community and staff were well informed of the district's efforts. For instance, all work done by curriculum work groups were to use a similar format, share all drafts at all buildings, and store all documents in notebooks at central office to share with the public and board members. The expectations for work groups were also published in a comprehensive booklet that discussed processes for adopting standards and new materials. Last, results of the curriculum work groups' efforts known as curriculum 134

PAGE 149

work plans were published and shared with teachers. However, Midplains' approach was implicit because there was no explicit link about how to use the information in classroom practice. It was assumed that teachers could transfer the lists into a workable curriculum for students. Last, the Midplains School District used both internal and external communication processes to publicly document and disseminate information. A district newsletter entitled Schoolhouse News was used every month to answer questions of the work committees. In addition any news items in this newsletter were shared with the local newspaper for community information. Again, however most of this information did little to help teachers or parents understand changes in classroom practice required by higher level standards. Case 2 River Valley School District. River Valley used a multitude of documents to share explicit knowledge of the link between standards, data and changes in classroom practice. For instance, a document titled A Parent's Guide to Standards shared expectations for students in every grade across all content areas. This document also contained samples of student work that for parents exemplified what a standard looked like in practice. In addition more formal documents developed for teachers listed not only standards and benchmarks but also materials and assessments that would help teachers meet these expectations. Data reports, 135

PAGE 150

formally developed at both the school site and district level, shared multi-level assessment results with community members, school personnel and administrators. Tacit or explicit knowledge, or that knowledge that comes from trial and error and correction was also publicly documented. While most tacit knowledge is kept by individuals, River Valley made an attempt to share school and teacher lessons through public documentation. In the initial stages of implementing standards-based reforms, principals who were working on similar problems wrote required lessons learned documents about common problems their staffs faced with achievement problems. These were complied and published for all buildings to use. In addition, the PDC by its nature was an explicit link to classroom practice because the PDC required documentation of all course and unit organizers, and assessment tasks to share throughout the district. Teachers who used an organizer or assessment evaluated their successes and problems so that organizers and assessment could be modified to help more teachers. Case 3 Front Range School District. As previously discussed, the Assuring the Essentials document was a widely shared, public document that served to direct Front Range's implementation as it progressed in its reform efforts. However, this document did little to inform changes in classroom practice. Beyond this document, however, Front Range had done little to document and share its efforts. While 136

PAGE 151

proficiency guides had been updated to include district standards, most administrators admitted that few people used them, and they provided little practical guidance in helping teachers understand standards-based instruction. Utilization The utilization of new knowledge for organizations refers to how new knowledge and meaning is used by an organization to alter decisions, behaviors or culture. While new knowledge and ideas may be generated and disseminated in a school district, unless they are used the learning cycle remains incomplete (Dibella and Nevis, 1998). Previous orientations as a part district culture heavily influences utilization of knowledge as a culminating process. How, or if, knowledge is used depends on initial interpretations, reflects certain values of an organization, and indicates preferences for certain outcomes (Dibella and Nevis, 1998). Orientations under utilization included learning scope and value chain. Key differences between LEAs also emerged across these orientations in this learning process that led to different understandings of standards-based reform. Learning Scope Figure 4.9 Learning Scope Rubric Incremental ............................................................................................ Transformative Mid plains Front Range River Valley 137

PAGE 152

Case 1 Midplains School District. Learning scope refers to the preference for the degree of change for learning efforts. In contrast to interpretations of standards-based reform as transformative experiences for students such as NCTM standards discussed previously, Midplains took a much more incremental approach to standards-based reform without realizing changes could be made in other areas. First Midplains realized that change in curriculum was an on-going process that had to occur: This year I mean it's a time line. This year we do this, next year we do this, next year we do this, continually setting goals related to benchmarks and standards and not saying we're going to be done. (Baker interview March 1999) What we get done depends on the year. I mean we're still in the process. I don't think the process (of curriculum renewal) will ever get done. (Adams interview March 1999) Second, because Midplains focused so heavily on curriculum and materials, very few district personnel alluded to other facets that might have to be adapted to support changes in curriculum and higher standards. While all interviewed to the necessary changes in instruction and assessment, none offered how that needed to look in the classroom or that it was occurring. A notable exception was Mrs. Baker, elementary principal, who had begun to restructure her school around student needs and teacher instructional strengths. Because of the diversity of her student population and low achievement on state 138

PAGE 153

assessment, Mrs. Baker took a much more transformative approach to the structure of her school. In her words: So we have our defmed reading time, whatever you want, our language arts block we call it communication. And that's where we have regular grade level instruction. I wouldn't even call it typical because strategies have changed. So grade level contained instruction. After that we regroup. Some kids who need phonics instruction. we have kids who need comprehension instruction. We have kids who need ESL. we have kids who are doing great so they don't need remediation and so we've formed literacy teams .. they are grade level teams ... Kids are grouped according to where they are and we're going to do this with this group of kids ... And everybody in the building is involved with that. Special Ed, Title I you know to reduce size. (Baker interview, March 1999) Case 2 River Valley School District. In contrast to Midplains where utilization was based on a different learning orientation, utilization in River Valley combined more standards-based ideas around transformative ideas for student learning. According to Mrs. Jackson: It's not that they (teachers) are doing everything differently. It's that they are looking at things differently. And some of the techniques, many techniques, many of the methods used, instructional methods are methods they have always used to help kids learn. Some has changed but a lot of just seeing things differently-using information better. But they are doing it with a different view. (Jackson interview, March 1999) Evidence of utilization also came from classroom embedded assessment data that had to be reported out by content areas as a form of accountability. Again according to Mrs. Jackson: Accountability for use of standards comes from reporting classroom achievement data through the district office for publication after two years of piloting and revision. And we're reaching a point now, this year where we will do our first year of public reporting of reading and writing at the elementary level. It's a part of each school's accreditation plan. In addition to CSAP we will be reporting classroom achievement data. That's pretty amazing, so that's pretty high accountability. (Jackson interview, March 1999) 139

PAGE 154

Similarly, many administrators alluded to accountability for utilization comes not only through student achievement data but also from student accountability itself And it's our expectation of kids that I can walk in and I can say, "What standards are you working on today?" and they ought to be able to tell me that. Kids knowing standards is one of our policies. Are other policies are, you know, standards are posted in the classrooms, and that you refer tot hem when you're teaching your lessons ... Now it may not be every day but when you are starting a new unit of instruction, that you talk about what is you want kids to know or do, what are the outcomes? And students should know from the first day of the unit what they're supposed to know and be able to do to meet the standard. Yeah, I mean you talk about rubrics around this building or you talk about course organizers or unit organizers, kids know exactly what you're talking about. (Williams interview, March 1999) Most administrators alluded to the fact that more elementary than secondary teachers were utilizing standard-based reform elements, but that secondary teachers had not felt the influence ofCSAP yet. Dr. Seneca and Mrs. Jackson estimated from informal surveys that 85% of elementary teachers were using district defined elements of standards-based reform, 70% of middle school teachers, and about 50% of high school teachers. This high degree of utilization came resulted from orientations toward a transformational learning scope. Many district administrators considered their reform efforts incremental because of the pace of their reforms. Over the decade, however, River Valley had transformed their student experiences around higher cognitive outcomes and relevant curriculum in most classrooms. According to Dr. Seneca: Absolutely sitting on status quo was not what we were after. And most of our kids achieved very well, traditionally always have. It's not that we were in this place where we had to change dramatically because we were bombing. Our community is quire conservative, but wanted to know how we were preparing students for the future. We stepped out and decided to be a leader 140

PAGE 155

in this area How to make curriculum more challenging rigorous and relevant at the same time was what we were after (Seneca interview, June 1999) District documents showed required transfonnative student experiences based around high achievement and content standards. For example, broad district goals, which had to be adapted to each school site, focused on relevance of standards, school to career experiences, and teclmology. All embedded district assessment tasks asked students to either construct responses or were perfonnance-based and authentic in nature. Last, almost all unit organizers examined from the PDC required application of knowledge and the production of some product usually through technology. Case 3 Front Range School District. Because Front Range appeared to be in transition, evidence of district-wide utilization of refonn knowledge was superficial at best except for isolated projects. The learning cycle of acquiring, interpreting and disseminating remained incomplete because of numerous interpretations, turnover in leadership and projects that focused on isolated issues within the larger refonn. According to some, the refonns had been around for 15 years (Bench interview, June 1999) so utilization was ongoing. While others suggested an unclear focus of too many competing goals (Hayes and Klein interview, June 1999). Still others had begun to see transfer of learning to classroom situations through more use of rubrics, 141

PAGE 156

performance tasks, planning, and changes in teacher questioning (Randolph, Cone and Grant interviews, June 1999). In all however, little evidence suggested a complete learning cycle resulting in wholesale changes in classroom practice because of mixed orientations around utilization. Front Range again exhibited a mixed orientation trying to move from an incremental to a transformative learning scope. Many administrators discussed the enhancement of instructional capabilities as the way they were trying to raise achievement compared to more progressive forms of curriculum and assessment. In contrast, Dr. Wright's Assuring the Essentials vision, while not promoting progressive forms of curriculum, did promote a transformative approach to educating all students for essential skills and habits. These transformative approaches included requiring a certificate of initial mastery, redefining credits based on accomplishment, and entrepreneurial approached for solving academic problems. Value Chain Figure 4.10 Value Chain Rubric Things .............................................................................................. Human Resources Midplains Front Range River Valley Case 1 Midplains School District: Value chain refers to the preference for emphasis of learning investments or where school districts allocated their resources 142

PAGE 157

when implementing standards-based reforms. While some money was spent on teacher training, substitutes for committee work, and external consultants, Midplains spent the majority of its resources on student materials. In essence, Midplains' value chain allocated the majority of its resources on purchasing new curriculum materials aligned with standards rather than professional development, extra teaching staff or other means for implementing standards-based reforms. According to Dr. Rodriquez, the reflection of doing what is right for kids is where a district spends its money: And when you look at our budget, I can honestly tell you we've invested in kids and invested in teachers. When you see over the last couple, well three years now, four hundred, five hundred thousand dollars set aside every year for materials. (Rodriquez interview March 1999) Case 2 River Valley School District. In contrast to Midplains who emphasized learning investments around materials, River Valley emphasized allocation of resources around teacher training. While some money was spent on materials, the majority of resources over the past decade have been focused on giving teachers incentives to modify their practice. According to Dr. Seneca, "we allocate close to a quarter of a million dollars a year for the PDC and other professional development. We spend a great deal of time and money on teachers" (Seneca interview, June 1999). Over the course of the decade according to Dr. Zeus: You !mow, I imagine we're in the millions of dollars. From PDC to incentive plans to trainers. So I finally look at providing substitutes and some of those trainings as well as paying people, paying teachers to attend PDC for some of that training. We're willing to pay whatever it takes. We spent contingency funds just so we could keep the PDC going. I don't have the exact cost. I only know we've made a commibnent and our board stands by us. (Zeus interview, March 1999) 143

PAGE 158

Case 3 Front Range School District. Again because of mixed orientations in other areas and a turnover in leadership, Front Range exhibited no clear value chain. Major resources had been allocated to early literacy personnel, professional development around differentiating instruction, and entrepreneurial funds for schools. Each of these allocations helped signal further the many competing interpretations for refonn in the Front Range School District. Conclusion These three cases call into question the notion of a single LEA response to mandated state policy in Colorado. To say that all three districts did not learn about this policy is excessively simplistic. However, gross categorizations of these responses such as adopting or non-adopting school districts also ignore the complexity of LEAs as organizations. Instead, the eight Learning Orientations discussed in this chapter represent the critical dimensions that emerged from this study to describe or characterize how organizational learning about refonn policy tookplace. In these three cases certain learning preferences emerged. Because of these preferences, what was learned and how it was learned differed among the three cases studied leading to different degrees of utilization across each district. These orientations begin to give us another perspective on why variation exists between LEAs. Focusing primarily on Learning Orientations, however, leaves many questions about the LEA response 144

PAGE 159

to mandated state policy unanswered. What causes or enables these orientations to become a primary determinant of what is learned and how it is learned in a district is also an important factor to consider. I consider this issue in the next chapter. 145

PAGE 160

CHAPTERS IMPLEMENTATION AS ORGANIZATIONAL LEARNING CAPABILITY: UNDERSTANDING THE ROLE OF FACILITATING FACTORS If the acquisition, interpretation, dissemination and utilization of knowledge (Huber, 1991) act as a generic learning cycle defined by a particular organization's orientations, then facilitating factors help promote this learning. According to Dibella and Nevis (1998), facilitating factors are normative because the more these factors are present in an organization the more opportunity exists for learning. The ease and amount of learning depends on the strength of these factors. Collectively they determine learning potential within an organization(p. 61). In each case, LEAs demonstrated certain facilitating factors that engaged and enabled organizational learning to occur. In some cases, LEAs possessed more or used better developed facilitating factors that greatly determined their learning potential. This chapter will primarily address the following question: 2. What facilitating factors in each of Huber's four areas were perceived by district respondents to contribute to organizational learning about these policies? Each of these factors is examined below under the headings of the generic learning cycle. 146

PAGE 161

Acquisition How district's acquired new knowledge surrounding standards-based refonn in Colorado depended on the district's ability to use facilitating factors that promoted or enabled this part of the learning cycle. Facilitating factors that were found to enable acquisition of knowledge included scanning, performance gaps, state policies, concern for measurement, and organizational curiosity. Key differences between LEAs emerged across both use and strength of these facilitating factors that acted to promote different understandings of standards-based refonn. Scanning This facilitating factor describes the process by which people in a school district gather information about practices, trends and conditions outside their own district. This factor to some degree helps districts compare the quality of their district to that of others. This factor also helps districts become aware of movements both educationally and politically contrast and definitions Figure 5.1 Degree of Use of Scanning High ............................................................................................................. Low River Valley Front Range Midplains Case 1 Midplains School District. In the case of the Midplains School District, scanning was not a high priority. Although district leaders were aware of state policy 147

PAGE 162

around standards-based reform, they did not use methods to gather information about future policy directions or other district interpretations. While they did look outside their district for assessment, it was unclear how they used this information to inform practice. Case 2 River Valley School District. In the case of River Valley, scanning took a high priority early in the standards-based reform era. First, prior to legislation River Valley scanned their community for what they wanted from their educational system. Second, as legislative efforts began to take shape in the early 1990's, a conscious effort to be involved with defining rules and regulations for state policies took place. For instance, Mrs. Glitton the Director of Assessment, was asked to participate on a state committee for developing the Colorado Student Assessment Program. From this work, she brought information back to River Valley and helped disseminate it within the district. Last, many district officials held offices in state organizations that helped influence policy or who had access to information that could be disseminated within the district. For instance, two board members served as executive members of the Colorado Association of School Boards. Dr. Seneca, Superintendent, served on numerous statewide task forces to influence school funding. Case 3 Front Range School District. Front Range showed little use of scanning beyond using external knowledge sources. In contrast to River Valley whose 148

PAGE 163

administrators spent considerable time on statewide groups to influence policy decisions, Front Range spent considerable time trying to figure out ways to protect their district from oncoming accreditation requirements (Hayes interview, June 1999). The only political scanning took place through Dr. Wright who had been the head of a statewide organization for school leaders, and who had spent time lobbying for this organization. This experience was used in recommendations for Assuring the Essentials. Performance Gap Being aware of performance gaps also can help facilitate learning on the part of a school district. According to Dibella and Nevis (1998), awareness of a performance gap-"either through analysis of performance shortfalls or a new visionopens the door to learning by providing initial awareness that new knowledge needs to be generated or that something needs to be unlearned" (p. 66). !Figure 5.2 Degree of Use of Performance Gap [High ............................................................................................................. Low River Valley Midplains Front Range Case 1 Midplains School District. In Midplains, an awareness of a performance gap existed primarily fueled by state testing results. This awareness leveraged action and signaled the need for improvements, but only at the elementary level because state testing results had not affected the middle school or high school. For instance 149

PAGE 164

Mrs. Baker alluded to how results had raised school and community concern at her school: Well it's raised the level of concern of not only staff people but also parents in a positive sense. We have more information to give to parents from the very beginning ... It's developed probably a sense of community of meeting the needs of kids ... What was the gap? I think the gap was huge, and that's why the message is constantly to evaluate and look at how to improve. We are far from the expectations of the state benchmarks and standards. we have some children there and we have some children in various places but we have a lot of work to do. (Baker interview March 1999) Case 2 River Valley School District. Similar to Midplains, River Valley's understanding of a performance gap came from state CSAP scores and district criterion referenced exams. Having spent considerable time during the decade understanding achievement gaps between gender and ethnic groups, River Valley continued to use similar processes to close gaps on CSAP and internal test scores. Because they were used for identifying and addressing these gaps, little indicated that these gaps raised any major concern in River Valley. Case 3 Front Range School District. Acknowledgment of a performance gap came from only one administrator. Dr. Grant discussed the needs of second language learners, and how their needs had to be addressed differently to give them access to higher level courses. No other gaps or problems were mentioned even though CSAP score had failed to rise significantly in the district over four years. 150

PAGE 165

Policy If standards-based policy is the curriculum to be learned as some would suggest (Cohen and Barnes, 1993), then how districts attend to and use policy as a learning source can also influence the acquisition of knowledge. Figure 5.3 Degree ofUse ofPolicy High ............................................................................................................. Low River Valley Midplains Front Range Case 1 Mid plains School District. The Mid plains case showed more of a reactionary stance or low use of state policy to enable learning. Due to turnover in leadership and the history of the school district, Midplains had not been moving toward a standards-based environment prior to passage of the legislation. In fact, Dr. Rodriquez suggested that they lagged behind many school districts: So that fall (1995) we sat down and finally looked at the law, just kind of reflected where we were and decided we needed to get going" (Rodriquez interview March 1999). Dr. Rodriquez also suggested the reactionary stance of her district: If it hadn't been for, you know, the literacy and standards, you know, I'd like to say I wish we would have been smart enough to figure that out. I think we knew but I'm not sure it would have been as aggressive, so I'm not at all upset with the legislation. I'm embarrassed probably more than upset that we didn't do it on our own. (Rodriquez interview March I 999) 151

PAGE 166

Other iterations of standards-based reform policy like the Colorado Basic Literacy Act also showed that Midplains did not use policy to enable learning. For instance, Mrs. Baker, the elementary principal, suggested that up until state assessment data was published, the teachers would not acknowledge a problem with reading and writing achievement. Therefore, the policy forced them to begin looking at achievement. In her words: I think it (the policies) lends credibility from a state and national standpoint. In other words if it's happening in other places, therefore what we're doing is also headed in the right direction. (Baker interview March I 999) Case 2 River Valley School District. In contrast to Midplains, River Valley anticipated state policy, used it to promote an ongoing reform agenda, and may have actually helped influence the direction of state policy. Prior to passage of HB 931313, the original standards-based reform policy, River Valley required clearly defined proficiencies, testing, and accountability reporting. Many administrators interviewed alluded to already having many of the state policy elements in place, and when the bill was passed the changeover was not that difficult. Many administrators also alluded to the power the legislation gave them to continue to make reform efforts within their district: And with those people who were still being reluctant there was always, you could always draw back and say, "It's the law we have to do it." You know it's the ultimate motivator which hasn't hurt us at all. (Jackson interview, March I 999) 152

PAGE 167

Because of their history with high-stakes accountability testing, River Valley administrators often testified or gave input into the original standards-based reform policies. Many central office administrators belonged to state organizations that were aware of pending legislative movements. Central office administrators used these organizations to help mold the organization's response through the experiences of River Valley. According to Mr. Rich: We were trying to anticipate things going on in the state so we wouldn't be surprised. we pushed hard to be on a lot of the committees because even if the legislation the rules and regs and usually not. So what you want is you want really good thinking people as part of that. I mean CDE always wanted someone from here to be a part of as much as possible ... You hardly know the total picture. You just come to where you know it and feel the next direction so you can take that back and tell people here's what's coming next. (Rich interview, March 1999) In addition, River Valley administrators also anticipated state policy because of an awareness of national movements concerning ideas like mastery learning and outcomes based education. We started this way back in the SO's early 90's. We were aware of an oncoming change after the release of A Nation at Risk. We just knew there had to be a different way so we began looking at mastery learning and outcome based education which by the way is really what standards-based refonn is about. We made some decisions to go that direction and didn't want the state to make us turn around so we made sure we always had a say in how policy was written through testifying, lobbying ... I also don't believe, however, the state has the capability of giving that kind of guidance. Policy in this state has always been translated differently. We just wanted some say in what the wording would be .. I'd say we didn't interpret the policy. I'd say the policy helped give us a few more answers. (Seneca interview, June 1999) Although River Valley anticipated and influenced early reform policy, River Valley later showed a reactionary stance to state assessment policies. According to 153

PAGE 168

most administrators, an emphasis on state testing took away some of the original intent of standards-based reform in their district. You know, now that the state has put so much emphasis on assessment, CSAP being one, that it's changed some ofthe outlook of standards-based efforts. You know, it's a part of it, but how they want you to report it when you give a test, you know caused some frustration. (Zeus interview, March 1999) Welllo and behold to get some teeth in this thing, the state decided to have a state testing program which we though would be ok. So first two you did it and you thought, 'Well the kids will be fine" then you realizes that their criteria for scoring is much different than anybody else's criteria ... and it was frightening. Those fourth grade kids, and it was ugly and we were treated unfairly. So as that evolved, that at least started to tell us at least we will always be treated unfairly. (Rich interview March 1999) Case 3 Front Range School District. In contrast to both Midplains and River Valley, Front Range exhibited mixed use of state policy to influence learning. For instance, Mrs. Bench suggested that because the district had a 15 year history of proficiencies, the changeover to standards was not that difficult. In contrast, both Dr. Hayes and Dr. Wright were adamant that state policy was under resourcing district's efforts, and therefore Front Range would only focus on a few, essential skills "no matter what the state demands we do" (Wright interview, June 1999). While both Dr. Wright and Dr. Hayes praised the intent of legislation like the Colorado Basic Literacy Act, they also suggested that some policies had forced them to do extra in areas they had already addressed: Oh, we had been dealing with literacy issues for about 10 years, and then the state comes in and says you have to do what we had been doing anyway. We are addressing literacy issues. We use a lot of different methods, most of our elementary teachers are trained in various interventions, but now we have to report that? Why? (Hayes interview, June 1999) 154

PAGE 169

Concern for Measurement A concern for measurement describes the extent to which systems and their leaders are concerned about and utilize measurement as a part of the learning process (Dibella and Nevis, 1998). A concern for measurement as a facilitating factor in education emerged as a shared perception about the need to define and attend to key factors relative to student achievement. Student achievement data as information can also enable acquisition of an enhanced knowledge base. Figure 5.4 Degree of Concern/Use of Measurement High ............................................................................................................. Low River Valley Front Range Midplains Case 1 Midplains School District. In Midplains, a concern for measurement existed but on an ordinary and common level that limited their ability to use data as a form of knowledge. In Midplains, a concern for measurement meant simply defining the forms of assessment to be used as achievement measures of standards. Instead of alluding to achievement data, all respondents discussed at length how assessment was an integral part of standards-based reform but did not mention use of data as a learning tool. ... I think it's really important that we continue to use assessment as the driving force for revising the curriculum, for developing our staff development plans, for, you know, using it as those marks along the way about how we're doing. (Rodriquez interview, March 1999) ... Assessment? Well we do grades five through 12. Every grade level, and this is performance assessment. Not the regular standardized tests. And then at grades two, three and four we are 155

PAGE 170

doing some more, more limits in testing there because again grades three and four are doing state testing. They all do CSAP (Adams interview March 1999) On a smaller scale, however, the elementary school in Midplains was forced to look at data to improve. According to Mrs. Baker: It's (CSAP data) has given us something to say besides I really believe that your students need to do this. We have a law now that says students must, and it's forced us to focus and become more data driven in that we do a lot of assessment and look at the results and know for pete's sake it's January and they haven't grown? We're forced, we have more data to look at how kids are doing, how well they're doing and what we need to do differently. (Baker interview March 1999) Case 2 ruver Valley School District. In contrast to Midplains that had an ordinary and common concern for measurement, ruver Valley's concern emerged as a key facilitating factor in engaging the district's learning capabilities. In ruver Valley, a concern for measurement permeated every interview and meant that data was used as both an accountability and learning tool. This emphasis came from both Dr. Seneca who was known as a "data guy who loves charts and graphs" (Williams interview, March 1999), and Mrs. Glitton who understood the need for better classroom assessment "before it was popular" (Jackson interview, March 1999). This emphasis on data primarily linked achievement and classroom assessment data to determine professional development needs and resource allocations. Data as an informational tool was advocated at classroom, school and district levels, and processes were being developed to link the levels together as a planning tool. "Datadrive instruction" as a decision based planning tool for teachers was especially 156

PAGE 171

visible and embedded in district documents including teacher performance standards. Last, a concern for measurement surfaced in documents from the Professional Development Center whose main focus was to show teachers how to translate standards into assessment tasks. Case 3 Front Range School District. Similar to Midplains, Front Range showed an ordinary and common concern for measurement. Because of an emphasis on instruction, assessment and data, while discussed, did not appear as a major concern in the district. The assessments that were used were all standardized including district performance-based for .rriting, social studies and science at the secondary levels, norm referenced at grades 3,6 and 9 and state CSAP testing. No processes for utilizing this data as knowledge existed other than publishing the data. Very little emphasis had been placed on classroom assessment and teacher judgment because of the amount of training involved (Klein interview, June 1999). Likewise, schools were not required to report achievement because no district-wide assessment system existed. Organizational Curiosity Organizational curiosity as a facilitating factor in education refers to the shared perception about the need for new ideas and experimentation. Driven primarily by 157

PAGE 172

other facilitating factors in this learning process, organizational curiosity emerged as the ability to question assumptions about the educational system and use processes to self-correct or find problems within the system. Figure 5.5 Degree of Organizational Curiosity High ............................................................................................................. Low River Valley Front Range Midplains Case 1 Midplains School District. Similar to the original research on organizational learning capabilities, a weak sense of organizational curiosity was seen in Mid plains. Because of their focus on content, Mid plains spent a lot of time and resources recreating their traditional system based on harder outcomes and materials. Very little discussion occurred around other questions in the system that might allow achievement to increase. Case 2 River Valley School District. In contrast to Midplains, River Valley exhibited a strong sense of organizational curiosity fueled by a reliance on data and focus on raising achievement. From the mid 1980's, the community and school district worked in tandem to envision what a successful school system could look like. This led to calls for higher cognitive outcomes and more workforce preparation. This call also led to a frequent examination of practice at all levels fueled by the use of data as indicators of success and information for improvement. For instance, in the mid 1990's, River Valley began in earnest to align curriculum around district 158

PAGE 173

standards and found that many teachers used similar topics and standards at many grade levels (White interview, March 1999). By surveying teachers and publishing overlaps, teachers were almost forced to think "outside of the box" that led to the development of the district's PDC (Seneca interview, June 1999). Organizational curiosity also emerged in River Valley around the idea of risk taking. Common solutions to solve achievement problems did not easily occur because of student demographics, school cultures, and school leaders' understanding (Seneca interview, June 1999). Therefore; district officials encouraged principals and teachers to take risks in finding solutions that worked for a particular school or group of students and document these (Williams interview, March 1999). Case 3 Front Range School District. Similar to Midplains, Front Range exhibited a minimal amount of organizational curiosity but was in transition. Organizational curiosity was fueled by Dr. Grant who, many people noted, often liked to discuss with administrators and teachers what could a standards-based system look like and what needed to change to make it happen (Hayes interview, June 1999). A transition to more organizational curiosity was also fueled by Dr. Wright's entrepreneurial fund that allowed schools to apply for money to solve their most pressing achievement problems. According to Dr. Wright, "People want to find and solve problems when there is an incentive attached and they know the funds will not blow away after a 159

PAGE 174

year" (Wright interview, June 1999). However, only three schools took advantage of these funds the first year of their existence. Interpretation While interpretive learning orientations will be discussed in the next chapter, how districts interpreted standards-based reform also depended on the district's ability to use facilitating factors that enabled this part of the learning cycle. Facilitating factors that were found to enable interpretation of these reforms included involved leadership, leadership cognition, focus, and a systems perspective. Key differences between LEAs emerged across both use and strength of these facilitating factors that acted to promote different understandings of standards-based reform. Involved Leadership Leadership as a facilitating factor describes multiple leaders within an educational system as actively engaged and involved with learning initiatives and in ensuring that a learning environment is maintained. High ............................................................................................................. Low River Valley Midplains Front Range 160

PAGE 175

Case 1 Midplains School District. In Midplains, Dr. Rodriquez was seen as the impetus behind standards-based reform. From her work as assistant superintendent in charge of curriculum and instruction to her current job as superintendent, Dr. Rodriquez exhibited a sense of involved leadership. According to Mr. Bohag, "What we will do this year, and what we will do next year are a common message from her based on what we are trying to accomplish" (Bohag interview March 1999). Because of the district's size, most principals also showed a personal and active engagement in reform and learning initiatives, especially at the elementary level. ... I think one way is that principals are the major part of the curriculum. We are backbone to the curriculum committee, and so we're constantly aware. We know where we stand in content areas .. There are some days you just go well I can't be the curriculum coordinator but you know what's going on and what needs to be done. I mean you do feel a responsibility for it. I know what needs to be done in reading and in PE ... So in our administrative meetings, one or the other of us are saying we haven't done this or that (Baker interview March 1999) Case 2 River Valley School District. In River Valley, Mrs. Glitton and Dr. Seneca were seen as the driving force behind standards-based reform. Mrs. Glitton was described as the vision behind the ongoing use of data as an accountability tool and as a tool for learning on multiple levels. Dr. Seneca, although trained as an expert in school finance, understood the role of data and accountability in raising student achievement. However, multiple advocates for standards-based reform, including the board of education, helped promote standards-based reforms in River Valley. 161

PAGE 176

Case 3 Front Range School District. Due to turnover in key leadership positions in Front Range, leadership as a facilitating factor for learning and implementing standards-based reforms had been limited. Even though he was widely known as an expert in reform in the state, Dr. Wright had only been superintendent for nine months at the time of this study. Dr. Grant, who had only been in the district for three years, was leaving to become an assistant superintendent in a Texas school district. Mrs. Bench had only been in her position for a little over two years. This turnover brought little continuity to the reform ideas within the district. In addition, because of a reliance on central office administrators to develop and promote reform ideas, multiple advocates from the teacher or principal ranks were not involved. Leadership Cognition The ability of leaders to understand the different requirements of standards-based reform has emerged as a factor that can greatly enhance implementation of these reforms (Sanders, 1998 and Spillane, 1998). Leadership cognition refers to the personal resources of leaders and the depth with which they understand the reform efforts to leverage learning orientations and resources. Leadership cognition also means the degree to which multiple leaders possess common cognitive maps about the reform. 162

PAGE 177

Figure 5.7 Degree of Leadership Cognition High ............................................................................................................. Low ruver Valley Front Range Midplains Case 1 Midplains School District. While leaders in Midplains exhibited involvement and activity within the reforms, it was not as evident that they understood the nature of the reforms or what else needed to happen. Instead they all discussed a nebulous vision of better curriculum or higher standards, or in evaluating teachers asking, what standards had been covered? Case 2 River Valley School District. Leaders interviewed in River Valley frequently alluded to constant discussions about understanding standards-based reforms, analyzing data, and marshaling resources for instructional improvements. In addition, teacher leaders who ran the PDC also worked with principals helping them understand the translation of standards into units of study and assessments. Helping building leaders understand standards-based reform was also taken on by central office personnel who spent fifty percent of their time in training and working with building administrators (White and Jackson interview, March 1999). However, some central district administrators pointed to the lack of secondary principals who truly understood standards-based education as an inhibiting factor in greater secondary reforms. 163

PAGE 178

Case 3 Front Range School District. In Front Range, most administrators discussed the ongoing leadership seminars and meetings Front Range used to help administrators understand these reforms. However, because of multiple interpretations and meanings used in the district, most school principals understood the reforms through their immediate director's interpretation. For instance Mr. Cone, high school principal, and Mr. Randolph, middle school principal, discussed Dr. Grant's leadership as looking at changes in grading practices. Mrs. Bortz, elementary principal, alluded to Mrs. Bench and her leadership in focusing time for elementary schools through the time allocation study. However, no similar understanding across leaders in Front Range emerged. A clear focus or vision for reform efforts as another minor factor that helped enhance the systems perspective also emerged. Figure 5.8 Degree of Focus High ............................................................................................................. Low River Valley Midplains Front Range Case 1 Midplains School District. Midplains, as discussed earlier, possessed a clear focus for what standards-based reform meant which helped them focus their efforts. This focus on content was clearly seen and articulated by district leaders as the main point of their work. 164

PAGE 179

Case 2 River Valley School District. In River Valley, student achievement of standards worked as the common focus for the district. They (schools) are connected with that one focus point. The core of it all is everything we do is made to improve student achievement. That's what ties it together. Everything at the next, if you have to layer it, everything from there is tied to standards, student standards. (Seneca interview, June 1999) Case 3 Front Range School District. Even though many district administrators alluded to having a systemic focus for their reform efforts, many unconnected projects pointed to a fragmented approach. Systems Perspective A systems perspective recognizes the interdependence and alignment among organizational elements. In the standards-based or systemic reform era, this factor has been promoted as critical for districts in recognizing the relations between standards, assessment and instruction along with district-school relations. However, because of different interpretations and focus, a systems perspective emerged with different meanings in each case. Figure 5.9 Degree of Systemic Perspective/Alignment High ............................................................................................................. Low River Valley Front Range Mid plains 165

PAGE 180

Case 1 Midplains School District. In Midplains, a systems perspective emerged primarily focused on articulation of content from grade to grade. Dr. Rodriquez mentioned this when she said: I know for use in this district, and I'm sure this is true in many others, there was not that clear articulation from one level of instruction to another. And so our goal really was to work in P-12 units to get that articulation from one level to another in a continuous way. (Rodriquez interview March 1999) How well this perspective was utilized K-12 though was not as evident. Mr. Wilson alluded that articulation at his level was still problematic in that his teachers were still set on blaming students for a lack of achievement. Teachers still say 'I can go to my room, close the door and do whatever I want.' They think it's the kid's fault. I think that or one or it's the parents' fault or its' the middle school's fault. (Wilson interview, March 1999) Dr. Rodriquez also noted that while articulation was a key factor in standardsbased reform, people's beliefs and understandings about content articulation were probably closer across buildings rather than across the district. Case 2 River Valley School District. In River Valley, a clear and coherent systems perspective focused school and student achievement of district standards through processes using data. Speaking about the idea of a systems perspective, Mrs. Jackson stated: I think it means a clearly articulated statement of what kids should know and do drives everything. It drives instruction. It drives selection of material. It drives instruction. It drives achievement data, how we measure it. They've done it cyclical but it is more systemic. It means that every decision made in our district. how we spend money, how we spend teacher time, how we allocate FTE. Everything focuses on a common goal that says kids have to know and do X. (Jackson interview, March 1991) 166

PAGE 181

Because student achievement of district standards formed the center of the system, all planning, improvement and other initiatives had to show the relation between improvements and achievement. For instance, the School to Career initiative in the district developed as the way to help teachers show students the relevancy of standards. On a district level, changes in achievement were examined by looking at not only achievement data but also at teacher evaluations, curriculum alignment tools, and spending patterns on materials and professional development. At a school level, the idea of "data coherence" meant each school analyzed its own achievement problems and utilized school and district resources to improve achievement relative to strategic district directions(Rich interview, March 1999). River Valley also used a systems perspective around standards through assessment. Using a criterion referenced assessment system, all students tested frequently to see if they are meeting district standards. Teachers helped choose tests and items that align with district standards. In addition, River Valley spent considerable time helping teachers understand the connection between standards and classroom assessment through the PDC. Last, the River Valley School District aligned their performance appraisal system to the idea of performance standards for teachers based on principles of standards based education adopted by the district. According to Dr. Seneca, "If we know what 167

PAGE 182

we want students to know and do, by logic we should align what we want teachers to know and do" (Seneca interview, June 1999). River Valley's teacher appraisal system consisted of ten performance standards. These standards appraised teachers on, among other things, their ability to show evidence of course and unit mapping based around standards, and to enhance the probability of raising achievement by looking at data. According to Dr. Zeus: Every professional in this district has or will have performance standards that focus on how they can help raise achievement. Teacher evaluations should be based on performance standards. Principal evaluation should be based on performance standards. The superintendent's evaluations should be tied to performance standards. And if we could ever get to the point, the point where school boards, they would have certain performance standards that would be assessed .... Every department, every employee has to be asked what do you contribute to student achievement, what is your part in this? (Zeus interview, March 1999) Case 3 Front Range School District. Unlike River Valley that had a clear focal point in content standards, Front Range used multiple projects and meanings to engage their reform efforts. For instance, Dr. Grant alluded to the chronological dimension of her district's reform efforts and not the interrelationships among the various levels when she explained: I think it's very systemic. And one of the reasons it's become very systemic is because we have a plan that frames it. We've done the elementary piece. We're going to do the middle school piece next year and the high school is down the road. So it's all a part of a plan; it's all a goal that we're working towards that people are aware of so I think that's good. Others however pointed to the need to become more systemic. According to Dr. Wright: 168

PAGE 183

I am a student of change and have read a lot about systemic reform and what that means, and I think it is darn hard to get it done. Schools and districts for so long have been able to be autonomous little islands without a lot of connection. By writing Assuring the Essentials, part of my goal was to show people we had to become more focused around learning, to become more systemic in our approach. We have started little projects here and there, but we need to pull them all together. (Wright interview, June 1999) And Mr. Green, Director of Human Resources: Looking at the whole philosophy (of standards) as well because what we want to do is incorporate that philosophy in all of our efforts in human resources and I guess another area I mentioned is teacher evaluation ... What we've gotten to is developing questions (for interviewing) share those questions, but we need to go beyond that and look at some of the other human resource issues like evaluation, incentives, merit pay. (Green interview, June 1999) Dissemination The dissemination of new ideas and knowledge surrounding standards-based reform again depended heavily on the district's ability to use facilitating factors that promoted or enabled this part of the learning cycle. Facilitating factors that were found to enable dissemination of knowledge included a climate of openness and continuous education. Key differences between LEAs again emerged across both use and strength of these facilitating factors that acted to promote different understandings of standards-based reform. 169

PAGE 184

Climate of Openness A climate of openness refers to the permeability of communication boundaries and what Spillane (1997) refers to as social capital. This facilitating factor is the degree to which people trust one another and can utilize the trust inherent within these relationships to communicate and agree upon direction for a school district. penness High ............................................................................................................. Low River Valley Midplains Front Range Case 1 Midplains School District. From the initial design ofMidplains' implementation plan for standards-based reform, a clear communication process assured people's understanding of what was happening. Involvement of teachers, .community, and the design of the adoption process was all designed to communicate expectations and openness. Social trust among administrators and teachers and community members was also paramount as Midplains began their form of standards-based reform. According to Dr. Rodriquez: But you know in a smaller school district in a sense, you know, we're a family and we can beat each other up and we can be nice to each other but we still see each other every day. We better figure out how to make this work. (Rodriquez interview March 1999) 170

PAGE 185

Mrs. Adams also discussed the importance of social trust in implementing standards-based reforms: You'll hear teachers say we are in this together because we haven't gone around blaming each other and what we've done, I know this is a word we all use, but teachers have really been empowered. They're the ones that decided what we should teach, how do we teach it, what the best way to assess is, what are the best materials for kids. It's their decision. It was not top down. All we said is you answer the questions and then we just said was this is the process and we'll follow and make sure we can live with it. (Adams interview March 1999) However, a $1.2 million dollar budget shortfall one year resulted in a reduction of 14 classified staff and a student walk-out because of a lack of ESL services also hampered social trust. Have you had your whole classified staff come to board meetings and holler and shout and involve community people? And so you know I think it's painful but it happens but we still have to stay focused on the fact it is about achievement in kids. As hard as it is, we have to and we were lucky that we were able to do that. As a result of all of that we even had a law suit go on about 14 of our classified staff said the budget deficit was made up and we lied about it and it's a political thing. But the good news was the other 150 didn't. They went along with it, we moved on. We all, we go to and I think it's made us better and stronger, most of us, and older. (Rodriquez interview March 1999) Case 2 River Valley School District. A climate of openness or using social capital (Spillane, 1999) to promote standards-based reform in River Valley emerged as another strong facilitating factor that helped advance the district's organizational learning. This aggregate factor emerged from other factors within the district. First, because the center of the district's reforms centered on achievement, they strike at the "heart ofteacher's intentions" (Zeus interview, March 1999). Second, because a 171

PAGE 186

common message existed about achievement and use of data to support all decisions, teachers knew any new idea was not a fad (Seneca interview, June 1999). In addition because of this common message or focus for the district, a culture based on standards has developed. "There's a real high comfort level in teachers. That's how they are introduced to the system. They know that's how we do things here" (Jackson interview, March 1999). Teachers were also made to feel valued both through monetary incentives and professional treatment. That paradigm shift had to occur. For a culture of individuals who now thinks about what is it I want kids to know and be able to do has to happen first and it can only happen at the classroom level. ... And more important, I think, than actual money is the strong message that goes to teachers. This is important work. You are the experts who need to do it. And we value your time and expertise plus we're going to pay you for it. ... We are valuing teachers as not just as leaders but as partners in all of this. (Seneca interview, June 1999) Because these factors created a climate of openness in River Valley, the district was chosen as one of six exemplary sites chosen by the National Education Association's Center for Innovation in 1998. River Valley was chosen because of its "shared accountability system ... and commended the district's teacher union, district leadership, teachers and Board for building a climate of collaboration trust and respect in the district that allowed these changes to occur" (River Valley press release, October 30, 1998). 172

PAGE 187

Case 3 Front Range School District. In contrast to both Midplains and River Valley, a climate of openness was not as apparent in Front Range. While district administrators in Midplains and River Valley discussed working with teachers and trusting their judgment in formulating responses to reforms, many Front Range adminisrators alluded to changes that needed to be made throughout the district. So I think those are extra efforts that need to be made. We made some good efforts there, but we need to do more. And it more than just Dr. Wright saying here's what we need to do, or I'll come and make a presentation. And so all of administrators understanding what needs to happen, telling parents and community members about this, about the benefits of it. And frankly teachers need to be more involved ... I think for the most part it's individuals' willingness to make some drastic changes in how we do business. (Green interview, June 1999) They (teachers) would tell you more time. But it's a combination oftime, I hate the cliche, but I think it has to do with their attitude being open to it and new knowledge. I think we have to bump our present mental model on what school is and some, some of us learn better by seeing it work in other places and there's so few good models out there. It makes it hard. Some of us learn by talking it through with colleagues. But too often when you're in a situation, they can't reinvent themselves, and I've watched that... Some people are there and we see that. We're starting to hear excited people talking about what they did. There's a number of these independent projects that are showing people it's possible. It's set the stage now a critical mass is beginning to accept it. More people just need to accept it and do it. (Klein interview, June 1999) Continuous Education Continuous education is the internalization of a commitment to lifelong learning at all levels of an organization. According to Dibella and Nevis (1998), the achievement of a high level of continuous education requires work settings that support learning of all kinds (p. 71 ). Much of the educational change literature 173

PAGE 188

alludes to the role of professional development as a key in changing teacher behaviors and increasing student achievement. However, common forms of professional development such as one-day training have been faulted for their lack of impact. To truly have continuous education means an organization utilizes multiple ways for people to learn both inside and outside the organization, and supports high quality learning through providing necessary resources. ucatlon High ............................................................................................................. Low River Valley Front Range Midplains Case 1 Midplains School District. In this sense, Midplains had a low degree of continuous education relying mainly on the their focus of content development as professional development and relying heavily on outside workshops for teachers' continuous education. The one concentrated avenue for continuous learning in Midplains occurs through an ESL endorsement program taken by 40 teachers in the district offered by a local university. All other continuous education is decided by individuals for movement on Midplains' salary schedule. Case 2 River Valley School District. Continuous education as a major facilitating factor in River Valley for implementing standards-based reform also emerged as an important and critical factor. This factor helped drive River Valley's orientation of internal knowledge sources. River Valley primarily relied on four forms of 174

PAGE 189

continuous education First, all new teachers in the district participate in a required three year induction program that helped them build their skills around the ten teacher performance standards. This induction system modeled a standards-based environment because teachers have to demonstrate proficiency on a performance standard in order to move to the next one. Second, once teachers move past the probationary stage, growth plans have to be differentiated by teacher developmental stage. To receive performance pay, teachers have to show behavioral changes in the goal area they selected and move along a continuum of performance on the chosen standard. Third, River Valley utilized an early release day every Wednesday afternoon during the school year for school-based professional development. These early release days addressed a school's particular achievement goal, and most schools used whole school formats such as looking at student work, lesson studies or book studies to develop solutions to achievement problems (Jackson interview, March 1999). Last, River Valley designed an ongoing summer institute, the Professional Development Center, to help teachers collegially discuss what standards look like in practice. First begun in 1995, the PDC has occurred every summer since. Case 3 Front Range School District. Continuous education, focused primarily on individuals, emerged as one of the most utilized and influential facilitating factors in 175

PAGE 190

Front Range. Using the differentiation of instruction as their major emphasis, the Front Range staff development department utilized numerous models for internal learning, and numerous external consultants the past five years while moving toward more school-based models. During the time of this study, 150 district teachers had participated in a week long workshop on differentiating instruction. The previous year, Front Range had also begun, their Research for Better Teaching network. This network consisted of70 teachers who met monthly around "common problems around instructional practice" (Klein interview, June 1999). Teacher who shared similar concerns would group together, read research, and develop solutions that they would try in classrooms and analyze the results. Similarly, Dr. Wright's entrepreneurial fund was staffed by personnel in the staff development department who helped schools in numerous ways including meeting facilitation, data analysis, research or training. Utilization How district's utilized new knowledge, ideas, and interpretation surrounding standards-based reform in their district also depended on the district's ability to use facilitating factors that promoted or enabled this part of the learning cycle. Facilitating factors that were found to enable utilization included multiple advocates, 176

PAGE 191

accountability, and resources. Key differences between LEAs emerged across both use and strength of these facilitating factors especially in the area of accountability that acted to promote different degrees of utilization. Multiple Advocates Multiple advocates refers to multiple champions for a cause exist at all levels of an organization. In this study, multiple advocates emerged primarily as having all levels including central administration, building administration, and teachers advocating the same understanding or goals for reform. Figure 5.12 Degree of Multiple Advocates High ............................................................................................................. Low River Valley Midplains Front Range Case 1 Midplains School District. Along with district leaders, multiple advocates in the Midplains School District helped lead the process of defining standards and choosing new materials. These advocates primarily came from the teaching force, but as discussed earlier also came from the outside as consultants and community members. Each of these advocates went through training in various processes, and according to Dr. Rodriquez (interview, March 1999) the participant structure helped add to the quality control of the process and end product. 177

PAGE 192

In addition, Mrs. Baker noted that teacher leaders began to emerge both from the reform process, but also as a result of a clearer achievement picture offered by state testing: ... there's some healthy respect now for our teacher leaders, you know, that they are closest to what happens in the classroom. They're teachers. They need to have a say in what goes on. It comes down to the respect we have for that person's abilities ... I think some days super people can do anything. (Baker interview, March 1999) Case 2 River Valley School District. Multiple advocates for standards-based reform, including the board of education, helped promote standards-based reforms in River Valley. For instance, teacher liaisons led the first training in building, and teacher leaders lead the summer PDC. In addition, building principals and central office administrators work across the district in various training roles to help the district's ongoing reform efforts. Case 3 Front Range School District. Advocates for reform in Front Range existed primarily at the central administration level. Little evidence suggested that principals or teachers were involved with decision making or helping decide the direction of reforms in Front Range. Accountability Accountability defined as a factor in which people understand their responsibility to follow through with the orgariizational direction or to be able to answer for their 178

PAGE 193

actions helps the utilization of knowledge. This factor emerged both from the literature and from data as numerous people across districts discussed the importance of being more accountable for student learning. However, only River Valley had any true accountability mechanism in place to enable utilization. Figure 5.13 Degree of Accountability High ............................................................................................................. Low River Valley Midplains Front Range Case 1 Midplains School District. In Midplains, accountability emerged as something everybody acknowledged as important, but no clear accountability structures were evident or anticipated. For instance, Dr. Rodriquez discussed the importance of evaluating teachers based on the principles of a standards-based classroom as did all other principals. But all acknowledged that a policy basing teacher evaluation on student performance would be difficult to bring about. Dr. Rodriquez also discussed the possibility of using a pool of extra money as an incentive for teachers to implement standards-based reforms, but this had not occurred due to the volatility of the issue (Rodriquez interview, March 1999). Accountability in Mid plains emerged more as a sense of responsibility for better student perfonnance. For instance, Dr. Rodriquez talked about having state and community people "watching us every day whether we like it or not, so I'm holding 179

PAGE 194

us to high standards" (Rodriquez interview March, 1999). Similarly, Mrs. Baker talked about accountability as an ill-defined pressure: We feel pressure to get better. Pressure from the state, pressure from the building administrator. Pressure from, I would say in general, you know it's their own personal beliefs that they want kids to perform better. They want, and I'm speaking about our building-they want this building to succeed. we are a community, you know, and our success is measured by the state, but there is no incentive. (Baker interview March 1999) Case 2 River Valley School District. Accountability as the responsibility for follow through and utilization of district direction and knowledge occurred on four levels in River Valley. First, at a district level, the board of education and central office leaders set high but achievable goals for student progress every year with part of the superintendent's salary tied to meeting these goals. Second, building principals were also required to set achievement goals and report annual progress in person to the board of education as well as the school's community. A principal's performance appraisal consisted primarily on raising achievement, and they too could be awarded performance incentives for meeting goals in their buildings. Third, as previously discussed teacher performance standards based on student achievement and professional growth formed the core of teacher accountability. Last, student accountability existed on two levels. The first, required students' to know what standards were required of them on units and to graduate. In this way, standards and 180

PAGE 195

accountability acted as a learning tool. The second, required students to pass competency testing at the high school level to receive a diploma. Case 3 Front Range School District. Accountability as a facilitating factor in Front Range surfaced in two rudimentary ways. First, the district had completed a time allocation study. at the elementary level as part of Assuring the Essentials. This study, done by elementary teachers, decided how much time should be spent teaching the essentials, and what would constitute evidence for student success. According to Dr. Grant: ... Now that's one thing about the Assuring the Essentials that actually the elementary planning meeting on time allocation are good. But that talks about accountability to what teachers need to be teaching in their classrooms relative to the curriculum. So even though they now close their doors, the accountability is so high around instruction and student achievement which is a good thing I think is going to delimit some of that teachers getting stuck on favorite projects. (Grant interview, June 1999) Second, an initial start into assessment and use of data had started by district administrators for accountability purposes. As explained by Mrs. Klein and Dr. Grant: Well, I think the accountability awareness (from CSAP) is incredibly helpful in a way. So no one likes those tests, but they're certainly paying attention to them. (Klein interview, June 1999) Oh, CSAP is just one small piece. The assessment accountability piece is multiple. We believe that assessments, standardized district assessments, state assessments, national assessments, they all give us a different piece of information so we use multiple assessments to look at achievement of students. I don't know if that's the main, that is, that's truly part of it, probably a major part of the accountability for teachers because when we, as we become more and more data driven, we're looking at the data and the data comes from the classroom from the kids and that is, comes from the teachers and what they're teaching in the classroom. So I would say, yeah, that probably is a major portion of it. (Grant interview, June 1999) 181

PAGE 196

However, no teacher or school evaluation or structural elements like merit pay or reporting existed to heighten the accountability in Front Range. Resources Resources as a facilitating factor refers to how school districts allocate or use time, money and structures for learning initiatives. Increases in resources alone would not be enough to increase organizational learning capabilities without a clear and direct relationship to enhancing a learning orientation. esources High ............................................................................................................. Low River Valley Midplains Front Range Case 1 Midplains School District. Midplains used their resources very clearly to gain people time to decide new outcomes for student learning and new materials. Changes in structures as a resource were less evident. Time: Early in Midplains' learning and implementation efforts, extra time was allocated for curriculum work. Every Monday afternoon for a year, students were sent home early so teachers could work on their curriculum teams for adopting curriculum work plans. More recently, regularly scheduled inservice days4 per yearwere allocated for curriculum work. Extra time was also granted teachers and 182

PAGE 197

curriculum teams to finalize their work through hiring of substitute teachers for a length of time. Money was specifically allocated for three major purposes in the implementation of standards-based reforms. First, outside curriculum facilitators were hired to help write curriculum work plans at the cost of $80,000 (Rodriquez interview March 1999). Second, approximately $300,000 to $500,000 per year since 1997 was allocated for the purchase of new curriculum materials. Third, incidental costs for substitutes, travel and laptops for curriculum work teams were also allocated. Specific money for professional development came from individual building budgets. Because of mismanagement of money by a previous superintendent, Midplains inherited a $1.2 dollar deficit. However, to fund many of their ESL initiatives an ongoing Federal grant has been written and funded for a total of $2.3 million dollars. Structures as a means to allocate time, personnel and efforts have not been as thoroughly used in the Midplains School District to enable learning initiatives. Most structures set in place again dealt with how curriculum standards and materials were decided. However, the district was beginning to work on the issue of merit pay as an incentive for teachers to implement the standards-based reforms and already had set aside $100,000 for this structure. 183

PAGE 198

Some schools also began to change their structures to meet the demands of standards-based reform. The elementary school put in place literacy blocks in which grouping patterns were decided by student need. The elementary school has also begun to utilize a reading lab that reallocated Title I reading personnel based on need. Case 2 River Valley School District. To support standards-based reforms, River Valley made a conscious effort to align resources with the learning needs of both students and teachers. River Valley used their resources very clearly to gain people time and incentives to improve achievement problems. Structural changes were also deliberately used to support district and school reforms. As previously discussed, River Valley modified their weekly schedule to allow teachers more professional development time. Our board made a real big commitment of time and money to make this happen. Make it work well. Our early release Wednesdays where teachers have the time, the number one thing they need to make the changes. And a pretty big commitment from our community in allowing us to do that. We really didn't have an angry mob of parents demanding we not do it. We explained it and they went for it. (Jackson interview, March 1999) Money in River Valley was specifically allocated for major purposes in their implementation of standards-based reforms. First, because of their reliance on achievement data, River Valley spent money for testing materials and scoring of assessments. Second, the summer Professional Development Center budgeted 184

PAGE 199

$250,000 a year to pay for teachers' stipends, materials, and lead teachers as consultants. Because of its importance, district leaders have refused to cancel the PDC even though its costs have come from the district's contingency funds the past two years. Last, annual allocations to buildings for achievement paid for professional development, or programs to enhance achievement (Seneca interview, June 1999). Beyond these immediate costs, Dr. Zeus also alluded to others: Right now, I imagine we're in the millions of dollars spent on all of this. I look not only at the PDC and buildings but providing substitutes for testing days and other meetings and some of those trainings as well as paying teachers. I don't have an exact cost. I don't know that we've ever exactly done a cost effective analysis of anything, we've just kind of moved forward. (Zeus interview, March 1999) Structures in River Valley helped allocate knowledge, solve achievement problems, and give people incentives to perform at higher levels. All content standard areas were lead by teacher leaders and building liaisons who have had PDC training. The content area liaison's role was to disseminate all information and changes in standards to buildings and to disseminate achievement solutions. A second structure previously examined was River Valley's structure for professional development that relied on induction, building based plans and the PDC. Third, structures at a school level helped solve achievement problems. Elementary schools for instance have instituted more schoolwide Title I programs utilizing regrouping strategies. High Schools have all implemented block scheduling to allow more time 185

PAGE 200

for combining academic and career based activities. River Valley's new high school will also use structure to help solve achievement problems. This new school will use houses or academies to personalize the high school education of its students. Last, River Valley designed a different incentive structure for its teachers to promote continual growth. Using a modified pay for performance plan as its base, River Valley teachers have to receive an acceptable evaluation on all performance standards to receive a salary increment increase. Teachers also have to show demonstrable changes in a teaching behavior from their growth plan for further incentive payment. They can also get incentive pay for work outside regular classroom activities. Case 3 Front Range School District. Because Front Range was in a transition toward Assuring the Essentials strategies, many issues surfaced around the need for more and different uses of resources. A need for more time emerged as a critical need in Front Range in order to better understand the suggested reforms. However, no changes were being promoted to find ways to gain more time. According to Dr. Grant: I think one issue has to be time. We don't have the time to give to teachers to have the dialogue in teams, in grade levels and at schools to really talk about student work. To have the standard, to put the student work on the table and say now is this what we are expecting? Is this really grade level? What should we be doing differently? What do we need to be doing? I mean time is really a factor in professional development dialog and we need to change the structure of our days I believe to include that time. Because I don't believe there, there's not any more time in the day. There isn't. (Grant interview, June 1999) 186

PAGE 201

Similarly Mrs. Klein alluded to the need for more time and teachers' unwillingness to put in extra time without incentive: Well the structure of the calendar is a hot one here. Length of school days. And the number of days that you teach. Whether professional development should be on your own time or our time. All, all those kinds of things. Sometimes you get crabby people that if you put in one more assessment that they have to grade, then you'd better be willing to pay. (Klein, interview, June 1999) Money and the need for more state support for mandated policies emerged as a key factor in Front Range. According to Dr. Hayes, "We've probably, conservatively, spent ten million dollars this decade on these reforms which did not come from the state" (Hayes interview, June 1999). Similarly, Dr. Wright argued, "It's not that we don't see these policies as important, but it is pretty hard deciding on a raise for teachers versus hiring a consultant" (Wright interview, June 1999). Money that was earmarked for implementation was specifically allocated for four purposes. First, $250,000 was designated for Dr. Wright's entrepreneurial fund for learning initiatives. Second, significant staff allocations were given to elementary schools to enhance their literacy programs costing close to $500,000. Third, more money was allocated to staff development to enhance staff capabilities. Last, extra personnel within the Department of Instruction to handle assessment and professional development added extra budget considerations. In addition, if Assuring the Essentials was completely implemented, an additional $1.8 million dollars would have to be redirected (Wright interview, June 1999). Extra money for particular 187

PAGE 202

projects like middle school science improvement for instance came from grant sources. No specific structures were evident other than the Elementary Time Allocation Policy. This policy mandated daily time allotments for elementary teachers around the essentials of reading, writing, math and technology. Left over time during the day could be used for other subjects, and whole school schedules had to be built around academic literacy blocks. Assuring the Essentials however, promoted many structural changes to support learning efforts including student progress policies, funding, accountability, and programs. Conclusion Facilitating Factors in the original Dibella and Nevis model (1998) are the practices or conditions that promote and enable learning within organizations. The presence or absence of these factors in LEAs determined the efficiency and effectiveness of promoting learning in all phases ofthe learning cycle. None ofthe districts studied could be considered competent in all facilitating factors. In aggregate, these factors help show why organizational learning occurred or not. However, facilitating factors alone did not guarantee that useful learning occurred. In some instances, facilitating factors were present but did not enable a deep 188

PAGE 203

understanding of standards-based reform. What then did determine the depth of learning about standards-based reform? I turn to this issue in the next chapter. 189

PAGE 204

CHAPTER6 IMPLEMENTATION AS ORGANIZATIONAL LEARNING CAP ABILITY: UNDERSTANDING VARIATION AS LOCAL MEANING Understanding variations in learning orientations and use or strength of facilitating factors in LEAs begins to help unravel why districts developed different meanings for standards-based reform. However, understanding these orientations and factors alone does not show how these elements interacted to shape these varied meanings. How LEAs acquired, interpreted, disseminated, and utilized reform ideas in part depended upon how learning elements interacted over time and were affected by facilitating factors. To understand these interactions, this chapter will primarily address the following questions: 3. Using learning orientations and facilitating factors, how did Local Education Agencies (LEAs) construct local interpretations of state-mandated standards-based reform policies? 4. How do differences in learning orientations and facilitating factors explain variation of interpretation and implementation of standards-based reform policies? 190

PAGE 205

Constructing Local Interpretations of Standards-Based Reform Policy To fully understand what standards-based reform meant in the three LEAs studied, the interaction of learning orientations and facilitating factors in each district must first be examined. A summary table of learning orientations and facilitating factors is provided for each case. How learning orientations and facilitating factors interacted for each LEA studied will be examined below. Case 1 : Mid plains Table 6.1 Summary ofMidplains' Learning Orientations and Facilitating Factors Learning Phase Learning Orientations (Describe how learning occurs Acquisition and what is learned) I. Knowledge Source Internal ....................................... X .... External 2. Learning Focus Content .. X ......................................... Process 3. Leamer Focus Individual ..................................... X .. Schools 4. Use of Data Purposeful ..................................... X Casual Strong Facilitating Factors (Practices or conditions that promote and enable learning) Policy Interpretation Learning Orientations 5. Interpretive Implicit .X ......................................... Explicit Mechanism to classroom experience 6. Interpretive Processes .X ....................................... Achievement Orientation Strong Facilitating Factors Leadership cognition Dissemination Learning Orientations 7. Dissemination Mode Delivery .X ........................................ Construction 8. Knowledge Reserve Explicit link .................................. X Implicit link Strong Facilitating Factors Climate of openness Utilization Learning Orientations 9. Learning Scope Incremental .X .................................. Transformative 10. Value Chain Things .. X ......................................... Human Resources 191

PAGE 206

Strong Facilitating Factors Resources While many people in the Midplains School District eventually assisted in the process of deciding standards, curriculum work plans and aligned materials, one key central administrator crafted the meaning of standards-based reform as content: Dr. Mary Rodriquez. Dr. Rodriquez was hired as assistant superintendent in the early 1990's: At the time I was appointed assistant superintendent and we changed the whole, all of the responsibilities for the assistant superintendent. They used to be a person who was just in charge of operations and personnel. And when the former superintendent left, he realized he that standards were coming and that we had done nothing and most people were two or three years into it, and here we're just kind of getting started. And so the position of the assistant superintendent was changed to include curriculum, instruction and assessment. (Rodriquez interview, March 1999) From that vantage point, Dr. Rodriquez began to examine how to adopt standards and how to engage the staff and community in the process. When she ascended to the superintendency in the Midplains School District, Dr. Rodriquez began a process intending to generate higher level outcomes for curriculum. These outcomes were to be based around staff and community engagement, articulation between grades, and quality control measures to assure better content. In interpreting the intent and meaning of standards-based reform, Dr. Rodriquez first began with designing a process for interpretation intending to engage school personnel and community members: So that fall (1995) we sat down and looked at the law, just kind of reflected on where we were and decided that we needed to pull together a group of people to develop the process. So we 192

PAGE 207

would call together a principal from every building and a couple of teachers, include a teacher from every grade level, parents, some students, and we sat around and talked, learned about what needed to be done and developed a process. (Rodriquez interview, March 1999) These brainstorming sessions eventually ended up in the design of community forums with invitations sent out to parents, staff, community members, and even senior citizens. The community forums were designed to elicit responses from the community on one simple question: what should students know and be able to do when they graduate from Midplains School District? Although not expecting a huge turnout, 400 community residents turned out to answer that question for core content areas and graduation standards or those skills that cut across all content areas. Because teachers did not attend these meetings a similar process was used to engage them: ... we didn't have enough teachers attending though. That really bothered us. We thought they would come up and attend during the evening. And so we thought this isn't good, so we took a day and had teachers do the very same process, all teachers P through 12 and they were real good. They were divided into groups which forced P through 12 teachers to be in every group so that it was a broad perspective of listening and thinking about what students should know and be able to do. (Rodriquez interview, March 1999) Using input from the community forums, a second process, based on articulation, was developed. This process was focused on assuring a tighter coupling between grades, levels and teachers. This secondary process, which took the greater part of two years, was also highly focused on the content to be taught under each standard in 193

PAGE 208

all areas addressed by the Midplains curriculum. Designed as an iterative process, articulation processes allowed teachers a voice in what student should know and do while helping all levels see the connection between standards, content and classroom practice. According to Dr. Rodriquez: They (teachers and committees) had to make sure they were going through the process, their work was always sent back to all the teachers because there was always a teacher from every class or grade level. And it was their responsibility to take the work back to their grade level discussions and also across the grade level discussions, and then we built committees that were forced to look if a building has like for example has a preschool through grade 2 then those grade 2 reps had to talk to the grade 3 reps over here across campus to make sure they could live with what the next group had, next group of teachers were responsible for ... so we have this cross discussion going on all the time to make sure teachers P through 12 so they all got a sense of what their articulation piece would look like. (Rodriquez interview, March 1999) This process defined expectations and methods, and articulated essential questions that helped further interpret the meaning of standards-based reform as a contentbased interpretation: ... we learned, not only do you have a process in place but we set parameters, your expectations about what everybody should do, what's acceptable, what the outcome should look like. Because if you don't people will go all over the place and you won't get answers to your questions. Plus you have to set the parameters so that people understand it's not like a strong sub group of the committee that they can lead a group in some other direction that really takes us off the path. (Rodriquez interview, March 1999) The questions that helped define the process were used to fill in a curriculum work plan. According to Ms. Adams, current director of staff development and curriculum, the questions designed into the curriculum work plan are "basic good curriculum kinds of questions" (Adams interview March 1999). These questions included such things as what are the grade level benchmarks? What knowledge, 194

PAGE 209

skills, abilities and processes are needed to reach the benchmark? What are the essential concepts on which to assess students at the district level? What is the best way to assess the essential concepts on district wide assessment? What instructional strategies can a teacher use? What is the availability of ready-made district-level assessments which align with the essential concepts? (Midplains Curriculum Work Plan Overview, 1997). But in answering those questions, they (content committees) had to come back to the same process again and again. They met with their teachers, P through 12, and committees answering those questions ... What they were doing was developing curriculum. (Adams interview, March 1999) A third process developed in the Midplains School District was designed to enhance quality control. At the onset, Dr. Rodriquez knew a formal process was needed both for the interpretation and for giving the process credibility (Baker interview, March 1999). Therefore, the Council for Curricular Excellence was established as an oversight group for quality control. Headed by Dr. Rodriquez, this group acted as a final approval of both standards and curriculum materials before board approval and acted with great authority and autonomy. Each content area committee was required to bring their finalized curriculum work plans to the Council for Curricular Excellence in a formal presentation. As explained by Dr. Rodriquez: So they (Council) would critique the work as well and sometimes they'd pass it forward to the board and other times they say you know you need to go back and really rethink this and did you answer all the questions, we heard people say this, now would you go back and check this input you got, make sure we're reflecting on all those. (Rodriquez interview, March 1999) 195

PAGE 210

1ry of River Valley's Learning Orientations and Facilitating Learning Orientations (Describe how learning occurs and what is learned) 1. Knowledge Source Internal .X .......................................... External 2. Learning Focus Content .......................................... X. Process 3. Leamer Focus Individual .................. X ..................... Schools 4. Use ofData Purposeful .X ..................................... Casual Strong Facilitating Factors (Practices or conditions that promote and enable learning) Scanning Policy Concern for measurement 5. Interpretive Mechanism 6. Interpretive Orientation Involved leadership Systems perspective Learning Orientations Implicit .......................................... X Explicit to classroom experience Processes ...................................... X. Achievement Strong Facilitating Factors Leadership cognition Learning Orientations 7. Dissemination Mode Delivery ........................................ X. Construction 8. Knowledge Reserve Explicit link X.................................. Implicit link Strong Facilitating Factors Climate of openness Continuous education 9. Learning Scope 10. Value Chain Multiple advocates Resources Learning Orientations Incremental .................................. X. Transformative Things ......................................... X .. Human Strong Facilitating Factors Accountability Resources ey School District engaged different learning orientations and to determine their own unique meaning of standards-based reform. and factors included a history of achievement and student 196

PAGE 211

accountability, strong leadership in assessment, involvement of teachers from the start, the design of professional development as interpretation, and the alignment of resources and policies to support the district's philosophy. Each of these processes and elements and their iterative stages were built upon one common ideal: high student achievement. Seen over time, these processes and elements helped make River Valley's meaning of standards-based reform coherent and aligned. First, A long history of high achievement and student accountability for achievement preceded state policy. From a community call for student accountability in the mid 1980's, River Valley implemented a proficiency based high school diploma. This diploma required students to show certain performance levels in content areas like reading, writing, oral communication, mathematics, science and swimming in order to graduate. Later in the decade, students took criterion leveled tests in reading, writing and science in other grades. Purchased from outside sources, these tests assured a required level of high achievement for graduation and helped evaluate curriculum programs within the district. Second, because of River Valley's history of criterion referenced testing for high schools, strong leadership in the assessment department emerged that helped couple assessment to instructional planning in the classroom. According to Dr. Seneca, Mrs. Glitton, former director of assessment, envisioned a major role for classroom 197

PAGE 212

assessment in a standards-based school district. In early work with River Valley's local BOCES, Mrs.Glitton worked with other visionary leaders in the state to help interpret what standards-based reform could mean in the classroom and its relation to assessment and data. Similarly, because River Valley had used criterion referenced testing as an accountability mechanism since the mid 1980's, standards-based reform as data interpretation reinforced schools practice of attending to data. By 1998, the district reported classroom, school and district, and state level assessment data as a form of accoWitability. Schools were responsible for understanding major discrepancies between any of the levels. Third, because of River Valley's history with proficiencies, the adoption of standards happened more easily (Senenca interview, JW1e 1999). River Valley's curriculum liaison structure had been in place for some time, and when standards evolved through policy, revisions quickly met or exceeded state expectations. River Valley first officially adopted proficiencies in 1990 as a result of the outcomes-based education movement of the 1980's. Once state model content standards were made available, district teams only had to compare and make minor revisions with final adoption occurring in June of 1996. This allowed for the gradual phase in of essential areas over time. In 1996-1997, River Valley phased in reading and writing standards while the other content areas had to be phased in over the next four years. Secondary 198

PAGE 213

schools phased in all first wave areas including reading, writing, mathematics, science, history and geography in 1996-1997. Because River Valley adopted original content standards six years ahead of state requirements, involvement of teachers from the start focused on training of standards-based building liaisons. These liaisons, as well as administrators, learned to help teachers understand philosophical differences between traditional systems and standards-based systems. As a natural follow-up to defining expectations for student learning, standards-based training liaisons not only helped disseminate understanding of these reforms. They were also seen by their colleagues as partners with administrators in interpreting the classroom version of standards-based reform. All faculties trained in characteristics of standards-based classrooms led by building administrators, central office personnel, and teacher liaisons. In essence, all faculties received the same clear message about the direction of this district. According to Mrs. Jackson: I'm thinking about early on. There was this overriding message that came from every interaction we had, that we are changing thing to improve or enhance student learning. There was this whole framework of this is not a fad that's coming through that we're going to jump on the bandwagon. This is not the latest staff development craze. This is something we believe in our hearts of hearts is good for kids. (Jackson interview March 1999) Fourth, teachers primarily designed professional development around interpretive focus about philosophy of a standards-based classroom, instructional planning and 199

PAGE 214

assessment leading to a common district understanding. The major interpretation of River Valley's standards-based philosophy came from this professional development which linked standards to classroom processes through assessment. The summer Professional Development Center (PDC) came about from discussions between Mrs. Glitton, standards-based liaisons, district principals, and central administrators. Knowing that the philosophy of a standards-based classroom would only help teachers so far, multiple advocates designed the PDC as a way for teachers to interpret standards through the design of course organizers, instructional units, and assessments. Using common instructional organizers, teachers learned design steps of a standards-based unit and the development of multiple assessment tasks that would measure achievement of standards. Through 1999, the PDC has been attended by over 70% of all teachers in River Valley and has been administered by teachers (Jackson interview, March 1999). Administrators attended as facilitators and listeners to help teachers think ti?Iough the implications of their designs. By using the PDC as the on-going form of professional development, teachers received the strong message that proper design and attention to data is an important part of standards-based reform. Last, alignment of resources and policies helped show teachers that their input was valued and that the district was serious about implementing standards-based 200

PAGE 215

reform in classrooms. Teacher evaluation, for instance, aligned with the district's standards-based reform efforts through redesign in 1996. Ten teacher standards used the language of the district's philosophy including the use of standards in planning, use of assessment data, and differentiating instruction to enhance the probability of student achievement. Required teacher growth plans differentiated levels of performance, different types of goals, and different resources according to need. All growth goals in River Valley had to deal with impacting student achievement in some content area. Similarly, principal, superintendent, and central office evaluation evolved primarily around how achievement improved. Planning processes also aligned resource allocations through analyzing the relation between district achievement needs, professional development needs of teachers, and individual school needs. Case 3: Front Range Table 6.3 Summary of Front Range's Learning Orientations and Facilitating Factors Learning Phase Acquisition Learning Orientations (Describe how learning occurs and what is learned) I. Knowledge Source Internal ................... X ........................ External 2. Learning Focus Content .......................................... X. Process 3. Learner Focus Individual .X ...................................... Schools 4. Use ofDataPurposeful ...................................... X Casual Strong Facilitating Factors (Practices or conditions that promote and enable learning) Organizational curiosity 201

PAGE 216

Interpretation Learning Orientations 5. Interpretive Implicit X .......................................... Explicit Mechanism to classroom experience 6. Interpretive Processes .................... X ................... Achievement Orientation Strong Facilitating Factors Dissemination Learning Orientations 7. Dissemination Mode Delivery .X ........................................ Construction 8. Knowledge Reserve Explicit link .................................. X Implicit link Strong Facilitating Factors Continuous education Utilization Learning Orientations 9. Learning Scope Incremental .............. X ..................... Transfonnative 10. Value Chain Things .......................................... X.Human Resources Strong Facilitating Factors Resources I think one of the important things is that it really has to be a common thread, a common understanding. Really have to have a collegial and instructional conversations. Because we think we all understand it now and we all have very different perceptions and unless we have agreement about where we're going and what we're doing, we're not going to see much change. (Grant, interview, June 1999) The Front Range School District also used different processes through their learning orientations and facilitating factors to come up with their separate, unique but multiple meanings of standards-based reform. Multiple meanings in Front Range were also caused by leaders' different assumptions about the current educational system that did not merge into an agreed upon mental map. Accompanied by turnover in leadership, Front Range exemplified what Spillane (1998) referred to as a lack of internal homogeneity around meaning. Taken together, this interaction offers a much different picture than the other two cases. 202

PAGE 217

First, an attempt at construct driven reform and assessment preceded state policy. Front Range, under a different director of secondary education in the late 1980's, used the theories of outcome-based education to attempt to implement a proficiency driven curriculum. Under the acronym ofCCO's (Common Core Objectives), Front Range took three years developing CCO's for every grade level and course. According to Mrs. Bench, current Director of Elementary Education: ... where we got into trouble with CCO's was when we asked teachers to begin to assess all of those objectives individually. We had a system called ALA's (Authentic learning Assessments) which teachers had to administer for every objective, grade them, keep track of them for every student. It eventually collapsed under its own weight. Teachers about rebelled. (Bench interview, June 1999) Second, a strong instruction and staff development department developed after the ceo era that focused efforts on instruction and away from a focus on content or assessment. After the collapse of the CCO's and some budgetary problems, Front Range restructured their Department of Instruction doing away with content coordinators. While, Front Range developed a process for adopting content standards in the mid 1990's that involved teachers, new leadership in the department of instruction helped focus learning efforts in the district primarily on instructional strategies for teachers through the new department structure. This new leadership team included Mrs. Bench as the Director ofElementary Education hired in 1997, Dr. Grant as the Director of Secondary Education hired in 1996, and Mrs. Klein who was hired as Director of Staff Development in 1995. In conjunction with Dr. Hayes, 203

PAGE 218

the Assistant Superintendent of Instruction, this core group focused district efforts toward differentiating instruction as a way to affect achievement. According to Dr. Grant: ... if we don't impact the instructional strategies and the structures that inhibit students from learning, then we might as well not even have standards ... standards I would say it's supposed to look like success for all students, about differentiating structures in the classrooms and instruction going on in the classrooms that provide the opportunity for all students to be successful. (Grant interview, June 1999) Similarly, Mrs. Klein discussed the original intent of standards-based education in their department: Well, we started off with building instructional repertoire. We were looking at the research on teaching and learning as a whole and finding that there was lots of patterns and people doing the same old stuff. We're trying to come up with a way of bumping people in, probably their most comfortable way that instructional strategies could change. (Klein interview, June 1999) Although members of the department used a similar language around differentiating instruction, they supported different initiatives based on a foundation of instruction. For instance Dr. Grant supported a new vision for grading as part of instruction while Mrs. Bench supported a focus on early literacy and new classroom assessment. Dr. Hayes, on the other hand, focused away from the work of school and classrooms "leaving that to the experts", and instead focused his efforts on designing strategies to protect the district from state intrusion. 204

PAGE 219

Third, a new superintendent arrived in the fall of 1998 with a different vision for narrowing the focus of achievement in Front Range to buffer oncoming state accreditation requirements. Dr. Wright, according to many on his staff, understood the "socio-political" atmosphere (Hayes interview, June 1999) around standardsbased reforms much better than his predecessor. In contrast to the previous superintendent who took a hands-off role on instruction (Grant interview, June 1999), Dr. Wright understood the need for focusing the district's reform efforts. When he arrived in the district: ... what I found were standards, a few assessments, not much data, a lot of emphasis on instruction and a lot of fearful teachers and principals. They knew this accreditation was coming, they had faced CSAP and not a lot of support around that was evident. Our achievement was okay, but there didn't seem to be an overall direction. I talked a lot to our board and our departments and everybody seemed to look at things a little bit differently. I decided to write an overall document to focus our efforts. But as I wrote this I wanted to make it clear that without extra resource support, we couldn't focus on everything at once. That's why I called it Assuring the Essentials. What is it that all students have to know and be able to do to be successful. If you do too much you lose your staff and nothing gets done very well ... I wrote it myself, revised it a lot with board and staff input and that's the direction we are heading. (Wright personal communication November, 1999) Assuring the Essentials acted as a focal point for Front Range principals and schools and was mentioned frequently in discussions with district administrators. This document called for assuring the essentials in literacy, numeracy, civility and technology, or what Dr. Wright called the "foundation of standards" that would help focus schools and protect them from too many state requirements. This document also focused the use of district resources around supporting processes for assessment 205

PAGE 220

and data, learning projects, and developing a different way for students to progress through the system based around a certificate of initial mastery in eighth grade. This idea held merit for many district administrators. During the 1998-1999 school year, Dr. Wright personally presented the document to every faculty in the Front Range School District plus numerous parent and civic groups. According to many, this document was the only major interpretive process undertaken around standards-based policy in the district: "I don't know that there was any purposeful conversation about the policy. We've just always assumed it was meant to increase learning for more students" (Klein interview, June 1999). Although the district had adopted standards and planned to design more classroom assessments, the earlier failure ofCCO's demanded a different approach. Assuring the Essentials foremost narrowed content down to targeted, essential skills. According to Mrs. Bench: And our district standards called proficiencies and we have begun to target specific proficiencies with sanctions and kind of to say to teachers these are the areas that you will be assessing. Some of the rest of this is not unimportant but it's probably only if you got time, so we tried to, tried to work pretty closely with teachers to make sure .. And of course we still are. (Bench interview, June 1999) In addition to these processes and elements, many district administrators questioned many of the assumptions upon which the educational system relied. These "informal dialogues" (Dr. Grant interview, June 1999) also helped Front Range to interpret their reform efforts although many challenged different 206

PAGE 221

assumptions. This strong challenge of assumptions lead to further multiple interpretations and multiple initiatives that were not necessarily link to the larger vision for reform. For instance: I think we had to question covering content and not teaching students. I think we had to question our rating system, grades as opposed to learning we wanted ... We had to question separation of academic, evaluation of standards versus the evaluation of nonachievement kinds of things responsibility, timeliness and all of that. (Dr. Grant interview, June 1999) One assumption of education is ranking high or low, rank all the kids from high to low and here's the A group, here's the F group, basically challenging them, get all kids to this minimal level or proficient level. More standardization of the curriculum ... I guess another thing we had to question was whether all of our kids need to go to college and is that really our goal to have X% of the kids go on to college or what are we preparing for? (Mr. Green interview, June 1999) I think some people asked really hard questions. For example which 20 percent do you think that's ok that they don't get it. They're not literate. Are you going to pick them out? I think it was things that were flung at them that really challenged their values ... It could at any time come from any of these groups. Superintendents yes, directors yes ... I mean they look at our data and say that ok with you? Or Dr. Hayes, or Dr. Grant will say which 30 percent are you, are you going to discard? (Mrs. Klein interview, June 1999) Standards, I think, ultimately rely on students' language. If they can't read or write, they won't be able to read or understand textbooks they have to read in high school. In my mind, our early literacy efforts are the most important thing we do. We had to challenge some of our ideas about what literacy looked liked, how it was taught, and what we did with student who were not succeeding. (Mrs. Bench interview, June 1999) There are many assumptions I think we challenged. One was the whole notion of time and structures that drives a certain percentage of students away from school. A second was how much would our community accept. Saying we want all kids to succeed smack right in the face of this whole notion of individual effort. (Dr. Wright interview, June 1999) I think we always assumed education had less to do with schools and all to do with kids. If they want it bad enough they will get it. We have to understand the influence the bigger picture has on what we teach and how we should be teaching it. (Dr. Hayes interview, June 1999) 207

PAGE 222

Variations in Meaning of Standards-Based Reform Because of the different interactions of learning orientations and facilitating factors within each LEA, distinct and varied meanings for standards-based reform emerged in each district.Again whether meaning led to learning orientations or the opposite, variation of implementation was driven by learning choices districts made. Distinct meanings for each LEA are examined below. Meaning of Standards-Based Reform in Midplains The processes of gaining community input, finalizing curriculum work plans, and gaining formal approval for plans and materials by the Council for Curricular Excellence helped shape the meaning of standards-based reform in Midplains. Because of the way the "learning" process was shaped by central administrators, the content of the standards became the meaning attached to the reform policies. Assessment, progressive notions of teaching and learning, or raising student achievement received little attention. In addition, while the three step process was designed to adopt standards and new materials, it became clear that this process came to be known as standards-based reform. These subtle, yet powerful, messages were delivered to teachers in various ways through the use of learning orientations and facilitating factors. 208

PAGE 223

First, because the Midplains School District had limited resources and few central office personnel, strategies to focus the efforts of teachers on content eliminated the need for lots of staff development on differences between standards-based classrooms and more traditional classrooms. According to Dr. Rodriquez: Because of the way we designed our process involving teachers we have not had to do a lot of training on what standards-based education looks like. Because they've been involved in the whole process of staff development, developing them saved us later in staff development funds because they already understood this. (Rodriquez interview, March 1999) Second, the scope and nature of the standards adoption process signaled to teachers that standards were encompassing of the current curriculum taught at all levels while changes in instruction or school structures were not associated. As explained by Mrs. Baker, elementary principal: I think another thing that's made our standards and benchmarks really a little different maybe than some other schools that we're proud of, that there's a scope and sequence that goes all the way preschool through grade 12. And that was, came out of making sure all areas including occupational education had standards ... (Baker interview March, 1999) In addition: I think they knew we did those a little bit different than maybe some schools in that we didn't concentrate on that first tier the state talked about. We figured we're teaching all of those content areas. We've got to decide if we're teaching to them, they must be important. And if they're not, we've got to give it up, and is there something new that has to be added, and we haven't thought about it. And so we did everything, you name it we did it all. Our occupational education program, there were standards and benchmarks set in everything. (Rodriquez interview March 1999) We needed it to be preschool through 12. we had too many definitions of what was important, too many ways of doing things. We were different, the middle school was different, you know. There was no consistency. (Baker interview March 1999) 209

PAGE 224

Third, decisions made within content areas also signaled content as the important meaning of standards-based reform. For instance additional requirements were added for graduation: And out of that came some real unusual things. I think one in particular that stands out to me is that there was a language requirement. Every student will be able to read, write and speak English and one other language. (Rodriquez interview March 1999) In addition, modifications to middle school and high school math curriculum offerings accommodated this perception of standards-based reforms. However, the influence of content in every area as necessary was not seen as beneficial by some ... to continue to move forward. Not that we're ever going to be done. To continually look at delivery strategies Because this is a phenomenal amount of curriculum and especially for elementary primary school. It's phenomenal what's expected and so delivery might have to change. Integrated strategies have to be used. There's no other way we can get through content ... I think as an elementary person, when you have content standards and benchmarks in every subject area for every student preschool through twelfth grade, I think we're done ... I think because its driven by content oriented people, it is beyond possibility. (Baker interview March 1999) Fourth, other decisions also signaled the importance of a district-wide set of standards and curriculum including standardized assessment to measure achievement of adopted standards in grades 5-12. According to Mrs. Adams, Midplains did not want to look at classroom assessment first. "We wanted to see the big picture first and back into the classroom assessments" (Adams interview March, 1999). This decision pulled teachers away from more progressive forms of assessment and 210

PAGE 225

pedagogy, and signaled that more traditional forms of assessment were the correct way to measure student achievement. Additionally, the overall sequence of decision making including standards, benchmarks, curriculum materials and district-wide assessment also signaled changes in content as the primary intent of reform. Last, and probably the most powerful signal to Midplains teachers that standardsbased reform meant different content came from the allocation of resources to the process. Curriculum experts from outside the district helped facilitate work groups, and work groups were compensated for time spent working on curriculum work plans. Similarly from 1995-1997 all district inservice time was allocated for articulation and discussion of work plans, and materials review. According to Dr. Rodriquez: I think that the reflection of whether you've restructured yourself and are doing what's right for kids is when you're looking at how you're spending your money. And when we look at our budget, I can honestly tell you that we've invested in kids and teachers. When you see over the last couple, well three years now, four hundred, five hundred thousand dollars set aside every year for materials. And we've gotten the job done. There's only one group that hasn't ordered their materials yet because they haven't passed the test of their peers about a quality curriculum. (Rodriquez interview, March 1999) Meaning of Standards-Based Reform in River Valley I think it (standards-based reform) means that a clearly articulated statement of what kids should know and do drives everything. It drives instruction. It drives selection of materials. It drives achievement data, how we measure it. They've done it linearly, it's more cyclical. It means that every decision made in the district is based on data-information. How we spend money, how we spend teacher time, how we allocate FTE. Everything focuses on a common goal 211

PAGE 226

that says kids have to know and do X. That all requires really good data. (Jackson interview March 1999) In contrast to the Midplains school district who focused meaning on the content, River Valley focused their meaning on the processes of using internal data to gain knowledge and feedback to continually improve student achievement of content standards. Known district wide as "data-driven instruction" these learning orientations and facilitating factors encouraged an evolving philosophy to emerge over a decade that aided the district's learning processes and a sense of shared meaning across the district. This powerful message occurred in River Valley in three ways. First, a clear focus on acceptable achievement levels existed for schools and students. This focus on achievement, not always present, has evolved over time. As explained by Mr. Rich, elementary principal, director of elementary curriculum and accountability: Standards-based education is assuring that there's a level of performance. That there's a difference in the level of knowledge, OK? We are moving to performance. Demonstrated perfonnance. Application of knowledge rather than just seat time and passing grades ... We recognize that there's a certain amount of instruction that needs to be put into an organization where the children have an opportunity to apply it. So we've moved it, which, if somebody really wanted to look at it, it's very, very like John Dewey. Standards-based education is also causing us to have a more rigorous approach to basic reading, writing, arithmetic at least at the elementary level. These laws are insuring that things that should happen in good education, happen. And our district is saying we want to make sure this happens. There are some problems, but overall it's been, well we are able to sit down with kids individually, we're going to make sure they're at grade level. (Rich interview March 1999) 212

PAGE 227

In addition, River Valley used multiple initiatives concurrently to help raise achievement and help teachers understand the relationship between achievement and standards-based reforms. According to Mrs. Williams, 'a. high school principal in the district: At the secondary level, we are trying to intertwine standards with School to Careers. School to Careers kind of gives the relevance for high standards. Everybody will go to work at some time or another. Some will go right away, some in four or five years. But everybody will work and they need to know what that will takeboth academically and with other skills ... Standards-based also looks a lot at careers. Actually they start at the eighth grade level. And we have a very systematic way of activities and things that happen at every single grade level to reinforce a career path for a student or to say, 'well you know, you're kind of following along these two. Kind of, you seem to have aptitudes and interests in these areas." And then doing a lot of the, we're just doing a lot more mentor ships and extemships with kids. (Williams interview April 1999) Similarly, in River Valley a pervasive belief existed that all students could learn. Competency levels for core areas in high school have been set and mandated as a requirement for graduation for all students but the most severely disabled. According to Mrs. Jackson: I think we have, in a lot of places in this district challenged the traditional belief that some kids can't learn. And I think most of us, not only say, but believe all kids can learn. (Jackson interview March 1999) In addition, as a part of the district's mission and focus on accountability for achievement, schools hold primary responsibility for achievement. According to Dr. Seneca: Our principals are primarily evaluated on how well their students achieve relative to where they begin. We have quite a few of our low SES schools outperforming our high SES schools because they have a better idea on where they need to head ... Our schools are connected by one common focus point. The core of it all is everything we do is made to improve student achievement. 213

PAGE 228

That's what ties it together. Everything at the next, if you have to layer it, everything from there is tied to standards, student standards. (Seneca interview, June 1999) Second, River Valley preached an instructional planning model that focused learning away from textbooks as the main curriculum source, to a focus on standards, performance assessment and student understanding. This planning model ultimately helped teachers "close the circle" (Seneca interview, June 1999) and look at student achievement as the driving force for instruction. According to Mrs. Jackson: The other thing I think we did that I think has been crucial in the evolution is that the frrst huge staff development effort focused on instructional planning. How is that different in a standards based classroom than traditional classrooms? We started where teachers live every day ... And then we logically move into ok now we need to assess differently and then we need to report things differently. That paradigm shift has to happen. For an individual culture of teachers who now thinks about, "What is it I want kids to know and do?" has to happen first, and it can only happen in a classroom. (Jackson interview, March 1999) According to many administrators interviewed, this instructional planning also impacted student learning because learning targets became clearer and students were able to analyze their own work. We tell kids you have to understand this. Without having to use the exact language. And then our application of the assessments is their way to show us. They know this. When you go to a first grader and ask why are you doing this, they say "I'm doing this because" maybe not the exact standard but you get the idea. They'd also say, 'I should have this in it to be good." They know what is expected up front. The rubrics help clarify and kid are recognizing the differences up front. (Rich interview, March 1999) 214

PAGE 229

Similarly, Mrs. Williams, high school principal, alluded to the impact of instructional planning on high school students. For most classes, kids, you cail walk up to them and say what are you doing, why are you doing this and they can point to the instructional planner teachers have given them or some will say because we are working on this standard. Most can point to specific things they have to do to graduate and say "I'm working toward this." That's really powerful. (Williams interview, March 1999) Third, the parlance of results shaped further meaning of standards-based refonn in River Valley as data-driven instruction. From the better use of assessment as data within the classroom to large scale use of data for resource decisions, data-driven frequently emerged as a common philosophy within the district. Many administrators discussed the need for and use of assessment data at the classroom and school level. What I do see is an increasing sophistication at the classroom level. Not just in instructional planning because our teachers are doing that pretty well. But in learning to create better and better technically better assessments. (Jackson interview, March 1999) I think we have probably gotten much better at understanding our assessments and using our assessments to inform better instruction, help make instructional decisions. We have better information on where are at so we can meet their next learning needs. See if there's anything that the school can learn and we've gotten better processes rather than just taking the previously published assessment and adding it all up and saying "Good. This is our average." We're now looking at it in a different way and we're sharing the assessments. (Rich interview, March 1999) The whole vision is that data would drive, not only instruction, but would drive our decisions in what's best for kids ... A critical component was how do we get teachers to be better at analyzing data. How should they use assessment data, not only test score data, but maybe it's performance portfolio data. Analyze things better so they get some type of information from that to use. (Zeus interview, March 1999) 215

PAGE 230

In addition, many administrators discussed the use of data as a way to better inform larger district decisions: I think in our district, SBE means making sure that you've got a minimum that covers the standards and a maximum that goes beyond. The ability to use assessment data to figure how well you do in your curriculum. And compare that against the implementation, where your adjustments need to be. Data also gives us an indication of where the next training has to go. And hopefully the motivation of the people to feel that, to design training for each person's level of need. (Rich interview, March 1999) The emphasis has really been at the elementary level to look at data and intervene earlier. I think in the past it's always been on the end result at the high school level. But I think they're starting to see what's happening to kids that, if at the third grade level they can't read or write, especially with reading, that I mean can pinpoint those kids at, you know as early as kindergarten who are going to struggle. That's where a lot of data helps. It shows us where we need to place an emphasis. If we don't get them by the time they reach high school, it's damn tough to teach them to read. (Williams interview, March 1999) Ultimately, we can only use the information we have. In the past we relied so much on input information not related to student achievement. As a district, we can only be focused on achievement data if that is our primary mission. We have to use credible data to "close the circle". We have to use data and access important data to plan, design intervention and training, and to ultimately influence achievement. We have to also as a group look at and attach some meaning to our data. What does it mean, what does it suggest? We can no longer deny the validity of the amount of data we have before us. It speaks volumes and we have ignored it for too long. (Seneca interview, June 1999) Meaning of Standards-Based Reform in Front Range In contrast to Midplains and River Valley who both had a fairly focused meaning for reform, Front Range presented meanings that were not internally homogenous (Spillane, 1998). These meanings dealt with student achievement, instruction, Assuring the Essentials vision, and numerous projects. Furthermore, an ongoing dialogue around prescribed reform through state-mandated policy received attention. 216

PAGE 231

Taken together, these meanings addressed many of the problems that district administrators saw, but their diversity did not encourage a coherent meaning or strategy for organizational learning. Front Range was clearly a district in transition with no clear meaning for their efforts. First, student achievement as the basis for the district's reform efforts existed as a point of agreement for many district administrators but without any relation to necessary structures or elements to focus large scale changes. As explained by Dr. Grant: "I would say it's supposed to look like success for all students" (Grant interview, June 1999). Additionally, this meaning was addressed by Mrs. Klein: Well it's supposed to mean that there's less of an achievement gap for kids. There is a, this is less of a place of have and have nots. That we really do care about what the state is telling us in that we alter our instruction and our goals to make sure that all kids have, have an opportunity to be successful at high levels ... We translate it to more learning for more kids more of the time. (Klein interview, June 1999) Moreover, even though Front Range disagreed with too much intrusion from the state, Dr. Hayes suggested: The emphasis on the state's part for higher achievement, I agree with. We need to help more kids all the time read well and write well, do math better. I have always been an advocate of raising achievement. These policies help give us a little bit of leverage, but we have, I hope, always focused on achievement. (Hayes interview, June 1999) Second, as another meaning for standards-based reforms, instruction offered many district administrators a way to help teachers understand how to raise achievement 217

PAGE 232

without having to convert other classroom practices. According to Dr. Grant before they could understand standards-based reforms, all teachers in the district needed," ... an understanding first of all what standards-based instruction is" (Grant interview, June 1999). In contrast to River Valley's encompassing view of data driven instruction, Front Range specifically meant isolated teaching strategies. According to Dr. Grant: If people implement standards really from the standpoint of just creating an objective of what a student should know and be able to do and they don't impact the instructional strategies and the structures that inhibit students from learning, then we might as well not even have standards. (Grant interview, June 1999) In addition, Mrs. Bench stated: We have been working on instruction for five, six years now. we found that many teachers did not have the repertoire they needed when they face a class of diverse learners. All the assessment and high level standards in the world won't matter if teachers could not help different learning styles or second language learners. (Bench interview, June 1999) Similarly, Mrs. Klein discussed the role of instruction as a major direction for her district: Well, we started off with building instructional repertoire. We were looking at the research on teaching and learning as a whole and finding there was lots of patterns and people doing the same old stuff. We're trying to come up with a way of bumping people in, probably their most comfortable way that instructional strategies could change. And then tying it into standards and assessment, and we did it just like that. Actually the whole time. Where we're not having as much focus yet is probably the curricula mapping. I'd say we need to work on that as a step. (Klein interview, June 1999) Third, the Assuring the Essentials document helped further shape the diversity of meaning surrounding standards-based reform within the Front Range School District. 218

PAGE 233

As a charge to Dr. Wright from the. Board of Education, Assuring the Essentials proposed a "continuous improvement model of education anchored in standardsbased education methodology" (p. 2). As an implementation plan, this document acknowledged the all encompassing nature of changing an educational system based in the past on sorting students into groups by ability. The document also suggested that standards-based reform had to be more than merely adopting standards and having students take performance assessments. According to the document's introduction: The focus must be on students acquiring essential knowledge, skills, information, mastering technology, and developing desirable character attributes. This, in addition to pushing higher achievement expectations for the general education program and bringing truly creative learning opportunities for those highest achieving students in the District, is a formidable task. To make these changes a reality, we must accept the need to focus our instructional delivery systems, accountability systems and rethink the culture that contributes to higher standards for teaching and learning. It begs the need for each school to embrace an entrepreneurial attitude. (pp. 2-3) According to Dr. Wright: My theory on change is to crawl and listen. I threw many of these ideas out early in a similar document. The Board then asked me for a more formal proposal. I tried to go as slow as necessary, but once I got into it, it kind of took on its own life. I truly believe if you are going to change a whole system you had better have a good direction without being overly prescriptive. I know many of these ideas will be changed over time which is ok. I kind of believe you throw out a pebble you get some ripples. You throw out a lot of pebbles and who knows what will happen. (Wright interview, June 1999) Because of these beliefs, Assuring the Essentials proposed a comprehensive approach to standards-based education in the Front Range School District. This approach was based on the concept that to assure essentials, the district had to do a few things well because of the limited resource environment in which it existed. This 219

PAGE 234

document fundamentally changed the direction of the district's efforts. First, it promoted more of a focus on essential content while suggesting instructional skills could not be isolated from what was being taught. Second, it promoted the school as the center of reform and placed emphasis for support and capacity building on the district. Last, the approach also offered an answer or buffer for the district to oncoming accreditation requirements by the state by requiring schools to focus only on essential standards. Based on increasing the probability of preparing all students for a challenging future, Assuring the Essentials proposed to: Shift a fundamental purpose and increase the overall quality in student performance in the Front Range School District. To create DISTINCTION (emphasis in original) in educational programs, processes and events in the District by: Guaranteeing that each and every student develops the capacity to access the world of learning and meet the challenges of living and working in a complex society by being an effective reader, clear communicator, proficient in mathematical processes, calculation and problem solving, skillful in technological applications and be able to demonstrate the ability to act responsibly consistent with common, prevailing American values. To design and/or redesign educational programs, practices, events and requirements which increase the focus on more rigorous standards for student academic performance; allowing students multiple opportunities to meet those standards without premature judgment of failure; and pushing the upper lim its for students with special academic talents. (Assuring the Essentials, p. 6) In order to create this type of distinction, Assuring the Essentials offered a "conceptual construct" (p. 1 0) for areas that needed addressed. Assuring the Essentials proposed to strengthen the general education program including process 220

PAGE 235

elements, product elements, and capacity building elements that help the plan "adapt" over time to not overwhelm the system. Taken together, Assuring the Essentials provided district personnel with a comprehensive vision for standards-based reform based on focused content coupled with instruction, better use of data, intensive support for school based problem solving and a philosophy of higher achievement for more students. Although because of its extensive nature and relative newness, many could only discuss it in general terms: The main thing right now we have is causing a little bit of interest is that Assuring the Essential document, vision, implementation plan that Dr. Wright's put out. (Klein interview, June 1999) That (Assuring the Essentials) kind of pulls everything together in a nice package that says here's the framework, here's the support, here's the expectations around creating this standards-based system. (Grant interview, June 1999) Fourth, in contrast to district level administrators who focused their meaning of reform around instruction or Assuring the Essentials, building principals were more focused on isolated projects proposed as by-products of central administrators attempts to implement change. For instance, Mr. Randolph middle school principal, discussed at length his school's experiment with Unitmaker software as the meaning of standards-based reform that had been proposed by central office: Standards means, at least in this building the ability to plan with and track achievement of standards so we can report them. We are piloting a piece of software called Unitmaker for the district. We have. had quite a time with it. Not working, needing extra supplies we didn't know 221

PAGE 236

we would need. About halfway through the year, Dr. Wright said we could stop using it if it created too much frustration. quite a few did, but we are going to try it again next year. (Randolph interview, June 1999) In addition, Mrs. Bortz, elementary principal, and Mr. Cone, high school principal, discussed a current discussion of grading as another meaning of the reforms at their level: We have been piloting a new report card at our level for about two years. We look at all of our proficiencies and mark these for parents. We are trying to figure out how to track these for reporting purposes to the district. That's what we are really trying to change as a district. (Bortz personal communication, June 1999) My staff has discussed Assuring the Essentials with Dr. Wright and read through it. We need to see what direction its heading. Standards in this school has really taken on the issue of grading. Dr. Grant has really helped us think about some of those issues. Is it fair to average, should we include non academic factors? All of those issues we're studying. I'm not sure how they fit with the superintendent's plan at this point. We have standards, and I know most of my staff uses them to teach, but we really focused on grading this past year. (Cone interview, June I 999) Conclusion Because of the interactions of different learning orientations and use and strength of facilitating factors, different meanings for standards-based reform policy emerged in each LEA studied. Past implementation research has argued that variation has to be an acceptable outcome of any state policy and practice relationship. However, very few explanatory propositions for variation have helped show why this proposition may be true. By using the organizational capability perspective, 222

PAGE 237

existence of variation can begin to be explained by analyzing how differences in learning orientations and facilitating factors interact around meaning. This meaning in one sense is the variation in what policy implementation looked like across all three cases. For instance, one way to understand the Mid plains' case is that state policy had a significant effect, and to some degree it might have. Every content area but one in Midplains had new standards and materials for student learning as of spring, 1999. But focusing on the role of state policy appeared to limit the depth of learning by administrators and their ability to use facilitating factors to engage various learning orientations. By utilizing some factors while negating others, administrators in Midplains interpreted standards-based reform around certain learning orientations that definitely focused teacher attention, but little suggested that this interpretation influenced changes in classroom practice. In contrast, River Valley utilized strong facilitating factors to engage the organization's learning capability around a meaning of improving achievement through data-drive instruction. These factors included such things as leadership cognition, concern for measurement, scanning, resources, and multiple advocates. These factors in turn affected learning orientations such as internal knowledge sources, a lesrning focus on processes of assessment, and an explicit interpretive mechanism. These factorsand orientations began before state policy, but were 223

PAGE 238

strengthened by their passage. Again, state policy alone negates the role of administrators and their ability to use facilitating factors to engage particular learning orientations around a coherent meaning. River Valley showed significant tendencies toward progressive pedagogical notions and their utilization in classrooms suggested in state policy documents. However, it is not clear how much policy influenced the River Valley interpretation of these reforms, or whether River Valley meaning influenced standards-based reform for the state. Last, Front Range was clearly a district in transition toward a more coherent meaning for reform offered through Assuring the Essentials. Front Range, however, at the time of this study clearly exhibited a non-monolithic approach to their reforms. This was not to say that Front Range did not have a reputation for excellence in addressing early literacy problems or middle school hands-on science, only that Front Range's response to standards-based policy was not internally homogenous. District administrators sent principals and teachers an array of different and often conflicting messages about what the reforms meant. These mixed meanings came from differences in the leaders' understanding of these reforms caused by turnover, position, department, and prior beliefs .. In addition, a focus on improved achievement through modifying instruction without a clear interpretive mechanism for understanding changes in content or assessment gave teachers and principals 224

PAGE 239

signals that very little needed to change. Efforts to make the reforms more coherent through a complete learning cycle were as yet incomplete as Front Range focused on unrelated projects and buffering schools from state intrusion. Understanding variation as differences in meaning driven by differences in organizational learning capability further complicates the relationship between state policy and local practice. This complicated relationship has many implications for understanding districts as learning organizations and for the ability of state policy to engage a district's learning capability. I take these issues up in the final chapter. 225

PAGE 240

CHAPTER 7 ORGANIZATIONAL LEARNING CAP ABILITY IN STANDARDS-BASED REFORM : CONCLUSIONS AND IMPLICATIONS FOR POLICY The patterns of implementation and effect are largely a story of incomplete professional and organizational learning occurring as the learners (teachers, administrators, curriculum coordinators, staff developers) encounter the often limited teaching (by those who make or promulgate reform policies, as well as by policy itself) and learning opportunities (created by, or around reform policies). The long term prospects for the policy's success depend in large measure on the reform policy's "pedagogy" and learning resources over time (Knapp, 1996). Variation in implementing educational policy has been a problem since the earliest implementation research in the mid 1970's (Berman & McLaughlin, 1977). Policy makers and implementation researchers have struggled to understand the dynamic interactions among policy, local context and culture, resources and variable responses by local education agencies (LEAs). These problems continue to persist in this era of systemic reform. The perspective used in this study suggests organizational learning as a way to understand the relationship between state policy and local variation that is different from rational implementation perspectives. In an organizational learning perspective, variation is a problem of meaning. 226

PAGE 241

Like all learners, LEA's learned from the reforms in distinctly different ways, depending upon the existing context and culture. What was learned and how lessons from the policy message were interpreted depended upon the capacity of LEAs to make sense of the policy, described here as organizational learning capability. Some may argue that implementation did not occur in some of the LEAs studied. Looking at the organizational learning capability in each LEA showed that a great deal of activity took place in all three districts related to standards-based reform ideas. But the reform meant different things due to differences in the LEAs organizational learning orientations and facilitating factors. Viewing the LEAs as learning organizations with different learning capabilities rather than rational actors in implementing agencies leads to three major conclusions about the problem of variation in the policy and practice relationship. The first conclusion suggests that how the elements of an LEAs organizational learning capability interact to drive variation of meaning is as important to understand as the differences in the elements. A second conclusion suggests that these state policies alone may not have enough "educative" power to engage organizational learning capacity or changes in this capacity. The third implication suggests that to improve the learning of LEAs in an era of systemic reform, functional and interactive models 227

PAGE 242

:lfhow this learning occurs are needed. Each of these conclusions is examined below. Implications for Interactions of Organizational Learning Capability Considering the three cases, I argue that variation in implementing state mandated educational policy occurred because of differences in meaning coupled with different learning orientations and uses offacilitating factors (see Table 7.1 below). Table 7.1: Differences in Learning Orientations and Facilitating Factors Learning Mid plains River Valley Front Range Orientations Knowledge Source External Internal Mixed Learning Focus Content Process-assessnnent Processinstruction Learner Focus Groups Mixed Individual Use ofData Casual Purposeful Casual Interpretive Implicit Explicit Innplicit Mechanism Interpretive Processes Achievement Mixed Orientation Dissemination Delivery Construction Delivery Mode Knowledge Implicit Explicit lnnplicit Reserve Learning Scope Incremental Transformative Mixed Value Chain Things Hunnan Resources Hunnan Resources Strong Facilitating Leadership cognition Leadership cognition Continuous education Factors Resources Involved leadership Resources Policy Concern for Organizational Climate of openness nneasurement curiosity Systenns perspective Accountability Scanning Policy Resources Multiple advocates Continuous education 228

PAGE 243

Climate openness Weak Facilitating Scanning Perfonnance gap Involved leadership Factors Systems perspective Systems perspective Concern for Concern for measurement measurement Perfonnance gap Climate of openness Organizational Scanning curiosity Accountability Accountability Policy Focus Meaning of Higher level content Raising achievement Various-Standards-Based through data driven differentiating Reform instruction instruction, grading practices, Assuring the Essentials How each of the dimensions of organizational learning capability was used in the organizational learning cycle, as these cases showed, lead to deep or superficial reform. Organizational learning capabilities in each LEA suggested that the learning around state policy was much deeper in some LEAs because of more complete learning cycles ar<'mnd coherent meaning using clear learning orientations. In addition, the differences in organizational resources or facilitating factors interacted with these learning orientations to influence different meanings in the districts and the depth to which districts could learn about the policies (see Table 7.1 ). Consequently, although state policy-makers may have had one notion for standards-based reform, these notions were shaped by local school district leaders as they attended to and interpreted them within their local context. This study found no singl, unique orientation for how LEAs approached changing practice in response to 229

PAGE 244

standards-based reform policies. Instead, this study found that distinct organizational learning capability elements interacted differently which helped determine unique meaning within each LEA. Therefore, one major conclusion for this study indicates that LEAs are complex, interactive systems of meaning that greatly influence the "learning" messages of policy for local educators. The interactive nature of the organizational learning capability is examined below to show how similar elements interacted differently for each LEA. To do this, system dynamic models will be used. System dynamic models (Senge, 1990) or causal networks (Mile & Huberman, 1994) display the most important variables in a study and use arrow diagrams to show the relation among them. Interaction of Learning Orientations If policy trickles through multiple interpretive screens, then how a district gets all actors to interpret reform matters in understanding variation across and within districts. How each district helped its staff make sense of the reforms depended on a defined learning focus, knowledge source, learner focus, dissemination mode, value chain commitment, and strength or interpretive mechanism used. In addition, these orientations were supported by messages about interpretive orientation and learning 230

PAGE 245

scope. However, differences in how these orientations interacted help exemplify the complexity of the policy and practice relationship. Midplains. In Midplains, a strong learning orientation in acquiring knowledge from external knowledge sources helped the district improve their processes for adopting new content. Because of these strong orientations around content and improving adoption processes, Midplains showed a casual orientation toward use of data suggesting to teachers that coverage of new materials was most important. This lead to utilization of materials but an incremental understanding of the deeper intent of the reforms. The processes for adopting standards acted as the only formal interpretive mechanism that did not explicitly link to changes in classroom experiences. Dissemination in Midplains revolved around a knowledge reserve based on an implicit link to classroom practice, and dissemination as delivery through training of groups who met to adopt new standards and materials. This too signaled teachers the importance of reform as content. Mid plains used a defined learning focus around content supported by a value-chain commitment to purchase new materials as its interpretive mechanism that signaled to teachers standards meant new content and textbooks. Using a system dynamic model, these orientations interacted as follows: 231

PAGE 246

Interpretive Orien tation: Process Knowledge Source: External River Valley. The scenario was very different in River Valley. River Valley's interpretive orientation toward the use of assessment and data lead to an internal knowledge source based on purposeful and plrumed uses of data to improve achievement. River Valley relied almost exclusively on their own internal training to provide their teachers and principals with a common interpretive mechanism for understanding the role of assessment in using standards. This interpretive mechanism, the Professional Development Center, also helped promote a transformative learning orientation within the district leading to high utilization. Dissemination in River Valley used a different form of knowledge reserve that helped teachers make sense of the reforms. The knowledge reserve consisted of designed units and assessments that had been piloted and modified. The knowledge form was publicly shared throughout the district. Formal dissemination based on the construction of new knowledge through instmctional planning also helped promote a 232

PAGE 247

common understanding of assessment and data as standards-based reforms. River Valley used a defined learning focus of assessment processes, supported by a valuechain commitment of professional development that was used to support the Professional Development Center. This professional learning gave teachers an opportunity for an explicit interpretive mechanism that would transfer more easily into classrooms. Using a system dynamic model, these orientations interacted as follows: Figure 7.2: The Interaction of River Valley's Learning Orientations Learning Scope: Trans formative Value Chain Interpretive Orien tation: Achieve ment 14-----tDissemination Mode: Front Range. In Front Range, a transition of leadership had begun to change the learning orientations within the district so orientations were mixed. For instance, Front Range focused a significant amount of staff development around an orientation toward instruction based on individuals' learning. However, newly allocated incentives focused more on whole school problems and group learning. Front Range also showed a mixed learning scope. Many administrators still alluded to the need for 233

PAGE 248

changes in instruction as a way to raise achievement while a new public document promoted more progressive reforms for the district. Because of new ideas surrounding standards-based reform, content took on more importance. For instance, Front Range actively pursued standards-based reform ideas in certain areas like literacy and science. Front Range did not have a formal interpretive mechanism. Through its focus on individual professional development, Front Range signaled to teachers that differentiating instruction, and not content or assessment, would raise achievement. In essence, Front Range showed internal variation around how it learned about standards-based reforms. Using a system dynamic model, these orientations interacted as follows: Figure 7.3: The Interaction of Front Range's Learning Orientations Scope: Mixed I 1 i Value Chain Learning Focus: rnowledge Source: HR I Instruction Mixed 71 1 ''Interpretive Orien-I 'Dissemination I 'Knowledge ., tation: Mixed Mode: Delivery Reserve: Implicit Interpretive Mode: Impli-HLeamer Focus: cit Individuals Use of Data: Casual --r Summary: Comparing Midplains, River Valley and Front Range's learning about state mandated reform policy through their learning orientation highlights the 234

PAGE 249

complex interactive reasons for local variation. Differences in learning orientations signaled to teachers different meanings of standards-based reforms. In some instances, these orientations helped complete a learning cycle toward utilization in classrooms, while other signals did not. These different learning orientations help explain variation of policy interpretation at a district level. Differences in Facilitating Factors In this study, facilitating factors stood out as especially important for engaging and supporting specific learning orientations. Important facilitating factors included: scanning, performance gap and a concern for measurement, leadership cognition, climate of openness, policy, continuous education, systems perspective, accountability, and allocation of resources. Facilitating factors and how they interacted with each LEAs learning orientations are synthesized below. Midplains. Midplains made use of a few strong facilitating factors to enable their learning orientations. For instance, as in all three cases, the leaders' cognition as a facilitating factor in understanding reform greatly influenced how they used other factors and orientations for the organization's learning. In Midplains, Dr. Rodriquez brought a deep understanding of curriculum to her role as superintendent in Midplains. Because of her background and knowledge of curriculum, she understood 235

PAGE 250

standards-based reform as adoption of new standards and materials. Because ofher role, her personal resources (background, knowledge) allowed interaction with organizational resources (culture, resources) to influence what was learned artd how it was learned within her district. For instance, Midplains exhibited the use of organizational resources to influence their learning orientations and the depth to which reforms were utilized. Resources as studied included financial resources, human resources, and structural resources. Access and use of all forms of organizational resources has been supported as a major factor in implementation and policy research. Midplains, however, used resources in a particular way that enabled a certain meaning. They budgeted close to $500,000 dollars for materials alone but without accompanying staff development efforts, different instruction was unlikely to occur. Similarly, because personnel resources were limited, the depth of changes were somewhat limited. In Midplains only two central administrators worked on standards-based reforms and the meaning was coherent within the district, but with little change to experiences for students. Another strong facilitating factor in Midplains included social capital or climate of openness that enabled what message could be disseminated. Mid plains exhibited a sense of open communication and a sense of trust between administrators and teachers. This allowed the dissemination of an understanding of standards-based 236

PAGE 251

reform as content. Teachers trusted that the administration's interpretation was correct especially since it did not challenge teachers to think any differently about other aspects of their practice. In this way, too much trust in the leader's cognition limited more progressive notions. In contrast, Midplains also had weak facilitating factors that limited the depth of their learning. Midplains employed no formal scanning process and therefore, could not anticipate what would be required by the state. Midplains had not yet developed a deep concern for measurement to deepen their knowledge of the success of their reform efforts or any sense of a performance gap. Last, Mid plains used no formal mechanisms for holding schools and teachers accOtmtable for student achievement. Accountability in Mid plains at the time of this study consisted of finalizing all content area standards and material selections within a specified time period. Using a system dynamic model, how the facilitating factors enabled learning orientations looks like: Figure 7.4 Interaction of Facilitating Factors and Learning Orientations in Midplains ,.----------, ................. mental Knowledge Source: ...... =-Fo_c_,us'-: --.---.t,_,E=xt=ern=a"-1 -----1 tation: Process Content .. .. : ;_r;_o_g!l_(t!!J..'! j ..---------, : ......... Resources 237

PAGE 252

River Valley. In contrast to Mid plains, River Valley used many more facilitating factors to enable deeper learning. First, scanning and a concern for measurement including an understanding of performance gaps helped River Valley acquire knowledge at a much deeper level than the other two districts. As discussed in the actual case, River Valley anticipated the policy because it had formal scanning processes to understand what national movements and state movements would impact their district. Understanding the outcomes-based movement helped River Valley gain an earlier understanding of the systemic reform movement that came from state policy. Similarly, the district's leaders were committed to serving in organizations that could influence state policy. This helped bring back information to the district prior to legislative passage that could be discussed, influenced, interpreted and disseminated. A concern for measurement also helped River Valley acquire knowledge. River Valley's use of data helped facilitate the need for better achievement data at both a district and classroom level that lead to common data analysis processes at all levels. This information was used not only as an accountability tool, but also as a decision making tool. Data to drive decisions and instruction was seen as a similar process at classroom, school and district levels. This use of data as a facilitating factor helped engage learning and understanding of performance gaps and if professional 238

PAGE 253

development efforts made any difference on achievement. This also helped create a true systems perspective Similarly in River Valley, leadership cognition greatly influenced the district. Dr. Senneca and Mrs. Glitton brought a different set of personal resources based around the idea of using of data to make decisions. Their understanding influenced an internal knowledge orientation, resources for professional development based around assessment and data, and instructional guidance for teachers based around assessment. These shared mental models (Senge, 1990) were shared by all administrators in River Valley and helped drive a system wide perspective on the importance of data in the district. River Valley also showed a climate of openness and trust that had developed within the district over the history of their reforms. Union officials, teachers and administrators worked closely together to learn from one another around a common focus of clear targets for student learning and a shared sense of accountability for achievement. This too allowed for the leader's cognition to be easily shared and disseminated through the PDC which was supported by increases in achievement. River Valley in contrast to the other two districts, had designed multiple accountability mechanisms to enhance the utilization of their interpretation of standards-based reform. These included a district wide assessment system, school 239

PAGE 254

reporting requirements, an aligned district and school improvement process based on data, teacher performance and evaluation system based on raising student achievement, and a modified pay for performance plan. These accountability mechanisms highly influenced a shared meaning and utilization of their version of standards-based reforms. Using a system dynamic model, how the facilitating factors enabled learning orientations looks like: igure 7.5: Interaction of Facilitating Factors and Learning Orientations in River Valley ................................ ........................................................ ________ . . : ............... : . : Interpretive Orien tation: Achievement ..... L. ... L ........ .. "1 :Leadership : : ........... ; :cognition : ....... r ...... ..... .._ _____________ -.L---,.-------___J : .. : .... .............................. ,::: .................................................... ........ j :concern for Meas11re-: :Pers ective : ............... :menV Perfor111llnce Gap : ..... 'P. .............. ............................... Front Range. In contrast to the other two districts, Front Range exhibited the weakest set of facilitating factors that limited organizational learning cycles. For instance, Front Range had scanned the national environment and accepted the outcomes-based movement in the late 1980's. After failure oftheir instructional guidance system, however, they did not connect that movement to the ideas within 240

PAGE 255

the state policy and were caught off guard by new accreditation requirements. Front Range also did not use a concern for measurement to deepen their knowledge of the success of their reform efforts, nor did they acknowledge any performance gap. As a key facilitating factor, Front Range administrators possessed different personal resources and understandings of standards-based reforms that lead to separate and disjointed projects that were not connected within a systems perspective. Even though directors in the Department of Instruction discussed instruction as important, each meant instruction as different projects supported by their own separate budgets. This was supported by their strong use of continuous education that was, however, highly focused on instructional differentiation. In addition, Front Range did not exhibit a climate of openness. Many administrators discussed the lack of trust between teachers and administrators that limited the ability of the district to have a shared understanding of the reform or move toward more progressive efforts. While resources could be considered one of Front Range's strongest facilitating factors, how they were used led to a sense of fragmentation. Front Range allocated close to ten million dollars for reform efforts over the decade, but too many ideas and a sense of mistrust was not overcome by this amount of money. In addition, Front Range used many more central office administrators to work on these reforms 241

PAGE 256

without an accompanying shared understanding. So while the availability of financial and human resources influenced people's ability to respond to the policy, any organizational resource was simultaneously shaped by individuals' perceptions on how best to use these resources. Using a system dynamic model, how the facilitating factors enabled learning orientations looks like: Figure 7.6 Interaction of Facilitating Factors and Learning Orientations in Front Range ........................ ; Climate of Openness; I : ................................. Scope: : ; Mixed -iEducation : ---1-l---.: : __ .............. : ........... 1reaming Focus: ' Source: HR Instruction ............... Mixed 1 II OrienI I tation: Mixed Mode: Delivery Reserve: Implicit Interpretive Mode: ImpliFocus: cit Individuals Use of Data: Casual !. Summary: Past implementation research has supported various forms of capacity in support of reform. From this study, numerous facilitating factors in various formspersonal, resource, perceptionsstood out as important for enabling organizational learning. An especially important finding centered on the use of resources. How financial resources are used depends on how individuals decide to use them. Financial resources while facilitating or constraining dissemination and utilization initiatives do not design the learning nor detail the district's interpretive orientation. 242

PAGE 257

Similarly, having extra people or personnel resources has been suggested as an organizational resource that can lead to enhanced implementation efforts. Again, however, individual perceptions if not aligned or shared do not necessarily help. In the three cases studied, more was not always better. Implications for Systemic Reform Policy in Engaging Organizational Learning Capability The second part of the policy-practice relationship implies that state policy has educative power. If the Local Education Agency acts as the learner as this study suggests, then reform initiatives in general and state policies in the specific entail teaching. Using this perspective two questions around reform efforts and state policies as a teaching mechanism arose in this study: First, was state policy influential and educative in nature in the three cases studied? Second, can policy act as pedagogy? Findings from this study suggest a second major conclusion. Namely that this policy by itself was not educative enough to change local practice, but that it did engage processes within the LEA. Influence ofPolicy on the Three Cases If state policies were designed to engage the local districts in Colorado in learning, one would suspect that engagement would focus specifically around the 243

PAGE 258

ideas in policies, and an increase in activity at the local site would be evident. In this sense, policy would have acted more like a learning orientation in the acquisition stage in that it would have demonstrated a defined knowledge source, learning focus, and learner focus. While all three cases did show an increase in activity, they exhibited much different approaches toward the state's policy initiatives because of how each district interpreted the reforms. Midplains showed a flurry of activity around adopting standards and materials as new content. Midplains exhibited a reactionary approach toward policy having ignored it until central administrators decided they had to at least adopt standards as required by state law. River Valley on the other hand showed an anticipatory approach toward state policy having put many of the reform elements in place five years before the passage of the first piece of legislation. In some regards, this study also showed that River Valley, because of a high degree of scanning, may have actually influenced the design of state policy because of their understanding of national reform movements. Last, Front Range showed a mixed approach toward the influence of policy often finding ways to fight against state intrusion and buffer the Front Range schools from policies that did not fit with their own reform efforts. Viewing policy as a curriculum to be taught and learned at the local site highlights the fact that state policy never enters a clean slate. Local context highly 244

PAGE 259

determined what could be learned from state policy. What could be learned from state policy, as previously discussed, depended upon a district's previous reform history, the leader's understanding of the reforms, their trust in state fnitiatives, and the sources from which they learned about the reforms among others. While reform activity had increased in each district over the past five years, what was learned and how it was learned was due more to local interpretations than the consistency or guidance given in state policy. The one notable exception was the Colorado State Assessment Program (CSAP) which was used as policy mechanism to signal local educators what the state meant by standards-based reform. While many administrators in this study did not like CSAP and the pressure brought on by its administration, many did allude to the influence of these assessments in helping clarify state expectations. In this regard, CSAP acted like an interpretive mechanism for districts but its influence on local curriculum, assessment and instruction is not as clear. This state policy in Colorado did engage the learning in the three cases studied to some degree, mainly through state testing. Therefore, a second conclusion from this study is that policy-makers at both the state and local level may wish to understand that ambitious reform policies require ambitious learning for local educators. In Colorado, little was done by the state to directly teach local districts what the reforms 245

PAGE 260

meant or should look like in practice. In this sense, policy by itself was a weak mechanism to engage learning by the local districts. How to best design policy to facilitate this ambitious learning and whether and how this would improve its reception at the local site remains unclear. This study provides a beginning means to understand this issue. Opportunities for learning would have to be provided that suggested a clear and consistent message about reform and attend to the leadership cognition and facilitating factors discussed earlier to buidl the LEA's internal capacity. In addition, a clear learning orientation from the state would need to be employed. On the other hand, state policy did offer local districts a "good guybad guy" scenario that may have been useful in engaging local reform efforts. In this sense, policy acted like a facilitating factor. In all three cases, local administrators alluded to the fact that the power of state policy was not necessarily in its ability to teach, but in its vagueness and accountability. Many district leaders pointed to the benefits of a vague policy that could be interpreted in local contexts but that also gave them a reason to push teachers to attend to test scores and achievement as important. More than one administrator discussed that for once teachers and administrators had a common enemythe stateand that they could use the policy to push their own reform efforts. In this sense, policy acted less like a curriculum to be learned and 246

PAGE 261

more like an accountability tool to be used and fit to local contexts. In this sense, policy could be considered influential as local educators began to fear what the state would do if test scores did not rise. Challenges for Policy as Pedagogy Viewing the local district's response to state-mandated policy from a teaching and learning perspective suggests that the optimism of systemic reform may need to be tempered. I argue, based on the cases within, that local variation may be as common as variation between classrooms in the same school building. Policy alone without authority, power, consistency and prescriptiveness as key signals for LEA's (Clune, 1998) is not a clear or strong enough motivator or teacher to change local contexts. The cases within revealed that the local context is fraught with prior history, different understanding of reform efforts, and different capabilities that mediate between the original intent of policy and what it looks like in practice. If state policy always depends upon the local site to interpret, then variation will be the norm because the local district also acts as a policy-making unit trying to give clear direction to schools and teachers who often get conflicting messages from their school districts. In addition, understanding policy as pedagogy also poses other problems for policy-makers. On the one hand, it is critical that policy take into account local 247

PAGE 262

particulars in order to increase the efficiency and utilization of policy at the local site. On the other hand, the particulars are so different for each local site that it seems impossible to design state policy that takes all local sites into account. According to Spillane (1993), the tension between specificity and universality is endemic to the relationship between state policy and local level policy making. State policy-makers work at high aggregate levels and only see the average or extreme cases while trying to solve the general problems of practice. Each local administrator works in a situation designed around particular specific circumstances endemic to local customs, history and cultures that work to shape local practice. If these three cases showed that facilitating factors are related to better organizational learning, how can policy enhance these so that they influence the local organization's ability to learn? There are no simple answers here. To enhance the leader's cognition, initiatives that challenge administrators' existing beliefs and knowledge about instruction could be a place to start. Similarly, extra resources that deal with acquiring long-term professional development focused around a particular learning orientation, coupled with better accountability designs could also be better "teaching" at the state level. However, this raises questions about the capacity of the state department of education. With multiple school districts at different developmental levels and needs, differentiating policy and teaching for each local 248

PAGE 263

site seems a huge task requiring substantial resources. In essence, policy-makers have to try to balance their obligation to all local districts through gearing policies toward the average. while knowing the capabilities and needs vary widely. This is not to say that local districts cannot learn from policy, or that they have not. Rather, I wish to come full circle and note that the policy and practice relationship is dependent upon both teaching and learning. Just as great teachers are made better by great learners, great policy is made better by great local capabilities. Implications of the Organizational Capability Model for Practice State policy and the intentions of state policy-makers are not irrelevant in our efforts to understand the interaction between policy and local practice. Trying to separate the clarity and authority of the state policy without attending to the local context though is problematic. Because the educative power of policy as a mechanism itself is weak, a third major implication suggests that enhancing LEAs as learning organizations may help reform efforts. Therefore, a major question for practice concerns the utility ofthe Organizational Learning Capability model. Does the model help us better understand the dynamic, interactive nature of LEAs as major determiners of policy success? And can this model help LEAs improve their learning capability in an era of higher accountability and reform? 249

PAGE 264

The Model's Utility in Understanding LEA's as Learning Organizations The Organizational Learning Capability model provides us with an understanding of the way a district orients its collective understanding of the meaning of reforms. Some organizational theorists ( see for instance Argyris and Schon, 1978; Berger and Luckman, 1966; Bougon, 1983; and Weick, 1994, 199; Senge, 1990) discuss the nature of an organization's shared mental models. The Organizational Learning Capability model allows us to understand if those mental models do exist and how they came emerge. For instance, in both Midplains and River Valley, a collective, shared understanding of the reforms was evident due to the interpretive mechanisms used while no such understanding was clear in Front Range. This model shows utility in understanding the relations between policy and LEA practice because of its broad scope oriented around all phases of the organizational learning cycle. For instance, River Valley, for all practical purposes, had the greatest success with their reforms because they exhibited clearly defined orientations in all parts of the learning cycle. It could also be argued that because of their focus on assessment and data as an internal knowledge source, they constantly engaged the organizational learning cycle through acquiring, interpreting, disseminating and utilizing new knowledge. In contrast, it was not clear that the utilization part of the learning cycle was as strong in Front Range or Midplains. Midplains, exhibited 250

PAGE 265

utilization of their interpretation as adopting new content. By orienting utilization at an organizational level, little evidence suggested that utilization transferred to classroom practice. In simpler terms, learning was more cyclical in River Valley and linear in the other two districts studied. River Valley utilized constant feedback mechanisms to adapt their practices and make decisions. Therefore, the utility of this model allows us to better understand LEAs as learning organizations and diagnose reform and implementation through complete learning cycles. If organizational learning is cyclical as many would suggest, then the ability of this model to help orient reform efforts in stages of organizational learning cycles may prove beneficial. This model also helps us understand LEAs the issue of capacity. Capacity has been defined in many ways including resources, training, structures, or cognitive capacity. Capacity has also been perceived as organizational resources and influence, a form of economic capital, or individual cognitive constructions and beliefs. This model combines all ofthese definitions under the heading of facilitating factors and shows that all of these elements of capacity may matter in implementing large-scale standards-based reform. However, this model also shows that capacity alone does not matter. How capacity develops and enables organizational learning over time might prove most conducive to reform efforts. For instance, Front Range actually spent the 251

PAGE 266

most fmancial resources and had the most human resources to help them in their reform efforts. However, a lack of cognitive capacity and climate of openness lead to disjointed learning. River Valley, in contrast, could be considered a low capacity district because it did not use the same amount of money or have the same amount of personnel as Front Range. However, River Valley targeted their resources toward specific learning opportunities for their staff and drew upon other favorable conditions such as a climate of openness and shared accountability to put their reforms in place in a much deeper way. Understanding a district's organizational learning capability also allows us to understand better how capacity as facilitating factors engages and enables organizational learning to determine the depth of reform efforts. While this study does not purport to develop a complete theory, certain interactions did emerge as important from this study. For instance in the area of interpretation, leadership cognition highly influenced the LEAs knowledge orientation. In the area of acquisition, scanning and concern for measurement influenced learning focus. In dissemination a climate of openness and focused continuous education influenced a district's learning scope and learning focus. In utilization, multiple advocates and accountability influenced both learning scope and value-chain. 252

PAGE 267

In addition, facilitating factors also influenced other factors. For instance this study found that the leadership cognition influenced allocation of resources and accountability structures. Taken together, the interaction of facilitating factors and learning orientations allows us to understanding the dynamic nature of reform in these three cases. Understanding the complexity of the LEA as an interpretive, learning agency through which state policy has to navigate requires new interactive models not previously proposed. Although characteristics of both state policy and the LEAs context all contribute to explaining variation, understanding policy and context alone is not sufficient. Synthesizing these elements in an interactive manner provides a much richer and deeper understanding of the LEA. Using the Organizational Learning Capability model in this way posits that how a district uses its facilitating factors to engage organizational learning matters significantly. Therefore, to build a district's learning capability may demand that leaders pay more attention to developing this capability before beginning down the path of reform. I turn to these implications next. The Model's Utility for Enhancing Learning Capability If the Organizational Learning Capability model does help us understand LEAs and variation among school districts in implementing standards-based reform, then 253

PAGE 268

other complementary questions arise. First, can the elements in the model be enhanced by LEAs to build better organizational learning? Second, how would districts use this model to enhance their organizational learning? Third, what would be the resultant outcomes for student achievement? Last, how would the organizational learning capability of a district change or have to change over time to meet the changes in the state policy environment? While all of these questions demand further empirical study, certain conjectures can be drawn from data in this study. Similar to Dibella and Nevis' work in multiple organizational settings, enhancing organizational learning ability in school districts may focus around using the model for analysis, enhancing the basic elements of organizational learning, and understanding changes at each stage for better organizational performance. First, the organizational learning capability model could be used by local school districts to analyze both problems and needs for putting reforms into place. For instance, using the model could help develop a district's learning profile and could point out areas needing more effort or time. Especially important would be the analysis or use of facilitating factors at all levels and groups to see if there were discrepancies. In addition, by analyzing the learning stages in temporal terms, districts could analyze where the learning breaks down or which atea needs 254

PAGE 269

strengthening. Last, by Wlderstanding the interaction of facilitating factors, districts could analyze if and when facilitating factors do actually enable learning orientations to enhance organizational learning. Second, the organizational learning capability model could be used by districts to Wlderstand the basic elements of organizational learning and their role in learning about and from reform efforts. From this study, what emerges as basic includes such things as a learning focus including groups or individuals, a learning focus built aroWld a particular orientation and reason for reform, an explicit interpretive mechanism, a systems perspective, involved leadership, and accountability structures. In addition, a strong interpretive mechanism seems to be basic in that it helps teachers transfer the ideas to classroom practice. Understanding the basics of the organizational learning capability model also would help districts Wlderstand the developmental and cyclical nature of the organizational learning cycle that would help focus learning in the necessary stages depending upon the temporal dimensions of the reform. Last, after districts began to Wlderstand the basics and how to analyze their own learning capabilities, the organizational learning capability model could help districts make changes at each stage to enhance organizational learning. Dibella and Nevis (1998) refer to this as the macro metacognitive ability of organizations to choose 255

PAGE 270

particular orientations according to the changes needed. For instance during the acquisition stage districts may switch from an external to an internal knowledge source or from a process to a content focus depending on their interpretation of need. The dissemination phase could also be changed over time from a focus on formal knowledge to more of a focus on tacit knowledge or knowledge gained from actual practice. Last, utilization, the stage least understood in the learning cycle, could be shifted by using multiple advocates to help design previous learning stages or through shifting the value-chain focus to signal different priorities. The Organizational Learning Capability model by design does not point to any ideal learning profile or suggest that any orientations are better than others. The model does not tell us that what is right for one organization or unit may not be appropriate for another. It does, however, suggest that the model can be used to analyze a district's learning capability to enhance the basic facilitating factors and to understand an integrated strategy necessary during a district's learning cycle. Each ofthese implications suggests further empirical study. Conclusion This study initially set out to understand how local districts responded to state mandated standards-based policies in the state of Colorado. Using the Organizational 256

PAGE 271

Learning Capability model (Dibella and Nevis, 1998), I analyzed how the model could be redefined to explain variation in local response to state mandated reform policy. Based on the three cases studied, I argue that no easy answers emerge to explain the relationship between the state policy and local response and implementation. Findings suggest that all three cases used facilitating factors and learning orientations interactively to make their own interpretations of what these reforms meant and put them into practice according to these interpretations. If anything, state policy heightened the local response; but the extent of policy's influence depended upon how local leaders understood the reforms, and how they engaged the organizational learning cycles for principals and teachers within the district. Each district was reforming, but with different degrees of depth surrounding their own interpretations. Significant among the findings suggests that districts with a strong and ongoing interpretive mechanism can influence progressive changes in schools and classrooms no matter what the change attempted. As an example, River Valley used a process of curriculum planing and assessment to operationalize the standards for their teachers which clearly transferred to classroom practice. Furthermore, this study found, that in contrast to most implementation research, the complexity of local response is interactive, developmental, and temporal. Most 257

PAGE 272

research treats local districts as static implementor or followers of state mandates. This research suggests that local variation is due more to the relationship between policy, facilitating factors, and local learning orientations. Standards-based reform looked different and meant different things in Midplains, River Valley and Front Range because each used policy differently, applied facilitating factors differently, and employed different orientations in learning about state-mandated reform. In sum, while more research needs to be done around the issues raised within, using the Organizational Learning Capability model to explain local response to reform policy suggests that local capability matters tremendously. While each of these case studies was a brief snapshot in time, this model clearly points to differences in how they learned about the reforms and what mattered in their learning. This study points to a different conception for understanding capacity and what may be needed to further understand the obstacles for deeper educational reform. The design of state policy to engage district learning is an area that demands much more attention. Overall it was quite clear that new state policy did little to diminish the active role of the local district in reform, and that while reform may look different, learning had occurred in each case. The depth of this learning, however, was quite varied between the three cases. Deficits found in earlier 258

PAGE 273

implementation research led to the need for different perspectives in understanding this variation. 259

PAGE 274

APPENDIX A INTERVIEW PROTOCOL FOR DISTRICT INFORMANTS I. General Implementation History/Strategies A. What is your overall vision for SBE? How is it supposed to look? B. Tell me a little about the whole history of standards-based reform in your district? How did you begin, where did you start? At what levels did you begin and what interface strategies between levels did you use? Who was instrumental in getting SBE off the ground? Follow-up Probes C. How have you implemented SBE? What has been the most effective strategy and why? What has influenced those strategies? How has SBE been modified or adjusted over time? What planning or problem solving strategies help remedy problems? D. What sort of district infrastructure did you have to create to support the implementation learning about SBE? E. How did you get people to attend to and understand the differences between SBE and traditional forms of education? How did the state policies help you with this? What problems, if any, were encountered in gaining acceptance of she by administration, staff, students, community? F. How have you tried to achieve coherence in your implementation strategies? How is she administered in this district G. What elements of the whole policy have been most helpful to your district? H. What has been the cost to the district in terms of human, social, fiscal resources for effective implementation? Where did you allocate most of your resources?How much time do people devote to responsibilities connected with she? I. How did you develop the capacity in your district to implement SBE? What kinds of capacities do personnel and schools need to implement SBE? J. What accountability strategies have you used to hold people accountable for implementation?What incentives have you used? K. Has she been implemented at every site as planned? Why has this occurred? II. Meaning of Standards-based Reform A. What does standards-based education mean in your district?How is this defined and communicated? What are the major elements or theories of action that 260

PAGE 275

you have used to design SBE reforms? How were policies and/or strategies designed to support these ideas? B. Does this view of sbe encourage a particular view of teaching and learning? [probe for is this shared widely, why this vision, how shared and disseminated] What does this look like in practice? Follow-up Probes C. What effect does this meaning, philosophy etc. have on current policies and practices within your district. D. What influenced philosophy of SBE?What main ideas did you dissect from policy as important? Why? E. What elements do you think people most attend to? Why? F. How has the district helped people make sense of or understand SBE G. What elements have contributed most greatly to teacher, student, school, district success? Why? H. What differences have you seen? I. What else could your district be doing to complete this vision? III. Knowledge Acquisition A. Orientation Questions 1. How did your district learn about standards-based education? How did you acquire your knowledge about how and what to do? Why this method? 2. What groups helped your district? 3. What element of sbe did you begin with and why? Follow-up Probes B. Facilitating Factors What enabled this acquisition of knowledge? 1. Organizational Resources a. To what extent did you gather or learn about sbe in other districts, states etc.? How did this occur? b. When you began learning about sbe, how much time was spent on defining and measuring key factors for implementation? How were these metrics used internally? How was this information fed back to the district, schools etc.? 2. Mental Models a. What was seen as the gap between actual and desired state for sbe? b. How willing was/is the district to try the new ideas surrounding sbe? How curious was/is the district to see how the parts of sbe are different? How willing was the district to look at processes, policies, structures etc. when implementing sbe? IV. Interpretation A. Orientation Questions 261

PAGE 276

1. When implementing sbe, what sort ofthings about schooling did you know would have to change? How did you come to these conclusions? How were these understandings communicated? 2. Would you describe the intent of your implementation plan as incremental or transformative relative to student learning/ experiences? Follow-up Probes B. Facilitating Factors 1. Organizational Resources a. What processes did you use to help people interpret and make sense of sbe? How long has it taken for people to understand differences in sbe? 2. Mental Models a. How closely aligned do you think people's beliefs and understanding are on sbe in your district?What has led or not led to that? V. Dissemination A. Orientation Questions 1. How have the ideas, processes and requirements of sbe in your district been disseminated? e.g. how have principals, teachers, parents, students etc. gained information and understandings about sbe? 2. How have the understandings learnings, processes etc. about sbe in the district been documented? How have these been shared? Follow-up Probes B. Facilitating Factors 1. Organizational Resources a. What sorts of professional development opportunities have been offered by the district? b. What structures in the district have helped people learn about sbe? 2. Mental Models a. How open have personnel been in discussing and learning about sbe in your district? How has this been encouraged or not? How have problems with implementation been shared and resolved? b. What conflicts have arisen as a district with implementing sbe? How have these been handled? VI. Utilization A. Orientation Questions 1. Where has most of your learning emphasis been placed? Why there? 2. Has learning been preferred for improving existing capabilities or for the development of new competencies? 262

PAGE 277

3. How has your district tried to achieve a systems perspective e.g. in what ways have problems and solutions seen in terms of systemic relationships and connections between levels, schools, policies and practices? Follow-up Probes B. Facilitating Factors 1. Organizational Resources a. How has the district leadership been involved with implementing sbe? How are they actively engaged? Have principals been asked to do perform differently in their leadership roles? Who else would you consider a champion of sbe, who advocates in the district? Why? b. Have teachers and schools been allowed autonomy in the process of sbe? c. What accountability processes and policies have been built in for the utilization of sbe? 2. Mental Models a. Beyond district level personnel, how are ideas and methods advanced? What allows this to occur? 263

PAGE 278

REFERENCES Acker, J. (1992). Gendering organizational theory. In A.J. Mills & P. Tancred (Eds.) (Ed.), Gendering organizational analysis (pp. 248-260). London: Sage Publications. Argyris, C. (1992). On organizational/earning. Cambridge, MA: Blackwell publishers. Argyris, C. (1993). Knowledge for action. San Francisco: Jossey Bass. Argyris, C.,&. Schon., D.A. (1978). Organizational/earning: A theory of action perspective. Reading, MA: AddisonWesley. Argyris, C., & Schon., D.A. (1978). Structural-interorganizational approach: A sociological and political science perspective. In C. Argyris (Ed.), Organizational learning: A theory of action perspective. Reading, MA: AddisonWesley. Baker, E., &. Linn, R.L. (1997). Emerging educational standards of performance in the United States [Technical report]. Los Angeles: CRESST. Baldersheim, H., & Stav., P. (1993). Reforming local government policymaking through organizational learning and experimentation: The case of Norway. Policy Studies Journal, 21(1), 104-114. Ball, D. L. (1990). Reflections and deflections of policy: The case of Carol Turner. Educational Evaluation and Policy Analysis, 12(3), 263 275. Ball, D. L., & Cohen, D.K. (1994). Understanding state efforts to reform teaching and learning: School districts and state instructional policy. Paper presented at the annual meeting of the. American Educational Research Association, New Orleans, LA. Bardach E. (1971 ). The implementation game: What happens to a bill after it becomes a law. Cambridge MA: The MIT Press. 264

PAGE 279

Bartunek, J., & Lacey, J. (1992). Social cognition in organizational change: An insider outsider approach. Journal of Applied Behavioral Science, 28(2), 204223. Bateson, G. (1972). Steps to an ecology ofmind New York: Ballantine Books. Berger, P., &. Luckman, T. (1966). The social construction of reality: A treatise in the sociology ofknolwedge. Garden City, NY: Doubleday & Company. Berman, P. (1978). The study of macro and micro implementation. Public Policy, 26(2), 157-183. Berman, P. (1980). Thinking about programmed and adaptive implementation: Matching strategies to situations. In H. Ingram and D. Manns (Eds.), Why policies succeed or fail. Beverly Hills, CA: Sage. Berman, P., &. McLaughlin, M.W. (1974). Federal program supporting educational change Vol I: A model of educational change. Santa Monica: RAND Corporation. Berman, P., & McLaughlin, M.W. (1977). Federal programs supporting educational change Vol VII: Factors affecting implementation and continuation. Santa Monica: RAND Corporation. Bigham, J. (1998, Feb. 21). Schools must be more accountable. Denver Post. Bingham, J. (1999, January 8). Schools get lower marks. Denver Post, pp. lA, 15A. Bodily, S., Keltner,B., Purnell, S.W., Reichardt, R.E., & Schuyler, G.L. (1998). Lessons from new American schools' scale-up phase: Prospects for bringing designs to multiple schools [NAS series report]. Santa Monica, CA: RAND Corporation. Borman, K., Cookson, P.W., Sadovnik, A.R. & Spade, J.Z. (1996). Implementing educational reform: Sociological perspectives on educational policy. Norwood, N.J.: Ablex Publishing Corporation. 265

PAGE 280

Bougon, M. (1983). Uncovering cognitive maps: The self-q technique. In G. Morgan (Ed.), Beyond method: Strategies for social research (pp. 173-188). London: Sage Publications. Brown, J., &. Duguid, P. (1994). Organizational learning and communities of-practice: Towar a unified view of working, learning and innovation. In H. Tsoukas (Ed.), New thinking in organizational behavior (pp. 165-187). Oxford: Butterworth-Heineman Ltd. Brown-Easton, L., & Koehler, P.H. (1996). Arizona's educational reform: Creating and capitalizing on the conditions for policy development and implementation. In M.B. Kane & R. Mitchell (Eds.), Implementing performance assessment: Promises, problems, and challenges (pp. 161-182). Mahwah, NJ: Lawrence Erlbaum Associates, Publishers. Burstein, L., McDonnell, L.M., Van Winkle, J., Ormseth, T.H., Mirocha, J, & Guiton, G. (1995). Validating national curriculum indicators. Santa Monica, CA: RAND. Burtless, G.(. (1996). Does money matter? The effect of school resources on student achievement and adult success. Washington, D.C.: Brookings Institute. Bushe, G.R., & Rami Shai, A.B. (1991). Parrallellearning structures: Increasing innovation in bureaucracies. Reading, MA: Addison-Wesley. Cavanaugh, R.F., & Dellard, G.B. (1997, March). Toward a model of school culture. Paper presented at the annual meeting of the. American Educational Research Association, Chicago, IL. Chawla, S. & Renesch, J.(Eds.) (1995). Learning organizations: Developing cultures for tomorrow's workplace. Portland, OR: Productivity Press. Cibulka, J.G., & Derlin, R.L. (1998). Authetic education accountability policies: Implementation of state initiatives in Colorado and Maryland. In Reynold J.S. Macpherson (Eds.), The politics of accountability: Educative and international perspectives (pp. 79-92). Thousand Oaks, CA: Corwin Press, Inc. Clarke, M. (1996). Learning as systems change. Unpublished paper, Denver: University of Colorado-Denver. 266

PAGE 281

Clune, W. (1987). Institutional choice as a theoretical framework for research on educational policy. Educational Evaluation and Policy Analysis, 9(2), 117-132. Clune, W. H. (1990). Three views of curriculum policy in the school context: The school as policy mediator, policy, critic and policy constructor. In J. E. Talbert, M. W. McLaughlin, & N. Bascia (Eds.), The contexts of teaching in secondary schools (pp. 256 270). New York: Teachers College Press. Clune, W. (1993). Systemic educational policy: A conceptual framework. In S.H. Fuhrman (Ed.), Designing coherent educational policy: Improving the system. San Francisco: Jossey-Bass. Clune, W. (1998). Toward a theory of systemic reform: The case ofnine NSF statewide systemic initiatives [Research Monograph]. University of Wisconsin Madison: National Institute for Science Education. Cohen, D. K. (1990). Revolution in one classroom. Educational Evaluation and Policy, 12(3). Cohen, D. (1994, April). Listening to the music of reform. American Educational Research Association. New Orleans, LA. Cohen, D. K. (1995). What is the system in systemic reform? Educational Evaluation and Policy Analysis, 24(9), 11 17, 31. Cohen, D. (1996). Standards-based school reform: Policy, practice and performance. In H.F. Ladd (Ed.), Holdong schools accountable (pp. 99-122). Washington, D.C.: Brookings Institute. Cohen, D. K., &. Ball, D. L. (1990, Fall). Policy and practice: An overview. Educational Evaluation and Policy Analysis, 12(3), 347-353. Cohen, D. K., &. Ball, D. L. (1990, Fall). Relations between policy and practice: A commentary. Educational Evaluation and Policy Analysis, 12(3), 249256. 267

PAGE 282

Cohen, D. K., &. Barnes, C. A. (1993). Conclusions: A new pedagogy for policy. In D. K. Cohen, & M.W. McLaughlin (Eds.), Teaching/or understanding: Challenges for policy and practice (pp. 240 275). San Francisco: Jossey-Bass. Cohen, D. K., &. Barnes, C. A. (1993). Pedagogy and policy. In D. K. Cohen, & M.W. McLaughlin (Eds.), Teaching for understanding: Challenges for policy and practice (pp. 207-239). San Francisco: Jossey-Bass. Cohen, D. K., &. Spillane, J.P. (1992). Policy and practice: The relation between governance and instruction. American Educational Research Association review of research AERA. New York. Cohen, D.K., &. Hill, H. C. (1998). State policy and classroom performance: Mathemtics reform in California. University of Pennsylvania: Consortium for Policy Research in Education. Cohen, D.K. &. Hill, H.C. (1998). Instructional policy and classroom performance: The mathematics reform in California. CPRE Research Report Series, vol. RR-39. Philadelphia: Consortium for Policy Research in Education. Cohen, D.&. Lowenberg-Ball, D. (1999, June). Instruction, capacity, and improvement. CPRE Research Report Series, vol. RR-43. Philadelphia: Consortium for Policy Research in Education. Cohen, D.,& Loewenberg-Ball, D. (1998). Capacity for reform. In CPRE Research Agenda [Online]. Available: www.upenn.edu/gse/cpre/docs/resrch. Cohen, M. (1991). Individual learning and organizational routines. Organization Science, 2(1 ). Cohen, M. &. Sproull, L.S. (Eds). (1996). Organizational Learning. Thousand Oaks, CA: Sage Publications. Coleman, J. (1990). Foundations of social theory. Cambridge, MA: Belknap Press. Colorado. (1995). CPRE Teacher Professional Development Profiles [Online]. 268

PAGE 283

Colorado Department of Education. (1997). Moving toward standards in Colorado classrooms. Denver: Author. Consortium for Policy Research in Education. (1996). Public policy and school reform: A research summary [Monograph]. Philadelphia: University of Pennsylvania Graduate School of Education. Cook, S., &. Yanow, D. (1995). Culture and organizational learning. In M.D. Cohen & L.S. Sproull (Eds.), Organizational/earning (pp. 430-459). Thousand Oaks, CA: Sage Publications. Corcoran, T. B. (1995). Helping teachers teach well: Transforming professional development. CP RE Policy Briefs. Corcoran, T., & Fuhrman, S. (1998). Going to scale: Building effective infrastructure. In CPRE Research Agenda [Online]. Available: www. upenn.edu/ gse/ cpre/ docs/resrch. Corcoran, T. B., & Goertz, M. (1995). Instuctional capacity and high performance schools. Educational Evaluation and Policy Analysis, 24(9), 27 31. Corcoran, T., & Shields, P.M., & Zucker, A. (1998). The SSI's and professional development of teachers [Evaluation ofNSF's Statewide Systemic Initiatives (SSI) program]. Menlo Park, CA: SRI International. Cousins, J. (1996). Understanding organizational learning for educational leadership and school reform. In K.A. Leithwood (Ed.), International handbook of educational leadership and administration (pp. 589-652). The Netherlands: K.luwer Academic. Cousins, J., &. Earl, L.M. (1992, winter). The case for participatory evaluation. Educational Evaluation and Policy Analysis, 14(4), 397-418. CPRE Policy Brief. (1995, December). Building capacity for educational reform. Philadelphia: Consortium for Policy Research in Education. Cummings, L., &. Staw, B.M. (1990). Information and cognition in organizations. Greewich, CT: JAI Press. 269

PAGE 284

Daft, R., &. Huber, G. (1987). How organizations learn: A communications framework. In Research in the sociology of organizations (Vol. 5) (pp. 1-36). Greenwich, CT: JAI Press. Daft, R., &. Lengel, R.H. (1990). Information richness: A new approach to managerial behavior and organization design. In L.L. Cummings, & B.M. Staw, (Eds.), Information and cognition in organizations. Greenwich, CT: JAI Press. Daft, R., &. Weick, K.E. (1994). Toward a model of organizations as interpretation systems. In H. Tsoukas (Ed.), New thinking in organizational behavior (pp. 70-90). Oxford: Butterworth-Heineman Ltd. Darling-Hammond, L. (1990). Instructional policy into practice: The power of the bottom over the top. Educational Evaluation and Policy Analysis, 12(3), 233 241. Darling-Hammond, L. (1993). Reframing the school reform agenda. Phi Delta Kappa, pp. 753-761. Darling-Hammond, L. (1998, January-February). Teachers and teaching: Testing policy hypotheses from a national commission report. Educational Researcher, 27(1), 5-15. Darling-Hammond, L.,& N. Lieberman. (1992). Restructuring in policy and practice. New York: Teachers College, Columbia University: National Center for Restructuring Education, Schools, and Teaching. Darling-Hammond, L., &. Sykes, G. (Eds.). (1999). Teaching as the learning profession: Handbook of policy and practice. San Francisco: Jossey-Bass. Davenport, T., DeLong, D.W., & Beers, M.C. (1998, Winter). Successful knowledge management projects. Sloan Management Review, pp. 43-57. David, J. (1987). Improving education with locally developed indicators. Philadelphia: Consortium for Policy Research in Education. DeGreene, K. (1993). A systems based approach to policymaking. Boston: Kiuwer Academic Publishers. 270

PAGE 285

DiBella, A.,&. Nevis,E.C. (1998). How organizations learn: An integrated strategy for building learning capability. San Francisco: Jossey Bass. DiBella, A., Nevis, E.C. (1996). Understanding organizational learning capabilities. Journal of Management Studies, 33(3), 361-379. Dogson, M. (1993). Organizational learning: A review of some literatures. Organization Studies, 14. Dole, S., &. Prestine, N.A. (1997, April). Constructivist school change: Building cognition, culture, and communities of practice. Paper presented at the annual meeting of the. American Educational Research Association, Chicago, IL. Douglas, M. (1986). How institutions think. Syracuse, NY: Syracuse University Press. Duncan, R., &. Weiss, A. (1979). Organizational learning: Implications for organizational design. In B.M. Staw (Ed.), Research in organizational behavior. Greenwich, CT: JAI Press Elliott, R. (1998). Building internal capacity: The interaction of school district interventions with organizational/earning processes in schools Unpublished doctoral dissertation, Ontario Institute for Studies in Education/University of Toronto. Elmore, R. ( 1978). Organizational models of social program implementation. Public Policy, 26(2), 185-228. Elmore, R. (1979-80). Backward mapping: Implementation research and policy decisions. Political Science Quarterly, 94(4), 601-616. Elmore, R. F. (1995). Structural reform and educational practice. Educational Evaluation and Policy Analysis, 24(9), 23 26. Elmore, R. F. (1995). Teaching, learning, and school organization: Principles of practice and the regularities of schooling. Educational Administration Quarterly, 31(3), 355-374. 271

PAGE 286

Elmore, R.F. (1996). Commentary: School reform, teaching, and learning. Journal of Education Policy, 11 ( 4), 499-504. Elmore, R., Abelman, C.H., & Fuhrman, S.H .. (1996). The new accountability in state education reform: From process to performance. In H.F. Ladd (Ed.), Holding schools accountable: Performance-based reform in education (pp. 65-98). Washington, D.C.: Brookings Institute. Elmore, R.F., & Fuhrman,S.H. (1994). Governing curriculwn: Changing patterns in policy, politics, and practice. In R. F. Elmore and S. H. Fuhrman (Eds.), The governance of curriculum: The 1994 ASCD yearbook (pp. 1 -1 0). Alexandria, VA:ASCD. Elmore, R. F., Peterson, P. L. & McCarthey, S. J. (1996). Restructuring in the classroom: Teaching learning and school organization. San Francisco: Jossey-Bass. Elmore, R., Siskin, L., & Carnoy, M. (1998). Accountability for results. In CPREResearch Agenda [Online]. Available: www.upenn.edu/gse/cpre/ docs/resrch. Fiol, C.M., & Lyles, M.A. (1985). Organizational learning. Academy of Management Review, 10(4), 803-813. Firestone, W. A. (1996). Images of teaching and proposals for reform: A comparison of ideas from cognitive and organizational research. Educational Administration Quarterly, 32(2), 209-235. Firestone, W.A., &. Herriott, R.E. (1984). Multisite qualitative policy research: Some design and implementation issues. In D.M. Fetterman (Ed.), Ethnography in educational evaluation. Beverly Hills: Sage Publications. Firestone, W. A., Mayrowetz, D., & Fairman, J. (1997). State testing and instructional reform: Middle school mathematics in Maine and Maryland. Paper presented at the annual meeting of the American Educational Research Association, Chicago, IL. Firestone,W.A.,Mayrowetz,D,.& Fairman, J.(1998, Summer). Performance based assessment and instructional change: The effects of testing in Maine and Maryland. Educational Evaluation and Policy Analysis, 20(2), 95-113. 272

PAGE 287

Floden, R., & Goertz, M. (1995). Capacity building in systemic reform. Phi Delta Kappan, 77(1), 19-21. Ford, C., & Ogilvie, D.T. (1996). The role of creative action in organizational learning and change. Journal of Organizational Change Mangement, 9(1), 54-67. Fox, C. (1991). Implementation research. In D.J. Palumbo & D.J.Calista, (Eds.), Implementation and the public policy process: Opening up the black box (pp. 199-212). New York: Greenwood Press. Friedlander, F. (1983). Patterns of individual and organizational learning. In S. Shrivastava and Associates (Ed.), The excutive mind: New insights in managerial thought and action. San Francisco: Jossey Bass. Fritz, J., Helper, D., & Power, S. (1994). Implementation research and education policy: Practice and prospects. British Journal of Educational Studies, 42(1), 53-69. Fuhrman, S., Clune,W.H., & Elmore, R.F.(1988). Research on education reform. Teacher College Record, 90(2), 237-257. Fuhrman, S. (Ed.) (1993). Designing coherent educational policy: Improving the system. San Francisco: Jossey-Bass. Fuhrman, S. &. O'Day, J. (1996). Rewards and reform: Creating educational incentives that work. San Francisco: Jossey-Bass. Pullan, M. (1992). Causes/processes of implementation and continuation. In M. Pullan & S. Stiegelbauer (Eds.), The new meaning of educational change (pp. 6590). New York: Teachers College Press. Pullan, M. (1990). Change practices in secondary schools: Toward a more fundamental agenda. In J. Talbert., N. Bascia & M.W. McLaughlin (Eds.), The contexts of teaching in secondary schools. New York: Teachers College Press. Ginsberg, R., & Berry, B. (1998). The capability for enhancing accountability. In J. Reynold & S. Macpherson (Eds.), The politics of accountability: Educative and international perspectives (pp. 43-61). Thousand Oaks, CA: Corwin Press, Inc. 273

PAGE 288

Gioia, D.A., & Sims, H.P. 1990). The thinking organization. San Francisco: Jossey Bass. Gierman, T. (1998). New American schools after six years [NAS series report]. Santa Monica, CA: RAND Corporation. Goertz, M., Floden,R.E., & O'Day, J. (1995). Volume I: Findings and conclusions [CPRE Research Report Series #35A]. Studies of education reform: Systemic reform. Philadelphia: Consortium for Policy Research in Education. Goertz, M., Floden,R.E., & O'Day, J. (1995). Volume II: Case studies [CPRE Research Report Series #35B]. Studies of education reform: Systemic reform. Philadelphia: Consortium for Policy Research in Education. Goertz, M., Floden,R.E., & O'Day, J. (1995). Volume III: Technical appendix, research design and methodology [CPRE Research Report Series #35CJ. Studies of education reform: Systemic reform. Philadelphia: Consortium for Policy Research in Education. Goertz, M.,& Massell, D. (1998). A case study of Connecticut's SSI (CONNSTRUCT), 1991-1996. SSI case studies, Cohort 1: Connecticut, Delaware, Louisiana and Montana. Menlo Park, CA: SRI International. Goggin, M.L., Bowman, A.O., Lester, J.P., & O'Toole, L.J. (1991). Studying the dynamics of public policy implementation. In D.J.Palumbo & D.J. Calista, (Eds.), Implementation and the public policy process: Opening up the black box (pp. 181-198). New York: Greenwood Press. Gould, J., & Bomstein, M. (1997, Spring). Building capacity for systemic reform at the high school level. Journal of Staff Development, 18(2), 14-18. Grant, S. (1998). Reforming, reading, writing and mathematics: Teachers' responses and the prospects for systemic reform. Mahwah, New Jersey: Lawrence Erlbaurn Associates, Publishers. Hanushek, E. ( 1997). Assessing the effects of school resources on student performance: An update. Educational Evaluation and Policy Analysis, 19(2), 141164. 274

PAGE 289

Hargreaves, A. (1996). Transforming knowledge: Blurring the boundaries between research, policy and practice. Educational Evaluation and Policy Analysis, 18(2), 105122. Hedberg, B. (1981). How organizations learn and unlearn. In C. Nystrom & W. Starbuck (Eds.), Handbook of organizational design (pp. 8-27). London: Oxford University Press. Herman, J. (1997). Large-scale assessment in support of school reform: Lessons in search for alternative measures. CSE Technical Report 446. Los Angeles: CRESST. Herriott, S.C., Levinthal, D.A.,& March, J.G. (1985). Learning from experience in organizations. American Economic Review, 75, 298-302. Hess, G. (1999, Spring). Understanding achievement (and other) changes under Chicago school reform. Educational Evaluation and Policy Analysis, 21(1), 67-83. Heydebrand, W. (1983). Organization and praxis. In Beyond method: Strategies for social research (pp. 3 06319). London: Sage Publications. Hirsch, E., Koppich, J.E., & Knapp, M.S. (1998, December). What states are doing to improve the quality of teaching: A brief review of current patterns and trends [A CTP Working Paper]. University of Washington: Center for the Study of Teaching and Policy. Huber, G. (1991). Organizational learning: The contributing processes and the literature. Organization Science, 2(1), 88-115. Jennings, N. (1996). Interpreting policy in real classrooms: Case studies of state reform and teacher practice. New York: Teachers College Press. Jennings, N. E., & Spillane,J.P. (1996). State reform and local capacity: Encouraging ambitious instruction for all and local decision making. Journal of Education Policy, 11(4), 465-482. 275

PAGE 290

Johnson, B. (1997). Reconsidering the educational restructuring process: An exercise in retrospective sense-making. Paper presented at the annual meeting of the American Educational Research Association, Chicago, IL. Johnston, R.C., & Sandham, J.L. (1999, April14). States increasingly flexing their policy muscle. Education Week, pp. 1, 19-21. Kahne, J. (1996). Reframing educational policy: Democracy, community and the individual. New York: Teachers College Press. Kanstoroom, M., & Finn, C.E. (Eds.) (1999). New directions: Federal education policy in the twenty-first century. New York: The Thomas B. Fordham Foundation in cooperation with the Manhattan Institute for Policy Research. Kaplan, G. R., & Usdan, M.D. (1992, May). The changing look of education's policy networks. Phi Delta Kappan, pp. 664-683. Kelley, C. (1997, Spring). Teacher compensation and organization. Educational Evaluation and Policy Analysis, 19(1), 15-28. Kim, D. (1993, winter). The link between individual and organizational learning. Sloan Management Review, pp. 37-50. Kim, D. (1994). Managing organizational learning cycles. In K.T. Wordman (Ed.), Reflections on creating learning organizations (pp. 41-50). Cambridge, MA: Pegasus Communications. King, J.A., Morris, L.L., & Fitz-Gibbon, C.T. (1987). How to assess program implementation. Newbury Park: Sage Publications. Klein, J. (1989). Paraenthetic learning in organizations: Toward the unlearning ofthe unlearning model. Journal of Management Studies, 26, 291-308. Knapp, M.S. (1996). Between systemic reforms and the mathematics and science classroom: The Dynamics of innovation, implementation and professional learning. National Science Foundation. 276

PAGE 291

Knapp, M.S. (1997, Summer). Between systemic reform and the math science classroom: The dynamics of inovation, implementation and professional learning. Review of Educational Research, 67(2), 222 266. Koretz, D., Mitchell, K., Baron, S., & Keith, S. (1996). Perceived effects of the Maryland school performance assessment program [CSE report 409]. Los Angeles: CRESST. Labaree, D. (1999, May 19). The chronic failure of curriculum reform. Education Week, pp. 42-45. Ladd, H.(1996). Holding schools accountable: Performance-based reform in education. Washington, D.C.: Brookings Institute. LeCompte, M.D., & Preissle, 1.(1993). Ethnography and qualitative design in educational research. Orlando, FL: Academic Press. Lee, J. (1997, Spring). State activism in education reform: Applying the Rash model to measure trends and examine policy coherence. Educational Evaluation and Policy Analysis, 19(1 ), 29 44. Leithwood, K., Leonard, L., & Sharratt, L.(1998, April). Conditions fostering organizational learning in schools. Educational Administration Quarterly, 34(2), 243-276. Levin, H. (1993). Accelerating the system. Accelerated schools, 3(1), 16-22. Levin, H. (1998). Educational perfcormance standards and the economy. Educational researcher, 27(4), 4-10. Levinthal, D. (1991). Organizational adaptation and environmental selection interrelated processes of change. Organization Science, 2(1), 140-155. Levitt, B., & March, J.G. (1988). Organizational learning. In J. Blake & W.R. Scott (Eds.), Annual review of sociology (pp. 319-340). Palo Alto, CA: Annual Reviews. 277

PAGE 292

Linde, C. (1997). How institutions use narratives to remember. Paper presented at the annual meeting of the. American Educational Research Association, Chicago, IL. Linn, R. (1999). Standards based accountability: Ten suggestions [CRESST Policy Brief]. Los Angeles: National Center for Research on Evaluation, Standards and Sudent Testing. Linn, R., &. Baker, E. (1998, Fall). School quality: Some missing pieces [The CRESST Line]. Los Angeles: National Center for Research on Evaluation, Standards and Sudent Testing. Little, J. (1993). Teachers' professional development in a climate of educational reform. Educational Evaluation and Policy Analysis, 13(2), 129-151. Little, J. W., &. McLaughlin, M. W. (1993). Teachers' work: Individuals, colleagues and contexts. New York: Teachers College Press. Lotto, L., &. Murphy, J. (1990). Making sense of schools as organizations: Cognition and sensemaking in schools. In Advances in educational administration, Vol. 1 Part B (pp. 201-240). Greenwich, CT: JAI Press. Loveless, T. (1998, Spring). Uneasy allies: The evolving relationship of school and state. Educational Evaluation and Policy Analysis, 20(1), 1-8. Malen, B., & Knapp, M.(1997). Rethinking the multiple perspectives approach to education policy analysis: Implications for policy-practice connections. Journal of Education Policy, 12(5), 419-445. March, J. (1991). Exploration and exploitation in organizational learning. Organization Science, 2(1), 71-87. March, J., &. Olson, S. (1975). The uncertainty of the past. European Journal of Political Research, 3, 147-171. Marion, R. (1999). The edge of organization: Chaos and complexity theories of formal social systems. Thousand Oaks, CA: Sage Publications. 278

PAGE 293

Marks, H.M.,& Seashore-Louis, K.(1997, Fall). Does teacher empowerment affect the classroom? The implications of teacher empowerment for instructional practice and student academic performance. Educational Evaluation and Policy Analysis, 19(3), 245-275. Massell, D. (1998). State strategies for building capacity in education: Progress and continuing challenges. CPRE Research Report Series, vol. RR-41. Philadelphia: Consortiwn for Policy Research in Education. Massell, D. (1998). State Strategies for building local capacity: Addressing the needs of standards-based reform. Philadelphia: Consortium for Policy Research in Education. Massell, D. (1998, July). State Strategies for building local capacity: Addressing the needs of standards-based reform. CPRE Policy Briefs, RB-25. Philadelphia: Consortium for Policy Research in Education. Massell, R., Kirst, M., &Hannah, M. (1997). Persistence and change: Standards-based systemic reform in nine states CPRE Policy Briefs, RB-21. Philadelphia: Consortiwn for Policy Research in Education. Matland, R. E. (1995). Synthesizing the implementation literature: The ambiguity-conflict model of policy implementation. Journal of Public Administration: Research and Theory, 5(2), 145-174. Maxwell, J. (1993). Gaining acceptance from participants, clients and policy makers for qualitative research. In D.M. Fetterman (Ed.), Speaking the language of power: Communication, collaboration and advocacy (pp. 1 05-115). Falmer Press. Maxwell, J. (1996). Using qualitative research to develop causal explanations. Working Papers: Harvard Project on Schooling and Children. Maxwell, J. (1998). Designing a qualitative study. In L. Bickman & D.J. Rog (Eds.), The handbook of applied social reserach methods (pp. 69-1 00). Thousand Oaks, CA: Sage Publications. Maxwell, J. (1998). Integrating quantitative and qualitative research designs. Unpublished paper, George Mason University. 279

PAGE 294

Maxwell, J.A., & Miller, B.A.(1997). Categorizing and connecting as components of qualitative data analysis. Unpublished paper, George Mason University. Maxwell, J.A., Bashook, P.G. & Sandlow, L.J.(1986). Combining ethnographic and experimental methods in educational evaluation: A case study. In D. Fetterman & M.A. Pitman (Eds. ), Educational evaluation: Ethnography in theory, practice and politics (pp. 121-143). Beverly Hills: Sage Publications. Mayer, D. (1999, Spring). Measuring instructional practice: Can policymakers trust survey data? Educational Evaluation and Policy Analysis, 21(1), 29-45. Mazmanian, D. A.,&. Sabatier, P. A. (1981). Effective policy implementation. Lexington, MA: Lexington Books. Mazmanian, D. A.,&. Sabatier, P. A. (1989). Implementation and public policy with a new postscript. Lanham, MD: University Press. McDonnell, L. (1991 ). Ideas and values in implementation analysis: The case of teacher policy. In A.R. Odden (Ed.), Education Policy Implementation (pp. 241258). Albany: SUNY Press. McDonnell, L. (1997). The politics of state testing: Implementing new student assessments [CSE report 424]. Los Angeles: CRESST. McDonnell, L.M,. & Choisser, C.(1997). Testing and teaching: Local implementation of new state assessments [CSE technical Report 442]. Los Angeles: CRESST. McLaughlin, M.W. (1976). Implementation as mutual adaptation: Changes in classroom organization. In W.Williams &. R. F. Elmore (Eds.), Social Program Implementation (pp. 167 180). New York: Academic Press. McLaughlin, M. W. (1987). Learning from experience: Lessons from policy implementation. Educational Evaluation and Policy Analysis, 9(2), 171 178. 280

PAGE 295

McLaughlin, M. (1991). The RAND change agent study: Ten years later. In A.R. Odden (Ed.), Education Policy Implementation (pp. 143-156). Albany, NY: SUNY Press. McLaughlin, M. W. (1993). What matters most in teachers' workplace context? In J. W. Little&. M.W. McLaughlin (Eds.), Teachers' work: Individuals, colleagues and contexts. New York: Teachers College Press. McLaughlin, M.W., & Shepard, L.A.(l995). Improving education through standards-based reform. Stanford, CA: The National Academy of Education. McLaughlin, M. W., & Oberman, 1.(1996). Teacher learning: New policies and new practices. New York: Teachers College Press. McLaughlin, M. W., Talbert, J. E. & Bascia, N.(1990). The contexts of teaching in secondary schools. New York: Teachers College Press. Meindl, J.R., Stubbart, C., & Porac, J.F. (Eds.) (1996). Cognition within and between organizations. Thousand Oaks, CA: Sage Publications. Merriam, S. (1998). Qualitative research and case study applications in education. San Francisco: Jossey Bass. Meyer, J.W., & Rowan, 8.(1977). Institutionalized organizations: Formal structure as myth and ceremony. In W.P. Powell and P.J. Dimaggio (Eds.), The new institutionalism in organizational analysis (pp. 41-62). Chicago: University of Chicago Press. Miles, M., &. Huberman, A.M. (1994). Qualitative data analysis: An expanded sourcebook. Newbury Park: Sage Publications. Miller, H. (1990). Weber's action theory and Lewi's policy types in formulation, enactment and implementation. Policy Studies Journal, 18( 4), 887 905. Monk, D. (1998, Winter). Resource and pupil performance implications of increased high school graduation requirements. UCEA Review, XXXJX(3), 1-2,5. Morgan, G. (1986). Images of organization. London: Sage Publications. 281

PAGE 296

Murnane, R.J., & Levy, F.(1996). Teaching to new standards. In S.H. Fuhrman,& J.S. O'Day (Eds.), Rewards and reform: Creating educational incentives that work (pp. 257-293). San Francisco: Jossey-Bass. Nakamura, R. T., &. Smallwood, F. (1980). The Politics of policy implementation. New York: St. Martin's Press. Neruda, P. &. Weick, K.E. (1997). Sensemaking in organizations. Nevis, E., DiBella, A., & Gould, J.M. (1995). Understanding organizations as learning systems. Sloan Management Review, pp. 73-85. Newmann, F.M, King, M.B., & Rigdon, M.(1997). Accountability and school performance: Implications for restructuring schools. Harvard Educational Review, 67(1), 41-74. Nicolini, D., & Meznar, M.B.(1995). The social construction of organizational learning: Conceptual and practical issues in the field. Human Relations, 48(7), 727-746. Norman, R. (1985). Developing capabilities for organizational learning. In J.M. Pennings and associates (Ed.), Organizational strategy and change (pp. 217248). San Francisco: Jossey Bass. Nystrom, P.C., & Starbuck, W.H.(1984, spring). To avoid organizational crisis, unlearn. Organizational Dynamics, pp. 53-65. Odden, A. R. (1991). Education Policy Implementation. Albany: SUNY Press. Odden, A. R. ( 1991 ). The evolution of education policy implementation. In A. R. Odden (Ed.), Education policy implementation. Albany: SUNY Press. Odden, A. (1998, September). Creating school finance policies that facilitate new goals. CPRE Policy Briefs, RB-26. Philadelphia: Consortium for Policy Research in Education. 282

PAGE 297

Ogawa, R. (1996). The case for organization in highly institutionalized settings. Paper presented at the annual meeting of the. American Educational Research Association, New York. Olson, L. (1998, Feb. 11). The push for accountability gathers steam. Education Week, pp. 1, 12. Patton, M. (1987). How to use qualitative methods in evaluation. Newbury Park: Sage Publications. Peterson, P. L. (1990). The California study of elementary mathematics. Educational Evaluation and Policy Analysis, 12(3), 257 261. Preskill, H., &. Torres, R.T. (1999). Evaluative inquiry for learning in organizations. Thousand Oaks, CA: Sage Publications. Pressman, J.L., & Wildavsky, A.B. (1973). Implementation. Berkeley, CA: University of California Press. Prestine, N. (1995). Crisscrossing the landscape: Another tum at cognition in educational administration. Educational Administration Quarterly, 3(1), 134-147. Prestine, N.A., & Bowen, C.(1993). Benchmarks of change: Assessing Essential School restructuring efforts. Educational Evaluation and Policy Analysis, 15(3), 289-319. Quality counts 99: Rewarding results and punishing failure. ( 1999). Education Week, vol. XVIII. Bethesda, MD: Education Week & Pew Charitable Trust. Ravitch, D. (1995). Debating the future of American education: Do we need national standards and assessment?. Washington, D.C.: Brookings Institute. Ravitch, D. (1995). National standards in American education: A citizen's guide. Washington, D.C: Brookings Institute. Ravitch, D.(Ed.)(l998). Brookings papers on education policy. Washington, D.C.: Brookings Institute. 283

PAGE 298

Ravitch, D.(Ed.)(l999). Brookings papers on education policy. Washington, D.C.: Brookings Institute. Roberts, C., & Kleiner, A.(1999). Five kinds of systems thinking. In Peter Senge (Ed.), The dance of change (pp. 137-149). New York: Currency Doubleday. Rose, R. (1993). Lesson-drawing in public policy: A guide to learning across time and space. Chatham, NJ: Chatham House. Rotherman, A. (1999). Toward performance-based federal education funding: Reauthorization of the elementary and secondary education act [21st Century Schools Project]. Washington,D.C.: Progressive Policy Institute. Rowan, B. (1991). Commitment and control: Alternative strategies for the organizational design of schools. In Review of research in education (pp. 352-389). New York: AERA. Rowan, B. (1996). Standards as incentives for instructional reform. In S.H Fuhrman & J. O'Day (Ed.), Rewards and reform: Creating educational incentives that work (pp. 195-225). San Francisco: Jessey-Bass. Schlechty, P. (1997). Inventing better schools: An action plan for educational reform. San Francisco: Jessey-Bass. Schein, E. (1985). Defining organizational culture. In J.M. Shafritz, & J.S. Ott (Eds.), Classics of organization theory (pp. 490-502). Belmont, CA: Wadsworth Publishing. Schein, E. (1993, winter). How can organizations learn faster? Sloan Management Review, pp. 85-92. Schein, E. (1996). Organizational learning: What is new? Available: http:// learning.mit.edu/res/wp/1 00 12.html. Schmidt, W.H., & Prawat, R.S.(1999, Spring). What does the third international math and science study tell us about where to draw the line in the top down versus bottom-up debate? Educational Evaluation and Policy Analysis, 21(1), 85-91. 284

PAGE 299

Schneider, S.C., & Angelman, R.(1993). Cognition in organizational analysis: Who's minding the store? Organization studies, 14(3), 347-374. Schon, D. (1994). Teaching artistry through reflection-in-action. In H. Tsoukas (Ed.), New thinking in organizational behavior (pp. 235-249). Oxford: Butterworth-Heineman Ltd. Schon, D.A., & McDonald, J.P.(1998). Doing what you mean to do in school reform [Occasional paper series]. Brown University: Annenberg Institute for School Reform. Schon, S. (1983). Organizational learning. In G. Morgan (Ed.), Beyond method: Strategies for social research (pp. 114-128). London: Sage Publications. Schrier, M.A., & Griffith, J.(1990). Studying micro implementation empirically: Lessons and dilemmas. In D. J. Palumbo & D. J. Calista (Eds.), Implementation and the public policy process: Opening up the black box (pp. 163 180). New York: Greenwood Press. Schwartzman, H. (1992). Ethnography in organizations. Newbury Park: Sage Publications. Schwille, J. (1983). Teachers as policy brokers in the context of elementary school mathematics. In Teaching and educational policy (pp. 370-391). New York: Longman Press. Scott, W. (1992). Organizations: Rational, natural and open systems. Englewood Cliffs, NJ: Prentice Hall. Scott, W. (1995). Institutional effects on organizational structure and performance. In W. Scott (Ed.), Institutions and organizations. Thousand Oaks, CA: Sage Publications. Scott, W. (1995). Institutions and organizations. Thousand Oaks: Sage Publications. Scott, W.R., & Meyer, J.W.(l994). Institutional environments and organizations: Structural complexities and individualism. Thousand Oaks, CA: Sage Publications. 285

PAGE 300

Scribner, J.P., Sunday-Cockrell, K., Cockrell, D.H., & Valentine, J.W. (1999, February). Creating professional communities in schools through organizational learning: An evaluation of a school improvement process. Educational Administration Quarterly, XXXV(l), 130-160. Seidman, I. (1991). Interviewing as qualitative research. New York: Teachers College Press. Senge, P. (1992). The fifth discipline: The art and practice of the learning organization. New York: Doubleday Currency. Senge, P., & Kleiner, A. (1999). The dance of change. New York: Currency Doubleday. Senge, P.M., Roberts, C., Ross, R.B., Smith, B.J., & Kleiner, A. (1994). The fifth discipline .fieldbook: Strategies and tools for building a learning organization. New York: Doubleday. Shrivastava, P. (1983). A typology of organizational learning systems. Journal of management studies, 20(1), 7-28. Shrivastava, P. (1984). Organizational frames of reference. Human Relations, 37, 795-807. Simon, H. A. (1991). Bounded rationality and organizational learning. Organization Science, 2(1), 125-134. Skrtic, T. (1996). Special education and student disability as organizational pathology: Toward a metatheory of school organization and change. In T.M. Skrtic (Ed.), Disability and democracy: Reconstructing special education for postmodernity (pp. 190-273). New York: Teachers College Press. Smith, M. L. (1997). Reforming schools by reforming assessment: Consequences of the Arizona student assessment program: Equity and teacher capacity building [CSE report 425]. Los Angeles: CRESST. 286

PAGE 301

Smith, M.L., Heinecke, W., & Noble, A.J. (1999). State assessment becomes political spectacle: Part I: Introduction to policy stories and policy studies. Teachers College Record [Online]. Available: www.tcrecord.org. Smith, M.L., &. O'Day, J. (1991). Systemic school reform. In S.H. Fuhrman & B. Malen (Eds.), The politics of curriculum and testing (pp. 233-268). London: The Falmer Press. Smithson, J.L., & Porter, A. C. (1994). Measuring classroom practice: Lessons learned from efforts to describe the enacted curriculum-the reform up close study. Philadelphia: Consortium for Policy Research in Education. Smylie, M.A., Lazarus, V., & Brownlee-Conyers, J. (1996). Instructional outcomes of school-based participative decison making. Educational Evaluation and Policy Analysis, 18(3 ), 181-198. Sorg, J.D. (1978). A theory of individual behavior in the implementation of policy innovations (Doctoral disseratation, The Ohio State University, 1978). University Microfilms International, 7902230. Spillane, J.P. (1993). Interactive policy-making: State instructional policy and the role of the school district. Unpublished doctoral dissertation, Michigan State University. Spillane, J. (1994). How districts mediate between state policy and teachers' practice. In R.F. Elmore & S.F. Fuhrman (Eds.), The governance of curriculum (pp. 167-185). Alexandria, VA: Association for Supervision and Curriculum Development. Spillane, J. (1998, February). A cognitive perspective on the role of the local educational agency in implementing instructional policy: Acconting for local variability. Educational Administration Quarterly, 34(1), 31-57. Spillane, J. (1998, Spring). State policy and the non-monolitic nature of the local school district: Organizational and professional considerations. American Educational Research Journal, 35(1), 33-63. 287

PAGE 302

Spillane, J.P.,&. Jennings, N. E. (1997, Spring). Aligned instruction, policy and ambitious pedagogy: Exploring instructional reform from the classroom perspective. Teachers College Record, 78(3), 448-481. Spillane, J., &Thompson, C.L. (1997, Summer). Reconstructing conceptions of local capacity: The local education agency's capacity for ambitious instructional reform. Educational Evaluation and Policy Analysis, 19(2), 185-203. Spillane, J.P., & Zeuli, J.S. (1999, Spring). Reform and teaching: Exploring patterns of practice in the context of national and state mathematics reforms. Educational Evaluation and Policy Analysis, 21 (1 ), 1-27. Spillane, J.P., Peterson, P.L., Prawat, R.S., Jennings, N.E. & Borman, J. (1996). Exploring policy and pratice relations: A teaching and learning perspective. Journal ofEducationPolicy, 11(4), 431-440. Spillane, J.P., Thompson, C.L., Lubienski, C., Jita, L., & Reiman, C.B. (1995). The local government policy system affecting math and science education in Michigan. Lessons from nine school districts. East Lansing: Michigan State University and the National Science Foundation. Sproull, L. (1981 ). Response to regulation: An organizational process framework. Administration & Society, 12(4), 447-470. Stake, R. (1995). The art of case study research. Newbury Park: Sage Publications. Stecher,B.M., Barron, S., Borkp, H., & Wolf, S. (1997). Important features of state assessment systems from the local perspective: Interim report. [CSE Technical Report 472]. Los Angeles: CRESST. Stecher, B.M., Barron, S., Kaganoff, T., & Goodwin, J. (1998). The effects of standards-based assessment on classroom practics: Results of the 1996-97 RAND survey of Kentucky teachers of mathematics and writing. [CSE Technical Report 482]. Los Angeles, CA: CRESST. Stein, S. (1997, March). Signifying policy: The local meaning ofESEA Title I. Paper presented at the annual meeting of the. American Educational Research Association, Chicago, IL. 288

PAGE 303

Stiegelbauer, S. (1996). Change has changed: Implications for implementation of assessments from the organizational change literature. In M.B. Kane & R. Mitchell (Eds.), Implementing performance assessment: Promises, problems and challenges (pp. 139-159). Mahwah, NJ: Lawrence Erlbum Associates, Publishers. Stokes, L. (1997). Short-term policy support for long-term school change: A dilema for reform-minded practitioners. Journal of Education Policy, 12(5), 371-384. Sugarman, B. (1997). Notes toward a closer collaboration between organizational theory, learning organizations and organizational learning in the search for a new paradigm. Available: [On-line] http://learning.mit.edu.res/kr/ Sugarman.html. Sykes, G. (1990, Fall). Organizing policy into practice:Reactions to the cases. Educational Evaluation and Policy Analysis, 12(3), 243-247. Talbert, J. (1996). Primacy and promise of professional development in the nation's education reform agenda: Sociological views. In K.M. Borman (Ed.), Implementing educational reform: Sociologocal perspectives on educational policy (pp. 283-312). Norwood, NJ: Ablex Publishing Corporation. Talbert, J., & McLaughlin, M.W. (1993). Understanding teaching in context. In D. Cohen & M.W. McLaughlin (Eds.), Teaching for understanding: Challenges for policy and practice (pp. 207-239). San Francisco: Jossey-Bass. Thompson, C., & Spillane, J. (1994). The state policy system affecting science and math education in Michigan [Technical report]. East Lansing: Michigan Partnership for New Education and National Science Foundation. Timar, T. B. (1989). A theoretical framework for local responses to state policy: Implementing Utah's career ladder program. Educational Evaluation and Policy Analysis, 15(1 ), 329 341. Toft-Everson, S., Burger, D., & Jesse, D. (1997, June). Organizational learning and development literature review report. Mid-continent Regional Educational Laboratory. 289

PAGE 304

Toft-Everson, S., Burger, D. & Jesse, D. (1997, December). Site visit report. Mid-continent Regional Educational Laboratory. Torrance, H. (1993). Combining measurement-driven instruction with authentic assessment: Some initial observations of national assessment in England and Wales. Educational Evaluation and Policy Analysis, 15(1 ), 81 90. Tucker, M., &. Cody, J.B. (1998). Standards for our schools: How to set them, measure them and reach them. San Francisco: Jossey-Bass. Tyack, D.,&. Cuban, L. (1995). Tinkering toward utopia: A century of public school reform. Cambridge: Harvard University Press. Ulrich, D., Jick, T., & Von Glinow, M.A. (1994). High impact learning: Building and diffusing learning capability. Organizational Dynamics. VanMeter, D. S., &. VanHorn, C. E. (1975). The policy implementation process: A conceptual model. Administration and Society, 6(4), 445-487. Wardman, K.(1994). Reflection on creating learning organizations. Cambridge, MA: Pegasus Communications. Wechsler, M.E., & Friedrich, L.D.(1997). The role of mediating organizations for school reform: Independent agents or district dependents. Journal of Education Policy, 12(5), 385-401. Weick, K. (1979). Social psychology of organizing. Reading, MA: Addison Wesley. Weick, K. (1990). Cognitive processes in organizations. In L.L. Cummings & B.M. Staw (Eds.), Information and cognition in organizations. Greenwich, CT: JAI Press. Weick, K. (1991). The non-traditional quality of organizational learning. Organization Science, 2(1), 116-124. Weick, K. (1994). Cartographic myths in organisations. In H. Tsoukas (Ed.), New thinking in organizational behavior (pp. 211-220). Oxford: Butterworth Heineman Ltd. 290

PAGE 305

Weick, K. (1996). Sensemaking in organizations. Thousand Oaks, CA: Sage Publications. Wiemers, N. (1990, fall). Transformations and accomodation:A case study of Joe Scott. Educational Evaluation and Policy Analysis, 12(3), 297-308. Wilson, S. (1990, Fall). A conflict of interest: The case of Mark Black. Educational Evaluation and Policy Analysis, 12(3), 309-326. Wirt, F., &. Kirst, M. W. (1997). The political dynamics of american education. Berkeley, CA: McCutchan Publishing Corporation. Woods, P., & Wenham, P.(1995). Politics and pedagogy: A case study in appropriation. Journal of Education Policy, 1 0(2), 119-141. Yarrow, D. (1991). Tackling the implementation problem: Epistemological issues in implementation research. In D.J. Palumbo & D. J. Calista (Eds.), Implementation and the public policy process: Opening up the black box (pp. 213 228). New York: Greenwood Press. Y anow, D. (1996). How does a policy mean? Interpreting policy and organizational actions. Washington, D.C.: Georgetown University Press. Yin, R. (1993). Applications of case study research. Newbury Park: Sage Publications. Yin, R. (1994). Case study research: Design and methods 2nd edition. Newbury Park: Sage Publications. Younis, T. (1990). Implementation in public policy. Aldershot, UK: Dartmouth Press. Zucker, A., Shields, P.M., Adelman, N.E., Corcoran, T.B., & Goertz, M.E.(1998). A report on the evaluation of the National Science Foundation's statewide systemic initiatives (SSJ) program [An REC sponsored report on evaluation]. Arlington, VA: National Science Foundation. 291

PAGE 306

Zucker, L. (1987). Institutional theories of organization. In Annual review of sociology (pp. 443-464). Annual Reviews, Inc. 292