Year 1 Baseline External Evaluation Report

North Carolina Investing in Rural Innovative
Schools (NCiRIS)
March 2013
Year 1 Baseline External Evaluation Report
Submitted to Dennis Davis, North Carolina New Schools
March 19, 2013
5900 Summit Ave.
#201
Browns Summit, NC 27214
www.serve.org
NORTH CAROLINA INVESTING IN RURAL INNOVATIVE
SCHOOLS (NC iRIS)
YEAR 1 EXTERNAL EVALUATION REPORT
Prepared by:
Dr. Julie Edmunds
Dr. Laura Gould
Mr. Bryan Hutchins
Ms. Megan Thompson
SERVE Center at UNCG
Gateway University Research Park - Dixon Building
5900 Summit Avenue, #201
Browns Summit, NC 27214
(800) 755-3277
Contact:
Dr. Julie Edmunds, Program Director
336-574-8727
[email protected]
Submitted to:
Dennis Davis, North Carolina New Schools
ii
Copyright © Notice
Copyright © 2013. The material within this report may not be reproduced or replicated
without written permission from SERVE Center at the University of North Carolina at
Greensboro.
For permission, contact: Julie Edmunds at [email protected]; 336-574-8727
Suggested citation:
Edmunds, J. A., Gould, L.F., Hutchins, B.C. & Thompson, M. (2013). North Carolina
Investing in Rural Innovative Schools (NC iRIS): Year 1 Baseline External Evaluation
Report. Greensboro, NC: The SERVE Center, University of North Carolina at
Greensboro.
Disclaimer:
The opinions expressed in this report are reflective of the authors and do not represent
the views or opinions of other individuals within the SERVE Center, the University of
North Carolina at Greensboro, or North Carolina New Schools.
iii
Background Information about the SERVE Center
The SERVE Center at the University of North Carolina at Greensboro (UNCG) is a
university-based research, development, dissemination, evaluation, and technical
assistance center. Its mission is to support and promote teaching and learning
excellence in the K-12 education community.
Since its inception in 1990, SERVE has been awarded over $200 million in contracts and
grants. It has successfully managed 14 major awards including four consecutive
contracts for the Regional Educational Laboratory for the Southeast (REL-SE) funded by
the Institute of Education Sciences (IES) at the US Department of Education (USED) and
four awards from USED for the National Center for Homeless Education (NCHE). In
addition, past SERVE awards include a five-year Technology Grant for Coordinating
Teaching and Learning in Migrant Communities, three consecutive contracts as the
Eisenhower Consortium for Mathematics and Science Education for the Southeast, and
two consecutive Regional Technology in Education Consortium grants.
At the national level, SERVE operates the National Center for Homeless Education
(NCHE), USED’s technical assistance and information dissemination center in the area of
homeless education. NCHE uses state-of-the-art technology for web communication and
online professional development and for supporting state coordinators of homeless
education, local program coordinators, educators, parents, and advocates in all 50
states and in 15,000 school districts.
In addition to national-level NCHE activities, SERVE currently conducts research studies
and evaluations under grants and contracts with federal, state, and local education
agencies. Examples of SERVE’s grant-funded research work include two federally funded
studies of the impact of early college high schools. Contract work includes evaluations of
two Investing in Innovation (i3) projects, the Winston-Salem/Forsyth County Magnet
Program in North Carolina, the Guilford County Schools teacher incentive program
(Mission Possible), the USED-funded Bridges to Early Learning Project in South Carolina,
and North Carolina’s Race to the Top Initiative. The Program Evaluation Standards,
Second Edition (The Joint Committee on Standards for Educational Evaluation, 1994) and
the Guiding Principles for Evaluators (American Evaluation Association, 2004) guide the
evaluation work performed at the SERVE Center.
iv
Table of Contents
Background Information About the SERVE Center ............................................................ iii
Executive Summary..............................................................................................................1
Section I: Introduction and Overview ................................................................................ 7
Section II: Evaluation Design ............................................................................................. 10
Section III: Program Implementation................................................................................ 13
School Recruitment ..................................................................................................... 13
Integrated System of Supports .................................................................................... 14
Postsecondary Partnerships and College Course Funding ........................................... 20
Activities to Influence District and State Context ........................................................ 22
Section IV: Results ............................................................................................................. 24
School-Level Characteristics ......................................................................................... 24
Baseline Outcome Data ................................................................................................ 25
Baseline Survey Data ..................................................................................................... 28
School-level Implementation ........................................................................................ 33
Section V: Lessons Learned, Conclusions and Recommendations ................................... 35
References ........................................................................................................................ 39
Appendix A: Performance Indicators ................................................................................ 40
Appendix B: Methodology ................................................................................................ 46
Appendix C: Design Principle Rubric ................................................................................. 73
Appendix D: NC iRIS Implementation Survey ................................................................... 88
v
North Carolina Investing in Rural Innovative Schools (NC iRIS): Year 1
Baseline External Evaluation Report
Executive Summary
North Carolina Investing in Rural Innovative Schools (NC iRIS) is designed to increase the
number of students who graduate from high school and are prepared for enrollment
and success in postsecondary education. The project seeks to blend high school and
college by applying strategies from the successful early college high school model to a
total of 18 traditional high schools located in rural, low-wealth districts.
NC iRIS is managed by the North Carolina New Schools (NC New Schools), which also
supports the early college model. According to NC New Schools, the critical components
of NC iRIS include a set of services that are intended to support implementation of a
whole-school reform model emphasizing the creation of a college-preparatory school
environment through six Design Principles. The services provided include: 1.) a series of
professional development activities centered around implementation of the six Design
Principles; 2.) on-site leadership coaching for administrative teams on the Design
Principles; 3.) on-site instructional coaching on the Design Principles, emphasizing the
Common Instructional Framework; 4.) funding for college credit courses for students;
and 5.) assistance in developing partnerships with postsecondary institutions. As a result
of these services, each school is expected to implement six Design Principles that
represent characteristics of an effective high school. These Design Principles, as
articulated by NC New Schools, are as follows: 1.) ensuring that students are ready for
college; 2.) instilling powerful teaching and learning in schools; 3.) providing high
student/staff personalization; 4. ) redefining professionalism; 5.) creating leadership
that develops a collective vision; and 6.) implementing a purposeful design in which
school structures support all of the above principles. A primary emphasis of the program
will be increasing the number of students who participate in college credit-bearing
courses while in high school.
This report includes results from the evaluation of the first year of NC iRIS. The
evaluation uses mixed methods to examine the implementation and impact of the
project. The impact of the project will be determined through a quasi-experimental
study in which student outcomes for NC iRIS schools will be compared to a matched set
of comparison high schools. The evaluation will study implementation through the use
of surveys, observations, and site visits.
1
The Year 1 report focuses on the implementation of the program and on presenting
baseline data for the Cohort 1 schools. A summary of the results is presented below,
organized by the implementation and impact evaluations.
Implementation Evaluation: At the end of the first year of the project, North Carolina
New Schools (NC New Schools) was mostly on-track for accomplishing the goals of the
NC iRIS project, although there were delays in some areas. NC New Schools staff
completed a variety of activities as described below.
By the fall of 2012, a total of 18 schools had agreed to participate in the project. Five
schools (Cohort 1) started receiving services in Year 1 and will continue receiving them
through the third year of the project. The eight schools in Cohort 2 are scheduled to
receive services starting in Year 2 through Year 4 and the five schools in Cohort 3 will
receive services in Years 3-5. The starting date for Cohort 1 schools was later than the
program staff would have liked, which resulted in challenges in obtaining buy-in,
scheduling professional development and coaching, and starting college courses. The NC
New Schools staff have recognized this and have already started working with Cohort 2
schools as of January, 2013.
A core component of the program is an Integrated System of Supports, which includes
professional development, leadership coaching, and instructional coaching, all centered
on building participants’ knowledge and expertise around the Design Principles. Table 1
shows the extent to which the project is on-track relative to these supports.
Table 1: Professional Development and Coaching Services Provided
Service
Targeted
Actual Number Average Days of
Number of
of Days
School
Days Expected
Provided
Participation
to be Provided
Professional
Development
Leadership
Coaching
Instructional
Coaching
13
18
12
Range of
Participation
(lowest to
highest
level)
4-15.5
9
4
4
3-6
34
24
24
7-45.5
These data show that NC New Schools provided more than the anticipated amount of
professional development days. Actual participation rates varied by school. One school
2
participated in as few as 4 days of professional development while two schools
participated in 15.5 days of professional development.
NC New Schools was expected to provide an average of 9 days of on-site leadership
coaching by the end of January, 2013. The actual average number of days was 4 with
only one school receiving more than half of the targeted number of visits. For
instructional coaching, NC New Schools was expected to provide an average of 34 days.
There was an average of 24 days of coaching provided with schools ranging from a low
of 7 days to a high of 45.5 days. NC New Schools is developing a plan to ensure that
those schools that have received fewer days of services receive the full number of visits
by the end of the year or over the summer.
In terms of the content of the visits, leadership coaches worked with the principal
primarily on leadership development and on improving instruction. The instructional
coaches also focused on improving instruction. An analysis of the coaches’ reports
showed that the leadership coaches reported focusing on the Ready for College Design
Principle approximately two-thirds of the time and the instructional coaches focused on
it one-third of the time.
Another core aspect of NC iRIS is the provision of college courses. During the first year,
NC New Schools staff supported the development of postsecondary partnerships that
would allow for these college classes. North Carolina’s new College and Career
Readiness legislation provided a challenge as it restricted the students who could take
community college courses and did not permit students to take college courses in 10 th
grade. NC New Schools staff worked to obtain a waiver to allow students to take one
course in 10th grade; they have also been identifying alternative sources of college
courses for schools to access if necessary. NC New Schools staff have also been helping
districts establish Memoranda of Understanding (MOUs) to formalize the relationship
between the district and the community college. As of the end of Year 1, one MOU had
been finalized and the remaining three were still under development.
No students took college courses in the first semester of the intervention; however,
students did begin taking courses in the second semester. Results for this will be
included in the Year 2 evaluation report.
The NC New Schools staff also undertook specific activities to influence the state and
district context. They engaged in a process called the Design Principle Rubric Review,
during which the NC New Schools staff spent time observing in a school. They then met
with the district superintendent and a team from the school to review the observational
and other data and work with the school to develop a plan to meet the Ready for
College Design Principle. As of the end of January, they had completed reviews for four
out of the five schools. NC New Schools staff saw this as a powerful learning experience
and are looking to implement it with all of their programs.
3
Finally, NC New Schools hired a community development coordinator who met with
community stakeholders in the four Cohort 1 districts to build an understanding of the
need for students to be ready for college and to build support for NC iRIS.
Impact Evaluation: In Year 1, the evaluation focused on collecting baseline data for the
participating schools, as well as qualitative data concerning implementation of the
program in the schools.
NC iRIS Cohort 1 schools are located in the targeted rural, low-wealth counties. An
analysis of the demographics of the schools showed that the Cohort 1 NC iRIS schools
are smaller in size than the state average and have a higher proportion of students in
poverty. They also have lower percentages of students who are minority and English
Language Learners than the state averages; this is because the Cohort 1 schools were
located in areas of the state with lower minority populations.
NC iRIS Cohort 1 schools are starting with outcomes that suggest a relatively low
emphasis on college. In particular, before beginning NC iRIS, the schools had course
enrollments of approximately half as many advanced courses (Advanced Placement,
International Baccalaureate, or dual credit courses) as the state average. Table 2 shows
the baseline levels of core outcomes for the Cohort 1 schools, compared to the state
average.
Table 2: Baseline Outcomes
Treatment
School Name
On-Time Grad
Rate Avg
(2008-2012)
Percent
Students
Enrolled in
Algebra 2
(2011)
Percent
Enrollments in
AP/IB/College
Credit Courses
(2012)
Attendance
Rate
Avg. Pass Rate
Core Subjects
(EOC
Composite)
Treatment
Group Mean
State Average
74.0%
15.9%
2.7%
94.5%
75.0%
75.3%
20.0%
5%
94.5%
81.4%
The evaluation team administered a baseline survey to all Cohort 1 schools; this survey
was designed to examine implementation of the program’s Design Principles. The
baseline survey data suggest that the schools are relatively well-functioning schools that
also have room to grow, particularly in the areas of college readiness expectations,
rigorous instruction, and personalization.
Although there are not yet quantitative data relative to implementation or outcomes,
we do have initial qualitative data on implementation from coaches’ reports and
interviews. According to the coaches, school staff have begun trusting the coaches
more, a critical first step in opening the staff up to assistance. Although it is early in the
4
project, coaches also report that some teachers are beginning changes in instruction
and that some schools are showing an increase in activities emphasizing college
readiness.
Conclusions and Recommendations
The data from the first year of NC iRIS implementation indicate that the program is
generally on-track for meeting its goals, although there were some delays and there are
areas in which it could be modified to strengthen its potential impact. The results so far
from the evaluation have led to some recommendations for the program staff to
consider as they move forward:




The program staff has recognized the need to begin working with the schools
earlier than the end of the school year and have already started working with
Cohort 2 schools. This earlier contact with the schools will give schools more
time to understand and buy-into the program and hopefully allow the coaching
services to start earlier. During these early contacts, the NC iRIS staff should
work with the schools to ensure that they have a very clear understanding of the
project, and that the goal of the project is to help more students become college
ready. For the school, this will involve trying to create an environment that is
more supportive of college, using instructional strategies that will help prepare
students to succeed in college courses, and providing early access to college
courses. To help highlight the need for change during these conversations, it
might be useful to share the school’s data around the percentage of students
who are taking advanced courses in the school, as compared to the state
average.
Given the multiple expectations placed on teachers, including Common Core, NC
iRIS staff and coaches should clearly present how the NC iRIS services mesh with
and will help in implementation of these other initiatives.
The professional development opportunities provided by NC New Schools are
clearly focused on the Design Principles. The emphasis in these sessions has
historically been on serving NC New Schools’ partner small schools, particularly
their early colleges. Comprehensive high schools, such as those in the NC iRIS
network, have different issues and would benefit from at least some
programming more explicitly tailored to their needs. NC New Schools have
made an excellent step in this direction by planning an onsite visit to a school
district in Texas, where they have been implementing a similar effort for several
years. NC iRIS staff should explore additional ways of tailoring the professional
development more explicitly to the needs of comprehensive high schools.
The instructional and leadership coaching services have been primarily focused
on the Leadership and Powerful Teaching and Learning Design Principles. Given
the fact that Ready for College is seen as the most important Design Principle for
NC iRIS, NC iRIS staff may want to consider working with the coaches to
5
determine ways in which they can emphasize the Ready for College Design
Principle in their visits. This can be driven by the school’s plan developed as part
of the Design Principle Rubric Review.
6
North Carolina Investing in Rural Innovative Schools (NC iRIS): Year 1
Baseline External Evaluation Report
Section I: Introduction and Overview
North Carolina Investing in Rural Innovative Schools (NC iRIS) is designed to increase the
number of students who graduate from high school and are prepared for enrollment
and success in postsecondary education. The project seeks to blend high school and
college by applying strategies from the successful early college high school model
(Edmunds, Bernstein, Unlu, Glennie, Willse, et al., 2012; Edmunds, Willse, Arshavsky, &
Dallas, in press) to a total of 18 traditional high schools that are located in rural, lowwealth districts.
NC iRIS is managed by North Carolina New Schools (NC New Schools), which also
supports the early college model. According to NC New Schools, the critical components
of NC iRIS include a set of services that are intended to support implementation of a
whole-school reform model emphasizing the creation of a college-preparatory school
environment through six Design Principles. The services provided include: 1.) a series of
professional development activities centered around implementation of the six Design
Principles; 2.) on-site leadership coaching for administrative teams on the Design
Principles; 3.) on-site instructional coaching on the Design Principles, emphasizing the
Common Instructional Framework; 4.) funding for college credit courses for students;
and 5.) assistance in developing partnerships with postsecondary institutions. As a result
of these services, each school is expected to implement six Design Principles that
represent characteristics of an effective high school. These Design Principles, as
articulated by NC New Schools, are as follows: 1.) ensuring that students are ready for
college; 2.) instilling powerful teaching and learning in schools; 3.) providing high
student/staff personalization; 4. ) redefining professionalism; 5.) creating leadership
that develops a collective vision; and 6.) implementing a purposeful design in which
school structures support all of the above principles. A primary emphasis of the program
will be increasing the number of students who participate in college credit-bearing
courses while in high school.
These critical components are then designed to lead to the following specific outcomes
as articulated in the NC iRIS grant proposal:

Over 21,442 students will be impacted over the five-year grant period;
7





The 4-year cohort graduation rate will increase an average of 10 percentage
points across the 18 project schools by the end of the fifth year of the grant
program;
Schools will increase the percentage of students successfully completing Algebra
1 by the end of ninth grade an average of 10 percentage points by the end of the
second year of implementation;
At least 50% of students who experience four years in project schools will
successfully complete at least 21 units of college credit.
90% of project schools and LEAs will continue implementation of the New
Schools’ Design Principles and early college high school strategies with active
participation in the New Schools Network after the completion of three years of
IS4 (Integrated System of Supports) services.
North Carolina will enact legislation and policy changes to expand access to
college courses for high school students.
Figure 1 on the next page is a pictorial representation of the program’s core
components and the expected changes in school- and student-level outcomes. This
logic model guides the evaluation design, which is a mixed method evaluation
examining the implementation and impact of the model.
This report includes information for the first 13 months of the project—January 1, 2012
through January 31, 2013. Section II of the report provides an overview of the
methodology used by the evaluation. Section III summarizes the activities undertaken
during the first year of the project, participation in those activities, and initial
perceptions about the quality and utility of the activities. Section IV provides baseline
data for the key outcomes that will be examined as part of the evaluation. Finally,
Section V summarizes lessons learned from the first year and provides conclusions and
recommendations to assist in future implementation.
8
Figure 1: NC iRIS Logic Model
9
Section II: Evaluation Design
The NC iRIS evaluation uses mixed methods to examine the impact and implementation
of NC iRIS. This section provides a brief overview of the evaluation design. A more
detailed evaluation plan can be found in Appendix B.
Sample
The schools in the study are located in rural, low-wealth counties throughout North
Carolina. The school’s entire student population will be participating in the portion of
the intervention that focuses on the six Design Principles. A subset of students,
approximately half of the school’s enrollment, will be targeted to participate in college
credit classes while in high school. The target population for the college credit courses
includes students:
 Who would be the first in their family to complete postsecondary education;
 Who are at risk of dropping out of high school; and
 Who are members of groups underrepresented in college, including low-income
and racial and ethnic minority students,
The treatment group sample will include a total of 18 comprehensive high schools that
will receive three years of services. A subset will receive services in Years 1-3; another
subset in Years 2-4; and the final subset in Years 3-5. Each school will be matched to up
to three comparison schools, bringing the total sample to 72 high schools. The
evaluation will also examine the impact of the model on school’s implementation of the
Design Principles.
Impact Study
The primary impact study uses a quasi-experimental design to assess the impact of the
NC iRIS Project on a core set of student outcomes.
Measures: The study will examine two core student outcomes as the primary outcomes
of the study: 1.) the taking and successful completion of college credit-bearing courses
(dual credit and AP) and 2.) graduation from high school. Additional student outcomes
to be examined include attendance, dropout and continued enrollment rates, and
enrollment and success in college preparatory courses. These data will come from data
collected by the North Carolina Department of Public Instruction and housed at the
North Carolina Education Research Data Center.
Analysis: The evaluation will examine the impact of the model on core student
outcomes using HLM (hierarchical linear modeling) analyses (Raudenbush & Bryk, 2002)
with students clustered in schools. The model will include appropriate student- and
school-level characteristics as covariates (see Appendix B).
10
Evaluation of Implementation
The implementation evaluation will focus on two aspects of implementation: 1.) the
delivery of and participation in NC iRIS program services (what has been conceptualized
as “structural implementation” (Century, Rudnick, & Freeman, 2010)) and 2.) the
implementation of the Design Principles at the school level (this is similar to what has
been conceptualized as “instructional implementation” and “represent[s] the actions,
behaviors, and interactions that the user is expected to engage in when enacting the
intervention” (Century, et al., 2010, p. 205)).
Measures: The evaluation will examine the extent to which NC New Schools delivers the
services it promised to deliver and the extent to which schools participate in those
services at the level that was intended. In addition, the evaluation will examine the
quality and utility of those services. Data on service delivery and participation will be
collected from project records, including coaches’ reports and professional development
sign-in sheets, supplemented by interviews with staff and by data from site visits
conducted by the evaluators. Data on the quality of the services will be collected
through observations of the training done by the external evaluation and through
feedback surveys.
The evaluation will also examine the extent to which the schools are implementing the
Design Principles using two primary instruments:
1.) An original survey that reports on the Design Principles. The evaluation team
used the Staff Implementation Survey from the Study of the Efficacy of Early
College High Schools as a base for the original survey. There are scales for all
relevant aspects of the Design Principles. Teaching and administrative staff in all
treatment schools will be asked to complete the survey as a baseline prior to
beginning the treatment. They will then complete the surveys annually. The
comparison schools will be administered the survey at baseline and at the same
time point as Year 2 of implementation for the treatment school with which they
are matched.
2.) The state-administered Teacher Working Conditions survey. The Teacher
Working Conditions survey (New Teacher Center, 2010) is administered
biennially to staff in all schools in the state and includes scales on teacher
empowerment and leadership.
We also collect data on implementation through reviewing coaches’ reports, interviews
with project and school staff, and site visits to selected schools.
Analysis: The evaluation will determine the level of fidelity of implementation of the NC
New Schools program services (the first column in the logic model—Figure 1). In
consultation with NC New Schools, the evaluation team has established formal
benchmarks for implementation. Relative to the coaching and professional development
services, fidelity of implementation will be assessed on whether 100% of professional
11
development services and 90% of coaching services are delivered as planned to schools,
and whether participants participated in professional development services at a rate of
at least 80%. Table 1 shows the expected level of services as would be recorded for a
specific school (in this case, M).
Table 1: Fidelity of Implementation of Services Provided to and Received by School M
Services
Target Level
Leadership Coaching
Instructional Coaching
Professional
Development Services
20 days annually
81 days annually
Total of 22 days
annually for different
school staff
Adequate Adherence
by NC New Schools
18 days provided
73 days provided
22 days provided
Adequate Dosage
Received by School
18 days participated
73 days participated
18 days participated
Two other program services are considered as dichotomous measures of
implementation: the creation of an institute of higher education (IHE) partnership that
allows students to take college courses and the provision of a day of professional
development for district staff. It will be indicated whether these are in place for each
school. Across all schools, the benchmark is 100% creation for IHE partnerships (because
the program will not work without them) and 80% participation rate among district staff
in district professional development.
The final program service is provision of funds for college courses. As a measure of
fidelity of implementation, this will be treated as dichotomous (did the program provide
funds for students to take college credit courses or not?). Because the actual number of
courses that NC New Schools funds is dependent on the number of students who enroll
in college courses, number of courses supported will be considered as a student
outcome.
A total score for fidelity of implementation will be calculated by combining the level of
participation in all of the required activities (see Appendix B for more detail).
The services described above are designed to prepare the schools to implement the
Design Principles. School-level implementation of the Design Principles can be
considered both as an implementation measure and an immediate outcome. Given that
these Design Principles are characteristics of a good school that could be found in both
intervention and non-intervention schools, it is critical to understand implementation in
both situations. We do not have formal benchmarks for each of these. Instead, the
expectation is that treatment schools improve on these dimensions as compared to
baseline and as compared to the comparison schools. These data will be collected
primarily through the surveys administered to the staff of treatment and comparison
schools.
12
Section III: Program Implementation
In this section, we describe the implementation of the NC iRIS project activities during
year 1 of the grant, from January 1, 2012 through January 31, 2013. The description of
activities is organized by the categories represented in the first column of the program’s
logic model (Figure 1) with the addition of the action of school recruitment. In Year 1,
NC New Schools completed the following activities:
 Recruited a total of 18 schools to participate in the project;
 Provided an integrated system of supports, including 18 days of professional
development, an average of 4 days of leadership coaching, and an average of 24
days of instructional coaching to the first cohort of schools;
 Supported the development of postsecondary partnerships and began providing
access to college courses;
 Undertook activities to influence the state and district context, including district
professional development and community engagement activities.
School Recruitment
As part of their i3 proposal, NC New Schools had recruited a total of 18 schools in 10
rural and low-wealth districts. After the grant was awarded, six schools in two counties
had to drop out of the project because they were already being served by NC New
Schools under North Carolina’s statewide Race to the Top grant. In the spring of 2012,
the superintendent in another district decided not to continue with NC iRIS, which
meant that an additional three schools were no longer participating. Finally, one
additional school left the project in the early fall of 2012 because the original principal
had left and the new principal did not feel that he could take on any additional work.
This left only 8 of the original 18 schools from the application.
As a result, throughout the first half of the grant NC New Schools staff had to recruit 10
schools to replace those who either could not participate (because of a conflict in
funding) or chose not to participate. They sought out schools that were in rural, lowwealth counties, that had successful early colleges in the district, and that exhibited
interest in the program. By November, 2012, the staff had successfully replaced the
schools that had left, resulting in a total of 18 schools that agreed to participate in the
project.
The original intent had been to serve the 18 schools for three years in three equal
cohorts of six schools each. Thus, Cohort 1 would have consisted of six schools that
received services in Years 1-3 of the project, Cohort 2 would have been six schools that
received services in Years 2-4, and the remaining six schools in Cohort 3 would have
received services in Years 3-5. Given the fact that NC New Schools had to recruit
replacement schools, the distribution among the school cohorts changed. The current
plans are to serve 5 schools starting in Year 1, an additional 8 schools in Year 2, and the
13
final five schools starting in Year 3. This will allow the program to serve all 18 schools by
the end of the project. Table 2 represents the scheduled implementation of schools.
Table 2: Number of Schools Participating by Project Year
Cohort
One
Two
Three
Year 1
5
Year 2
5
8
Year 3
5
8
5
Year 4
Year 5
8
5
5
Integrated System of Supports
The NC iRIS project includes what NC New Schools calls an “Integrated System of
Supports” (IS4) to assist the schools in their implementation of the NC iRIS model. These
supports include professional development organized by NC New Schools, onsite
leadership coaching, onsite instructional coaching and support by NC New Schools.
Each activity is discussed separately.
Professional Development. NC New Schools supports a large network of schools,
including early colleges, STEM-focused high schools, redesigned high schools, and NC
iRIS schools. Throughout the year, they offer a series of professional development
offerings that are designed to build understanding of the Design Principles and the
Common Instructional Framework. NC iRIS schools could avail themselves of any of
these offerings, although there was a specific set of experiences that were considered
important for the schools to attend. During the 2012-2013 school year, each school is
expected to have one or more representatives attend 22 days of professional
development, with full implementation considered to be 18 days.
New Schools began working with the NC iRIS schools in June 2012. The first professional
development was their annual Summer Institute, a four-day conference that includes
participation from all schools in the New Schools’ network. For the NC iRIS schools, the
sessions included an orientation introducing the schools to the project and some
structured team planning time. There was also the opportunity for all participants to
choose from a variety of concurrent sessions that centered on topics related to the
Design Principles and the Common Instructional Framework. Staff from four of the five
Cohort 1 schools attended the Summer Institute.
In July 2012, New Schools offered LEAD, a three-day principal professional development
session focusing on leadership skills. Principals from two of the schools attended LEAD.
In September 2012, there was a two-day New Principal Institute and a two-day New
Teacher Institute. Offered to new principals and teachers in all schools in the NC New
Schools’ network, these sessions are designed to introduce participants to the Design
Principles and the Common Instructional Framework. The principals also visited a school
and observed how instructional rounds—a framework for conducting peer
14
observations—were conducted. All five schools sent representatives to the New
Principal Institute, but only two of the schools sent representatives to the New Teacher
Institute.
Also in September, there were regional meetings of the Leadership Innovation Network,
during which principals learned about leadership strategies and had an opportunity to
share with each other. Staff from three schools participated in this.
In October 2012, there was an opportunity for staff to do study visits to selected
schools. The purpose of these study visits was to participate in a focused instructional
rounds model and to see the Common Instructional Framework in action. Staff from all
five schools participated in this activity. A member of the evaluation team observed
this training and found it overall well-designed, rating it as “Accomplished, Effective
Professional Development.”
Two-day regional sessions entitled “Common Practices Symposium” were held in
October and November of 2012. These sessions were available to all members of the
New Schools network and included breakout sessions on a variety of topics, such as
“The Silver Bullets to Achieve Rigor and College Readiness” and “Shared Leadership and
Collaboration.” Three of the five schools sent representatives to these professional
development opportunities.
Also in October and November of 2012, NC New Schools offered regional one-day
sessions for counselors and early college liaisons. These sessions included information
on applying to college and financial aid. There were a series of follow-up webinars. Only
one of the schools attended the face-to-face sessions but three additional schools
attended the webinars.
It should be noted that, with the exception of NC iRIS-specific discussions at Summer
Institute, all of the professional development was offered to all of the schools in the
New Schools network. At the end of April, there will be a NC iRIS-specific professional
development, where the participating schools will visit a district in Texas that is
implementing similar approaches in traditional high schools.
Each school is given at least 22 days of professional development from June 2012
through May 2013; attending 18 is considered full implementation of the program.
Because of reporting deadlines, this report includes data through the end of January
2013—representing 7/12 of the year (which includes the summer). As a result, for
purposes of this report, fidelity of implementation for professional development is
considered to be met if New Schools offered 13 days of professional development and if
schools participated in at least 10 of those days. Table 3 reports whether that level has
been reached by NC New Schools and by the schools themselves (pseudonyms are used
for the schools).
15
Table 3: Fidelity of Implementation—Professional Development
Organization
a
Number of Days
Number of Actual
FOI Rating
Expected
Days
NC New Schools
13
18
100%
Marks
10
12
100%
Jefferson
10
15.5
100%
Lincoln
10
8
80%
Grant
10
15.5
100%
Roosevelt
10
4
40%
b
Overall Average
10
12
87%
a
With the exception of NC New Schools, all names are pseudonyms.
b
The school average is calculated by taking the mean of each school’s FOI rating, not the average number
of days provided.
Table 3 shows that NC New Schools has delivered more than the number of days of
professional development that they would have been expected to deliver by this point
during the school year. Three out of the five schools have met full implementation, with
the remaining two schools at 80% and 40% respectively. The school with very low
participation is also facing challenges with principal buy-in, according to the coaches.
Instructional Coaching: Each school in the NC New Schools network receives services
from experienced educators who have knowledge, experience, and skills in working with
school staffs and who understand and are committed to NC New Schools’s mission,
vision, and support system. The instructional coaches emphasize both implementing the
Design Principles and working with the teaching staff individually and collectively to
improve their skills in using the Common Instructional Framework (CIF). The number of
days each coach spends in a given school is driven by the size of the school, with 3
days/teacher for each year of the grant. Table 4 shows the number of days expected to
be provided during the entire year, the number of days that would be considered full
implementation by the end of January (90% of half of the target number of days), and
the number of days actually provided.
Table 4: Fidelity of Implementation—Instructional Coaching
Organization
Marks
Jefferson
Lincoln
Grant
Roosevelt
Target # of
Days Annually
87
81
54
105
57
Implementation Target
(by end of 1/13)
39
36
24
47
26
Number of
Actual Days
7
33
19
45.5
17.5
FOI Rating
18%
92%
79%
97%
67%
a
Overall Average
77
34
24
71%
a
The school average is calculated by taking the mean of each school’s FOI rating, not the average number
of days provided.
Table 4 shows that instructional coaching has been provided at a level that is overall
approximately 71% of full implementation at the end of January, 2013. Two of the
16
schools are on-track at full or close to full implementation with an additional school at
close to 80%. One school has had only 2/3 of the visits it is expected to receive while
another school has had less than one-fifth of the visits it is supposed to receive. The
school with 67% of leadership coaching visits is also the school with little principal buyin. Interestingly, the school with 18% of instructional coaching visits does not suffer
from lack of buy-in. The principal for that school was chosen as Principal of the Year and
was out of the school for much of the fall, making it hard to schedule both instructional
and leadership coaching visits. NC iRIS staff are developing strategies to address how
they can provide that school with the number of visits they are promised; one possibility
is to provide coaching activities around planning over the summer.
The instructional coaches generally began their work in each school by meeting with the
principal and determining the focus of their efforts—which aspect of the Common
Instructional Framework they should work on with the staff. They then began by
providing professional development sessions on the chosen topic to either the whole
staff or to small groups. They followed up by doing one-on-one work or working with
grade-level professional learning teams.
Analysis of the instructional coaches’ reports showed that instructional coaches
provided a variety of activities. The majority of reports referenced the provision of
schoolwide professional development, meeting with teachers, and classroom
observations. Table 5 shows the activities completed by the coaches with a sample
description of each activity.
Table 5: Activities Completed by Instructional Coaches
Activity
Meeting with
teachers
Schoolwide/Formal
Professional
Development
Percent of
Reports
70%
59%
Sample Description
I asked [teacher] about the previous day’s session on LG roles and
the opportunity for their use with group work in her Biology class.
She discussed the use of stations in her classroom. I suggested that
roles could potentially play a part in the work with stations. I
shared some suggestions and offered to collect some websites on
genetics and evolution that could be used in that type of lesson.
Presented Writing to Learn PD during planning periods. This PD
predominantly focuses on low-stakes writing opportunities and
how those can be incorporated more fluidly into each classroom.
During most of the presentations, discussion was lively and
revolved around students’ writing skills and how to strike the
balance between having students write to learn and having
students write using perfect grammar and conventions.
17
Activity
Classroom
observations
Percent of
Reports
50%
Lesson Planning
35%
Meeting with
Principal
34%
Modeling
instruction
20%
Sample Description
Visited in [teacher’s] English I class. Students were writing an initial
draft of the introduction and first 2 paragraphs of an
argumentative essay: Is it better to live in the big city or small
town? … I am continuing a discussion with [the teacher] on trying
to connect the reading with the writing, vocabulary and grammar
being learned in the classroom. After some discussion, [the
teacher] is willing to co-plan and co-teach a lesson using literacy
groups/ collaborative group work after our next professional
development. I hope to make some progress by connecting these
in our co-planning.
A seasoned English teacher that had not signed up for me to work
with her asked me during the day to work with her during her
planning period. We co-planned ways to utilize low stakes writing
more in her classroom. She had several protocols that she uses
quite often with her students, but was requesting new ways to
obtain more student engagement in the process. We planned
three options for her to use. I was able to observe one she selected
to use right away with her students. We had time to follow-up this
observation and have a post conference.
[The] principal, and I met about the staff development for the
spring semester and he approved the dates set up by [other
coach]. He was very supportive of the work of both coaches. [The]
assistant principal discussed with us the use instructional rounds in
the freshman academy. She also invited us to the freshman
academy meetings and gave us the dates of these meetings.
Math [teacher] and I planned a lesson which included collaborative
group work. I modeled the use of CGW using Tower building and
then the students were kept in the groups and the teacher began a
new lesson. We both circulated among the students. I saw that an
EC student was actively engaged with the other students. Prior to
this the student did not interact with the rest of the class. He had a
better understanding of the concept because he was able to
discuss the concept with the members of his group.
The instructional coaches also reported working closely with the leadership coaches,
providing feedback so that the leadership coach could reinforce their work in meeting
with the principals.
As noted in the descriptions in Table 5, the content focus of the instructional coaches’
visits was primarily centered on the Powerful Teaching and Learning Design Principle,
with 87% of their reports referencing this principle, and on Leadership, with 89% of the
reports referencing this Design Principle. Seventy-six percent of the reports included
information on the Professionalism Design Principle. The remaining Design Principles
were referenced in less than 40% of the reports: both Ready for College and
Personalization were referenced in only 37% of reports and Purposeful Design was
referenced in only 33% of the reports.
18
Within the Teaching and Learning Design Principle, the instructional coaches supported
the use of the Common Instructional Framework. Collaborative Group Work was the
instructional practice upon which the coaches most commonly focused (mentioned in
54% of reports). In interviews, the coaches said that group work was often chosen
because it was one of the most accessible practices. One of the coaches said that they
started with Collaborative Group Work because “it’s the fastest way to a studentcentered classroom, in the principal’s mind.” The second most commonly emphasized
practice was Questioning (mentioned in 39% of the reports). The remaining CIF
strategies were mentioned as follows: Classroom Talk (20%); Writing to Learn (17%);
Scaffolding (11%); and Literature Circles (9%).
Leadership Coaching. Principals at NC New Schools’ schools are supported by a
leadership coach who conducts regular on-site visits. Leadership coaches are
experienced school leaders who have worked in NC New Schools’ schools or other
schools with a focus on innovation. NC New Schools identifies the major responsibilities
of these coaches as follows: 1.) establishing trusting relationships with the principal and
school staff; 2.) building understanding of the NC New Schools Design Principles and
best practices; 3.) identifying specific needs for support and assistance related to
successfully implementing the model; 4.) identifying potential obstacles to success,
while helping develop strategies to eliminate them and ensure support for initiatives
within the scope of NC New Schools’ expectations; and 5.) guiding and focusing school
leaders on innovation, reflective practice and the strategic planning process to ensure
that all students in the school will graduate prepared for college and work. Each school
receives approximately 15-25 days of leadership coaching a year, depending on the size
of the school.
Table 6 shows the number of days expected to be provided during the entire year, the
number of days that would be considered full implementation by the end of January
(90% of half of the expected number of days), and the number of days actually provided.
Table 6: Fidelity of Implementation--Leadership Coaching
Organization
Target # of
Days Annually
Implementation
Number of
FOI Rating
Target (by end of
Actual Days
1/13)
Marks
20
9
4
44%
Jefferson
20
9
3
33%
Lincoln
15
7
4
57%
Grant
25
12
6
50%
Roosevelt
15
7
3
43%
a
Overall
19
9
4
45%
a
The school average is calculated by taking the mean of each school’s FOI rating, not the average number
of days provided.
19
Leadership coaching has been provided at a level that was overall less than 50% of the
levels that would represent full implementation; only one school has received more
than 50% of the expected days.
The leadership coach we interviewed indicated that the support she provides varies
according to what the principal needs. Over the summer, the principals completed a
leadership descriptor form that was used to guide some of the leadership supports. For
example, in one school she is working on helping the principal learn to delegate more
effectively, while in another school she is working on providing feedback, and in a third
she is helping the principal do more effective teacher observations. The leadership
coaches also work with the principal on programmatic aspects of NC iRIS such as
scheduling visits or preparing for the Design Principle Rubric Review Process.
The leadership coaches provided summary reports for each visit. An analysis of the
reports found that, as expected, all of their visits involved meeting with the principal. In
one of the reports, the coach also reported meeting with teachers and doing schoolwide professional development.
In terms of the content focus of the visits, reports included references to the following
Design Principles:






62% referenced the College Ready Design Principle;
75% referenced Powerful Teaching and Learning;
50% referenced Personalization;
88% referenced Professionalism;
100% referenced Leadership; and
38% referenced Purposeful Design.
Postsecondary Partnerships and College Course Funding
A core part of NC iRIS is the development of college partnerships, which are designed to
provide students access to college courses among other college readiness activities. The
provision of college courses and the establishment of college partnerships have faced a
challenge in a North Carolina law that took effect in January of 2012, entitled Career and
College Promise.
Career and College Promise provides for three pathways of community college courses
for high school students. 1.) The College Transfer Pathway gives high school juniors and
seniors access to up to 44 credits that can transfer to a four-year college or university.
Students in this pathway must have a GPA of at least 3.0 and must meet certain testing
criteria. 2.) The Career Transfer Pathway gives high school juniors and seniors access to
courses in Career and Technical Education clusters that can lead to certification.
Students must have a GPA of 3.0 or the principal’s recommendation. They must also
20
have taken course prerequisites. 3.) The third pathway allows students in early college
high schools to take courses starting in 9th grade. This third pathway does not apply to
students in NC iRIS because they are in traditional high schools.
NC iRIS faced two challenges because of this legislation. The first was that it prohibited
students who were younger than 11th grade from taking college courses. Through their
advocacy work, NC New Schools was able to get a legislative waiver for NC iRIS schools,
allowing students in 10th grade to take one community college course. The second
challenge was that the target population of NC iRIS schools includes students who are
underrepresented in college and who might not meet the eligibility criteria for the
College Transfer Pathway. As a result, NC New Schools staff are seeking out alternative
service providers for online college courses and have found two options that they will
make available to schools.
In addition to their efforts to change policy at the state level, NC New Schools staff have
been working with local community colleges and districts to develop their partnerships.
Each district and community college is in the process of developing a Memorandum of
Understanding that will delineate policies such as the courses offered to students,
tuition and textbook reimbursement. For the four districts with Cohort 1 schools, one
MoU is complete; the other three are still in development.
The most significant aspect of the partnerships is the offering of college courses for
students. The goal of the project is to have students at the following levels taking
college credit-bearing courses: 15% of the entire student population averaging 1 course
in Year One, 30% of the population averaging 3 courses in Year Two, and 50% of the
population averaging 3 courses in Year Three. Table 7 shows the target number of
students by school.
Table 7: Number of Students Expected to Take College Courses
Organization
Marks
Jefferson
Lincoln
Grant
Roosevelt
Overall Mean
School
Enrollment
738
630
431
854
425
616
Target # of Students
(15%)
111
94
65
128
64
92
College credit courses began being offered in the spring of 2013, in part because of the
negotiations concerning Career and College Promise and, in part, because of the fact
that some of the Cohort 1 schools did not start receiving NC iRIS services until the
summer, which meant that fall student schedules had already been completed.
21
Activities to Influence District and State Context
District-based Professional Development. In NC iRIS, NC New Schools introduced a new
process entitled the Design Principle Rubric Review. As part of this process, NC New
Schools staff spent time in each NC iRIS school observing, examining data and talking
with staff. NC New Schools staff then met with the superintendent and any relevant
district leaders and a team from each school to discuss how the school was doing
relative to the standards articulated in the Design Principle Rubric (see Appendix C). By
the end of January, four out of the five reviews had been completed.
NC New Schools staff describe this meeting as a non-threatening process during which
the school staff explore the different data collected and determine for themselves the
priority areas on which they need to focus. They believe that it allows the district and
school staff to develop a common understanding of what NC iRIS is trying to accomplish.
NC New Schools staff have found this process to be so valuable that they are planning
on duplicating this process with the other programs that they have.
Because this process was a pilot and also seen as potentially sensitive, the evaluation
team did not conduct any observations of the review process. We plan, however, to do
so in Year 2 in districts that agree to our presence.
Community Development. In October 2012, NC New Schools hired a community
development coordinator for NC iRIS. He articulates the main goal of his work as
“building a sense of urgency in community members around the need for students to be
ready for college and career.” He does this by educating local elected officials,
community members, business communities, and local stakeholders.
By the end of October, this coordinator had begun holding meetings with individual
stakeholders in the Cohort 1 districts. These stakeholders were identified by the
district’s superintendent as community organizations, individuals, or businesses that
have been active in working with the schools. The coordinator also met with county
commissioners and school board members. Finally, he also used research to identify
appropriate individuals, which involved reading the newspaper and contacting
individuals who were frequently mentioned as being involved with schools.
The coordinator held community meetings in two counties, one with the local Rotary
Club and one with a “Workforce Partners Group,” a group of businesses partnering with
schools. During these meetings, the coordinator explained the rationale behind NC iRIS,
the goals of the project, and the changes that the community could expect to see in the
school.
The coordinator hopes to be able to identify core organizations in each county that can
adopt NC iRIS as an initiative. For example, the Chamber of Commerce in one of the
22
counties has been supportive of schools but has not had a specific focus for their efforts.
According to the coordinator, the Chamber is considering putting their support behind
NC iRIS.
The coordinator also has been attempting to build community awareness by getting
stories published in local newspapers. As of the end of January, stories had been
published in papers in three of the four counties in which there are Cohort 1 schools.
Moving forward, the evaluation will consider ways to measure the impact of the
community development work. Following are some possible measures of success to
consider:
 Degree to which local stakeholders support the program and are actively
engaged in making sure that it succeeds. Specific indicators could include
involvement of different parties in aspects of NC iRIS, or the extent to which
local agencies commit to providing support.
 Business involvement in the school. This could be examined by determining
whether there was an increase in supports provided by businesses to the
schools, including funding, tutoring support, opportunities for teacher
externships, internships, or field trips, or times in which businesses and teachers
work together on activities for students.
 Extent to which there is local commitment to continue funding the project. The
ultimate goal of the community involvement work is sustainability of NC iRIS. As
a result, we can examine the extent to which organizations within a county or
the county itself have committed funds to continue supporting the project.
23
Section VI: Results
This section summarizes the background characteristics of the Cohort 1 schools and
provides baseline data on the core outcomes that will be used to assess the impact of
NC iRIS.
Overview of Findings





NC iRIS Cohort 1 schools are located in the targeted rural, low-wealth counties.
Over half of their students receive free and reduced-price lunch.
NC iRIS Cohort 1 schools are starting with outcomes that suggest a relatively low
emphasis on college. In particular, before beginning NC iRIS, the schools offered
approximately half as many college credit-bearing courses as the state average.
Baseline survey results suggest that the schools are relatively well-functioning
schools that also have room to grow, particularly in the areas of college
readiness expectations, rigorous instruction, and personalization.
Students began enrollment in college courses in January 2013.
According to the coaches, school staff have begun trusting the coaches more.
Coaches also report that some teachers are beginning changes in instruction and
that some schools are showing an increase in activities emphasizing college
readiness.
School-level Characteristics
The targeted population for this intervention is schools that are in rural, low-wealth
counties. All of the schools met this criterion. Table 8 shows the demographic
characteristics of the individual schools that have been served in Cohort 1. The table
shows that the schools in the sample differ somewhat from the average school in North
Carolina; this is not unexpected as the schools are located in rural areas. The Cohort 1
schools are generally on the smaller end of high schools. All of the schools have over
half of their students receiving free and reduced-price lunch, higher than the state
average. On average, a quarter of the student body is a member of an
underrepresented minority but there is a wide range, with one school having only 2%
and another having 47% of its population as minority. This is less than the state average
and is driven partly by the location of three of the schools in rural Western North
Carolina, which has a lower minority population. The schools also, on average, had the
same or lower teacher turnover than the average school in the state.
24
Table 8: Background Characteristics of Cohort 1 Schools
Treatment
School
Student
Enrollment
2011-2012
Percent
Students in
Poverty
2011-2012
Lincoln
431
54.3%
Percent of
Underrepresented
Minority
Students 20112012
29.5%
Roosevelt
425
68.9%
Jefferson
630
Marks
Percent ELL
2011-2012
Teacher
Turnover 20112012
3.3%
11.1%
46.6%
1.8%
14.7%
54.8%
2.4%
1.6%
14.0%
738
61.8%
24.5%
1.0%
11.4%
Grant
854
56.9%
15.8%
4.9%
6.6%
Mean
615
59.3%
23.8%
2.6%
11.6%
State Average
829
47.8%
45.6%
5.0%
14.8%
Baseline Outcome Data
The evaluation has identified a set of eight outcomes that we will examine to determine
the impact of the program. These outcomes occur in four domains: 1.) college creditbearing course-taking; 2.) graduation; 3.) college preparatory course enrollment; and 4.)
staying in school. The outcomes in the first two domains are considered confirmatory
outcomes, or those outcomes that represent the ultimate impact of the intervention.
The outcomes in the second two domains are more exploratory in nature, and are
designed to track progress toward the longer-term confirmatory outcomes.
Domain: College credit-bearing course-taking.
1. Percent of students who have enrolled in at least 1 college credit-bearing course
by the end of 11th grade. A primary goal of the intervention is to increase the
number of students who have access to college credit-bearing courses. This
measure is therefore designed to look at the percentage of the student body
that is given access to these courses. For purposes of this study, we are looking
at any course that has the potential to bear college credit, including Advanced
Placement, International Baccalaureate and dual enrollment courses.
2. Average number of college credit-bearing courses students have taken and
passed by the end of 12th grade. The previous measure speaks to access. This
measure tries to get at the depth of the students’ experiences with college credit
through the number of courses successfully completed. NC iRIS has a goal of
having at least 50% of students successfully completing at least 21 college
credits. Students will be identified as having taken either AP, IB, or dual
enrollment courses. Passing the course will be indicated as receiving a grade of C
25
or higher, which is the level accepted for college course transfer by UNC Chapel
Hill. To ensure comparability among courses, we will use course grades for AP
and IB courses also, even though, for those programs, credit is awarded only to
students who pass the AP or IB exams.
Domain: Graduation.
3. Cohort graduation rate. NC iRIS has a goal of increasing the graduation rate by
10 percentage points by the end of the fifth year. For this outcome, we will use
the four-year cohort graduation rate calculated by the North Carolina
Department of Public Instruction (NCDPI).
The exploratory outcomes are listed below.
Domain: College preparatory course-taking and success.
4. College preparatory course-taking. This measure looks at the proportion of
students taking a core set of college preparatory courses at the 9th grade level.
The courses to be examined include those that would ensure that a student is
on-track for entrance into the University of North Carolina system. In 9th
grade, these courses include English I and at least one college preparatory
mathematics course (Algebra I, Geometry, Algebra II, Integrated Math I).
Because it is extremely challenging for students who are off-track for college in
9th grade to catch up (Finkelstein & Fong, 2008), we will examine the
percentage of students taking these courses as a measure of the extent to
which the school provides access to courses needed for college to a wide
range of students.
5. College preparatory course success. This measure is very closely related to the
first measure and is the percentage of students taking and succeeding in
English I and at least one college preparatory math course in the 9 th grade.
Successful completion will be defined as passing the course with a grade of C
or higher. While the first measure speaks to access, this second measure of
successful course completion captures both access and success in school and
does not penalize schools that are expanding access to more students. The
anticipated impact is at least 10 percentage points on both course-taking and
course success by the second year of the intervention.
Domain: Staying in school.
6. Attendance. Student attendance has been positively associated with progress in
school (Lee & Burkham, 2003); changes in student attendance are therefore
seen as a reliable indicator of students’ likelihood of remaining in school. The
26
evaluation will examine the number of days that a student is absent from
school. The intervention is expected to result in a reduction of two days of
absence.
7. Dropout. This measure examines the dropout rate for each school. Students in
the dropout file are students who either completed a form indicating that they
are dropping out of school or had the school indicate that they dropped out.
Students who are not listed in the dropout file are considered not to have
dropped out.
8.
Continued enrollment in school. Because our experience with North Carolina’s
data indicates that the dropout data are not always complete (Edmunds,
Bernstein, Unlu, Glennie, Smith, et al., 2012), the evaluation will also look at the
proportion of students who remain enrolled in school in each year.
Cohort 1 is only in its first year of receiving services. As a result, we do not have data on
these outcomes above but we are able to present baseline data for some of these
outcomes. We do not have baseline data for those outcomes that are calculated using
student-level data. We have tried, however, to identify a proxy school-level outcome
that can provide an indication of where a school is. The specific outcomes for which we
have baseline data include the following:
 For college credit-bearing course-taking, we have the percent of course
enrollments in AP/IB/dual credit courses;
 For graduation, we have baseline data for the final outcome of the four-yearcohort graduation rate;
 For college preparatory course-taking, we present the percent of students
enrolled in Algebra II1. A rate of approximately 25% would suggest that close to
all of the students in the school are taking the math needed for college.
 For staying in school, we have school-level attendance data.
 We also include the EOC composite test scores, although these are not a target
outcome of the program.
Table 9 presents the baseline data for those four outcomes listed above.
1
At the school-level, the data provide the percentage of students taking a specific exam in a given year
but it does not break this out by grade level of students. Therefore, because we cannot identify the
th
percentage of 9 graders taking Algebra I from school-level data (and upperclassmen often take it), the
percentage of students taking Algebra II is a better school-level proxy for college preparatory coursetaking.
27
Table 9: Cohort 1 Baseline Data
Treatment
School Name
On-Time Grad
Rate Avg
(2008-2012)
Percent
Students
Enrolled in
Algebra 2
(2011)
Percent
Enrollments in
AP/IB/College
Credit Courses
(2012)
Attendance
Rate
Avg. Pass Rate
Core Subjects
(EOC
Composite)
Lincoln
80.6%
14.5%
1.3%
95.4%
72.5%
Roosevelt
73.3%
14.3%
1.5%
94.8%
77.0%
Jefferson
73.0%
13.0%
4.9%
94.4%
73.6%
Marks
68.9%
21.0%
4.6%
94.7%
68.8%
Grant
74.2%
16.5%
1.2%
93.3%
83.2%
Treatment
Group Mean
State Average
74.0%
15.9%
2.7%
94.5%
75.0%
75.3%
20.0%
5%
94.5%
81.4%
Table 9 shows that the schools in the treatment group are below the state average on
almost all of the baseline outcome measures, with the exception of attendance. Taken
as a whole, these data suggest that the schools in NC iRIS are starting with a lower
emphasis on college-going than the state average. In particular, three of the treatment
schools appear to have extremely limited enrollment in the area of college creditbearing course-taking. Additionally, four out of the five schools are below the state
average in the percentage of students taking Algebra II. Academic performance and
graduation rates are also below the state averages.
As noted in Section III, the grant calls for at least 15% of the student population to have
taken at least one college course in Year 1. By the end of December 2013, no students
had taken any college courses. Students did enroll in college courses in January 2013;
we will report on this in the Year 2 report.
Baseline Survey Data
The staff at each treatment school completed surveys designed to measure the
implementation of core components of the model (a copy of the survey is in Appendix
D). This section presents baseline results for the treatment schools, organized by each of
the Design Principles.
Ready for College. For this Design Principle, the survey asked questions about coursetaking expectations for their students, college-going expectations and activities
completed to get students ready for college.
According to the survey, all of the treatment schools had fewer than 50% of their
students taking honors courses and four of the five schools reported that fewer than
28
75% of their students were on-track for college. The schools also generally reported
different 9th grade course-taking expectations for the below-grade level students,
primarily in the area of math and foreign languages. Only two of the schools expected
that 9th grade students would take foreign language courses. Table 10 presents the
course-taking expectations for both sets of students.
Table 10: Percent of Schools Reporting On-Grade-Level Course
Taking (Non-Remedial) for 9th Graders
English
Mathematics
Science
Social Sciences
Foreign Language
Below-grade-level 9th
grader
100
40
100
80
0
On-grade-level 9th
grader
100
100
100
100
40
The survey also asked the schools to report on whether they offered specific courses for
college credit (including AP and dual credit). The only course offered by all schools for
potential college credit was Calculus. Table 11 presents the results for this question.
Table 11: Courses Offered by Participating Schools
Offered for HS Credit
Offered for Dual Credit,
College Credit, or AP
Algebra I, Geometry, Algebra II
100
0
Integrated Mathematics I, II, and III
40
0
Pre-Calculus and Trigonometry
100
40
Calculus (AB and/or BC)
60
100
Statistics
0
60
Course
Advanced Functions and Modeling
100
0
Biology
100
20
Chemistry
100
20
Earth/Environmental Science
100
0
Physical Science
100
0
Physics
80
40
English
100
60
Civics and Economics
100
0
World History
100
20
US History
100
80
Other Social Science
80
20
Visual and Performing Arts
100
20
Foreign Language
100
0
Career and Technical Education
100
40
29
There were also two scales measuring the school’s college-going culture. The school
mean on the College-going Expectations scale was 2.6 out of 4. The scale asked
questions such as the extent to which the faculty expect every student to go to college
or the extent to which the vision of this school is tied to preparing every student for
college. A 2.6 on the scale fell mid-way between the “agree” and “disagree” point. The
survey also asked schools to indicate the level of student participation in different
college-going activities. The mean on that scale was 3.8 out of 5, which placed the
average percentage of students receiving different services at below 50%.
Powerful Teaching and Learning. For this Design Principle, the survey asked school staff
to report on their use of specific teaching strategies in four primary areas: use of the
Common Instructional Framework, Rigorous Instruction, Assessment, and College
Readiness Strategies. Table 12 presents the baseline results for each of these scales.
Table 12: Powerful Teaching and Learning Scales
Indicator
Common Instructional
Framework
Rigorous Instruction
Overall
Mean
3.7
3.2
Assessment
3.6
College Readiness Skills
3.4
Assessment
3.6
Sample Question
Response Scale
How frequently have you asked
students to explain their thinking?
How frequently have you asked
students to research information?
How frequently have you provided
models or exemplars so students could
see high quality work?
How frequently have you taught
students note-taking skills and/or notetaking strategies?
How frequently have you used the
following assessments? Essays.
1=Never
2=A few times this year
3=Once or twice a
month
4=Once or twice a week
5=Almost every day
1=Not at all used
2=Seldom used
3=Used occasionally
4=Used often
5=Used very often
These results suggest that, on average and at baseline, teachers were using the targeted
instructional practices somewhere between once a month and once a week.
Personalization. The survey looked at three indicators of personalization: the quality of
staff-student relationships, the type and frequency of academic support provided to
students, and the extent of communication with parents. Table 13 presents the results
from these three scales.
30
Table 13: Personalization Scales
Indicator
Overall
Mean
2.8
Sample Question
Response Scale
Every student in this school is
known well by at least one
staff member.
Academic and
Affective Support
2.2
To what extent are the
following services provided at
your school?
Advisories/Seminar
Communication
with Parents
2.9
How frequently have you
provided feedback to parents
regarding assignment
completion?
1=Not true at all
2=Somewhat true
3=Mostly true
4=Entirely true
1=Not offered
2=Available but not mandated
3=Mandated only for students who need
it (may be available for others) or
Mandated for everyone
1=Never
2=A few times this year
3=Once or twice a month
4=Once or twice a week
5=Almost every day
Staff-Student
Relationships
Professionalism. The scales concerning professionalism include questions focused on
collaboration, professional development, teacher involvement in decision-making, and
the extent to which teachers feel responsible for students’ success. Table 14 presents
the baseline results for the scales related to Professionalism.
Table 14: Professionalism Scales
Indicator
Collaboration
Overall
Mean
3.3
Responsibility for
Student Success
3.1
Participation in
Professional
Development
2.8
Teacher
Involvement in
Decision-making
2.2
Sample Question
Response Scale
How frequently do you work
with or communicate with
other school staff on the
following: Lesson or unit
planning?
School staff act as if they are
responsible for students’
learning, even if the students
are not in their classes.
How much professional
development have you
received in the following
areas in the past year? The
content you teach
How involved are teachers in
the decision-making process
in the school?
1=Never
2=A few times this year
3=Once or twice a month
4=Once or twice a week
5=Almost every day
1=None of the staff
2=A few of the staff
3=Most of the staff
4=All of the staff
1=None
2=A single presentation
3=Multiple sessions
4=Multiple sessions with on-site followup
1=Not involved at all
2=Involved in mostly minor decisions
3=Involved in minor and some major
decisions
4=Involved in most major decisions
Leadership. For the Leadership Design Principle, the survey included two scales. One
scale asked questions around whether there was a common vision for the school. The
31
mean on that scale was 3.0 out of 4, indicating that participants agreed that their school
had a clear mission and vision.
The second scale examined the extent to which the leadership team exhibited targeted
leadership behaviors, such as monitoring instruction on a regular basis. The mean score
for this scale was 3.1 out of 4, indicating that the respondents on average agreed that
the leadership team was engaging in these targeted behaviors.
This suggests that, overall, these schools have a relatively strong leadership
environment upon which the NC iRIS project can build.
Purposeful Design. Purposeful Design is the Design Principle focused on the structures of
the school. The survey questions asked about things such as regularly scheduled time
for collaboration and the extent to which the schools have partnerships with other
organizations. Results from the survey show that, at baseline, 44% of respondents
indicated that there were regularly scheduled times in their school for professional
development and for collaborations. Results also show that there is some support from
local colleges and universities, although that support is limited. Table 15 shows the
support received.
Table 15: Support Received from Colleges and Universities
Support Received
Financial Support
Percent of Respondents
8.3
Provide internships
23.6
Mentor or tutor
25.5
Serve as guest speakers
63.1
Provide equipment
12.7
Teach classes or courses
44.6
Provide other resources
42.0
School-level Implementation
Although we only have baseline data for most of the quantitative outcomes, we do have
some initial qualitative data to suggest how the schools are doing in terms of
implementation. To provide an early sense of implementation, we have analyzed the
instructional and leadership coaches’ reports and the interviews conducted with the
instructional coaches, one leadership coach and NC New Schools staff.
The first year of implementation in the Cohort 1 schools, as described by the coaches,
reinforces the common wisdom that change takes time. When asked what changes the
schools have made, the coaches highlighted that they believe that the school staff are
becoming more trusting and open to the coaches’ help. One instructional coach said,
32
…allowing someone outside to come in their classrooms; I think that was a big
piece of what we had to do. To be able to be a nonthreatening set of eyes, that
we don’t report back to the principal, we don’t critique them, but we’re actually
having a professional conversation centered around teaching. I think a lot of
teachers did not have that experience prior to this and that they were so used to
being ready to be criticized that this was a new experience.
In agreement, another coach commented that the teachers and schools need time to
recognize the value in the coaching services before they can make any changes.
And I think if nothing else, there’s some teachers that, at least in my one-on-one
experience at both schools, who have been excited about the opportunity of a
reflective partner, of somebody just to reflect on their own practice with. That, I
think, is a big first step towards opening up all of those other doors to
instructional change and co-teaching and co-planning, that just demonstrating
that little bit of value in that kind of professional attention to their classroom,
which many teachers don’t get even in their outside evaluations or in their
principal observations. So, demonstrating the value of just reflection, I think for
a lot of the teachers, is a good first step towards moving to instructional
change…
In addition to seeing an increasing openness to the process, coaches did indicate some
small, noticeable changes in the schools. For example, one coach commented that she
has seen more classrooms with the students seated in groups; “…even doing that is a
signal that they’re starting to adopt and recognize the changes…”
Other coaches noted that they have seen some visible changes in emphasis on collegegoing in some of the schools. For example, at Roosevelt, the teachers added signs by
their doors that included information about where they went to college and also put
college pennants on the walls. At Grant, they have a 30-minute block during which they
have started targeting college-readiness skills such as ACT test-taking.
Although schools have only been working with the program for one semester, there is
already a sense that implementation varies by school. In those schools where the
leadership has clearer instructional expectations and is supportive of the coaching work,
the coaches report that the staff is more receptive to their services. In one school in
particular, the principal does not appear to value the professional development and
coaches’ work; as a result, many of the staff are seen as substantially resistant to
change.
NC New Schools coaches and staff saw complacency as another reason for potential
variability in implementation. In one school, the staff saw little need to change because
they believed that they were doing quite well. Another school had experienced high
33
growth in the past two years, which made the principal nervous about trying anything
new related to instruction.
A final insight regarding implementation: There is a general sense that there are many
expectations already placed on teachers—including the introduction of Common Core
standards—and the NC iRIS work needs to be conceptualized within the context of
those broader expectations. One coach describes it this way:
Several teachers welcomed us into their classrooms as an opportunity to view
their students at work and discuss how to change classroom instruction through
use of the CIF. There are multiple initiatives drawing on teachers’ time and
resources including implementing CRISS and Common Core curriculum
integration and beginning using Haiku websites for their classrooms. Teachers
state feeling overwhelmed by the expectation of designing new lessons and units
as well as integrating the technology and the CIF. Teachers do not currently see a
connection between the use of Common Core/Essential Standards and the CIF as
a pedagogical delivery model; they only see these initiatives as layers of
expectations with no fundamental relationship.
In the next section, we summarize the lessons learned from the first year of program
implementation and provide conclusions and recommendations for consideration in the
second year.
34
Section V: Lessons Learned, Conclusions and Recommendations
This section summarizes lessons learned by the NC New Schools staff as they have been
implementing the project, as well as conclusions and recommendations arising out of
the data collected for the evaluation.
Lessons Learned
Interviews with the staff suggested several lessons learned from the first year of
implementation of NC iRIS.
The first lesson regards timing: the project would have benefited from an earlier start.
NC New Schools staff recognized that starting with the Cohort 1 schools in June was too
late and they have already started working with Cohort 2 schools by January. This allows
the staff to work with the schools in developing a clearer understanding of NC iRIS
before the project kicks into high gear. It also allows schools to do things that require
advance planning such as scheduling summer professional development and
incorporating college courses into their schedules in the fall.
As a related lesson learned, several coaches spoke of the need to be very clear with the
schools about what participation in NC iRIS means. The coaches reported that some
schools had an initial understanding that the program was only about college courses
and not necessarily about changing teaching and learning.
Everyone who was interviewed agreed that one of the key lessons learned was the
importance of leadership. As one of the coaches said,
I think that [when] the leadership has the aligned vision with us, those schools
…seem to be going to be jumping ahead, and those without that strong
leadership or that strong vision of leadership seem to be a little slower to [get]
on board, not just teachers, but kind of the faculty as a whole.
In addition, the staff believed that better results were obtained when they worked with
both the superintendent and the principal together.
A final lesson learned concerned the Design Principal Rubric Review Process, which was
a new process implemented for NC iRIS. The NC New Schools’ staff saw this as an
extremely powerful learning process and are planning to implement this in their other
programs.
35
Conclusions
The data from the first year of NC iRIS implementation indicate that the program is
generally on-track for meeting its goals, although there were some delays and there are
areas in which the program could be modified to strengthen its potential impact.
NC New Schools faced challenges early on when they had to find 10 new schools to
participate in the program. This resulted in a smaller number of schools being served in
Year 1 than they had originally planned. They were able, however, to recruit additional
schools to participate and the program is intending to serve the full 18 schools included
in the proposal. The Cohort 1 schools currently being served appear to be schools that
can benefit from the program; they are located in rural, low-wealth counties and have
data that suggest an overall lower emphasis on college readiness.
The need to recruit new schools and the hiring of project staff also resulted in the
Cohort 1 schools getting a later start than initially anticipated. Four of the five Cohort 1
schools were able to attend the Summer Institute in June 2012, but this was the first
real exposure that they had to the program. As a result, the early coaching visits in the
fall of 2012 were centered around working with the staff to develop a shared
understanding of NC iRIS. As noted elsewhere, the NC iRIS staff recognized that an
earlier start would be beneficial and they have begun meeting with the leadership of
Cohort 2 schools starting in January 2013, a full six months earlier than the Cohort 1
schools.
NC New Schools is on-track for providing the number of professional development days
that it intended for the first academic year of program implementation. NC New
Schools offers many professional development opportunities to all of the schools in
their network, many of whom are small early colleges. These opportunities are
designed to build knowledge and understanding of the different Design Principles.
Although all of them focus on the core model components, to date, very few of these
opportunities have been tailored for the unique challenges of implementing early
college strategies in a comprehensive high school. NC iRIS staff have recognized this and
have designed some NC iRIS-specific opportunities for Year 2 of the grant. They are also
working with other NC New Schools staff to modify existing professional development
opportunities to be more appropriate for comprehensive high schools.
Instructional coaching is essentially on-track for three of the five Cohort 1 schools.
Leadership coaching, on the other hand, has been provided at a rate of less than 50% of
the targeted days. NC New Schools staff are exploring ways to reach their coaching goals
with all of the schools. Both the instructional and leadership coaches have been focused
on developing trust with the school staff, a slow process that does appear to be
experiencing some success. Both the instructional and leadership coaches report
focusing primarily on the Design Principles of Powerful Teaching and Learning and
36
Leadership. Ready for College, which is the key Design Principle for the grant, has
received less attention, according to the coaches’ reports, although it was emphasized
in the Design Principle Rubric Review process.
Because of new legislation, NC New Schools staff faced significant early challenges in
providing access to NC community college courses to some of their target population of
students. They were able to obtain a partial legislative waiver for NC iRIS schools. In
addition, they have identified other providers of online courses for their students to
take. For these reasons and because of the initial delay in working with Cohort 1
schools, no students took college courses in the first semester; however, students are
taking college courses in the second semester.
Baseline outcome data collected for the schools suggest that the schools are starting
with an overall lower emphasis on college readiness than the state as a whole. The
baseline survey data suggest that the schools are relatively well-functioning schools that
also have room to grow, particularly in the areas of college readiness expectations,
rigorous instruction, and personalization.
According to interviews and coaches’ reports, the program services provided to date
have resulted in some changes in the school. The most significant reported change is an
increased level of trust between the NC iRIS coaches and the school staff. A level of
trust is a necessary precursor to teachers and administrators making significant changes
in their schools. There are also some specific changes occurring in some teachers’
classrooms. The coaches reported seeing more concrete evidence of an emphasis on
college readiness, such as college pennants on walls.
Recommendations
The lessons learned and conclusions drawn have led to some recommendations for the
program staff to consider as they move forward:

The program staff have recognized that they need to begin working with the
schools earlier than the end of the school year. This earlier contact with the
schools will give schools more time to understand and buy into the program and
will hopefully allow the coaching to start earlier. During these early contacts, the
NC iRIS staff should work with the schools to ensure that they have a very clear
understanding of the project, and that the goal of the project is to help more
students become college ready. For the school, this will involve trying to create
an environment that is more supportive of college, using instructional strategies
that will help prepare students to succeed in college courses, and providing early
access to college courses. During these conversations, to help highlight the need
for change, it might be useful to share the school’s data on the percentage of
37



students who are taking advanced courses in the school, as compared to the
state average.
Given the multiple expectations placed on teachers, including Common Core, NC
iRIS staff and coaches should clearly present how the services provided by NC
iRIS mesh with and will help in implementation of these other initiatives.
The professional development opportunities provided by NC New Schools are
clearly focused on the Design Principles. The emphasis in these sessions has
historically been on serving NC New Schools’ partner small schools, particularly
their early colleges. Comprehensive high schools, such as those in the NC iRIS
network, have different issues and would benefit from at least some
programming more explicitly tailored to their needs. NC New Schools has made
an excellent step in this direction by planning an onsite visit to a Texas school
district which has been implementing a similar effort for several years. NC iRIS
staff should explore additional ways of tailoring the professional development
more explicitly to the needs of comprehensive high schools.
The instructional and leadership coaching services have been primarily focused
on the Leadership and Powerful Teaching and Learning Design Principles. Given
the fact that Ready for College is seen as the most important Design Principle for
NC iRIS, NC iRIS staff may want to consider working with the coaches to
determine ways in which they can emphasize the Ready for College Design
Principle in their visits. This can be driven by the school’s plan coming out of the
Design Principle Rubric Review process.
Next Steps
The focus of the first year of the evaluation has been on collecting baseline data for the
schools, putting data collection procedures in place, developing a detailed analysis plan,
and developing a deeper understanding of the program. In Year 2, the evaluation will
undertake the following activities:
 Submit a final analysis plan to meet the NC iRIS GPRA indicators related to
evaluation;
 Collect and analyze baseline data for Cohort 2 schools and survey data for Cohort
1 schools;
 Collect data from schools on their perception of the quality and utility of the
services provided;
 Develop student-level datasets to allow for the analysis of impacts of the
program;
 Observe professional development opportunities;
 Begin site visits to schools to explore issues associated with implementation; and
 Develop measures and begin collecting data to assess the impact of the program
on the community context.
38
References
American Evaluation Association (July 2004). Guiding principles for evaluators.
Downloaded on August 25, 2010 from
http://www.eval.org/publications/guidingprinciples.asp
Century, J., Rudnick, M., & Freeman, C. (2010). A framework for measuring fidelity of
implementation: a foundation for shared language and accumulation of
knowledge. American Journal of Evaluation, 31(2), 199-218.
Edmunds, J. A., Bernstein, L., Unlu, F., Glennie, E., Smith, A., & Arshavsky, N. (2012).
Keeping students in school: Impact of the early college high school model on
students’ enrollment in school. Paper presented at the Society for Research on
Educational Effectiveness.
Edmunds, J. A., Bernstein, L., Unlu, F., Glennie, E., Willse, J., Smith, A., et al. (2012).
Expanding the start of the college pipeline: Ninth grade findings from an
experimental study of the impact of the Early College High School Model. Journal
for Research on Educational Effectiveness, 5(2), 136-159.
Edmunds, J. A., Willse, J., Arshavsky, N., & Dallas, A. (in press). Mandated engagement:
The impact of early college high schools. Teachers College Record.
Finkelstein, N. D., & Fong, A. B. (2008). Course-taking patterns and preparation for
postsecondary education in California’s public university systems among minority
youth. (Issues & Answers Report, REL 2008–No. 035). Washington, DC: U.S.
Department of Education, Institute of Education Sciences, National Center for
Education Evaluation and Regional Assistance, Regional Educational Laboratory
West.
Lee, V. E., & Burkham, D. T. (2003). Dropping out of high school: the role of school
organization and structure. American Education Research Journal, 40(2), 353393.
New Teacher Center. (2010). Validity and reliability of the North Carolina Teacher
Working Conditions Survey. Santa Cruz, CA: University of California at Santa Cruz,
New Teacher Center.
Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and
data analysis methods Thousand Oaks, CA: Sage Publications.
39
Appendix A: Performance Indicators
GPRA Measures
GPRA 1
Performance
Measure
Number of
students
served
Measure
Type
GPRA
Target
Raw
Number
4,444
Ratio
Actual Performance
%
Raw
Number
3,078
Ratio
%
Explanation of progress. According to the proposal, an estimated 4,444 students were
expected to be served during year 1; the actual number was 3,078. The reduced number
of students is due to two reasons. First, after the proposal was awarded, a number of
schools included in the application were unable to participate, some because they were
participating in Race to the Top. The new schools that are participating have lower
enrollment than the schools that were originally slated to receive services. Second, six
schools were planned for year 1, however, only five of those six actually participated.
This will be made up in year 2 of the grant, when 8 schools will participate, instead of
the 6 that were originally planned. The number of students served is calculated by
taking the full enrollment of participating NC iRIS schools. NC iRIS provided Professional
Development provided to the entire school (teachers and staff) via instructional
coaching. Each teacher demonstrates new knowledge, experiences and skills to all
students within the school. All teachers are trained in using the Common Instructional
Framework (CIF) which includes six instructional strategies. Teachers are expected to
integrate these strategies so that all students are challenged to read, write, think and
talk in every class, every day.
GPRA 4
Performance
Measure
Cost per
student
Measure
Type
GPRA
Target
Raw
Number
$715
Ratio
Actual Performance
%
Raw
Number
$140
Ratio
%
Explanation of progress
The explanation above provides statements concerning the actual number of students
served in year 1. This number (3,078) has been divided into $432,238 to determine the
actual cost per student. The cost per student will increase in Year 2 as students begin
taking college courses. Adjustments are also recommended in the year 2, 3, 4 and 5
budgets which will include the expanded support system to include a NC iRIS liaison
position to ensure student success in high school and in college. This support is included
in the Early College system of support and has been seen as having a positive impact on
student success and college completion.
40
Goal 1 – Improved student outcomes
Goal 1A
Performance
Measure
By the end of
Year 3, 18
schools will
have been
served.
Measure
Type
Projectspecific
Target
Raw
Number
18
Ratio
Actual Performance
%
Raw
Number
5
Ratio
%
Explanation of progress. As noted above, a total of 6 schools were expected to be
served starting in Year 1. Because a number of schools were ineligible or unwilling to
participate, NC iRIS staff needed to recruit a subset of new schools. As a result, only five
schools were able to participate in the first year. However, 8 schools are participating in
the second year, which will bring the total served in the first two years to 13. Five
schools are expected to be served in Year 3, bringing the total to the target of 18.
Goal 1B
Performance
Measure
Target
Measure
Type
Each school will ProjectRaw
Ratio
receive 15-25
specific
Number
days of
9
leadership
coaching
annually,
depending on
the size of the
school.
Actual Performance
%
Raw
Number
4
Ratio
%
Explanation of progress. The average number of coaching days in a year depends on the
size of the school. On average, the schools were expected to have received 9 days of
leadership coaching by the end of January. By the end of January, they had received an
average of 4 days, with a low of 3 days in one school and a high of 6 days in one school.
41
Goal 1 B
Performance
Measure
Target
Measure
Type
Each school will ProjectRaw
Ratio
receive 3 days
specific
Number
of instructional
34
coaching
annually for
each staff
member in the
school.
Actual Performance
%
Raw
Number
24
Ratio
%
Explanation of progress. The average number of instructional coaching days depends on
the number of teachers in the school. By the end of January, schools should have
received on average 34 days’ worth of coaching. The average number of days of
coaching received was 24, with a low of 7 days in one school and a high of 45.5 in
another. NC iRIS staff are planning to provide the balance of days owed to each school
during the school year or in the summer months.
Goal 1B
Performance
Measure
NC New
Schools will
provide 22
days of
professional
development
annually.
Measure
Type
Projectspecific
Target
Raw
Number
13
Ratio
Actual Performance
%
Raw
Number
18
Ratio
%
Explanation of progress. All NC iRIS schools are able to participate in all of the
professional development offered by NCNC New Schools to its networks. As of the end
of January 2013, school staff should have been offered a total of 13 days of professional
development. NC New Schools actually provided 18 days, more than the target number
of days. Not all schools participated in all of the events; the average participation rate
was 12 days per school.
42
Goal 1B
Performance
Measure
Measure
Type
Schools will
Projectincrease the
specific
percentage of
students
successfully
completing
Algebra 1 by the
end of ninth
grade an
average of 10
percentage
points by the
end of the
second year of
implementation.
Target
Raw
Number
Ratio
Actual Performance
%
Raw
Number
Ratio
%
10
Explanation of progress. A goal of the grant is to expand access to college preparatory
courses as well. Instructional Coaches will work with math teachers to implement
engaging strategies which will increase the number of students experiencing success in
Algebra 1. In order to measure this goal, the evaluation team will establish a studentlevel dataset that will allow measurement of courses taken by year. Baseline data will be
reported for this measure in the Year 2 report.
Goal 1C
Performance
Measure
At the end of
Year 3, 50%
of students
will be taking
at least 3
college creditbearing
courses.
Measure
Type
Projectspecific
Target
Raw
Number
Ratio
Actual Performance
%
50%
Raw
Number
Ratio
%
0%
Explanation of progress. One of the primary goals of the project is to increase the
number of students taking college -credit- bearing courses. At the end of Year 3 of
services, each school is expected to have 50% of its students taking at least three college
-credit- bearing courses. NC New Schools is partnering with ten local community
43
colleges and two universities in the state of North Carolina to offer college credit
courses on campus and online for high school students. Because of challenges faced in
establishing postsecondary partnerships, college courses were not available to students
in the first semester. Students have, however, begun taking college courses in the
second semester. These numbers will be included in the Year 2 annual report.
Goal 1B
Performance
Measure
By the end of
year 5, there
will be a 10
percentage
point
increase in
graduation
rates.
Measure
Type
Projectspecific
Target
Raw
Number
Ratio
Actual Performance
%
Raw
Number
Ratio
%
10%
Explanation of progress. By the end of the project, the graduation rate for each school
is expected to improve by 10 percentage points. The evaluation team has established
baseline levels for each school and will track progress toward this outcome. The first
time point at which progress will be measured will be upon completion of the second
year of a school’s participation in NC iRIS. Those results will be included in the Year 4
evaluation report.
Goal 2 – Build capacity sustainable implementation
Goal 2a: Design Principle Rubric Review (DPRR)
Performance
Measure
Measure
Type
Each school
Projectdistrict will
specific
participate in a
Design Principle
Rubric Review
visit in Year 1 of
implementation.
Target
Raw
Number
5
Ratio
Actual Performance
%
Raw
Number
4
Ratio
%
Explanation of progress. The purpose of this activity is to explore the quantitative data
as it relates the Design Principle “Ready for College” creating a college going culture in
the school where all students graduate ready for college . This process will assist the
team in greater understanding of where they are on the DP continuum and will assist
44
the team in setting appropriate goals. This team will look at aggregated data to
determine how well students are doing in relation to students in the district and the
state, what trends in student performance are observed, and to what extent does this
data reveal about the existing college going culture established at the school. Staff from
NC New Schools along with district level central and school staff jointly participate in
this process. The process includes but is not limited to the following:
1. Predicting and identify your expectations, assumptions and predictions related
to the quantitative data you will examine.
2. Look at the individual data using guiding questions:
3. Assess their progress relative to the Design Principle, using supporting data.
4. Identify trends in light of the College Ready Design Principle
5. Reflect on the process.
Goal 3 – Create platform to support large scale expansion
Performance
Measure
Measure
Type
Communicate Projectlessons
specific
learned from
the project to
stakeholders.
Target
Raw
Number
Ratio
Actual Performance
%
Raw
Number
Ratio
%
Explanation of progress. By March 15 of each program year, North Carolina New Schools
will have submitted a report on implementation to the State Board of Education, the
State Board of Community Colleges, Office of the Governor, and the Joint Legislative
Education Oversight Committee. This report will provide a summary of annual
performance and evaluation of the project.
45
Appendix B: Methodology
This section includes the Analysis Plan submitted to the U.S. Department of Education in
October, 2012. A revised plan will be submitted in May, 2013.
Intervention: NC iRIS is designed to increase the number of students who graduate from
high school and are prepared for enrollment and success in postsecondary education.
The critical components of the NC iRIS Project include a set of services that are intended
to support implementation of a whole-school reform model emphasizing the creation of
a college-preparatory school environment through six Design Principles. The services
provided include: 1.) a series of professional development activities centered around
implementation of the six Design Principles; 2.) on-site leadership coaching for
administrative teams on the Design Principles; 3.) on-site instructional coaching on the
Design Principles, emphasizing the Common Instructional Framework; 4.) funding for
college credit courses for students; and 4.) assistance in developing partnerships with
postsecondary institutions. As a result of these services, each school is expected to
implement six Design Principles that represent characteristics of an effective high
school. These Design Principles, as articulated by the NCNSP, are as follows: 1.) Ensuring
that students are ready for college; 2.) Instilling powerful teaching and learning in
schools; 3.) Providing high student/staff personalization; 4. ) Redefining professionalism;
5.) Creating leadership that develops a collective vision; and 6.) Implementing a
purposeful design in which school structures support all of the above principles. A
primary emphasis of the program will be increasing the number of students who
participate in college credit-bearing courses while in high school.
Primary Study: The primary impact study uses a quasi-experimental design to assess the
impact of the NC iRIS Project on a core set of student outcomes. The treatment group
sample will include a total of 18 comprehensive high schools that will receive three
years of services. A subset will receive services in Years 1-3; another subset in Years 2-4;
and the final subset in Years 3-5. Each school will be matched to three comparison
schools, bringing the total sample to 72 high schools. One sub-study will examine the
impact of the model on school’s implementation of the Design Principles.
The schools in the study are located in rural, low-wealth counties through North
Carolina. The school’s entire student population will be participating in the portion of
the intervention that focuses on the six Design Principles. A subset of students will be
targeted to participate in college credit classes while in high school. The target
population for the college credit courses includes students:
 Who would be the first in their family to complete postsecondary education;
 Who are at-risk of dropping out of high school; and
 Who are members of groups underrepresented in college, including low-income
and racial and ethnic minority students,
46
Additionally, the project currently plans to serve students who do not qualify for normal
dual enrollment status under North Carolina statute2.
Expected Outcomes: The study will examine two core student outcomes as the primary
outcomes of the study: 1.) enrollment in college-credit bearing courses (dual credit and
AP) and 2.) graduation from high school. Additional student outcomes to be examined
include attendance, dropout and continued enrollment rates, and enrollment and
success in college preparatory courses. These data will come from data collected by the
North Carolina Department of Public Instruction and housed at the North Carolina
Education Research Data Center. In addition, the study will use original survey data to
examine the extent to which the schools implement the six Design Principles, as
compared to schools in the comparison group.
Evaluation of Implementation: The implementation evaluation will focus on the NC iRIS
Project’s critical components: 1.) the delivery of and participation in program services
(what has been conceptualized as “structural implementation” (Century, Rudnick, &
Freeman, 2010)) and 2.) the implementation of the Design Principles at the school level
(this is similar to what has been conceptualized as “instructional implementation” and
“represent[s] the actions, behaviors, and interactions that the user is expected
to engage in when enacting the intervention” (Century, et al., 2010, p. 205)). The
implementation of the Design Principles can also be seen as proximal outcomes and
thus will not be considered as measures of fidelity of implementation.
Relative to the coaching and professional development services, fidelity of
implementation will be assessed on whether 100% of professional development services
and 90% of coaching services are delivered as planned to schools, and whether
participants participate in professional development services at a rate of at least 80%
(For coaching, the benchmark for delivery of services and participation in services are
the same because, if NCNSP actually provides the service, the coach is on-site working
with the school and the school is participating in the service).
Two other program services are considered as dichotomous measures of
implementation: the creation of an IHE partnership that allows students to take college
courses and provision of a day of professional development for district staff. It will be
indicated whether these are in place for each school. Across all schools, the benchmark
is 100% creation for IHE partnerships (because the program will not work without them)
and 80% participation rate among district staff in district professional development.
2
North Carolina law now requires that students who participate in dual credit courses be in 11 th or 12th
grade, have a GPA of at least 3.2, and meet certain testing requirements. The NC New Schools Project has
received a waiver from these requirements for this project.
47
The final program service is provision of funds for college courses. As a measure of
fidelity of implementation, this will be treated as dichotomous (did the program provide
funds for students to take college credit courses or not?). Because the actual number of
courses that NCNSP funds is dependent on the number of students who enroll in
college courses, number of courses supported will be considered as a student outcome.
A total score for fidelity of implementation will be calculated by combining the level of
participation in all of the required activities (see below for more detail).
All of these data on service delivery and participation will be collected from project
records, including coaches’ reports and professional development sign-in sheets,
supplemented by interviews with staff and by site visit data conducted by the
evaluators.
The services described above are designed to prepare the schools to implement the
Design Principles. School-level implementation of the Design Principles can be
considered both as an implementation measure and an immediate outcome. Given that
these Design Principles are characteristics of a good school that could be found in both
intervention and non-intervention schools, it is critical to understand implementation in
both situations. We do not have formal benchmarks for each of these. Instead, the
expectation is that treatment schools improve on these dimensions as compared to
baseline and as compared to the comparison schools. These data will be collected
primarily through surveys administered to the staff of treatment and comparison
schools.
Independence of Evaluation: The evaluation is designed and conducted by the external
evaluator. Although NCNSP recruited the treatment schools, the comparison schools are
being identified by the evaluation team. The evaluation team will collect all of the
outcome data and the majority of the implementation data. The evaluation will use
implementation data collected by the developer, including project records, selfassessments completed by schools and reports completed by project coaches. All data
will be analyzed by the evaluation team and the developer will have no authority over
the final content of the reports.
48
1. Implementation Evaluation
1.1.
Logic model for the intervention
The NC iRIS Project is designed to increase the number of students who graduate from
high school prepared for and enrolling in postsecondary education. The model uses a
comprehensive approach designed to affect most aspects of the schooling experience,
with a particular emphasis on creating a college-going culture through early exposure to
college classes. Figure 1 presents the overall logic model, which is described in more
depth in this section.
Figure 1: NC iRIS Logic Model
49
The specific activities that the NC iRIS project will undertake are described in more
depth here while the outcomes are described in the next sections.
The North Carolina New Schools Project (NCNSP) is the NC iRIS project grantee and is
the developer of the NC iRIS School Model and the provider of the supports to help
schools implement the model. The NC iRIS School Model is both comprehensive and
complex and is designed to influence the basic design of the school. The services
provided are similarly comprehensive and multi-faceted.
Services Provided. To help schools implement the Design Principles, NCNSP has
developed a set of comprehensive professional development supports—the Integrated
System of School Support Services (IS4). The core components of the IS4 services
include:
Instructional Coaching: Each school in the NCNSP network receives services from
experienced educators who have knowledge, experience, and skills in working with
school staffs and who understand and are committed to NCNSP’s mission, vision, and
support system. The instructional coaches emphasize both implementing the Design
Principles and working with the teaching staff individually and collectively to improve
their skills in using the Common Instructional Framework (CIF). The number of days each
coach spends in a given school is driven by the size of a school, with 3 days/teacher for
each year of the grant.
Leadership Coaching and Professional Development: Administrative teams at NCNSP
schools are supported by a leadership coach who conducts regular on-site visits.
Leadership coaches are experienced school leaders who have worked in NCNSP schools
or other schools with a focus on innovation. The NCNSP identifies major responsibilities
of these coaches as follows: 1.) Establishing trusting relationships with the principal and
school staff; 2.) Building understanding of the NCNSP Design Principles and best
practices; 3.) Identifying specific needs for support and assistance related to successfully
implementing the model; 4.) Identifying potential obstacles to success, while helping
develop strategies to eliminate them and assure support for initiatives within the scope
of NCNSP expectations; and 5.) Guiding and focusing school leaders on innovation,
reflective practice and the strategic planning process to ensure that all students in the
new school will graduate prepared for college and work. In addition, the leadership
coaches design and deliver regular professional development services to principals in
regional groups called Leadership Innovation Networks (LINs) focused on issues related
to the Design Principles. Each school receives approximately 20-25 days of leadership
coaching a year, depending on the size of the school.
Teaching for Results Professional Development: Each year, teachers in the treatment
schools will take part in a series of intensive professional development activities that
sustain their focus on instruction, academic rigor and professional learning
50
communities. Sessions are designed to build understanding of different aspects of the
Design Principles and the Common Instructional Framework. During the school year,
there are also sessions that involve visits to peer schools in which teachers use a medical
“rounds” model to improve their practice collaboratively. These Peer Visits provide a
basis for using the “rounds” model in each school, with the goal of helping teachers
work as professional peers, providing critical feedback and learning from each other
through facilitated classroom observations. Each school is expected to send teams to
participate in approximately 22 days of professional development a year. The
composition of the teams and the number of participants may vary depending on the
focus of the professional development.
NCNSP Program Staff Support: Each school is also assigned an NCNSP staff member who
serves as a portfolio manager, coordinating the delivery of integrated supports and
acting as a primary point of contact with NCNSP.
In addition to the IS4 professional development services (which are provided to all of
the schools with whom NCNSP works, including existing early colleges), there is also a
set of services unique to the NC iRIS Project. These are described below.
Funding for College Courses. One of the core goals of the NC iRIS Project is to increase
the number of students taking college credit courses. As noted above, the aim is that
15% of a school’s enrollment has completed one college course by the end of Year 1; in
Year 2, 30% of the population will have completed two college courses; and in Year 3,
50% of the population will have completed three college courses.
NCNSP will partially reimburse colleges directly for expenses associated with tuition for
students ($100 for each community college course) and for college textbooks.
Assistance in Developing Partnerships with Institutions of Higher Education (IHEs).
NCNSP staff plan to work closely with the school sites to ensure that each participating
school has an IHE that can provide access to college courses and that can also support
activities (such as college campus visits) to help create a more college-going culture. The
district and the IHE will collaboratively develop a formal agreement that delineates roles
and responsibilities.
Documenting the implementation of the services listed above is a core part of the
evaluation.
NCNSP is also providing a set of services designed to influence the context. These
services are new to the model and are not as solidly conceptualized; as a result, they will
not be a primary emphasis of the evaluation.
Actions to Influence the Context. NCNSP is also planning to provide additional supports
that are intended to create a more supportive context for the schools attempting to
51
implement this model. They intend to provide professional development with the
district, to work in educating the community, and to disseminate results throughout the
state. These strategies are new to NCNSP and have not yet been fully articulated.
All of the activities listed above are intended to help the school implement the Design
Principles of the NC iRIS School Model.
NC iRIS School Model. The NC iRIS School Model is based on the Early College High
School Model that has been implemented successfully in the past. The critical
components of the model include the Design Principles, the Common Instructional
Framework, and early access to college courses. These components are the content of
the services and are what schools are expected to implement. Each is discussed in more
detail below.
Design Principles: There are six design principles that represent the characteristics of
effective schools. As defined by NCNSP (and as included in the School-level Outcomes
section of the logic model), these Design Principles (North Carolina New Schools Project,
2011) are:
1. Ready for College: Innovative high schools are characterized by the pervasive,
transparent, and consistent understanding that the school exists for the purpose
of preparing all students for college and work, and recognizes that, in the 21st
century, the skills to succeed in post-secondary education and in viable
employment are the same. They maintain a common set of high standards for
every student to overcome the harmful consequences of tracking and sorting.
2. Require Powerful Teaching and Learning: Innovative high schools are
characterized by the presence of commonly held standards for high quality
instructional practice. Teachers in these schools design instruction that ensures
the development of critical thinking, application, and problem-solving skills often
neglected in traditional settings.
3. Personalization: Staff in innovative high schools understand that knowing
students well is an essential condition of helping them achieve academically.
These high schools ensure that staff leverage knowledge of students in order to
improve learning.
4. Redefined Professionalism: The responsibility to the shared vision of the
innovative high school is evident in the collaborative, creative, and leadership
roles of all adult staff in the school. The staffs of these schools take responsibility
for the success of every student, hold themselves accountable to their
colleagues, and are reflective about their roles.
5. Leadership: Staff in NCNSP schools work to develop a shared mission for their
school and work actively as agents of change, sharing leadership for improved
student outcomes in a culture of high expectations.
6. Purposeful Design: Innovative high schools are designed to create the conditions
that ensure the other five design principles are evident. The organization of time,
52
space, and the allocation of resources ensure that these best practices become
common practice.
The Design Principles are operationalized more fully in the school-level outcomes
section.
Common Instructional Framework: A subset of the Powerful Teaching and Learning
Principle, the Common Instructional Framework includes a set of six instructional
strategies that all teachers in the school are expected to implement. These strategies
are part of the Powerful Teaching and Learning Design Principle but receive particular
emphasis in the professional development and instructional coaching. The instructional
strategies include: 1.) collaborative group work; 2.) frequent opportunities to write in
classrooms; 3.) literacy groups focused on understanding content texts; 4.) effective
questioning; 5.) scaffolding or clearly connecting to students’ prior knowledge; and 6.)
classroom talk.
Early Access to College Courses: A core aspect of the model is creating a college-going
culture (the Ready for College Design Principle) by providing students early access to
college courses, starting in 10th grade. The current expectation is that 15% of a school’s
enrollment has completed one college course by the end of Year 1; in Year 2, 30% of the
population will have completed two college courses; and in Year 3, 50% of the
population will have completed three college courses.
The extent to which schools are able to implement the Design Principles will be a key
focus of the study. Specifically, we will examine whether these components are present
in the treatment schools at a higher level than in the comparison schools (treating them
as proximal outcomes). We will also explore the extent to which implementation varies
across treatment schools and attempt to identify services or characteristics associated
with different levels of implementation (treating them as implementation measures).
1.1.1.
i3 teacher and school outcomes
As articulated in Figure 1, the extensive professional development and coaching
provided by NCNSP are intended to support schools in implementing the Design
Principles (including the Common Instructional Framework and increased access to
college classes). Thus, each school is expected to make progress towards strong
implementation of the Design Principles. As also shown in Figure 1, these Design
Principles can be grouped in three different categories, each of which is discussed
below. Specific indicators for each of the Design Principles are shown in Table 1 below.
Students’ School Experiences. Three of the Design Principles emphasize directly
changing students’ experiences. As described above, these three include Ready for
53
College, Powerful Teaching and Learning, and Personalization. Ready for College focuses
on creating a college-going culture including: enrolling more students in a college
preparatory course of study; early access to college courses; and college awareness
activities. Powerful Teaching and Learning focuses primarily on the Common
Instructional Framework, Rigorous and Relevant Instruction, and the use of multiple
assessments in classrooms. Personalization emphasizes the creation of quality studentstaff relationships and the provision of academic and affective supports for students.
School Staff Experiences. Two of the Design Principles focus on the work of the
educators in the school. These Design Principles are designed to change teacher and
leader practice in such a way that the staff are able to support implementation of the
first three Design Principles. Professionalism emphasizes collaboration, ongoing
professional learning, empowerment of teachers, and developing a sense of collective
responsibility for students. Leadership is reflective of the need to create a collective and
distributed vision for the school focused on student learning.
Structures. Purposeful Design is the final Design Principle and it is intended to get
schools to develop structures (schedules, allocation of resources, etc.) that support
implementation of the other five Design Principles.
The final immediate/school-level outcome that is articulated on the logic model is
community and district support for the school. This outcome has not been fully
conceptualized by NCNSP and the evaluation team will continue to work with them to
define it further.
The Design Principles can be considered as fairly broad. To help operationalize these for
schools, NCNSP has developed a rubric that articulates expectations and indicators for
each of the Design Principles. The evaluation team has used this rubric as a base for
establishing indicators to measure the extent of implementation of each of the Design
Principles. Table 1 presents specific indicators.
Table 1: Indicators of School-Level Implementation
Design Principle
Main Concept
Indicators
Ready for College A college-going
--number of students taking college
culture exists
preparatory courses
throughout the
--existence of college awareness activities
school.
--faculty expectations for college-going among
students
--assistance navigating admissions/financial aid
--students taking college credit-bearing courses
(includes AP)
54
Design Principle
Main Concept
Powerful Teaching All students
and Learning
experience high
quality
instruction.
Personalization
All students are
known and have
necessary
supports.
Professionalism
Teachers are
encouraged to
learn and
collaborate
around students.
Leadership
Leaders develop
a shared culture
of high
expectations
Purposeful Design Structures are in
place to support
the other Design
Principles.
1.1.2.
Indicators
--use of Common Instructional Framework
--use of rubrics and formative assessment
strategies
--allowance for student input into activities
--high quality staff-student relationships
--use of specific academic/affective support
strategies
--collaboration around instruction and student
learning
--teachers feel responsible for students success
--participation in professional development
--teacher involvement in decision-making
--existence of a common vision
--high level of expectations among faculty
--scheduled time for teacher collaboration
--student supports embedded within the
school day
--strong IHE partnership
i3 student outcomes
Implementation of the Design Principles is intended to lead to improved student
outcomes—some of which are conceptualized as intermediate and some of which are
conceptualized as long-term. Specific outcomes that will be examined include the
following:
1. College preparatory course-taking and success. One of the main goals of the
project is to increase the college readiness of students. A core part of this is enrolling
students in the courses required for college entrance and helping them succeed in those
courses. As a result, this study looks at the proportion of students taking and succeeding
in a core set of college preparatory courses. The courses we will examine include those
that are required for entrance into the University of North Carolina system (e.g., four
years of English, four years of college preparatory math, etc.).
The evaluation will look at two outcomes for each course. The first will be the
percentage of a given grade of students who are taking the course, and serves as a
measure of access and the extent to which schools are providing opportunities for
students to get ready for college. The second will be successful course completion or the
percentage of a given grade of students who took and passed the course. In courses for
55
which there is a North Carolina mandated End-of-Course (EOC) exam, a passing score on
the exam will be used. In courses for which there are no EOC exams, students’ final
grades will be used. This second measure of successful course completion captures both
access and success in school and does not penalize schools that are expanding access to
more students. The anticipated impact is at least 10 percentage points on both coursetaking and course success by the second year of the intervention.
The three Design Principles related to student experiences (Ready for College,
Powerful Teaching and Learning and Personalization) are expected to most directly
impact this outcome.
2. College credits accrued while in high school. One of the key goals of the
program is to increase the number of students receiving college credit while in high
school. We will examine the proportion of students taking dual credit or Advanced
Placement courses. These data are also available through NCERDC. The Ready for
College Design Principle is expected to most directly impact college enrollment,
although we do anticipate that the Powerful Teaching and Learning and Personalization
Design Principles will have a direct impact on students’ success in these courses. We
believe that enrollment in these college courses will also be influenced by a school’s
success at getting more students enrolled in a college preparatory course of study.
3. Attendance. Student attendance has been positively associated with progress
in school (Lee & Burkham, 2003). Changes in student attendance are therefore seen as a
reliable indicator of students’ likelihood of remaining in school. The evaluation will
examine the number of days that a student is absent from school. The intervention is
expected to result in a reduction of 2 days of absence. We theorize that the
Personalization Design Principle will have the most direct connection to attendance but
we also believe that Ready for College and Powerful Teaching and Learning will
contribute.
4. Staying in school. The intervention is designed to keep more students in
school and on track for graduation. As a result, the evaluation will look at the number of
students who drop out. Because we have found that the dropout data are not always
complete, we will also look at the proportion of students who remain enrolled in school
in each year. By Year 2 of implementation in a school, the intervention is expected to
result in an increase of 5 percentage points per year in the proportion of students who
remain enrolled in school. Similar to attendance, we theorize that the Personalization
Design Principle will have the most direct connection to staying in school but we believe
that the Ready for College and Powerful Teaching and Learning Design Principles will
also contribute. We also believe that attendance is an intermediate outcome that is
strongly related to this longer term outcome of remaining in school.
5. Graduation from high school. Given that the grant program will last only five
years, the evaluation will be able to examine graduation rates only for a limited group of
students (those students who were in 9th and 10th grade in Year 1 of the project). Based
on results being obtained by the early colleges, the intervention is expected to result in
an increase of 10 percentage points on graduation rates for those students who
experience four years in an implementation school. As articulated in Figure 1, increased
56
graduation rates are designed to be a direct outgrowth of more students remaining in
school.
1.2.
Research questions for evaluation of implementation
The specific questions guiding the implementation evaluation are as follows:
1. Fidelity of Implementation: To what extent did NCNSP deliver the services as
intended and to what extent did the schools receive those services? What
variability in service delivery occurred across sites?
2. What was the quality and perceived utility of the NCNSP services?
3. School-level Implementation: How did schools’ implementation of the Design
Principles change over time? How did this compare to the presence of the Design
Principle characteristics in the comparison schools?
4. Which students participated in college credit courses? How were they selected?
How did student characteristics in treatment schools compare to the
characteristics of college-credit enrolled students in control schools?
5. What specific assistance was provided to schools to help them develop IHE
partnerships? What partnerships have been established? What do they look like?
6. What services were provided to help create a supportive context in the district
and the community? What lessons have been learned from those activities?
1.3.
Measuring fidelity of implementation
The implementation evaluation will focus on two aspects of implementation: 1.) the
delivery of and participation in program services (adherence and exposure) and 2.) the
implementation of program components at the school level (these latter can also be
seen as proximal outcomes and will not be considered as measures of fidelity of
implementation). Fidelity of implementation will be assessed at the school level, given
that different teams of school staff may participate in professional development
activities. The target benchmark will be whether 100% of professional development
services are delivered as planned to schools and whether participants participate in
those services at a rate of at least 80% (see Table 2 below). The 80% participation rate
was chosen by the developer as representing their perception of full participation.
For coaching, the benchmark for NCNSP and the school are the same because, if NCNSP
actually provides the service, the coach is on-site working with the school and the school
is participating in the service. For professional development services, NCNSP has
planned to provide all of these services to the schools in their varying networks
(including NCiRIS); however, the school may choose not to participate in individual
offerings. A participation rate of 80% is thus considered full fidelity for schools.
57
Because the number of days of coaching differs depending on the size of the school,
each participating school will have an individual benchmark for participation. Below is a
sample table for M. High School.
Table 2: Fidelity of Implementation of Services Provided to and Received by School M
Services
Target Level
Adequate
Adequate Dosage
Adherence by
Received by School
NCNSP
Leadership Coaching 20 days annually
18 days provided
18 days participated
Instructional
81 days annually
73 days provided
73 days participated
Coaching
Professional
Total of 22 days
22 days provided
18 days participated
Development
annually for
Services
different school
staff
Two other core components of the program are considered as dichotomous measures
of implementation: the creation of an IHE partnership that allows students to take
college courses and provision of a day of professional development for district staff. It
will be indicated whether these are in place for each school. Across all schools, the
benchmark is 100% creation for IHE partnerships (since the program will not work
without them) and 80% participation rate in district professional development.
The final core component of implementation is provision of funds for college courses.
This amount is dependent upon the number of students taking college courses, which is
a student-level outcome of the program. As such, as a measure of fidelity of
implementation, this will be treated as dichotomous (did the program provide funds for
students to take college credit courses?). The number of courses funded by NCNSP will
be considered as a student outcome.
An overall fidelity of implementation score will be generated as follows. Each school will
receive a score from 0-10 on each core component, with 10 representing full
implementation and 0 representing no implementation. Each core component will be
weighted equally, giving a total possible score of 50 for full implementation.
Participation in the different professional development and coaching services will be
rated based on the percentage completed relative to the full implementation
benchmarks. For example, a school that participated in 80-100% of the professional
development activities would receive a score of 10 on that component. A school that
participated in 50% of the professional development activities would receive a score of
6.3 (50% divided by 80% or full implementation). The dichotomous variables will receive
scores of 0 or 10. All scores will be added to create a single score.
58
For implementation of the Design Principles, we do not have formal benchmarks.
Instead, the expectation is that treatment schools improve on these dimensions as
compared to baseline and as compared to the comparison schools. These data will be
collected through surveys administered to the staff of treatment and comparison
schools. More detail is provided under the school-level outcomes section.
1.4.
Data collection plan
The following types of data will be collected on implementation. At the end of this
section, Table 3 aligns the data sources, sample, and timeline with the research
questions.
Project records. NCNSP requires schools to sign in to participating professional
development activities. These records will be collected from NCNSP and analyzed to
indicate participation in professional development activities by each school.
Coaches’ reports and interviews. Each time coaches visit their respective schools, they
are required to complete structured reports that include the date visited, the services
provided as aligned with one or more of the Design Principles, and any feedback the
coaches have about a school’s progress. NCNSP will forward these reports to the
evaluation team as they are submitted. These reports will provide data to measure
implementation by allowing us to assess delivery of and participation in the Leadership
and Instructional Coaching. We will analyze these reports to look at number and content
of visits and to help describe the process of implementation. We will supplement this
with semi-structured interviews conducted annually with the coaches to understand a
school’s implementation of the Design Principles.
Interviews with NCNSP staff. We will interview NCNSP staff annually to make sure we
understand the services that have been provided, including all professional
development activities, coaches’ activities, professional development for the district,
and community development activities. We will also interview the NCNSP Portfolio
Managers who work with the schools to obtain data on the formation of IHE
partnerships.
Site visits. We will identify four schools that can illuminate specific issues relative to
implementation. The current thinking is that these schools will be ones that are
beginning to make significant changes in their schools as a result of the model. We will
then visit these schools once a year starting in Year 2. During these visits, we will
interview the administrative team, school staff who have been actively participating in
NC iRIS services, students who have been participating in college credit courses, and
district staff. We will also conduct structured observations in classrooms that have been
59
working with instructional coaches and in support activities that have been created as a
result of the project. Finally, we will also observe a coaching visit during the site visit.
Professional development surveys. After each professional development activity,
participants are asked to complete a survey on the perceived quality and utility of the
professional development. These surveys will be completed online and responses will be
provided to NCNSP (because they already ask participants to complete surveys).
Professional development observations. Each year, the evaluation team will conduct
structured observations of selected professional development experiences. We will
emphasize observing P.D. sessions that are intended to provide the most direct and
targeted support for the NC iRIS schools. We will assess the P.D. sessions on indicators
of quality and relevance for the participants.
NCDPI data. NCDPI collects data on student course-taking from the schools. The level of
the course, including whether it is Advanced Placement or dual credit, is noted in the
data. Also included are demographic data for each student. These data are linked and
de-identified for use by researchers by the North Carolina Education Research Data
Center. Although these data will be used primarily for outcome analyses, we will also
use them to examine the characteristics of students who are participating in college
credit courses. These data will be analyzed starting in the first year in which a school is
participating in the intervention.
School Activity Survey. Each year, we will ask participating school teams to reflect on
and describe the implementation over the past year. The survey will ask participants to
describe what they have done, including the development of IHE partnerships.
School Self-Assessments. Each year, during the Summer Institute, schools complete a
self-assessment rubric that is centered on the Design Principles and that allows them to
identify areas of improvement. We will collect copies of these self-assessments.
Table 3: Data Collection Plan
Implementation
Research Question
1. Delivery and
receipt of services.
Data Sources
Sample
Timeline
Project records (sign-in
sheets)
All participating schools
Ongoing
Interviews with NCNSP
staff
Relevant NCNSP personnel
Annually
Coaches’ reports
Coaches for treatment
schools
As received
Site visit interviews
Staff in schools identified
for visits
Annually starting in Year 2
60
Implementation
Research Question
2. Quality and utility
of services.
3. School-level
implementation of
Design Principles
Characteristics of
students
participating in
college credit
courses; process of
selection of students
Data Sources
Sample
Timeline
P.D. surveys
All participants
After each training
P.D. observations
Selected core PD sessions
Annually
Site visit interviews
Staff in schools identified
for visits
Treatment and comparison
schools
Annually starting in Year 2
Coaches’ reports
Treatment schools only
Ongoing
Interviews with
coaches
Coaches for treatment
schools
Annually
Site visit interviews
Staff in schools identified
for visits
Annually starting in Year 2
School selfassessments
NCDPI data
Treatment schools
Students in treatment and
comparison schools
Annually, each year of
participation in the
intervention
Annually starting in first
year of implementation
NCNSP staff interviews
Portfolio managers
Annually
Coaches for treatment
schools
Annually
Treatment Schools
Annually
Staff in schools identified
for visits
Portfolio managers
Annually starting in Year 2
Coaches for treatment
schools
Annually
Treatment Schools
Annually
Staff in schools identified
for visits
Annually starting in Year 2
Staff surveys
Interviews with
coaches
Baseline at beginning of
training; each year for
treatment schools; every
other year for comparison
schools
School Activity Survey
Site visit interviews
Development of IHE
partnerships
NCNSP staff interviews
Interviews with
coaches
Annually
School Activity Survey
Site visit interviews
61
Implementation
Research Question
Services to create a
supportive context
Data Sources
Sample
Timeline
Project records (sign-in
sheets)
All participating schools
Ongoing
Interviews with NCNSP
staff
Relevant NCNSP personnel
Annually
Site visit interviews
District staff in schools
identified for visits
Annually starting in Year 2
2. Outcome Evaluation
Table 4: Impact studies described in Chapter 3
Chapter Title
Notes
4.1
Outcome
Impact of program on core student outcomes (e.g. collegeStudy 1
credit course-taking and graduation) using a quasiexperimental comparison of students receiving 3 years of
program vs. students not receiving program).
4.1.6
Sub-Study 1
Impact of program on school-level implementation of
Design Principles
2.1. Primary Outcome Study
The primary outcome study will use a quasi-experimental design to assess the impact of
the model on a core set of student outcomes. The sample will include a total of 18
comprehensive high schools that will receive three years of NC iRIS services. A subset
will receive services in Years 1-3; another subset in Years 2-4; and the final subset in
Years 3-5. Each school will be matched to up to three comparison schools, bringing the
total sample to approximately 72 high schools.
2.1.1.
Research questions
The general research question that motivates the outcome study design and analysis
plan is:
To what extent does participation in the NC iRIS Project result in improved student
outcomes, including increased course-taking and success, higher student attendance,
reduced dropout rates, and increased graduation rates?
62
Within this general question we specify two Primary Research Questions:
1. To what extent does participation in three years of NC iRIS implementation
result in improved student enrollment in college-credit bearing courses (dual
credit and AP)?
2. To what extent does participation in three years of NC iRIS implementation
result in improved student graduation from high school?
Secondary Research Questions include:
3. To what extent does participation in three years of NC iRIS implementation
result in improved student attendance?
4. To what extent does participation in three years of NC iRIS implementation
result in improved student drop out and continued enrollment rates?
5. To what extent does participation in three years of NC iRIS implementation
result in improved student enrollment and success in college preparatory
courses?
2.1.2.
Control (or comparison) conditions
The comparison condition will include 3 schools for every 1 intervention school, with the
comparison schools selected from the same pool as the intervention schools (that is,
comprehensive public high schools located in low-income, rural districts in North
Carolina that enroll between 350-1500 students). The comparison schools will not be
receiving any of the professional development services that are part of the NC iRIS
intervention.
In order to select appropriate comparison schools we will collect baseline data on all
public high schools in North Carolina. This baseline data will be taken from publically
available data collected annually by the North Carolina Department of Public
Instruction. The baseline data provides annual school-level averages of student-level
outcomes for all five primary and secondary student outcomes as outlined above and
serves as the best data available, given study resources, upon which to base selection of
comparison schools. This baseline dataset will include school-level data for the student
outcomes in this study for all intervention and potential comparison schools over the 5
years prior to NC iRIS implementation.
We will then use propensity score matching (PSM) and tests of baseline equivalence to
select comparison schools that exhibit a similar baseline pattern on key student
outcomes,. Specifically, we will choose comparison schools that are equivalent (within
.25 SD on each primary and secondary student outcome of interest) 1) at baseline and 2)
63
over the 5 years prior to NC iRIS implementation (given an interrupted time series
analysis is planned to estimate this whole school reform impact).
In addition to matching schools and establishing baseline equivalence on student-level
outcomes, we will also collect baseline data on our proximal outcomes of interest:
school-level implementation of Design Principles. Both intervention and comparison
schools will be given a baseline survey on the Design Principles and followed for the
same time interval. While it is not possible to choose comparison schools based on
baseline equivalence of these Design Principles, these surveys will allow us to assess
comparison condition equivalence on school-level implementation outcomes at baseline
and follow-up to inform outcomes in Sub-Study 1.
2.1.3.
Sample selection
2.1.3.1. Selection of study schools
The 18 intervention schools in the study were recruited by NCNSP from the list of rural,
low-income counties in North Carolina (based on 2011 fiscal year eligibility for rural lowincome school program criteria). All public comprehensive high schools in those
counties that enroll between 350-1500 students were considered as potential
participating schools. Schools had to be willing to participate in the intervention.
For each of the 18 intervention schools, the evaluation team will identify up to 3
comparison high schools using propensity score matching (PSM). The same criteria used
to select intervention schools will be used to select the potential pool of comparison
schools (that is, comprehensive public high schools located in low-income, rural districts
in North Carolina that enroll between 350-1500 students). Schools will be matched in 3
successive years, given the staggered entry design. That is, intervention schools will be
matched to 3 comparison schools the year prior to receiving the intervention.
We will use PSM to match schools on all five primary and secondary outcomes of
interest as well as theoretically relevant school-level characteristics including school
size, percent of minority students underrepresented in college, and percent of low
income students. To do so, we will use school-level data collected annually from NC DPI.
Given that our impact analyses will use an interrupted time series approach, we will
match on historical patterns of baseline outcomes (previous 5 years) and 1-year perintervention school characteristics. That is, for each of the five outcomes of interest, we
will estimate both the mean across 5 years of baseline as well as the slope of a best-fit
line across those 5 years (to capture direction and magnitude of change). For relevant
school characteristics, we will use the most currently available year of data. In all, 13
covariates will be used to match schools and try to achieve “good” covariate balance
across experimental and selected control schools. We will use various PSM techniques
64
as outlined and recommended by leaders in the field (Stuart, 2010; Stuart & Rubin,
2007) to try to achieve the best balance possible, given available data.
2.1.3.2. Selection of study teachers
Because aspects of this reform touch the entire school, all school staff will have some
exposure to the services. Leadership coaches will work with the school’s administrative
team. Each school will work with the NCNSP staff to identify a subset of teachers who
will participate more intensively in the instructional coaching and the off-site
professional development. This subset of teachers may vary depending on the needs of
the school.
2.1.3.3. Identifying eligible students for the study
Eligible students include the entire student population within participating and
comparison schools, given all students are exposed to the whole-school reform
components of the NC iRIS program. However, a subset of students will be targeted to
participate in a specific component of the NC iRIS program: college credit classes while
in high school. Specifically, students who are the first in their family to complete postsecondary education, who are at-risk of dropping out of high school, members of
underrepresented groups in college (low-income and racial and ethnic minority
students) and those who do NOT qualify for normal dual enrollment status under NC
statute will comprise a sub-sample of students eligible for that specific intervention
component.
As currently conceptualized by the program, students who would be eligible to
participate in North Carolina’s Career and College Ready program (which provides free
college courses) would not be eligible to participate in this program.
In the sample for our outcome analyses, we will include those students who are in 9 th
and 10th grade in the first year of the project’s implementation in a school. This will
allow us to document graduation rates of students who have participated in the project
for at least three years.
2.1.3.4. Expected Sample Sizes
The expected study sample size (for both intervention and comparison conditions)
includes approximately 72 high schools in 30-40 districts, and an estimated 28,000
students. Participating students will be drawn from 3 cohorts of 9 th and 10th graders.
Table 5 shows the anticipated sample by Cohort.
65
Table 5: Estimated Sample Size
Cohort (start year)
# of Treatment
Schools
2012-2013
5
2013-2014
8
2014-2015
5
# of Comparison
Schools
15
24
15
# of Students
8,000
12,000
8,000
2.1.3.5. Documenting attrition
This study is a quasi-experimental design that uses administrative data for core student
outcomes. As a result, attrition is not considered a concern. However, we will keep
detailed logs of when and why treatment schools choose to stop receiving the
intervention, if that occurs.
2.1.4.
Data collection for the student outcomes
All data used for student outcomes comes from data collected annually from
schools by the North Carolina Department of Public Instruction. These data are
housed at the North Carolina Education Research Data Center at Duke University,
where they are cleaned, de-identified, and linked to longitudinal student records.
1. College preparatory course-taking and success. This study looks at the proportion
of students taking and succeeding in a core set of college preparatory courses. The
courses to be examined include those that are required for entrance into the
University of North Carolina system (e.g., 4 years of English, four years of college
prep math, etc.).
The evaluation will look at two outcomes for each course. The first will be the
percentage of a given grade of students who are taking the course and serves as a
measure of access and the extent to which schools are providing opportunities for
students to get ready for college. The second will be successful course completion or
the percentage of a given grade of students who took and passed the course. In
courses for which there is a North Carolina mandated End-of- Course (EOC) exam, a
passing score on the exam will be used. In courses for which there are no EOC exams,
students’ final grades will be used. This second measure of successful course
completion captures both access and success in school and does not penalize schools
that are expanding access to more students. The anticipated impact is at least 10
percentage points on both course-taking and course success by the second year of
the intervention.
66
2. Attendance. The evaluation will examine the number of days that a student is
absent from school. The intervention is expected to result in a reduction of two days
of absence.
3. Dropout and continued enrollment in school. Because our experience with North
Carolina’s data indicates that the dropout data are not always complete, the
evaluation will also look at the proportion of students who remain enrolled in school
in each year. By Year 2 of implementation in a school, the intervention is expected to
result in an increase of 5 percentage points in the proportion of students who remain
enrolled in school, consistent with findings in the previously cited experimental study.
4. College credits accrued while in high school. The evaluation will examine the
proportion of students receiving college credit.
5. Graduation from high school. The evaluation will be able to examine graduation
rates only for the students who were in the 9th and 10th grades in Year 1 of the
project. Based on results being obtained by the early colleges, the intervention is
expected to result in an increase of 10 percentage points on graduation rates for
those students who experience four years in an implementation school.
2.1.5.
Student outcomes
The following table outlines student-level outcomes to be measured in the current
study.
67
Table 6: Outcomes
2
Name of
instrument
(and subtest)
3
4
5
Instrument Normed or Test-Retest
reference
State Test? Reliability
6
Internal
Consistency
7
Inter-rater
Reliability
8
Explanations, notes,
comments
NCDPI Graduate
Survey
NA
No
NA
NA
NA NCDPI Graduate Survey
is the authoritative list
of graduates
Course identified
as dual credit or
AP-level
NA
No
NA
NA
NA Level of courses are
available through
NCDPI
Courses required
for entrance into
UNC system
NA
No
NA
NA
NA Coursetaking is
collected by NCDPI
Performance on
state mandated
assessment or
teacher grades
NCDPI
Yes
Days absent
NA
No
NA
NA
NA Days absent are
collected by NCDPI.
Dropout Database NA
No
NA
NA
NA Students are recorded as
dropouts and sent to
DPI
Student identified NA
as enrolled in
public school
No
NA
NA
NA
Exam scores and grades
are collected by NCDPI
68
2.1.6.
Statistical analysis of outcomes for students
The primary analytic framework to be used to examine primary and secondary student
outcomes of interest will be computing the standardized mean (for continuous
outcomes) or odds ratio (for dichotomous outcomes) difference (Hedge’s g) between the
intervention and control groups after three years of NC iRIS implementation, controlling
for relevant baseline covariates (including any baseline student outcomes that differ by
more than .05 SDs between conditions). We will use HLM (hierarchical linear modeling)
analyses (Raudenbush & Bryk, 2002) that account for the nesting of students within
schools using SAS Proc Mixed (for continuous outcomes) and SAS %GLIMMIX (for binary
outcomes). HLM is the preferred method for analyzing data from studies with schoollevel assignment such as this one (Institute of Education Sciences, n.d., p. 45). We will
estimate an impact for each of the two primary student outcomes of interest separately
(college credit course-taking and graduation rates), because each of these outcomes
represent different domains, it will not be necessary to apply a statistical correction to
adjust for multiple tests.
A secondary analytic framework to be used will be a comparative interrupted time series
(ITS) analysis as a recommended analytic technique to assess the impact of whole school
reforms when random assignment is not feasible (Bloom, 2003). That is, the most reliable
estimate of what would happen in the absence of the program is a trend (not a point in
time) estimate. Specifically, we will compare the historical pattern of each student
outcome in intervention schools in the 5 years prior to the start of the intervention to
the pattern of each student outcome in intervention schools in the 3 years following the
start of intervention implementation. The difference between outcome levels in the two
groups is referred to as a “deviation from the baseline.” We will then conduct a second
interrupted time series analysis for the matched comparison schools. The difference
between the deviations from the baseline in the intervention schools and the deviations
from the baseline in the comparison schools will represent the estimated impact of the
intervention.
2.1.7.
Data collection for school-level outcomes
The second study focuses on implementation of the Design Principles at the school
level. As noted in Table 1, we have developed a set of indicators that represent
implementation of the Design Principles. Data on the indicators will be collected from
three primary sources:

Extant administrative data collected annually by NCDPI.
69


A baseline survey that has been developed to measure implementation of the
Design Principles. Both treatment and comparison schools will complete a survey
prior to receipt of services. Treatment schools will then respond to the survey
each year and comparison schools will respond every two years. These scales are
adapted from a survey administered as part of the IES-funded Early College High
School studies.
Teacher Working Conditions Survey. This survey is administered biennially to all
the schools in North Carolina. Included in the survey are scales relevant to the
quality of leadership, to professional development, and to participation in
decision-making. Reliability of all scales is at 0.84 or higher. Content, construct,
and predictive validity have been assessed (New Teacher Center, 2010).
2.1.8.
School-level outcomes
Table 7 summarizes the indicators and data sources for each of the Design Principles.
Table 7: Indicators of School-Level Implementation
Design Principle Indicators
Ready for
--number of students taking college
College
preparatory courses
--existence of college awareness
activities
--faculty expectations for collegegoing among students
Powerful
Teaching and
Learning
Data Source and Timeline
NCDPI administrative data-annually
Design Principles Survey—
Baseline and annually for
treatment schools, Baseline
and every two years for
comparison schools
Design Principles Survey
--assistance navigating
admissions/financial aid
Design Principles Survey
--students taking college creditbearing courses (includes AP)
--use of Common Instructional
Framework
NCDPI administrative data
--use of rubrics and formative
assessment strategies
Design Principles Survey
Design Principles Survey
70
Design Principle
Personalization
Professionalism
Indicators
--allowance for student input into
activities
--quality of staff-student
relationships
Data Source and Timeline
Design Principles Survey
--use of specific academic/affective
support strategies
--collaboration around instruction
and student learning
Design Principles Survey
--extent to which teachers feel
responsible for students’ success
Design Principles Survey
--participation in professional
development
Design Principles Survey
Teacher Working Conditions
Survey, administered in 2012,
2014, 2016
Design Principles Survey
Teachers Working Conditions
Survey
Design Principles Survey
--extent of teacher empowerment
Leadership
--existence of a common vision
--level of expectations among
faculty
--perceived quality of leadership
Purposeful
Design
--scheduled time for teacher
collaboration
Design Principles Survey
Design Principles Survey
Design Principles Survey
Teacher Working Conditions
Survey
Design Principles Survey
--student supports embedded within Design Principles Survey
the school day
--strong IHE partnership
2.1.9.
Design Principles Survey
Statistical analysis of school-level outcomes
To assess the implementation of the Design Principles at the school level, an average
score will be calculated for each principle from the Design Principle and Teacher
Working Conditions surveys. Scores on each Design Principle will be compared to both
the school’s scores at baseline as well as the scores of the schools that have not
71
received the treatment. This will be done using a multiple regression analysis that
includes a dummy variable for receiving the treatment and school-level covariates,
including school size, percent free and reduced price lunch, percent minority, and
composite score for the end-of-course exams. We will also conduct multi-level analyses
that utilize the scores for the Design Principles as site-level variables to explore possible
variations in impact across sites.
References
Bloom, Howard S, 2003. Using ‘short’ interrupted time-series analysis to measure the
impacts of whole school reforms. Evaluation Review, 27, (1), 3-49.
Century, J., Rudnick, M., & Freeman, C. (2010). A framework for measuring fidelity of
implementation: a foundation for shared language and accumulation of
knowledge. American Journal of Evaluation, 31(2), 199-218.
Institute of Education Sciences. (n.d.). What Works Clearinghouse: Procedures and
Standards Handbook. Washington, DC: Institute of Education Sciences, U.S.
Department of Education.
Lee, V. E., & Burkham, D. T. (2003). Dropping out of high school: the role of school
organization and structure. American Education Research Journal, 40(2), 353393.
New Teacher Center. (2010). Validity and reliability of the North Carolina Teacher
Working Conditions Survey. Santa Cruz, CA: University of California at Santa Cruz,
New Teacher Center.
North Carolina New Schools Project. (2011). Design Principles. Retrieved October 2,
2011, from http://newschoolsproject.org/our-strategy/design-principles
Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and
data analysis methods Thousand Oaks, CA: Sage Publications.
Stuart, E.A. (2010). Matching methods for causal inference: A review and look forward.
Statistical Science, 25 (1), 1-21.
Stuart, E.A. & Rubin, D.B. (2007). Best practices in quasi-experimental designs: Matching
methods for causal inference. In Best practices in quantitative social science. J.
Osborne (Ed.) Thousand Oaks, CA: Sage Publications.
72
Appendix C: Design Principles Rubric
Design Principles
Overview
North Carolina New Schools partners with local school districts and higher education institutions to help secondary schools become nimble,
rigorous and focused institutions that graduate every student prepared for college, careers and life. NC New Schools’ goal is to spark and
support deep instructional change by purposefully and dramatically rethinking traditional secondary schools’ organization to
promote more effective teaching and learning. Our essential premise is straightforward: to improve public secondary schools everywhere,
individual schools must be encouraged and assisted to invent and implement more effective means of serving students. The successes that these
schools achieve must be sustained, their processes supported, and their new structures for success replicated.
Design Principles
Each child in every school is entitled to achieving high academic and affective outcomes. To that end, the following six design principles for partner
schools are non-negotiable for all involved in leading secondary school transformation:






Ready for College: Partner schools are characterized by the pervasive, transparent, and consistent understanding that the school exists for
the purpose of preparing all students for college and work. They maintain a common set of high standards for every student to overcome the
harmful consequences of tracking and sorting.
Require Powerful Teaching and Learning: Partner schools are characterized by the presence of commonly held standards for high
quality instructional practice. Teachers in these schools design rigorous instruction that ensures the development of critical thinking,
application, and problem solving skills often neglected in traditional settings.
Personalization: Staff in partner schools understand that knowing students well is an essential condition of helping them achieve
academically. These schools ensure adults leverage knowledge of students in order to improve student learning.
Redefine Professionalism: Evident in partner schools are the collaborative work orientation of staff, the shared responsibility for
decision making, and the commitment to growing the capacity of staff and schools throughout the network.
Leadership: Staff in partner schools work to develop a shared mission for their school and work actively as agents of change, sharing
leadership for improved student outcomes in a culture of high expectations for all students.
Purposeful Design: Partner schools are designed to create the conditions that ensure the other five design principles: ready for college,
powerful teaching and learning, personalization, leadership and redefined professionalism. The organization of time, space, and the
allocation of resources ensures that these best practices become common practice.
73
Design Principle 1: Ready for College
College Credit
College Ready Skills
(High School)
Course of Study
Beginning
Early Steps
Growing Innovations
New Paradigms
Students are tracked according to
past performance into regular and
honors level courses.
All students are given the option to
take at least one honors course.
All courses are taught at the honors
level where applicable in the NC
SCOS.
Every student graduates with the
minimum admissions standards for
the UNC system schools.
Students are allotted time to receive
academic assistance (i.e. regular
scheduled meeting with staff
member, tutorials).
During and after school support is
scheduled on an individual basis,
determined by student performance
and data.
Schools implement the academic
supports necessary for every student
to succeed in the university
prep/future ready core curriculum.
Schools revise high school
experience of current students
based on data collected from college
going graduates.
A curriculum integrating but not
limited to self-advocacy, note taking
skills, study skills, research skills,
written and oral communication,
self monitoring and time
management (college ready skills)
exists.
Students are unaware of college
resources available to them, e.g.
study groups, tutoring center,
library and office hours.
College ready skills are
implemented throughout the
curriculum.
Opportunities for students to
practice college ready skills are
provided via exhibitions,
presentations (project graduation)
to authentic audiences.
Every student experiences a
curriculum that requires selfadvocacy, note taking skills, study
skills, research skills, written and
oral communication, selfmonitoring and time management.
Some students use college
resources.
Every student learns how to make
effective and efficient use of college
resources.
Students develop a four/five year
plan during the freshman year.
Students review their four/five year
plan occasionally with a staff
member.
Selected students enroll in some
college classes.
Most students enroll in some college
classes, selected by interest only.
Each student has a well-defined
four/five year plan that is
continually monitored and updated
to ensure completion of an AA or AS
degree, or transferrable college
credit.
All students are enrolled in and
complete college classes with
transferable credit.
Every student is an advocate for
their own learning, seeking
opportunities for personal growth
and success in the college
environment.
Every student is accepted into a four
year institution with credits earned
fully recognized. Students’
acceptance to college is celebrated.
Every student graduates high school
with both a high school diploma and
a two year degree or 64 hours of
transferable credit.
Design Principle 1: Ready for College
74
College Going Culture
Beginning
Early Steps
Growing Innovations
New Paradigms
Students and families are invited to
orientation/open house at the home
base community college or
university.
Brochures and literature about
financial planning and scholarships
are available for students.
Students are given the opportunity
to participate in optional college
visits.
Multiple mandatory visits to four
year institutions take place
throughout the year.
Every student and their family visit
multiple IHE campuses throughout
the year.
Families are invited to presentation
about FAFSA, CFNC, scholarships,
and the college admissions process.
Families are supported through
FAFSA and scholarship application
processes on site.
Schools provide support for every
student and family for college
admissions and financial aid,
including scholarship applications.
Some students take the SAT at some
point.
Students have access to take the
PSAT and SAT in sequence and on
time.
Teachers post information about
their college(s) and invite students
to discuss their college experience.
Frequent conversations exist
between students and teachers with
a focus on attending and graduating
college.
Students are given multiple
opportunities to prepare for and
participate in the PSAT/SAT/ACT.
Students explore the internet and
investigate possible institutions
based on their interests. Teachers
and students talk daily about
acquiring tangible goals in order to
go to college. Conversations focus
on which college to attend not
whether to go to college.
Every student takes the PSAT and
SAT/ACT in sequence and on time.
A list of institutes of higher
education is posted in the
counselor’s office. Displays
throughout the school highlight
colleges. Students aspire to attend
college.
Every student completes a formal
process through which they are
supported by staff in applying to
and being accepted at multiple
IHEs.
75
Design Principle 2: Require Powerful Teaching and Learning
Instruction
Curriculum
Beginning
Early Steps
Teachers plan using a variety of
resources but without reference to
local, state or national standards or
without consideration of
appropriate pacing.
Teachers teach the North Carolina
Standard Course of Study at an
appropriate pace.
Content is course-specific.
Teachers relate the content from
other courses to connect learning for
students and incorporate literacy
and problem solving instruction
within each content area or
discipline.
There are limited learning activities
outside of classroom experiences.
Some teacher-directed learning
activities enrich classroom curricula
for some students.
Instruction meets the learning needs
of some, but not all, students.
There is limited use of technology
for instruction.
Teachers regularly adapt resources
and instruction to address learning
differences in their students.
Teachers integrate and use
technology in their instruction.
Teachers provide limited
opportunities for students to work in
groups.
Teachers organize student learning
teams and teach the importance of
cooperation and collaboration.
Growing Innovation
New Paradigms
Teachers plan instruction around
“big ideas” that are mapped to
multiple standards and to 21st
century skills (e.g. leadership, ethics,
accountability, adaptability,
initiative, communication,
collaboration, social responsibility,
wellness, entrepreneurship).
Teachers relate the content to other
disciplines and school theme (if
applicable) to deepen understanding
and connect learning for students,
across each school year as well as
from year to year. Teachers further
promote global awareness and its
relevance to the subjects they teach.
All students participate in
purposeful and varied co-curricular
learning opportunities that support
college and work readiness and
school theme (if applicable).
Teachers create structures for
personalized learning and teach
students to make informed choices.
Teachers know when and how to use
technology to maximize students’
development of critical-thinking and
problem-solving skills.
Students identify problems – in
their own lives, in their
communities, and in the world –
and design projects mapped to state
and national standards across
disciplines.
Teachers organize learning teams
deliberately and teach students how
to create and manage their own
teams.
Students synthesize relevant
knowledge and skills from their
cumulative experience to design and
communicate thoughtful solutions
to increasingly sophisticated,
authentic problems. In themed
schools, authentic problems relate to
school theme.
Students design and lead a wide
range of co-curricular learning
opportunities that support college
and work readiness, service learning
and school theme (if applicable).
Students create their own learning
plans with guidance and support
from the teacher.
Students help each other use
technology to learn content, think
critically, solve problems, discern
reliability, use information,
communicate, innovate and
collaborate.
Students effectively organize and
manage their own learning teams.
76
Assessment
Instruction
Beginning
Design Principle 2: Require Powerful Teaching and Learning
Early Steps
Growing Innovation
New Paradigms
Teacher talk dominates instruction.
Teachers communicate effectively
with all students. Teachers help
students articulate thoughts and
ideas clearly and effectively.
Teachers teach students how to
communicate effectively with each
other and set up classroom practices
that require them to do so.
Collaboration and discussion among
students is pervasive.
Students help each other exercise
and communicate sound reasoning,
understand connections, make
complex choices, and frame,
analyze, and solve problems.
Students clarify ideas and other
students’ work during whole-class
discussions and small group work.
Students ask each other to justify
their thinking.
Students are reading, writing,
thinking and talking in every
classroom every day, without
explicit teacher direction, to advance
collective and individual
understanding of core skills and
concepts.
Students participate in the
development of the criteria for
successful demonstration of
meaningful learning outcomes.
There is limited use of student
engagement strategies.
All teachers adopt a common
instructional framework to make
instruction more engaging for all
students and to ensure a coherent
and consistent student learning
experience.
Teachers facilitate students reading,
writing, thinking and talking daily to
develop a deep understanding of
core academic concepts.
Teachers post learning objectives.
Teachers communicate learning
outcomes and the criteria for
success and assess progress daily.
Teachers routinely share rubrics
with students that clearly
communicate meaningful learning
outcomes and criteria for success.
Teachers monitor progress
throughout each lesson.
Teachers’ use of a narrow range of
assessment strategies limits their
understanding of students’
knowledge and skills.
Teachers employ varied assessment
strategies that elicit student thinking
related to learning outcomes.
Teachers have a more complete
understanding of students’
knowledge and skills.
In addition to a wide range of teacherdesigned assessment strategies, teachers
use protocols for peer- and selfassessment aligned to learning outcomes
and criteria. Teachers have a
comprehensive understanding of
students’ knowledge and skills.
Students exercise choice in
determining how to demonstrate
learning outcomes. Teachers and
students share a comprehensive
understanding of each student’s
knowledge and skills.
Teachers primarily use assessments
to assign grades and/or control
behavior.
Teachers provide instructional
interventions based on data from
assessments.
Teachers provide timely, targeted
opportunities for students to learn
and demonstrate particular
outcomes based on data from
assessments.
Students monitor their progress on
learning outcomes and engage in
multiple, varied opportunities to
learn and demonstrate outcomes.
77
Teachers provide limited feedback
to students and/or parents
regarding student progress.
Feedback is limited to grades and/or
assignment completion.
Teachers regularly provide feedback
to students and parents regarding
progress on specific learning
outcomes.
Teachers provide feedback to
students and parents that clearly
communicate students' strengths
and specific guidance for continued
development relative to learning
outcomes.
Teachers and students have ongoing
communication regarding progress
toward learning outcomes and next
steps.
Students regularly report strengths and
plans for continued development
relative to learning outcomes to parents.
Design Principle 3: Personalization
Affective and Academic Supports
Beginning
Early Steps
Growing Innovation
New Paradigms
There is an advisory or seminar
course for every grade level that
provides students with affective and
academic supports based on
students’ personal learning plans
and other data.
A schedule is in place in which
school staff and college staff from
any higher education partners meet
regularly to discuss students’
progress. Data is used to identify
and implement the necessary
supports for students.
Some planning for implementation
of advisories/seminars exists.
Advisory courses are provided for
some grade levels.
Advisory/seminar courses with well
developed curricula exist for every
grade level.
There is a plan to develop
relationships with the students,
staff, and community partners and
any higher education partners.
A systemic plan is followed in which
each student is assigned to a
teacher-advisor. The school
counselor also serves as an advisor
and assists students with their
academic and affective needs.
Some online courses are available
for students.
A variety of online courses are
available which students may take
based on their academic needs.
Advisories, personal learning plans,
AVID or other school-wide
strategies are used to know students
and their academic and affective
needs well. In addition, staff
members meet regularly during
scheduled times to discuss students’
academic and affective needs.
There is a wide range of online
courses available to students based
on their personal interests and
academic needs.
There is a plan to develop academic
support programs in order to
maximize student growth.
Some academic supports are in
place such as a summer bridge
program and tutoring session times
available before and after school.
Academic support programs are in
place during the summer and
before, during, and after the school
day such as tutoring sessions and
academic support labs.
The school provides a wide range of
high school and college courses that
allow students to be self-initiated
and self-paced. Supports are
available that help students to
complete these courses at a high rate
of success.
The school schedule provides varied
opportunities for students to obtain
additional supports through
extended blocks, looping, tutoring
and summer programs.
78
Design Principle 3: Personalization
Adult/Student Relationships
Beginning
Early Steps
Growing Innovation
New Paradigms
A welcome letter is sent to incoming
freshmen.
Staff members visit the homes of
incoming freshmen.
Staff members visit the homes of
incoming freshmen and new
students to welcome them and begin
developing positive relationships.
Staff members visit the homes of
every student annually in order to
maintain positive parent-school
relationships and discuss the needs
and progress of students.
Some teachers meet occasionally to
discuss the needs and progress of
students.
There is a plan to develop a school
schedule that provides time for
teachers to meet at least once a week
to discuss the needs and progress of
students.
All teachers meet weekly by grade
level or subject area to discuss the
needs and progress of students.
Teachers are grouped by students
and meet during scheduled common
planning times daily to discuss
student needs and develop supports.
Every student has a teacher-advisor.
Teacher-advisors meet with their
assigned students once a month.
Teacher-advisors meet with their
assigned students weekly to review
their academic progress.
During informal conversations,
students state that they feel their
teachers care about them.
Data gathered from sources such as
student surveys indicate that a
majority of students feel known and
cared for by the adults in their
school.
A school wide survey of the student
body indicates that at least 95% of
students surveyed indicate that they
feel known, respected and cared for.
Teacher-advisors meet with
assigned students at least once a
week to review their progress and
provide academic and affective
supports as needed. Teacheradvisors are looped to advance with
students as they advance through
high school and review the students
personal education plans in order to
ensure successful completion.
Data from surveys of students and
parents indicate that at least 98% of
both populations feel that the adults
in the school care, know, and
respect them.
The school distributes newsletters
or other forms of communication to
provide updates and information
frequently.
School newsletters and
communications are provided in a
language other than English.
School newsletters and
communications are translated into
every language represented in the
student population.
The school website, blogs, tweets,
social media sites and newsletters
are translated into every language
represented in the school and are
made available. Parents, students
and other community members are
also involved in submitting
information for the newsletters and
communications.
79
Design Principle 4: Redefine Professionalism
Shared Responsibility and Collaborative
Decision Making
Collaborative Work Orientation
Beginning
Early Steps
Growing Innovations
Principals observe teachers.
Teachers observe their peers in
practice.
Teachers observe their peers in
practice for the purpose of giving
and receiving feedback for revision
and improvement.
Staff meetings and/or common
planning opportunities model
inquiry among adults.
Staff attends staff meetings and/or
common planning opportunities, as
appropriate.
Staff meetings and/or common
planning opportunities model
collaboration among adults.
Teachers work independently.
Staff collaborates with peers and, at
times, share expertise for
professional learning and improved
practice.
School implements district
protocols for recruitment, interview
and hiring processes.
Principal includes one or more staff
in recruitment, interview, and
hiring processes for their specific
school.
Teachers supervise advisories.
Teachers are developed as teachercounselors through a common
research based approach to student
development.
Teachers lead advisories that provide
consistent guidance and support,
including the development of personal
learning plans and support for
emotional, social and academic needs.
Students are organized into
advisory groups and/or project
teams.
Peer connections are promoted
through advisory groups and/or
project teams
Professional development
opportunities are offered to support
youth development.
Teachers share strategies for
engaging challenging students.
Students have an overt and clearly
delineated mechanism for
participating in student
development and school success.
Teachers collaboratively create
flexible solutions for engaging
challenging students.
Principals make decisions related to
school-wide issues and teachers
make decisions related to classroom
issues.
Staff has some input into school
decisions, including the selection of
representatives to decision-making
bodies.
Staff regularly collaborates with
peers, share expertise, and holds
themselves accountable for
professional learning and improved
practice.
Principal and staff collaborate on
recruitment, interview, and hiring
processes to ensure alignment with
the school’s mission.
All staff work together to make
decisions that advance the mission
of the school.
New Paradigms
All staff members solicit peer
feedback in order to advance their
own practice.
Staff is engaged in inquiry around
their practice through sharing their
work, student work and
professional dilemmas for feedback
and support.
Staff regularly collaborates with
peers, share expertise, and hold
themselves and peers accountable
for professional learning and
improved practice.
Staff, parents, and students
collaborate on recruitment,
interview and hiring processes to
ensure alignment with the school’s
mission.
All adults in the school assume
responsibility for youth
development and each student’s
success.
Students assume responsibility for
positive school and community
engagement that contributes to
citizenship.
The school actively engages families
regarding successes and challenges
that their child faces and works with
families to arrive at successful
solutions.
Individuals from all constituent
groups are engaged in and can
clearly articulate the school
decision-making process and the
avenues for participation.
80
Sustainability
Design Principle 4: Redefine Professionalism
Beginning
Early Steps
Growing Innovations
New Paradigms
Knowledge Capture &
Exchange
Staff maintains personal classroom
and instructional resources and
units of study.
All staff share resources and units of
study with school-based peers.
All staff post and use resources and
units of study in a shared bestpractice library.
Networking
Assigned teachers participate in
scheduled NCNSP network events.
Teachers initiate participation in
scheduled NCNSP network events.
Staff participates in peer networks
for the purpose of giving and
receiving feedback to advance
specific practices.
All staff routinely vets individual
and shared resources and units of
study posted in a shared bestpractice library with school-based
peers and with peers across the
NCNSP network.
Staff convene and regularly network
with peers, employers, and experts
beyond the school.
Communication
Staff members sometimes speak in
support of the school with internal
and external stakeholders.
Staff members routinely speak in
support of the school with internal
and external stakeholders.
Staff members speak with
confidence about collectively made
decisions with internal and external
stakeholders.
Staff members speak with
confidence to stakeholders about
collectively made decisions and
their alignment with the school’s
mission and vision.
Capacity Building
School participates in NCNSPprovided professional development.
Schools have a mechanism for
disseminating resources and
materials garnered from NCNSP
professional development
experiences.
Schools secure resources and
professional development
experiences aligned with the
school’s mission and vision and
NCNSP Design Principles.
Schools have a systematic, internal
process for the on-boarding and
development of new staff and crosstraining, capacity building, and
continued acculturation of existing
staff aligned with NCNSP Design
Principles.
81
Design Principle 5: Purposeful Design
Autonomous Governance
Beginning
Early Steps
Growing Innovations
New Paradigms
A full-time principal has been
named and essential staff has
been hired.
Adequate instructional and
support staff members have been
hired. Along with the principal,
their time is not divided with
other schools.
The school has a unique school
code and a preliminary school
budget has been prepared.
The principal and staff meet to
review and discuss the school
budget.
The principal, instructional staff,
and support staff meet consistently
to discuss scheduling and hiring
decisions as well as other school
operation items in order to make
decisions that will best meet
students’ needs. The principal and
staff members have significant
autonomy from undesirable staff
transfers and district level
professional development mandates.
The principal and staff meet
frequently to discuss the school
budget and make revisions as
necessary.
The school has established an
identity and theme.
The school actively advertises
their school’s identity and theme
and visits middle schools to
recruit their target population.
The school is autonomous in making
decisions regarding curriculum
decisions as related to the school’s
identity and theme.
The principal and staff members work
as a team in which distributed
leadership is used and everyone is
actively involved in key areas of
decision making. The principal and
staff meet during scheduled, specific
times at least once a week and use their
autonomy from district mandates to
make decisions and solve problems to
create unique instructional designs to
meet student needs.
The principal and staff meet on a
scheduled basis to review the school
budget and make any necessary
revisions. The budget is revised as
necessary to make decisions that
exemplify a flexible use of resources in
the best interests of students.
The school reaches out to local, state,
national and global organizations in to
deepen the connection between the
school’s identity, theme, and real world
applications.
The district office is aware of the
separate professional
development requirements set
forth by the NCNSP.
The district office has waived
required attendance at some of
the district level professional
development.
The district office and the school
partner are involved in the decision
making process regarding the
attendance of the principal and staff
for selected professional
development events.
The principal and staff have complete
autonomy in making decisions
regarding attendance at district level
professional development events. The
principal and staff attend and
implement all NCNSP mandated
professional development. They also
work together to identify and develop
any additional professional
development needed.
82
Design Principle 5: Purposeful Design
Student Recruitment and
Selection*
School Sustainability
Beginning
Early Steps
Growing Innovations
New Paradigms
There is a detailed budget plan
one fiscal year prior to the
current fiscal year.
There is a detailed five year budget
plan for the duration of initial
funding.
There is a detailed budget plan to
ensure program sustainability beyond
initial funding.
There is a detailed budget plan to
ensure program sustainability beyond
initial funding that incorporates
business and community partner
support as well as other stakeholders.
The school is recognized in the
community.
The school fosters relationships
with business and community
partners for financial support,
community service opportunities,
job shadowing opportunities and
participation in school projects
during the school year.
The school develops business and
community partnerships for
continuous financial support,
community service opportunities,
internship opportunities, and
participation in school projects that
connect to and influence decision
making.
The school develops business and
community partnerships for ongoing
financial support, community service
opportunities and participation in school
projects that connect to and influence
decision making. Business and
community partners are also used to
provide opportunities that connect
students to real world learning
experiences.
The school is well recognized by
the school district and local
education partners.
The school has scheduled meetings
with the school district and higher
education partner to discuss
sustainability of the school.
Short and long range plans for
development and sustainability of the
school are available and supported by the
school district and higher education
partner.
A plan for the sustainability of the
school is embedded within the
outlook of the school district and
higher education partner.
Recruitment materials for the
school are available.
Recruitment materials are aligned
with NCNSP guidelines for the
specific model.
Recruitment materials are aligned
with NCNSP guidelines for the
specific model in at least two
languages.
Recruitment materials aligned with
NCNSP guidelines for the specific
model are available online in multiple
languages.
A school admissions policy
exists.
The school admissions policy is well
defined and non-selective.
Some faculty members participate during
middle school recruitment visits and
presentations. The staff participates
during the selection process. A rubric for
targeted recruitment focused on the
design principles is implemented.
Faculty, staff, students and
community leaders assist the
principal by participating during the
selection process. A rubric for
targeted recruitment and student
interviews are used in the process.
The principal recruits in middle
schools.
The principal and staff recruit in
middle schools.
The principal, staff and students
explicitly reach out to
underrepresented parent and
community groups.
A community approach involving the
staff, parents, students, civic leaders and
business leaders is executed to reach out
to underrepresented parent and
community groups.
83
Design Principle 5: Purposeful Design
Beginning
Collaborative
Work
Orientation
Teachers work independently.
Early Steps
Growing Innovations
New Paradigms
Staff members collaborate with
peers and, at times, share expertise
for professional learning and
improved practice.
Staff regularly collaborates with
peers, share expertise, and holds
themselves accountable for
professional learning and improved
practice. A common planning time
has been established as part of the
master schedule.
Staff regularly collaborates with
peers, shares expertise, and holds
itself and peers accountable for
professional learning and improved
practice.
84
Shared
Mission
Change Agent
Focus on Powerful Teaching &
Learning
Beginning
The principal employs tools to create
a mission and vision for the school.
The principal completes the School
Improvement Plan and NCNSP Selfassessment.
Design Principle 6: Leadership
Early Steps
Growing Innovations
The principal ensures that the
Staff members work together to
school’s identity actually drives
make decisions that advance the
decisions and informs the culture of
mission of the school and foster
the school.
understanding among constituent
groups.
The principal ensures alignment of
the school’s vision with the
implementation of evidence-based
strategies to improve student
performance.
The principal and staff creatively
seek opportunities to build new and
unique connections between the
school and the community.
The principal acts as a catalyst to
seek new solutions and encourages
risk-taking in meeting individual
student needs with potentially
beneficial outcomes.
The principal proactively develops
partnerships with district and
institutions (e.g., higher education
colleagues) to the benefit of school
and students.
The principal expects teachers to be
a part of a professional learning
community.
The principal convenes staff working
groups to identify instructional
trends across campus.
The principal leads discussions
about standards based upon
research and best practice.
The principal safeguards
instructional and professional time
in the school day.
The principal monitors instruction
in classrooms daily for full
implementation of the Common
Instructional Framework and
provides relevant and targeted
feedback to teachers.
The principal makes data available
to staff for review and reflection.
The principal holds staff accountable
for full implementation of the
Common Instructional Framework
and for continuous learning and
professional development.
The principal allows teachers to take
risks in meeting students’ needs.
The principal collects or receives
data.
The principal facilitates
conversations with staff about the
use of data to improve school
performance through systematic
collection, analysis and goal setting.
New Paradigms
Staff members engage in a dynamic
process of continuous reexamination and refinement of the
mission of the school in order to
grow the school’s direction based on
previous successes and challenges.
Staff assumes ownership for the
development of new solutions to
meet school and individual student
needs.
The principal contributes to
leadership within the district and
across the NCNSP network to
advance an innovative educational
agenda for all students in North
Carolina.
Staff members assume ownership of
problem identification, solution
generation and strategy
implementation.
Staff collaborates with peers, shares
expertise, and holds itself and peers
accountable for professional
learning and improved practice.
Staff members adopt an action
research orientation that includes
the collection of data points,
analysis, and goal setting as a result
of data review.
85
The principal makes decisions
related to school-wide issues.
The principal leads all committees
and work groups.
The principal develops partnerships
with staff to the benefit of the school
and students.
Culture of High
Expectations
Collabor
ative
Work
Environ
ment
Shared Leadership
Beginning
The principal believes that all
students are capable, with
appropriate supports, of succeeding
in a challenging learning
environment.
The principal believes that all staff
members, with support, are capable
of creating a rigorous and
challenging learning environment
for all students.
Design Principle 6: Leadership
Early Steps
Growing Innovations
The principal demonstrates evidence The principal holds staff accountable
of high expectations for all students
for ensuring the success of each
that eliminates tracking.
student.
New Paradigms
Staff holds peers accountable for
ensuring the success of each student.
The principal demonstrates evidence
of high expectations for all staff that
include routine conversations with
staff regarding school standards for
rigorous and challenging learning
environments for all students.
The principal seeks input from staff
into decisions made at the school,
including active recruitment of
diverse representatives on school
decision-making bodies.
Staff collaborates with peers, shares
expertise, and holds itself and peers
accountable for the design and
implementation rigorous and
challenging learning environments
for all students.
Individuals from all constituent
groups are engaged in and can
clearly articulate the school
decision-making process and the
avenues for participation.
The principal actively encourages
teacher leadership through
traditional school-based leadership
opportunities, including
department/grade level leaders,
School Improvement Chair, etc.
The principal proactively develops
relationships with students, families
and community partners.
The principal holds teachers
accountable for full engagement in
the design and implementation of
rigorous and challenging learning
environments for all students.
The principal establishes a clear
collaborative decision-making
process so that all staff work
together as appropriate to make
decisions that advance the mission
of the school.
The principal promotes staff
participation in district and external
leadership opportunities and
enables staff to lead school-based
conversations about those
experiences.
Teachers, parents, and community
members actively participate in the
development of the School
Improvement Plan, the NCNSP Selfassessment, and other school plans.
The principal expects and empowers
teacher leadership through the
establishment of clearly defined and
promoted leadership deployment
pathways.
The principal empowers staff and
school community to assume
ownership of problem identification,
solution generation and strategy
implementation.
86
The principal designs a schedule and
process that includes common
planning opportunities.
Staff share instructional practices,
lessons learned, and current
challenges with peers during
common planning opportunities.
Practice is made public through the
use of school-wide rounds and peer
school review, which includes both
internal and external peer
observation and feedback.
Staff members routinely engage in
quality assurance processes such as
school-wide rounds, peer school
review, and collaborative student
work reviews to improve
instructional practices within
specific classrooms and across the
school.
87
Appendix D: NC iRIS Implementation Survey
Validating Early College Strategies: Staff Survey
UNIVERSITY OF NORTH CAROLINA AT GREENSBORO
CONSENT TO ACT AS A HUMAN PARTICIPANT--SURVEY
What is the study about?
This is a research project. The purpose of this project is to evaluate the implementation and impact of a
program called Validating Early College Strategies. We want to understand how schools are implementing
specific policies and practices and whether those change over time.
Why are you asking me?
We are asking you to participate in this study because you are participating in a school that is receiving services
or you are part of a school that is similar to a school receiving services.
What will you ask me to do if I agree to be in the study?
We will ask you to complete an anonymous survey.
Is there any audio/video recording?
No, there is no audio or video recording.
What are the dangers to me?
The Institutional Review Board at the University of North Carolina at Greensboro has determined that
participation in this study poses minimal risk to participants. If you have any concerns about your rights, how you
are being treated or if you have questions, want more information or have suggestions, please contact Eric Allen in
the Office of Research Compliance at UNCG toll-free at (855)-251-2351. Questions, concerns or complaints
about this project or benefits or risks associated with being in this study can be answered by the study’s director,
Julie Edmunds, who may be contacted at (336) 574-8727 or at [email protected]
Are there any benefits to society as a result of me taking part in this research?
Information collected through the evaluation will help inform how to make high schools better.
Are there any benefits to me for taking part in this research study?
There are no direct benefits to participants in this study.
Will I get paid for being in the study? Will it cost me anything?
Your school will receive $1,000 when at least half of their staff have completed the survey. It won’t cost you
anything to participate.
How will you keep my information confidential?
This online survey is anonymous. The data will be entered into a database kept on a password protected
computer. For the online surveys, absolute confidentiality of data provided through the Internet cannot be
guaranteed due to the limited protections of Internet access. Please be sure to close your browser when finished
so no one will be able to see what you have been doing.
88
What if I want to leave the study?
You have the right to refuse to participate or to withdraw at any time, without penalty. If you do withdraw, it
will not affect you in any way. If you choose to withdraw, you may request that any of your data which has
been collected be destroyed unless it is in a de-identifiable state.
What about new information/changes in the study?
If significant new information relating to the study becomes available which may relate to your willingness to
continue to participate, this information will be provided to you.
Voluntary Consent by Participant:
By clicking “I agree” on the bottom of this form, you are agreeing that you have read, or it has been read to you,
and you fully understand the contents of this document and are openly willing consent to take part in this study.
All of your questions concerning this study have been answered. You are also agreeing that you are 18 years of
age or older and are agreeing to participate. You may print a copy of this form for your records.
I AGREE (goes to survey)
I DON’T AGREE (goes to thank you page)
89
School: __________________
Date: ________________
Your school is participating in a project led by the North Carolina New Schools Project. This survey is designed
to measure your school experiences in areas that the project is designed to influence. We will use this
information to describe what schools are doing. We also hope to connect this information to student outcomes
and determine which aspects of the program are most critical. As a result, we ask you to be very honest in
reporting what is actually happening in your school.
Please do your best to answer questions based on your knowledge; if there is a question you absolutely cannot
answer, please skip that question.
We will also share a summary of the results of this survey with your individual school for school improvement
planning. However, the results will not be broken out by position. As a result, this survey is anonymous and will
not be traced back to you.
Thank you very much for your time.
For comparison group:
Your school is participating in a study designed to understand the implementation of a specific reform effort.
Your school is not participating in this reform effort but your school is similar to other schools that are. This
survey is designed to measure your school’s experiences in a variety of areas that are targeted by the reform we
are studying. We will use the survey information to understand if the reform is working. If it is working, we
want to understand which aspects are most critical. As a result, we ask you to be very honest in reporting what
is actually happening in your school.
Please do your best to answer questions based on your knowledge; if there is a question you absolutely cannot
answer, please skip that question.
We will also share a summary of the results of this survey with your individual school for use in your school
improvement planning. However, the results will not be broken out by position. As a result, this survey is
anonymous and will not be traced back to you.
Thank you very much for your time.
90
1. What is your role in this school? (Please choose the ONE that most applies.)
○
○
○
Administrator (go to Q2)
Teacher (skip to College
Readiness)
○
○
Support Staff (skip to College Readiness)
Other__________________________(skip to College
Readiness)
Counselor (go to Q2)
2. Below is a list of courses. Please identify the kinds of courses that would be on a typical class schedule for
two sets of first-time 9th grade students: those students who are below grade level and those students who are on
grade level. (In cases of a structured sequence of courses or a bridge course leading to a higher level course in
the same year, please mark the highest level course a typical student could expect to take in 9th grade.)
(NOTE: In the online survey, the respondent is prompted to choose from a drop-down menu the appropriate
level of course for each type of student.)
a. English: Remedial
English/English I or a higher
course
b. Mathematics: Introductory
Mathematics/ Algebra I or
Integrated Mathematics I or
higher
c. Science: Biology, a Physical
Science, or Earth/Environmental
Science/ No science
d. Social Sciences: World History,
Civics and Economics, or US
History/No Social Studies
e. Foreign Language: Foreign
language/ No foreign language
A below-grade-level
9th grader would
have:
An on-grade-level 9th
grader would have:
○
○
○
○
○
○
○
○
○
○
91
3. Below is a list of courses. Please identify whether your school regularly does not offer, offers, or offers for
dual credit, college credit, or AP credit any of these courses (including online courses) (Regularly offered
courses do not have to be offered every semester. Do not include courses that students can take on their own
time. If courses are offered for both high school and AP/College credit, please select both options.)
a. Algebra I, Geometry, Algebra II
b. Integrated Mathematics I, II, and
III
c. Pre-Calculus and Trigonometry
d. Calculus (AB and/or BC)
e. Statistics
f. Advanced Functions and Modeling
g. Biology
h. Chemistry
i. Earth/Environmental Science
j. Physical Science
k. Physics
l. English
m. Civics and Economics
n. World History
o. US History
p. Other Social Science (2 additional
options):_________
q. Visual and Performing Arts
r. Foreign Language
s. Career and Technical Education
Not
offered:
Offered
for HS
Credit:
Offered for
Dual credit,
College Credit,
or AP:
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
Estimated
Percentage of
Students
Taking Class
for College
Credit or AP
92
4. This year, what percentage of your students (Mark one for each question.):
Less than
25-50% 50-75% 75-99% 100%
25%
a. Were enrolled in honors
courses?
b. Were on track to meet
minimum admission standards
for the university system?
○
○
○
○
○
○
○
○
○
○
College Readiness
The first set of questions concern activities related to college readiness.
5. How much do you agree with the following statements? (Please choose the ONE that most applies.)
a. The faculty in this school expects every
student to go to college.
b. Most teachers in this school believe that, if
given enough support, all students can
successfully complete college preparatory
courses.
c. The faculty at the school explicitly and
purposefully focuses on college expectations.
d. The faculty at the school focuses on
specific activities that lead to college
acceptance.
e. The vision of this school is tied to
preparing every student for college.
f. The school does activities designed to get
all students to think of themselves as college
students.
Strongly
Disagree
Disagree
Agree
Strongly
Agree
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
93
6. Please estimate the percentage of students for whom the school provides the following services. (Mark one
for each question.)
Less
25-50%
Greater
0%
than
50-75%
than 75%
25%
a. Advising on courses to take
to get ready for college
b. Advising on choosing college
classes
c. College exam preparation
(COMPASS, Accuplacer,
SAT/PSAT, ACT)
d. Advising on skills students
need in college
e. Discussions with college
faculty about expectations in
college
f. Tours of college campuses
g. Advising parents about
college admissions and financial
aid
h. Helping students through the
college admissions process.
i. Helping students through the
financial aid process
Other: ____________________
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
94
Teaching and Learning
The following questions concern curriculum and instruction in your school.
7. To what extent does your school have a common vision for instruction?
No common vision; everyone
teaches the way he/she likes
Some staff share a common
vision for instruction but others
don’t
There is a common vision that
drives major instructional
decisions for all staff
○
○
○
8. This question asks you to report on your instructional practices. Note: If you are an administrator or
counselor, please answer this question relative to the teaching practices of most teachers in your school
(Mark one for each question.)
A few
Once or Once or Almost
This school year, how frequently
times
twice a
twice a
every
have you…
Never
this year
month
week
day
a. Asked students to solve
problems based on life outside of
school?
b. Let students decide on the
projects or research topics
they will work on?
c. Had students decide how to
work on their assignments or
projects (e.g., read on their own,
do research in the library)?
d. Had students work together on
projects or assignments?
e. Emphasized making
connections between what goes
on inside and outside of school?
f. Made connections between
what’s covered in your class and
what’s covered in other classes?
g. Asked students to defend their
own ideas or point of view in
writing or in a discussion?
h. Asked students to write more
than 5 pages on a topic?
i. Asked students to explain their
thinking?
j. Asked students to apply what
they have learned to solve a new
problem?
k. Asked students to engage in indepth discussions about what they
have read or learned?
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
95
This school year, how frequently
have you…
l. Asked students to research
information?
m. Asked students to do a formal
oral presentation?
n. Asked students to form
and test a theory or hypothesis?
o. Asked students to analyze
or interpret documents or data?
p. Asked students to do a formal
oral presentation?
q. Had students create or add
to a portfolio of their work?
r. Expected students to take
detailed notes on a lecture or
presentation?
s. Worked with students on
time management and study
skills?
t. Asked students to communicate
what they had learned in writing?
u. Asked students to read difficult
or complex texts?
v. Used rubrics to grade students’
work?
w. Explained your expectations
for an assignment up front?
x. Given students feedback or
comments on their work before
they turned it in for a grade?
y. Provided models or exemplars
so students could see high quality
work?
z. Taught students note-taking
skills and/or note-taking
strategies?
aa. Provided a syllabus and had
students use it for planning their
work?
Never
A few
times
this year
Once or
twice a
month
Once or
twice a
week
Almost
every
day
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
96
8. (continued) This question asks you to report on your instructional practices. Note: If you are an
administrator or counselor, please answer this question relative to the teaching practices of most teachers
in your school (Mark one for each question.)
A few
Once or Once or Almost
This school year, how frequently
times
twice a
twice a
every
have you…
Never
this year
month
week
day
bb. Asked students to reflect on
their learning?
cc. Ask students to assess their
own work?
dd. Ask students to assess their
peers’ work?
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
9. This question asks you to report on your use of different assessments. Note: If you are an administrator or
counselor, please answer this question relative to the assessment practices of most teachers in your
school. (Mark one for each question.)
Used
Used
How frequently have you used the
Not at
Seldom
occaUsed
very
following types of assessments?
all used
Used
sionally
often
often
a. Multiple choice tests
b. Essays
c. Open-ended written responses
other than essays (such as graphic
organizers, etc.)
d. Projects
e. Oral presentations
f. Formative assessments to guide
instruction
g. Informal checks on student
understanding (observations,
questioning)
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
97
10. This question asks you to report on your communication with parents. Note: If you are an administrator
or counselor, please answer this question relative to the teaching practices of most teachers in your school
(Mark one for each question.)
This school year, how frequently
A few
Once or Once or Almost
have you provided feedback to
times
twice a
twice a
every
parents…
Never
this year
month
week
day
a. Regarding grades?
b. Regarding assignment
completion?
c. Regarding progress on specific
learning outcomes?
d. That clearly communicate
students' strengths?
e. With specific guidance for
continued development relative to
learning outcomes?
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
98
Personalization
The next set of questions focuses on aspects of the school related to staff-student relationships and to affective
and academic support for students.
11. Please mark the extent to which the following statements about relationships in this school are true.
The faculty in this school believes that all
students can do well.
a. Faculty members at this school have given
up on some students.
b. Every student at this school is known well
by at least one staff member.
c. The family and home life of each student
is known to at least one faculty member in
this school.
d. Faculty members follow up when students
miss their classes.
e. Faculty members respect all the students in
this school.
f. Students respect all the faculty members in
this school.
g. Important communication messages for
parents are translated to different languages.
h. Peer connections are promoted through
advisory groups or project teams.
i. Peer mediation programs help solve
student conflict.
j. Staff interact on a regular basis with
students’ parents or guardians.
k. Staff in this school care whether or not
students come to school.
l. Other:_________________
Not true
at all
Somewhat
true
Mostly
true
Entirely
true
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
99
12. To what extent are the following services offered at your school?
a. Advisories/Seminar
b. Sessions to help improve
general academic skills such
as study skills
c. Tutoring connected to a
specific class
d. Summer orientation or
bridge sessions for entering
students
e. Personalized education
plans
f. Sessions or classes to help
students cope with social or
emotional issues
g. Credit Recovery Courses
h. Online Courses
i. Intensive E-O-C prep
activities
j. Community Service
Not offered
Available
but not
mandated
Mandated
only for
students
who need it
(may be
available
for others)
Mandated
for
everyone
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
If offered, % of
students who
participate
13. If students get extra help or other services, when does this happen? (Please mark all that apply.)
○
○
○
○
○
During the school day
Before the school day
After the school day
On weekends
During vacations (summer, breaks)
14. Does the school provide transportation for students if they need to get services outside of regular school
hours? (Please select only one answer.)
○
○
Yes, the school provides transportation.
No, the school does not provide transportation.
100
○
Students do not receive services outside of regular
school hours.
Professionalism
This set of questions concerns issues such as decision-making, collaboration, and professional development.
15. How frequently do you work with or communicate with other school staff on the following: (Mark one for
each question.)
A few
Once or
Once or
times this twice a
twice a
Almost
Never
year
month
week
every day
a. Lesson or unit planning
b. Logistical issues (ex. planning
field trips, ordering materials, etc.)
c. Student behavior
d. Assessments
e. Peer observations & feedback
f. Content learning
g. Instruction/instructional strategies
h. Individual student needs
i. Sharing resources and units
j. Using research or data to improve
instruction
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
16. To what extent are the following activities built into your school schedule? (Mark one for each question.)
Regularly
Occurs when
scheduled
Scheduled as people take
time
needed
the initiative
○
○
a. Joint planning or collaboration
b. Professional development
○
○
○
○
17. Please mark the proportion of school staff for whom the statements below are true.
a. School staff act as if they are responsible
for students’ learning, even if the students
are not in their classes.
None of
the staff
A few of
the staff
Most of
the staff
All of the
staff
○
○
○
○
101
b. Staff in this school really believe every
child can learn.
c. School staff meet regularly (formally or
informally) to discuss how to meet the
needs of students.
d. At this school, staff enforce a common
set of rules and regulations that enables us
to handle disciplinary problems
successfully.
e. If a student doesn’t want to learn, staff
here give up.
f. School staff believe that good teaching is
more important to students’ engagement in
schoolwork than is their home
environment.
g. If a student doesn’t learn something the
first time, staff here will try other
approaches until the student does learn.
h. Staff in this school believe that students’
success or failure is primarily due to factors
out of the school’s control.
i. Staff in this school feel responsible for
making sure that students don’t drop out.
j. At this school, staff are able to create a
safe and inclusive atmosphere even in the
most difficult classes.
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
102
18. How much professional development have you received in the following areas in the past year?
Multiple
sessions with
A single
Multiple on-site followNot
None presentation sessions
up
Applicable
a. The content you teach
b. Instructional
strategies in your
content area
c. General instructional
strategies
d. Management or
organizational practices
(including Critical
Friends Groups,
leadership practices,
etc.)
Other: ______________
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
19. How involved are the following groups in the decision-making process in the school?
Involved in
Involved in
minor and
Involved in
Not involved mostly minor
some major
most major
at all
decisions
decisions
decisions
a. Teachers
b. Students
c. Members of the college
community
d. Parents
e. Community members
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
103
20. How involved are the following groups involved in the recruiting, interviewing, and hiring process in the
school?
Significantly
Not involved Involved in
Involved in
involved in all
at all
some aspects
most aspects
aspects
a. Teachers
b. Students
c. Members of the college
community
d. Parents
e. Community members
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
104
Purposeful Design
The following questions concern the structure of your school, interactions with the district, and the nature of
any partnerships your school may have.
21. (principals only)What is the extent to which the school has autonomy in the following areas?
None
a. Hiring
b. Firing
c. Budgets
d. Participation in district
level professional
development
A little
A fair amount
A lot
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
22. To what extent do the district’s requirement and supports align to the school’s goals and needs?
○
○
○
○
None
A Little
A fair amount
A lot
23. Schools often have partnerships with different members of the community. How do the members of your
community contribute to your school? (Please select all that apply.)
a. Financial Support
b. Provide internships
c. Mentor or tutor
d. Serve as guest speakers
e. Provide equipment
f. Teach classes or courses
Parents
(including
PTA)
○
○
○
○
○
○
Businesses
Local
colleges or
universities
Other
members
of the
community
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
105
g. Provide other resources
○
○
○
○
24. How supportive do you feel the community is of this school and its vision?
○
○
○
○
Not at all supportive
A little supportive
Fairly supportive
Very supportive
25. (principals only) Please list any major grants you receive for initiatives in your school (ex. Golden Leaf
technology grants, etc.)
26. (principals only) Please list and briefly describe any school-level interventions or other key school
improvement efforts occurring in your school.
106
Leadership
27. Please indicate the extent to which you agree with the following statements about your school:
a. The school has a clear mission and
vision which drives key decisions in the
school
b. Staff members work together to
continuously review and evaluate the
implementation of the vision.
Strongly
Disagree
Disagree
Agree
Strongly
Agree
○
○
○
○
○
○
○
○
28. Please indicate the extent to which you agree with the following statements about the leadership team at
your school:
The leadership team:
a. Leads discussions about standards of
curriculum and instruction.
b. Monitors instruction on a regular basis.
c. Provides feedback on a regular basis.
d. Facilitates discussions centered around
data use to improve performance.
e. Creates an environment where all staff
are responsible for student learning.
f. Expects that all staff will work to ensure
students learning.
g. Communicates high expectations for all
students.
h. Celebrates successes with all staff.
Strongly
Disagree
Disagree
Agree
Strongly
Agree
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
○
107
Demographic Information
Please tell us a bit about your background.
29. Number of years experience in education: _______
30. Number of years in current role at any school (as administrator, counselor or faculty): __________
31. Number of years in current role at the current school: __________
THANK YOU FOR YOUR TIME!!!
108
North Carolina investing in Rural Innovative Schools (NC iRIS )
NC iRIS Liaison
These are intended to be minimal guidelines and may be expanded at each
district depending upon unique school circumstances and needs.
Primary Responsibility:
The NC iRIS liaison will develop positive relationships between the higher
education partner and the LEA and bridge the gap and keep communication lines
open among all parties. As the chief advocate for and support to the NC iRIS high
school, 80 percent or greater of duties and time will be devoted directly to
serving students and meeting program needs of the NC iRIS high school. The
remaining 20 percent of the liaison’s time will be spent performing duties such as
securing classroom space, scheduling and overseeing college placement testing,
coordinating vertical teaming between faculties and participating on various high
school and higher education partner joint committees as appropriate.
Possible roles of the NC iRIS liaison:
Coordination Schedules
• Assist sites in developing four or five year plans of study combining
secondary and postsecondary coursework to ensure the attainment of an
associate’s degree; two years of transferable college credit; online courses
or any pathway opportunity.
• Assist in the development schedules that support gradual transitions into
increasing levels of independence in postsecondary coursework.
• Work with secondary and postsecondary staff to ensure coordination of
high school and college schedules that allow access to required
coursework.
• Assist secondary staff with the registration of students in postsecondary
courses
• Synchronize the planning for the college, so that high school schedules are
developed in time to fit into the college cycles for room allocation and
instructor assignment (for both high school and college classes needed by
the NC iRIS). Responsible for making sure that on the days the college is
not in session, but the high school is, that all details have been worked out.
• Help to plan summer orientation, with focus on needs for space and types
of activities.
• Coordinate the use of college facilities.
Structure
Policy Development and Coordination (this work forms the basics for the
College/LEA’s Memorandum of Understanding, but not limited to)
• Assist secondary planning team in developing policies for awarding
high school credits for college coursework.
• Develop policy statements for students such as:
o Attendance
o Data collection
o Grading policies and procedures
o Behavior and code of conduct
o Parking
o Access to college facilities and resources (library, computer labs,
etc.)
o Inclement weather coordination
• Develop and facilitate policies and procedures for placement of
students into postsecondary courses
o Placement test scores
o EOC test scores
o Portfolio development
o Alternative assessments
• Participate in appropriate working committees.
Curriculum Development and Coordination, such as, but not limited to:
• Facilitate curriculum planning between secondary and postsecondary
subject areas to reduce redundancy and maximize collaboration
• Facilitate cooperative planning to align secondary and postsecondary
expectations
• Work with postsecondary department heads to develop special topics
courses for high school students
• Coordinate between college and high school staff to identify
professional development that is appropriate or deemed necessary
Guidance, Support, and Advocacy, such as, but not limited to:
• Support NC iRIS students and families by providing information
related to colleges, financial aid, and other information as needed.
•
Connect NC iRIS students to student life on campus through special
events such as black history month, student council.
•
Connect special services at the college with the NC iRIS. This includes
coordinating disabilities services for NC iRIS students in college
courses with college instructors, and arranging for advising programs
to be developed.
•
Educate the college and the community at large concerning “NC iRIS”
•
Participate in as many parent high school activities as possible.
`