Skip to content
Licensed Unlicensed Requires Authentication Published by De Gruyter July 21, 2018

Teacher Turnover, Composition and Qualifications in the Year-Round School Setting

Jennifer Graves EMAIL logo , Steven McMullen and Kathryn Rouse

Abstract

We estimate the effects of year-round school (YRS) calendars on teacher turnover and teacher qualifications for the state of California, finding that YRS results in diminished teacher education and experience. This result is notable as previous research finds negative academic impacts of YRS in California. As context for our findings, we use comparisons with North Carolina, where research has found neutral academic impacts for the same calendar. While we find that schools in both locations hire more teachers to accommodate the calendar, teacher qualifications do not decrease for North Carolina. Our results are therefore consistent with, and can partly explain, evidence on the impact of YRS on student achievement. Additionally, as YRS is implemented in more affluent areas in North Carolina and in disadvantaged populations in California, we use matched samples to show that student demographics do not explain our teacher impacts found for California.

JEL Classification: I21; J45

Acknowledgements

This author gratefully acknowledges support from the Spanish Ministry of Science and Innovation through grants ECO2013-44920-P and ECO2017-82882-R.

Appendix

A “Teacher Turnover, Composition and Qualifications in the YRS Setting”

In this Appendix, we first discuss and present results from several robustness tests of our preferred estimation results for California. Then, we provide detail on the data and empirical strategies used for the comparison analyses conducted for Wake County, NC.

B Robustness of California Results

In this section, we provide a variety of evidence regarding the robustness of our main findings for California. First, Table 7 provides regressions of lagged school characteristics, including teacher characteristics on a next year change to a multi-track YRS calendar in California. This test serves as supporting evidence that there is no notable time-varying selection into the multi-track YRS that could bias estimates. Nonetheless, estimation in the main paper include both school fixed effects and school-specific time trends to account for potential time-varying selection. Table 8 reports the main estimation results without adding any school fixed effects or school trends and Table 9 reports the same specifications with school fixed effects added, but still no time trends. These two tables show how estimates change as estimation becomes stricter in terms of the identifying variation used.

Table 7:

Regressions of current school measure on a next year change to a MTYR calendar in California.

Dependent Variable: a next year change to a multi-track YRS calendar12
Single-track YRS calendar0.0445***

(0.00781)
0.0457***

(0.00781)
Traditional calendar0.0504***

(0.00560)
0.0515***

(0.00568)
Charter school−0.00116*

(0.000685)
−0.00149

(0.00187)
Student variables (percent on 0–1 scale)
Percent Asian−0.00502

(0.00636)
−0.00680

(0.00635)
Percent black0.0102

(0.00632)
0.00768

(0.00627)
Percent Hispanic/Latino−0.0142***

(0.00307)
−0.0151***

(0.00308)
Percent other race−0.00135

(0.00193)
−0.00195

(0.00191)
Percent on FRPM1.96 × 10−5**

(9.48 × 10−6)
1.83 × 10−5*

(9.50 × 10−6)
Percent male−0.000239

(0.00132)
−0.000612

(0.00132)
Computers/student0.000151

(0.000108)
0.000202*

(0.000115)
Internet connections/student−0.000243

(0.000437)
−0.000381

(0.000432)
Teacher variables (percent on 0–1 scale)
Number of teachers5.37 × 10−5

(3.39 × 10−5)
Percent working overtime−7.18 × 10−6

(2.16 × 10−5)
Percent with a Masters or PhD2.67 × 10−6

(1.00 × 10−5)
Average years teaching5.18 × 10−5

(6.19 × 10−5)
Average years in district−0.000313***

(7.22 × 10−5)
Percent fully credentialed−5.94 × 10−5**

(2.56 × 10−5)
Percent hired on special conditions5.62 × 10−6

(2.10 × 10−5)
Percent certified special education−1.59 × 10−5

(1.06 × 10−5)
Percent certified bilingual2.39 × 10−5*

(1.44 × 10−5)
Constant−0.0370***

(0.00508)
−0.0312***

(0.00538)
Observations81,72781,727
R-squared0.1290.130
  1. Robust standard errors in parentheses

    ***p < 0.01, **p < 0.05, *p < 0.1

    Notes: All regressions include year and school fixed effects.

The main findings for effects of YRS on teacher composition in California, reported in Table 2 of the paper, are generally robust to variations in estimation. Table 10 reports the same specifications, excluding controls. Table 11 reports the main estimation run on a balanced panel of schools. Due to new school construction over the time period studied, along with some changes in the reporting of teacher variables in the later years of our sample, the number of schools varies across years of our main sample used in estimation in the paper. One can see from both of these checks that the core findings of the paper remain. Most robust are the reductions in teaching experience resulting from YRS, while the estimates for lower average teacher education are more sensitive to the sample used.

In drawing comparisons between the findings for California, and the findings for North Carolina discussed in Section 6 of the paper, we also test to see whether the timing of calendar changes could explain differences in results between the two locations. If this were the case, then the different findings in both locations would be driven by our choices regarding the years in our samples used in estimation, as opposed to truly different effects in our two locations. We do this by limiting the much longer sample of years used in California to be the same as those used in WCPSS estimation. Results are reported in Table 12. Estimates are largely the same as those presented in Table 2 for California using the full sample, finding an increase in teachers, teaching overtime and reduced experience levels. We also see same-signed evidence regarding education (however, further from significance at conventional levels). Estimates on other credentials become insignificant. While some estimates change, most general findings for California, as well as the main differences in findings between California and WCPSS, do not appear to simply be driven by the different timing of calendar changes or the specific sample of schools used in the two locations.

C Data and Empirical Approach for Wake County, NC

In previous studies of YRS, there is a documented difference in the estimated impact of calendar changes on student outcomes (Graves, McMullen, and Rouse 2013). In order to explore the connection between teacher effects and the student impact of YRS, we offer similar estimation results for a sample from the Wake County Public School System (WCPSS) in Wake County, North Carolina. We demonstrate that, in this alternative setting, we do not see the same detrimental impact on teacher qualifications. What follows in this section of the Appendix is a brief description of the North Carolina study location and detailed discussion of the data and empirical approach used in the analyses underlying estimation results presented in the paper in Table 3.

The WCPSS currently serves over 150,000 students, making it the largest district in the state of North Carolina and the 16th largest in the United States. Enrollment in the WCPSS has grown substantially over the last few decades and is expected to increase by 40,000 students by 2022.[18] Use of the YRS calendar was first implemented in the WCPSS in 1989. Since its adoption, use of the YRS calendar slowly increased in prevalence until the 2007–2008 school year when the school system converted 22 schools from traditional to YRS calendars and ordered that all new schools be opened on a YRS calendar. In Wake County, NC, YRS calendars are multi-track, where each school has four tracks of students, at least one of which is “tracked-out” at any point in time. Moreover, Wake County, NC uses only the 45-15 multi-track model of YRS, referring to a rotation of number of days in school and out of school, respectively.

The one-time large-scale calendar conversion in 2007–2008 more than doubled the number of schools using the calendar. The YRS policy change was a response to crowding created by high population growth. The school system selected the 20 schools that would switch calendars based on their level of crowding. However, schools did not have choice over this conversion. Because it was imposed mandatorily upon the selected schools, this change was met with strong opposition from parents and was eventually contested at the State Supreme Court where the court upheld the school system’s policy. The policy environment surrounding the WCPSS and the mandatory nature of the calendar assignments create a natural experiment that can be exploited along with longitudinal data and fixed effects.

Data for WCPSS comes from the NCERDC, a data center that was created in 2000 through a collaborative effort between Duke University and the North Carolina Department of Public Instruction. We combine these data with publically available school-level information from WCPSS on demographics, achievement, and crowding. Because the YRS calendars are only used in elementary and middle schools in Wake County, we eliminate high schools from our analysis.[19] In Figure 4, we show the growth in use of the YRS calendar in Wake County over the time period of study. Each year corresponds to the spring of the academic year. In 2006, only about 13% of Wake’s elementary and middle schools operated on a YRS calendar. The largest change occurred in 2007–2008, when the proportion of YRS increased from roughly 16 to a little over 34% of the schools. Since 2008, the number of schools operating on the schedule has increased slightly to roughly 38%.

Because the calendar conversions in 2007–2008 were largely implemented to counteract crowding, YRSs tend to differ from traditional counterparts. In Table 13, we present descriptive statistics of student, school and teacher variables by calendar type. Multi-track YRS in Wake County have a higher percent of white students and a lower percent of students on a free and reduced price meals program. Average reading achievement scores suggest there is little difference across calendar types, however there is a slightly higher passage rate for math exams in YRS. There are also a lower number of crimes and long-term suspensions at YRSs. Because of their increased capacity to house students, YRSs are less crowded than their traditional calendar counterparts and have more teachers.

While, on average, YRSs have more teachers, many of the teacher characteristics are very similar across the calendar types. Average teacher experience at both YRS and traditional calendars schools is around 12 years. YRSs appear to have a larger proportion of teachers with 4–10 years of experience while traditional schools have higher proportions of both less experienced (0–4 years) and more experienced (11+ years) teachers. We do see a higher number of national board certified teachers in YRS. Additionally, teachers in YRS also earn roughly $2,000 more than their traditional calendar counterparts.

Because our dataset includes data on all North Carolina public schools, we are also able to observe whether a teacher stays in their current school, leaves for a new public school within the district, moves to a new public school in North Carolina outside of WCPSS or if they exit the sample entirely. We use this information to construct school-level aggregate measures of mobility, which appear in Panel D of Table 13. The mobility statistics suggest turnover is slightly lower in YRS. This pattern persists across all of the mobility measures.

Because of the natural experiment that mandated calendar conversions in WCPSS, specifications including year and school fixed effects are likely to address selection concerns and allow for estimation of a causal effect of YRS calendars on teacher outcomes. To see this visually, we plot teacher measures before and after the large-scale calendar conversion to show that there does not appear to be systematic time-changing differences between changers and non-changers. Figure 5 compares the number of teachers and average years teaching from 2006 to 2010 between schools that are traditional across the entire period and the 22 converting schools that were switched to YRS in 2007/2008 (labeled 2008). The evidence in Panel A shows that the number of teachers, while different in levels between the two groups, remained relatively constant across the two types of schools during our sample period. Panel B shows the average number of years teaching by calendar type. Likewise, the trends do not appear notably different prior to the 2007–2008 calendar conversion.

To estimate the impact of YRS on teacher characteristics, qualifications and turnover, we begin by estimating the following general linear function at the school level[20]:

(2)Yst=αMTYRst+δSst+φs+γt+εst

where Yst is the outcome of interest (i. e. percent of licensed teachers, percent of highly experienced teachers, percent teacher turnover), MTYRst is an indicator variable that is set equal to one if school s operates on a multi-track year-round schedule at time t, is a vector of school-level characteristics of school s at time t, ϕs is a school fixed effect, γt is year fixed effect, and εst is an error term. This specification allows us to estimate the impact of YRS calendars on the composition of the stock of teachers. The coefficient α on in eq. (1) is our coefficient of interest.

It is important to note that YRS could have a differential impact on mobility according to the destination to which the teacher is moving. For example, we might reasonably assume that distaste for a YRS calendar might make it more likely that a teacher would make an in-district move than an out-of-district move, because the cost of making an in-district move would be lower. While a YRS calendar could conceivably cause a teacher to move to another North Carolina public school outside of Wake County, this decision might more likely be driven by something like a household move to another part of the state. Similarly, the impact of YRS on the decision to exit public school teaching – whether due to retirement or for some other reason (e.g. new career, move to another state, etc.) – is likely to be different than its impact on the decisions to move to another Wake school or to another school within North Carolina. To address this possibility, following Goldhaber, Gross, and Player (2011), we also estimate specifications using types of teacher transitions as alternative outcomes: the percent of teachers moving to a new school in the district, the percent moving to a school outside of the district and the percent leaving the NC system.

An additional concern one might have in comparing estimates across both locations is that identifying variation in NC comes exclusively from switches to YRS, while identifying variation in CA largely comes from switches away from YRS, back to a traditional calendar. However, the temporary nature of YRS use in California makes the switch back to traditional in most cases a subsequent event driven by switching to the calendar in the first place. In other words, the types of schools switching away from a YRS calendar are unlikely to be a highly selected subset of (or differ substantially from) those that switched to a YRS calendar (before our data begins).

In addition to leads and lags of mean teacher variables presented in Figure 5, as further evidence regarding potential for time-varying selection, we present evidence in an event study style graph from our main estimation (eq. (1)) that includes dummy variables denoting the time until and time after a YRS calendar change. This analysis is presented for North Carolina in Figure 6. We present evidence using those schools that changed to a multi-track YRS calendar from a traditional calendar as part of the large-scale calendar conversion in the 2007–2008 school year (referred to as 2008). Panels A and B of Figure 2 present estimates and confidence intervals for the outcome of the number of teachers in the school. As time interactions are for schools changing in 2008, t = −2 is 2006, t = −1 is 2007, t = 0 is 2008, t = 1 is 2009 and t = 2 is 2010. The (positive) effect in the change year has been centered at zero to make comparisons across years straightforward. In Panel A, the comparison group includes both traditional calendar schools that do not change calendar, as well as always YRSs that do not change in the observed time period (plus a few new schools that open in the time period as year-round, and a few that change away from year-round). While Panel A therefore includes the full sample of schools used in estimation, Panel B restricts the comparison group for year-2008 converters to only those traditional calendar schools that do not experience a calendar change.

As one can see from the graphs, there does not appear to be any strong pre-trend. There does, however, appear to be a small spike in the year before the calendar conversion year. This jump does not appear to reflect a general trend pre-dating the calendar change, but rather is more consistent with an anticipatory policy impact. If this were the case (although the estimate for t = −1 is not highly statistically significant and only so in Panel B), then one might interpret estimates on the number of teachers for North Carolina as an underestimate (we estimate only a 4% increase in teachers in North Carolina compared to a 9% increase in teachers in California, relative to their respective means). In Figure 6, Panels C and D, comparable estimates for mean experience do not show a strong pre-trend either. However, in the reduced sample (Panel D), the estimate for t = −2 is statistically significant and positive. If one were to interpret this as a decreasing pre-trend, then our estimates of no change in teacher experience (which we report in the next section) would likely be downward biased. Regardless of whether estimates for North Carolina (Table 3) are interpreted directly or are interpreted as underestimates, any concerns about pre-trends impacting results do not explain the differences in effects found for our two locations. For California, we find additional teacher hiring with diminished experience (and possibly education levels), while for North Carolina we find additional teacher hiring with no change (or possible positive effects) for teacher qualifications.

Figure 4: Percent of Wake County Public Schools on each calendar type by year.Note: Sample size are 116 schools in 2006, 121 schools in 2007, 122 schools in 2008, 126 schools in 2009 and 133 schools in 2010.
Figure 4:

Percent of Wake County Public Schools on each calendar type by year.

Note: Sample size are 116 schools in 2006, 121 schools in 2007, 122 schools in 2008, 126 schools in 2009 and 133 schools in 2010.

Figure 5: Plot of mean teacher outcome measures surrounding the 2007–2008 calendar change year in Wake County, North Carolina.Notes: Panel A shows a line graph through the mean number of teachers by year for schools that change calendar in the 2007–2008 school year compared to schools that remain on a traditional calendar through the entire sample.Panel B shows the same comparison for the mean of school average teaching experience.Since the large-scale calendar conversion in WCPSS happened in the 2007–2008 school year (2008), 2006 and 2007 on the graphs are years before the calendar change, while 2008 through 2010 are years after the calendar change.These graphs include all observations in each year, including new schools that open during the sample years.
Figure 5:

Plot of mean teacher outcome measures surrounding the 2007–2008 calendar change year in Wake County, North Carolina.

Notes: Panel A shows a line graph through the mean number of teachers by year for schools that change calendar in the 2007–2008 school year compared to schools that remain on a traditional calendar through the entire sample.Panel B shows the same comparison for the mean of school average teaching experience.Since the large-scale calendar conversion in WCPSS happened in the 2007–2008 school year (2008), 2006 and 2007 on the graphs are years before the calendar change, while 2008 through 2010 are years after the calendar change.These graphs include all observations in each year, including new schools that open during the sample years.

Figure 6: Estimates by year for North Carolina (WCPSS) calendar changes to multi-track YRS in the 2007/2008 school year (t = 0).Notes: Estimates plotted in these graphs come from estimation including controls, year effects and school fixed effect, plus interactions of a dummy for schools that convert to multi-track year-round in the 2007–2008 school year with year effects. The coefficients and confidence intervals for those interactions are plotted in each graph, with the main effect for 2007–2008 centered at zero. Panels B and D restrict schools to those that convert to a multi-track YRS in the 2007–2008 school year and use as a comparison group only schools that remain traditional the entire sample period. Panels A and C use the full sample, including other schools that do not change and new schools.
Figure 6:

Estimates by year for North Carolina (WCPSS) calendar changes to multi-track YRS in the 2007/2008 school year (t = 0).

Notes: Estimates plotted in these graphs come from estimation including controls, year effects and school fixed effect, plus interactions of a dummy for schools that convert to multi-track year-round in the 2007–2008 school year with year effects. The coefficients and confidence intervals for those interactions are plotted in each graph, with the main effect for 2007–2008 centered at zero. Panels B and D restrict schools to those that convert to a multi-track YRS in the 2007–2008 school year and use as a comparison group only schools that remain traditional the entire sample period. Panels A and C use the full sample, including other schools that do not change and new schools.

Table 8:

Estimation results for California, without school fixed effects or school-specific time trends.

Dependent variables:Number of teachersPercent working overtimePercent with Masters or PhDAverage years teachingAverage years in district
12345
Multi-track YRS calendar9.144***

(0.599)
−0.584***

(0.149)
−3.406***

(0.696)
−1.547***

(0.116)
−1.440***

(0.106)
Single-track YRS calendar−0.249

(0.984)
−0.437

(0.275)
11.06***

(1.013)
−0.272

(0.217)
0.0450

(0.220)
Observations85,12085,12085,12085,12085,120
R-squared0.1080.0220.0490.1020.106
Dependent variables:Percent full credentialsPercent special conditionsPercent cert. special educ.Percent cert. bilingual educ.
6789
Multi-track YRS calendar0.125

(0.369)
−0.308

(0.384)
−2.328***

(0.188)
2.003***

(0.495)
Single-track YRS calendar4.125***

(0.400)
−3.769***

(0.385)
3.138***

(0.558)
4.022***

(0.818)
Observations85,12085,12085,12085,120
R-squared0.9060.2490.2130.170
  1. Robust standard errors in parentheses. Errors are clustered at the school level.

    ***p < 0.01, **p < 0.05, *p < 0.1

    Notes: All specifications include year effects and the following school control variables: student racial and gender composition, percent of students eligible for free and reduced price meals, computers and internet connected devices per student, and whether the school is a charter school.

Table 9:

Estimation results for California, including school fixed effects, but no school-specific time trends.

Dependent variables:Number of teachersPercent working overtimePercent with Masters or PhDAverage years teachingAverage years in district
12345
Multi-track YRS calendar7.506***

(0.329)
0.237

(0.161)
−3.579***

(0.589)
−1.681***

(0.115)
−1.787***

(0.105)
Single-track YRS calendar0.0124

(0.461)
0.781

(0.520)
1.904*

(1.052)
−0.196

(0.204)
−0.211

(0.161)
Observations85,12085,12085,12085,12085,120
R-squared0.9600.3960.7050.6700.705
Dependent variables:Percent full credentialsPercent special conditionsPercent cert. special educ.Percent cert. bilingual educ.
6789
Multi-track YRS calendar−2.869***

(0.466)
2.886***

(0.471)
−0.549***

(0.199)
0.644

(0.452)
Single-track YRS calendar3.063***

(0.673)
−3.061***

(0.671)
0.00978

(0.612)
−0.244

(0.782)
Observations85,12085,12085,12085,120
R-squared0.9460.5720.6510.534
  1. Robust standard errors in parentheses. Errors are clustered at the school level.

    ***p < 0.01, **p < 0.05, *p < 0.1.

    Notes: All specifications include year effects and the following school control variables: student racial and gender composition, percent of students eligible for free and reduced price meals, computers and internet connected devices per student, and whether the school is a charter school.

Table 10:

Estimation results for California including both fixed effects and school-specific time trends, no time-varying controls.

Dependent variables:Number of teachersPercent working overtimePercent with Masters or PhDAverage years teachingAverage years in district
12345
Multi-track YRS calendar6.182***

(0.430)
0.655**

(0.302)
−1.681***

(0.479)
−0.829***

(0.0932)
−0.780***

(0.0827)
Single-track YRS calendar0.0617

(0.436)
1.361**

(0.694)
−0.689

(0.650)
−0.497***

(0.172)
−0.478***

(0.133)
Observations84,74384,74384,74384,74384,743
R-squared0.9790.4690.8210.8120.841
Dependent variables:Percent full credentialsPercent special conditionsPercent cert. special educ.Percent cert. bilingual educ.
6789
Multi-track YRS calendar9.365***

(2.046)
−1.280***

(0.410)
−0.516

(0.361)
−0.388

(0.664)
Single-track YRS calendar17.36***

(2.785)
−1.055*

(0.600)
1.852**

(0.815)
1.170

(1.022)
Observations84,74384,74384,74384,743
R-squared0.4980.7330.5900.524
  1. Robust standard errors in parentheses. Errors are clustered at the school level.

    ***p < 0.01, **p < 0.05, *p < 0.1.

    Notes: All specifications are estimated using stata command reghdfe, which allows for estimation using a large dummy variable set, as well as that dummy variable set interacted with a linear trend. All specifications include year effects.

Table 11:

Estimation results for California including both fixed effects and school-specific time trends (see Table 2), balanced panel of schools only.

Dependent variables:Number of teachersPercent working overtimePercent with Masters or PhDAverage years teachingAverage years in district
12345
Multi-track YRS calendar4.702***

(0.456)
0.462**

(0.189)
−0.414

(0.553)
−0.694***

(0.0928)
−0.648***

(0.0790)
Single-track YRS calendar−0.441

(0.607)
−0.302

(0.426)
1.052

(0.850)
−0.232

(0.164)
−0.235*

(0.131)
Observations56,08956,08956,08956,08956,089
R-squared0.9860.5540.8560.8500.868
Dependent variables:Percent full credentialsPercent special conditionsPercent cert. special educ.Percent cert. bilingual educ.
6789
Multi-track YRS calendar0.806*

(0.428)
−1.191**

(0.467)
0.282

(0.212)
1.331*

(0.773)
Single-track YRS calendar0.585

(0.593)
0.0987

(0.618)
−0.156

(0.382)
−1.202

(1.290)
Observations56,08956,08956,08956,089
R-squared0.7790.7670.7300.665
  1. Robust standard errors in parentheses. Errors are clustered at the school level.

    ***p < 0.01, **p < 0.05, *p < 0.1.

    Notes: All specifications are estimated using stata command reghdfe, which allows for estimation using a large dummy variable set, as well as that dummy variable set interacted with a linear trend. All specifications include year effects and the following school control variables: student racial and gender composition, percent of students eligible for free and reduced price meals, computers and internet connected devices per student, and whether the school is a charter school. The sample has been limited to only those schools that are observed for each year between 1998 and 2008.

Table 12:

Estimates for California in Table 2, repeated for the same sample years as used for WCPSS, 2006–2010.

Dependent variables:Number of teachersPercent working overtimePercent with Masters or PhDAverage years teachingAverage years in district
12345
Multi-track YRS calendar3.343***

(0.493)
0.911***

(0.308)
−1.263

(1.121)
−0.335**

(0.138)
−0.274**

(0.115)
Single-track YRS calendar−0.943**

(0.409)
0.724

(0.598)
−0.297

(0.852)
−0.482**

(0.229)
−0.345**

(0.171)
Observations33,57333,57333,57333,57333,573
R-squared0.9930.7410.9170.9180.934
Dependent variables:Percent full credentialsPercent special conditionsPercent cert. special educ.Percent cert. bilingual educ.
6789
Multi-track YRS calendar−0.365

(0.400)
−0.302

(0.545)
0.232

(0.601)
−2.728

(1.796)
Single-track YRS calendar−0.486

(0.475)
0.732

(0.626)
0.0152

(0.718)
−1.912

(2.068)
Observations33,57333,57333,57333,573
R-squared0.9890.7550.8800.791
  1. Robust standard errors in parentheses. Errors are clustered at the school level.

    ***p < 0.01, **p < 0.05, *p < 0.1

    Notes: All specifications are estimated using stata command reghdfe, which allows for estimation using a large dummy variable set, as well as that dummy variable set interactied with a linear trend. All specifications include year effects and the following school control variables: student racial and gender composition, percent of students eligible for free and reduced price meals, computers and internet connected devices per student, and whether the school is a charter school.

Table 13:

Wake county, NC student, school and teacher characteristics summarized separately by calendar type.

TraditionalMulti-Track YRT-stat of
MeanSDObservationsMeanSDObservationsDifferenceb
A. Student variables
Percent students, FRPM0.359(0.141)4410.270(0.158)1760.423
Percent black0.294(0.140)4410.202(0.142)1760.465
Percent Hispanic0.128(0.074)4410.128(0.083)176−0.001
Percent white0.475(0.160)4410.560(0.188)176−0.347
Percent of students, passed end-of-grade reading tests0.777(0.135)4410.774(0.131)1760.013
Percent of students, passed end-of-grade math tests0.770(0.096)4410.839(0.091)176−0.524
B. School variables
Average daily attendance0.955(0.008)4410.957(0.006)176−0.209
Annual yearly progress0.420(0.494)4410.534(0.500)176−0.163
Students/instructional computer3.191(0.988)4413.620(0.932)176−0.316
Percent crowding, includes mobile classrooms101.924(14.650)44188.634(15.076)1760.632
Books/student17.218(5.668)44117.089(5.244)1760.017
Crimes per 100 students0.512(0.763)4410.261(0.641)1760.252
Number of long-term suspensions0.329(0.744)4410.108(0.392)1760.263
Percent poverty0.422(0.168)4410.307(0.172)1760.476
C. Teacher characteristics
Number of teachers52.095(13.345)44156.761(12.396)176−0.256
Percent teacher turnover0.229(0.092)4410.184(0.078)1760.369
Percent of teachers, lateral entry0.018(0.025)3450.011(0.020)1020.206
Percent of teachers, 0–4 years experience0.231(0.094)4410.215(0.083)1760.123
Percent of teachers, 4–10 years experience0.322(0.081)4410.358(0.072)176−0.331
Percent of teachers, 11+ years experience0.448(0.113)4410.427(0.105)1760.132
Percent of teachers, advanced degrees0.298(0.082)4410.300(0.082)176−0.017
Percent of teachers, fully licensed0.965(0.042)4410.981(0.031)176−0.294
Percent of classes taught by highly qualified teachers0.980(0.044)4410.989(0.032)176−0.157
Number of teacher, National Board Certified5.849(3.694)4379.540(6.075)176−0.519
Annual salarya37,881(9783)14,47939,890(10,441)7059−0.140
Years of experiencea12.519(8.689)14,47912.104(8.412)70590.034
Percent of teachers with Mastersa0.344(0.475)14,4790.337(0.473)70590.011
Percent of teachers with advanced degree or PhDa0.006(0.079)14,4790.006(0.076)70590.004

References

Anderson, D. M., and M. B. Walker. 2015. “Does Shortening the School Week Impact Student Performance? Evidence from the Four-Day School Week.” Education Finance and Policy 10 (3): 314–349.10.1162/EDFP_a_00165Search in Google Scholar

Angrist, Joshua D., and Jorn-Steffen Pischke. 2009. Mostly Harmless Econometrics: An Empiricist's Companion. Princeton, New Jersey: Princeton University Press. 200910.1515/9781400829828Search in Google Scholar

Barnes, G., E. Crowe, and N. Schaefer. 2007. The Cost of Teacher Turnover in Five School Districts: A Pilot Study. New York, NY: National Commission on Teaching and America's Future.Search in Google Scholar

Bellei, C. 2009. “Does Lengthening the School Day Increase Students? Academic Achievement? Results from a Natural Experiment in Chile.” Economics of Education Review 28 (5): 629–640.10.1016/j.econedurev.2009.01.008Search in Google Scholar

Blankenship, T. 1984. “Update: These School Systems Swear by the Four-Day School Week because Students Work Harder and Face Fewer Distractions.” The American School Board Journal 171 (8): 32–33.Search in Google Scholar

Buddin, R., and G. Zamarro. 2009. “Teacher Qualifications and Student Achievement in Urban Elementary Schools.” Journal of Urban Economics 66: 103115.10.1016/j.jue.2009.05.001Search in Google Scholar

California Department of Education (CDE). Year-Round Education Program Guide. 2014. Accessed February 28. 2014. http://www.cde.ca.gov/ls/fa/yr/guide.asp.Search in Google Scholar

Clotfelter, Charles T., Helen F. Ladd, and Jacob L. Vigdor. 2007. “Teacher Credentials and Student Achievement: Longitudinal Analysis with Student Fixed Effects.” Economics of Education Review 26 (6): 673–682.10.1016/j.econedurev.2007.10.002Search in Google Scholar

Cooper, Harris, Jeffrey C. Valentine, Kelly Charlton, and April Melson. 2003. “The Effects of Modified School Calendars on Student Achievement and on School and Community Attitudes.” Review of Educational Research 73 (1): 1–52.10.3102/00346543073001001Search in Google Scholar

Croninger, R., J. King Rice, A. Rathbun, and M. Nishio. 2007. “Teacher Qualifications and Early Learning: Effects of Certification, Degree, and Experience on First-Grade Student Achievement.” Economics of Education Review 26: 312–324.10.1016/j.econedurev.2005.05.008Search in Google Scholar

Daneshvary, Nasser, and T.M. Clauretie. 2001. “Efficiency and Costs in Education: Year-Round versus Traditional Schedules.” Economics of Education Review 20: 279–287.10.1016/S0272-7757(00)00010-8Search in Google Scholar

Darling-Hammond, L. 2000. “Teacher Quality and Student Achievement: A Review of State Policy Evidence.” Journal of Education Policy Analysis 8 (1). 1–44. http://epaa.asa.edu/epaa/v8n1/.10.14507/epaa.v8n1.2000Search in Google Scholar

Donis-Keller, C., and D. L. Silvernail. 2009. Research Brief: A Review of the Evidence on the Four-Day School Week. Portland, ME: Center for Education Policy, Applied Research and Evaluation, University of Southern Maine.Search in Google Scholar

Gandara, Patricia. 1992. “Extended Year, Extended Contracts. Increasing Teacher Salary Options.” Urban Education 27 (3): 229–247.10.1177/0042085992027003002Search in Google Scholar

Gilpin, Gregory. 2017. School Capacity, Calendar Conversions, and Teachers' Secondary Employment. Unpublished manuscript. Bozeman, MT: Montana State University–Bozeman.Search in Google Scholar

Goldhaber, D., B. Gross, and D. Player. 2011. “Teacher Career Paths, Teacher Quality, and Persistence in the Classroom: Are Public Schools Keeping Their Best?” Journal of Policy Analysis and Management 30: 57–87.10.1002/pam.20549Search in Google Scholar

Graves, J. 2010. “The Academic Impact of Multi-Track Year-Round School Calendars: A Response to School Overcrowding.” Journal of Urban Economics 67: 378–391.10.1016/j.jue.2009.11.004Search in Google Scholar

Graves, J. 2011. “Effects of Year-Round Schooling on Disadvantaged Students and the Distribution of Standardized Test Performance.” Economics of Education Review 30: 1281–1305.10.1016/j.econedurev.2011.04.003Search in Google Scholar

Graves, J. 2013a. “School Calendars, Child Care Availability and Maternal Employment.” Journal of Urban Economics 78: 57–70.10.1016/j.jue.2013.07.004Search in Google Scholar

Graves, J. 2013b. “The Effects of School Calendar Type on Maternal Employment across Racial Groups: A Story of Child Care Availability.” American Economic Review 103: 279–283. P&P.10.1257/aer.103.3.279Search in Google Scholar

Graves, J., S. McMullen, and K. Rouse. 2013. “Multi-Track Year-Round Schooling as Cost Savings Reform: Not Just a Matter of Time.” Education Finance and Policy 8: 300–315.10.1162/EDFP_a_00097Search in Google Scholar

Guin, K. 2004. “Chronic Teacher Turnover in Urban Elementary Schools.” Education Policy Analysis Archives 12 (42): 1–30, Retrieved July 9, 2018 from http://epaa.asu.edu/epaa/v12n42/.10.14507/epaa.v12n42.2004Search in Google Scholar

Hanushek, Eric A., John F. Kain, Daniel M. O'Brian, and Steven G. Rivkin. 2005. “The Market for Teacher Quality.” NBER Working Paper 11154. National Bureau of Economic Research.10.3386/w11154Search in Google Scholar

Hanushek, Eric A., John F. Kain, and Steven G. Rivkin. 2004. “Why Public Schools Lose Teachers.” Journal of Human Resources 39 (2):326–354.10.2307/3559017Search in Google Scholar

Haser, S.G., and I. Nasser. 2003. “Teacher Job Satisfaction in a Year-Round School.” Educational Leadership 60: 65–67.Search in Google Scholar

Hincapie, D. 2016. "Do Longer School Days Improve Student Achievement?: Evidence from Colombia." IDB Working Paper Series (Social Sector. Education Division), IDB-WP-679.10.18235/0000268Search in Google Scholar

Jensen, V. M. 2013. “Working Longer Makes Students Stronger? The Effects of Ninth Grade Classroom Hours on Ninth Grade Student Performance.” Educational Research 55 (2): 180–194.10.1080/00131881.2013.801244Search in Google Scholar

Jepsen, C., and S. Rivkin. 2009. “Class Size Reduction and Student Achievement.” Journal of Human Resources 44: 223–250.10.3368/jhr.44.1.223Search in Google Scholar

Kneese, C. 2000. Teaching in Year-Round Schools. Washington DC: ERIC Digest. ERIC Clearinghouse on Teaching and Teacher Education. http://ericae.net/edo/ed449123.htmSearch in Google Scholar

Koki, S. 1992. Modified School Schedules: A Look at the Research and the Pacific. Honolulu, HI: Pacific Region Educational Lab. Retreived on July 9, 2018 from ERIC Document Reproduction Service, ED 024707. https://eric.ed.gov/?id=ED354630.Search in Google Scholar

Kreitzer, A., and G. Glass. 1993. Policy Considerations in Conversion to Year-Round Schools (Policy Brief No. 1). Tempe, AZ: Education Policy Studies Laboratory: College of Education, Arizona State University. http://www.gvglass.info/papers/yrs.html.Search in Google Scholar

Loeb, S., L. Darling-Hammond, and J. Luczak. 2005. “How Teaching Conditions Predict Teacher Turnover in California Schools.” Peabody Journal of Education 80: 44–70.10.1207/s15327930pje8003_4Search in Google Scholar

McMullen, S.C., K. Rouse, and J. Haan. 2015. “The Distributional Effects of the Multi-Track Year-Round Calendar: A Quantile Regression Approach.” Applied Economic Letters 22: 1188–1192.10.1080/13504851.2015.1016204Search in Google Scholar

McMullen, S.C., and K.E. Rouse. 2012a. “School Crowding, Year-Round Schooling and Mobile Classroom Use: Evidence from North Carolina.” Economics of Education Review 31: 812–823.10.1016/j.econedurev.2012.05.005Search in Google Scholar

McMullen, S.C., and K.E. Rouse. 2012b. “The Impact of Year-Round Schooling on Academic Achievement: Evidence from Mandatory School Calendar Conversions.” American Economic Journal: Economic Policy 4: 230–252.10.1257/pol.4.4.230Search in Google Scholar

Merino, B. 1983. “The Impact of Year-Round Schooling: A Review.” Urban Education 18: 298–316.10.1177/004208598301800303Search in Google Scholar

Mitchell, D., and R. Mitchell. 2005. “Student Segregation and Achievement Tracking in Year-Round Schools.” Teachers College Record 107: 529–562.10.1177/016146810510700401Search in Google Scholar

National Association for Year Round Education (NAYRE). 2006. "Creating More Time for Learning: Symposia Powerpoint Presentation." Accessed 24 August, 2012. www.nayre.org/2006conference.htmSearch in Google Scholar

National Association for Year Round Education (NAYRE). 2007. "Statistical Summaries of Year-Round Education Programs 2006–2007." Accessed 24 August, 2012. www.nayre.org/STATISTICAL%20SUMMARIES%20OF%20YRE%202007.pdf.Search in Google Scholar

Nelson, S.R. 1983. An Evaluation of Sheridan County School District Alternative Schedule, 1982–83. Portland, OR: Northwest Regional Education Lab.Search in Google Scholar

Rivkin, S., E. Hanushek, and J. Kain. 2005. “Teachers, Schools and Academic Achievement.” Econometrica 73: 415–458.10.1111/j.1468-0262.2005.00584.xSearch in Google Scholar

Rockoff, J. 2004. “The Impact of Individual Teachers on Student Achievement: Evidence from Panel Data.” American Economic Review 94: 247–252.10.1257/0002828041302244Search in Google Scholar

Sanders, W., and J. Rivers. 1996. "Cumulative and Residual Effects of Teachers on Future Student Academic Achievement." Research Progress Report. Knoxville: University of Tennessee Value-Added Research and Assessment Center.Search in Google Scholar

Shields, C.M., and S.L. Oberg. 2000. Year Round Schooling: Promises and Pitfalls. Boston, MA: Scarecrow Press.Search in Google Scholar

Skinner, R.R. 2014. Year Round Schools: In Brief. Washington, DC: Congressional Research Service, Report 43588.Search in Google Scholar

Smith, A. 2011. "Are Year-Round Schools a Viable Option for Improving Student Achievement, Combating Summer Learning Loss in Disadvantaged Youth, Controlling Expenses, and Reducing Teacher Burnout?" diss., University of North Carolina at Chapel Hill, Chapel Hill, NC.Search in Google Scholar

Thompson, Paul. 2018. "Effects of Four-Day School Weeks on Achievement: Evidence from Oregon." Working paper, Department of Economics, Oregon State University.10.2139/ssrn.3390191Search in Google Scholar

von Hippel. 2016. “Year-Round School Calendars: Effects on Summer Learning, Achievement, Parents, Teachers, and Property Values.” In The Summer Slide: What We Know and Can Do about Summer Learning Loss, edited by K. Alexander, S. Pitcock and M. Boulay, Chapter 13. New York: Teachers College Press.Search in Google Scholar

Worthen, B.R., and S.W. Zsiray. 1994. What Twenty Years of Educational Studies Reveal about Year-Round Education. Chapel Hill, NC: North Carolina Educational Policy Research Center.Search in Google Scholar

Published Online: 2018-07-21

© 2018 Walter de Gruyter GmbH, Berlin/Boston

Downloaded on 2.2.2023 from https://www.degruyter.com/document/doi/10.1515/bejeap-2017-0240/html
Scroll Up Arrow