Letters to the Editor

Capital District Business Review

The annual ranking of seventy nine schools in the capital region and supporting stories (April 29-May 5 issue) deserves credit for relating the Business Review effort to important educational issues surrounding property taxation and the upcoming Report Cards on school districts . It is quite an improvement from earlier rationalizing of district ratings on a software program used in another part of the state or in helping citizens buy homes in the best districts.

However, as a source cited in the articles, I must make the following comments about my own work and how it relates to your rating approach.

1) In my work, the seventy nine districts of the capital region are part of a larger 649 district study focused upon the implications of trying to document Regents academic productivity and to speculate about organizational problems in implementing a state wide reform based upon an all Regents emphasis. The discussion has been underway throughout the spring semester as part of a post master's graduate seminar at the University at Albany. Readers with world wide web access and an interest in the actual seminar format can find such detailed information at http://www.albany.edu/~dkw42/links.html. My ratings of Regents performance in all seventy nine capital region districts and the rating of the top eighty six Regents districts state wide are available at the web site.

During the semester we compared the last year's Business Review rating of capital region districts to the results of my rating method. Your rating included four Regents courses and diploma production as part of nine measures while I assess Regents diploma production for eight years (since l987-88) and each of nine Regents subjects for four years (since l990-9l). I use a threshold of assessment based upon percent of students actually taking the year end test and 80 percent passing rate in tandem while you use the State Education Department percent passing as an estimate of average enrollment and a 100 point calculation of nine measures. You ask districts with variances for their scores while my method removes courses with variances from the calculation but notes their existence. As far as the direct comparison of my work with the Business Review effort to rank districts in the capital region , the actual relationship of our assessments for all 79 districts is seventy percent overlap for the most recent l994-l995 information and essentially the same relationship between a four year aggregation of your rankings and my total accumulated rating.

What this difference suggests for the top scoring districts under each scheme is presented below. North Colonie continues to be the best Regents performing district with Voorheesville second. For my grand total based upon multiple year performance Niskayuna could conceivably tie Voorheesville for second place if the three variances operating in Regents subjects were counted as points. Bethlehem is clearly the fourth district in the region and Burnt Hills-Ballston Lake fifth. In comparing last year to the accumulated totals, both Mechanicville and Shenedehowa would have scored 2.5 Regents points for l994-l995 but Mechanicville would remain the sixth most Regents productive district. Queensbury, Averill Park (with two variances) and Guilderland could claim the most improving of the top performing districts.

For 1994-l995 Data Only Business Review 4 YrTotal & My Grand Total
Business Review(of 900) My Data (10.0 possible) Business Review (3600)My Data( 38 possible)
Voorheesville(794)North Colonie (8.0) Voorheesville(3040)North Colonie( 26.0)
Averill Park(785)Voorheesville (6.0) Averill Park (3015)Voorheesville(19.5)
North Colonie(775)Queensbury (5.0) North Colonie(2951)Niskayuna(16.5)***
Niskayuna(767)Averill Park (tie 4.5) Averill Park(2943)Bethlehem(15.0)*
Shenendehowa(758)Bethlehem (tie 4.5) Shenendehowa(2921)Burnt Hills(13.0)
Burnt Hills(744)Burnt Hills (4.0) Burnt Hills( 2911))Mechanicville(12.0)
Chatham(739)Argyle (tie 3.0) Niskayuna(2907)Averill Park(11.0 tie)**
Guilderland(738)Guilderland (tie 3.0) Queensbury (2850)Queensbury(11.0 tie)
Bethlehem(734)Schodack (tie 3.0) Lake George(2811)North Warren(9.5)
Queensbury(734)Bolton (tie 3.0) Chatham(2794)Guilderland(9.0 tie)**
Salem(733)Fort Edwards (tie 3.0) Bethlehem(2790)+Chatham(9.0 tie)*
Cambridge(729)Duanesburg (tie 3.0) Cambridge(2784)Shenedehowa(8.5 tie)*
Mechanicville(729)
-
Schodack(2780)Schodack(8.5 tie)
Schodack(716)
-
Mechanicville(2777)Bolton (8.0 tie)
Johnsburg(716)
-
East Greenbush(2777)Windham-Ashl(8.0 tie)
Lake George(708)
-
Scotia-Glenville(2723)Cambridge(8.0 tie)
East Greenbush(706)
-
Salem(2709)Fort Edwards (8.0 tie)
Galway(703)
-
South Colonie( 2702)Lake George (6.5 tie)
Argyle(703)
-
Johnstown(2702)Galway (6.5 tie)
Schalmont(702)
-
Windham -Ashland(2665)Brunswick (8.5 tie)

+ indicates a recalculation of original Business Review ratings for first three years.

* indicates the number of approved variances in Regents subjects for l993-94 or l994-95.

Each of our differences in methodological approach have serious implications for how a district is assessed in secondary Regents performance. Such questions need to be addressed in serious conversations about district Report Cards and the full implementation of Regents curriculum by the year 2003. Your article cited me as stating the "horse race" among top districts for the best rankings was the predominant form of political "heat" that has been shown and that "to date, bottom districts have voiced little objection" to being rated. Let me clarify that. Excellent districts sitting next to one another are intensely competitive and have communities that expect each one to be #1. Academic performance comparisons often become another form of interscholastic sports or trying for Olympic gold. This is true in the capital region's response to previous Business Review ratings or in Long Island last fall as a response to a Newsday piece about my method. While such "horse racing" is bound to continue, a betting person would argue that the most political "heat" to be generated in the next few years when an all Regents curriculum is implemented state wide will not be centered in these jurisdictions. My analysis of 649 district, for example, identified over one hundred districts that failed to meet thresholds in any Regents measure during all the years studied. Thirteen of the seventy nine capital region districts are such a situation. Certainly, the "horse race" to meet state level implementation demands in such situations will generate "heat" about Regents performance at least as fierce the present volatility of top contenders. Further, the nature of the "heat" will not be delimited by choosing English or any other particular subject to start state wide improvement and avoid "school administrator wrath."

In terms of pure methodology issues, I do believe secondary school Regents productivity should be kept as separated and clear cut as possible. Some of the measures you include in the Business Review composite are more legitimately descriptor measures that may or may not vary systematically with the dependent variable of Regents performance. The relationships should be the primary concern of your research agenda, not the assumptions of what makes a package of nine measures. In that light, the quote attributed to me that the Business Review method (with its inclusion of dropout and pupil/teacher ratio) was a better predictor of a districts' ability to prepare students for Regents exams than the SUNY Buffalo effort to mesh student achievement with socio-economic features is a bit misleading. Both approaches are wrong to roll Regents performance into some "composite" or "compensation" measure and pretend it becomes a better explanatory indicator by such action.

Academic measures from middle and elementary performances, organizational measures such as dropout or suspension or percent of money spent on teaching, community wealth capacity measures by census poverty index or free and reduced lunch eligibility or combined wealth ratio , community wealth effort measured by tax rate per $1000 or full assessment per enrolled pupil are only descriptors that might vary systematically with Regents performance. In any particular set of districts such possible relationships need to be examined as researchable hypotheses about what actual patterns can be described in terms of statistical significance. We spent a semester doing just such analyses and, yes, last year's Business Review rating did do a better job of "fitting" with the l993-l994 Regents diploma output than the Buffalo rating effort did. If the reader wishes to see the regression and analysis of variance results that argue last year's Business Review ratings vary with free and reduced lunch eligibility, full valuation of real property per enrolled pupil and the l992-l993 Regents diploma percent in the capital region see http://www.albany.edu/~dkw42/t21.html.

In summary, the method questions behind many issues of practical importance often turn on the ultimate reason data are collected and presented in the first place. The Business Review wants to rank capital region districts each year. My purpose remains to assess the implications of demanding a common reform expectation of Regents performance against a commonly understood meaning of actual Regents behavior. My data about district performance was used to estimate how districts of similar or equivalent Regents performance could be developed into networks for helping each other in Regents driven reform. The graduate seminar at Albany considered the practical aspects of achieving such cooperation when implementing a state wide reform agenda driven by Regents expectations. I hope to continue such conversations.

David Wiles

Professor of Educational Administration and Policy Studies

Date sent: April 30. 1996

cc. Tim Aurentz, Claire Hughes