Wednesday, October 10, 2007

School Beat: Hammann Report: Summary of Conclusions from Data and possible recommendations

To: Heidi Carvin, Brian Cashore, Leslie Ferrell, Lou Havlik, Randy Keister, Vicki Lecy-Luebke, (remaining staff members of 2003 Academic Excellence Team), Marissa Morstad, Krista Jones, Linda Rehfeldt, Marilyn Brink, Deb Arnold, Bob Flaherty (non-discrimination self-evaluation report team), Michael Pierick, Dennis Hatfield, Mike Larson, Tina Rossmiller, Art Phillips, Jeff Herbers (Evansville School Board).

Cc: Jamie Gillespie, Kitty Verkuilen, Deb Fritz, Sarah Champeau, Teresa Ellison.

From: Melissa Hammann

Subject: Comments regarding: Pupil Nondiscrimination Self-Evaluation Report, March 2007

Excel File Attachments: wkcedata, wkcehist and actvwkce

I agree that self-evaluation regarding non-discrimination in district policy and action is an important ongoing process. However, I am concerned about the emphasis placed on superficial policy compliance and statistical evaluations. It is of marginal value to use such statistics without proper context and little careful application toward determining root causes where apparent inequity exists. Furthermore, I feel our focus should be centered on the nearly universal loss in proficiency between Grade 8 and Grade 10 WKCE results, as well as the gender inequities in these scores. Downward trends in the WKCE data since 2002 are also a cause for concern. In addition, Evansville’s consistently low ACT results compared to the Statewide results are particularly troubling since our small district has to help our students outperform the state average to be considered competitive. These data illustrate a problem in our district which, for the nine years we have resided here, has yet to be adequately addressed. This problem continues in spite of this specific issue being addressed beginning in the fall of 2001 by the Academic Excellence Committee.

Let’s first address scholarship award distributions by gender, an item of concern in the report. On page 10, on first blush, it appears that scholarship awards are highly skewed toward young women, who accounted for 66% of the awards in 2006, reduced from 82% in 2004. The first question that leaps to mind is “What percentage of college bound seniors from Evansville are girls and what percentage are boys?” The evaluation of these statistics is moot without that piece of information. Should the scholarship award gender distribution reflect the class demographics or the college bound student demographics? The latter, to be sure. Also, the selection criteria for the scholarships are not expounded on in this report. If grades are the only criteria, and the girls are outperforming the boys on grades by 2:1, then this too needs to be factored into the discussion and your 66/34 distribution makes sense. The gender ratio of the senior class also bears on the statistics. For example, in 2004-2005, the overall student body was comprised of 52.3% boys and 47.7% girls while the scholarship distribution for college bound seniors was 18% male and 82% female. Based on demographics, your scholarship award ratio is even more skewed than represented. The gender breakdown for seniors only was not provided for that year, so that’s as far as the analysis can go at this point. The percent of each gender in the class who took the ACT factors into this discussion somehow. Beginning at a dismal 29% of the boys and a whopping 81% of the girls in 99-00 graduating class, we have improved the boys’ statistic to 53% of the boys taking the ACT in 05-06. However, the girls’ participation rate has plunged from 81% to 65% of the girls in the class of 05-06. Probably more important than numbers of scholarships is the value breakdown of the scholarship. How does the money distribute by gender when corrected for the college bound gender ratio? Are the scholarships distributed over a fairly large number of students, or are a select few students cleaning up on the scholarships? Finally, shouldn’t we be most concerned that scholarships should go to those students who most deserve and need them, regardless of gender?

There is another section described on page 10 of the report entitled “Other Forms of Recognition.” There seems to exude from the report pride that these data are more balanced toward “equal” gender distribution. However, the middle school 2005-2006 data is unbalanced from the perspective of extracurricular activity demographics. How can it be egalitarian if you award 51% of the recognition awards to boys when only 41% of the extracurricular participants are boys? The expected award distribution would be 41% boys and 59% girls. The high school award distribution for the same year (45% male/55% female) is much closer to the extracurricular activity participant distribution (43% male/57% female). Once again, of greater concern is the distribution of the funds used to support the extracurricular activities of boys and girls. It’s doubtful that 57% of the funds allocated for extracurricular activities at the high school are funneled to activities with predominantly female participants.

Now let’s focus on the WKCE results. The 3 years of data provided in the Self-Evaluation report illustrated my concern for loss of proficiency between grades 8 and 10. One immediately wonders if this is a persistent problem in our district or related to one class. The DPI web site lists data going back to 2002 for WKCE results with data parsed by gender and year. Statewide results are available for comparison purposes. Using these data, one is able to expand to 3 sets of results in which Grade 8 data in year x can be directly compared to Grade 10 data in year x+2, wherein essentially the same students are being evaluated. Attached with this document are the tables generated using DPI data (filename: wkcedata). Per a DPI disclaimer, WKCE data prior to 2002 are not comparable to 2002 and later, nor is it available on the DPI website.

Please review this Excel file with the attached bar charts while reading this part of the report. Tabs labeled reading, language arts, math, science and social studies contain the raw data spreadsheets. There are bar charts entitled reading0204all, reading0305all, reading0406all, and so on for each subject that nicely illustrates the “corrected loss” I discuss below (total of 15 charts). There are also 10 charts describing the proficiency loss for each subject by gender entitled mathlossboys and the like. Finally, there is a complete table of WKCE Historical data in this file for reference purposes entitled HistoricWKCE. I have matched scales on the graphs where feasible for better comparison purposes.

There are many ways to interpret this type of data. First, one can make a simple comparison between grade 8 and grade 10 results for Evansville students in the same class (for example, 2002 grade 8 results compared to 2004 grade 10 results). Using this technique, 22 of the 30 sets of Evansville data represented in the tables show loss of proficiency with magnitudes ranging from –1% to –16% (“prof. loss” on the charts). This made me consider that the sensitive timing of these grades may require some correction for overall reduction in scholastic performance due to general adolescent influences. Hence the inclusion of the state data on the charts, on which one may note that 20/30 of the measures of advanced plus proficient students showed decreasing proficiency. If one then “corrects” the Evansville data by offsetting the decreases with the same magnitude of decreases seen at the state level, the news is still not so good for Evansville. I did the calculations both as direct offsets and offsets calculated as a percent of the starting value, since Evansville scores are typically higher than state averages. There were tweaks to the data that way, but no big differences in the magnitude of the losses I refer to, so the charts you see represent direct offset data. Half of the measurements (15/30) still show loss of proficiency even after the corrections. 5 of those measurements indicate boys’ loss of proficiency (2005 grade 10 language arts, 2005 grade 10 math, 2005 grade 10 science, and 2004 and 2006 grade 10 social studies). 10 of the measurements indicate girls’ loss of proficiency (2004-2006 grade 10 reading, 2004-2006 grade 10 language arts, 2005 grade 10 math, 2005 and 2006 grade 10 science and 2004 grade 10 social studies).

This exercise led me to another way of looking at the data in a more historical overview. As useful as the bar charts were, I could only look at one snapshot 2-year period at a time. Excel file wkcehist contains a spreadsheet of the historical data (HistoricWKCE) and bar charts (READHISTGR8, READHISTGR10, etc.) that illustrate my concern with a general, somewhat alarming pattern of reduced proficiency over time, in addition to the Grade 8 to Grade 10 loss. Each chart shows results for one subject over the 2002-2006 time period, broken down by gender and with state results for comparison. The Grade 8 data is more precipitous in the reduction of the percent of students scoring proficient plus advanced over time, but since there is already a loss from grade 8 to grade 10, it’s a relief that equivalent degradation wasn’t evident in the grade 10 data too.

Both ways of evaluating the data illustrate gender gaps that are disheartening in this day and age. Of particular concern was the 31% gap in Grade 8 male and female language arts score in 2002 (favoring girls). There is also the usual gender bias for math and science: an 11% Grade 10 math gap in 2006 (favoring boys) and the consistent 5-17% science gap in all years (favoring boys). There were a few notable data sets that defied the traditional gender biases. But overall, Evansville data reflect these traditional biases.

One could argue that Evansville WKCE data are universally higher than the state data, sometimes by more than 30%. Why are you making such a big deal about this, Melissa? It’s frustrating that Evansville’s superior performance on WKCE does not predict superior ACT performance at all. One could say that the ACT test IS the gold standard of measure. If our superior WKCE performance translated to 30% higher ACT scores, all would be well and this report would not have seen the light of day. It seems that we sit on our laurels and bask in the glory of WKCE superiority while ACT mediocrity descends around us. We are failing our students when the rubber hits the road and they are counting on us to help them get into the colleges of their choice.

Please refer to the Excel file actvwkce while reading this section of the report. Tab ACT vs. WKCE contains the raw data spreadsheet that collates three sets of year x Grade 10 WKCE data with year x+1 ACT results. It was originally thought that year x+2 ACT results should be the comparison data, but it was determined that students are counseled to take the ACT in their junior year for early admission, and most do. There are 4 charts for each year: Evansville boys, Evansville girls, State boys and State girls (entitled Ev Boys 2002, Ev Girls 2002, St Boys 2002, St Girls 2002, etc.). One can get a sense of trends with these charts, but it’s clear that high WKCE results do not predict good ACT performance. It appears that the improvement referred to in the 2003 Academic Excellence Committee Report has not been sustained. Widely disparate WKCE scores often result in the same ACT score. And high WKCE scores do not predict excellent performance in that subject on the ACT. The most striking example of this was the 2004 WKCE Reading score/2005-2006 ACT comparison. 92% of the Grade 10 boys scored proficient or advanced and 92% of the girls did too. But while the ACT Reading score the following year showed an impressive 24.8 for the girls (reference 22.5 State Girls), the boys only scored 22.1, exactly the same as the State Boys, who had only 72% scoring Proficient + Advanced on their grade 10 WKCE.

Again, one is curious to put into context the ACT results with prior year data. The spreadsheet of historic ACT data from 2000 to 2006 is entitled Historical ACT Data. There are some generalizations that can be made looking at the last 5 charts which follows the ACT results for each subject and the composite data by year and gender, state and local. They are entitled ACTREAD, ACTENGILSH, etc. The good news is that there has been an increase in the girls’ scores across the board, especially in reading and language arts. The girls’ improvement in math and science is particularly encouraging. However, Evansville girls’ highest math score is still lower than our boys’ lowest math score, but for the last 2 years, the girls have finally exceeded the average state girls’ ACT score. The girls’ highest science score has finally caught up with the boys’ score but the boys continue to stay flat at less than the state boys’ science score. The flat male ACT performance is widespread, especially in the areas where they could use some bolstering, reading and language arts. It also appears that our boys are losing ground in science, very slightly. Their math scores still nudge just under the state boys’ average math ACT. The girls’ incredible ACT performance last year brought our composite score above the state average for the first time ever. This is good news to be celebrated, but with caution. These are ACT results for this year’s graduating class. By every measure, this class has outperformed all classes above and below them. As one reviews all of this data, one gets a feel for “class ability” that factors into the standardized test results. One could classify the class of 06-07 as downright brainiacs, evident from their grade 8 testing results forward. It may be unrealistic to expect continued ACT performance improvement of this magnitude in coming years.

What seems evident is that we need to academically engage our boys the way we have engaged our girls. We need to be realistic in our assessment of the coming grades whose grade 8 and grade 10 WKCE scores were far lower than those of this year’s graduating class to whom the excellent ACT scores can be attributed. While one can’t really predict ACT performance with the WKCE results, general trends can be inferred. If the students who scored 24 and 25 on Reading and English (language arts) in last years ACT had 100% scoring proficient + advanced on the Grade 8 WKCE and subsequent classes have performed at 75-90%, chances are that ACT scores will jump back down to more historically expected values. The trick is to stop this trend and continue the improvement.

While I reviewed all of this data to generate a report, the block schedule report was released which surveyed all the stakeholders regarding the relatively recently implemented block schedule. There was a claim in the report that our WKCE and ACT scores have improved since its implementation. This claim is disputable. The girls’ ACT performance has improved, impressively so in Reading and English, but the last impressive data point may be indicative of a very smart graduating class. The girls’ Math and Science scores are still below Evansville boys’ scores, which are a smidgen below the state boys. Evansville Boy’s ACT scores have not improved with the block schedule. The bar charts in wkcehist speak for themselves as to the WKCE trends, and they are not in the upward direction. We may perform better than the state average, but our scores over time are not improving.

It occurred to me after reading the report that perhaps block schedules dovetail better with the average female learning style (ability to sit for longer periods, more often given to the interactive styles and cooperative environment fostered in a block schedule as outlined by the overview report). This in turn has resulted in a steady improvement in the girls’ historically poor ACT areas of Math and Science and brought their Reading and English scores to impressive heights. Alternatively, it’s possible that the male learning style has not meshed as well with the block schedule. Their performance on their “problem” areas of Reading and English has remained unremarkable, and their strength areas of Math and Science remain flat. Some tweaks to the block schedule may be necessary to bring up the boys’ reading and language arts scores and help them soar in Math and Science. Speaking to their passions and harnessing the same strengths they exhibit in Math and Science for Reading and English could foster a veritable explosion in Evansville ACT scores.

Respectfully Submitted,


Melissa Hammann

Evansville School Board

No comments:

Post a Comment