Film trivia: the movie The Graduate has only one mention of an undergraduate major, and it belongs to the character that is not a graduate. Mrs. Robinson intended to major in art history, but left college early. The movie contrasts her unrealized ambitions with the promise of Benjamin (Dustin Hoffman), who has just completed an unspecified degree and has only to decide on which of the many roads to opportunity he wants to travel. Simply being the eponymous Graduate is enough to confer considerable potential.
One of the current mantras of education reform is to give students academic skills to be The Graduate, and to have the opportunity to follow any one of several professional paths. And rightly so, for the modern economy is ruthlessly demanding of ever-greater skills and abilities, and many entry-level jobs now require analytical thought and problem solving commensurate with advanced education. But while more and more students are attending college, the number that major in areas which hold the most future promise are essentially unchanged. We are getting kids into college, but dropping them off without a map.
The value of a college degree is the focus of a recent report from Georgetown University titled “The College Payoff.” Over the last decade, the earning premium between a high school and bachelor’s degree has widened, so that on average and over a lifetime, a bachelor’s degree is now worth $2.8 million. But the report also found that there is an increasing emphasis on what someone studies, and which occupation they pursue.
I recently came across – in, of all places, an essay on tax polices for capital gains — a topic I think resonates in any discussion on education reform: The Fallacy of Chesterton’s Fence.
I like fallacies. As a somewhat directionless undergraduate philosophy major, I lost interest in the Heidegger seminar, but I became increasingly entranced by basic logic and understanding how people think. Fallacies are potholes in rational thought. Understand how to recognize them and one is more able to avoid them. Help other people see them and you are more likely to find consensus.
The short version of the Fallacy of Chesterton’s Fence is this: don’t ever take down a fence until you know why it was put up. Simple enough. However, particularly as it relates to education reform, the long version is worth reading:
“In the matter of reforming things, as distinct from deforming them, there is one plain and simple principle […]. There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.”
Additional attention to English Language Learner (ELL) students is unquestionably a good thing. Particularly given the large percentages of ELL students both in Denver and across Colorado, there can be no doubt that this is a critical issue. There is simply not enough concerted attention on how schools support ELL students — and especially on specific strategies at both the district and school level to see what is most effective.
What there should not be, however, is opinion substituting for fact.
A recent discussion on these page does exactly that. The claim is that any attempt at quantitative assessment — through state and district tools such as School Performance Frameworks, or representation on sites like ColoradoSchoolGrades.com – unjustly punish schools with high percentages of ELL students.
These kids, so the theory goes, don’t learn as fast as their non-ELL peers, and schools who have more of them will always do worse on academic growth. And growth percentiles are the primary driver in most assessments. By holding all schools equally accountable for the academic growth of their students, as a member of the Denver school board put it, these systems are shamefully guilty of:
“accountability blinders that punish schools and kids for their English-proficiency differences by trying to lump them all into the same bucket as native and fluent English speakers”
Well, there is a blindness here, but it’s not the assessments. It’s us. Conventional wisdom dictates that including scores from ELL students will depress academic growth — and I’ll admit that I believed it as well (although to a lesser extent than some). I doubt I’m the only one. But we are all mistaken, as this perspective could be Exhibit A when’re blind acceptance of opinion and conjecture at the expense of data.
A coalition of 18 different organizations (including several with whom I am affiliated) have worked together to devise a simple website that grades all of Colorado’s public K-12 schools. They have undertaken this effort for a simple reason: The words we use to describe things matter. A lot.
And it is an unfortunate truth that, when talking about our schools, many professional educators, administrators, and bureaucrats speak sideways, slantways, and askew. They do not — and will not — speak straight.
It takes a simple idea to make us realize we have seemingly grown immune to basic logic and common sense. Hence the website, where one can look up any school in the state and find that most familiar of languages: Letter grades.
Using the Colorado Department of Education’s own data, the site gives each school a single letter grade, as well as additional data on academic proficiency, academic growth, primary subjects, and student demographics. Not a bad start.
The need to use language well is hardly a new phenomenon. While most Orwellian references point to a single novel, for me, this issue echoes Orwell’s 1946 essay on Politics and the English Language. When reviewing some particularly offensive linguistic black holes, Orwell writes of two primary criticisms:
Alan’s asked bloggers for their thoughts on the election as part of a post-mortem. I find this rearview mirror perspective usually boringly obvious after the fact, as it’s far easier to ascribe cause once one knows the effect. So I’m sending in my quick thoughts in advance of the election results (although probably published afterwards). If I am wrong, it will be painfully obvious, and if right, you’ll have to trust me that this was in early.
First is Prop 103. I thought the best summary was provided by Eric Sondermann. I just don’t see how this passes, and I doubt it is at all close. This initiative never had enough high-powered backers or an effective coalition, and faced a strong economic headwind. All true criticisms, but I also think a proposal as unspecific about how the money will be used would have trouble even in a different economic climate.
Most taxpayers, with justification, see education as a black hole where money enters and little changes. This proposition exacerbated that claim, and has all the mechanics of a a feel-good proposition that many people could support in its doom without having to engage in the far harder work of crafting something specific which could have drawn broader support. ”We tried” will be the mantra — but a try that was designed to never be a serious threat. More is the pity.
Most interesting will be the district-specific vote tallies as a precursor to future bonds. Denver voters approved a $454M bond back in 2008 by a 2:1 margin, while nearby Jeffco voters voted down a similar proposal. The appetite of voters for Prop 103 will be an imprecise but early indication of bond issue potential in 2012, so watch for the district-by-district tally — particularly since suspense on the initiative itself is unlikely.
A+ Denver issued a new brief yesterday (and full disclosure – I helped crunch some of the numbers). It’s oddly not available (as of now) on their website, but it’s worth a look, so I’m posting here: SPF by District Report 10.12
A+ decided to take the 2011 School Performance Framework (SPF) and divide the schools by the five school board member districts. It’s an interesting exercise (and they include some handy maps and graphs). Recall that there are five level of school performance on the SPF (from best to worst in the corresponding color codes: blue, green, yellow, orange, red) and five geographic member districts (the other two seats are at large).
Here is partly what they found:
Jeannie Kaplan and Andrea Merida, two sitting members of Denver’s board of education, published this Op-Ed last Friday. Its genesis, they tell us, is in their conversations with Denver parents. “We are listening,” they write, “and are calling for the truth about how neighborhood schools perform.”
The specific call follows a few paragraphs later:
“But the data are clear that neighborhood middle schools are exceeding the growth expectations of the Denver Plan. These schools are actually performing better than the district average, including all the newer schools.”
Well, no. Pretty to think so. But not true. Traditional middle schools in Denver are lagging the district average for academic growth, not leading it. And more often than not, their students graduate 8th grade lacking the basic skills necessary to be successful in high school and beyond.
We are now over five years into the Denver Plan and a serious civic conversation about public education. So perhaps we might raise the bar just a little: It should not be okay for elected school board members to selectively distort performance data, and then use it as a basis to recommend where parents should send their children to school. And that is exactly what these board members are doing.
The recent Westword article on Denver North High School’s manipulation of its graduation rates, the belief that “juking the stats” likely spreads beyond a single school and a sage comment at the end of Alan’s post wondering what other Denver high schools were affected all indicate that this is a topic where rhetoric might benefit from a closer relationship with data.
At its crux, the question is if graduation rates tell us something meaningful about how district schools are performing academically. And it sure looks like they do, but not in the way one might have hoped.
For what the North debacle — and a previous yet related controversy over Lincoln High School — bring into question is twofold. First, does a high school diploma signify a reasonable, baseline level of student achievement; and second, is the rise in DPS’s graduation rate spread evenly throughout the district or is being used by some schools to mask a lack of academic rigor and proficiency.
To answer the first question, we need to see if there a pervasive gap – particularly at certain schools — between a school’s graduation rate and the ability of its alums to read, write, and do math at grade level. As one teacher at North commented for the Wesword article, are we reaching a point where someone could say “Oh, they went to North? They’ll give a diploma to anyone” – and for how many schools might this be an issue?
So here is a quick graph comparing respective 2010 graduation rates (data here) and 2010 average proficiency rates* (from CDE’s schoolview.org) at a number of notable, open-enrollment DPS high schools.
The red line indicates the trend; the schools above the line will have more students who graduate with solid academic skills; those below the line will have more graduates who lack basic proficiency. How far you are from the line shows the gap: well above the line pretty much guarantees a close correlation between graduation and at least a base level of academic ability; well below the line increases the likelihood that a diploma has little relation to academic skills.
READ THE REST OF THIS STORY IN THE BLOG ARCHIVE