New Research Confirms…Everything We Already Believe

As many predicted, the worlds of research and journalism have changed with the advent of the internet and the explosion of social media. Gone are the days when research studies were mainly published via journals and extensive peer review processes. The pace of news has accelerated, as has the pace of consumption.

This new reality has its benefits. Research findings are no longer locked in obscure journals. More diverse journalistic views appear in more outlets. Yet we are also paying the price for this ubiquitous information. Findings and headlines are now available to support just about any position on any educational topic, and there is almost no way to understand which have merit and which do not. In the context of the broader vitriol-filled environment of education reform, people have come to use studies and their associated news stories as a way to confirm their own assumptions and as weapons in ideological warfare, rather than as new knowledge to inform and challenge.

assumptions

Shifts in the news business make the research problem worse. Some news outlets have been better than others about reporting on controversial education research, but few have been immune from inaccurate, attention-grabbing headlines or unbalanced reporting. Compelling research headlines can take on a life of their own, even without the help of mainstream media. Fact and nuance are increasingly inconvenient in the service of action and agendas.

Recent examples illustrate the challenge. Last March, Gary Orfield’s Civil Rights Project at UCLA published a report on suspension rates in charter schools. The press release headline screamed, “Study Finds Many Charter Schools Feeding the School-to-Prison Pipeline.” It was quickly picked up by scores of bloggers and tweeters across the country who didn’t need to see more than the headline to confirm their views that charter schools systematically exclude students and employ militaristic discipline tactics. Even the author of the report took to Twitter to decry a charter school problem. The problem was, the report (on pretty close reading) didn’t actually provide any evidence that high suspension rates are more common in charter schools than in district-run schools—the headline’s implication. As CRPE pointed out, the methodology also failed to pass muster with experts.

Just one month later, when Education Cities and Great Schools released the Education Equality Index (EEI)—a measure of the income-based achievement gap—national and local charter advocates asserted that charter schools were “leading the way.” But the EEI was not designed to assess whether charter schools were more likely than district schools to close gaps. In fact, there was nothing in the EEI to assert or support such claims. When a methodological limitation in the state scores and rankings was caught by another researcher post-publication, the organizations quickly issued a clarification (and they’ve recently convened a technical advisory panel of independent researchers to help strengthen the EEI’s methodology). By that time, however, states, cities, and advocates had already shouted inaccurate self-congratulatory news across the airways. 

Studies that have not gone through a peer review process are published almost weekly. Unless readers have a degree in statistics and want to wade through technical appendices, they must simply trust that the findings have merit. In the medical field, this type of academia is considered dangerous and irresponsible, yet in education it’s accepted and even defended.

There is no going back to how research and journalism played out before the internet came on the scene, and few would want to. We must now adapt to the new realities so the public can still find meaning and knowledge, data and evidence. We could start with a concept borrowed from the medical field: Standards of Care. Doctors are ethically bound to apply the most current, evidence-based treatments to a given medical condition. They can lose their license or be sued if an objective observer finds they violate published Standards of Care for say, a hernia. Journalists and researchers should be bound by similarly clear criteria. Though it would be impossible to impose the same type of external accountability doctors face from a medical review board, transparency would be a good first step.

Here’s how it could work in education research:

In fields outside of education, analysts publish their entire analytic process, including the codes used in their formulas and all of the data “runs” (the analyses they tried but didn’t publish). Statistical software, like “R,” easily allows researchers to track and publish their entire process. Scientific journals commonly require this level of transparency so that studies can be replicated—something researchers rarely, if ever, attempt in education.

Philanthropies like the Bill & Melinda Gates Foundation and the Broad Foundation, whose grantees are often accused of simply towing their lines, could combat that perception by insisting that all grantees adhere to replicable methods when possible and commit to making them available to other researchers who are interested in replicating their studies. And though peer review can have its own problems, funders should ask whether there will be rigorous quality assurance and stringent peer review processes, at minimum. New approaches to speedy but still rigorous peer review processes are being used in other fields.

Funders, including the federal government, should also be willing to fund studies that attempt to replicate prior research and meta-analyses that report on findings over time across rigorous studies. These are not headline grabbers, but they are critical. The field of psychology was rocked recently when about a third of 100 published psychological experiment findings could not be replicated and then again when the replication studies themselves were challenged. All studies are not created equal. We should not treat them as such.

Journalists could also help reinforce these standards if editors adopted a standard protocol for reporting on new studies. A publically available list of vetted experts on different topics could be a central resource, and reporters could be expected to use a standard quality control and transparency protocol. The Education Writer’s Association could help promote and inform such efforts.

With increased pressure from funders and journalists to demonstrate rigorous methodology, researchers may be less inclined to be “first to publish” and less likely to seek headlines without merit, and journalists would be less likely to be “first to the story” on research and data without adequate due diligence.

As a research organization, CRPE is not immune to these kinds of errors. We employ a peer review process and try to be transparent and balanced. We’re not in a position to judge whether we’re doing enough. Part of the point here is that we all have to proactively engage external checks on our own work, especially considering how our research is likely to be used in the larger political landscape.  

The public should always be skeptical consumers of information, but they will forever rely on third parties to help separate the wheat from the chaff when it comes to research and reporting. We need to step it up on that front, and these simple actions would go a long way. Doctors can be sued or can lose their license if they fail to meet professional standards of care. It seems reasonable to ask researchers to at least be transparent about their methods and review processes and to allow others to try to replicate their results. Better outcomes and more equitable public education are not well served by innuendo and anecdote.

Skip to content