Anya Kamenetz has a piece up on NPR about learning analytics, highlighting Purdue’s Course Signals as its centerpiece. She does a good job of introducing the topic to a general audience and raising some relevant ethical questions. But she missed one of the biggest ethical questions surrounding Purdue’s product—namely, that some of its research claims are likely false. In particular, she repeats the following claim:
Course Signals…has been shown to increase the number of students earning A’s and B’s and lower the number of D’s and F’s, and it significantly raises the chances that students will stick with college for an additional year, from 83% to 97%. [Emphasis added.]
Based on the work of Mike Caulfield and Al Essa summarized in the link above, it looks like that latter claim is probably the result of selection bias rather than a real finding. So who is at fault for this questionable claim being repeated without challenge in a popular venue many months after it has been convincingly challenged?
For starters, Purdue is. They never responded to the criticism, despite confirmation that they are aware of it—for one thing, they got contacted by us and by Inside Higher Ed—and despite the fact that they apparently continue to make money off the sales of the product through a licensing deal with Ellucian. And the uncorrected paper is still available on their web site. This is unconscionable.
Anya clearly bears some responsibility too. Although it’s easy to assume from the way the article is written that the dubious claim was repeated to her in an interview by Purdue research Matt Pistilli, she confirmed for me via email that she took the claim from the previously published research paper and did not discuss it with Pistilli. Given that this is her central example of the potential of learning analytics, she should have interrogated this a little more, particularly since she had Matt on the phone. Mike Caulfield also commented to me that any claim of such a dramatic increase in year-to-year retention should automatically be subject to additional scrutiny.
I have to put some blame on the higher ed press as well. Inside Higher Ed covered the story (and, through them, the Times Higher Education). In fact, Carl Straumsheim actually advanced the story a bit by putting the question to researcher Matt Pistilli (who gave a non-answer). The Chronicle of Higher Education did not cover it, despite having run a puff piece on Purdue’s claims the same day that Mike Caulfield wrote his original piece challenging the results. It is very clear to Phil and me that we are read by the Chronicle staff, in part because they periodically publish stories that have been obviously influenced by our earlier coverage. Sometimes without attribution. I don’t care that much about the credit, but if they thought Purdue’s claims were newsworthy enough to cover in the first place then they should have done their own reporting on the fact that those claims have been called into question. If they had been more aggressive in their coverage then the mainstream press reporters who find Course Signals will be more likely to find the other side(s) of the story as well. Outside of IHE, I’m having trouble finding any coverage, never mind any original reporting, in the higher ed or ed tech press.
I have a lot of respect for news reporters in general, and I think that most people grossly underestimate how hard the job is. I think highly of Anya as a professional. I like the reporters I interact with most at the Chronicle as well. Nor will I pretend that we are perfect here at e-Literate. We miss important angles and get details wrong our fair share. For example, I doubt that I would have caught the flaw in Purdue’s research if Mike hadn’t brought it to my attention. But collectively, we have to do a better job of providing critical coverage of topics like learning analytics, particularly at a time when so much money is being spent and our entire educational system is starting to be remade on the premise that this stuff will work. And there is absolutely no excuse whatsoever for a research university to not take responsibility for their published research on a topic that is so critical to the future of universities.