Ed Tech Evaluation Plan: More problems than I initially thought

Late last week I described the new plan from the US Department of Education (ED) and their Office of Educational Technology (OET) to “call for better methods for evaluating educational apps”. Essentially the ED is seeking proposals for new ed tech evaluation methods so that they can share the results with schools – helping them evaluate specific applications. My argument [updated DOE to be ED]:

Ed tech apps by themselves do not “work” in terms of improving academic performance. What “works” are pedagogical innovations and/or student support structure that are often enabled by ed tech apps. Asking if apps works is looking at the question inside out. The real question should be “Do pedagogical innovations or student support structures work, under which conditions, and which technology or apps support these innovations?”. [snip]

I could see that for certain studies, you could use the ED template and accomplish the same goal inside out (define the conditions as specific pedagogical usage or student support structures), thus giving valuable information. What I fear is that the pervasive assumption embedded in the program setup, asking over and over “does this app work” will prove fatal. You cannot put technology as the center of understanding academic performance.

Upon further thought as well as prompting from the comments and private notes, this ED plan has even more problems that I initially thought. Continue reading

Posted in Higher Education, Notable Posts, Tools, Toys, and Technology (Oh my!) | Tagged , , , , , , | Leave a comment

US Department of Education: Almost a good idea on ed tech evaluation

Richard Culatta from the US Department of Education (DOE, ED, never sure of proper acronym) wrote a Medium post today describing a new ED initiative to evaluate ed tech app effectiveness.

As increasingly more apps and digital tools for education become available, families and teachers are rightly asking how they can know if an app actually lives up to the claims made by its creators. The field of educational technology changes rapidly with apps launched daily; app creators often claim that their technologies are effective when there is no high-quality evidence to support these claims. Every app sounds world-changing in its app store description, but how do we know if an app really makes a difference for teaching and learning?

He then describes the traditional one-shot studies of the past (control group, control variables, year or so of studies, get results) and notes:

This traditional approach is appropriate in many circumstances, but just does not work well in the rapidly changing world of educational technology for a variety of reasons.

The reasons? Continue reading

Posted in Notable Posts, Openness | Tagged , , , , , | 3 Comments

68 Percent of Statistics Are Meaningless, Purdue University Edition

I don’t know of any other way to put this. Purdue University is harming higher education by knowingly peddling questionable research for the purpose of institutional self-aggrandizement. Purdue leadership should issue a retraction and an apology.

We have covered Purdue’s Course Signals extensively here at e-Literate. It is a pioneering program, and evidence does suggest that it helps at-risk students pass courses. That said, Purdue came out with a later study that is suspect. The study in question claimed that students who used Course Signals in consecutive classes were more likely to see improved performance over time, even in courses that did not use the tool. Mike Caulfield looked at the results and had an intuition that the result of the study was actually caused by selection bias. Students who stuck around to take courses in consecutive semesters were more likely to…stick around and take more courses in consecutive semesters. So students who stuck around to take more Course Signals courses in consecutive semesters would, like their peers, be more likely to stick around and take more courses. Al Essa did a mathematical simulation and proved Mike’s intuition that Purdue’s results could be the result of selection bias. Mike wrote up a great explainer here on e-Literate that goes into all the details. If there was indeed a mistake in the research, it was almost certainly an honest one. Nevertheless, there was an obligation on Purdue’s part to re-examine the research in light of the new critique. After all, the school was getting positive press from the research and had licensed the platform to SunGard (now Ellucian). Furthermore, as a pioneering and high-profile foray into learning analytics, Course Signals was getting a lot of attention and influencing future research and product development in the field. We needed a clearer answer regarding the validity of the findings.

Despite our calls here on the blog, and our efforts to contact Purdue directly, and attention the issue got in the academic press, Purdue chose to remain silent on the issue. Our sources informed us at the time that Purdue leadership was aware of the controversy surrounding the study and made a decision not to respond. Keep in mind that the research was conducted by Purdue staff rather than faculty. As a results, those researchers did not have the cover of academic freedom and were not free to address the study on their own without first getting a green light from their employer. To make matters more complicated, none of the researchers on that project still work at Purdue anymore. So the onus was on the institution to respond. They chose not to do so.

That was bad enough. Today it became clear that Purdue is actively promoting that questionable research. In a piece published today in Education Dive, Purdue’s “senior communications and marketing specialist” Steve Tally said

the initial five- and six-year raw data about the impact of Signals showed students who took at least two Signals-enabled courses had graduation rates that were 20% higher. Tally said the program is most effective in freshman and sophomore year classes.

“We’re changing students’ academic behaviors,” Tally said, “which is why the effect is so much stronger after two courses with Signals rather than one.” A second semester with Signals early on in students’ degree programs could set behaviors for the rest of their academic careers.

It’s hard to read this as anything other than a reference the study that Mike and Al challenged. Furthermore, the comment about “raw data” suggests that Purdue has made no effort to control for the selection bias in question. Two years after the study was challenged, they have not responded, not looked into it, and continue to use it to promote the image of the university.

This is unconscionable. If an academic scholar behaved that way, she would be ostracized in her field. And if a big vendor like Pearson or Blackboard behaved that way, it would be broadly vilified in the academic press and academic community. Purdue needs to come clean. They need to defend the basis on which they continue to make claims about their program the same way a scholar applying for tenure at their institution would be expected to be responsible for her claims. Purdue’s peer institutions likewise need to hold the school accountable and let them know that their reputation for integrity and credibility is at stake.

Posted in Tools, Toys, and Technology (Oh my!) | Tagged , | 1 Comment

Challenge Of Student Transition Between Active And Passive Learning Models

Last week the Hechinger Report profiled an innovative charter school in San Diego called High Tech High (insert surfer jokes here) that follows an active, project based learning (PBL) model. The school doesn’t use textbooks, and they don’t base the curriculum on testing. The question they ask is whether this approach prepares students for college.

As a result, for [former HTH student Grace] Shefcik, college – with its large classes and lecture-based materials– came as a bit of a shock at first. At the University of California, Santa Cruz, she is one of more than 15,000 undergraduates, her assignments now usually consist of essays and exams. At High Tech High, Shefcik had just 127 students in her graduating class, allowing her to form close relationships with peers and teachers.

The premise of the article is that PBL prepares students for life but maybe not for college. Grace described the big difference between high school, with constant feedback and encouragement, to college, where you rarely get feedback. Other students describe their frustration in not knowing how to study for tests once they get to college.

After a recent screening of “Most Likely to Succeed” at the New Schools Summit in Burlingame, California, High Tech High CEO Larry Rosenstock told an audience, “We actually find that many of our students find themselves bored when they get to college.” Continue reading

Posted in Higher Education, Instructional Design, Notable Posts | Tagged , , , , , , , | 2 Comments

Reuters: Instructure has filed for IPO later this year

Reuters is on a breaking news roll lately with ed tech. This time it is about Instructure filing for an initial public offering (IPO).

Instructure is planning an initial public offering later this year that could value the education software company at $500 million to $800 million, according to people familiar with the matter.

Instructure, based in Salt Lake City, has hired Morgan Stanley (MS.N) and Goldman Sachs (GS.N) to help prepare for the IPO, which has been filed confidentially, the people said. They requested anonymity because the news of the IPO was not public.

Under the Jumpstart Our Business Startups Act, new companies that generate less than $1 billion in revenue can file for IPOs with the U.S. Securities and Exchange Commission without immediately disclosing details publicly.

Instructure has long stated its plans to eventually IPO, so the main question has been one of timing. Now we know that it is late 2015 (assuming Reuters story is correct, but they have been quite accurate with similar stories). Continue reading

Posted in Higher Education, Notable Posts, Tools, Toys, and Technology (Oh my!) | Tagged , , , , | Leave a comment