Previously, the best data available on total student counts came from the Babson Survey Research Group with their annual survey (prior to 2012 called the Sloan survey). This is the survey tracking the total number of students taking at least one course online. When I talked to the researchers earlier this year, they mentioned that they hoped the new IPEDS data would ‘put them out of business’. I hope this comment was half in jest, as their survey measures much more information than just total student counts and they have very useful longitudinal data. I have asked BSRG for an updated statement on their plans but have not been able to get a response yet.
Some additional notes on the data:
According to the IPEDS data, 5.5 million (26%) degree-seeking students in the US took at least one online course in Fall 2012, which is significantly less than that reported by Babson (6.7 million / 32% for Fall 2011 data). Keep in mind the different methodologies involved – IPEDS collects data reported directly from colleges and universities, while Babson is based on a representative survey; IPEDS measures “distance education”, while Babson measures “online courses”. I would look forward to this year’s Babson survey to explain the differences, but for now, I’ll just note the difference.
The new Babson Survey came out yesterday morning, and I was disappointed to find out that they did not address or acknowledge the differences in the data. With the new survey data (p. 15, using Fall 2012 data just like IPED), they came to a dramatically different conclusion [emphasis added]:
There were 412,000 more online students in fall 2012 than in fall 2011, for a new total of 7.1 million students taking at least one online course. This year-to-year change represents the smallest numeric increase in the past five years. The growth rate of 6.1 percent in students taking at least one online course also represents the lowest percentage increase since these reports began tracking online enrollments.
The difference between 5.5 million and 7.1 million is quite significant - Babson's estimate from the survey is 29% higher than the IPEDS data. The Babson survey is the most widely-quoted source for answering the question "how many students take online courses in the US", and these differences are important for policy makers and planners. And as a note, the Babson survey also looks specifically at degree-granting institutions (see p. 33 of the survey report).
While the initial articles in the Chronicle and Inside Higher Ed did not cover this disparity, Steve Kolowich at the Chronicle did talk to Jeff Seamen (one of the report authors) to get a statement with this morning's post:
But how many American students are taking at least one online course right now?
The answer, according to the latest figures from the Babson Survey Research Group, is about 7.1 million.
Or is it?
For the last decade, researchers and journalists have relied on the Babson group and its annual survey to measure the scale and growth of online higher education in the United States. With backing from the Sloan Consortium and others, the Babson surveyors have been taking the temperature of online education in the United States since 2002, when they estimated that 1.6 million students were taking at least one online course.
The article goes on to answer the question:
So which number is correct?
The lower one, probably. The Education Department data are more likely to be accurate, “given that they are working from the universe of all schools,” says Mr. Seaman by email. [snip]
The reporting requirements for the department “are such that I would always trust their numbers over ours,” he wrote. “However, I still believe that the trends we have reported for the past 11 years are very much real.”
It's good to see this follow-up from Steve and Jeff's response.
I certainly agree that the differences between the Babson Survey and IPEDS data points out the challenge we have had with inconsistent definitions (what data should trigger course to be classified as online versus face-to-face or hybrid) as well as the inconsistent data collection by colleges and universities. Jeff makes a great point that the new IPEDS data will force institutions to be officially track the data in student systems (rather than shadow systems) and to use a consistent definition.
I think that Babson Survey Research Group should have acknowledged these differences publicly along with the release of the report, as we need to have more confidence in the impartiality of our data collection. Even with the IPEDS data, the Babson survey is for the attitudinal data and the consistent trend data they provide. But Kudos to them for the quick explanation.
What we still need is guidance on how to translate the data in an understandable way. One possible solution is for Babson to do a one-time adjustment based on new bias factor, going back to 2002 to give new time-series data. While there will likely be media confusion for a short time, we would benefit from coordinated data in the long term.
The Babson survey is very useful - and will remain so - even with the release of IPEDS data. We need to have confidence in the data, however, and this clarification is important. I am in contact with one of the Babson authors and expect to be able to provide a deeper explanation soon.
And no, there aren't 7.1 million US higher ed students taking at least one online course. There are closer to 5.5 million as of Fall 2012.