In the fall of 2011 I made the following argument:
We need more transparency in the LMS market, and clients should have access to objective measurements of the security of a solution. To paraphrase Michael Feldstein’s suggestions from a 2009 post:
- There is no guarantee that any LMS is more secure just because they say they are more secure
- Customers should ask for, and LMS vendors should supply, detailed information on how the vendor or open source community has handled security issues in practice
- LMS providers should make public a summary of vulnerabilities, including resolution time
I would add to this call for transparency that LMS vendors and open source communities should share information from their third-party security audits and tests. All of the vendors that I talked to have some form of third-party penetration testing and security audits; however, how does this help the customer unless this information is transparent and available? Of course this transparency should not include details that would advertise vulnerabilities to hackers, but there should be some manner to be open and transparent on what the audits are saying. [new emphasis added]
Inspired by fall events and this call for transparency, Instructure (maker of the Canvas LMS) decided to hold an public security audit using a white hat testing company, where A) the results of the testing would be shared publicly, and B) I would act as an independent observer to document the process. The results of this testing are described in two posts at e-Literate and by a post at Instructure.
Interested in the LMS market? Sign up to receive more information about our LMS Market Analysis service, including a free sample newsletter!
Instructure has kept up the process, this year with a crowd-sourcing twist:
What was so special about this audit? For starters, we partnered with Bugcrowd to enlist the help of more than 60 top security researchers. To put that number in context, typical third-party security audits are performed by one or two researchers, who follow standard methodologies and use “tools of the trade.” Their results are predictable, consistent, and exactly what you’d want and expect from this type of service. This year, we wanted an audit that would produce “unexpected” results by testing our platform in unpredictable ways. And with dozens of the world’s top experts, plus Bugcrowd’s innovative and scrappy crowdsourcing approach, that’s exactly what we got.
So while last year’s audit found six issues, this year’s process unearthed a startling 59. (Yeah, you read that right. Fifty-nine.) Witness the power of crowdsourcing an open security audit.
The blog post goes on to state that all 59 issues have been fixed with no customer impacts.
I harp on this subject not just to congratulate Instructure on keeping up the process, but to maintain that the ed tech world would benefit from transparent, open security audits. Back in 2011 there were ed tech executives who disagreed with the approach of open audits.
There are risks, however, to this method of public security testing. Drazen Drazic, the managing director of Securus Global, indicated that in talking to people around the world through security-related social networks, no other companies have chosen to use an independent observer for this testing. This is not to argue that no one should do it, but clearly we are breaking new ground here and need to be cautious.
One downside of public security assessments is that the act of publicizing results can in fact increase the likelihood that vulnerabilities would be exploited by hackers. As one executive from a competitive LMS put it to me, we need to focus on security consistently and not as a once-a-year exercise. Any public exposure of vulnerabilities can increase the likelihood of hackers exploiting those vulnerabilities, so the trick is to not disclose specific pathways to exploitation. In our case, I described the category of vulnerability found, and I avoided disclosing any information on the critical and high-risk vulnerabilities until after they had been remediated. Still, this is a tricky area.
Two competitive LMS vendors have criticized these tests as a marketing ploy that could be dangerous. In their opinion, student and client data is best protected by keeping the testing process out of the public domain. I cannot speak for Instructure’s motivations regarding marketing, but I did want to share these criticisms.
We are now in the fourth year of Instructure providing transparent security audits, and I would note the following:
- The act of publicizing the results has not in fact enabled hackers to exploit the security vulnerabilities identified.
- While I am sure there is marketing value to this process, I would argue that the primary benefits have been enhanced security of the product, but more importantly better information for the institutions evaluating or even using Canvas.
I repeat my call for more ed tech vendors to follow a this type of process. I would love to cover similar stories.