By Phil Hill
Instructure has engaged Securus Global to test the Canvas LMS product for security vulnerabilities. Instructure also invited me to be an independent observer – participating in the process and independently reporting on the testing and Instructure’s response to any vulnerabilities identified. Part 1 of this series of posts described the concept. Part 2 gave a mid-term update, describing the process involved and initial results. Part 3 described the full results of the security assessment. In this final post on the experience I’d like to address two subjects – my own impressions of the testing, and a call for more LMS vendors to follow suit and make their security testing more transparent.
As described in part 3, the risk assessment found 10 vulnerabilities – 1 critical, 1 high, 4 moderate and 4 low risk – in the Canvas LMS system. I do not have a basis to judge the relative number of vulnerabilities found compared to Instructure’s competitors, as there is not an industry-specific standard on the depth and extent of penetration testing, but by all appearances the Canvas LMS system is a well-designed, generally secure application. I base this judgment on two factors:
- Instructure has been able to remediate 7 of the 10 vulnerabilities in less than a month and a half since the testing was completed. Additionally, the critical item was remediated within 24 hours and the high item was remediated within 2 weeks. The remaining 3 items include 2 moderate risk vulnerabilities and 1 low risk vulnerabilities. See part 3 for additional details.
- In the report, Securus Global summarized the findings with the observation that “It is our impression that CANVAS is generally a secure application and that the issues found can quickly be remediated”, and that “None of these issues are associated with major application flaws that are difficult to remediate”. Indeed, this impression by Securus has largely been validated by the subsequent remediations.
I cannot state how these results for Canvas compare to competitive LMS solutions, as no other vendor has been this public with their testing results. It is worth noting that there was one critical and one high risk vulnerability found. Securus Global has stated that ~95% of systems going through penetration testing do end up with at least one critical item; however, this goes across industries into financial systems, etc.
When talking to Instructure staff, they appeared to be surprised by the existence of the critical item, given their history of internal security audits and automated testing. In other words, Securus Global found vulnerabilities that Instructure has been unable to find. As Josh Coates, CEO of Instructure, related to me, it is a classic engineering case that having another set of eyes look at your system will inevitably find issues that the developers may miss – if you are too close to the problem, you often can’t see the issue. This is especially true when the 3rd party includes experts in their domain, such as full-time ethical hackers. Furthermore, by having the process done in public through an independent observer, Josh stated that Instructure somewhat cornered themselves – it would not look good to avoid fixing the important issues. While this is not to say that Instructure would not have addressed the issues in a private test, the public process ensured that Instructure put great emphasis on the follow-up to the testing results.
Due to this third-party security testing done in a public manner, the Canvas LMS is now more secure that it would have been without the testing.
Transparency in Security Testing – A Call for Other LMS Vendors
One aspect of this security testing that I would like to focus on is transparency of the process. Should we (the higher education and K-12 community) or should we not have more transparency in the security testing of the enterprise systems we rely upon to support the academic mission? This is not an easy question.
In my experience, system security has been too easily swept under the rug in the LMS world – at least within education markets. Most schools go through the motions of asking about security during Request for Proposal (RFP) processes, but by and large, they just ask generic questions which the vendors answer in the proposals and follow-up meetings. The net result is that it is up to the vendors to describe how thorough their security testing is. These descriptions are run through sales & marketing groups as part of the proposal process, and the end result is vague, non-verifiable answers.
A minority of schools go to the addition step of performing their own security audits, which is commendable, sometimes including their own penetration testing. This situation is superior to relying RFP responses alone, but I see two main problems with this approach.
- The testing is typically done by institutional IT staff and not by security professionals – hackers, in other words. As Instructure found out, real hackers can find vulnerabilities that most IT staff and engineers cannot find.
- The testing results are not public, therefore only the institution in question benefits from the testing. The higher education LMS market does not benefit from security evaluations in the same way that it benefits from other parts of public evaluations – features & functions, and even pricing.
Neither choice is likely to produce real understanding of the security vulnerabilities of a particular LMS, yet these systems are mission-critical to the university, and they house some of our most sensitive data.
Argument For More Transparency
In my opinion, the LMS market (both higher education and K-12) would benefit from making security testing results more open. Consider what the LMS market knows about Instructure based on this public testing.
- We know the number and risk-level of security vulnerabilities found by a qualified, independent security testing firm after 2 and a half weeks of penetration testing with access to source code.
- We know the response times for Instructure to remediate 7 of the 10 identified vulnerabilities.
- We have insight into the trade-offs that Instructure makes to determine whether and how to fix security issues.
- Any institution has these results available to support their decision-making.
I would argue that this is more information than is available to the vast majority of institutions making LMS decisions using traditional methods of security inquiries. The market in general would benefit if other LMS vendors followed suit and agreed to 3rd party security assessments using an independent observer.
To be fair, other LMS vendors also use 3rd party security audits, but what is not known is the nature of those audits, what the results are, and what the response times are for the vendor to address the vulnerabilities found.
Argument Against More Transparency
There are risks, however, to this method of public security testing. Drazen Drazic, the managing director of Securus Global, indicated that in talking to people around the world through security-related social networks, no other companies have chosen to use an independent observer for this testing. This is not to argue that no one should do it, but clearly we are breaking new ground here and need to be cautious.
One downside of public security assessments is that the act of publicizing results can in fact increase the likelihood that vulnerabilities would be exploited by hackers. As one executive from a competitive LMS put it to me, we need to focus on security consistently and not as a once-a-year exercise. Any public exposure of vulnerabilities can increase the likelihood of hackers exploiting those vulnerabilities, so the trick is to not disclose specific pathways to exploitation. In our case, I described the category of vulnerability found, and I avoided disclosing any information on the critical and high-risk vulnerabilities until after they had been remediated. Still, this is a tricky area.
Two competitive LMS vendors have criticized these tests as a marketing ploy that could be dangerous. In their opinion, student and client data is best protected by keeping the testing process out of the public domain. I cannot speak for Instructure’s motivations regarding marketing, but I did want to share these criticisms.
While there are valid arguments as to the risks of more transparent security testing, I believe the benefits outweigh the risks. The main change I would make to this type of testing is that I would separate the reporting from the testing by several months. I did find myself going back and forth on when to disclose the testing results, and a simple solution is to delay any reporting for 2 -3 months after the tests are complete. I would also argue that no critical or high risk vulnerabilities should be described until after they have been remediated.
Regarding the criticism about using security testing as marketing, I do not see the problem. Yes, Instructure may use these results from a marketing perspective, but these are real testing results that shed real insight into their system. In my opinion, that is a type of marketing that adds value to the customers and market in general. In fact, it will likely put pressure on other LMS vendors to disclose more information to clients.
There are real arguments on both sides, however. What are you opinions? Here are a couple of questions that could be addressed in the comments.
- Do you agree that most current RFP processes do not result in real insight into the security of systems and remediation practices under consideration? Are there examples that can be shared publicly?
- Do you agree that the market would benefit from security testing with independent observers, or is it better to keep these results out of the public domain?
- What alternate suggestions do you have to improve the level of institutional insight into system security and remediation practices, while not jeopardizing client data?