I wasn't planning on writing this post, but I've become aware of several recent conversations that have led me to the conclusion that it would be useful to get this out.
For people who adopt software, trying to judge the value of so-called "standards support" in a product can be an incredibly frustrating experience. Standards implementations often fail to live up to their promises and, worse, it can be very hard to tell in advance of installing and running the software whether or not the "standards support" it supposedly provides is actually going to meet your needs. There are several reasons for this. First, creating a standard is hard. You have to think of all the different ways people might want to use the technology in all kinds of different environments, and then find some overlay that can meet all of those needs while still being able to work with existing software packages that are very different from each other. For example, Peoplesoft Campus Solutions and SunGard Banner are both SISs, but their architectures and data models are substantially different from each other. The same is true among the various LMSs. If you have a standard that is supposed to represent data from any SIS to any LMS (like, for example, the IMS Learning Information Services standard does), then you have to come up with a representation of that data that maps to the different data models and works with the different architectures. This is always hard, and it's harder on some software developers than on others. For example, if your system doesn't already have some pretty robust generic ways to interact with web services, then it will be more work for you to support a standard that is built on web services.
Further complicating the picture is that, while it is almost always in vendors' interest to say that they support a standard, it often isn't in their interest to do the hard work of actually supporting the standard, especially in the short term. Standards tend to reduce the total amount of money that customers spend on integration, which generally means that somebody is going to make less money. This hits companies that depend on service revenues harder than it hits companies that mostly sell product because at least with the product you can build the cost of the standards implementation into your license fees. But the truth of the matter is that, most of the time, everybody makes less money after a standard has been implemented, at least in the long run. This is because a standard, by definition, commodifies the function it is standardizing. A capability cannot be a differentiator if everybody does it. Therefore, while you might be able to continue to charge for the capability, particularly if you implement it very well and early, customers will, over time, increasingly expect it to be an affordable part of the package rather than a major expense (including the ongoing expense of maintenance that comes with a consulting-based custom integration).
The net result is a situation where it is both technically easy and financially convenient for the outcome of standards implementation to be nothing more than one more meaningless bullet point that vendors can add to their glossy product brochures. But it doesn't have to be that way. You can hold your vendors accountable for delivering the actual value that the standards promise, if you know the right questions to ask.
For starters, learn what it means for software to "comply" with the particular standard. Using a standard and complying with it are not the same, and there can be nuances. For example, the Learning Information Services (LIS) standard, like many of the newer IMS standards, has something called an application profile (also sometimes referred to as a conformance profile), which is essentially the subset of the total specification that developers must implement in order to assure interoperability. At the moment, LIS has only one application profile. So for LIS, you want to ask your vendor if they support or intend to support all mandatory elements of that application profile. If they don't, then they don't really support the standard and they can't guarantee interoperability. Oracle's implementation of LIS (which is called SAIP) supports all mandatory elements. Beyond that, there are optional elements. Ask them which ones they are supporting and which ones they aren't. You may even want them to give you a list. These elements provide additional functionality beyond the bare minimum, some of which may be very important to your institution. Details matter. SAIP supports many, but not all, optional elements of the LIS application profile.
Second, find out if the vendor has added extensions to the standard that are required for integration. Extension isn't always a bad thing, which is why many standards explicitly provide extension mechanisms. However, extensions that break the minimal interoperability that the standard promises are always a bad thing. Oracle has not extended LIS in SAIP and has committed to supporting the finalized version of the plain vanilla LIS application profile with no extensions required to achieve interoperability shortly after that finalized version comes out.
Third, find out if there are any official conformance tests and, if so, whether the software you're evaluating has passed them. LIS doesn't have an official conformance test yet, although one is planned. Basic Learning Tool Interoperability (BLT), another IMS standard, does have a conformance test. If a software package supposedly BLTI but hasn't passed the test, that would be a red flag.
Finally, find out of the software has been tested against multiple implementations. As much as a standard can lower the barriers for interoperability, there is no better test either for the standard itself or for its implementation than seeing whether integration works as advertised with different products. If a supposedly standards-compliant product has only been tested to integrate with one other product, then you can't have confidence that it has achieved the true interoperability that the standard promises. Oracle has completed LIS integration testing with Sakai, Moodle, and Schools on Facebook, is testing integration with another product now, and has testing plans with more in the near future.