Just as CDÃ¢â‚¬â„¢s and DVDÃ¢â‚¬â„¢s play on any player that supports the format, SCORM conformant courses should run and track on any SCORM conformant Learning Management System.
In practice, SCORM provides a framework that promotes, but does not gaurantee, successful interoperability. There are many ways conformant content can fail to interoperate.
In my experience there are two most common ways this can happen:
- interpreting results differently than they are intended by the specification
- infrastructure issues (bugs, character encoding) that interfere with communication between course and LMS
Result Interpretation Issues
When completion of a course is a requirement, a completion report should identify those who have satisfied the requirement. However, many LMSÃ¢â‚¬â„¢s report completion independently of a pass or fail condition. This makes reporting much more cumbersome, as data granularity required to track and interpret test results increases processing and data loads exponentially. A typical workaround is to develop a course that contains an embedded test which must be passed to complete the course. This prevents any granular information from being track at the level of the test.
One of my responsibilities at Kodak WWLD was to troubleshoot SCORM tracking issues.
In the course of working through several SCORM integration issues a pattern emerged that inspired me to create a standard method for solving them. A worst-case scenario that occurred a couple of years ago highlights the steps and principles embedded in the method:
Scenario: Courses created with a popular authoring tool fail to track properly on the LMS. The authoring tool provides SCORM via a black box implementationÃ¢â‚¬â€œ its functions and output files canÃ¢â‚¬â„¢t be modified. The LMS is provided as a service and canÃ¢â‚¬â„¢t be changed on the customer end either.
|Test on a neutral platform and try to reproduce issue.|
|Focus should be on Content or LMS depending on result|
|Contact vendor with results, have vendor reproduce on neutral platform to get agreement|
|Vendor commitment to partner on solving issue|
|Work through causation (keep asking why until answer is detailed enough to reveal nature of problem)|
|Action plan to make change likely to solve issue|
In my scenario there were no obvious reasonable solution steps I could take directly: Changing the black box raises compatibility issues with future versions and adds steps, and will not solve the problem. The LMS provider wants proof that their system is the root cause of the problem, but there are very limited troubleshooting capabilities on the customer end.
To complete Step 1 of the process I ran the content in the SCORM Test Suite from ADL. This is the same software used by ADL to test for SCORM conformity – no one will argue with the results from this test, and it saves out a results log I can send to the vendor. Everything checked out fine in that environment.
Step 2: Even though it already looked like the problem was on the LMS, I went to the authoring tool vendor and they agreed to verify my results. This would help me deal with the LMS vendor, who knew would insist on these steps.
I then went to the LMS provider with the test results. They agreed to audit results of a session on their LMS which we arranged. These were compared.
I arrived at Step 3 quickly because the LMS vendor acknowledged that the content behaved differently on their system than the test suite. They went through their logs and worked out the root cause on their end. They found that the authoring tool sent unescaped text data to the LMS that included a special character that the LMS interpreted as an end-of-file marker.
The LMS vendorÃ¢â‚¬â„¢s product is used by multiple customers so they are very reluctant to change anything for one. In this case the problem could effect any and all of their customers, so it was in their interest to make the necessary changes that solved our issue.
While it may read as a simple way to solve problems, this example required many meetings, phonecalls, and almost daily email updates. I became known as “the anklebiter” in my department in the process, but the process ensured that everything was widely communicated and very professional.
To learn more about SCORM and the test suite, go to the ADL website.