A response to a W3C priorities blog
This blogpost is a comment on this interesting white paper
by Crispin Weston and Pierre Danet about setting W3C priorities. I first thought to comment on the site but it became rather lengthy so seemed more logical to post it here. Since coming to work in the United Kingdom I have not really been involved in standards for education, but the topic triggered my previous experiences. As from quite early on, 2004-ish, I did my fair share of ‘standards’ work (SCORM packages, programming the SCORM module in moodle, assessment standards for Maths via SURF) I thought it would be good to comment on the blog in more detail, not only because I disagree with some of it, but more importantly, I think any new development in this area *must* learn from previous experiences. I thought long and hard before writing this piece because I don’t want to come over, or be dismissed, as someone against innovation per se. But I must admit, even when I lived in the Netherlands, I did not really feel people really wanted too much innovation.
The context of the document
I hope that some of the claims and associations made in the first paragraph(s) will be reworded or evidenced more. At the moment one sentence combining underperforming education, PISA, MOOCs and elite courses seems far-fetched. I also wonder whether a statement saying that it’s a good time for 1:1 because touch is easy-to-use isn’t a bit too ‘easy’. On evidence I would also say that, when we run with the assumption that technology can do both good and bad, there might not be general evidence of the impact of technology on learning outcomes because it is seen as a means to and end. Like any tool, means are used correctly and incorrectly.
The authors of the piece see three key barriers to the emergence of a dynamic ed-tech market (why the word ‘market’?):
- The lack of interoperability of systems and content that would allow different sorts of instructional software to work together in an integrated environment.
- The lack of discoverability of innovative new products.
- The institutional conservatism of many state-run education systems, which are often resistant to innovation and uncomfortable with the use of data as a management tool.
I certainly agree with interoperability (or lack thereof), which *is* the main topic of the article, as being a big barrier (some more comments later on). The second one, discoverability, is not really defined but if it leads to the conclusion that the W3C would be a good organisation to connect different players, then that’s fine as well. However, the article then primarily emphasizes how the W3C should “work with governments that are prepared to address the institutional conservatism to be found in their own formal education system.”. This is in line with the third barrier the authors define, adding that these systems “are often resistant to innovation”. I think such statements are not warranted and rather unproductive. It also neglects the role, at least on the topic of adoption of standards and interoperability, of companies that, in my view, in the last decades systematically have undermined standards, either intentional or through mis-management and neglect. In my view this thread runs from even HTML3.2 (Netscape vs Internet Explorer) to the current ePub3, Kindle and iBooks endeavours.
The paper then goes to requirements analysis. In general terms this section is ok, and I certainly see a lot of good things in Analytics and Games. I do, however, miss some analysis there. How are these aspects effective in “other businesses”?, why is that the case?, What characteristics make it this way? What’s to say it would work in education? And, crucially, why and how would you adopt a whole paradigm? To use this then to argue that little has been done, and subsequently to propose how to go forward has a bit of a ‘light touch’.
What I do find interesting and appropriate is the conclusion that interoperability between the two strands, let’s concisely call them analytics and content, is an important aspect. So although I think the analysis falls short, I think there would be no harm in, even potentially serve as catalyst, to have good interoperability. But that’s not something new, of course. Getting a standard is a ‘different cookie’ (Sorry, a Dutch joke). I also like the ambition to outsmart the proprietory market and be there first.
Having used SCORM myself and even having modded the SCORM module in Moodle so that it made more use of the SCORM specification, I think the lessons to learn from it, are not complete. One only needs to look at software that can create SCORM packages like Reload/Weload, Xerte, but also commercial packages to see that it has been possible to make rich content. So I’m not really sure whether it’s the lack of standardisation and tools why it has not really taken off. When I extended the SCORM module most users did not really care, they just wanted a simple scoring mechanism. But now as well: when I see current adaptive systems they are mainly multiple choice and scores. When I look at game mechanisms they mainly are Points, Badges and Leaderboards. To me, that indicates users might not really want more sophistication (yet). Now I understand this might be seen as a chicken/egg issue i.e. when we finally *can* make sophisticated stuff it *will* happen. Perhaps this is true (although history tells us otherwise) but it certainly would be smart to analyze the situation more deeply. Not in the least with regard to the role of the current edtech industry who, in my view, have sometimes frustrated attempts to formulate standards.
This also brings me to a general frustration with the fact that millions have been spent on attempts to write specifications on standards and, even worse, metadata. Over the years this has resulted in thousands of pages of documentation. Why will it be different this time? I feel that before setting out on yet such a journey, that question needs to be answered extensively. The description of the SCORM standard shows that we are dealing with experts. Given what I said previously I think there are more important reasons for SCORM’s waning than others. Apart from asking ourselves what factors, we also need to ask ourselves how it will be prevented this time. I also wondered whether there still was any scope in assessments standards like QTI. A thread, in my view, through almost all standards is the mis-management and frustration by organisations and enterprises. If W3C leads the process, that is at least a strong start. In how far W3C can confront the largest enterprises in the world, I don’t know.
A second point risk remains hardware and software. Hardware and software becomes obsolete or deprecated. Every time when it happens we are told there are good reasons for it, often inefficiency or security (e.g. java, but that’s also, again mis-management and perhaps personal feuds), but in any case: who’s to say this won’t happen again. In my opinion it certainly ties in again with the corporate aspect. The W3C should be strong enough to address it.
SCORM was under-utilized
From a technical point of view I have always thought the CMI had not been used as well as possible. I agree that it was partly because of the SCORM specification but also unfamiliarity with it, for example the ‘suspendState’ option. A package could communicate quite a lot of information through this field, needing two things: (1) packages that would create a suspendState, (2) a VLE that would actually read that state. I remember being involved in the Galois Project
, a maths algebra project, where we tried to do just that. The former was done by creating a system that could produce algebra SCOR
M packages which utilized the suspendState for storage. The latter had indeed needed to be obtained by reprogramming a VLE, which I did for moodle’s SCORM module. The annotated SCORM module was plugged on the moodle forums. As said previously, most people simply did not see the point in doing such a thing. Now, this was just a hack, but it did led me to believe that there’s more going on in the education world, in that technology is (and probably should be) linked to the preferred didactics/pedagogy. So: maybe we don’t even need a lot of sophistication. Why am I saying this: I think it would be good to use some Rapid Application Development here: get some working stuff out there first, rather than slave for years on specs and metadata.
Having said all that, I do think, given the developments, that a follow-up for SCORM is needed. And also that it is warranted that W3C would look into something like that. It is smart, I think, to take the theme of connectivity rather than content, to make it more W3C. It also provides a good reason to include Analytics. The fact the authors mention privacy and data Protection acknowledges awareness of the politics involved with such an initiative. So overall I think this is a good initiative, but ask attention for the following:
- Traction and commitment with enterprise. How prevent frustration of the process?
- Rather get technology working quickly than endless specification and metadata mashing.
- Promote a more sophisticated use of technology as well.
- Either refrain from sweeping statements about ‘conservatism’ in education and focus only on interoperability, OR get more evidence for the various claims (I doubt you will find this).