Christian Bokhove

…wonderful life

A response to a W3C priorities blog

blog1
This blogpost is a comment on this interesting white paper by Crispin Weston and Pierre Danet about setting W3C priorities. I first thought to comment on the site but it became rather lengthy so seemed more logical to post it here. Since coming to work in the United Kingdom I have not really been involved in standards for education, but the topic triggered my previous experiences. As from quite early on, 2004-ish, I did my fair share of ‘standards’ work (SCORM packages, programming the SCORM module in moodle, assessment standards for Maths via SURF) I thought it would be good to comment on the blog in more detail, not only because I disagree with some of it, but more importantly, I think any new development in this area *must* learn from previous experiences. I thought long and hard before writing this piece because I don’t want to come over, or be dismissed, as someone against innovation per se. But I must admit, even when I lived in the Netherlands, I did not really feel people really wanted too much innovation.

The context of the document

I hope that some of the claims and associations made in the first paragraph(s) will be reworded or evidenced more. At the moment one sentence combining underperforming education, PISA, MOOCs and elite courses seems far-fetched. I also wonder whether a statement saying that it’s a good time for 1:1 because touch is easy-to-use isn’t a bit too ‘easy’. On evidence I would also say that, when we run with the assumption that technology can do both good and bad, there might not be general evidence of the impact of technology on learning outcomes because it is seen as a means to and end. Like any tool, means are used correctly and incorrectly.

Three barriers

The authors of the piece see three key barriers to the emergence of a dynamic ed-tech market (why the word ‘market’?):
  • The lack of interoperability of systems and content that would allow different sorts of instructional software to work together in an integrated environment.
  • The lack of discoverability of innovative new products.
  • The institutional conservatism of many state-run education systems, which are often resistant to innovation and uncomfortable with the use of data as a management tool.
I certainly agree with interoperability (or lack thereof), which *is* the main topic of the article, as being a big barrier (some more comments later on). The second one, discoverability, is not really defined but if it leads to the conclusion that the W3C would be a good organisation to connect different players, then that’s fine as well. However, the article then primarily emphasizes how the W3C should “work with governments that are prepared to address the institutional conservatism to be found in their own formal education system.”. This is in line with the third barrier the authors define, adding that these systems “are often resistant to innovation”. I think such statements are not warranted and rather unproductive. It also neglects the role, at least on the topic of adoption of standards and interoperability, of companies that, in my view, in the last decades systematically have undermined standards, either intentional or through mis-management and neglect. In my view this thread runs from even HTML3.2 (Netscape vs Internet Explorer) to the current ePub3, Kindle and iBooks endeavours.

Requirements

The paper then goes to requirements analysis. In general terms this section is ok, and I certainly see a lot of good things in Analytics and Games. I do, however, miss some analysis there. How are these aspects effective in “other businesses”?,  why is that the case?, What characteristics make it this way? What’s to say it would work in education? And, crucially, why and how would you adopt a whole paradigm? To use this then to argue that little has been done, and subsequently to propose how to go forward has a bit of a ‘light touch’.

Interoperability

What I do find interesting and appropriate is the conclusion that interoperability between the two strands, let’s concisely call them analytics and content, is an important aspect. So although I think the analysis falls short, I think there would be no harm in, even potentially serve as catalyst, to have good interoperability. But that’s not something new, of course. Getting a standard is a ‘different cookie’ (Sorry, a Dutch joke). I also like the ambition to outsmart the proprietory market and be there first.
blog2Having used SCORM myself and even having modded the SCORM module in Moodle so that it made more use of the SCORM specification, I think the lessons to learn from it, are not complete. One only needs to look at software that can create SCORM packages like Reload/Weload, Xerte, but also commercial packages to see that it has been possible to make rich content. So I’m not really sure whether it’s the lack of standardisation and tools why it has not really taken off. When I extended the SCORM module most users did not really care, they just wanted a simple scoring mechanism. But now as well: when I see current adaptive systems they are mainly multiple choice and scores. When I look at game mechanisms they mainly are Points, Badges and Leaderboards. To me, that indicates users might not really want more sophistication (yet). Now I understand this might be seen as a chicken/egg issue i.e. when we finally *can* make sophisticated stuff it *will* happen. Perhaps this is true (although history tells us otherwise) but it certainly would be smart to analyze the situation more deeply. Not in the least with regard to the role of the current edtech industry who, in my view, have sometimes frustrated attempts to formulate standards.
This also brings me to a general frustration with the fact that millions have been spent on attempts to write specifications on standards and, even worse, metadata. Over the years this has resulted in thousands of pages of documentation. Why will it be different this time? I feel that before setting out on yet such a journey, that question needs to be answered extensively. The description of the  SCORM standard shows that we are dealing with experts. Given what I said previously I think there are more important reasons for SCORM’s waning than others. Apart from asking ourselves what factors, we also need to ask ourselves how it will be prevented this time. I also wondered whether there still was any scope in assessments standards like QTI. A thread, in my view, through almost all standards is the mis-management and frustration by organisations and enterprises. If W3C leads the process, that is at least a strong start. In how far W3C can confront the largest enterprises in the world, I don’t know.
A second point risk remains hardware and software. Hardware and software becomes obsolete or deprecated. Every time when it happens we are told there are good reasons for it, often inefficiency or security (e.g. java, but that’s also, again mis-management and perhaps personal feuds), but in any case: who’s to say this won’t happen again. In my opinion it certainly ties in again with the corporate aspect. The W3C should be strong enough to address it.

SCORM was under-utilized

From a technical point of view I have always thought the CMI had not been used as well as possible. I agree that it was partly because of the SCORM specification but also unfamiliarity with it, for example the ‘suspendState’ option. A package could communicate quite a lot of information through this field, needing two things: (1) packages that would create a suspendState, (2) a VLE that would actually read that state. I remember being involved in the Galois Project, a maths algebra project, where we tried to do just that. The former was done by creating a system that could produce algebra SCORblog3M packages which utilized the suspendState for storage. The latter had indeed needed to be obtained by reprogramming a VLE, which I did for moodle’s SCORM module. The annotated SCORM module was plugged on the moodle forums. As said previously, most people simply did not see the point in doing such a thing. Now, this was just a hack, but it did led me to believe that there’s more going on in the education world, in that technology is (and probably should be) linked to the preferred didactics/pedagogy. So: maybe we don’t even need a lot of sophistication. Why am I saying this: I think it would be good to use some Rapid Application Development here: get some working stuff out there first, rather than slave for years on specs and metadata.

Conclusion

Having said all that, I do think, given the developments, that a follow-up for SCORM is needed. And also that it is warranted that W3C would look into something like that. It is smart, I think, to take the theme of connectivity rather than content, to make it more W3C. It also provides a good reason to include Analytics. The fact the authors mention privacy and data Protection acknowledges awareness of the politics involved with such an initiative. So overall I think this is a good initiative, but ask attention for the following:
  • Traction and commitment with enterprise. How prevent frustration of the process?
  • Rather get technology working quickly than endless specification and metadata mashing.
  • Promote a more sophisticated use of technology as well.
  • Either refrain from sweeping statements about ‘conservatism’ in education and focus only on interoperability, OR get more evidence for the various claims (I doubt you will find this).
Advertisements

6 comments on “A response to a W3C priorities blog

  1. Crispin Weston
    February 28, 2015

    Hello Christian,

    Thanks very much for the helpful and thoughtful comments.

    I think that some of your comments on the early part of the paper reflect the fact that we were trying to keep the whole thing short. We certainly could have expanded it if we had felt that we would have kept the attention of our readers. So the message in the first paragraph is really very simple: “education needs the improvements that edtech has to offer”.

    I agree with your comments on companies that have undermined standards – but I also think that companies will behave in a way that maximises profits and so the blame for bad behaviour really falls on the policemen (i.e. the regulators) and those who draw up the procurement specifications (normally, government). I could write at great length how these processes have been mishandled in the past. You sum it up pretty well with the word “neglect”.

    Similarly on “Requirements” – I might come back and say more about this another day. Basically, companies track transactions and processes with complex IT systems – think of a supermarket logistics system. And as Professor Diana Laurillard has argued, education is a highly transactional business, yet no similar approach to process management is followed. Its a high level point, of course: one cannot pursue the analogy between teaching and selling cabages too far – but at the end of the day, setting an assignment, monitoring student progress, collecting those essays and checking their compliance with the deadline, handing them out again for redrafting, getting the students to comment on each others work – is all a highly complex administrative task that quickly drowns most teachers.

    In your section on SCORM…

    I did a survey in 2009 for UK agency Becta which showed that commercial suppliers had very little interest in QTI. I don’t really like calling it an “assessment” format as that places a lowest common denominator definition on “assessment”. It is basically a multiple-choice format, and assessment is much richer than that. In fact I think that practice is vital to teaching and learning and so long as practice is monitored and tracked (and why wouldn’t you want to do that as a teacher?) then I see no real difference between teaching and assessing.

    So why do people get stuck on multiple choice?

    Partly because devleoping rich digital interactions is complex and expensive. You don’t like the word “market”. And although I use the word in its broadest sense to mean a nexus between suppliers and consumers of digital resources (also used in its widest sense to include software) – I personally do not think that the rich interactive content that we need in this space is going to produced by anyone other than commercial suppliers with significant amounts of money to invest. I am happy to be proved wrong, but that is my view at the moment. I think there will be an important niche for OER in the application of authoring tools to more curriculum specific content – but I very much doubt the OER community is going to produce the underlying software we need.

    So why haven’t the investors come along? And why do we seem to have such an anti-competitive market in the bits of software that people do spend money on, like school MIS systems?

    In many ways, I think it is because we have had monopology buyers (government) and monopology buyers prefer to deal with monopology suppliers, rather than lots of fiddly little people. So I think decentralising procurement will also help decentralise supply – adn decentralised supply (if it is connected by robust interoperability standards) enables innovation.

    Another problem is the overhead imposed by the regulatory environment of a government-run service. The only suppliers able to sell serious amounts of textbooks in the UK are those that are formally endorsed by the awarding bodies – which is another source of the anti-competitive market.

    You say that the SCORM model was underused. I completely agree. Part of the problem was poor certification. How can someone have a VLE which says it is SCORM compliant but which does not support the suspendState field? Answer, problably, because it is a Open Source system bodged together by the Maths lecturer in his spare time. If it is a commercial system, then it is because of some incompetent bureaucrat who certified the system without doing any proper tests on it (I referred Becta to the European Commission in 2007 for doing exactly that) – several Local Authority tenders were cancelled and Becta’s approval rating slumped from 50% to 20% – but the damage had still been done. We need transparent and credible badging.

    You complain that there has been too much emphasis on specs development, which has been wasted because none of it has been implemented. Again, I entirely agree. Developing specs without impelemtnation is a complete waste of time, not only because an unimplemented set of specs serves no purpose, but also because an unimplemented set of specs is an untested set of specs, which is certain to be not worth the paper it is written on. ISO/IEC JTC1/SC36, the International Standards Organisation Committee for IT in Learning Education and Training routinely creates documents which it calls International Standards and which will cost you £150 to read, even though they have never been implemented or tested by a soul.

    So any project in W3C, if I have my way, will be focused on specs development that is done hand-in-hand with implementation. This is also the attitude of W3C, which I think is healthy.

    So why should the implementers turn up o implement? Good question – perhaps they won’t. But the best chance of getting the implementers to turn up is to ensure that governments, who are the people who control most ed-tech markets, ensure that support for robust, proven interoperability is recognised and rewarded in their markets (again, in the widest sense of the word – and they can do that e.g. by a strong badging scheme which does not only *certify* products but also *advertises* them). Because one reason that suppliers (or Open Source communities, if you believe that they will be the ones to lead these developments, I do not want to pre-judge anything here) do not come up with more innovative digital solutions is because these are not being demanded by the the education system. And that is what I mean by conservative institutions – we have to stimulate interest and demand. That will in part be a chicken-and-egg thing – people will demand more edtech when they start to see good edtech that can really help them in the classroom.

    So I think the phrases in the first section of our paper, which you did not particularly like, is not really so important. What matters is what we suggest should be done, i.e. specs development tied to development and implementation. And on that, I think we are closely aligned.

    Do come along and contribute to the group (next call, next Friday, 4pm CET) – the more people with a working knowledge of SCORM, the better in my view. Email me if interested – see the blog.

    And thanks again for your helpful comment. Best, Crispin.

    • cbokhove
      February 28, 2015

      Thanks Crispin. I agree most is aligned, and also see how such a piece should not be bogged down in details. I think I will be mailing.

  2. Hello,

    Regarding discoverability, i thought it was more to allow “Education Learning Objects” being serachable easily on the web platform. Something which is also quite important for books and ebooks. The web size is so big now that this “discoverability concept” becomes a challenge.

    The undermining of standards specially in Europe is real, you’re right but may be changing a little now. There is also an usual confusion between Norms and Standards. Standards are the best way to cooperate within an industry to make collective progress. Norms are more public oriented.

    On requirements, you’re right. How to progress is a good question. This is something we faced with W3C to make them aware that it is a strategic subject.

    Totally agree with you on the idea that “If W3C leads the process, that is at least a strong start”. That was our purpose !

    On the speed of standardization, I love the idea that we could RAD standards. That something i see inside IDPF where the standardization process is quite quick and pragmatic. Tnx to Bill Mac Coy on that.

    As Crispin proposed, please join our taskforce !

    Cheers,

    Pierre

    • cbokhove
      March 1, 2015

      Hi Pierre,

      Thanks for your comments. These all seems very sensible. Only re disoverable: aha, so that is meant. I can see that is important but hope it might be augmented by smart search algorithms and recommender systems, rather than spend ‘too much’ time on discussing taxonomies and metadata constructions. I hated how some projects spent hundreds of hours trying to agree on all the -was it Dublin core- metadata details.

      Cheers,

      Christian

      • You’re absolutely right. Discoverability does not mean having metadata talking shops only. This is much more. +1 !

  3. Pingback: ResearchED 2015 | Christian Bokhove

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Information

This entry was posted on February 28, 2015 by in Education and tagged , , , .
%d bloggers like this: