The Department for Education in collaboration with other authorities in the Education, Skills and Children’s Services sector has, since at least 2009, been developing its own set of data-related standards under the management of the Information Standards Board (ISB).

According to its own website (http://data.gov.uk/education-standards/standards-adoption), “The core business of the ISB, supported by the Technical Support Service (TSS), is to successfully embed standards within the Education, Skills and Children’s Services (ESCS) system in England.”

The ISB has two “approved” statuses for its published standards – “Approved: Recommended” and “Approved: “Adopted”.  On the ISB website today, there are 269 “Recommended” standards, and zero “Adopted” standards.

So why, in five years, has the ISB failed in its core aim of issuing standards which are actually being used?

We believe the answer lies in the approach taken by the ISB in producing and publishing these standards.  Let’s take as our benchmark the work of the Internet Engineering Task Force (IETF), which is responsible for producing a large number of widely-deployed technical standards.  To quote from “The IETF Standards Process” (RFC 2026):

The goals of the Internet Standards Process are:

  • technical excellence;
  • prior implementation and testing;
  • clear, concise, and easily understood documentation;
  • openness and fairness; and
  • timeliness

Assuming we agree that these are all good goals to have when producing a standard, let’s assess the performance of the ISB against those goals.

1) Technical excellence

It could be argued that technical excellence should be relatively low on the list of priorities for ISB (its aim is simply to produce usable standards), but there is a minimum standard that the work is “fit for purpose”.  Unfortunately there have been numerous examples of documents published as “Approved” by the ISB which are simply not fit for purpose.

One of the foundations of the ISB data standards is the Business Data Architecture Data Types document.  This was first published on 20th September 2013 as version 4.0 and was so riddled with errors that that version is not even archived on the ISB website!

It contains statements such as the recursive definition of a date being “A string providing date information and hence containing following values – year, month and date”.  Those three components of a date are each defined to consist of a single digit (0 to 9).  Or defining a type called “Simple_Integer” as being “A simple unsigned string of numeric values.”, and another type called “Integer” with the simpler definition of “Signed numeric value”.

It was simply not possible to implement any of the definitions provided in this document without ignoring large parts of it and trying to guess what the document author intended.

When defining controlled lists of values for common concepts such as language, country and currency, the ISB have quite rightly decided to adopt existing ISO standards in those areas.

However, the ISB have fundamentally misunderstood the purposes of the ISO standards, which are titled “Language Codes – ISO 639”, “Country Codes – ISO 3166” and “Currency Codes – ISO 4217”.  i.e. these are standard lists of *codes* for languages, countries and currencies.

In trying to fit in with other controlled lists, the ISB have broken perfectly good existing standards by using the textual descriptions (e.g. Spanish, Spain and Euro) instead of the code defined by the standard.  The problem is when ISO codes remain the same but the textual description changes.

This was raised as an issue with ISB in May 2014 but no visible action has been taken.

Regarding the many other controlled lists and data documents, there is no way to reliably assess their fitness for purpose without attempting to implement them in the real world.  Which brings us to…

2) Prior implementation and testing

The ISB have stated that it is not their role to perform testing – the ISB propose standards and it is reliant on stakeholders to implement them.  However, given that no ISB standards have been through an implementation process and moved to “approved” status, this approach is obviously not working.

The ISB standards are also heavily dependent on each other, adding layer upon layer upon layer.  When mistakes are found in documents that many other documents depend on (such as the BDA Data Types document) or improvements are suggested to those documents, then these suggested improvements are rejected as being impossible to implement due to the many other standards that would be affected by such a change.

If no stakeholders have implemented any of these standards then in addition to not proving the standards are fit for purpose, it also raises the obvious questions around the need for those standards in the first place.

The motivation for a lot of the standards developed as part of the work of the ISB has been the DfE Data Transformation Programme, which is described as consisting of the “Data Exchange” and “School Performance Data Programme” projects, neither of which are  currently being actively pursued.

However, there is still value in the work undertaken by the ISB, especially in the production of controlled lists which can be used regardless of the over-arching data model.  Instead of investing in the production of further standards, the ISB should stop and invest in its own testing and implementation of at least the core standards and their XML representation.

To quote Albert Einstein, “In theory, theory and practice are the same. In practice, they are not.”

3) Clear, concise, and easily understood documentation

The documents published by the ISB are far from being clear, concise and easily understood.

The ISB staff developing the ISB data model need to use the ERwin software to view, develop and maintain the data model behind all the published ISB documents.

However, the “other side” of the process (organisations wishing to develop systems conforming to ISB) do not have any of that functionality, and are just provided with a set of static PDF documents, often containing 90% boilerplate text and 10% actual content.

There is also an XML Schema (xsd) file which partially specifies the XML representation of ISB data.  In order to obtain the full specification of this XML representation, an implementer is required to refer to the PDF documents.  No examples of XML files have been published.

To quote from the W3C, “XML Schemas express shared vocabularies and allow machines to carry out rules made by people. They provide a means for defining the structure, content and semantics of XML documents.”

The XML Schema provided by ISB defines the structure of an ISB XML file, but as a matter of policy declines to use the language of XML Schema to define the content and semantics of such files.

The main example of this is the lack of incorporation of controlled lists into the Schema, but also includes the lack of any documentation on the semantics of the structures being defined.  Instead, a user of the schema is referred to the many PDF documents as being the definitive source of documentation.

Given that XML will be the main method of transferring ISB data between systems, it is vital to ISB’s success that the documentation on the XML format is both accessible and unambiguous.  The XML Schema language should be fully used to aid in this task, not ignored.

It seems clear that the information published by the ISB needs to be stored in a structured way (e.g. in a database), and the content should be accessible via an interactive website.  In addition, the same database of information could be used to automatically generate PDF documents similar to those now created manually, or to create an XML Schema which as the W3C intended, would “allow machines to carry out rules made by people”.

4) Openness and fairness

In 2014, an independent review of ISB was undertaken, which included contributions from stakeholders outside the ISB and its constituent organisations.  However, the findings of this review have not been made public, not even to those who contributed time to the review. This is particularly shocking given this Administration’s emphasis on “transparency”.

5) Timeliness

It is difficult to judge the timeliness of the work of the ISB given that there is no apparent desire to implement any of the published standards.

THE INFORMATION STANDARDS BOARD

3 thoughts on “THE INFORMATION STANDARDS BOARD

  • 24 February 2015 at 9:23 am
    Permalink

    So let me get this straight: according to the ISB there’re only 10 days in a month, 10 months in a year and 10 years, well… at all?

    They should hurry up. They’ve only got five years left to finish. Not much room for ROI!

    Better make hay while the sun shines.

    Reply
  • 26 February 2015 at 12:27 pm
    Permalink

    Spot on! The use of coding lists is much more flexible (language not an issue and mapping responsibility falls to the end-user not the standards provider) and more efficient (smaller files for transfer and less disk space for storage).

    Reply
  • Pingback:THE INFORMATION STANDARDS BOARD, SLIGHT RETURN | dataday01@sda

Leave a Reply to Roy Hicks Cancel reply

Your email address will not be published. Required fields are marked *