National Resilience Capability Survey

An inauspicious beginning

For more than twenty five years I was a civil servant, usually in charge of delivering something substantial – so I thought I knew how this would go: Not so.

At the time Cabinet Office awarded the contract to SDA the directing policy team were undergoing a seismic change of personnel, the legacy system we were to replace had already been decommissioned, and it was two years on from the previous data collection so many in the field had also moved on. It was our job to introduce our SCROLL survey software into this extreme state of flux.

The job

Cabinet Office’s Civil Contingencies Secretariat (CCS) are responsible for the National Resilience Programme, which aims to increase the UK’s capability to respond to and recover from civil emergencies and provides advice on preparing for a crisis. They needed a solution to enable them to collect, process and analyse resilience and readiness data from national and local responder organisations as well as utilities and Local Authorities. The data is at least “official sensitive” so the solution had to be locked down tight. SDA’s comprehensive ISO27001:2013 certification was only the starting point!

Engaging the client

After we’d negotiated the security capsules in the lobby of 70 Whitehall – and been relieved of all our electronic accoutrements – we were taken to COBRA 2 for our initial briefing and workshop. Here we began to flush out the details of the assignment. The shopping list was extensive. These were a few of the must haves:

  • Guided development of hundreds of separate survey questions;
  • 25 different questionnaires delivered simultaneously to 800 diverse respondent groups during a six-week window;
  • Multiple users in an organisation involved in completing each instrument;
  • Different access levels for different users to both questionnaire development, data entry and outputs;
  • Local administrative functions for each participating organisation;
  • Multiple views of survey data from pre-set reports to complete flexibility in describing variables and granularity.
  • Facilities to compare individual elements of demographic groups with the aggregate performance of the respective demographic group;
  • Benchmark reports across different levels of aggregation;

The list went on. And on.

Making progress

It became clear that though the responder community had huge subject matter expertise, much of it on the frontline, it needed to be complemented by an equal proportion of survey savvy: in other words they knew what they needed to find out, but they didn’t altogether know how to ask. And while the ‘must haves’ were given, the ‘shoulds’, ‘coulds’ and ‘woulds’ were still in the eyes of the beholders. SDA gently suggested that we might be best used to arbitrate between conflicting priorities based on our deep knowledge of our discipline and our product. At the end of the workshop we proposed two important variations:

  • That CCS delegate the survey design and implementation to us
  • That it would be more efficient if we were to manage the entire technical service.

Our public sector colleagues agreed, very quickly, and it seemed – to me at least – that our offer had evoked a metaphorical sigh of relief. We began work in earnest.

A question of security

Cabinet Office is the home of the Government Digital Service (GDS) and so it was unsurprising that the project was run according to agile principles. This was fine by us: SDA have always subscribed to requirements/prototype/iterate in all their many guises. One big deal, though, was the GDS Service Assessment. Ours majored on security. We took them through our Information Security Management System, we showed them our audits, our testing regimes, our certificates and our third party certificates. Still they appeared uncertain.

Finally we suggested that we could install SCROLL on a virtual machine in their own datacentre. Absolutely, they said. Another deep sigh of relief.

Going live

Our guiding imperative was to launch on time: this was an absolute. And honestly, we were always on the front foot. Where our client had questions, we provided answers. Where they had problems, we provided solutions. Where they had deadlines, we met them. Come the day, though, a final flurry of amendments were presented to us and they were non-negotiable. Of course, we went the extra mile. That’s what we do.

Who cared about the weekend anyway?

Does that all sound difficult? It was. Was it unpleasant? No. It was exhilarating. When you throw yourself into something, the barriers just come down. We all did a great job.

Lessons learned

  • Technology is easy. Organisations aren’t
  • Never underestimate complexity
  • Be prepared to lead. Sometimes from behind
  • Focus on delivery.