Rapid delivery of robust, responsive applications

    By Simon de Timary, Head of Innovation at BJSS

    Simon de Timary

    COVID-19 has completely transformed our way of life, putting significant pressure on people and businesses around the world.

    The same is true of our government who are trying to anticipate the needs and reactions of the public and use policies accordingly. These new policies often need new digital services to be delivered quickly and cost-effectively.

    But, we should not sacrifice quality on the altar of time. Any new service must remain secure, robust, scalable and where applicable accessible by all intended users as per GDS guidelines.

    Accelerating GDS deliveries - a case study

    BJSS has been delivering digital services to the government for almost 10 years, with some significant transformative implementations across NHS, Home Office and DVSA to name just a few. During this time we have been involved in over 100 public sector deliveries, successfully passing over 30 GDS assessments. That experience has helped us master the GDS process and successfully pass quality gates, at the first time of asking.

    We have recently successfully delivered a brand-new service in collaboration with the Department for Transport (DfT) and Highways England (HE) to help bridge owners find and procure temporary bridge solutions. We went through all three GDS phases (discovery, alpha and beta) and successfully passed the beta evaluation within nine weeks.

    Our delivery was split into:

    • a lean start-up three-week sprint to validate assumptions and design the right solution
    • six one-week sprints to deliver the working solution into live

    We organised a show & tell at the end of every week to showcase our progress, review our objectives and revalidate the agreed vision.

    Our three-week sprint: Discovery and alpha

    Highways England had already identified the need for a new service and the product owner had a general idea of what the service should do and how. This limited the objective of our discovery to validating our original assumptions with the end-users.

    Map-of-survey-respondents

    Week one - discovery

    We started with a kick-off workshop with all key stakeholders from DfT and HE to set out ways of working, identify risks and challenges, and define a shared vision. We fleshed out the product owner’s initial ideas, transforming them into user journeys and personas.

    In that first week alone, we’d spoken to three end-users and obtained 49 responses to the online survey we’d sent out on Tuesday. All interactions with users were conducted online, which allowed us to interview a large number of people.

    The feedback gathered – while challenging some of our initial assumptions – clearly validated the need for the service, which gave us enough confidence to move into alpha.

    Weeks two & three - alpha

    We synthesised our finding from the first week to form a service blueprint, which then allowed to start creating the designs of the system. Our developers had spent the first week setting up a fit-for-purpose development environment and designing the target architecture based on the information we had at the time. This enabled them to start implementing pages as they were designed, and we had a first coded prototype to demo at the end of week two. We also presented our target architecture to the architecture governance team, which enabled us to change elements of our solution (i.e. the database technology) without compromising the delivery timelines.

    In week three, we were testing a partial solution with end-users. This involved gathering feedback and valuable insights into their needs and frustrations that simple interviews would not yield. On that Friday, the mid-way GDS review replaced our weekly show & tell. Using our experience, we worked to help the assessor evaluate our delivery against the 14 criteria set in the GDS Service Standard. This ensured we were compliant when we needed to be, had a clear path to compliance and if compliance was not required, we had a clear supporting rationale.

    GDS reviews are notoriously hard to pass so even though we were confident in the quality of our work and preparation, we were relieved when the assessor told us we were on the right track.

    Beta

    We had ambitious objectives for our beta phase, having to harden and finish our prototype to meet production standards and build the management dashboard for HE’s admin users. Our architecture first approach to building code enabled us to build upon the Alpha and saved precious time going into beta.

    We continued to improve the temporary bridge service and conducted multiple user tests every week, regularly asking our experienced colleagues for feedback. At the end of week five, we agreed on a backlog freeze. Any new feature requests were added into a backlog for continuous improvement post-release, thus enabling the team to focus on going through the long list of remaining tickets and bug fixes.

    On week nine, we conducted the GDS beta assessment and having worked closely with the assessor throughout our delivery, we received sign-off two days later.

    Key to Success

    This delivery was challenging. We faced issues with deployment, integrations, performance and the underlying technology.

    Yet, we managed to deliver a full end-to-end GDS service in under nine weeks. Several factors led to our success including:

    • our proven experience of GDS deliveries;
    • the full commitment and support from the key stakeholders at DfT and HE;
    • our resolute approach to agile;
    • our efficient use of collaboration tools such as Miro, Zoom and Office 365;
    • appropriate trade-offs between creating reusable assets and writing throw-away code;
    • a determination to test early, regularly and to iterate our solution;
    • open communication with all stakeholders including the GDS assessors;
    • strong awareness of what everyone’s responsibilities were and sense of accountability established in the kick-off workshop and reiterated in the agile ceremonies;
    • show & tells proved to be an invaluable communication tool and helped gather stakeholder support as well as involve the GDS assessors regularly before the final assessment and time was made for everyone to share their opinion on the current progress;
    • all agile ceremonies, show & tells and assessments were conducted remotely all key stakeholders were invited and attended;
    • all user research and testing were conducted remotely, enabling further reach, accelerating the discovery and saving precious time and energy.

    Our experience in delivering government digital services has given us the confidence to deliver at pace, without compromising process and quality. During this unprecedented time, the demand to deliver solutions quickly and safely is increasing, and at BJSS we're well-positioned to help. Feel free to contact us at info@bjss.com to discuss your requirements.