Big programme design and the willing suspension of disbelief involved
I haven’t written a lot about my time at Capita but I wanted to capture some of the thinking that I did in the last year I was there as its work that I am drawing on a lot at the moment and hopefully it will help people understand where I am coming from on this. I’m also keen to connect to other people who are working in this space as its one of my curiosities for the year.
I was lucky to be working with some really great people in Capita Consulting with huge amounts of skills and experience in the more ‘traditional’ approaches to programme management who were nonetheless very open to (and tolerant of!) my much more digital mindset. 2017 was a very turbulent year for Capita (!) and I am not sure what the legacy of that work is or will be but for me it was a chance to spend some time thinking about the question of how to scale agile and also a chance to really learn about how big programmes of work operate — something that has been hugely useful to me. With that in mind I moved into the Capita Consulting business to look at two main things:
- What could our digital transformation ‘offer’ look like?
- How could we deliver (big) agile programmes not just agile projects or products? And the question that sits behind this; how do you create a change approach that reflects the fact that the context around the work you are doing is always changing?
More of the digital transformation stuff perhaps at a later date, this piece is about big programmes and for clarity; when I say ‘big programmes’ I am talking about organisational scale change work which consists of multiple workstreams and requires (or is expected to have) significant corporate governance. These programmes will be designed to deliver on major strategic objectives be it cost saving or transformation — or most often a combination of both.
Whenever I have been involved either as part of the delivery or an observer of Big Programmes — usually in local government — I have been struck by the willing suspension of disbelief that participants have to sign up for. The first suspicion of disbelief is the idea that you can plan a two or three year programme and deliver certainty of process and outputs from the start. This seems like madness to me — especially when you contrast it to an agile approach which makes no long term promises about how you will get there but does make a stronger commitment to outcomes instead of outputs. But this is not just a ‘go agile’ battle cry; where agile approaches seem to struggle is in their intersection with corporate accountability and longer term planning as well as when they are forced to interact with legacy IT teams and systems.
Clearly ‘agile’ does scale — but most examples of agile at scale are in very ‘digital’ businesses like Spotify where its an agile culture — not just an agile method in an organisation which is still organised around the paradigm of the industrial society as so many organisations still are.
This brings the second suspension of disbelief — particularly around digital transformation programmes — how can you possibly expect to create a more digital organisation if your change programme is being delivered in an analogue way? Its a bit of a New Years cliche but yes — we do need to be the change we want to see.
The work I was doing at Capita was looking at how we could create a blended programme design that delivered both on agility but also on the corporate need and expectations around these huge and strategically significant pieces of work. In the Capita context it was also about something that a third party implementation partner could contract against — and this point about partnering is something I will probably return to later in the year. Some of the corporate expectations around a big programme need to be adjusted; the ritual of the programme board or the formality of the steerco for example — but others such the expectations about accountability and planning need to be met in some way.
Where we landed with the design was something that had four big differences from ‘traditional’ programme structures:
- We brought the idea of a backlog into the programme design (via the hopper) and made the design function more integrated/multidisciplinary and iterative than is usual in a Prince 2 type set up. The aim of this was to embed an agile mindset in the programme and to bring things through based on their value and not just because the team running the show think a particular order is logical.
- We drew a bigger distinction between the ‘engines’ that drive the programme — such as comms/engagement (people in the diagram) or knowledge/insight — and the actual packages of work (in this diagram the solutions). In this model the engines are responsible for sense making and continuity while the solutions move through the programme pipeline from ideas going into the hopper and then through the validation and design process into delivery — and sometimes back again as the process is designed to iterate.
- We built in much more discipline around benefit realisation into the design process — everything that went through the hopper process was tagged for potential benefit types with the goal being that solutions included the generation of the data needed to measure their impact. We also did work around expanding the range of value types that a programme could target — this list is designed for a local government environment and one of the jobs for my team this year is to look at what a similar family of benefits/outcomes could look like for CRUK.
- We separated the governance into procedural (programme and measurement) and strategic (purpose and outcomes) in order to create different spaces for the different conversations that need to be had with the rest of the organisation. These are; 1) Procedural: is this programme doing what it said it would do on time and on budget? 2) Strategic: Is this programme achieving the outcomes we hoped for or do we need to redirect its purpose? We did this because of an observation of too many ‘decision’ forums being bogged down in procedural stuff and not getting chance to talk about the strategic issues — we wanted a parity of esteem between these different conversations. We also wanted the purpose of the programme to be treated as a living statement — something the organisation could regroup around to check that this large piece of work was actually doing something useful.
Here is one the later iterations of the design (culled from conference materials):
The reasoning behind these changes — and much of the back story of the debate that we had to get here — was about acknowledging the fact that design is an iterative process that needs to respond to new information as it comes in but also to remove **the third suspension of disbelief — that the process of delivering the programme is as important as the work that is being delivered.** Its much easier to create a perfect programme team with lovely comms plan — its much more difficult to actually make the change happen in the real world at the edges of the programme.
There is lots more to test in the design as a whole and one of the reasons for this post is to see if other people have already looked at aspects of this thinking to get some ideas about what does (and doesn’t) work.
While we never really got to try this out in its entirety — though we did test the hopper and the benefit types with some live client work with some really encouraging results (there is LOADS to talk about in the hopper — especially about prioritisation and how to pick what to progress but that is another post entirely).
I am still very attached to the hopper. At the start I assumed that the benefit realisation aspect was going to be most alien to me — however three things about the hopper and subsequent process really shifted my thinking on this:
- When we got into it and viewed it as a data problem it became much more meaningful
- The breaking out of the different value types really helped the team to unpack some of the hidden work in the examples we were working with — the things you pursue because they are ‘Good Things’ but don’t have a mechanism to measure because they are not financial outcomes
- Benefit measurement is deeply agile — ie the pursuit of value — but it is also where you see the advantages of properly disciplined research and analysis methods. I love research and analysis methods!
Here at CRUK we are testing the separation of the delivery work from the engines of change on our Future of work programme — the principle is working for us but we need to give it a bit of time to see whether it has the effect that we want it to have. One of the critical success measures on this for me will be how the change engine transition at the end of the programme. While the packages of work (for example ‘laptop rollout’) will finish, things like the knowledge engine will transition into our UX function as its role is to hold the insights that we are gathering about our users in the programme.
One of our other major programmes is a refresh of our legacy IT estate (yes — CRMs and the like) and the team had already decided to build that around the idea of a backlog of work packages (candidates) that we prioritise from which already gives us a much greater degree of agility. The areas we are now thinking about are governance and accountability; both in terms of how we make the iterative design process effective but also how we make sure that we meet the right organisational expectations.
I wanted to end this piece by recapping those three suspensions of disbelief:
- The idea that you can plan a two or three year programme and deliver certainty of process and outputs from the start
- How can you possibly expect to create a more digital organisation if your change programme is being delivered in an analogue way?
- Creating a programme structure sets up the risk that the process of delivering the programme becomes as important as the work that is being delivered.
As anyone who has read my stuff before knows I am hugely interested in how change happens — and at their essence these big programmes should be about change. When I speak to leadership teams this is why they develop these things — to make change happen — but they so often actually fail to meet any of the expectations that they are set out with and we need to ask why.
With this in mind, over the last few years I found it remarkable to see programme designs that had organisational change constrained into a single work stream rather than running like a golden thread through the whole thing. These three statements of disbelief came from this observation and a desire to create a change vehicle that was both getting stuff done on time and on budget but also addressing the bigger and more complicated questions of that delivery actually making a difference.
I have much more delivery experience from my time as a CEX then I do as a consultant and this perhaps colours my observations of the programme delivery work (especially the observation about the programme becoming an end in itself) but these observations are borne out by many conversations and discussions I have had with leadership teams, usually in the context of a piece of digital leadership work, who are trying to be deliver major change in their organisations.
To state the obvious — change is hard — to create meaningful organisational change you need to design a programme of work that has a number of different qualities:
- it puts purpose as a central tenet to the work and something that you return to to check that you are on the right track
- it needs to embody the kind of future you are trying to deliver
- it needs to deliver value incrementally and do so in a way which builds the organisational inclination and ability to change
- it needs to, overtime, transition into ‘how we do things round here’ rather than existing until end of times as a separate entity
- it needs to be open and transparent to corporate scrutiny. The organisation has to be able to trust it
- it needs to be agile and adaptive to changes in the organisational context and not just continue on its tracks irrespective of what is happening around it
You hear a lot about change fatigue — and I am not surprised — you see people trying to do their ‘business as usual’ work being battered by big strategic initiatives. Major programmes take up a huge amount of energy — you have to tie them to your strategic goals and deep into your organisational structures and narratives unless you want them to be forever pulling against each other.
Originally published at www.curiouscatherine.info on January 3, 2019.