Programme reviews: a cornerstone of effective delivery or a means to an end?
Programme reviews are probably something that all of us in the project management profession are familiar with, no matter the frequency. Whilst the recipe rarely varies all that much, often the results are quite inconsistent.
Conducting regular programme reviews is generally considered a routine and practical component of delivering a programme. Regular progress reviews provide an opportunity to reach a wide range of stakeholders and to share the ‘headlines’, as well as to secure support from the sponsor, in order to achieve the programme aims.
The APM Body of Knowledge, 7th edition states that “regular reviews between relevant parties support transparency and the early identification of any issues”. However, having attended my fair share of programme reviews, in a variety of capacities, the quality and content varies significantly. Why is this the case when the purpose and aims of the reviews are generally well understood?
In this blog, I will explore a few of the ingredients of a good review, as well as some of the challenges (or opportunities) that lie ahead of us, as professionals.
Principally, a programme review exists to ensure that the scope and/or deliverables are progressing in line with the baseline plan in terms of time, cost and quality, and it is attended by most of the key business functions. The agenda usually includes a general status report, with any notable achievements or issues (often in the form of a successes, opportunities, failures and threats or SOFT report), along with all notable aspects of performance, such as schedule adherence, cost performance, cash flow, risk, resources and quality assurance.
An oft used phrase in our profession, is integrated, something my Programme management SIG colleague Nigel Beecroft explored here. Perhaps we could consider the very nature of a programme review integrated as it brings together:
- all aspects of programme performance;
- the views and contributions of all business functions and tiers of the value chain; and
- the client and the contractor.
In my experience the more integrated the review is, the more confidence the sponsor, customer and other stakeholders can take away from the review. This is demonstrated by:
- the coherency of each component of the report (and supporting data);
- the alignment of the programme team; and
- the recognition of relevant external influences.
Corporate cultures and professional preference will influence how programme reviews are conducted in your organisation. Two interesting questions to explore in this regard are:
1. Is the programme review data-led or opinion-led?
During my career I have experienced both ends of the scale. Whilst there is room for qualitative context, surely this alone can no longer provide a reasonable level of assurance? As the volume of data being produced by programmes continues to grow, along with powerful tools to analyse and interpret it, surely we have now reached a point where quantitative measures will always outweigh opinion and judgement, so long as the data is reliable?
2. Is the review challenging or nurturing?
This is a bit trickier to balance and I find it useful to consider Eric Berne’s Transactional Analysis model to consider the sliding scale of behaviours that we have probably all seen during programme reviews. This model includes the parent (nurturing or critical), the adult and the child (adaptive or free).
Have you seen your sponsor, or a senior leader adopt the role of a ‘critical parent’, chastising the underperforming team? Or perhaps, you have observed the ‘nurturing parent’ who offers a more encouraging and coaching style? I recall one of my experiences as a programme manager and I remember it vividly because I was leading a struggling programme and had been on the receiving end of a ‘wire brushing’ from the vice president. The more I was criticised, the more I fought back and defended myself and the team. The review had gone beyond ‘adult’ discussion and I had become the ‘adaptive child’ and the vice president the ‘critical parent’.
The aim at a programme review is to keep the discussion in the ‘adult’ state – we use the data to inform our opinions to make rational judgements. Programme reviews are about providing help and support, not to score points or look clever.
Over the past 12 months, we have all had to adapt our styles and adjust to remote working. I have seen that some organisations have reduced the duration of programme reviews, but increased the number of informal stand-up reviews, in a more Agile type approach. The pandemic has an impact on all of us and it will be interesting to see how much of this impact is here to stay - nonetheless, in my opinion an effective programme review plays a key role in the timely delivery of outcomes and/or benefits. This is the case irrespective of your organisation’s industry or culture.
So, a recipe for an effective programme review that is a cornerstone for effective delivery rather than a means to and end must include the key ingredients: reliable data; engagement from all programme actors and the sponsor providing a fair balance of challenge and support.
1 comments
Log in to post a comment, or create an account if you don't have one already.
Hi James, the short answer is....no....are you kidding? Levity aside, consider what a programme is. Even the "simplest" Programmes by their nature consist of multiple projects, possibly multiple phases and these may overlap, run in parallel or even have pauses. Programmes evolved to address the levels of uncertainty, complication, complexity, duration, organisational culture challenge.......and so on....that project management was never "designed" for and crumbled beneath. Even in these "agile" days when many foolishly and erroneously believe that being agile means leaving stuff out or being somewhat laissez faire. I have tried, but cannot conceive of the rationale or benefits in not have regular reviews. even if replaced by something less formal. Consider also Governance. If you want to give a CFO apoplexy....hmm not a bad idea.....suggest only informally reviewing the progress of a 2-4 year, £300m defense programme. What you CAN do, especially if Being Agile is a goal, and yes its possible even in civils, defense or even nuclear engineering...then consider what regular reviews are "just enough" for [a] control and [b] keeping stakeholders "happy". To summarise (trying not to sound like Goldilocks and the 3 bears), not having regular reviews for a programme is crazily too little, strangling through too many is wasteful, expensive and irritating. Having just enough is......just right. the fun question then is.....what is just enough, and how to you decide?