Author Archives: Mark Ritchie

About Mark Ritchie

Deputy Director and Head of Project Services in Information Services Applications Division at the University of Edinburgh

Simple Project Portfolio Management

Typically we have many more ideas for projects than we have resources to execute them. We need some simple processes to determine what projects to take on.

Agreeing priorities for your projects when you have multiple different stakeholders can be difficult – comparing the relative priorities of new and in-flight projects can be even more challenging. Do nothing and you may miss opportunities to optimise benefits from your project portfolio. Become too interventionist and you risk continually stopping and starting projects which makes no one happy. If you introduce too complex a process then you may become over bureaucratic and create a project governance industry at your institution. It can be a minefield!

Here are some suggestions that we’ve adopted at the University of Edinburgh to provide some simple Project Portfolio Management capabilities.

Suggestion 1 – Assign all your projects an Overall Priority when you bring them into the portfolio

First-Link-Priority2-264x300We assign each new project or proposal one of the following Overall Priorities as part our planning process:

CRITICAL (Priority 1) – typically failure to progress these projects will:

  • Have unacceptable impact on the University community, e.g. affect a large number of students, staff, visitors or alumni
  • Involve unacceptable additional costs, e.g. the costs of providing an alternative solution or for additional goods and/or services which would not otherwise be required
  • Place the business of the University at unacceptable level of risk, e.g. failure to meet a legislative requirement or deliver a contracted service
  • Cause reputational damage to the University

HIGH (Priority 2) – typically failure to progress these projects will:

  • Have significant impact on the University community
  • Involve significant additional costs
  • Place the business of the University or an organisational unit at significant risk
  • Cause significant reputational damage to Information Services, our partner or another organisational unit

MEDIUM (Priority 3) – typically failure to progress these projects will:

  • Impact on the University community
  • Involve additional costs
  • Place the business of an organisational unit at risk
  • Cause possible reputational damage to Information Services, our partner or another organisational unit

Due to sustained levels of demand for projects we haven’t needed to define a LOW (Priority 4) classification for a number of years!

The initial Overall Priority allocation for each project is based on the judgement of the Project Sponsor and is reviewed by our management team as part of the project take-on process. Overall Priority allocations for projects in our portfolio are reviewed at least quarterly by our senior management team. Any proposal to change Overall Priority for a project is discussed with the Project Sponsor prior to the change being made.

There is  an agreed mechanism whereby a Project Sponsor can request an change in Overall Priority at any time – again with appropriate senior management oversight to adjudicate on requests. Overall Priority can also be affected by the stage the project is at. For example we may want to protect a project that is very near to going LIVE by giving it a higher priority. Projects with a longer lead time may start off with a lower Overall Priority and be stepped up later.

Setting Overall Priority for all the projects in our portfolio enables us to identify the most important projects that we are currently working on.  When a schedule or resource conflict arises and we have to choose between projects we use the current Overall Priority of each project to guide our decision making.

Suggestion 2 – Score all your projects when you bring them into the portfolio

mental_gymnasticsWe score our project proposals to define our portfolio in the first place, i.e. to determine which projects to take into the portfolio. This is a really valuable technique for identifying which projects to work on and why.  We also use Portfolio Score to compare between projects with the same Overall Priority.

Our approach is to score our projects against a few simple factors. We then use these Portfolio Scores to prioritise projects. We keep the scoring simple and score projects at the outset, as they come into the portfolio, rather than more frequently.

Our Portfolio Scoring for each project is based on the following weighted factors:

  • Alignment with University Strategic Plan 0 = Not Aligned, 1 = Somewhat Aligned, 2 = Fully Aligned i.e. of clear strategic importance (Weighting = x3)
  • Risk of not doing the project High = 3, Medium = 2, Low = 1, None = 0 (Weighting = x2)
  • Benefit to cost ratio >2 = 2, 1-2 = 1, <1 or Not Stated = 0 (Weighting = x2)
  • Time to deliver tangible benefit <1 Year = 3, 1-2 Years = 2, >2 Years = 1 (Weighting = x1)

For the benefit to cost ratio we consider the direct financial benefits to be delivered by the project with the costs of delivering the project, including any associated service costs, over a period of four years.

We could add more factors but we are wary of double counting and adding unnecessary complexity. The Portfolio Scores can be differently weighted, e.g. to value strategic alignment more highly than benefit, depending on our requirements at that time. Its obviously important however to use the same weighting for all current projects so that the scoring is directly comparable between projects!

We use the Portfolio Scores to decide which projects to take forward and to inform portfolio level decisions, e.g in the event of resource or schedule conflicts, based on both Portfolio Score and Overall Priority

 Suggestion 3 – Establish portfolio level project governance

meeting21You need some form of overall governance approach to use these tools effectively. If you are deciding between projects for a single business area then it’s easy. If you are  delivering projects for partners across the University then more senior management involvement will be needed.

At the University we’ve found that involving the senior management team, e.g. Heads of College and Support Groups, in defining the initial project portfolio is essential. After that, and as long as senior management understand the process,  they will typically be happy to trust the Head of IS  or Head of PMO to oversee operational decisions with escalation only required in exceptional circumstances.

We need to keep the senior team informed of current portfolio status and do this through quarterly status reports which include Overall Priority and current RAG status. For larger projects we also have local governance, typically a Project Board or Steering Group, to help ensure that the impact of decisions is understood and informs what we do.

I hope the above suggestions are helpful. Please let me know what you think!

Mark Ritchie
Deputy Director and Head of Project Services

Project Management in IS Applications FAQ Part 1 – Tools and Metrics

In my role as Head of Project Services I’m often asked about what tools and approaches we use for project management in IS Applications. These questions have become more frequent since I became Chair of the UCISA Project and Change Management Group in 2014 see https://www.ucisa.ac.uk/groups/pcmg.aspx.

This is the first of an occasional series of posts which I’ll try to provide answers to the most frequently asked questions. This set of questions cover aspects of project management tools and metrics and was developed in response to questions posed by Trinity College Dublin.

1 – What tool do you use for Project Portfolio Management (PPM)?

We manage projects and resources in ASTA PowerProject http://www.astadev.com/ We capture actual project effort for IS Applications Division Staff staff using the web based  time recording tool provided with ASTA.

A complete time record is submitted by each person in IS Applications Division each week which includes their effort on projects, services, annual leave, absences and other activities. ASTA PowerProject is a specialist tool and most of our staff only access the too via the web based Time Recording interface. Our full ASTA users are Project Managers, Programme Managers, Portfolio Managers and Resource Managers within our various teams.

2 – What tool do you use for project/portfolio reporting? (in terms of metrics/trends)

We capture actual project staff effort in ASTA PowerProject http://www.astadev.com/. We export this data to Excel and our formal reports on projects, services and resources are all driven from the ASTA data.

3 – Do you have a repository for PM documentation? If so what do you use.

Our repository is the Drupal based web application at www.projects.ed.ac.uk we also use Confluence Wiki and SharePoint for sharing project information and managing Project Boards, and User Groups.

4 – How do you measure the success of the PMO/Governance structure? And would you have any metrics you could share on this?

Currently our measurement is mainly around projects delivered – we need to do more with outcomes and benefits.

Our projects are grouped into programmes and portfolios. We have an annual planning process to identify projects see https://www.projects.ed.ac.uk/planning/area/apps/15-16. The outcome of the annual planning process is an agreed set of projects to be delivered in the coming year. We know how many projects are expected to be completed in the year ahead for each programme and portfolio.As part of the planning process we set a budget for our staff effort on each project and can roll up this to set a budget for each programme and portfolio.We also capture start, planning, delivery and closure milestones for each project.

We use these measures to report metrics and KPIs including:

  • Number of projects delivered in a period by programme and portfolio
  • Number of project days delivered in a period by programme and portfolio
  • Number of projects delivered as a % of the expected number of projects to be delivered in a period by programme and portfolio
  • Number of project days delivered as a % of the expected no of project days to be delivered in a period by programme and portfolio
  • Number of key project milestones (start, planning, delivery, closure) achieved in a period by programme and portfolio
  • Number of key project milestones (start, planning, delivery, closure) achieved as a % of the expected number of key project milestones to be delivered in a period by programme and portfolio

As we have a budgeted effort for each project we can compare budgets with actuals on a project, programme, portfolio or overall basis. This is great – but be need to be wary of becoming overly fixated on costs rather than project outcomes.

We capture a RAG status for each project based on very simple rules:

GREEN – expected to deliver agreed scope on time and in budget
AMBER – emerging risks or issues may not deliver agreed scope on time and in budget
RED – will not deliver agreed scope on time and in budget (intervention required)

Our primary interest is in the RAG status for individual projects. For large programmes and portfolios however we occasionally report counts and percentages of projects by RAG status

We publish annual reports for each portfolio which contain some of this data. You can view reports of portfolio outcomes in 2013/14 in the portfolio home pages on the Projects Web Site:

5 – Any thoughts on developing these metrics?

The thing we are all looking for is the correlation between project outcomes and things which we can influence or control. We all want answers to questions such as:

  • are we more or less successful with small, medium, large or extra large projects?
  • are we more or less successful on projects with shorter duration/elapsed time?
  • are we more or less successful on projects of different types e.g. Software Development, IT Infrastructure, Software Package Implementation, Business Case/Options Appraisal and Procurement?
  • …..

This requires us to regularly review project outcomes against various measures including project size, duration and type as part of our continuous improvement processes.

References

We’ve also contributed to the following best practice project and change management resources published by UCISA:

Mark Ritchie
Deputy Director and Head of Project Services