People don’t really learn lessons unless they have experienced them first hand? Why is this different?
The logical conclusion to this argument is that we shouldn’t capture and analyse lessons because the experience resides within the individual. Project delivery performance will be shaped by the range of experience that the organisation happens to have within the team and whatever good/bad habits they have acquired. If the team hasn’t encountered a specific set of circumstances before, they will find their own way through them like an explorer carving a route through the jungle, ignoring the experience of thousands of explorers who have gone before.
There is a rich seam of research that helps to explain why lessons don’t stick. Much of this is about being able to process the knowledge, relate to it, motivate individuals to apply it and understand the cause and effect. For example, we all know that stakeholder management is important, but do we really understand the consequences if we get it badly wrong. History provides us with examples of projects where costs have escalated by $millions because of a failure to grasp the fundamentals, but we don’t always take time to understand them. History also provides us with examples of where it has gone well and what good practice looks like. We’ll help you to identify and apply these insights.
I’ve been delivering projects for years and lessons learned systems have never delivered any tangible value. What magic sauce does Projecting Success bring?
The management team at Projecting Success have led projects, programmes and portfolios with a total value of nearly $5billion. We have first hand experience of lessons learned methods and know that they rarely work. Our approach is different and relies heavily on the following key factors:

  • An industry leading dataset of lessons, integrated into a data model using the very latest in graph based data analytics tools. This enables us to identify and extract evidence based insights, correlations and insights that wouldn’t previously been possible.
  • By pooling this data and insights we should be able to develop a crystal ball to help predict future lessons, or as a minimum, identify key areas of focus.
  • Deep experience of trying to implement lessons learned and realising that it’s a very difficult recipe to get right. For us, the key elements are senior leadership/vision, the quality of the input data, pooling data to identify trends and insights, segmenting lessons according to the exploitation route and identifying actionable insights.
  • Implementing a knowledge management approach aligned to the segmentation of lessons. For example, we have learned that there is little benefit to be gained by trying to implement more and more guidance, forums and insights on risk management when the issue is about priorities and the motivation to conduct effective risk management, rather than a lack of technical competence.
If I share my data with you then will some of less favourable lessons get into the public domain or will some of my commercial best practice be shared with competitors?
  • Data integrity and security is incredibly important to us. Without it, we wouldn’t be able to operate.
  • We segment our dataset based upon publically available data, data that stakeholders are willing to share and data that must be protected. In these instances, we roll up the data to provide insights and trends, but the nature of the lesson, its origin and specifics remain confidential and only releasable to the company who shared it with us.