When analytics projects go wrong

By Hugh Miller
Principal
17 September 2018


By Hugh Miller - Principal
17 September 2018

Share on LinkedIn
Share on Twitter
Share by Email
Copy Link


By Hugh Miller
17 September 2018

Share on LinkedIn
Share on Twitter
Share by Email
Copy Link

Many organisations have seen big-ticket analytics projects turn into a morass of expense, confusion and lacklustre results. While some mistakes are unavoidable, there are ways to make the best of such experiences.

When studying at university, I was fortunate enough to have lunch with a senior academic from one of Australia’s largest medical research institutions. I impertinently asked him why so many of the big-ticket research projects (cancer, brain science, genetics etc) that attracted significant funding often underdelivered on their promised findings. He gave my question more time than it deserved. He believed that there was an element of a sales cycle in big medical research projects; to secure funding and high-quality students you had to aim big, knowing that progress is likely more modest. But it is vital that you have continuity so that over time you can incrementally build on progress towards great things.

This sales cycle exists for big analytics projects too; I’ve seen and heard about many cases where a big bold vision to transform an organisation’s use of data has woefully underdelivered. In some cases the results are so bad that a project is abandoned altogether. What are we to make of this? Is there a better way?

Here are some thoughts.

  • Do not ignore incrementalism: There is nothing wrong with a big bold vision, particularly if it’s the best way to gain stakeholder buy-in. However, the reality is that most organisations see incremental improvements in their capability and results over time. If you’re evaluating a potential new project, try to ensure that it builds on what you’ve already got and that, at worst, it will still deliver some useful incremental progress for next time. This means injecting a dose of realism into the vision.
  • Your hard problems will often remain hard problems: Sometimes part of a justification for a big analytics project is to solve a deficiency in the current setup. For example, the ability to better manage customers with complex product holdings. In many cases, if these were easy problems to solve then they would already have been solved. Radically changing an IT system or an analytics solution might just repackage the issue into a different form.
  • Use consultants effectively: Consultants are a fixture of the analytics landscape and their advice is often sought for big change projects (disclosure: I am one!). However, they will usually not have the same end-to-end view of the business as internal teams, and they will not have to pick up the pieces if things go wrong. Over-reliance on consultants or lack of knowledge transfer in a project creates risk for an organisation.
  • Laying good groundwork is important: Much activity these days is done under the umbrella of ‘agile’ management and minimum viable products. Such approaches aim to rush to a solution and fill in the details later. Agile management has been hugely valuable in breaking down previously monolithic projects with giant Gantt charts. There is a risk of going too far; sometimes there is value in laying the foundations (good data flows, scalable infrastructure etc) rather than racing ahead to the final product. While less sexy, it is possible to sell such groundwork if it aligns well with longer-term analytics ambitions.
  • When a project goes wrong, look for the value: Even if a project ends up in the scrapheap, there will usually be progress that can be scavenged. Some of this might be higher-level learnings (e.g. discovering the weakness of an analytics platform that guides future decisions). Others could be lower level, like new code or models that solve a smaller problem that can be re-purposed. While sifting through a failed project can be painful, a blanket deletion can be worse.

Not every analytics project will be a success, just like not all medical research yields spectacular cures. But both have the virtue that there is always a new opportunity, particularly when you appropriately learn from the past.

As first published by Actuaries Digital, 17 September 2018


Other articles by
Hugh Miller

Other articles by Hugh Miller

More articles

Hugh Miller
Principal


Well, that generative AI thing got real pretty quickly

Six months ago, the world seemed to stop and take notice of generative AI. Hugh Miller sorts through the hype and fears to find clarity.

Read Article

Hugh Miller
Principal


Inequality Green Paper
calls for government policy reform to tackle economic equality gap

In a Green Paper commissioned by Actuaries Institute, Hugh Miller and Laura Dixie, analyse the impact of economic inequality in Australia.

Read Article



Related articles

Related articles

More articles

Jonathan Cohen
Principal


The Australian privacy act is changing – how could this affect your machine learning models?

In the first of a two-part series, we look at the proposed changes, their potential impacts for industry and consumers, and what you can do

Read Article

Daniel Stoner
Principal


Analytics quick wins: Five New Year resolutions to thrive in a climate of economic uncertainty

How can advanced analytics help you through the COVID-19 storm? We reveal some ways to strengthen your business and prepare for success

Read Article