Skip to content

On the 20th October, Hitachi Solutions sponsored the Government Data Summit and I was delighted to attend alongside two colleagues from our data team. The event itself was well positioned, with a good audience and relevant keynote speakers whom we were able to engage with. 

For me, it was a key learning opportunity and I have pulled together the most important takeaways from both the event and from the conversations I had. Despite not being involved in the roundtable discussions, I learnt a lot from speaking with different suppliers and listening in on the panel sessions which delved deeper into the influences and challenges of data and its prospects.  

The importance of data 

I was particularly interested in a discussion around AI; it provided an insightful evaluation of the dependency between data and AI with the speaker asserting: 

“Storytelling is important but what story we tell is really important”. 

The speaker related talking with other data scientists to ‘therapy’ as if the premise of data is so misunderstood amongst different audiences. He suggested the desire for AI often shadows the underlying importance of datasets which are what drives the need for AI in the first place. This quote from the speaker supports one of my favourite quotes: 

Without good data, if you use AI, you just make mistakes with greater confidence

Judson Althoff
EVP

Interestingly, it was highlighted that “perfectionism should not hinder a decent dataset suggesting the fine line between ‘bad’ data and ‘good enough’ data. This was an interesting theme throughout the day, including how best to leverage data to generate the bigger picture on achieving a broader data strategy across government.  

Data strategy 

Many speakers emphasised the importance of a robust data strategy, which can be grouped into the following 5 mission statements: 

  1. Unlocking the value of data across the economy 
  2. Securing a pro-growth and trusted data regime 
  3. Transforming governments’ use of data to drive efficiency and improve
    public services

  4. Ensuring security and resilience of the infrastructure on which data relies
  5. Championing the international flow of data 

As part of this, it must articulate standards, data quality, data management, and stewardship as well as a longer vision of where you want to go with the data. This brings me to short-termism; the need to deliver quick results with the capacity to support long-term requirements. In a fast-paced world where 2.5 quintillion bytes of data are generated each day, one must ‘design for tomorrow and build for now’ by harnessing the power of data with a long-term strategy in mind. It is reassuring to see that Departments recognise the value of a data strategy and are seemingly acutely aware of what they know and don’t know which can help to propel them up the maturity curve.  

Obstacles to data 

Notwithstanding the advancements made with data, it was interesting to note the different obstacles it faces across government and the capacity for its value to be recognised. It is the most competitive market for data talent that we have ever known with a multitude of new roles required across different skill areas. Together with a scarce proficiency in data talent combined with an abundance of data forecasts a data overwhelm / information overload has led to a ‘reactive’ approach as opposed to a ‘proactive’ approach when it comes to data. However, whilst we should encourage communities to think beyond the boundaries of their department, they are constrained by departmental funding. Budget cannot match expectations which are heightened by different perspectives, meaning people view things differently from different expertise and so have a different vision on how to deliver the best value of data.   

All in all, these are only some of the many key takeaways I established during my time at the Government Data Summit. It proved an invaluable experience and a good insight into not only what the future of data looks like but how we get there.