Visceral Business

View Original

Data as an emerging utility

Photo by Ricardo Gomez Angel on Unsplash

During Data Connect 2023, as part of her excellent session, a Taxonomist at the Home Office asked the question:

‘Is a hot dog a sandwich?’

From a taxonomy point of view, it’s a conundrum. But at the same time, it’s worth acknowledging that standardisation and category management have been established as business as usual in sectors such as food retail for a really long time.

In food retail, a hot dog would be classified as hot food in its prepared state because of where it needs to be placed in store where heated cabinets define ranging. The component parts for hot dog self-assembly would sit in the bakery, canned meats and condiments aisles of a supermarket. Importantly, they would be easily discoverable that way.

When looked at in this context, both central and local and government organisations are only just beginning to grapple with the issue of how fragmented data and data taxonomies are.

Today, data is a mission critical asset to everything we do and we are all dependent on data quality to function well. There’s only one direction of travel, especially with AI and algorithmic management, and that’s data is going to become more and more inherent and embedded in the fabric of our lives.

Essentially, data is a major utility, one of universal dimensions, waiting to come fully onstream. And we have past history we can learn from when looking at how to bring a major utility online successfully.

Let’s go back nearly 200 years ago to 1826, when different railways employed varying track gauges. That led to the inconvenience of ‘gauge breaks’ when tracks of different widths intersected. Basically, for all the investment made in the railways at the time, people couldn’t go anywhere without gauge breaks being an issue.

As a solution, a standard gauge, known as the ‘Stephenson gauge’, (1,435 mm  or 4ft 8 1⁄2 inches) became widely accepted worldwide. It was this gauge that enabled seamless interconnection and interoperability and the creation of the rail networks we take for granted today.

The development of data as a utility is the same as this in principle as well as, for example, the development of electrical networks where standard protocols have been critical to developing electricity as an asset and the ability to generate reliable energy.

Using data to develop intended outcomes successfully also has parallels with how we navigate road infrastructure and travel.

It wasn’t until the late 1950’s the road transport network was established, but today - and with digital connectivity built-in to how and where we travel - navigating from ‘A’ to ‘B’ using a well-designed architecture for information design is something we take completely for granted.

So, my question is, why aren’t we doing this with data?

There are, of course, significant practical obstacles to overcome.

The difference between data as a utility and the utilities of road, rail and power is that data hasn’t ever been centrally controlled and nor should it be.

Because of how rapidly digital capability has developed over the last decade, all organisations have developed data and digital strategies rapidly and in parallel to one another. So far, there hasn’t been an opportunity to aggregate findings and learn from them.

But then, with data, is that even advisable? Data blind spots are a liability! And iterative development is not a risk we should want to take. We need a singular, well thought-out, strategic approach.

Photo by Alexander Schimmeck on Unsplash

Data is highly granular and uber complex. There are millions of classifications, standards and taxonomies to consider, I hear you say.

But data is our digital DNA, and just like sequencing the human genome, formatting data to a universal standard is what will enable us to realise our full data-enabled potential.

I think we need to establish a general acknowledgement that, consciously or unconsciously, we are on a path towards the standardisation of data as an emerging utility because, as with any utility, standardisation is an inevitability.

Thinking of data as an emerging utility is not just about idealism, it is a practical step to take. It is a catalyst to developing effective, universal data architectures and infrastructures.

I’ve seen how information and organisational architectures affect and influence particular outcomes throughout my professional career. All organisations have operational and process architectures, for example, that shape how they get the products and services they provide to market, as well as how they handle the interactions with customers and associated networks. If these aren’t considered as part of organisational culture, the organisation will underperform and be less agile.

Farming data management out to third parties without acknowledgement of how this helps or hinders emerging data infrastructure is not a long-term option. If Decentralized Autonomous Organisation (DAO’s) are emerging, enabled by digital technology, then it becomes even more the case that data interoperability is essential.

Central and local government has perhaps the strongest motivation to ensure we can handle technology, code and data, rather than let us be conditioned or subsumed by it.

Getting to a joined-up approach takes time, of course, but I think the moment has come to accelerate what we’re doing. Shocking though Marc Andreessen’s recent Techno Optimist Manifesto might have seemed, I do agree with one point, which is it’s time to build.

We need to take things up a level. My extensive work on the Connected Housing and Social Charity Studies over the years has involved examining numerous organizations simultaneously and conducting high-level analyses to gain insights into their preferred strategies.

These studies have played a pivotal role in fostering a widespread understanding of digital best practices within the industry and have established active learning mechanisms with continuous feedback loops.

Typically, such work is overseen by regulatory bodies due to the necessity for coordinated efforts. The task now is to turbo-charge these organisations and develop new networked capabilities to support the aim of developing data as an asset for the benefit of us all and a new utility that operates at maximum capacity.