Open data initiatives being embraced by governments around the world will generate major economic and social benefits, but the process must extend beyond simply releasing gigabytes of raw data.
Each year, governments collect vast quantities of information as a by-product of programs undertaken and services provided. The resulting data sets cover everything from population demographics and health records to business and trade trends.
As part of a growing belief in the benefits of open data, these massive data stores are being made available to the private sector and general public. It’s been shown that such openness can lead to insights and advances that would previously have been impossible.
Business consultants McKinsey & Co estimate open data initiatives could unlock more than $US 3 trillion of economic value through stimulation of innovation and improvements in decision making.
In Australia, the Federal Government is pursuing an open data agenda. Communications Minister Malcolm Turnbull has acknowledged that, to be useful, data released must be free to use, easily discoverable and machine-readable.
Just last week the Minister launched a study, The Open Data 500, to explore how Australian organisations are using government data sets to generate new business, develop new products and services, improve business operations or create social value.
However, to truly deliver the promised benefits, governments must do more than simply open the data floodgates. They need to invest in platforms that enable intelligent analysis of that data.
An example is the popular online tool offered by the Australian Bureau of Statistics. The tool allows users to create tables, maps and graphs using data from a diverse range of official sources—primarily census data but also statistics on topics such as migration, healthcare, crime, education and employment. Users can slice and dice data sets and combine statistics from different areas to create unique insights.
By giving users the power to ask their own questions, this tool makes it easy for businesses, researchers, policy makers, and the general public to find the information they need to make better decisions, supported by facts. Whether that be a business franchise considering the best location for its next store, or a government department looking to make policy decisions about the right projects to fund, it is clear that making the data available in an accessible way can benefit everyone.
Despite such successes, Australia has been portrayed as an open data laggard. The label resulted from the nation being ranked 10th in the Open Data Barometer report published by the World Wide Web Foundation.
Yet such rankings do not reveal the full story. Just because other countries have released greater numbers of data sets, it doesn’t mean they are in a format that can be used to extract meaningful insights. Measuring ‘openness’ by sheer volume of data released is misleading.
Instead, attention should be focused on how easy it is to analyse the information and derive value from it. Large tables of statistics are one thing, but gaining valuable insights from them is something else.
For Australia to derive the most benefit from its open data initiatives, governments and agencies must be prepared to invest in the technology required to make data as accessible as possible.
Rather than being faced with gigabytes (or potentially terabytes) of data, users should be able to manipulate and analyse data quickly and easily. By doing this, attention and resources can be focused on gaining insights rather than the cumbersome mechanics of dealing with large volumes of data.
Streamlining access and use of open data in this way will also encourage more people and organisations to take advantage of the opportunities it offers. The result will be insights and advances that will deliver on the true promise of open data.
Steve Hulse is chief executive at Space Time Research