Open data is a nebulous concept. What does it actually mean? Openness is generally considered to be a good thing. And we all know data is valuable. So open data must be double plus good, right?

We tend to get confused about what’s “open” and what’s not (hardly surprising when few of us read the terms and conditions on anything). As Tim Berners-Lee pointed out at the fifth ODI Summit yesterday, most of us don’t realise we shouldn’t be using Google Maps on event invitations, because that data is copyright Google (he recommends we use OpenStreetMap instead).

At the same time as being trigger-happy with other people’s copyrighted data, we’re even more foolhardy with our own. What we really don’t want is what Sir Tim calls “promiscuous data” – that’s personal data which goes off in all sorts of directions we don’t want it to.

The Open Data Institute believes that open data is the glue society needs. It is campaigning to establish data as “an infrastructure not a commodity”.  If we all share data and collaborate, we’ll save ourselves billions of pounds annually. But if we’re individually confused about what we should and shouldn’t share, the companies and organisations currently managing our data for us are even more conflicted.

Because very little appears to have actually changed since Carole Cadwalladr uncovered the Facebook/ Cambridge Analytica scandal back in March. Global technology companies are still moving fast and breaking things. Just this week, there’s new concern around how Google will use DeepMind’s health data, which contains records of 1.6m NHS patients.

“Data and value” was the focus of the ODI Summit yesterday. And “value” was the word that was most under scrutiny: what sort of value do we want to prioritise: financial or social?

Here are five key themes that emerged:

1. Our inadequate regulatory framework

Catherine Miller (doteveryone – above, centre) argued that we urgently need an Office for Responsible Technology. Claire Melamed (Global Partnership for Sustainable Development) went one step further, saying that privacy should be enshrined as a fundamental human right:

“We have a right to clean air and clean water – but we don’t have to clean our own air and water, we place responsibility on the state to provide that. We have governance in every other area of human rights, why not for privacy?”

2. The need for proper design thinking

According to Andrew Eland (DeepMind – above, left), people need to start thinking about the downstream consequences of what they’re building: not just the end user but about “what if my product is wildly successful, what will the implications be?”

There’s a lack of trust because companies like Google & Facebook have behaved in an untrustworthy way, said Catherine Miller. They say they “just want to build good products” but they need to think carefully what their definition of “good” is. Good for whom? “The real test should be, is this something you could tell your mother about, and not be ashamed?”

3. Making it happen

The open data movement is having the same conversations now as the open source movement ten years ago, said Patrick McGarry (Dataworld). Businesses need to appreciate what a community of practice can help them do: “A hundred thousand eyes is better than any amount of time you can spend with the same five people.”

Monzo and Co-op Digital were cited as good examples of businesses sharing data in the open. Engineering firm Arup has been a pioneer in creating apps to gather data and its willingness to collaborate. Catherine Brien (The Guardian) said organisations were often reluctant to share, but rallying different companies around a common cause worked well.

Meanwhile, the ODI announced two brand new collaborations: first, it will work on a series of open data projects with the government’s new Office for AI. And second, it will work with Lloyds Register Foundation to improve safety in built infrastructure. Richard Clegg (MD of Lloyds Register Foundation) said we need to imagine a world where all the owners of critical infrastructure (ie bridges and roads) are able to openly share data.

4. The invisible long tail

The problem with the issue of private personal data being shared publicly (eg: via social media), then going into private company (for profit) ownership is that the implications aren’t felt or realised straight away.

Catherine Miller pointed out that people generally may feel that social media has improved their lives personally yet still worry about wider impact on society & democracy. If your data has been scraped, you won’t necessarily see a direct impact on your own life, but socially, the impact is massive.

5. In the future we’ll marvel at how naive we were

I liked BBC CTO Matthew Postgate’s description of “social” as an evolutionary digital step after the mainframe and desktop but before artificial intelligence and machine learning.

Tim Berners Lee said that in future we’ll happily pay for safekeeping of our data the way we’ve started paying for professionally-produced news rather than read too much that is free (and fake).

This is just one aspect: the “social” web as we know it is clearly set to change completely. In hindsight, we’ll all be less careless with our data (I’ve heard it likened to the early days of electricity where Edwardians used to think high voltage electric tablecloths were a good idea).

The feeling is that private companies have had free reign for way too long. Now the pendulum must swing back the other way. To use a well-worn adage, it’s time for us (as individuals, and as a society), to take back control.