Science & Technology Committee: The Big Data Dilemma: Panel 1

On Tuesday 17 November 2015 the Science & Technology Committee, of which I am a member, held an inquiry to understand the balance between the benefits and risks of big data. Here is the exchange of questions with the first panel of witnesses:

 

Valerie Vaz: We touched on the EU. It took a court case before we got the right to be forgotten and delete our history. The EU is looking at new regulations, and some of the highlights are the right to be forgotten and that consent should be explicit. All those enactments will hopefully be put in train. What is your view about them? Do you think they go far enough, or do they go too far?

 

Professor Montgomery: I have already alluded to the concern of people in the health community that there are projects that rely on the integration of data and which people would like to see happen because it would give them more responsive and effective health services. We need to bear in mind how much of the healthcare we give is not properly validated—we do not know as much as we should about it—and how much of the evidence we have to support healthcare is based on trials in somewhat artificial contexts. One of the promises of big data and the integration of health records is that we might learn what happens in the real world when people are offered particular treatments or drugs. There is some concern that we need a proper debate about the gains as well as the risks involved. There is certainly a perception that the weight on specific consent as the key tool might remove a lot of opportunities that people would like a chance to talk about. If we could find a way of controlling that better, there might be things that would be ruled out by too great a compliance tick-box approach. There is a series of concerns which I am sure will be represented to you by other witnesses.

 

 

Valerie Vaz: Is that specifically about the EU regulations?

 

Professor Montgomery: There is a very significant concern about the impact it has on health data registries of various sorts, particularly in relation to care for children where people try to work out whether they will have to try to re-consent. This is mostly historic data. It is not the gathering of new data, so you do not have the opportunity to seek a new consent; you have to go back and contact people who may not have remembered or may not care. We need balance in whether or not it is something they want to see, so we need a lot more conversation about it.

 

 

Valerie Vaz: People may not have consented, even if they are children. That is the key concern.

 

Professor Montgomery: Their parents may have consented on their behalf; they may have consented thinking that their consent was abandoning any further involvement, because they were quite happy to put a blood sample into a process. We do not know the answer to those questions.

 

 

Valerie Vaz: I was right in the middle of the inquiry on care.data, so I know the difficulties. Some of the data was released without consent and sold off. What is your view of the EU regulations?

 

Chris Combemale: It is quite difficult to have a final view because at the moment there are three versions of the legislation: the Parliament version, the Commission version and the Council of Ministers version. They all differ quite considerably on different key clauses. We think the Council of Ministers version gets a better balance between risk and principles versus being prescriptive, and in a fast-changing world things that allow risk-based principles to be incorporated are better. On issues like profiling, we think the Commission and Council versions are both better than the Parliament version because they restrict limitations on profiling to automated profiling that would have a legal effect on the person, whereas the Parliament version asks for consent to every profiling action that might be taken, which from a consumer and marketing point of view would be quite difficult. We know that consumers want relevance; they want products and services that match their purchase history. I think it is right to restrict the limitations on profile to things that have a legal effect and are automated: for instance, automatic approvals of mortgage applications without the intervention of a loan officer. It is highly sensible that you cannot make an automated decision that has a legal effect without the involvement in some way of a human being reviewing that, but the Parliament version goes too far. I do not want to take you through every key clause, but I do not think they have the balance quite right yet. The negotiations are ongoing. We don’t really know, because it is behind closed doors, exactly what the final version will be, but from what we understand we think there is movement towards a better balance than some of what existed in the parliamentary version.

 

Renate Samson: Likewise. There is still a long way to go. I am trying to get my head around it all. A lot of good and interesting conversations are taking place. I am particularly enthused to see a conversation about privacy by design or privacy by default. That is hugely important as we move along in the internet of things. Considering privacy and security at the very start of research and development does not happen now; it is often the afterthought. You have a great idea and then you think, “Oh, crumbs. This might happen.” It needs to be the first thing. It should be seen not as a negative but as an opportunity for great innovation. Security does not have to be a bad thing. Security is a good thing, so I am encouraged that it is on the agenda. Likewise, there is still a lot of discussion to go; the trial or the process is not over. I am sure it will not be 100% perfect, but we are definitely thinking more about the citizen.

 

Professor Montgomery: You made a point about care.data. I am not sure that was care.data, because care.data was never implemented, but in the piece of work Sir Nick Partridge did for the Health and Social Care Information Centre the particular thing that struck me was not that there were no governance processes in place at the initial transfer—contracts were in place—but that there was no ability to see whether contracts were being honoured. The difficulty was that if you looked just at the point of collection it looked as though it was a robust system, but in the way it was implemented we could not tell whether those agreements were honoured. We can speculate that they probably were not honoured in some cases. It is part of our thinking—just focusing on the point at which I agree or do not agree is not enough to protect my interests. There needs to be more than that.

 

 

Valerie Vaz: Can I take it a bit further? There is the idea that we have very good data protection—we have the seven pillars—but maybe the US does not have as robust a system as we do. How do you see the new regulations impacting on, say, companies that do not operate here? They may have some form of base in the EU but they are not based here. How will that impact on companies in different parts of the world?

 

Renate Samson: That is a very good question that is tricky to answer right now in light of the recent safe harbour situation. I hope conversations are taking place with regard to mutual legal assistance treaties. I understand there are provisions in the conversations about the new data regulations that will look at the US and organisations who engage with Europe. It is being discussed, but I am not in a position to be able to expand on that.