This is the output of the real‑time captioning taken during the IGF 2014 Istanbul, Turkey, meetings. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.
>> DAVID GROSS: Oh, there it is, very good. Good morning, everyone. Thank you very much for joining us for what undoubtedly will be the highlight of your IGF. This is ‑‑ this is a panel on mobile, trust and privacy. It's important to do this early in the morning because you don't get too excited as a result of some of these issues. This is the third time we've met and our distinguished panel will answer the question of, among other things, what has happened in the intervening years to help extend the information society interest in being able to deal with data privacy and mobile services while at the same time protecting our rights and interest in privacy.
We've got a terrific panel. What I've asked all of our panelists to do is to give a short presentation, only four or five minutes, so it's not even really a presentation, of a couple points that they specifically want to make sure that they have gotten across to you all that they think is particularly important, timely and ‑‑ and the like. I'll ask each of them to give a very brief introduction about their role within their organisation and the relevance to this topic, and then we're going to be opening up for questions. We want to make this as interactive as possible, so feel free not only to ask questions but if you want to make statements, have views and the like, please share them as well.
Since the GMA has organised this and Pat Walshe is the driver, I'll ask Pat to begin and we'll go straight down the dais. Pat?
>> PAT WALSHE: Wants us to be controversial. Not everything I say will be (?) so please don't write me off. So privacy. Just a couple of stats. There are 7.1 billion connections globally at the moment and 3.6 unique mobile subscribers. Now, bear in mind, when figures are bandied about like 1 billion Facebook users or Google users, that's one company and the 7.1 billion connections are connected by 800 companies that operate in 219 countries. So it's an extremely ‑‑ privacy is an extremely complex matter. I think also I'm going to challenge whether or not privacy law can ever meet people's privacy ‑‑ protect their privacy, so we did research on 11 1/2 thousand people in 800 countries and the documents are at the back of the room, and it's very clear to us that trust really matters, and that 82% of people want to know what data is being taken from their devices and they want to be able to permission that, and 60% of those surveyed, just in the area of location alone, want consistent rules to apply. Think of this consistent bit when I talk in a few minutes.
So for me privacy mobile, everybody is moving to mobile, businesses are waking up to mobile, Google, Facebook, everybody is saying it's mobile first, and that's true. Governments want to use mobile to provide the key means by which you will access government services, for example, and it's absolutely terrific. I've seen mobile deliver so many opportunities to people around the world.
But in mobile it's very different. Mobile has traditionally been regulated by a precautionary principle, and this began in the mid‑'90s where they felt that there were harms that could materialize the privacy sector so they regulated by particular rules, and yet we have to ask ourselves, I think, in a world of converged services whether such an approach is really applicable today, because my privacy is affected in many different ways, in the context in which I use my mobile and by not only the operating system, by the hardware manufacturer, by the platforms, by the app developers, et cetera. This is a key issue, and I think these are things that affects the ability to provide a privacy protective experience in mobile, is the fact that not only is mobile affected by omnibus data protection laws where they exist, but there are also telco specific rules, that are license conditions, there are codes of conduct. It's actually really fragmented.
If I look, for example, in mobile, some of the abilities to provide privacy protective experiences are hampered by license conditions, some of which might require an operator to obtain prior approval to obtain bulk encryption on their networks, some of which require them by law to maintain obligations to lawfully intercept communications. We may have registration programs. One company issued a direction to operators to deal with the theft and loss of stolen handsets, so we have a system whereby we can block those handsets from being used using your IMEI but in this particular country a direction was issued whereby mobile operators have to send names, addresses, national identity numbers to this government database.
So while we make every effort to improve are trust and privacy, it's sometimes distorted by regulation and I do find that regulation is distorting a market in privacy, and it's distorting the market in data as well. And I think it's time for me ‑‑ those are ‑‑ there was a workshop yesterday, it talked about privacy engineering, and that's really important, because for me, and I'm aware David is one of the best lawyers, the law will never do this. It won't do it.
So I am currently trying to do some initiatives in countries where there are no laws. So it's as much a cultural challenge in those countries, convincing companies to do the right thing, to be responsible, and to establish the right standards and approaches to this. And so privacy engineering is a key concept. We hear about privacy by design, but that's still a bit sort of woolly for too many people, but privacy engineering seems to be something tangible. How do we engineer for privacy both in law and in hardware, in software, in platforms, and in apps, because all of these come together.
One final thing if I may. This is the IGF, and privacy on a mobile device is affected by telco regulators, what they do. They have a say in how privacy is regulated by data protection, privacy commissioners, who have a say, and by justice departments, but I don't see a forum by which all of these different ‑‑ and I'm not going to use the word "interested parties" are brought together to discuss these things in an open and transparent manner, and I'll talk about some things we're doing in solutions later.
>> AUDIENCE: (?)
>> The theory is that privacy is important, as your survey data indicated, it's extraordinarily important to virtually everyone. Why ‑‑ are we seeing, and if not, why are we not seeing more competition amongst app providers, carriers and others to provide greater privacy protection for consumers as a marketing tool in order to get more customers?
>> PAT WALSHE: I think privacy ‑‑ I recently gave a talk to a group of people about moving from privacy as ‑‑ to privacy of value ‑‑ because privacy is a chore for users, for citizens. It's a chore internally. As a chief privacy officer when you're trying to deal with your business officers they see you not as a CPO but a PPO, a project prevention officer. So again you have that challenge, but there's very real value here. You mentioned platforms and app stores. Actually some app stores won't allow certainly privacy protective apps. So it's proper ‑‑ not proper to suggest the market can always take care of this issue.
>> DAVID GROSS: Alex?
>> ALEXANDRINE PIRLOT DE CORBION: As part of this panel we're asked to highlight some of the challenges and some of the solutions, or at least the work that we're doing to try and address those challenges. Well, first of all, Privacy International would like to thank the organizers for asking us for the second time to be on this panel. Mobile telephony has brought so much power to individuals, enabling them to speak, to engage in their ‑‑ in Civil Society, in politics, to learn, and to have an outreach to a wider world. But this power hasn't only been taken on by individuals, unfortunately. Governments, industry have used mobiles to gain insight into individuals, their lives, their friends, their networks, their families. I'm going to quote actually former vice president of social ‑‑ of social ‑‑ from Google who once said, a mobile phone has ears, has eyes, a skin and knows your location, eyes because you never see one that doesn't have a camera, ears because they all have a microphone, skin, because, there are now touch screens and GPS allows you to know your location. But what it doesn't say is the phone can betray your information without your knowledge, and this is increasing concern to users, as Pat mentioned in his presentation.
Before getting into more of the details, I wanted to just give a broad highlight of the context in which we're having this discussion in terms of the legal framework in which this is happening. Most ‑‑ some of the time there isn't even a legal framework, either because there's no data protection law in place or it is as well because the rule of law is not respected in some countries. Nevertheless, we see it in around 101 countries, and I think South Africa was the last one in November to adopt a data protection law, but nevertheless there's still a lack of legal frameworks around domestic communication surveillance practices and policies.
The second one in terms of the legal framework I wanted to mention because it's increasingly becoming a peculiar to adopt data retention policies, and our argument is if there was a strong and robust and respected data protection law there wouldn't be a need for data retention policy to complement it.
Then going into more of the details, I wanted to highlight two actors ‑‑ set of actors that I think are the main ones that are engaging in this industry ‑‑ in this sector, so first we've got the industry, and by industry I mean telecommunication, Internet and network providers, information and technology and service providers, (?) device developers, so looking at them as a broad actor, and we can disagree that they have different responsibilities and obligations, and they have different purposes and different business models, but at the end of the day all of these actors are collecting, using and storing and sharing users' information. And this is why we categorize them together.
We welcomed the approach and the new initiatives from the industry to publish transparency reports, but that's not enough. Transparency is not enough. Transparency is the final step of a multi‑step process that includes pushing back on surveillance laws with the same vigor that industry is pushing back on privacy laws. The second is protecting users, collecting less. There's increasing amassment of metadata. It's not necessary for companies to deliver their services to users. The second one is to push back against government requests, and we've seen some companies doing that.
The third one is to protect and follow data protection law, to be transparent about the data protection laws under which they're operating.
The fourth one is more on a technical level and I think we shouldn't forget that it's to secure the platform and services under which these are operating. And the final one, like I said, and it is in this order that it should happen, is to be transparent, including about the legal framework, but also specifically about the relationship that firms can and the industry in general have with law enforcement and national security requests.
The last actor I wanted to mention, and this can be controversial because most of discussions people might expect to be about the industry and companies themselves, but the states. I want to bring in the issues that we've been having for a long time and more particularly and more visibly in the last sort of 18 months. Industry can do what they can't ‑‑ to customers, and they should continue to do so, but we know and we've seen from the Snowden revelations that increasingly there has been external interception technologies employed by states, and by that I mean telecommunication networks might not even know that the state is interfering and having access to their networks.
The NSA tapped directly into Internet ‑‑ Internet and network service providers' systems and infrastructure to obtain data. They hacked into Google's communication infrastructure as well, as well as others, and this is something that we're challenging in the U.K. courts at the moment with other Civil Society Organizations. And the point I wanted to make by mentioning the roles of the state in terms of the way people engage in respect to mobile technologies, is this goes well beyond interception of communication and what will ‑‑ this do to the trust people have in using these systems. And as the industry ‑‑ somebody is doing this ‑‑ and as the industry is really something that you have ‑‑ that we have to address together, that there are bigger adversaries.
And in terms of the solutions to addressing this problem is we suggest the 13 principles, and I know they've come up several times this week, and they might seem like an abstract contribution from Civil Society, but for us it was a way of constantly dating all the protections that currently exist on data protection, privacy and surveillance, and to gather them into a list of principles that are approachable and could be implemented by all actors. So we can go into more detail.
>> DAVID GROSS: Terrific. Before going to Amos, who has joined us remotely, and knowing how tentative all of these connections are, let me ‑‑ Pat would like to jump in. What we're going to try to do is get as many of the panelists to ask each other questions as well, and Pat is ‑‑ is chomping at the bit to comment about something Alex said.
>> PAT WALSHE: I'm not going to touch the government stuff, we'll debate that later. But something about data protection laws. Many of the companies that are members operating don't have data protection laws, as this is a cross‑ecosystem issue, how does international feel ‑‑ how can industry come together and agree what standards should be applied? Because after five years somebody to me it's like pushing a rock up a hill. Actually it's not a rock, it's a boulder the size of a house.
>> ALEXANDRINE PIRLOT DE CORBION: No, definitely, it was one of my points for later. No, no, it's fine. I mean, it's definitely one of the challenges we're facing but we really feel that that is the starting point, is to establish laws, and we work with different partners across the world, including CIS, who is present on this panel, and I hope Sunil will back me up in a minute in his presentation. But for Civil Society there are milestones to reach, and advocating for data protection laws is one of them, and it is a long process, and we haven't achieved it in all of the countries, but it's a starting point to raise awareness about what the issues are. For example, something that we might start implementing, and I know some of our partners are thinking of it, is within EU directive on data protection and the regulation. This includes a provision where the EU Member States will not be able to do business to engage with countries that don't provide sufficient standards for data protection.
So I mean, it's all political, but that's something that we know people will start using advocacy to convince their governments that there's a need for it, and it's maybe more from an economic perspective, but it will benefit the protection of the privacy.
>> DAVID GROSS: Terrific. Amos, if you can hear us, and it looks like you might be able to hear us ‑‑
>> AMOS MANASSEH: Yeah, I can.
>> DAVID GROSS: Terrific. We have a ‑‑ this group has a very bad track record of doing remote access here. It is the Internet Governance Forum here. Amos, can you give us four or five minutes and be prepared to answer scathing questions?
>> AMOS MANASSEH: Of course, yeah. So just a brief introduction. I work with Axiata, which is an Asian group, and we have telcos in five different countries, Malaysia, Indonesia, Cambodia, Sri Lanka, Bangladesh, and interest in telcos in other countries in India and Singapore. So we have to cover seven regulatory environments, some of which are normalized as much as ‑‑ much of the way telcos work is relatively normalized but some of which aren't, in terms of specifically local considerations for data protection. The project I'm working on at the moment is group wide, (?) an ATI layer. So I'm going to attack this from a slightly different problem.
I've got serious practical problem, not only about the fundamentals of privacy and data protection, but also with the local nuances. So I'm struggling from a technical point of view with those nuances, so I'm trying to know what one might need to do in Indonesia, might be different than in Malaysia. And I'll go into detail in a minute. But before I get into the sort of specific issues, I think one of the things that we in the sort of telco space see is how difficult it is to operate in the over the top commercial environment when the over‑the‑top commercial environment is seemingly (?), and therefore, you know, is aggressively commercially driven, and an environment which we're in, which is actually where the aggressive (?) and actually makes it quite difficult to keep up.
So one of the problems that we have specifically in relation, that you ‑‑ you an find telcos are pushing the limit to some extent, is often to do with commercial pressures of keeping up with ‑‑ keeping the revenues coming in on a month by month basis. So I think from time to time you'll find that telcos are pushing the envelope, issues around (?) and often the way they've been misused over the years is a good example of that.
So I think ‑‑ I think, you know, that's one of the issues that we struggle with. And I think there is potentially an opportunity for telcos to regulate themselves back into the game in that respect, because I think that there is a requirement for, you know, privacy to be more controlled, and there's an opportunity that telcos take that ‑‑ to take the high ground, if you like, and I'd certainly want to suggest (?) going in that direction and I think that they're good ones.
But on the other side of the coin there's a lot of organisations who are putting together ‑‑ favor putting together services to help customers take control of their identities and take control of how their data is registered and on what basis, and there's some companies actually putting together services that will allow customers to actually potentially benefit from it, actually make money from (?) in a controlled way, and I think it's a commercial opportunity. In the end I think this is going to be commercially driven to a great extent. So I think that's one of the issues that we face.
I think that from a technical point of view we may find that the ‑‑ that ‑‑ how can I put it ‑‑ some of the new services that come up (?) will be driven by local (?) an example of some of the issues I'm facing. The services I'm putting together now are cloud‑based, but some of the data cannot be (?), so we're having to effectively come up with hybrid solutions to be sure we comply with regulations, keeping any data that is generated from the customer in country (?) when it goes into the cloud mediation mode, if you like, and that's quite a challenge.
One of the other challenges is that if some of the service providers that we use are based in countries which have very Draconian laws, you know, related to being able to access other people's data, that causes us a problem, and we're actually having to consider where a company is based when we're thinking about which (?) to use for services that may be cloud‑based. So that's another issue that we're facing at the moment. It's a fairly conscientious one, organisations ‑‑ it's fairly current. But that's the coming procurement consideration when it relates to data or customer privacy.
My personal feeling is that it is an opportunity, this subject, for telcos. My personal feeling is (?) I talk about quite a lot, within my role, is that we have an opportunity to do something about this and take the high ground and potentially start to help to protect the customer in their interfacing with various (?). So that's pretty much where we're coming from.
>> DAVID GROSS: Thank you very much, Amos. Let me ask you, you talk a little bit about potential for taking the high ground for the telcos and so forth. Does that mean that you also think that there's opportunity for competition in this space, that is, between and among telcos as well as competition with OTTs and others?
>> AMOS MANASSEH: I think one of the things that telcos, you know ‑‑ I think they realise, they recognize, is that there are certain points beyond which it's going to be very difficult to go. Telcos are fundamentally ‑‑ they're very ordinary, an integral part of everybody's life. There's one or two ‑‑ there's one or two areas in which I think that telcos can increase that position, and one of those in, in fact, you know, customer courtesy, customer data. But I think it needs to be across all ‑‑ a raising of the bar of brand mobile (?) telcos competing against each other in that particular area.
I just think that when you get a mobile phone you should know that it is a very secure device, it's a device that can help protect your privacy, it's a device through which you can get your own data, and I think that would be a service that telcos should be providing generically across the board. I also think there's going to be significant competition from ATTs to try to take that space for themselves, but I think it should be a natural fit for telcos. So, you know, if you like a trust (?) associated with your friendly local operators, the telcos by nature we know we've got the ability to roam, that's an advantage because you've got a fairly large and fairly aggressive ‑‑ are global in nature, a local presence or the ability to act, you know, in a very local way.
So again, I think it's quite a natural fit, but I also think customers would benefit tremendously if that were the case, (?) need to be a separation. I think the context ‑‑ in the context of those things which you'd like to attribute to ‑‑ attributes, is something that is going to become a very important part of people's lives.
>> DAVID GROSS: Thank you very much. Sunil?
>> SUNIL ABRAHAM: Good morning. My name is Sunil Abraham and I work with the Centre for Internet & Society in India. Five years ago we had no idea about the importance of privacy, but thanks to a partnership with Privacy International today we are best known for our work around privacy and data protection.
What I will try and do is in a telegrammatic fashion try and quickly address all the questions that are posed to the panel. The first question is how can citizens in both developed and developing countries benefit from the responsible use of mobile data? Telcos as you know are sources of big data, and what would be the need of the ‑‑ around the use of big data? I will use one example to illustrate. Learn (?) got anonymous data from all telcos in Sri Lanka and were able to test (?) whether people from North Columbo came from South Columbo as part of their daily life and whether the opposite was also true so that the transportation system in Columbo could be engineered more appropriately based on this divide.
Second question, what are the key emerging issues and challenges of a mobile hyperconnected world? I think more and more people accept that privacy and national security is a false dichotomy. More and more people accept that what we need to do is optimize the national security imperative and the privacy imperative. At CIS we are working on one solution, which is encrypted logs at the telco and ISP level and key escrow for every single mobile user with the privacy commission or some other Human Rights Commission. There are also very interesting papers, for example a paper by Dr. M.M. Oberoi on how to design a fishing expedition with minimum impact on privacy. How can we ensure secure and trusted identities on‑line?
Here I will use the example of Indenui project. It's still in some cases based on biometric standards. The trouble with using biometric standards are that it is irrevocable, proprietary authentication and identification. We have been advocating that we should shift to digital signatures which could also be on the mobile phone. Here we will have revocable authentication factors, completely open software and standards and a decentralized topology.
What needs to be done to ensure consumers are able to access services in a private, trusted ways? Apart from the obvious answers such as open hardware and free and open software, it would be excellent in the post‑Snowden era if we can address all the concerns around cryptographic standards that depend on elliptical you curves by getting governments across the world to throw $10 billion at the problem and create market incentives for investigating, researching and fixing cryptographic standards.
What are the different roles for law, industry, self‑regulation and enhancing trust? The way Pat has posed the problem, he says that no law will do it, and he says that countries don't have data protection or privacy law. And now I'm going to commit a sin as per the norms on this panel. I'm going to use the bad word, the multi‑stakeholder model. (laughter) In my view the multi‑stakeholder model is nothing but consultative and self‑regulation. It is nothing else, and what it does is allows industry the opportunity to pre‑empt regulation through hard law and mitigate harm and uphold the public interest, and we have an example of this happening in India, since the government in India is not willing to discuss the privacy bill as it is being developed, not yet willing to place it in the public domain for debate, we have come up with a shadow bill, and we are running a shadow policy consultation process, and since we are occupying this vacuum in the policy space, more and more industry is beginning to assume that the real bill is going to look like our bill, and most recently we got calls from telecom industry consortium and they want to have a discussion with us on this fake or fraudulent bill. And if they self‑regulate, then it reduces the reason for ‑‑ reducing the rationale for extensive and perhaps inappropriate overregulation by the government. Thank you.
>> DAVID GROSS: Terrific. Talking a little bit about what's going on in India is one example. How much of this is being driven by what is happening domestically in India, as you point out, things like the privacy bill and the like, versus those OTT services from companies outside that have no connection to India, but yet provide services to people in India? How do you ‑‑ that's a challenge, how do you see that playing out?
>> DAVID GROSS: Let me warn everybody in the audience that I'm going to go to your next speaker for our next statement or questions.
>> TITI AKINSANMI: Hi, everybody, my name is Titi Akinsanmi. I work on policy and government relations in general for Google across sub‑Saharan Africa, and I am going to be as controversial by putting the very controversial issues right out there.
I want to be able to make a couple of very clear statements that if you take nothing else away once we've started the debates you're very clear on where we stand as Google. One is the fact that free flow of information is important. It is cultural, economically and socially critical. Third is the fact that free expression is critical to who we are as Google.
I will speak a bit briefly on surveillance, on the right to be forgotten and end on a couple of notes, but before I go as part of my slightly being controversial, I'm really curious to see that there's nobody from the telcos on this panel. It would have been interesting to be able to give them some space to be able to engage with a lot of the questions that have been thrown at them.
On surveillance, one, those around the world currently they do not allow governments to access in general private information, but they need to be a lot stronger than they currently are. Second, we think it's important to be able to explain how often and how governments are able to use the laws at their disposal to gain access to end users' personal information. Currently that is not very clear.
Third, lawful access must be under due process. We're aware that the MLATs don't currently meet that greatly, but I think that's something we're working on. Why am I speaking about this in the mobile? At the end of the day having a lot of access on‑line on traditional media is exponentially different once you bring the mobile factor, and I see that personally as well that that begins to open up a lot more conversations. On my continent, for example, I think very few companies actually have data protection laws.
Whether or not data protection laws or data protection agency ‑‑ when there are, it makes it more difficult for us to do our work and protect our end users. On right to be forgotten, we think it's an impossible position. It puts in a very, very difficult position to balance an individual's right to the public's right to information. We certainly disagree with it, but we are implementing it because, again, we have to operate within the rules of the law. We are implementing it in a very focused way, and within the confines of a very broad and very vague ruling. We are doing our best to be transparent. I hear very clearly from Alex that transparency is the last step of the law, and we have tried to follow through on every single one of the steps that you have mentioned. The concern, though, is that consistently private sector should not be living by itself this entire process but it should be a collaboration across the ecosystem.
Where was I now? Okay. To just finish up, because I want to be able to take the questions, is one, will countries act responsibly and acting responsibly sometimes means that some constituents will feel they have been left out or that we shouldn't be doing what we need to do. Second is the fact that we don't have all the answers. Google certainly doesn't have all the answers, and we are very clear. Incidentally in my head I said, you're supposed to be able to Google everything and find all the answers. Unfortunately we don't have all the answers, but we're working with a range of parties where we can get to that point a bit more. I'm going to leave it open so we can have more discussion.
>> DAVID GROSS: Sunil?
>> SUNIL ABRAHAM: Since Google doesn't have all the answers, I thought I would provide them with some. (laughter) Apparently ‑‑
>> TITI AKINSANMI: Thank you.
>> SUNIL ABRAHAM: Apparently people think it is difficult to optimize privacy and free speech, more particularly the transparency complement of free speech. As a engineer I have reduced this to a mathematical formula. Transparency requirements should be delicately proportionate to power. In both privacy law and in transparency law, the phrase "public interest" exists, and this test allows us to distribute protection or requirement across society where there are deep power asymmetries. So it should not be difficult to think that through. I think Google in some ways is trying to only adhere to the letter of what the European court has said rather than the spirit, and in some ways trying to demonstrate that it won't work at all by implementing it poorly, but if Google takes this seriously, that both free speech and privacy are equally important rights, then the letter of the law tells us in some way how those rights have to be balanced. Thank you.
>> DAVID GROSS: Let me assure the audience that despite the fact there's been an equation used today there will be no examination before we leave. Before going to Pat, let me ask a question to the panel that was just raised. We have the right to be forgotten coming out of Europe, and yet Sunil was just commenting on that generally. Do we see that as a trend globally or is this a one‑off, just a European phenomenon?
>> PAT WALSHE: So the Hong Kong privacy commissioner said that Asia needs to consider a similar right to be forgotten. So it's having repercussions around the world. It's not a right to be forgotten, really. Let's not use that term. It's a right to have your data deleted, and in the U.K. you've had such a right under the existing Data Protection Act for many, many years. It's just a bit more challenging in Google's context. But I wanted, if I may, to touch on something Titi said which is that laws in general don't permit access to private sector data. That may be the case for Google and Facebook and others, but it's not the case for telcos.
Generally around the world there are laws and there are license conditions, and other things that place obligations on mobile operators to provide assistance, and in relation to transparency, to pick up a point made by Titi and Alex, in the example that I mentioned earlier where this particular Ministry has issued a direction against mobile operators requiring them to send to a third‑party data the IMEI, addresses, dates of birth, national identity cards, et cetera, there's no overarcing law to investigate the event investigative powers of the state in relation to access to that. So in terms of transparency reporting that obligation should be on the states and not on anybody else because this data is regular to be sent for every customer.
>> DAVID GROSS: Titi, do you want to ‑‑
>> TITI AKINSANMI: Yes. On the right to be forgotten I actually wanted to say that we actually have seen a proliferation, not just in Europe with the ruling, not just in Asia but also in Latin America and interestingly in a few African countries, where laws were loosely defined recently and ‑‑ to be a right to be forgotten. Now like we said, it is about finding the balance between the freedoms and the responsibilities, and what we are saying is that we should not be put in that position to find that balance alone. But rather it should be a collective. Otherwise we are going back to ‑‑ I've been in these conversations around governance around 14 years now. We're going back to the place where intermediaries are being place indeed the role of the judge, the judiciary, the prosecutor, the advocate, and that should not be sitting in the hands of the private sector.
>> DAVID GROSS: Any other comments before we go to the audience? Sunil?
>> SUNIL ABRAHAM: I completely agree with what she said both on free speech and on privacy. (?) should not adjudicate and make the decision. But on the other question that you asked on whether this is a global concern, perhaps not in a country like India where we don't have sufficient basic privacy and data protection measures in place or law in place, then moving to some of what we might consider rather advanced rights is perhaps too premature. It might lead to too much regulation of the industry and that might not be appropriate.
>> DAVID GROSS: Let me look to the audience for some questions. Chris, I think you looked like you had one. Let me get you a microphone here.
>> AUDIENCE: So this is sort of in combination mostly for Pat and Alexandrine, but others are welcome to jump in. I appreciate the point on the market here and, Pat, your sort of theory that law has its limits. I would say isn't there a role at minimum in policing honesty of consumer facing claims of privacy and sort of corresponding to that, a minimum of transparency in making those claims, just because both of these, if they're not strong enough, really limit the effect of market forces.
And then segueing into government issues where market concepts really aren't salient at all because you can't realistically choose your governments, there is a long history of collaboration between telcos and governments. I can accept the premise there's value in that. Doesn't mean there don't need to be legal limits on scope and substance, but Alexandrine, your point about transparencies coming last I think is relevant in trying to get to substantive checks, but to me it also has to come first, because in Democratic societies we all too often don't even have insight into what the relationship between the telcos and the governments is. And so we can't effectively push for change on that. So I would love thoughts on that.
>> DAVID GROSS: Who wants to begin? Looks like Pat already has his finger on the button. Go ahead, Pat.
>> PAT WALSHE: Yeah, so I should have qualified it. When I talk about law ‑‑ I'm thinking in terms of consumer law and transparency law, some of the usual things we have to deal with, because some of that law is also premised on (?) outmoded concept, but in today's connected world that's increasingly irrelevant. You don't need ‑‑ to have consequences for me. In relation to government let me make it clear it's my view and I'm not speaking on behalf of the GSMA, but there is a need for transparent laws. Theres a need to set the expectations of citizens to the circumstances under which their privacy will be infringed and there needs to be very clear, justified proportionate laws and investigative powers of the state and there are other things that are being done that sit in ambiguous areas of the law. The law should be explicit about whether certain capabilities and technologies can or cannot be used.
So that's my view. Please don't quote me as the GSMA, but that's my view. I should say that view is based on ‑‑ not just because I was the chief officer with mobile operators but I used to run teams that did government and disclosure work. So I have two hats. And it is very closer to me that there is a need to address this issue.
>> DAVID GROSS: Other comments?
>> AMOS MANASSEH: I wouldn't mind making a comment here.
>> DAVID GROSS: Go ahead. Amos.
>> AMOS MANASSEH: I think ‑‑ I think that the degree to which governments and telcos collaborate (?) but it's also high because of the degree to which telcos are regulated in the first place and the products that telcos present the customers are a result of that collaboration and/or those regulations.
The second thing is that telcos traditionally maintain ridiculously large customer (?), which most of the OTTs simply don't. So customers have huge (?) and can actually change what telcos do. I would suggest it's almost impossible to do that (?).
The second thing I wanted to say is I just don't buy self‑regulation. And that's just a pragmatic statement. I just don't buy it, because at the end of the day the market ‑‑ the people will do what they can get away with in that section of the law. So I just think that at some point we're going to need ‑‑ find ourselves in a position where there is some central regulated entity, whatever it is, that can help solve this problem.
>> DAVID GROSS: All right. Well, Amos just went the other way from multi‑stakeholder to government regulation. We have a comment over here. Is there any other comments? Sunil, did you have a comment? Alex, did you have a comment?
>> ALEXANDRINE PIRLOT DE CORBION: To respond to Chris, the way we presented it was to have the transparency report, and sort of the last step to show that there is a multi‑step process, and I was happy to hear that Google takes that approach as well. No, definitely there needs to be transparency from the beginning, and the way we see the element of transparency is to have access to the laws that regulate the relationship between telcos and the states, because at the moment we're engaging in an environment where a lot of the relationships are done through secret laws and for secret laws and not law.
>> DAVID GROSS: Very good.
>> My name is Luis Bennett from the BCS. I think there's a danger that we might be confusing privacy, which is about revealing personal information only to those you choose, with data protection, which is about protecting personal data that an organisation has been entrusted with, and I don't think these are the same things at all. In the survey that you produced, you quite rightly said that most users don't do anything about their privacy settings, although they want privacy, and that is because the industry, all parts of the industry seem to have much too complex privacy policies that are unreadable to the ordinary person. If the industry self‑regulated by having privacy by default as a setting, this would go a long way to engendering trust in consumers.
I also think that some companies, and I'm afraid I include Google here, think that privacy is about sharing things between the individual and the company who you've first given it with, which it's not.
>> DAVID GROSS: Comment? Obviously we're going to go to Titi because she's calling out Google here.
>> TITI AKINSANMI: Yes, 100%. And we're okay with being called out. I think pre‑Snowden a lot of the approach, the example of Gmail, has always been about giving the end user the opportunity to be able to choose to either encrypt their mail or not encrypt it. Post‑Snowden revelations we took that away and made sure once you're sending an email it's encrypted. On the one hand is the fact that end users, consumers, whatever you want to call them, are not necessarily as aware as they should be about the extent to which the information can be accessed. So a lot of work we have done and we have ramped up is making sure that our end users are a lot more aware.
The point around making the privacy ‑‑ the terms and conditions, the terms of service a lot more clear and simplistic such that is a regular end user can understand it, it's also something we're taking on board and we're putting in a way that we can have simplified conditions in place. But I wanted to be able to track back to ‑‑ he mentioned in ‑‑ a bit more to your point that the entire ecosystem, sometimes we forget that even if there's regulation, even if industry is doing that which is right, if the end user is not as aware of the power of the technology in their hands, they will still make wrong choices.
So a lot of the work that has to be done, if you can't make the right ‑‑ if there's a lot more education that can be done for the end user to adopt technology, we're putting mobile phones in the hands of people who are living in remote areas and their core concern is about being able to use it for economic reasons. Top of their list is not necessarily privacy, instinctively, so how can we best put that in place? And that role cannot always be played by industry but sometimes by Civil Society. So I think we need be able to find that balance as well to address that.
>> AMOS MANASSEH: Isn't the point not making a business out of people's wrong choices?
>> DAVID GROSS: I'm sorry, Amos, could you repeat that?
>> AMOS MANASSEH: I said, isn't the point not making a business out of people's wrong choices?
>> DAVID GROSS: All right. Right. Sunil, you wanted to comment?
>> SUNIL ABRAHAM: Just to reference some research from the Philippines, which basically tested a variety of respondents by sending the standard message, which is, can I make friendship with you? Kind of characterizing the message. And what was found is that people at the bottom of the pyramid with limited social networks would like to make more friends, would like to sacrifice their right to privacy for the right to publicity, in a sense, and as you go to the top of the pyramid and people already have extensive social networks, then they begin to value the right to privacy over the right to publicity, in a sense. So it is almost impossible for a social network to address this very diverse set of concerns.
Before I end I'd like to point out that professor Rohan Sunerji is in the room and the record from his think tank about using CDRs to do planning ‑‑ perhaps he could tell us more.
>> DAVID GROSS: Rohan, he was talking about your Columbo project. Any more comments on that? Pat?
>> PAT WALSHE: There is a difference between data protection and privacy, is if you do this from a compliance perspective, complying with data protection laws you'll never address privacy. And this is as much about designing for trust as it is for anything else. And in that research I don't think any person would have thought that that was about data protection. The people that were responding thought it was about their privacy. And what we find around the world is that even regulators and policy makers are basing decisions without even conducting research into how people feel these days, and as I said, on data protection you might have a piece of law that defines personal data as something from which you can identify someone from, but then you have a bunch of technical data and capabilities that in context, and when you merge with this data can reveal a lot about an individual.
For example, it could put you, whether you're outside ‑‑ at a mosque, for example, it could put you outside whether you go and buy something in a Hilau store. You can begin to infer things from people about this ‑‑ infer things from people about this data, even though that's not personal data. And one of the things we have is while Google may have that data as well it's not regulated as telco data is. My point is what is it we're seeking? What are resident privacy outcomes we want for individuals and what are the risks and harms that we are trying to mitigate? Because the precautionary principle doesn't work.
Innovation, we have to find a different way, and I know Rohan sat at the back and wants to come into this. We've been working on using mobile big data, it's unfortunate I couldn't get to Rohan's event, even going back to ‑‑ who remembers the earthquake in Haiti? Mobile data was used to understand where people migrated to to save lives. It was used to understand and help prevent the spread of cholera because an aid worker took it into Haiti where cholera wasn't known before that.
So mobile data has huge social benefit. It has huge benefit to some of the most pressing public policy objectives of our time and we have to find a way to use it in wise, responsible, trusted ways, and that's partly what Rohan has been doing.
>> DAVID GROSS: Any comments or questions? Rohan is already coming to the front here. (laughter) You've been called out.
>> AUDIENCE: Okay. Thank you for this opportunity. Is this on?
>> DAVID GROSS: Try it again. Here, switch.
>> AUDIENCE: I run ‑‑ I'm the Chair of an organisation called Learn (?) it's a regional think tank working on ICT policy and regulatory issues primarily. And I happened in my previous academic life to have worked on privacy issues for about a decade, and primarily on transaction‑generated information, as we used to call it. A guy called Thomas McManus coined the term. I think the correct term is transaction‑generated data. So what we did ‑‑ as you know there's a lot of buzz about big data everywhere. And Google data, bank information, payment information, et cetera. In the developing world data sets ‑‑ datafied data sets are very rare. So if you're looking at supermarket information or credit card information you're going to get 5% of the population, but the only comprehensive data set that covers the poor is mobile big data, and that is our focus. My organisation's.
So an experimental basis for pilot investigation, we obtained historical data sets from multiple mobile operators in Sri Lanka to address public policy concerns such as those affecting transportation policy, urban planning and so on. I think ‑‑ I think I saw that he's here, but believe me, in developing Asia the traffic jams are something to behold. Entire cities are locked down by the independent actions of individual people and nobody moves.
So I think it is very important that we develop solutions for this, not on the basis of building more roads but on the basis of using the potential of data. So we have done this research, and I think ‑‑ I don't have the slides with me because this is all about visualization. You have to show the slides. If anybody is interested it is all on our Web. We have presented it. It's publicly available on the Web. You can look at it.
The part that I think is of relevance to this discussion is that in the process we developed a draft document, draft guidelines for the responsible use of mobile big data by third parties for public purposes, and we are in the process of consulting with multiple parties, particularly those who have direct authority over this ‑‑ this data resource, which is the mobile companies, and we are involved in discussions with mobile operators from four countries at the CXO level and I know that the CEOs are directly looking at this document. And instead of trying to put this data into the straitjacket of the inform and consent model that I am fully informed about and which I believe was developed for credit reporting and pick boxes, which will not apply to transaction‑generated information, we are trying to develop a set of remedies, set of guidelines to alleviate ‑‑ to remove or address the harms that could actually be caused by this data.
Now, I will stop at that point except I want to make one response to what Pat said, which is that in this kind of situation, if people are going ‑‑ if they think that from anonymized historical data you can identify who's standing outside the Hilau store, it just shows you really haven't worked with the data. We are talking about millions of records a day. We are talking about the law of large numbers. We are talking about the fact that, in fact, the base stations give rather imprecise location information. I'm not talking about GPS smartphones. They give rather imprecise location information, and it will be ‑‑ I can barely differentiate between a pedestrian congested public transportation hub and another place where the cars are stuck in traffic, which is half a kilometer away, given the data.
So why I bring this up is that it's very, very important for anybody who's going to try to regulate or come up with rules regarding big data, to actually get their hands dirty. This is not for people sitting in air‑conditioned rooms trying to spin hypothetical scenarios. There were things that I thought were problems, were feasible, were doable that I found could not be done once we looked at the data. I am not saying that what we have seen is static, that it will remain forever, but there are certain limitations to what the data can yield, and we have to be aware of the technical capabilities before we try to legislate on this subject. Thank you. Snow terrific. Pat, you need to respond.
>> PAT WALSHE: I do need to respond because context is always important here. The Hilau is different to what you were talking about where mobile call detail records, CDRs, are being used in an anonymized way. (?) Did fantastic work and ‑‑ they used it for seven different reasons and released to researchers around the world under strict conditions and rules. That's different, Rohan is talking about anonymized data. The example I gave is it's held with a lot of other data. But when we consider the governments have access to some of this data and, you know, that it can be used to differentiate and distinguish between individuals, then these are things that we can't just ignore that have to be addressed.
In relation to the anonymized CDRs, the UN has just established an advisory group to look at what rules should be in place. I sit often on that and a number of regulators technically competent from around the world also sit on that and hopefully you'll see movement, and hopefully Rohan will be part of that. Because the thing I notice and concerns me is we don't have a standardized approach. When we talk about anonymized CDRs, Rohan might have a different view to ‑‑ to the Singapore authorities.
There's a wonderful presentation on the Web site about visualizing big data, mobile data, when it rains, for example, what happens? What happens to taxi movements so there are no standards in anonymization being deployed. And that's one of the conditions we need to apply.
>> ALEXANDRINE PIRLOT DE CORBION: To go back and comment about confusing privacy and data protection, I appreciate that there is a difference, and there's a difference in the debates. The point I wanted to make is because of the way the data is used and the power that data has to give an insight on people's private lives, their communications, the way they engage in society, if there is a failure to protect that data, that can lead to ‑‑ interfere with the right to privacy, so that's where we make the connection.
>> DAVID GROSS: Any other comments? All right.
>> AUDIENCE: Very briefly, I think that the time now is for innovation. It is not for global standardization. Somebody trying to set global one size all fits rules at this time is going to be deadly. We have to allow this process to go on for a little while. Look at what is feasible, what is practical in particular context, then codify, not top‑down global standardized standards.
>> DAVID GROSS: I'm walking to the back to get your next question. Let me ask one as well. We've been talking about data and the challenges and the tremendous opportunities associated with that. I would note that in the United States there was a report out of the White House that John Podesta helped work on, and draft together with PCAST, and he said the consent piece should be probably done at the back end instead of the front end because of the unknown areas of how it would be used. That's one extreme. On the other extreme, though, is individual information. I don't know if this is true. I tried to test it, that Google ‑‑ that you could go on to Google and put your phone number in and you could actually see where you have been over the past some period of time. Now, unfortunately for me, I have a BlackBerry phone, which provides no information about anything to anybody, so I couldn't tell if that worked. Is that something that's actually true?
>> TITI AKINSANMI: I do know that if you are signed in and you put in your phone numbers, you can definitely at least identify where ‑‑ what information has been shared currently with us. But it's only as far as you have allowed it, as an end user.
>> DAVID GROSS: So it's not open to the public then?
>> TITI AKINSANMI: As far as I know, no, it is not. Except somebody ‑‑ and we do have people like that, or institutions like that, that have really good skills and can do that some other way. But certainly not as Google.
>> DAVID GROSS: Very good. Question in the back? Should be on.
>> AUDIENCE: Does it work? Yeah. Martin Spitz is my name. I visualized my mobile phone (?) data and put it on‑line. I'm always asking myself, why does the CDR have to have 32 different fields of information, like when we talk about ‑‑ when we talk about strengthening privacy, I think we have to talk about the ‑‑ about the amount of information that is there at the beginning. So I think we should talk about systems, how to lower the amount of metadata that aren't necessary because then we also know the amount of information that can be shared ‑‑ lower the amount of information that can be shared, for states and also can be used by the companies itself, and I always don't know why all these different fields have to be there. I think this is a system from the past and I think we have to talk about a system of data minimization here.
>> DAVID GROSS: Thank you very much for that question. Pat?
>> PAT WALSHE: Okay. So yes, and in a call data record there can be significant pieces of metadata. It's quite detailed, and it's quite complex, and that's borne out of, you're quite correct, about the fact that, you know, networks need to know where your device is in order that they can ensure your device connects to the network. Some of the data in there just gives you a network error code, for example, I know because I used to have to go to court and explain these things, some of that is important. When you have a customer calls up and says, I can't understand why my call keeps dropping or I called X and it doesn't connect. This information is really important in helping you resolve your customer queries and it's not the case that mobile operators retain that data forever.
In fact, one company I worked for, they would keep that in the live form for 90 days because it was anonymized. You need that. Can you imagine, a customer calls and says, why does this keep happening to you? We'd love to tell you but we can't. Operators have obligations to provide quality of service and customers want to know, why have I been charged this extra money for extra gigabytes? So data really is important. You're right. There needs to be an approach which looks at it and ensures each of those pieces of metadata are tagged in a way that they begin to drop off and are erased when they're not needed for these commonly accepted purposes or where they're otherwise required to retain the data under data retention legislation, for example.
>> DAVID GROSS: Any other ‑‑ Alex?
>> ALEXANDRINE PIRLOT DE CORBION: I need to comment on that as well. Like Pat was saying, it's important that every piece of information is regulated ‑‑ sorry, the way that each piece of information is used is regulated and known, and known by the user as well, why certain pieces of information are requested. The danger that I was mentioning earlier is with this amassment of metadata, can you really control how each piece of information is used if it's currently related with others. So that's one point.
I think the other point I wanted to make was Pat was mentioning improving services and when you use your device, but the point that's important to make as well, all the time, whether you're using your data or not, you're connecting to wireless system. That means that even when you're not using your mobile phone, you could be traced, and that's something to think about as well.
>> DAVID GROSS: Any other comments from the floor or questions from the floor? Let me ask ‑‑ let me ask sort of a basic question. We have operated on the assumption that privacy is extraordinarily important to consumers, that trust is the central focus here. We have survey data from the GSMA that would support that, but yet as a couple comments have indicated from the floor and elsewhere, if you look at the way in which consumers and customers operate, it may not support that. People seem to click immediately on through the endlessly long series of waivers and informed consent. Even if it were shorter it's not clear, at least to me, that people would not click through. They usually want that service, and if you don't click you're not going to get that service that is of value. What makes us think that there's a real issue here? Do we see lack of adoption? Do we see competition where companies are saying, come to me and are being much more successful by their privacy attributes? Do we have something other than the fact that people say, probably the obvious, that yes, privacy is important but not necessarily, because it comes at a cost. Alex?
>> ALEXANDRINE PIRLOT DE CORBION: We would never argue that is the fault of the user. I mean, it's not the ‑‑ and I think consumers and users in general are being more aware of the privacy implications. I think there's still a lack of knowledge in terms of what the power ‑‑ the power of data and what can come out of the use of their data, and that's something that we're working on across ‑‑ across the world and not just in developing countries. There's an assumption that developing countries, people have less knowledge of these things than in the West, and so yeah, that's something that's really ‑‑ that's really important.
And in terms of the privacy policies, that's something as well, the message we're trying to put across to industry, and it's the same with the laws. These need to be accessible to people and it's not just about clicking and knowing what it means, but that's something that falls on industry to take into account.
>> DAVID GROSS: Any other comments? Sunil?
>> SUNIL ABRAHAM: I have often heard the argument that the sharing economy tells us that users don't value privacy. Look at what ‑‑ look at the extent to which they share information and look at the extent to which they allow these platforms to harvest personal information. But if this were completely true, then in the sharing economy users would be constantly posting their passwords, and saying here's the password to my Facebook and Twitter account. So they obviously don't. There's a line somewhere. That line varies based on a variety of circumstances, but that line does exist.
And to perhaps take it to another domain, such as the regulation of food, consumers love fast food. They're perhaps aware of the implications it has for their health, just as many consumers love tobacco, and there are a variety of things that consumers do. But when the provision of fast food or cigarettes happens through large corporations and the potential for harm is large, then even if consumers don't understand all the regulatory complexity, it is the business of the state to uphold public interest and to mitigate harm to human rights by ensuring that it is appropriately regulated. Thank you.
>> DAVID GROSS: Pat, you look like you want to ‑‑
You could ask, because if you want to ‑‑ your device to set the date and time automatically based on the network you connect to, that's great, but there are other things in there to support it, vehicle navigation systems, and advertising systems. So one must ask them what ‑‑ and there are 18 million of these devices so that's a lot of big data. But when you take people through this process, they're like, I didn't realize. And they switch things off or say that should have been my choice. It does matter and it's a barrier to engagement.
>> DAVID GROSS: Do we have ‑‑ Sunil raised this, I think, a little bit earlier in his example about friendship. Is ‑‑ are these issues truly universal or do we see some significant differences, regionally, culturally. One of course that's sort of famous and has been with us in this discussion for a long time is age. The young are very quick to give up their privacy. The older you are, perhaps the more experienced you are, it seems you're less likely to give up your privacy in some of those interests. How universal is this and is it one size fits all? And if not, how tailored should this be?
>> TITI AKINSANMI: I'm just going to comment really quickly with a bit of a practical example and I'm not wearing my Google hat. I'm a Nigerian by birth and I did (?) recently on the Boko Haram issue and I asked a wide range of end users that if you had a choice to give up some of your privacy, some of your data for the terrorists to be located by government, what would your choice be? And I was very impressed ‑‑ I don't know whether positively or negatively, to identify that from the most highly educated and aware of the implications of giving up your data and of allowing government to be able to survey ‑‑ or to be able to follow you, every single person said yes, take away some of my freedoms. Just get rid of this.
And sometimes I think as industry, as government, we're not aware that people are willing to give up a bit more to be able to feel safer. So it's finding that balance again. It also goes back to that. Finding that balance between freedom and responsibility at any point in time.
Wearing the Google hat back on, really quickly, is inasmuch as there's a lot of choices and complexities around these mobile phones, the ones who are usually able to afford the really complex devices with the very long terms of service, et cetera, are the ones who usually should be a bit more aware of the rights they give up. And I'm going to end with saying this again, that without (?) the end users ‑‑ at the foot of end user, we as industry certainly need to do more to make them aware of what they have consented to, 100%. But at the end of the day responsibility for such freedom will lie with (?).
>> DAVID GROSS: Very good. Sunil?
>> SUNIL ABRAHAM: So you talked about age, and I had earlier referenced class. It is as if users are deploying personal information as currency to climb the attention economy. So that is the objective. Moment they are where they want to be in the attention economy, then their view changes dramatically. Then they're no longer willing to pay in personal information for additional attention.
So very hard for us either to formulate policy or to configure technology that allows for this movement, for this sudden transition from somebody that wants to divulge personal information to now somebody who wants to conserve personal information. But this argument of cultural relativism unfortunately is used by those who don't want to be regulated, to delay regulation, and what we have done through historical research is identified privacy norms in (?) law, Islamic law, precolonial law in India. So it isn't as if there is that much diversity that we can't somehow implement universal principles like necessary and proportionate. There are some things that seem to transcend all contexts, but there is also a huge difference, even within a single country, within religious groups, et cetera.
So the configuration is multi‑layered regulatory system that is self ‑‑ there is self‑regulation, there is co‑regulation, and even when it comes to regulation by the state, I believe the law is high‑level mostly focusing on principles, but then you have the office of the regulator that hopefully can examine each case and then give guidance on an ongoing basis. So what we don't want is overly engineered privacy law, which prevents us from addressing the relativism question.
>> DAVID GROSS: Let me ask sort of a follow‑up, if I could, from what Sunil was just talking about. How do you all view the role of government and Intergovernmental Organizations in this space? Pat, you referenced early on in our discussion the fact that there's no single place for all the relevant governmental interests to come together. Telco, regulators, those who are involved in privacy‑related issues within governments, those who do national security within governments, to come together globally to have these discussions about the trade‑offs, the opportunities and the potential harms. Is this a space, do you all believe, that each country needs to regulate? Does each country need to call for self‑regulation? Or is this an area for which we need global standards? Pat?
>> PAT WALSHE: Well, to take Rohan's point I'm not calling for global standards. We can have ‑‑ no, I didn't. We can also ‑‑ we can have interoperability, and when ‑‑ like as Amos said, when you're an Asian operator it's important that you have some consistency to drive efficiency, and some of that comes to approaches how you anonymize the data. And after all, the data belongs to their customers whom they need to have trust in.
I think, David, yes, there is a need to bring those together. I'm not sure which is the right forum, though, but I will give it some more thought.
>> AMOS MANASSEH: I think ‑‑ I think there's another sort of quite interesting angle to this, because I think the concept of engagement with, you know, the legal environment that we live in (?) country, relative to age. So the attention economy is a very nice way of putting it. The young ones trying to get on, and they tend to give things away (?) as you settle down, get a bit older, you tend to protect the environment you're in and start making decisions about your privacy.
I think from the point of view of regulation and government organisations getting together, potentially globally, regionally, to make certain decisions could be very strong defining different data sets, because there's a certain data set that exists, it exists from the day you're born, and there's other data which you add to on an ongoing basis as you move on. Perhaps there's a differentiation between those data sets, health records, government identity number, those sorts of things. They're consistent, pretty much, and perhaps those are the ones that would be a good baseline, and, you know, the free‑for‑all happens (?).
Where the problem is, I mean, I have never given my passport number away as freely as in Asia. It's quite extraordinary how often you have to need to give it away. I've never given my passport out. Over here you need to give your passport number almost any transaction, in which case that number is held by all sorts of different organisations that shouldn't necessarily have it, but perhaps that's where the differentiation (?).
>> DAVID GROSS: Very good. We've come to the end of our session. Let me ask if there are any other comments from our panelists on this or any other subject before we close. Alex?
>> ALEXANDRINE PIRLOT DE CORBION: To respond to a comment that was made about security, sacrificing some of our freedoms and rights to get security, the problem at the moment in the environment that we have is that now we're giving up all of our freedoms and all of our rights for a sense of feeling secure that's not even guaranteed. And that's definitely a problem that we need to start addressing, because people have ‑‑ are living under the assumption that if they give up their rights, then they'll get security, but that's not the case at the moment.
>> DAVID GROSS: Any other comments?
>> TITI AKINSANMI: I just wanted to be able to end the game with clearly stating the fact that freedom comes with responsibility, and being able to find that balance can only be achieved if every person, and I'm doing my very best not to mention that word ‑‑ if every player, if every ‑‑ so Sunil is the only guilty one in this space (laughter) ‑‑ if every player, if every ‑‑ from industry, from government, the end user, Civil Society, technical community ‑‑ is able to better understand the driving force of the other end and why they take the positions they take.
I think on our part, as Google, we have taken it ‑‑ to improve the technical culpability, to find that balance and to also ‑‑ I want to be able to use word "aggressively" ensure that our end users are very much knowledgeable and aware of the technologies available to them.
>> DAVID GROSS: Very good. Let me sum up, I think, very briefly by saying that once again we've established that this is an extraordinarily important subject. It's very rich. It's changing rapidly. The big data discussion is fundamentally different and new as compared to when we started this discussion. The mobility aspects of it make everything that much more complicated and that much more personal for each and every one of us. I think we've accomplished one of the great goals of the IGF in terms of process form. We've been able to have the remote access work properly are properly. That's an accomplishment. I want to thank the GSA for putting this panel together. Pat, thank you, and Yiannis, thank you for all your efforts in doing that, and I think we should give a round of applause to our terrific panel.
This is the output of the real-time captioning taken during the IGF 2014 Istanbul, Turkey, meetings. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.