Check out the third episode of our podcast series, TwentyTwenty; Your Podcast for (Un)Precedented Time, here! Find us also on Spotify, Apple
Check out the third episode of our podcast series, TwentyTwenty; Your Podcast for (Un)Precedented Time, here! Find us also on Spotify, Apple Podcasts and Google Podcasts. For those who wish to read along, a transcript of the episode can be found below.
Elizabeth Dykstra-McCarthy: Unprecedented times. Such has been the clarion call of this year. But just how unprecedented are these times? Whilst a global pandemic might not be an annual issue, many of the crises we are facing, from the erosion of civil liberties to debt crises, or from fears around technology to the shifts in geopolitics as major powers flex their muscles, these other products of trends and concerns we know only too well in which a state of global emergency has amplified. Even the pandemic itself was no surprise to those in the know. So what are these trends? Where have they come from? And what kind of path can we expect them to put us on?
This is TwentyTwenty Vision, and I’m Elizabeth Dykstra-McCarthy with a podcast, brought to you by Foreign Brief in partnership with the Fletcher School, about how this year has accelerated global trends and the state of global crisis has made them that much more visible.
EDM: We take for granted our tech powered world. The degree to which our lives operate in the virtual sphere would have been unconscionable–borderline dystopian–at the turn of the millennium. For better or for worse, the COVID 19 pandemic has catalyzed a shift to ubiquitous digitality. Tech Titans, Amazon, Netflix, Google and the like, have arguably been the biggest winners of the last six months. Amazon stock has appreciated 60% since March. Netflix added 10 million new subscribers in the second quarter of this year. Zoom, the poster child of pandemic success, predicts earnings upwards of 1.8 billion US dollars for the year, double what it predicted in March. With our foot on the gas of virtualization, we would do well to ponder the consequences of the heightened role of the internet in our everyday lives. We have less control over our personal data today than ever before. With information about your shopping habits, website visits, geolocation, private actors are able to paint robust pictures of our lives, interests and personalities. And as the age old aphorism goes, “With knowledge comes power.” So, just how vulnerable are we? Countries around the world are grappling with unprecedented questions of security, and are trying to establish legal regimes to ensure any semblance of personal privacy before control over data slips from their fingers. In Europe, the general data protection regulation, or GDPR, protects individuals right to access, delete or correct data, as well as opt out of processing at any time. The laws only scratch the surface of data privacy issues. But as big tech grows bigger, we are creating a new sort of privilege, the privilege of privacy.
Josephine Wolff: It’s always been true to some extent that money can buy certain types of privacy. But I think with digital privacy, you again want to be really careful about setting up a system where people who have fewer resources, by default, are losing all of that information about themselves or losing that privacy much faster.
EDM: Dr. Josephine Wolff is an assistant professor of cybersecurity at the Fletcher School of Law and Diplomacy. An expert in international Internet governance and cybersecurity policy, there’s no one better to help us make sense of the unsettling reality of data ownership that lies ahead.
JW: So I think the question of what it would mean to own your data is, essentially, that data belongs to you, nobody can use it, nobody can collect it without your permission, and you can take that permission back at any time you want. And that’s very much not the world we live in, there are some people who think that would be a much better world. I think, ideally, what you would have ownership over, what you would have autonomy over, is not all of your data. Because there are actually a lot of companies you probably want to have your data so they can provide you with useful services that you like using. But instead a sense that you have some understanding of what data is being collected about you and you have some ability to revoke that collection.
EDM: But why should we care about our data in the first place? If it doesn’t have any value to me, why should it have value to others? And what should it matter to me if others want to buy it?
JW: First of all, you have to start out with the question of whether it has any value to you. Which is which is a complicated one, because I think in many cases, our data does have value to us. It just depends on how it’s being used, and by whom, to provide that value. So the reasons that you care about data privacy are not necessarily that you think nobody should ever have access to any of your data under any circumstances, but more that you sometimes feel you want some control over who might have access to certain kinds of information about you.
EDM: It’s obvious why we’d want to protect some data such as that pertaining to children or personal health, but much of the rest seems mundane, such as where I go or what I buy, read or watch.
JW: And that’s the kind of data where each individual piece makes feel a little bit irrelevant, right? You don’t care so much if somebody knows you are on Amazon, looking at ice cube trays. But putting all of that together with where you live and where you go, and where you work and who you see and who you’re friends with, that can become a really rich picture. And there are a lot of different ways that picture can be used.
EDM: Many see this virtual picture as creepy, even dystopian. You think about a trip to Bali, and you see adverts for flights to Bali. You get engaged, then mysteriously begin seeing adverts for the wedding venue.
JW: It can really freak people out, the sense of sort of like, was my computer listening to me? Part of what’s so disconcerting about it is the sense that you have no idea where the information to show you that ad came from. And I think first of all, that lack of transparency is very disconcerting to a lot of people. The sense of like, what pieces of information about me have been put together to craft this profile? And I don’t think that’s paranoia. I think that’s a very valid feeling of, I want to have some control over what the world knows about me and what public profile is being presented about me. And I think a lot of sort of the efforts we’ve seen towards data privacy regulation have been really focused on trying to provide users with some stronger sense of autonomy and control. Look, you’re gonna have some ability to say I don’t want this data to be shared, or I want this data to be deleted, or at the very least, I want to see what data this company has collected about me. And a lot of that I think, is intended to sort of help people craft their own privacy rules, and not try to make a one size fits all privacy regime.
EDM: It might not be the website itself that’s using this information. The website is likely selling that information to third party. These actors can use this aggregated data, a virtual portrait, to target you with ads. That’s not such an issue for the ice cube trays, but it becomes concerning when those ads are political and tailor made to connect with you and influence your beliefs. Conversely, such robust data biographics can be used to filter who can see what. When applied to job advertisements, property listings, or even dating sites, this information can–and has–been used to discriminate.
JW: To decide should you be approved for a credit card, should you be approved for a loan. So there are a lot of sort of very real world applications that some of this data can have on decisions that are being made for us or about us.
EDM: In effect, every piece of collected data is integrated into models to create projections about your personality. Are you trustworthy? Do you splurge unnecessarily? If so, on what? And all of this data can be used by companies to make decisions about engaging with you as a consumer. But this isn’t just targeted advertising for ice cube trays, we also have to consider government and other private uses of data.
JW: So for instance, we’ve seen software being used to determine whether people get bails or how high their bails are set in the criminal justice system…Cases in China where there are ethnic minorities whose data and genetic information is being used to profile them and restrict their rights in various ways.
EDM: Reports of governments, such as those in China and Belarus, undertaking surveillance against their own citizens has flooded new streams.
JW: Often the US side feels like there are many more restrictions on government surveillance and many fewer on corporate surveillance. That doesn’t mean there aren’t very valid concerns about government surveillance. And you saw a lot of this after the Snowden leaks, a lot of worries about the ways in which the United States government in particular was collecting data at very large scales, from telephone companies, from overseas networks, and being able to sort of store these massive quantities of data and analyze them in various ways to collect intelligence.
EDM: And this doesn’t stop at Snowden. Concerns surrounding government data surveillance have been rife in the Black Lives Matter protest movement. On August 7th, BLM activist and organizer Derek Ingram found himself besieged in his apartment by dozens of officers, armed with tactical gear, shields, and dogs. Ingram has been charged with second degree assault. The NYPD admitted to using facial recognition software, which compares still pictures with surveillance videos to identify Ingram and track him down months later. The still pictures used by the NYPD are suspected to have come from Ingram’s Instagram account. Politics aside, this case should elicit neo-Orwellian fear: the availability of our personal data, much of which, like pictures, we voluntarily upload, makes surveillance far easier for Big Brother than even Orwell could have imagined.
JW: I think there are other parts of the world, when we look at, say, the European Union, we see something pretty different where there’s a much more aggressive regulatory regime on corporate collection of data under the GDPR and not necessarily as clear cut or restrictive set of rules on how governments can use this data.
EDM: It’s no surprise that fears surrounding privacy are growing. According to a Pew Research poll conducted in November 2019, 64% of Americans are concerned about the ways in which the government is using their data. That figure stands at 79% when the question is applied to companies. For their part, more than six in 10 EU respondents are concerned about public authorities storing the data. Mounting anxieties have not yielded productive, sustainable solutions. This largely stems from the policy quagmire in which the lawmakers find themselves. Not only does every country seek varying degrees of privacy, convenience and trust, so does every citizen. Whether International, domestic, public or private, there is no one size fits all solution.
JW: Given the way that a lot of data breach and data security regulation has gone in the United States, which is very much state by state, I think that you’re going to see in the US a lot of fragmentation within the country in terms of regulatory regimes.
EDM: There’s also a danger of data privacy issues, exacerbating socio economic inequalities. People who rely on cheaper or free online services are more likely to have their data collected and distributed. The US Senate Commerce Committee found that companies collect data about low income individuals to better target them for payday loans, high interest mortgages, and for-profit education opportunities. So what policies should we implement to combat this trend?
JW: What are those baseline privacy protections that we think everybody deserves? And how do we codify them in regulation? Where you say, look, even if you can’t afford to pay for these additional protections, there are some protections written into the law that are there for you. And I think some of the ones that that make the most sense to me have to do with transparency about when your data is being collected and shared with third parties, some degree of control over certain types of data like location data about when apps can collect it. And I think that kind of decision making control is a big part of what I would aspire to. Which is not owning my own data, but having transparency and insight into what data about me has been collected, who it’s been shared with, and the ability to make some decisions about that.
EDM: In a way we do have some modest protections in our data sharing. And when I say that, I’m specifically thinking of instances where I’ve consented to share my location with Google, or agreed to terms and conditions. This has led to granular controls on browsers that allow us to choose what data a website can collect about us, what my look like 1000, checkboxes a day, to agree to basic functionality.
JW: How do you provide that information to users in a way that they, A: understand it without having to read a lot of really dense text. And B: actually feel like they have some options, right, that they have some more control than just like yes or no. Like, you can collect all this data, or no, I won’t visit your website. And I think that’s a space where there’s actually a lot more room for thinking about policy alternatives. The question, and I don’t have a great answer to it, is what do you do sort of as it gets more complicated, when it’s not just one kind of clear cut data, but actually, this website is tracking everywhere that you move your maps and everywhere that you sort of linger for a moment and how long you spend reading each line of text and your eye movements as you read it. All of those then become a longer and longer list that, as you say, I don’t want to sit there and say like, yes, it can do this, no, it can’t do that, yes, it can do this, if there are 100 things on that list. And so I think we need to get a little bit better at sort of designing the categories that all of these types of data fall into, and then educating people about what those categories mean, so that they feel really empowered to make those decisions for themselves.
EDM: We can make legislation around what we can or can’t consent to. However, there’s an added layer of where that data goes, who uses it, and for what purpose. Each additional question requires another layer of consent, another set of possibilities to answer. In 2014, there was a high profile incident in which Uber employees used God View, a software where you can see all the Uber users in a city to track passengers, applying the settings to celebrities, journalists, or even their exes. In this case, permission was given to the platform to access location data, but God View certainly fell outside the reasonable bounds of users’ authorization. Even if we have options to opt in or opt out of sharing specific pieces of data., that doesn’t mean we can give informed consent around how or if the data will be misused or accessed.
JW: That’s a really important question, is sort of how much information you actually get about how these companies are using your data. And if you look at, say, the current privacy policies, a lot of them, many of which were rewritten to be GDPR compliant specifically to tell people like, What purpose am I collecting your data for? But a lot of them will say things like, we’re collecting your data to improve our services and sell ads. That could be an advertisement for something totally innocuous, or that could be an advertisement being designed by Cambridge Analytica in the lead up to the 2016 election. And you might do those two things very, very differently. When we say people deserve to know how their data is being used and who it’s being used by, how specific is that? Right? Is that just like, we’re using this for, you know, marketing and advertising purposes? Or do you actually get to see who it’s going to, and if you get to see who it’s going to do even really want to take the time to read down the whole list and figure out who everybody on it is?
EDM: And aware isn’t always informed. Merely possessing the facts on how data could be used isn’t enough. We need to understand it.
JW: Right, if you had access to that list in 2015 and 2016, and Cambridge Analytica had been on there, would you even have known enough to care?
EDM: And this spins right back to the earlier problem: the information about who has your data and why must be comprehensible to the layman, who doesn’t have an advanced degree in data privacy and isn’t up to date on the ins and outs of data sharing.
JW: And I don’t think we’ve gotten that balance right yet. I don’t think we know how to provide people with autonomy and control in this space without subjecting them to a lot of little windows that they find incredibly irritating and not at all beneficial.
EDM: We’re living in the digital age. It’s not enough, either, to simply have national policy and draw the line at that. Unable as we are to agree or legislate on national rules, these questions have seismic geopolitical implications. Sometimes it feels as if every day to skirmish in 2020, has a wary world leader on one side of the battlefield and China on the other.
EDM: Right now, we’re in the throes of the TikTok showdown. First India, and now the US, government’s afraid that data collected by TikTok could quite easily be compromised, held captive to the Chinese Communist Party, potentially laying the roots for disinformation campaigns that serve Beijing’s interests. Of course, this isn’t the first time that disinformation campaigns have wreaked havoc in the US, nor are Chinese companies, the only ones harvesting this data. There is however, a salient difference between Facebook collecting and using your data versus TikTok collecting and potentially sharing your data with the Chinese government.
JW: How much do you trust Facebook and Google? Do you trust them more or less than the Chinese government? And I think the short answer is that they have very different agendas. Right? So the things you might be mistrustful about with Facebook or Google collecting your data are likely to be very different things from the things that you would worry about if the Chinese government was collecting your data. And so I think it really depends on sort of who you are, and what risks you’re worried about.
EDM: If data privacy and protection is going to be one of the next great human rights struggles, as Amnesty International predicts it will be, then we may yet see these fractures in digital rights across the world. Not just in the importance of access to digital infrastructure, but in what happens to us online, who is vulnerable to digital attacks, how society is vulnerable to mass data harvesting, and how we can respond to the panoply of threats, both public and private, national and international. Somehow, our roster of policies needs to protect each individual with a way to determine what or who is a threat to them, be it a company, their own government or a foreign government. People should be able to know where their data is going and have some control over what is being shared.
EDM: The European Union has been grappling with these questions about data protection and data security since 2013, when Edward Snowden leaked highly classified information about the scope of the US national security agency’s surveillance of foreign nationals and US citizens. Prior to the leak, Brussels endorsed the Safe Harbor decision, which allowed companies to transfer data from the EU to the US without prosecution. The Snowden leaks disproved a key assumption of the Safe Harbor decision that the US provided safeguards for personal information. Snowden ignited European fears about data protection, but more importantly, provoked louder calls for data sovereignty in European control. With the adoption of the GDPR. In 2016, individuals gained control over their data by regulating the transfer of personal data outside of the EU. More recently, Brussels has taken even greater steps to address data sharing and data privacy issues. There’s new regulations and policies related to Facebook and AI.
Peter Chase: Now, everything is digital. The economy is digital. And it’s not just the economy. It’s how we do our politics. It’s how we do our media. It’s how we do our culture.
EDM: Here to talk to us more about the EU’s views on data security is Peter Chase, a senior fellow at the German Marshall Fund, where he focuses on the transatlantic economy with particular attention to trade, investment, digital, and energy policies.
PC: The Europeans have focused on digitalising their economy, the digital services initiative, the digital single market, because they’ve seen bringing digital technologies into their economy as a major way of promoting their competitiveness, promoting productivity, promoting government services, health care, energy, all of these things that can be improved with the use of digital technologies.
EDM: But, digitising the economy also presents problems vis-a-vis data protection, something with which Europe is trying to grapple.
PC: Europeans wanted strong data protection laws within Europe. In part because of the history of some people, of many countries here with authoritarian governments and governments intruding too much. And part of the concern about advertising and the way companies have used data.
EDM: And it was Snowden’s revelation, specifically that US data was not as well protected as previously believed, that really spurred the data security debate.
PC: That revelation was something that went primarily to the issue of data protection and privacy. It wasn’t as much security, it was more, what is the government doing? Is government intruding into our private space in a way that shouldn’t be able to?
EDM: In July the European Court of Justice rule to invalidate the EU us Privacy Shield, in many ways a response to fears that foreign intelligence could track the data of European citizens with impunity. Snowden gave evidence that US companies, Apple, Microsoft, Facebook, Google and Yahoo, were disclosing information to the US government for surveillance.
PC: Everyone understands that other countries, like China and Russia, have even fewer protections for data.
EDM: So it seems to me that the EU has three major fears: cybersecurity, data protection, and information control.
PC: The first, most serious level, is cybersecurity attacks. Very kinetic attacks against the infrastructure of a government and Europeans have direct experience with that, with attacks from Russia and Estonia and other places. The Europeans have been doing quite a lot to try to improve that, including building up their own cybersecurity agency where all of the member states share their ability to address kinetic attacks.
EDM: “Cyberspace has been a means to terrorist groups to push their propaganda for Russians to interfere in our democratic elections, for Chinese misinformation and IP theft, for Iranian supported terrorist groups, and aggressive use of the COVID-19 pandemic as a cover for exploitation operations.” That’s a quote from the head of Strategic Command the British military, General Sir Patrick. “If this was an airwar,” he said, “it would be the Blitz.” In November they will likely announce a new cyber force to strengthen British cyber campaigns, such as those deployed against ISIS. Cyber attacks are commonly viewed as the most threatening of data security concerns. However, not all issues of cyber security are militarized.
PC: Right now people are a little bit more concerned about the attacks that are more insidious. Whether it’s illegal speech, hate speech, political speech is illegal in some European countries, or if it’s simply disinformation, a campaign driven by foreign or domestic actors to make people doubt the institution.
EDM: And those high profile examples of its use include the Russia misinformation campaign used to meddle in the 2016 US presidential election and the Brexit vote.
Amid public pressure to crack down on the spread of misinformation to sway November’s presidential election, Facebook has agreed to limit political advertising.
Recently, the concerns regarding China’s data usage have come to the forefront with the emergence of Huawei, the world’s largest seller of smartphones and 5G technology. The US and several other countries have accused the company of violating international sanctions and stealing intellectual property. Washington is also concerned that Huawei will be forced to hand over data to the Chinese government or that the company’s 5G infrastructure has backdoors that would allow Beijing to launch a cyber attack or commit cyber espionage. While the US has pushed for allies to exclude the Chinese company from their 5G rollouts, EU countries, save the UK, have yet to follow suit.
PC: Europeans have not been as much concerned about the Chinese gathering personal data of Europeans, because I don’t think they feel that they’ve been a target. But they have been concerned about the way the Chinese, particularly in the last six months, have been aggressively engaging in the information space in Europe.
EDM: Could we chalk up these concerns to some sort of a phobia against Chinese technology? Great technological superpowers appear to be on the outs, while Chinese technology, which is more affordable, has been proliferating across Africa, Asia, and more and more in Europe. Do these concerns surrounding China and its tech stem from unease in the introduction of a new major player, and with it a shifting balance of power, rather than any real threat?
PC: The Europeans were sensitized. First by the things that the Americans did, second by the things that the Russians did, so the Chinese are the new kids on the block, which doesn’t mean that they’ve forgotten about the others.
EDM: So European states don’t see Chinese technology as the only threat. Tech concerns surrounding the US and Russia still hold. But as the adage goes, Money Talks, and certainly Chinese services such as Huawei’s 5G tech, are cheaper than those of competitors by an estimated 20 to 30%.
PC: Commercially, European companies that are involved in 5G, their major competitors are in fact Huawei, which is a Chinese company. And so there’s also an economic dimension about this sense of competition and sense of threat.
EDM: The new European Commission emphasized technological sovereignty as a new target goal for 2050. But what does that mean? How do Europeans define technological sovereignty? And why do they believe they need it?
PC: I wish I knew what it meant. And I think that most of the European officials I talked to wish that they really knew what it meant. There’s a feeling that among a lot of European politicians and policymakers, as well as a lot of European businesses, that the lead on the digitalization and the digital revolution has not really been with Europe. The perception that the big guys are outside either American or Chinese in this case, not so much the Russians, is something that makes people in Europe uncomfortable that they are either dependent on someone who’s not, “European,” or that they’re just losing their competitiveness. For most Europeans, the idea is, we want more European capability in this space, we want more European recognition, we want more European giants, they are not seeing that they want to have their own Facebook or their own Google. In fact, what the Europeans have been saying much more is that Europe is a leader in industrial machinery, in services. There’s a lot of digital technology associated with that. What Europe should do is build the infrastructure and the capability to take advantage of the application of the technology.
EDM: This process of ensuring the data is as protected in the US, or wherever, as it is in Europe, has been complicated by new laws.
PC: European Court of Justice has now made it much more difficult to say, “You have to be as democratic as we are, you have to have a judiciary that ensures that there is no unwarranted government intrusion into the use of the data in order for us to transfer the data there.”
EDM: Even in the US, which is fairly transparent and democratic, this is difficult because of constitutional blockages that require you to get warrants or subpoenas. In countries like China or Russia, where these processes are less transparent, this becomes even more difficult.
PC: So what does the EU do? Do they actually prohibit the transfer of data to those countries? That would be extremely difficult, because it would mean that a lot of economic commercial relations could fall apart. So they wanted to increase the protection of data overseas, whether or not they’ve overstepped their capability to do that is, I think, an open question. My own gut instinct is they have, which will mean that they’ll have to, in reality, not implement their own law as strictly as they interpreted.
EDM: To tackle these concerns around data transfers, the EU has recently proposed a new initiative: GAIA-X. The project, set up as a Belgium based nonprofit, seeks to encourage EU firms to store their data with European alternatives to American and Chinese data storage giants, known as hyperscalers, such as the Amazon Web Services and Alibaba. Though not a cloud service in and of itself, GAIA-X would link the cloud hosting services of various companies, enabling EU businesses to move the information freely, while ensuring its protection under European data processing laws. This model would allow for smaller cloud platforms to be integrated and compete in the market. Companies must apply to join GAIA-X and commit to key principles, namely, protecting and following European data sharing provisions. Companies participating in GAIA-X are able to maintain ownership of their data, as they can choose what they wish to share and dictate for what purposes said data can be used. In essence, the project’s raison d’etre is to ensure the protection of European information and foster stronger sovereignty. So first, let me ask, why do you think the EU is pushing to regulate access to data, rather than the quantity of data that is collected?
PC: There’s already a lot of data about each of us as individuals and what we do. And I think we’ve reached a point where we as citizens need to be much, much, much more aware of what our governments seek to do with the information.
EDM: Here an interesting conflict arises. GAIA-X is, in effect, asking the Chinese, Americans, and more to share their data with the EU, but the EU is simultaneously saying it doesn’t want its data to be stored overseas. How can you explain this?
PC: So they’ve got this technological sovereignty thing, but I think what they’re just really trying to do is to build a truly European capability in some of these areas.
EDM: So whilst it seems natural to want to grow an industry, this does smack a little of protectionism.
PC: If I want to operate in Europe, I have to obey European laws. If I want to sell a product in Europe, it has to meet European standards. If I want to sell a service in Europe, whether it’s banking or cloud computing, it has to meet European standards. They’re trying to insist that a provider of artificial intelligence services based in Israel, facial recognition, whatever the case may be, that if it’s providing those services in Europe, it has to be certified as meeting European rules. I get that, but it’s going to be very difficult for them to police.
EDM: What are some practical manifestations of the latter issue?
PC: I want to watch a sports game. And I’m living in a country where only one company is allowed to provide that sports game. They have territorial rights. So if you’re in the territory, theoretically, you shouldn’t be able to get to it. But if you take a VPN and you go to a Russian site, you can download it from there. Okay, that’s something that most people understand because most people understand the use of VPNs to get around these things that try to enforce copyright. But you can take that idea and use it for anything, including things like facial recognition. It would be illegal to take a lot of physical biometric data out of Europe and send it to a third country to process. It would be pretty difficult to police.
EDM: If there might be issues with policing companies providing services outside of the cloud, I can imagine there are going to be issues with convincing companies to freely share data within the cloud.
PC: How do you regulate? Do you regulate access to data, we are moving to a world that’s going to be of connected cars. If they’re playing the radio, the company that’s provided you that car can probably tell that they’re playing the radio and what stations you’re listening to. They can probably tell where the car is. There are lots of hertz, and that’s going to get much more intrusive, because companies are going to say, “I’ll be able to fix your car while you’re driving.” Data associated with something like a car, that’s really big. If a car goes over a pothole, it can tell that there’s a pothole. Question is, should that data be paired with the municipal government that can specifically point to a pothole that needs to be repaired? So the Europeans know that the digital economy is based upon data. So I want as much of that data that BMW collects, I want as much of it as possible to be freely accessible by anyone who can think of how to use the data in a way that adds value. Right now, it’s all going into BMW. BMW’s saying, “Well, wait a minute, I can think of what to do with that. And I can think of how to monetize using all that data. Why should I share it with someone else?”
EDM: On the horizon limbs potential clash. Corporations want to monetize data and sell it to whomever is willing to pay. The EU, through GAIA-X, wants certain data to be accessible and free for all to use. This is going to complicate the regulation process and might well affect the feasibility of GAIA-X in the long term. In an increasingly online centered world, our data represents us: our likes, our beliefs, and even our trustworthiness. Companies and governments have been able to use data unrestrained thus far. If we want to create a more just, equal, and secure society, this needs to change. Regulating data use is a challenging prospect. It’s hard to regulate across borders when the internet itself is borderless, but it’s a challenge governments are determined to take on. Although how well they’ll fare is yet to be seen.
PC: It’s going to get worse with the next generation and the next generation, who extend the capabilities in this sort of architecture of repression. You realize that you might be willing to accept any risk.
EDM: This episode of 2020 was researched and written by Manisha Vepa and Sulagna Basu. Max Klaver was the associate producer, and the executive producer and presenter was me, Elizabeth Dykstra McCarthy. Thanks for listening and until next week, goodbye.