The breathless pace of development means data protection regulators must be prepared for one more scandal like Cambridge Analytica, says Wojciech Wiewiórowski, the EU’s data watchdog.
Wiewiórowski is the European data protection supervisor, and he’s a strong figure. His role is to carry the EU accountable for its own data protection practices, monitor the innovative of technology, and help coordinate enforcement across the union. I spoke with him in regards to the lessons we should always learn from the past decade in tech, and what Americans need to grasp in regards to the EU’s data protection philosophy. Here’s what he needed to say.
What tech corporations should learn: That products must have privacy features designed into them from the start. Nevertheless, “it’s difficult to persuade the businesses that they need to tackle privacy-by-design models once they need to deliver very fast,” he says. Cambridge Analytica stays the most effective lesson in what can occur if corporations cut corners on the subject of data protection, says Wiewiórowski. The corporate, which became one among Facebook’s biggest publicity scandals, had scraped the private data of tens of tens of millions of Americans from their Facebook accounts in an try and influence how they voted. It’s only a matter of time until we see one other scandal, he adds.
What Americans need to grasp in regards to the EU’s data protection philosophy: “The European approach is connected with the aim for which you employ the information. So whenever you change the aim for which the information is used, and particularly in case you do it against the knowledge that you just provide individuals with, you might be in breach of law,” he says. Take Cambridge Analytica. The most important legal breach was not that the corporate collected data, but that it claimed to be collecting data for scientific purposes and quizzes, after which used it for one more purpose—mainly to create political profiles of individuals. This can be a point made by data protection authorities in Italy, which have temporarily banned ChatGPT there. Authorities claim that OpenAI collected the information it wanted to make use of illegally, and didn’t tell people the way it intended to make use of it.
Does regulation stifle innovation? This can be a common claim amongst technologists. Wiewiórowski says the actual query we should always be asking is: Are we actually sure that we wish to offer corporations unlimited access to our personal data? “I don’t think that the regulations … are really stopping innovation. They try to make it more civilized,” he says. The GDPR, in any case, protects not only personal data but in addition trade and the free flow of knowledge over borders.
Big Tech’s hell on Earth? Europe just isn’t the just one playing hardball with tech. As I reported last week, the White Home is mulling rules for AI accountability, and the Federal Trade Commission has even gone so far as demanding that corporations delete their algorithms and any data which will have been collected and used illegally, as happened to Weight Watchers in 2022. Wiewiórowski says he’s pleased to see President Biden call on tech corporations to take more responsibility for his or her products’ safety and finds it encouraging that US policy considering is converging with European efforts to forestall AI risks and put corporations on the hook for harms. “One in all the large players on the tech market once said, ‘The definition of hell is European laws with American enforcement,’” he says.
Read more on ChatGPT
The within story of how ChatGPT was built from the individuals who made it