Newsletter (issue no. 5) - The importance of the term “processing purpose” for privacy
June 2022
Welcome back to a new issue of Recoding Tech’s newsletter. Together with the website, each recoding.tech newsletter offers relevant analyses and highlights of the policy discussions, research, and news from the past month that are reshaping the rules for Big Tech.
In this month’s issue:
Featured topic
The importance of the term “processing purpose” for privacy: An interview with Dr. Johnny Ryan, Fellow at the Irish Council for Civil Liberties, on the Digital Markets Act
Updates on how governments are recoding tech
Courts in the U.S. preserved the ability of social media platforms to moderate content for now
The state of California advanced an Age-Appropriate Design Code to protect children
The Federal Trade Commission found that Twitter deceptively used account security data to target ads
The E.U. opened an investigation into Apple Pay and released a report on dark patterns and a plan for access to social media data by independent researchers
Highlights from new research, policy papers, and news/commentary
Privacy researchers expose problems with current and proposed laws
What happens when governments limit the ability of platforms to moderate content?
Recent papers examine solutions for regulating Big Tech
Addressing the spread of disinformation
New revelations from the Facebook Papers
Featured topic
The importance of the term “processing purpose” for privacy: An interview with Dr. Johnny Ryan
The Digital Markets Act establishes new regulations for large platforms acting as “gatekeepers.” The law has undergone numerous changes as it moved from a proposal by the European Commission (EC), to negotiations between the EC and the E.U. Parliament, to a final agreement. In such a complex law, the devil, as they say, is always in the details. The law includes a provision, 5(1)a, to limit platforms like Google from combining and cross-using personal data across the company’s many services. But the specific language could create a considerable loophole.
To discuss the particular language in the Act, I spoke to Dr. Johnny Ryan, a Senior Fellow at the Irish Council for Civil Liberties and a Senior Fellow at the Open Markets Institute.
Read more on recoding.tech.
Updates on how governments are recoding tech
Recoding.tech is tracking existing and proposed laws and regulations, along with government investigations and litigation from across the U.S. and Europe. Here are the new additions and updates for May 2022. You can view all the actions being tracked on recoding.tech using our law & regulation tracker.
United States. Two decisions by the Federal Courts in the U.S. have for the moment, preserved the ability of social media to moderate content, including removing content and suspending users. The first decision by the 11th Circuit Court of Appeals upheld an earlier decision by a Federal district court to prevent a State of Florida law targeting social media platforms from going into effect. That law, SB 7072, banned social media platforms from de-platforming candidates for political office and required that social media platforms establish more transparent and consistent content moderation policies and procedures to comply with the State’s Deceptive and Unfair Trade Practices Act.
In the second decision, the Supreme Court stepped in to reverse an appeals court decision that reinstated a Texas law prohibiting social media companies from censoring (i.e., banning, demonetizing, or otherwise restricting) user content based on the viewpoint of the user or another person, the viewpoint represented in the user's expression, or a user's geographic location in Texas. The Supreme Court’s decision only preserves a preliminary injunction on the law. It is still up to a district court to determine the law's constitutionality and whether it complies with existing federal laws, including Section 230. The Florida law will also go back to the district court to determine its legality. One or both cases are likely to make their way back to the Supreme Court to make a final decision, which is increasingly uncertain given the 5-4 split on upholding the injunction against the Texas law.
In addition, the California Age Appropriate Design Code Act passed the State of California’s Assembly and is under discussion in the State Senate. The act was modeled after the Age Appropriate Design Code enacted in the United Kingdom. It would require businesses that offer online services, products, or features likely to be accessed by children to comply with specified requirements. Finally, the Federal Trade Commission charged Twitter with deceptively using security data to target ads.
European Union. The European Commission informed Apple of its preliminary view that it abused its dominant position in markets for mobile wallets on iOS devices. By limiting access to a standard technology used for contactless payments with mobile devices in stores (‘Near-Field Communication (NFC)' or ‘tap and go'), the Commission argued that Apple restricts competition in the mobile wallets market on iOS. The European Commission’s Directorate-General for Justice and Consumers also released a report finding that dark patterns are prevalent across popular websites and apps. Finally, the European Digital Media Observatory’s Working Group on Platform-to-Researcher Data Access released a new report that clarifies how platforms may provide access to data to independent researchers in a GDPR-compliant manner.
Did we miss something for June? Email hello@recoding.tech with recommendations.
Highlights from research, policy papers, and news/commentary
Recoding Tech curates a collection of academic and civil society research, policy papers, investigative journalism, and op-eds. These articles illuminate what’s wrong with Big Tech’s platforms and business models, debate policy options that could address the problems, and make recommendations for government action. Below are some highlights from May, organized by topic. You can explore the entire collection in our library.
Privacy researchers expose problems with current and proposed laws. The month of May should be a wake-up call for policymakers on the issue of digital privacy. First, a group of researchers found that thousands of popular websites collected email information from users without their consent. Then the ICCL released a report on the real-time bidding system that facilitates targeted advertising across the internet. It estimates that the system “tracks and share what people view online and their real-world locations 294 billion times in the U.S. and 197 billion times in Europe every day.” Finally, Human Rights Watch released a report investigating education technology products that connect kids to virtual learning during the Covid-19 pandemic. The report found that 89 percent of the products reviewed engaged in data practices “that put children’s [privacy] rights at risk, contributed to undermining them, or actively infringed on these rights.”
Leaky Forms: A Study of Email and Password Exfiltration Before Form Submission — USENIX Security’22
The Biggest Breach: ICCL Report on the Scale of Real-Time Bidding Data Broadcasts in the U.S. and Europe — Irish Council for Civil Liberties
How Dare They Peep into My Private Life? Children’s Rights Violations by Governments That Endorsed Online Learning During the Covid-19 Pandemic — Human Rights Watch
Remote Learning Apps Shared Children’s Data at a ‘Dizzying Scale — Washington Post
How Private Are Learning Platforms Used by Texas Schools? Here’s What a Study Found — Fort Worth Star-Telegram
Technology Used by Educators in Abrupt Switch to Online School Shared Kids’ Personal Information, Investigation Shows — The Globe and Mail
In addition, commentary from the LA Times and the Washington Post paints a grim picture of the ability of the public to control their digital privacy or make informed decisions about how their data is collected or used. In response to this reality, Zeynep Tufekci makes the case in the NY Times for a more comprehensive approach to protecting privacy. Finally, a report from the Bank for International Settlements, which act as a bank for central banks, makes a case for a “data governance system that restores control of data to the consumers and businesses generating it.”
I Hid My Pregnancy from the Internet so I Know: Online Privacy Is Nearly Impossible — Los Angeles Times
I Tried to Read All My App Privacy Policies. It Was 1 Million Words — Washington Post
We Need to Take Back Our Privacy — The New York Times
The Design of a Data Governance System — Bank for International Settlements
What happens when governments limit the ability of platforms to moderate content. As discussed above, the Federal Courts in the U.S. have for the moment, preserved the power of social media to moderate content, including removing content and suspending users. An article in Slate’s Future Tense discussed how the 5th Circuit Court of Appeals' decision to reinstate the Texas social media law would make social media even more harmful. Similarly, a commentary in The Daily Beast adds that both laws would likely require social media sites to host mass shooting videos. A blog post from EFF discusses the role of the First Amendment in the decisions by courts to uphold injunctions against the Texas and Florida laws. Finally, an article in a Fordham University law journal discusses the legal thinking that could shape the Supreme Court’s review of both laws.
The 5th Circuit’s Reinstatement of Texas’ Internet Censorship Law Could Break Social Media — Slate
Florida and Texas’ ‘Free Speech’ Social Media Laws Would Require Sites to Host Mass Shooting Videos — The Daily Beast
11th Circuit’s Ruling to Uphold Injunction Against Florida’s Social Media Law Is a Win Amid a Growing Pack of Bad Online Speech Bills — Electronic Frontier Foundation
Failed Analogies: Justice Thomas’s Concurrence in Biden v. Knight First Amendment Institute — Fordham Intellectual Property, Media and Entertainment Law Journal.
Recent papers examine solutions for regulating Big Tech. There is no shortage of problems resulting from Big Tech’s platforms that require new thinking and policy action. Two papers discuss the role of antitrust in addressing one of these problems, Big Tech’s power over markets. Another article in the Emory Law Journal argues for change to Section 230 that would “empower courts to issue injunctive relief, directing content platforms that enable intimate privacy violations to remove, delete, or otherwise make unavailable intimate images, real or fake, that were hosted without written permission.” Finally, a journal article discusses the legal challenges of regulating social media’s algorithms.
The Bipartisan Consensus on Big Tech — Emory Law Journal
A New Antitrust Framework to Protect Mom and Pop from Big Tech — Journal of the National Association of Administrative Law Judiciary
Privacy Injunctions — Emory Law Journal
Algorithms and Misinformation: The Constitutional Implications of Regulating Microtargeting — Fordham Intellectual Property, Media and Entertainment Law Journal
Addressing the spread of disinformation. Why is disinformation so difficult to combat on social media? According to one recent paper, one of the reasons is that misinformation is easier to read and more emotional. The researchers find that misinformation content, on average, is ten times more likely to rely on negative sentiment and on one-third more likely to appeal to a reader’s morality. A second paper finds that questionable sources of news were more responsive (i.e., supplying disinformation) to the public’s concerns, doubts, fears, etc., about COVID than general news production. Researchers also found that a person’s susceptibility to misinformation is less explained by inadequate analytical thinking and more by their existing biases and partisan views. Finally, an article in Science Advances finds that credibility labels on social media content and websites provided by the NewsGuard’s web extension didn’t change users' consumption of low-quality news sources or their misperceptions about the Black Lives Matter movement.
The Fingerprints of Misinformation: How Deceptive Content Differs from Reliable Sources in Terms of Cognitive Effort and Appeal to Emotions — Humanities and Social Sciences Communications
The Supply and Demand of News during COVID-19 and Assessment of Questionable Sources Production — Nature Human Behaviour
Susceptibility to Misinformation Is Consistent across Question Framings and Response Modes and Better Explained by Myside Bias and Partisanship than Analytical Thinking — Judgment and Decision Making
News Credibility Labels Have Limited Average Effects on News Diet Quality and Fail to Reduce Misperceptions — Science Advances
New revelations from the Facebook Papers. The leak of the Facebook Papers last October continues to yield new examples of the company’s harmful practices and decisions. A review of the leaked documents by Gizmodo found that Facebook held back an update to its News Feed to staunch the flow of hoaxes, fake news, and other disinformation to avoid accusations of liberal bias by political leaders. A second report from the news outlet examines internal discussions of Facebook’s recommendation algorithms, including a presentation that finds the platform’s “ranking encourages the sharing of fewer meaningful posts while allowing “bad content to spread farther due to the costless accumulation of friends.”
Facebook Killed News Feed Fix for Fear of Conservative Backlash: Docs — Gizmodo
The Facebook Papers: The Algorithms That Control Your News Feed — Gizmodo
Did we miss something for June? Email hello@recoding.tech with recommendations.
About us
Recoding Tech is a Reset-supported initiative.Reset is engaged in programmatic work on technology and democracy. It seeks to change the way the internet enables the spread of news and information so that it serves the public good over corporate and political interests — ensuring tech companies once again work for democracy rather than against it.