
from FourFourTwo https://ift.tt/2Ub8ncT














Yesterday, the Federal Trade Commission (FTC) announced the creation of a new task force to monitor competition in technology markets. Given the inadequacies of federal antitrust enforcement over the past generation, we welcome the new task force and reiterate our suggestions for how regulators can better protect technology markets and consumers.
Citing the 2002 creation of a task force that reinvigorated antitrust scrutiny of mergers, and ongoing hearings on Competition and Consumer Protection, FTC Chairman Joe Simons said, “[I]t makes sense for us to closely examine technology markets to ensure consumers benefit from free and fair competition.” Bureau Director Bruce Hoffman noted that “[t]echnology markets, which are rapidly evolving and touch so many other sectors of the economy, raise distinct challenges for antitrust enforcement.”
We could not agree more.
Unfortunately, antitrust enforcement in the U.S. has become strangled in an outmoded economic doctrine that fails to recognize the realities of today’s Internet. We recently submitted comments to the FTC explaining a few key ways to strengthen antitrust enforcement and enable it to better protect competition, the marketplace, and consumer welfare.
Increasingly, consumers “pay” for services that we use online not in dollars, but with our data, which the companies then use without compensation to enable targeted advertising. Given that these services are nominally “free” to consumers, it makes no sense to evaluate consumer welfare solely on the basis of price.
The fetish with price among antitrust regulators originated with a group of economists known as the Chicago School. Their stated goal was to ground antitrust in empiricism. But the empirical measures they adopted have grown dramatically underinclusive, and their theories make little sense in the context of today’s corporate Internet.
In particular, the most salient “cost” paid by consumers to tech companies is often not a price that we pay, but rather the data that we provide, as well as our agency and autonomy in the face of corporate advertising and platform censorship.
In the advertising context, firms monetize user data by selling the privilege of reaching those users to third parties. Because the third parties—not the users themselves—are paying the price of advertising, a price-focused measure of consumer welfare essentially ignores crucial externalities that should inform antitrust analysis.
In addition, platform censorship harms users in a dimension unrelated to price. Arbitrary filters—sometimes driven by perceived national security concerns, and just as often by narrow corporate interests like extreme copyright enforcement—often remove speech from the Internet. Users dissatisfied with one service’s practices should be able to migrate to alternative platforms, but that presumes a competitive marketplace that is almost nonexistent on today’s internet.
Federal antitrust regulators should consider these very real costs to consumers when they evaluate proposed mergers, acquisitions, and anti-competitive behavior by companies leveraging longstanding and entrenched monopolies in particular digital markets.
Several corporate behemoths dominate today’s Internet, each of which tends to wield monopoly power in at least one particular segment. Facebook’s share of advertising revenues among social networks in the United States is over 79%, while Google enjoys similar dominance over search tools, Amazon over cloud data infrastructure, Microsoft over operating systems, and Apple in device manufacturing.
Among the features of the contemporary marketplace that entrench these monopolists are network effects. Put simply, their value corresponds to their number of established users, and the size of their user bases represents a barrier to entry among potential competitors.
One of the features that inhibit user choice is the refusal of corporate platforms to allow interoperability. In other contexts, consumers dissatisfied with a service can choose a competing one. But in the context of social media, the established content that a user has generated serves as inertia, increasing the transaction cost of migrating to alternative services, especially those that have not yet established comparable network effects.
Platforms do not benefit from this inertia merely passively. Rather, they actively prevent users from migrating—and prevent third parties from developing tools that would help empower users—in at least two ways. First, companies have enforced overbroad claims leveraging the Computer Fraud and Abuse Act. They have also expansively interpreted their authorities specified in user agreements, which are legally suspect under traditional contract law principles as contracts of adhesion lacking any opportunity for negotiation or modification.
To address the realities of today’s digital economy, regulators and courts must finally begin to consider harms to consumers beyond price, including corporate platform censorship.
At the same time that antitrust regulators and courts developed an unsustainable, myopic interpretation of consumer harm, they also sharply limited one of the strongest levers in antitrust law for guarding competition: the “essential facilities” doctrine. It has been applied in cases ensuring that railroads could access bridges over rivers even when their competitors owned the bridges and that advertisers could run ads in newspapers even when the newspaper might prefer to exclude them in retaliation for those advertisers also buying ads in other advertising mediums.
When a firm wielding monopoly power leverages a resource that other firms cannot duplicate by refusing to allow access, courts can apply the essential facilities doctrine. On the one hand, leveraging a firm’s unique infrastructure might seem like a normal way of doing business. Seen from another perspective, this kind of activity preys on consumers—and competition—by preventing competition from emerging and forcing users to settle for the first mover.
Applications of essential facilities doctrine might appear aggressive, but applying the doctrine need not impose the kinds of obligations that constrain common carriers. Indeed, common carrier restrictions on social networks would risk imposing harms on speech. In contrast, recognizing essential facilities claims by competitors hampered by an anticompetitive denial of access would promote a diversity of approaches to content moderation, and other platform conduct (such as predatory uses of the Computer Fraud and Abuse Act) that harms users. Essential facilities claims would also encourage the development of new social media platforms and expand competition.
We have argued that the FTC should consider harms to consumers beyond price manipulation, and the essential facilities doctrine, to inform and revive its enforcement of antitrust principles. We anticipate making similar arguments to the Department of Justice (DOJ), and before courts evaluating potential claims in the future. And we hope the new task force, through its work monitoring technology markets, helps focus federal regulators at both the FTC and DOJ on these opportunities.
Properly understood, and liberated from the constraints of an outmoded economic theory that defers to the abuses of corporate monopolies, antitrust laws can be a crucial tool to protect the Internet platform economy—and the billions of users who use it—from the dominance of companies wielding monopoly power.
Update, 2:35 p.m.: The coalition of groups behind Privacy for All has grown since time of publishing. This update reflects the latest count.
Privacy is a right. It is past time for California to ensure that the companies using secretive practices to make money off of our personal information treat it that way.
EFF has for years urged technology companies and legislators to do a better job at protecting the privacy of every person. We hoped the companies would realize the value meaningful privacy protections. Incidents such as the Cambridge Analytica scandal and countless others proved otherwise.
Californians last year took an important step in the right direction, by enacting the California Consumer Privacy Act (CCPA). But much work remains to be done. “Privacy for All,” a bill introduced today by Assemblymember Buffy Wicks, builds on the CCPA’s foundation. It promises to give everyone the rights, knowledge, and power to reclaim their own privacy.
Californians have an inalienable, constitutional right to privacy. But the scale and secrecy of corporate monetization of our personal information has outpaced the state’s duty to enforce that fundamental right. Privacy for All improves on the CCPA by ensuring that companies cannot punish someone for exercising their right to privacy, by imposing a higher price or inferior service. Privacy is not a right reserved for the rich.
Privacy for All also establishes a crucial power to protect our privacy: the right to act as our own privacy enforcers. With a private right of action, Privacy for All ensures that every person can go to court to hold companies accountable when they violate the law and refuse to respect our rights.
When it comes to protecting our own privacy, consumers are at a huge disadvantage. Companies know what they collect, how they use it, and who they share it with. Consumers usually do not.
This knowledge gap has harmful effects. Without knowing where their information goes, people have been unable able to exert control over its distribution, sale, and use. There is no way for them to know, for example, that a company has given their information—their zip code, their race, their restaurant preferences—to a firm that uses this information to determine their mortgage rate or credit limit. Seniors with dementia have no way to know when their name ends up on a data broker’s list.
The CCPA increases the consumer’s right-to-know. Privacy for All strengthens this right, and makes sure that everyone can learn what information companies have shared and who it’s been shared with.
A cornerstone of data privacy is the consumer’s power to decide what a company may do with their data. The CCPA empowers consumers to opt-out of sale of their personal information.
Privacy for All would improve the CCPA by making sure that companies that share data, as well as those that sell it, are required to get opt-in consent to do so. Privacy for All would make sure that the law covers all the ways personal information is shared in the modern digital world, including in ways people may not expect. That returns privacy power to the people.
EFF proudly stands with 30 other privacy and civil rights organizations behind Privacy for All and its commitment to protecting our fundamental right to privacy. Companies have broken their promises that they will do better when it comes to privacy. Scandals and breaches have shown, time and again, that letting companies dictate privacy policy hurts everyone.
California lawmakers and Governor Gavin Newsom have already made clear that privacy is a vital right for the people of this state. It’s time for California's legislators to take the lead once again and ensure Privacy for All.
San Francisco—The Electronic Frontier Foundation (EFF) is standing with Californians demanding more control over their personal data by supporting the Privacy For All bill, which requires tech companies to get their permission to share and use private information.
“All eyes are on California, which has taken the lead nationwide in passing a historic consumer privacy bill at a time when people across the country are outraged by the privacy abuses they read about every day,” said EFF Legislative Counsel Ernesto Falcon. “Privacy For All improves on the existing privacy law so that consumers can control who gets access to their data and how the data is being used.”
Privacy For All was introduced in Sacramento today by Assemblymember Buffy Wicks and has the support of a broad coalition of 14 consumer advocacy groups, including the ACLU, Common Sense Kids Action, Consumer Federation of America, and Privacy Rights Clearinghouse.
Privacy For All
“When it comes to control of their personal information, Californians are at the mercy of companies who enrich themselves at the expense of our privacy,” said Lee Tien, senior staff attorney at EFF. “Privacy For All improves that imbalance of power and gives consumers the opportunity to block companies from secretly sharing and using their personal information.”
For more on Privacy For All:
https://www.eff.org/deeplinks/2019/02/its-atime-california-guarantee-privacy-all
For more on CCPA:
https://www.eff.org/deeplinks/2018/12/california-lawmakers-defend-and-strengthen-california-consumer-privacy-act
For more on data privacy:
https://www.eff.org/deeplinks/2018/12/data-privacy-scandals-and-public-policy-picking-speed-2018-year-review











