By Winny Sun
In 2013, whistleblower and former CIA employee Edward Snowden delivered the shocking revelation that the National Security Agency had been spying on the phone and internet activities of millions of Americans. Just a few years later, the 2018 Cambridge Analytica scandal brought the issue of online privacy under spotlight once again. More than 87 million Facebook profiles were breached, and the stolen information was used by the political consulting firm Cambridge Analytica to develop strategies for the 2016 Trump presidential campaign. Incidents like these have raised the societal importance of online privacy in the US. In fact, a 2017 Pew Research study found that 50% of Americans think that their personal data is less secure today compared to five years ago.
In 2018, the European Union adopted the General Data Protection Regulation (GDPR) to address similar concerns in Europe. GDPR comprises a set of strict data protection rules, requiring companies to ask users for permission when they collect their information and to remove identities from collected data, among other things. Since the inception of the EU law, discussion about creating a similar federal privacy law in the US has sprung up. But as politicians and academics battle over whether and how to regulate privacy, individual states have already begun pushing forward their own plans. The California Consumer Privacy Act (CCPA), signed into law in 2018 and scheduled to go into effect in 2020, will require companies to delete sensitive information at customers’ request, among other demands. The CCPA is one of the nation’s strictest privacy regulations to date. Yet, California lawmakers are not satisfied and want to make the act even stronger. They have introduced a ballot initiative, which includes new rights like allowing people to prevent their sensitive information like race, health, and financial data from being used in advertising. California is slated to take up this initiative during the upcoming November election.
Elsewhere in the country, things are more complicated. New York State’s approach, embodied in the proposed New York Privacy Act, was intended to be even more bold than California’s, yet the bill failed to pass due to industry lobbying and a lack of support in the NY Senate. The Trump administration reportedly wants a federal privacy law, but senators cannot agree on what rights to include and not include. For example, should consumers be allowed to directly sue businesses that violate the new federal law? While politicians have differing opinions, the majority of tech companies have reached a consensus. A group of 51 large companies, including Amazon, IBM, and Salesforce, recently signed a letter to US congressional leaders asking for a federal consumer data privacy law. For the companies, regulatory certainty is the highest priority, as it is simply too difficult and expensive to comply with numerous individual state privacy laws.
One major benefit of having a federal privacy law is that it would simplify the existing legal framework. Currently, the US has a patchwork of confusing and conflicting federal and state laws. At the federal level exist laws such as the Cable Communications Policy Act (CCPA) and Health Insurance Portability and Accountability Act (HIPAA), but they only grant protections to certain demographics. The CCPA protects the personal information of cable TV service subscribers, while HIPAA prevents patients’ sensitive health records from being released without their knowledge. There are also wide discrepancies across state laws. They differ in the harshness of rules and the extent of rights covered. Wyoming and Mississippi are some of the worst states for online privacy, according to an analysis performed by the research agency Comparitech. The two states lack basic laws baring employers from breaching employees’ personal information, for example.
Faced with this legal muddle, a federal law that reduces inconsistencies and overlaps would reduce compliance costs and improve revenues. A 2019 Cisco report found that businesses who followed the European GDPR regulations experienced increased consumer trust and shorter sales delay. But it’s not only the businesses that benefit. Consumers will also be able to take ownership of their privacy rights more easily, assuming that a unified law is simpler to interpret. Another major benefit of enacting a federal law is that US companies can effectively compete in the international market. Earlier this year, Google received a hefty fine of $57 million for GDPR violations. The French privacy authority accused the search engine giant of being non-transparent in how it collects personal data and creates personalized ads. Regardless of what policies are like in the US, Silicon Valley tech giants have to comply with strict European regulations nonetheless, and stricter regulations at home would help with this.
Privacy advocates in the US wish that a national privacy act, if enacted, could mimic the GDPR—imposed across the nation and yet still composed of strict requirements. But the US and Europe are fundamentally different, where the latter places a heavier emphasis on privacy in general. During the Second World War, the Nazis, with their secret police, were notorious for invading individual privacy, which ingrained pro-privacy attitudes among the Europeans. Additionally, unlike the US Constitution, the European Union’s Charter of Fundamental Rights also includes rights to protection of privacy and personal data. Therefore, for historical as well as legal reasons, Europe institutes much stricter privacy regulations on tech companies. By contrast, America’s long-standing attitudes in favor of free markets and competition could water down any potential federal law.
But even if lawmakers eventually decide to adopt a federal privacy law, it likely will not come to fruition until important questions have been answered. For example, what organization should enforce the law? Currently, the Federal Trade Commission (FTC), Department of Commerce, and Department of Health and Human Services (HHS) each oversee some aspects of data privacy. Additionally, what is the definition of “personal data”? The meaning of this term has become increasingly complex and states often define it in different ways. A new privacy law will need to redefine the term, explicitly stating what “personal data” encompasses and to what extent companies are allowed to use it. There will likely be clashes between businesses and consumers, as companies want a narrower interpretation of personal data to evade as many regulations as possible while consumers and privacy rights activists want a more comprehensive definition. Another controversial question is whether the new federal law would preempt state laws. In other words, once a federal law passes, are individual states still allowed to impose their own privacy acts?
Putting these questions aside, one of the trickiest things of establishing privacy regulations in the 21st century is maintaining an intricate balance between consumer data protection and innovation. Artificial intelligence poses privacy challenges yet it has devised solutions to numerous health and financial problems. For example, machine learning has succeeded in predicting Alzheimer’s Disease based on brain scans six years before patients were diagnosed. It has also triumphed in accurately assessing consumer and business credit risks by looking through massive quantities of financial data. Interestingly, despite its privacy concerns, AI can also help people protect their personal data. A program developed by researchers from the University of Wisconsin-Madison and the University of Michigan helps people interpret websites’ complex data protection policies, by translating the terms to simple graphs and color codes.
Privacy laws may also harm business innovation by reducing healthy competition and preventing small businesses from thriving in an industry. A report put together for the California attorney general’s office found that compared to big companies, smaller companies may be subject to a disproportionately large portion of compliance costs. Small companies with less than 20 employees may incur up to $50,000 in privacy compliance costs.
Instituting a national privacy law is no easy task, as illustrated by the current limbo politicians are in. But given society’s rapid digitization and people’s increased concerns about their safety online, the issue of privacy will sooner or later come to a national attention. When that time comes, lawmakers need to pay extra attention to make sure that a new federal law will advance privacy while maintaining a healthy competitive ecosystem.