Stay-at-Home, Videoconferencing, and a Baptism of Fire for the California Consumer Privacy Act

Michael P. Goodyear | April 22, 2020

The novel coronavirus COVID-19 has rapidly become one of the worst public health crises in U.S. history. Yet this is not only a critical moment for health, but also for privacy. With social isolation orders in forty-two states, as well as Washington, D.C., Puerto Rico, and Guam, collaborative technological services—such as such as video conferencing, file sharing, mobile apps, and video games—have taken an even more preeminent role in our social and work lives. The increasing importance of these technologies represents the equally increasing power these services have over our privacy, both at work and at home. Sensitive corporate information, facts about individuals’ well-being, and personal conversations between loved ones are being communicated via these platforms in greater quantities than ever before. Proper privacy protections have never been more important for U.S. citizens, yet federal data privacy law remains a patchwork of sector-specific laws that fail to cover large swaths of entities that process personal information. Instead, it is largely left to states to adopt consumer privacy laws, of which California was the first with its California Consumer Privacy Act (CCPA) having taken effect in January 2020.

The sole consumer privacy act in the United States, the CCPA is now the primary shield against poor data processing practices from online collaborative platforms, foremost among them Zoom Video (Zoom). Zoom’s soaring popularity led to greater scrutiny of its data processing and security practices, with the findings ranging from sending personal information to Facebook to allowing uninvited attendees to break into private calls, a practice known as “Zoom-bombing.” Just a month after the first CCPA case ever was filed, on March 30, 2020, a class action suit against Zoom was brought before the U.S. District Court for the Northern District of California. The class action complaint alleges that Zoom violated the CCPA by failing to provide consumers with adequate notice of personal information processing practices and by failing to implement reasonable security measures to prevent the unauthorized disclosure of non-encrypted personal information. The CCPA explicitly requires entities collecting consumers’ personal information to fully disclose the collection of such information and how it will be used, including with which third parties it is shared. Businesses processing personal data also must implement and maintain reasonable security measures to prevent the inadvertent disclosure of such information. Therefore, the CCPA would seem to be a promising constraint on poor data protection practices, both in the conscious use of data as well as maintaining insufficient protection practices.

These rights are standard for data protection regimes around the world and are part and parcel of the comprehensive data protection in the European Union, the General Data Protection Regulation. Yet the CCPA is still untested in court and such protections are new to the U.S. legal regime. The courts’ early interpretation of key terms such as “disclosure” and “reasonable security measures” will be critical. The Northern District of California’s ruling in this case, or settlement negotiations between the class and Zoom, will be of the utmost importance for future CCPA litigation and reigning in poor consumer data privacy practices. Although the CCPA only protects California residents, the interstate nature of the Internet would likely cause data processing entities to conform to the highest data privacy standard in the United States. Given the vastly increased use of data-collecting online platforms due to COVID-19 stay-at-home orders, an enormous amount of personal data is on the line. Since California is currently the only state with a comprehensive consumer privacy act, the outcome of the Zoom litigation is particularly acute: if the CCPA proves to be a paper tiger, United States residents will be left nearly defenseless, with only a patchwork of federal and state laws to provide data protection.

Michael P. Goodyear is a graduating 3L at the University of Michigan Law School, where he is the editor-in-chief of the Michigan Technology Law Review. His previous and upcoming law journal articles can be found on his SSRN site here. After graduation, Michael will work as a litigation associate at a large law firm in New York.