Information contained in this publication is intended for informational purposes only and does not constitute legal advice or opinion, nor is it a substitute for the professional judgment of an attorney.
|
On November 22, 2024, the California Privacy Protection Agency (CPPA) formally proposed new regulations implementing the California Consumer Privacy Act (CCPA). Although the CCPA itself and previous CCPA regulations largely ignored employers, the proposed regulations focus heavily on the employment context. The centerpiece of the proposed regulations are three new sets of rules on risk assessments, cybersecurity audits, and automated decisionmaking technology (ADMT).
Of these, the risk assessment and ADMT rules would impose particularly heavy burdens on employers. Covered employers would have to perform and document detailed risk assessments for certain uses of HR data, particularly the use of ADMT, and then provide copies of those risk assessments to the CPPA. In addition, the regulations would closely regulate the use of ADMT by employers. California applicants, employees, and independent contractors would have the right to opt out of most uses of ADMT for employment decisions or for systematic observation of their behavior. In addition, covered businesses would be required to provide detailed information about their use of ADMT to California residents. Although the new cybersecurity audit requirements do not apply as directly to covered businesses in their capacity as employers, employers should take note. The rules on cybersecurity audits provide insight on the data security measures that the CPPA likely considers compliant with the CCPA’s requirement for “reasonable” data security.
Context for the proposed regulations
The CCPA provides comprehensive protections for the personal information of California residents and applies to most businesses of at least medium size that do business in California.1 In addition to highly burdensome data protection provisions, the statute created a new privacy agency, the CPPA, and authorized it to issue regulations on 22 topics.2 The first set of regulations, issued on March 29, 2023, covered, at least in part, 12 of these topics. The new set of regulations covers four more topics, including new rules on insurance companies (which are not relevant to employers), and makes some amendments to the existing set of regulations. A few remaining topics—for example, further defining the statutory terms “precise geolocation” and “specific pieces of personal information”—have not yet been addressed.
The CPPA now has until November 22, 2025, to submit the new regulations to California’s Office of Administrative Law for approval, which is the final step prior to official publication. The deadline for public comments on the proposed regulations is January 14, 2025.
ADMT Regulations
The ADMT regulations focus heavily on the use of ADMT by employers by regulating common use cases for ADMT in the workplace. Employers that use ADMT in these covered use cases would be required to: (1) provide a detailed pre-use notice, (2) add disclosures to their privacy policies, (3) ensure vendor contracts include required provisions, (4) comply with requests for access and requests to opt out, and (5) in certain cases, validate functioning for the intended purpose, ensure no discrimination occurs based on protected categories, and implement policies, procedures, and training to ensure such functioning and lack of discrimination. The right to opt out would likely pose the steepest stumbling block for employers because employers typically adopt ADMT to streamline processes, such as hiring or performance monitoring. For many employers, the obligation to create exceptions for each opt-out request may negate the advantages of using ADMT.
Covered Uses of ADMT
The regulations define ADMT very broadly as “any technology that processes personal information and uses computation to execute a decision, replace human decisionmaking, or substantially facilitate human decisionmaking.”3 The definition does explicitly exclude some common technologies, such as “calculators, databases, spreadsheets, or similar technologies.”4
The proposed ADMT rules apply in three circumstances of importance to employers. First, they apply to the use of ADMT to make “significant decisions,” which includes using these technologies to substantially facilitate decisions on hiring, allocation of work, compensation, promotion, demotion, suspension, and termination.5 Second, the regulations apply to the use of ADMT to perform “extensive profiling,” which includes using automated processing of data collected through extensive observation to analyze or predict the characteristics of applicants, employees, and independent contractors.6 Third, these rules would apply to the use of personal information to train ADMT for purposes including “significant decisions” and profiling.7
Pre-Use Notice and Privacy Policy
Prior to using ADMT in a covered manner, the business must provide California residents with a “pre-use notice” covering the following topics:
- The specific purpose for using ADMT;
- How the ADMT works, including the logic and intended output;
- The right to opt out, the right to access information about the ADMT, and how to submit the request; and
- That the business may not retaliate against California residents for exercising their rights.8
In addition, the regulations would require businesses to add information about ADMT to their privacy policies, including that they use ADMT for a covered use, the right to opt out, and how to exercise that right.9 In contrast to the pre-use notice, which is new, the CCPA already requires that businesses post a privacy policy “online.”10 Therefore, businesses may have to revise their existing privacy policies to cover ADMT.
The Right to Opt Out, Exceptions, and Evaluation of Functionality and Discrimination
The proposed regulations grant California residents the right to opt out of ADMT for the covered uses. The rule now provide four exceptions of relevance to employers, each of which is described below, but these are quite narrow.11 As written now, the right to opt out would apply to most covered uses of ADMT unless the ADMT were “necessary to achieve and used only” for the purpose at issue or the business offered a right of human appeal.
Human appeal exception
First, the right to opt out would not apply where ADMT is a key factor in hiring, allocation of work, compensation, promotion, demotion, suspension, and termination (“significant decisions”) if the employer provides the individual with the right to appeal to a human reviewer with the authority to overturn the decision. However, for employers that implement ADMT largely to streamline processes and save human labor, this exception offers little relief from the right to opt out.12
Security, fraud prevention, or safety exception
Second, there is no right to opt out when the “ADMT is necessary to achieve, and is used solely for, …
security, fraud prevention, or safety purposes.”13 Unfortunately for businesses, the “necessary to achieve” may swallow this exception because rarely would ADMT offer the only way to achieve security, fraud prevention, or safety purposes. Moreover, the security exception applies to protection of personal information only, which appears to omit trade secrets and other confidential business information. Also, the fraud prevention exception does not appear to cover prevention of fraud against customers, vendors, or affiliated companies.
Significant decisions exception
Third, the proposed regulations give businesses an exemption from the right to opt out of the use of ADMT for decisions about hiring, allocation, assignment of work, and compensation.14 Like the security, fraud prevention, and safety exception above, however, the exception applies only where the ADMT is “necessary to achieve, and is used solely for” these purposes. In addition, the business must evaluate whether the ADMT works for the intended purpose and does not discriminate based on protected classifications, and implement policies, procedures and training to ensure this functionality and non-discrimination.
Work profiling exception
Fourth, the right to opt out would not apply when the ADMT is “necessary to achieve, and is used solely for,” profiling to assess the individual’s ability to perform, or actual performance, at work.15 However, because the rules do not provide an exception to the right to opt out over the use of ADMT for promotion, demotion, suspension, or termination decisions (as opposed to hiring, allocation, assignment of work, and compensation), it seems that California residents might still have the right to opt out if the performance assessment were used as a key factor in those decisions.
Right of access
California residents would gain the “right to access ADMT,” which means a right to obtain, upon request, certain information about the covered uses of ADMT regarding the requesting individual. This information would include: the specific purposes for using the ADMT, the ADMT outputs, and details about how the ADMT worked, such as logic, key parameters, and output statistics to help the requesting individuals understand how their output compared to the output on other individuals. Providing this highly individualized information likely would increase the cost and/or decrease the efficiency to employers of using ADMT.
Risk Assessments
At first blush the proposed regulations on risk assessments appear to add significant burdens to a covered employer’s CCPA compliance efforts. However, it is likely that the requirement to conduct risk assessments would apply mainly in circumstances involving ADMT or artificial intelligence (AI).16 Where the risk assessment obligations apply, however, it would require a detailed, written assessment of about 30 elements, annual submission to the CPPA, and regular updates.
When businesses must conduct risk assessment
The proposed regulations would require that covered employers conduct a risk assessment before processing personal information in a manner that presents a “significant risk to consumers’ privacy.”17 Processing activities that meet that standard are those that involve: (1) selling and sharing personal information; (2) processing sensitive personal information; (3) using ADMT for a significant decision concerning the consumer or for extensive profiling; or (4) processing personal information to train ADMT or AI. Notably, the final two categories encompass all uses of ADMT covered by the proposed ADMT regulations. As a result, any use of ADMT covered by the ADMT rules would also require a risk assessment under the risk assessment rules.
Covered employers rarely sell or share personal information of their workforce. With respect to processing sensitive personal information, the proposed regulations carve out processing sensitive personal information “solely and specifically for purposes of administering compensation payments, determining and storing employment authorization, administering employment benefits, or wage reporting as required by law.”
In addition, section 1798.121(d) of the CCPA provides that sensitive personal information collected or processed without the purpose of inferring characteristics about the consumer is not subject to the CCPA’s rules for sensitive personal information and is instead treated as personal information. This exception should eliminate any situations not covered by the quoted exception because employers rarely, if ever, collect sensitive personal information for purposes of inferring characteristics about employees.
Requirements of a risk assessment
Risk assessments would be a burdensome undertaking, requiring the assessment of some 30 elements18 to determine whether the risks to consumers’ privacy from the processing of personal information outweigh the benefits to the consumer, the business, and others from that same processing. If the risk to the consumers’ privacy outweighs the benefits, the covered employer is prohibited from processing personal information for that activity. Risks to the consumers’ privacy may include unauthorized access, destruction, use, modification, or disclosure of personal information and discrimination based on a protected class, among other things. Covered businesses must review the risk assessment at least once every three years and update the assessment as necessary.
Submission of the risk assessment to the CPPA
For many businesses, the requirement to submit the risk assessment to California authorities may raise the most concern. This requirement could expose detailed information about a business’s internal processes and policies to regulatory scrutiny. Covered businesses would have to submit risk assessment materials to the CPPA within 24 months from the risk assessment’s effective date. These materials would consist of: (1) a form Certification of Conduct signed by the employer’s highest-ranking executive responsible for oversight of the company’s risk assessment compliance; (2) an abridged version of the risk assessment on a form provided by the CPPA; and (3) an unabridged version of the risk assessment. Thereafter, the covered business must annually submit the foregoing risk assessment materials. Finally, the proposed regulations would require that a covered business provide its unabridged risk assessment within 10 business days of a request from the CPPA or state attorney general.
Cybersecurity Audits
As the thresholds for the cybersecurity audit are currently drafted, only large data processors or data sellers would be required to perform the highly burdensome and comprehensive written cybersecurity audit. However, even companies that would not be required to conduct the cybersecurity audit should review the factors because they likely indicate what the CPPA considers to satisfy the CCPA’s requirement that a covered business implement “reasonable security procedures and practices.”19 For many companies, HR data is their most sensitive personal information, and businesses could face regulatory scrutiny and liability if that information were compromised.
What businesses must conduct a cybersecurity audit
The cybersecurity audit requirement applies to covered businesses that: (1) derive 50 percent or more of their annual revenues from selling California residents’ personal information or disclosing it to third parties for cross-context behavioral advertising or (2) had annual gross revenues of over $25 million, as adjusted by the CPPA, and, in the preceding calendar year, either (a) processed the personal information of at least 250,000 California residents or households or (b) processed the sensitive personal information of at least 50,000 California residents.
Requirements of a cybersecurity audit
The cybersecurity audit must be conducted by an independent objective auditor on an annual basis and must document 13 highly specific components, each including multiple subcomponents. These include, for example, encryption of personal information at rest and in transit, cybersecurity training, and code reviews. Although the components show similarity to other cybersecurity standards, such as ISO 27001, and the Center for Internet Security’s recommended controls, they would not map directly to these other standards. Nevertheless, covered businesses potentially may use audits to other cybersecurity standards to meet the requirement. To do so, however, the business must jump over a significant hurdle; the business “must specifically explain how the cybersecurity audit, assessment, or evaluation that it has completed meets all of the requirements” of the CCPA cybersecurity audit.20
Finally, like the risk assessment, the cybersecurity audit implicates senior leadership in compliance. The cybersecurity audit must be reported to the board of directors, or if no board, to a high-ranking executive.21 A member of the board or the executive not only must sign and date the audit, but also sign a certification of audit completion and submit it to the CPPA.22
Changes to the existing regulations
In addition to creating the new ADMT, risk assessment, and cybersecurity audit rules, the proposed regulations would revise the existing regulations in several relatively minor ways for employers. For example, these proposed regulations would require a covered business, when rejecting a request to exercise data rights, to inform the requesting individual that they can file a complaint with the CPPA and California attorney general and provide links to these agencies’ website complaint forms. As another example, the revisions would require provisions in vendor contracts obligating the vendor to assist the covered business in complying with the regulations on ADMT, risk assessments, and cybersecurity audits.23
See Footnotes
1 For more information on what businesses are covered businesses, please see Anna Park, Zoe Argento, and Philip Gordon, Substantial New Privacy Obligations for California Employers: The California Privacy Rights and Enforcement Act of 2020 Passes at the Polls, Littler Insight (Nov. 5, 2020). For more coverage of the CCPA, please see our website: California Privacy Rights Act of 2020 | Littler Mendelson P.C.
2 Cal. Civ. Code §§ 1798.185, 1798.199.10.
3 Proposed Text of Regulations, Oct. 2024 § 7001(f) [hereinafter “Proposed Regulations”].
4 Id.
5 § 7200.
6 Id.
7 Id.
8 § 7220(c).
9 § 7011(e)(2).
10 Cal. Civ. Code § 1798.135(a)(5).
11 Proposed Regulations § 7221.
12 § 7221(b)(2).
13 § 7221(b)(1).
14 §§ 7221(b)(3), (4).
15 § 7221(b)(5).
16 The regulations define “artificial intelligence” broadly as “a machine-based system that infers, from the input it receives, how to generate outputs that can influence physical or virtual environments.” 7001(c).
17 § 7150(a).
18 At top level, the risk assessment must cover nine elements, but these elements include multiple sub-elements: (1) the specific purpose for processing personal information; (2) the categories of personal information to be processed and whether they include sensitive personal information, and for ADMT or AI uses, the actions taken or any planned actions to maintain the quality of personal information processed by the ADMT or AI; (3) operational elements of processing as outlined in the regulations; (4) the specific benefits to the business, the consumer, or others from the processing of personal information; (5) the specific negative impacts to the consumers’ privacy associated with the processing; (6) safeguards the business plans to implement to address the negative impacts identified; (7) whether it will initiate the processing subject to the risk assessment; (8) the contributors to the risk assessment; and (9) the date the assessment was reviewed and approved, and the names and positions of the individuals responsible for the review and approval. Proposed Regulations §7152.
19 See Cal. Civ. Code 1798.100(e).
20 Proposed Regulations § 7121(f).
21 § 7122(h).
22 § 7124.
23 § 7051(a)(5).