President Biden’s plan to nominate Alvaro Bedoya to the U.S. Federal Trade Commission (FTC) sends a clear message that the agency and other federal regulators will further intensify their focus on consumer privacy matters, especially within technology companies.
Frequently described as a “privacy hawk,” Bedoya is the founding director of the Center on Privacy and Technology at Georgetown University Law Center, where his research has examined facial recognition systems and many other privacy issues. “Bedoya also isn’t shy about calling out Big Tech,” reports TechCrunch’s Taylor Hatmaker. “In a New York Times op-ed a few years ago, he took aim at Silicon Valley companies giving user privacy lip service in public while quietly funneling millions toward lobbyists to undermine consumer privacy.”
As regulators and legislators consider new approaches to addressing consumer privacy, CISOs and colleagues in technology and consumer products companies that use personal data should reconsider how they’re balancing their management of data privacy risks and the need for speed. Developing a privacy risk analysis capability is a useful way to address consumer privacy issues without impeding the rapid pace at which the company puts forth new products and updates of existing offerings.
Many technology companies update their products on a weekly, if not daily, basis. And many, though certainly not all, of those updates pose potential data privacy risks. The trick is to quickly and reliably identify which new products and/or updates can give rise to privacy risks and then mitigate those risks, while letting updates that have little or no impact on consumer data privacy zip through the filter.
When a company releases 10 bits of code to production, the code should be subject to a privacy assessment process that evaluates the data privacy risks as quickly and as unobtrusively as possible. However, when that update moves a button from one side of a user’s screen to the other side, that change likely does not need to be subject to a comprehensive risk impact assessment.
A privacy risk analysis consists of three enabling activities:
- Involve your engineers: The vast majority of data privacy missteps are unintentional, and many of these unintentional mistakes occur as second-degree impact of a change that otherwise poses a minimal direct risk. A slight change to an in-car app’s algorithm may have the unintended effect of letting a third party track the vehicle. These types of unexpected data privacy impacts are precisely why software engineers should be involved in identifying the privacy risk inventory and analyzing whether or not a product update might trigger a privacy risk. Engineers can walk their data privacy and security colleagues through discussions along the lines of if we make this update, the consumer will do X, which will cause Y and Z to happen.
- Mitigate and document: When a privacy risk associated with an update occurs, the privacy team must determine what actions to take to reduce that risk to an acceptable level. And that mitigation should be documented. If the car app update might let an unauthorized third party track someone’s vehicle, the team should document the technical and procedural controls that were put in place to prevent that outcome from occurring.
While developing and refining a privacy risk analysis requires a fair amount of upfront work and ongoing collaboration, its design is relatively straightforward. Yet many organizations lack this capability, not because of its intricacy, but due to other common hindrances, such as:
- Compliance myopia: Some organizations overfocus on compliance-related privacy risks while neglecting risks that arise from using consumer data in ways that depart from the stated policy and/or other promises that the company has previously made to consumers. Compliance risks are important, of course, but many types of privacy risks lurk beyond the reach of regulations.
- Reacting instead of reaching out: In some cases, privacy teams only get involved in the software development lifecycle by invitation – when a member of the development teams thinks a change might create a data privacy problem. The reality is that far more changes in the development lifecycle have potential privacy impacts than many development teams realize.
- Ineffective filtering: Without a comprehensive list of privacy risks, too many software and product changes with privacy risk implications slip through the filter without being mitigated.
A sound privacy risk analysis program helps companies overcome those obstacles without interfering with the roll-out of new functionality and new products.