Safe at Any Speed: Formalizing Consumer Privacy Risk Management

President Biden’s plan to nominate Alvaro Bedoya to the U.S. Federal Trade Commission (FTC) sends a clear message that the agency and other federal regulators will further intensify their focus on consumer privacy matters, especially within technology companies.

Frequently described as a “privacy hawk,” Bedoya is the founding director of the Center on Privacy and Technology at Georgetown University Law Center, where his research has examined facial recognition systems and many other privacy issues. “Bedoya also isn’t shy about calling out Big Tech,” reports TechCrunch’s Taylor Hatmaker. “In a New York Times op-ed a few years ago, he took aim at Silicon Valley companies giving user privacy lip service in public while quietly funneling millions toward lobbyists to undermine consumer privacy.”

As regulators and legislators consider new approaches to addressing consumer privacy, CISOs and colleagues in technology and consumer products companies that use personal data should reconsider how they’re balancing their management of data privacy risks and the need for speed. Developing a privacy risk analysis capability is a useful way to address consumer privacy issues without impeding the rapid pace at which the company puts forth new products and updates of existing offerings.

Many technology companies update their products on a weekly, if not daily, basis. And many, though certainly not all, of those updates pose potential data privacy risks. The trick is to quickly and reliably identify which new products and/or updates can give rise to privacy risks and then mitigate those risks, while letting updates that have little or no impact on consumer data privacy zip through the filter.

When a company releases 10 bits of code to production, the code should be subject to a privacy assessment process that evaluates the data privacy risks as quickly and as unobtrusively as possible. However, when that update moves a button from one side of a user’s screen to the other side, that change likely does not need to be subject to a comprehensive risk impact assessment.

A privacy risk analysis consists of three enabling activities:

  • Inventory privacy risks: Start by developing a long list of privacy risks related to specific offerings that a company confronts. When developing this list, keep in mind that data privacy issues often arise when a company begins using data in ways that it has not done previously. A platform, app or product update might contain a new feature that relies on geolocation data to track customers, for example. If the privacy policy indicates that the company does not use geolocation data to track consumers, that update warrants additional scrutiny. In addition to strengthening the identification and mitigation of privacy risks, this inventory helps support efficiency. After all, it’s much quicker to assess whether a software update might trigger one or more of 50 or 100 defined privacy risks than to start the assessment from scratch each time.
  • Involve your engineers: The vast majority of data privacy missteps are unintentional, and many of these unintentional mistakes occur as second-degree impact of a change that otherwise poses a minimal direct risk. A slight change to an in-car app’s algorithm may have the unintended effect of letting a third party track the vehicle. These types of unexpected data privacy impacts are precisely why software engineers should be involved in identifying the privacy risk inventory and analyzing whether or not a product update might trigger a privacy risk. Engineers can walk their data privacy and security colleagues through discussions along the lines of if we make this update, the consumer will do X, which will cause Y and Z to happen.
  • Mitigate and document: When a privacy risk associated with an update occurs, the privacy team must determine what actions to take to reduce that risk to an acceptable level. And that mitigation should be documented. If the car app update might let an unauthorized third party track someone’s vehicle, the team should document the technical and procedural controls that were put in place to prevent that outcome from occurring.

While developing and refining a privacy risk analysis requires a fair amount of upfront work and ongoing collaboration, its design is relatively straightforward. Yet many organizations lack this capability, not because of its intricacy, but due to other common hindrances, such as:

  • Compliance myopia: Some organizations overfocus on compliance-related privacy risks while neglecting risks that arise from using consumer data in ways that depart from the stated policy and/or other promises that the company has previously made to consumers. Compliance risks are important, of course, but many types of privacy risks lurk beyond the reach of regulations.
  • Reacting instead of reaching out: In some cases, privacy teams only get involved in the software development lifecycle by invitation – when a member of the development teams thinks a change might create a data privacy problem. The reality is that far more changes in the development lifecycle have potential privacy impacts than many development teams realize.
  • Ineffective filtering: Without a comprehensive list of privacy risks, too many software and product changes with privacy risk implications slip through the filter without being mitigated.

A sound privacy risk analysis program helps companies overcome those obstacles without interfering with the roll-out of new functionality and new products.

To learn more about our privacy consulting services or our CISO Next program, contact us.

 

Jeffrey Sanchez

Managing Director
Security and Privacy

Subscribe to Topics

In the latest #Technology Insights blog post, read how Protiviti works with clients who have concerns about #IoT device #security by asking five questions about their device security ecosystem: http://ow.ly/igKy50H5pbV

In this #podcast episode, Protiviti's Jim McDonald and Jeff Steadman talk with Armin Ebrahimi, Head of Distributed Identity at Ping Identity, about the role #identityproofing plays in the #identity space. Listen now: http://ow.ly/8jCz50H5mte

We invite you to visit the @Xillio #TOPGOLF VIP Cabana for a few hours of networking, happy hour food & drinks, and golf ⛳️ to unwind yourself during the #Microsoft365 Collaboration Conference in Las Vegas. Reserve your ticket at https://hubs.li/Q0106-7R0 and have fun🥳!

The #quantumcomputing industry believed #MonteCarlo simulations would be one of the best use cases to show #quantumadvantage; @QCWare proved the feasibility. @KonstantHacker and QC Ware's Yianni Gamvros chat how to access reusable #quantum #code. http://ow.ly/cRmK50H4wQh

Protiviti's Scott Laliberte shares with Venture Beat four mistakes that can lead to a failed AI implementation and what to do to avoid or resolve these issues for a successful AI rollout. http://ow.ly/wU8C50H4pBY

#ProtivitiTech #AI #machinelearning #technology

Load More...