The illusion of protection and SB 5062 (the Bad Washington Privacy Act)

Last updated: March 11.

Arcs giving the appearance of a spiral, but in reality concentric circles on a black, white, and gray background.
It's not actually a spiral. It's an optical illusion.

The weak, industry-backed Bad Washington Privacy Act (SB 5062) has passed the Senate, and is heading on to the House.   SB 5062's January Senate hearing featured plenty of sharp criticism, including examples of how it would not help people and communities who are being harmed by data abuse.

Civil rights and community groups have given similar feedback on previous iterations of the bill for years.  So have consumer groups; see Consumer Reports' policy analyst Maureen Mahoney's Washington Privacy Bill needs stronger safeguards for consumers op-ed and Consumer Federation of America Director of Consumer Protection and Privacy Susan Grant's We Need Real Privacy Protection in the States, not the Washington Privacy Act’s Illusion of Privacy.   The Attorney General's Office has made it clear that they think the bill's enforcement needs to be strengthened.  

Alas, bill sponsor Sen. Reuven Carlyle rejects these perspectives, describing them as "ideologically pure politics that present perfect as the enemy of the good."  

But just how good is the Bad Washington Privacy Act?  

Let's start by looking at some of the weaknesses in SB 5062's current draft.  Next, I'll go into a case study of real-world privacy abuse to see how much the Bad Washington Privacy Act would really help; then look briefly at other scenarios.  I'll include references to specific sections of the current draft (2SSB 5062) for those who want to get deep into the details.    An appendix has suggestions for improvements to SB 5062, some of which are straightforward, others more controversial.

Loopholes and exemptions that neuter protection

A grid of grey lines and white dots on a black background.  Most people also see flashing black dots, which don't actually exist in the image.
The dots aren't actually flashing. It's an illusion.

Let's start with a few of what the Attorney General's Office described as "broad exceptions that would permit industry to sidestep the very consumer rights and obligations created by this bill” in their testimony on last year's bad bill.  Susan Grant of Consumer Federation of America was just as blunt in SB 5062's January hearing, citing "loopholes and definitional problems that essentially neuter the protections the bill is supposed to provide."  

For example:

  • Section 102 (2), clauses a b c d e f g h i j k l m include loopholes that completely exempt government agencies, student data covered by FERPA or the state user privacy in education rights act, health data covered by HIPAA or other legislation, financial data covered by Gramm-Leach-Bliley, and a whole lot more.   Update: amendments have now added airlines to the list, so now there's also a clause n.   SB 5062, now with more neutering!
  • Companies like Google, Facebook, and Amazon also get a big carve-out from  SB 5062's rules about using data for targeted advertising.  The definitional problem here is Section 101 (33) (a), which excludes advertising "based on activities within a controller's own websites or online applications."
  • Institutions of higher education have a five-year exemption as well, thanks to Section 403 which covers "non-profits and institutions of higher learning."  Washington is justifiably proud of its excellent universities, colleges, and community colleges so I can see the basic idea has a lot of support ... but Section 403 also exempts for-profit colleges that as Tressie McMillan Cottom says "target and thrive off of inequality."  

Another potentially-neutering loophole is the "right to cure" in Section 112 (4), which gives corporations 30 days until after the AG gives them a letter to "cure" any alleged violations . Many companies post inaccurate descriptions of the information they share.  If they correct it within 30 days of a letter from the AG, is there a violation? What if a company shares location data without opt-in consent, but stops if and when the AG puts them on notice?   The current version of SB 5062 is silent on whether these would be considered a cure.

And there are also very broad exemptions for cooperating with law enforcement agencies even withot a warrant, subpoena, or court order for either of these two exceptions.  Section 110 (c) allows cooperation concerning conduct or activity that "may violate" federal laws or regulations.  Section 110 (g) allows coopertaion for preventing, detecting, and investigating those responsible for any illegal activity. Of course, companies holding data should comply with legal processes; but it bears repeating: SB 5062 does not require a warrant, subpoena, or court order for either of these two exceptions.

The illusion of enforceability

Four squares, each with a grid of hundreds of small hearts in the background and a large heart in the center.  The top left and bottom right squares have yellow backgrounds and small blue hearts; the larger heart int he center looks pink.  The top right and bottom left squares have blue backgrounds and small yellow hearts; the larger heart looks orange.
The large hearts are actually the same color in all four boxes. It's an illusion.

This year's Bad Washington Privacy Act does have some improvements over Sen. Carlyle's previous bad privacy bill.  For example, thanks to what Sen. Carlyle describes as the "constructive insistence of the Attorney General"  (who repeatedly testified that the 2019 and 2020 bills were unenforceable), a "per se" clause has been added.

Then again the AGO has also repeatedly testified that giving them the sole authority to enforce the bill is a bad idea, and that "a Private Right of Action, along with AGO enforcement is the best policy for consumers."   Many civil rights and consumer organizations have said similar things.  The AGO, civil rights, consumer organizations repeated this in SB 5062's January hearing and in the February Ways & Means hearing.   Apparently, though, they haven't been insistent enough to be heard yet.

One of the reasons a private right of action is such a big deal is that SB 5062 only allocates a pittance for AGO enforcement.  The fiscal note allocates only $700,000 annually for the next two years enough for 3.6 full-time equivalents (FTEs) employees, and then decreases the allocation in future years.  The AGO's office projects doing only three investigations per year, which isn't a lot ... and good luck to those 1.2 FTEs taking on Facebook, Amazon, or Google!

Ireland, by contrast, budgets $23 million per year – despite having a population smaller than Washington state.  Even tiny Luxembourg spends $8 million per  year!  Despite these much larger allocations, European data protection enforcement has been held back by lack of resources.  Just last week, an EU commission called for the European Commission to sued the Irish DPC for failing to enforce the GDPR.

Section 113 says that receipts from civil penalties can be used to offset costs ... but the fiscal note projects $0 in receipts from civil penalties through 2027.

And keep in mind that Facebook and Google have been breaking Washington state law on election advertising for years without the AGO being able to stop them. Why do we expect AG-only enforcement to be more successful with privacy?   Indivisible Plus Washington asked Sen. Carlyle this on Twitter but as far as I know he hasn't responded yet.

There's a good discussion of the private right of action and right to cure, along with many other topics, in Common Sense's Privacy Nuts and Bolts: How Washington lawmakers can protect our digital privacy, featuring UW's Ryan Calo, Jennifer Lee of ACLU of Washington, Stacey Gray of Future of Privacy Form, and Maureen Mahoney of Consumer Reports Advocacy.

Would SB 5062 prevent ICE and the military from using information from Muslim prayer apps?

During the hearing Brianna Auffray of CAIR Washington said  "SB 5062 would allow for continued unconsented data collection and sale that will put Muslim and immigrant communities at serious risk of harm."

As Ms. Auffray's use of the word "continued" implies, this kind of data collection and sale is already occurring.   Last November, Joseph Cox reported in Motherboard that popular apps including Muslim Pro sold location data to X-Mode and other data brokers – who then sold it to contractors who worked for the military.  Then, just a few weeks ago, new reporting by Cox identified another set of apps, including the Salaat First prayer time app, that sold location data to Predicio, a different data broker – who then sold it to a contractor that has a history of working with ICE.

When the news about Muslim Pro first broke, Zahra Billoo (a civil rights attorney and executive director of CAIR’s San Francisco Bay Area chapter) described it as feeling like "a betrayal."  Majlis Ash-Shura, a leadership council that represents 90 New York state mosques, sent a notification urging people to delete Muslim Pro, citing “safety and data privacy.”  After the Salaat First news, Bay Area attorney Sara Mostafavi said she was "disturbed, appalled, but not surprised."

Hopefully we all agree this is the kind of thing that strong privacy legislation should hold companies accountable for!  

A screen shot of the Salaat First app for prayer times

And indeed, SB 5062 has some clauses that seem like they stop this kind of abuse:

  • For sensitive data, including location (Section 101 (31)), SB 5062 requires opt-in consent before sharing (Section 107 (8)).  
  • Sec. 107 (4) says that a controller can't share data for a different purpose than it was collected without getting additional consent.  
  • Sec. 107 (6) prohibits processing of data on the basis of religion in a way that unlawfully discriminates against the consumer.

But when you look more closely the answer isn't so clear at all.

First of all, there's the law enforcement exceptions 110 (c) and (g) I mentioned earlier.   ICE could well argue that they're investigating illegal activity, or activity activity that "may violate" federal laws or regulations; in which case, companies could share data they've collected for a different purpose.  

Also SB 5062 exempts government agencies (Section 102 (2) (a)).  So if ICE, the military, a fusion center or law enforcement agency acquires the data claiming that it's for another purpose, and then uses it for profiling, there's no accountability.

Even if one of the corporations involved has shared data inappropriately, they may not have violated SB 5062.  Remember, Section 112 (4) gives them 30 days (after the AG gives them a letter) to "cure" the violation.  So suppose they hadn't asked people to opt-in to sharing location data but did it anyway.  If they stop this unauthorized sharing within 30 days of getting a notice from the AG, would that be enough to get them off the hook?  That's a good question.  In a Twitter thread about an earlier version of this post, Stacey Gray of Future of Privacy Forum suggested that some things can't be cured.  But that's not what the bill's text currently says.

As for Section 107 (6)'s anti-discrimination clause, all of the data's processed, so it's not “on the basis of” religion. And is the discrimination here actually "unlawful"?   So that's not likely to help either.

With all these uncertainties, and limited resources, would the AGO even bring a case?  If not, then the people harmed by this abuse wouldn't have any way to pursue accountability – there's no private right of action (Section 111).  What if city or county governments want to protect their residents?  Sorry, they're not allowed to (Section 112 (1)).

Variations on a theme

Overlapping disks with a geometric pattern in green, pink, and red
The disks aren't actually rotating. It's an illusion.

After diving deep into this one example it's useful to think briefly about some hypothetical variations.

Suppose data broker and contractor, or app writer and data broker, are affilliates?   Then SB 5062 doesn't apply (Section 101 (30) (b)  (iii)).

What about if one those sleazy for-profit colleges decides to "monetize" its Muslim students' data by selling it for a purpose the students didn't consent to?  Section 403 applies.  At least for the next five years, there's no violation.

Or, suppose an app shares sensitive info like location with an airline for targeted marketing and advertising ... and the airline instead uses it to flag Muslim users for extra security checking.   Stacey Gray suggests that SB 5062 will help here because only a small percentage of users are likely to opt-in to using their location for targeted advertising and marketing.   Frankly, I'm skeptical (most of people I know will opt-in to get targeted marketing like discounts or early access to events they're likely to be interested in) but suppose she's right.  It doesn't matter!  Airlines are exempt (Section 102 (2) c) so they don't actually have to ask people to opt in!

And while this particular case study focuses on Muslims, they're far from the only ones getting harmed today.   For example:

  • ICE gets data from utility companies without consent.
  • During the January hearing, Emilie St. Pierre said that SB 5062's opt-out approach did not protect victims of online harassment, stalking, and identify theft.
  • Non-binary, genderqueer, gender-nonconforming, and agender people get less proection because the definition of "sensitive data" in section 101 (31) doesn't include gender.

The examples in Harms of Data Abuse also includes privacy abuses affecting LGBTAIQ+ dating app users; cell phone users;  abortion patients;  emergency room patients; Black, Indigenous, and People of Color; transgender and nonbinary individual; migrant children and their families; political protesters; children and teenagers; and menstrual cycle tracking app users.   How effective will SB 5062 be at protecting them from similar abuses?  

Even to the extent it might, just how much impact can the AGO's 3.6 FTEs really have?

Strong protection ... or just an illusion?

An optical illusion, looking somewhat like an eye, where curved lines radiating out from a center black dot appear to be moving
The curved lines aren't actually moving. It's an illusion.

Unsurprisingly, SB 5062's supporters think much more highly of the bill's consumer protections than I do.  In his op-ed, Sen. Carlyle describes the protections as "unambiguous".  During the hearing the Washington Technology Industry Association, the CEO of a Seattle online gaming company, and Microsoft used the words "strong", "very strong", and "strongest" to describe its consumer protections.   Tech companies also appreciate that the bill as currently written allows them to "operate with increased predictability" and assert confidently that it will increase consumer trust.   What's not to like?

In reality, though, the Bad Washington Privacy Act's weaknesses encourages even well-intentioned app vendors and data brokers to share data without worrying much about consequences of violating people privacy.  After all, most of the time nobody even notices if data's being shared somewhere people don't expect it to.  If and when people do notice, the bill may well have a loophole that covers it – and all you have to do is stop sharing the data and you've quite possibly "cured" the problem!  And if not, no worries, the AG's 3.6 FTEs have a lot to do, and you'll probably have them badly outnumbered if they do go after you.

And of course some big tech companies, app vendors, and data brokers are ... not always well-intentioned.

So when you hear the Bad Washington Privacy Act's supporters talk about SB 5062's "strong  consumer protection", take it with a grain of salt.

Is it really strong protection... or just an illusion?

An optical illusion in which a static image appears to be moving
The image isn't actually moving. It's an illusion.

Appendix: the People's Privacy Act and potential improvements for SB 5062

The People’s Privacy Act, community-driven, people-centric alternative to the Bad Washington Privacy Act, takes a radically different approach.  The People's Privacy Act gives rights to people, not just consumers.  It was designed from the ground up in collaboration with the civil liberties and community groups in the Tech Equity Coalition, focusing on protecting people who are harmed by data abuases.  It's got a private right of action, and allows city and county attorneys to pursue claims against companies that harm Washingtonians.  

It would be great if the legislature simply adopted the People's Privacy Act.  Rep. Shelley Kloba introduced it into the House as HB 1433, and it has bi-partisan co-sponsorship, but it has not yet gotten a hearing.  

So another path to strong privacy protection is to improve SB 5062.   Let's face it, history is not encouraging here.  In 2019 and 2020, the Senate passed weak, corporate-friendly bill sponsored by Sen. Carlyle.  The House listened to privacy advocates and community groups, and passed a significantly stronger version to protect consumers. Negotiations between the chambers collapsed.  Our privacy as Washingtonians remained unprotected.

I'm sure I'm not the only person who will be really upset if that happens again.

Still, SB 5062 is the bill that's on the table now.  So it's worth a try.  Fortunately, the years of testimony and feedback from civil rights and consumer groups, and in some cases specific langauge from the People's Privacy Act, highlight quite a few opportunities.  

Here are some possible improvements.   Some of these are very straightforward, others likely to be more controversial.    

Give rights to people, not just consumers, for example by reframing Section 103.

Remove loopholes and definitional problems.   For example:

  • Replace the exemptions for student, financial, health care (the Section 102 (2), clauses a b c d e f g h i j k l m I talked about in my testimony) with language from HB 1433 Section 11 (2) and (3), which basically says this new legislation applies where it provides stronger privacy protections for individuals than existing law and the federal laws do not preempt state laws.  Also, I realize airlines just got added, but they could certainly be removed again!
  • Include sites and apps like Google, Facebook, and Amazon within the definition of targeted advertising by removing Section 101 (33) (a).
  • Remove the warrantless law enforcement exceptions in Sections 110 (c) and (g).
  • Remove the right to cure in Section 112 (4), as California did in the recently-passed CPRA.  A right to cure drains AGO resources, and creates a perverse incentive for companies to ignore the law until they’re notified that they’re breaking it.
  • Remove or at least tighten the five-year exemption for non-profits and institutions of higher education in Section 403.  As currently written, this exemption even applies to for-profit colleges, who as Tressie McMillan Cottom says "target and thrive off of inequality."

Get rid of the confusing mix of opt-out and opt-in (a classic dark pattern!) and require that companies always get consent before using people's data (Section 107 (8)).  Opt-in creates positive incentives for companies -- they need to make it easy for users to understand the benefits and consent.  Opt-out, by contrast, gives companies incentives to make it hard for users to withdraw their consent.  Opt-out is especially problematic for disabled users (many websites do not work well for people using screenreaders or other assistive technology) and people who prefer languages other than English (most websites and apps only have English-language opt-out pages).

Add a private right of action, as the AGO and other groups have consistently requested, by removing section 111 and instead explicitly using language similar to the People's Privacy Act Section 10 (1).

Allow city attorneys and county prosecutors to enforce the law by removing the word "solely" from Section 112 (1) and adding language from People's Privacy Act Section 10 (3).

Allow for stronger local legislation, by removing the preemption clause (Section 114), and potentially replacing it with the People's Privacy Act Section 11 (1).

Split out the timely and important topic of protecting Covid-19 related data (Parts 2 and 3) into a separate bill,  as several people suggested during the hearing .   New York’s Contact Tracing Privacy Bill: A Promising Model, from the Brennan Center, discusses this in more detail.

That's a lot.  Still, politics is the art of the possible.   Last year, the House ITED committee strengthed the bad Senate bill substantially, and it was further strengthened on the House floor.  This year, its going to the House Civil Rights & Judiciary instead; hopefully, they'll be even more attuned to the civil rights issues. The optimistic scenario is that SB 5062 is strengthened enough in the committee that it becomes the Pretty Good Washington Privacy Act, and then further strengthened on the floor to be the Pretty Good Washington Privacy Act.

Of course, tech companies will still be push back hard – and politically, it's a lot easier for legislators to pass a bad bill and claim it's better than it is.  So it's hard to know how things will work out.  Will we get real privacy protection, the illusion of protection, or once again nothing at all?   Time will tell!


For more about SB 5062's illusion of protection, see

Image credits

Edit history:

The initial version of this post, in late January focused primarily on the case study, focusing primarily on targeted advertising and marketing.  

In early February, in response to some very helpful feedback from Stacey Gray on Twitter (who disagreed with someof the original analysis) as well as several people pointing out that I had skipped some basics, I added the new section on loopholes, and reworked the analysis of the case study.  

On February 6-7, I added new sections on enforceability and "variations on a theme" ... plus a couple more illusions!   And on February 9, I incorporated some of the fiscal analysis from the Ways & Means hearing.

In early March, I added a discussion of law enforcement exceptions, included a few ore exaples in the "variations on a theme" section, updated the bill's status, and added the References section.

Show Comments