Last week's The stakes are high: Washington state privacy, facial recognition, and automated decision making legislation 2021 discussed several bills backed by the Tech Equity Coalition, a group of civil liberties and civil rights-focused organizations and individuals working to hold technology companies accountable – as well as one bill we oppose.
The Washington legislative season's in high gear, so this week there are significant updates on two of the bills:
- SB 5116, Sen. Bob Hasegawa’s bill regulating government use of automated decision making systems, had a hearing at the State Government and Elections committee featuring a lot of great testimony in support of the legislation.
- SB 5062, the Bad Washington Privacy Act, got a “do pass” recommendation from the Senate Environment, Energy, and Technology (EET) committee. Since Sen. Carlyle is the EET chair, this wasn't a particular surprise – even though this there was a lot of very negative feedback about SB 5062 from Tech Equity Coalition members and and others in its hearing last week.
Rounding out the list, Rep. Shelley Kloba is expected to introduce the People's Privacy Act in the House this week. And SB 5104, the facial recognition moratorium (also sponsored by Sen. Hasegawa), is still waiting for a hearing in the EET committee.
There's a lot more to say about all of this legislation. I'll return to SB 5062, the Bad Washington Privacy Act, in a separate post. Read on for a deeper dive into SB 5116.
First though, here's a video with some important context. The Fight for the Future: Organizing The Tech Industry, with Dr. Timnit Gebru of Black in AI, Dr. Alex Hanna from the Ethical AI team at Google, Charlton Mcilwain of NYU and the Center for Critical Race and Digital Studies, Dr. Safiya Umoja Noble of UCLA and the Center for Critical Internet Inquiry (C2i2), labor organizer Adrienne Williams, and Meredith Whitaker of NYU and AI Now. Once again, this isn't directly about the legislation but is very important recent context for the wave of tech worker activism.
From the description:
In December of 2020 Google fired Timnit Gebru, the co-lead of their Ethical Artificial Intelligence Team, after she refused to accept their attempted censorship of her co-authored article questioning the ethics and environmental impact of largescale AI language models. The termination sparked a new wave of organizing among Tech workers who quickly mobilized to defend Gebru against the corporate giant’s efforts to silence criticism of a key part of their business model. This organizing—following on the heels of the walk-outs against defense contracts and preceding this month’s announcement that Google workers have formed a union—offers important lessons about workers’ power within one of capitalism’s most profitable and important sectors.
And now, back to the legislative update.
SB 5116: Automated Decision Systems
"Automated decision system" means any electronic software, system, or process designed to automate, aid, or replace a decision-making process that impacts the welfare or rights of any Washington resident, and that would otherwise be performed by humans.
– SB 5116, Accountability and Transparency Standards for Automated Decision Systems
Automated decisions fill an increasingly important role in our society. Unfortunately, many of these systems embed biases against Black and Indigenous people, trans and non-binary people, disabled people, non-native English speakers, people of color, and many other groups.
What happened to Nijeer Parks is a clear example an of the harms automated decision systems can cause. First, he was arrested for a crime he didn’t commit -- because an automated facial recognition decision system misidentified him. Then, he was jailed because New Jersey’s no-bail automated decision system decided he was a risk.
SB 5116 sets standards for fairness and accountability, transparency requirements, and prohibition on government agencies developing or using automated decision systems to discriminate against people. It’s an important step forward against these kinds of abuses. I’m delighted that the legislature is considering it — and I’m thrilled that Sen. Patty Kuderer, who represents my district, is one of the co-sponsors.
Some great testimony!
SB 5116’s hearing also included discussions of several other bills, some of which had hundreds of people signed up to testify, so committee chair Sen. Sam Hunt limited oral testimony to only six people, for one minute each. The good news is that this meant the section had wrapped up in time for us to watch Lady Gaga at the Inauguration!
The Bill Report includes a short summary of the very positive testimony at the hearing (as well as background and a very readable description of the bill). If you’ve got about 15 minutes, check out the video of the hearing – SB 5116 is the first topic. Here's my somewhat-longer summary:
- Jay Cunningham, a PhD Student and Ethical AI Researcher at University of Washington, made an especially powerful case, and had the best soundbite of the hearing: "What we need in AI is less artificial and more human.”
- Jennifer Lee of ACLU-Washington cited the Arkansas healthcare algorithm that cut benefits to disabled people.
- Hillary Haden of Washington Fair Trade Coalition noted this would be the first bill of its kind and would set a sound standard for transparency and fairness – especially important since Washington is justifiably looked to as a tech leader.
- Ben Winters of Electronic Privacy Information Center brought national perspectives – and emphasized the timeliness of the issue.
- Witnesses from Washington Association of Sheriffs and Police Chiefs and the Internet Association-Washington agreed with the goals of the legislation and the need, although also wanted to make sure that there are not unintended consequences on systems that are in standard use today.
Several written testimonies were also submitted in support of the bill. A few highlights:
- Alka Roy (founder of the Responsible Innovation Project and a technology and AI expert) noted that bills like these are needed "to redefine innovation to include responsibility and accountability" – and highlighted the importance of independent and periodic audits since automated systems continue to be trained on new data and evolve.
- Shasta Willson, a software engineer who's spent over two decades working in the industry discussed how unless care is taken to avoid bias, "these systems simply reproduce the disparities they would most idealistically remove."
- I noted that the requirements for an algorithmic accountability report should be straightforward for any responsible systems vendor to supply. Conversely, if a vendor doesn’t have this information available, "it’s a sign that they are not applying industry-standard best practices, and so their system is very likely to have significant biases."
I’m not sure where the written testimony is available is on the legislature’s site — I’ll update this page with more testimony (and hopefully a link to a repository somewhere) as I find it.
There's an opportunity here!
I agree with pretty much everything the people I've quoted above have brought up ... and one aspecy I especially want to highlight is the timeliness of this legislation. Stories like Nijeer Parks', or the Stanford algorithm that didn’t allocate Covid vaccines to front-line workers, are increasing awareness of the risks and harms of automated decision systems. At the same time, there's the broader context of calls for accountability, ethics, and justice for tech companies and in the AI community – and the wave of tech worker activism.
Washington is well-positioned here, with a lot of great AI researchers and allies who have been pushing hard for these principles of accountability, fairness, transparency and justice within the AI community. It's also an advantage that many legislators began discussing this with last year's AI Profiling bill and multiple years of discussion of various facial recognition bills.
So as I said in The stakes are high, there's a real chance for Washington to be a leader here.
Then again, it's early days, so we shall see. Stay tuned for more!