#2 No Invasive Police Tech
Ban on Advanced Policing Technologies
City of Bellingham Initiative No. 2021-02
AN ORDINANCE OF THE CITY OF BELLINGHAM, WASHINGTON REGARDING A PROHIBITION ON THE CITY’S ACQUISITION AND USE OF ADVANCED POLICING TECHNOLOGIES
WHEREAS, police surveillance and actions based on predicted crimes may threaten the privacy of all of us, and they have historically been used to intimidate and oppress certain communities and groups more than others, including those that are defined by a common race, ethnicity, religion, national origin, income level, sexual orientation, or political perspective.
WHEREAS, in recent years, some Police Departments across the United States have adopted advanced policing technologies, including facial recognition and predictive policing technologies.
WHEREAS, whenever possible, decisions relating to advanced policing technology should occur with strong consideration given to the impact that such technologies may have on civil rights and civil liberties.
WHEREAS, many facial recognition and predictive policing technologies used by municipal police departments rely on proprietary algorithms, which cannot be subjected to full public scrutiny.
WHEREAS, a 2020 study by the US National Institute of Standards and Technology (NIST) of two-hundred facial recognition algorithms concluded that they have high rates of false positive identifications for Black people.
WHEREAS, the voters of Bellingham recognize the emerging need to protect the public safety, privacy, and civil rights of their residents, a growing number of local governments have adopted laws that prohibit the use of facial recognition and other biometric surveillance technology. More than half a dozen U.S. cities have passed bans on the government use of facial recognition.
WHEREAS, police departments employing predictive police technology elsewhere in the United States have not demonstrated a commitment to transparency and public accountability.
WHEREAS, the application of predictive policing technology may violate the constitutional requirement that police possess reasonable suspicion before stopping individuals.
WHEREAS, Bellingham voters find that advanced policing technologies currently lack the protections needed to adequately safeguard the rights and liberties of all people.
WHEREAS, the propensity for facial recognition technology and predictive policing technology to endanger civil rights and civil liberties substantially outweighs its purported benefits, and the technology will exacerbate racial injustice and threaten our ability to live free of continuous government monitoring.
NOW, THEREFORE, THE CITY OF BELLINGHAM DOES ORDAIN:
Section 1. Purpose of this Article
The purpose of this Article is to ensure equal treatment of all people and to protect the privacy and civil liberties of all people in Bellingham.
Section 2. Definitions
For the purposes of this Article, the following terms have the following meanings:
-
“City of Bellingham” means any department, agency, bureau, and/or subordinate division of the City of Bellingham.
-
“City of Bellingham official” means any person or entity acting on behalf of the City of Bellingham, including any officer, employee, agent, contractor, subcontractor, or vendor.
-
“Facial recognition” means an automated or semi-automated process that assists in identifying or verifying an individual or captures information about them, based on the physical characteristics of their face. Facial recognition as you used here is synonymous with “facial surveillance.”
-
“Facial recognition technology” means any computer software or application that performs face surveillance, including any software or application used to recognize faces. Facial recognition technology as used here is synonymous with “facial surveillance technology.”
-
“Predictive policing technology” means software that is used to predict information or trends about crime or criminality in the present or future, including but not limited to the characteristics or profile of any person(s) likely to commit a crime, the identity of any person(s) likely to commit crime, the locations or frequency of crime, or the person(s) impacted by predicted crime. Such software typically uses algorithms to sort through large data sets.
Section 3. Prohibition on the City’s Acquisition or Use of Facial Recognition Technology
-
It shall be unlawful for the City of Bellingham or any City of Bellingham official to:
-
Obtain, retain, store, possess, access, use, or collect:
-
any facial recognition technology; or
-
any data or information derived from a facial recognition technology or other use of facial recognition;
-
-
Enter into a contract or other agreement with any third party for the purpose of obtaining, retaining, storing, possessing, accessing, using, or collecting, by or on behalf of the City of Bellingham or any City of Bellingham official:
-
any facial recognition technology; or
-
any data or information derived from a facial recognition technology or other use of facial recognition; or
-
issue any permit or enter into a contract or other agreement that authorizes any third party to obtain, retain, store, possess, access, use, or collect:
-
any facial recognition technology; or
-
any data or information derived from a facial recognition technology or other use of facial recognition.
-
-
-
-
The inadvertent or unintentional obtainment, retention, storage, possession, access, use, or collection of any information obtained from facial recognition technology by the City of Bellingham or any City of Bellingham official shall not be a violation of this Section provided:
-
the City of Bellingham or any City of Bellingham official did not request or solicit the obtainment, retention, storage, possession, access, use, or collection of such information,
-
and, a designated City of Bellingham official logs such obtainment, retention, storage, possession, access, use, or collection;
-
and, a designated City of Bellingham official publishes that information on the City Council’s website within thirty (30) days or in the agenda for the next regular meeting of the City Council. Such a report shall not include any personally identifiable information or other information the release of which is prohibited by law.
-
and, a designated City of Bellingham official notifies any and all persons identified as a result of such inadvertent or unintentional obtainment, retention, storage, possession, access, use, or collection of any information obtained from facial recognition technology within thirty (30) days of discovery of such inadvertent or unintentional obtainment, retention, storage, possession, access, use, or collection of any information by registered mail.
-
Section 4. Use of Facial Recognition Technology in Criminal or Civil Proceedings is Unlawful
No data or information that is obtained, retained, stored, possessed, accessed, used, collected, or derived from any facial recognition technology or other use of facial recognition in violation of Section 3, and no evidence derived therefrom, may be used by the City of Bellingham or City of Bellingham officials as evidence in any trial, hearing, or other proceeding in or before any court, grand jury, department, officer, agency, regulatory body, legislative committee, or other authority.
Section 5. Retention of Data or Information Obtained in Violation of Section 3 is Unlawful
-
Any data or information that is obtained, retained, stored, possessed, accessed, used, collected, or derived in violation of Section 3 shall be considered unlawfully obtained, and shall be deleted upon discovery.
-
In the event that any data or information on an individual is obtained, retained, stored, possessed, accessed, used, collected, or derived in violation of Section 3 that individual should be notified within thirty (30) days of the violation by registered mail.
Section 6. Prohibition on the City’s Acquisition or Use of Predictive Policing Technology
-
It shall be unlawful for the City of Bellingham or any City of Bellingham official to
-
Obtain, retain, store, possess, access, use, or collect:
-
Any predictive policing technology; or
-
any data or information derived from a predictive policing technology or other use of predictive policing; or
-
-
issue any permit or enter into a contract or other agreement that authorizes any third party to obtain, retain, store, possess, access, use, or collect:
-
any predictive policing technology; or
-
any data or information derived from a predictive policing technology or other use of predictive policing.
-
-
-
The inadvertent or unintentional obtainment, retention, storage, possession, access, use, or collection of any information obtained from predictive policing technology by the City of Bellingham or any City of Bellingham official shall not be a violation of this Section provided:
-
the City of Bellingham or any City of Bellingham official did not request or solicit the obtainment, retention, storage, possession, access, use, or collection of such information,
-
and, a designated City of Bellingham official logs such obtainment, retention, storage, possession, access, use, or collection;
-
and, a designated City of Bellingham official publishes that information on the City Council’s website within thirty (30) days or in the agenda for the next regular meeting of the City Council. Such a report shall not include any personally identifiable information or other information the release of which is prohibited by law.
-
and, a designated City of Bellingham official notifies any and all persons identified as a result of such inadvertent or unintentional obtainment, retention, storage, possession, access, use, or collection of any information obtained from predictive policing technology within thirty (30) days of discovery of such inadvertent or unintentional obtainment, retention, storage, possession, access, use, or collection of any information by registered mail.
-
Section 7. Use of Predictive Policing Technology in Criminal or Civil Proceedings is Unlawful
No data or information that is obtained, retained, stored, possessed, accessed, used, collected, or derived from any predictive policing technology or other use of predictive policing in violation of Section 6, and no evidence derived therefrom, may be used by the City of Bellingham, its Departments, or officials as evidence in any trial, hearing, or other proceeding in or before any court, grand jury, department, officer, agency, regulatory body, legislative committee, or other authority.
Section 8. Retention of Data or Information Obtained in Violation of Section 6 is Unlawful
-
Any data or information that is obtained, retained, stored, possessed, accessed, used, collected, or derived in violation of Section 6 shall be considered unlawfully obtained, and shall be deleted upon discovery.
-
In the event that any data or information on an individual is obtained, retained, stored, possessed, accessed, used, collected, or derived in violation of Section 7, that individual should be notified within thirty (30) days of the violation by registered mail.
Section 9. Enforcement
-
Any person injured by a violation of this Article may institute proceedings for relief, including a writ of mandate, in any court of competent jurisdiction to enforce this Article.
-
An action instituted under this paragraph may be brought against the City of Bellingham, if necessary to effectuate compliance with this Article (including to expunge information unlawfully obtained, retained, stored, possessed, accessed, used, collected, or derived thereunder) or to redress injury suffered by an individual through violation of this Article.
-
Prior to the initiation of any legal proceeding, the City of Bellingham shall be given written notice of the alleged violation(s) and an opportunity to correct such alleged violation(s) within 90 days of receipt of the notice.
-
If the alleged violation(s) is substantiated and subsequently cured, a notice shall be posted in a conspicuous space on the City’s website that generally describes the corrective measure(s) taken to address the violation(s).
-
Notice to the injured party shall also be provided by registered mail, or electronic substitute if consented to by the injured party.
-
Section 10. Exceptions and Safe Harbors
Nothing in this article shall be construed to:
-
conflict with the Constitution of the United States, the Constitution of the State of Washington or with any State or federal law;
-
prohibit the use of an automated or semiautomated process for the purpose of redacting a recording for release or disclosure to protect the privacy of a subject depicted in the recording;
-
prohibit the use of facial recognition or similar biometric technique on privately owned consumer devices for personal use or security, or for commercial use or security; or
-
prohibit the use of facial recognition technology by the City of Bellingham or City of Bellingham officials in managing secure entry or access to restricted buildings, rooms, or other secure spaces, devices, or things, provided that:
-
any data or information derived from such a system is only obtained, retained, stored, possessed, accessed, used, or collected with the knowledge and consent of any person authorized for such entry or access; and
-
no data or information derived from such a system about any persons not authorized for such entry or access may be obtained, retained, stored, possessed, accessed, used, or collected for any purposes other than listed herein.
-
Section 11. Severability
The provisions of this Article are severable. If any provision of this Article or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.
Frequently Asked Questions Why should we ban Facial Recognition Technology (FRT)?
First, of all - what is it? Facial Recognition Technologies give governments, companies, and individuals the power to spy on us wherever we go, enabling the persistent tracking of our faces everywhere there are networked cameras available to either corporations who contract the tech out, law enforcement, or both. This creates the ability to track us in great detail, including at protests, political rallies, places of worship, and more – chilling our democratic rights. After all, we cannot leave our faces at home. The use of facial recognition technology exacerbates the already disproportionate surveillance and criminalization of targeted communities. In this past year alone, we learned of at least 3 Black men who were wrongfully arrested and jailed because facial recognition software matched them to crimes they did not commit. And, it was only after lawsuits were brought in those cases, that the use of facial recognition technology in those wrongful arrests was even disclosed.
Will banning facial recognition technologies (FRT) take away the tools for cops to be able ‘do their jobs’?
Short answer: no, banning FRT will not prevent cops or city employees from doing their jobs. Proponents of these technologies want to make them sound non-threatening, necessary, and helpful; yet no solid data supports that stance. Analysis of facial recognition and predictive policing programs have found them to be ineffective, expensive, and incredibly harmful to marginalized communities. Bellingham should spend money addressing the things that really prevent crime: access to housing, good food, and equity for all. Facial Recognition Technologies, such as Clearview AI, have a demonstrated impact of misidentifying both people of color and women. A study done by MIT found that the technology had a 0.8% error rate on light skinned men and a 34.7% error rate on dark skinned women (MIT Study). There have been multiple misidentifications and wrongful arrests, and the technology is consistently abused by ICE and Border Patrol agents in conjunction with local institutions (ACLU Article), which is not something we think Bellingham should support.
What is Predictive Policing Technology (PPT)?
Like facial recognition, predictive policing technologies (PPT) further entrench systemic racism in a self-replicating cycle. Because predictive policing tools use massive amounts of historical crime data to forecast where future criminal activity will take place, they are more accurate at predicting biased policing rather than predicting actual crime. The claim is that PPTs benignly import past crime data to help law enforcement target policing activities to “hot spots” of criminal activity. But, let’s be clear about what that really means: police can continue to target, profile and over-police the same individuals and communities it has in the past. It’s no secret that policing in our country is deeply biased, and this fact has plagued our nation for years. Policing has always dramatically skewed against low-wealth immigrant and communities of color, as well as queer/lgbt communities. Because the past crime data that goes into these technologies is already biased, it – of course – generates results that are also biased. Put another way: garbage in, garbage out. That biased data is then manipulated by authorities and creates further inaccuracies that lead to more over-policing of our neighbors of color (see Brooking Institute Article on Predictive Policing). And that, as we know, can be incredibly harmful, and even deadly. Currently, there is little to no transparency with the public on the exact algorithms and elements contained in the PPT that law enforcement agencies can use– and that is just part of what makes them dangerous. Companies and agencies can draw from all kinds of sources to build their predictive and facial recognition software, including scraping data from people’s facebook profiles, personal blogs, geo-location data to be able to track exactly where a person is at any given time, financial logs, you name it. Black, Indigenous, People of Color, Queer and LGBT, and Immigrant members of our community; activists and social justice organizers; and journalists are all at risk if we do not enact a ban on these invasive technologies. Clearview has continued to collect data and build ever-scarier and creepier tools. If we don’t act now, we may well lose our chance to reign in these civil rights violating practices that are inherent to this kind of technology being employed against us in our daily lives. We believe Bellingham residents deserve a future that is not controlled by the greedy whims of large corporations and agencies that have demonstrated the harms of their deep institutional bias time and time again. We need to pass Initiative 2 to protect our rights and civil liberties.
More Info
Only a ban, not weak regulations, can protect Bellingham Residents.
Only a ban on racially biased and powerful facial recognition technology can protect the civil rights, civil liberties, and safety of Bellingham residents. Washington state’s facial recognition law (SB 6280) is insufficient and allows continued use of this deeply flawed technology. SB 6280 only regulates 3 uses of FRT while many other uses are freely allowed. The bill requires a warrant or court order only for “ongoing surveillance,” “persistent tracking,” and “real-time or near real-time identification.” This means that agencies can use FRT to surveil entire crowds at football stadiums, places of worship, or even on public street corners.
Bellingham is well within its jurisdiction and capability to go above a bill that was never supported by civil liberties proponents. SB 6280 has no preemption provision.
FRT and PPT fuel abuse of power
Law enforcement and big corporations have unlimited powers to threaten, harass, harm, detain, inconvenience, and profile people they decide to target based upon predictions and guesses - which we know are prone to human error and bias – instead of actual material evidence of having already committed a crime.
When combined with local, state, and national databases police already use, as well as social media-monitoring, drone surveillance, and stingray (or cell-site simulator) devices, FRT and PP give police unprecedented power to track and target people.
You won’t know when it’s being used on you
Currently, there is no way to know whether law enforcement is targeting or stopping someone due to a facial match or a prediction based on the use of predictive software, Most folks who’ve been unjustly stopped due to FRT or PP have only discovered that fact after bringing a lawsuit.
Surveillance always disproportionately harms marginalized communities
Past governmental responses to white violence, such as the Oklahoma City bombing and the Columbine shooting, have disproportionately harmed marginalized communities. Government agencies often use crises as an excuse to erode our civil rights and harm our communities.
Examples: 9/11, Patriot Act, Guantanamo Bay, Japanese internment.
FRT and PPT contribute to recidivism & further targeting folks who have served their time.
These technologies contribute to re-incarceration (or recidivism) by continuing to enable the close targeting of those who have served their time for their crime. With the use of these incredibly invasive technologies, these individuals may never be able to be free of the ongoing harassment and surveillance of their lives by law enforcement. This is yet another violation of rights, and makes the rebuilding of one’s life after imprisonment nearly impossible. Given how many people of color are unjustly arrested and jailed in the first place, this leads to a never-ending cycle of bias in our carceral system.
Facial recognition and predictive policing are bad whether or not they are accurate.
The technology is not accurate, but even if it was, it would still pose huge threats to our democracy and constitutionally protected rights.
Facial Recognition inaccuracies:
- While facial recognition harms our democracy regardless of its inaccuracies, it is still extremely inaccurate and biased. Marginal improvements do not eliminate bias.
- It is up to 100 times more likely to misidentify Black or Asian faces, compared with white faces.
As with most unjust or wrongful stops, the harm and violence they create often goes without remedy.
- Black women, in particular, are misidentified at significantly higher rates.
- This technology is even less reliable when identifying transgender individuals and entirely inaccurate when used on non-binary people.
Mismatches at airports and borders also create great risk of harm and harrasessment for non-binary and trans individuals.
- Robert Julian-Borchak Williams, Nijeer Parks, and Michael Oliver - Black men who were all wrongfully arrested and jailed because face surveillance software matched them to crimes they did not commit.
Crime forecasting threatens our constitutional rights:
- Predictive policing systems seek to forecast crime even before it happens, undermining our Fourth Amendment right against “unreasonable searches and seizures” by the police.
- Even though predictive policing tools are inaccurate and biased, they make it easier for the police to claim that individuals meet the “reasonable suspicion” standard, justifying stops even if no crime has taken place.
- Predictive policing tools rely on data derived from a long history of discriminatory policing, replicating biased police practices and reinforcing over-policing of communities of color.
Lack of transparency and accountability = due process concerns
- We don’t know what data goes in and what decisions/recommendations come out
. - There is no way to correct inaccurate or discriminatory decisions (e.g., Robert Julian Borchak Williams being wrongfully arrested due to a false facial recognition match - Detroit MI).
- FRT and PPT create what we might refer to as a “tech-washing” of biased policing patterns. How could a wrongful stop, arrest or assault on a civilian by a law enforcement officer be viewed as biased, if it can be blamed on an algorithm? This poses huge barriers to holding law enforcement accountable for harms they may inflict.
FRT does not work well on children & Bellingham does not need it to find missing children
There are many studies showing that FRT does not work well on children or elderly people (Dec 2019 NIST study).
FRT does not stop human and child trafficking but instead enables mass surveillance of the public & sex workers
The use of facial recognition with the purported purpose of halting human trafficking will not be effective in stopping human and child trafficking (face recognition does not work well on children—see above), but could instead enable mass surveillance of the general public, eroding everyone’s privacy and civil liberties, while further marginalizing and harming sex workers, who are disproportionately from LGBTQ, BIPOC and immigrant communities.
Launching mass surveillance efforts to address human trafficking may do much more harm than good, because human trafficking is often conflated with consensual sex work. This inaccurate conflation not only does not help legitimate victims of human trafficking, but also harms the diverse sex worker community.
Major flaws in data
- Over-policed areas yield higher crime data. This creates a never-ending data loop.
- Communities that are already over-policed yield higher crime data, which predictive policing technologies then leverage into more over-policing, a cycle that all but guarantees inequity in community safety.
- Police are likely to make mistakes using these flawed and biased technologies - opening up possibilities for injury, harm, corruption and cover up, and/or very expensive lawsuits against the city.
- Algorithms can’t best human emotional intelligence or relational understanding - they’re very likely to make flawed predictions based on the limited and overly-objective set of facts (data) they are written and machine-taught to process. This is the human vs robot dilemma!
Dangers of Data retention
- The data these tools employ and collect is used by federal agencies, but local police are the one’s collecting it.
- Police shouldn’t be able to use biased technology, or the data generated by that technology. We don’t want Bellingham police to ask other agencies / companies to use harmful technology on their behalf.
- Using discriminatory policing data simply leads to more discrimination in communities that are already harmed by over-policing
.
FRT & PPT enable an all-seeing view of people’s travels in real time, with very real concerns for the harm this may create.
- Stalking of people by law enforcement.
- Many cases of intimate partner violence abuse by law enforcement.
- Geolocation tracking aspects of FRT and PPT gives law enforcement an intimate portrait of people’s lives--from where they live and work, to what religion they practice, the health clinic they visit, and the family and friends they associate with. This has an incredible chilling effect on people’s freedom of movement and association, as well as potential to chill free speech by creating conditions in which people are afraid to attend public assemblies.
Key exceptions to this ban
It’s important to point out that Initiative 2 will ban the use of these invasive technologies, and the data they collect, by the City of Bellingham for at least two years. However, pressing “STOP” on the use of this tech by our City doesn’t mean they cannot be used by state and federal agencies.
↑ Back to top
First, of all - what is it? Facial Recognition Technologies give governments, companies, and individuals the power to spy on us wherever we go, enabling the persistent tracking of our faces everywhere there are networked cameras available to either corporations who contract the tech out, law enforcement, or both. This creates the ability to track us in great detail, including at protests, political rallies, places of worship, and more – chilling our democratic rights. After all, we cannot leave our faces at home. The use of facial recognition technology exacerbates the already disproportionate surveillance and criminalization of targeted communities. In this past year alone, we learned of at least 3 Black men who were wrongfully arrested and jailed because facial recognition software matched them to crimes they did not commit. And, it was only after lawsuits were brought in those cases, that the use of facial recognition technology in those wrongful arrests was even disclosed.
Short answer: no, banning FRT will not prevent cops or city employees from doing their jobs. Proponents of these technologies want to make them sound non-threatening, necessary, and helpful; yet no solid data supports that stance. Analysis of facial recognition and predictive policing programs have found them to be ineffective, expensive, and incredibly harmful to marginalized communities. Bellingham should spend money addressing the things that really prevent crime: access to housing, good food, and equity for all. Facial Recognition Technologies, such as Clearview AI, have a demonstrated impact of misidentifying both people of color and women. A study done by MIT found that the technology had a 0.8% error rate on light skinned men and a 34.7% error rate on dark skinned women (MIT Study). There have been multiple misidentifications and wrongful arrests, and the technology is consistently abused by ICE and Border Patrol agents in conjunction with local institutions (ACLU Article), which is not something we think Bellingham should support.
Like facial recognition, predictive policing technologies (PPT) further entrench systemic racism in a self-replicating cycle. Because predictive policing tools use massive amounts of historical crime data to forecast where future criminal activity will take place, they are more accurate at predicting biased policing rather than predicting actual crime. The claim is that PPTs benignly import past crime data to help law enforcement target policing activities to “hot spots” of criminal activity. But, let’s be clear about what that really means: police can continue to target, profile and over-police the same individuals and communities it has in the past. It’s no secret that policing in our country is deeply biased, and this fact has plagued our nation for years. Policing has always dramatically skewed against low-wealth immigrant and communities of color, as well as queer/lgbt communities. Because the past crime data that goes into these technologies is already biased, it – of course – generates results that are also biased. Put another way: garbage in, garbage out. That biased data is then manipulated by authorities and creates further inaccuracies that lead to more over-policing of our neighbors of color (see Brooking Institute Article on Predictive Policing). And that, as we know, can be incredibly harmful, and even deadly. Currently, there is little to no transparency with the public on the exact algorithms and elements contained in the PPT that law enforcement agencies can use– and that is just part of what makes them dangerous. Companies and agencies can draw from all kinds of sources to build their predictive and facial recognition software, including scraping data from people’s facebook profiles, personal blogs, geo-location data to be able to track exactly where a person is at any given time, financial logs, you name it. Black, Indigenous, People of Color, Queer and LGBT, and Immigrant members of our community; activists and social justice organizers; and journalists are all at risk if we do not enact a ban on these invasive technologies. Clearview has continued to collect data and build ever-scarier and creepier tools. If we don’t act now, we may well lose our chance to reign in these civil rights violating practices that are inherent to this kind of technology being employed against us in our daily lives. We believe Bellingham residents deserve a future that is not controlled by the greedy whims of large corporations and agencies that have demonstrated the harms of their deep institutional bias time and time again. We need to pass Initiative 2 to protect our rights and civil liberties.
Only a ban, not weak regulations, can protect Bellingham Residents.
Only a ban on racially biased and powerful facial recognition technology can protect the civil rights, civil liberties, and safety of Bellingham residents. Washington state’s facial recognition law (SB 6280) is insufficient and allows continued use of this deeply flawed technology. SB 6280 only regulates 3 uses of FRT while many other uses are freely allowed. The bill requires a warrant or court order only for “ongoing surveillance,” “persistent tracking,” and “real-time or near real-time identification.” This means that agencies can use FRT to surveil entire crowds at football stadiums, places of worship, or even on public street corners. Bellingham is well within its jurisdiction and capability to go above a bill that was never supported by civil liberties proponents. SB 6280 has no preemption provision.
FRT and PPT fuel abuse of power
Law enforcement and big corporations have unlimited powers to threaten, harass, harm, detain, inconvenience, and profile people they decide to target based upon predictions and guesses - which we know are prone to human error and bias – instead of actual material evidence of having already committed a crime. When combined with local, state, and national databases police already use, as well as social media-monitoring, drone surveillance, and stingray (or cell-site simulator) devices, FRT and PP give police unprecedented power to track and target people.
You won’t know when it’s being used on you
Currently, there is no way to know whether law enforcement is targeting or stopping someone due to a facial match or a prediction based on the use of predictive software, Most folks who’ve been unjustly stopped due to FRT or PP have only discovered that fact after bringing a lawsuit.
Surveillance always disproportionately harms marginalized communities
Past governmental responses to white violence, such as the Oklahoma City bombing and the Columbine shooting, have disproportionately harmed marginalized communities. Government agencies often use crises as an excuse to erode our civil rights and harm our communities. Examples: 9/11, Patriot Act, Guantanamo Bay, Japanese internment.
FRT and PPT contribute to recidivism & further targeting folks who have served their time.
These technologies contribute to re-incarceration (or recidivism) by continuing to enable the close targeting of those who have served their time for their crime. With the use of these incredibly invasive technologies, these individuals may never be able to be free of the ongoing harassment and surveillance of their lives by law enforcement. This is yet another violation of rights, and makes the rebuilding of one’s life after imprisonment nearly impossible. Given how many people of color are unjustly arrested and jailed in the first place, this leads to a never-ending cycle of bias in our carceral system.
Facial recognition and predictive policing are bad whether or not they are accurate.
The technology is not accurate, but even if it was, it would still pose huge threats to our democracy and constitutionally protected rights.
Facial Recognition inaccuracies:
- While facial recognition harms our democracy regardless of its inaccuracies, it is still extremely inaccurate and biased. Marginal improvements do not eliminate bias.
- It is up to 100 times more likely to misidentify Black or Asian faces, compared with white faces. As with most unjust or wrongful stops, the harm and violence they create often goes without remedy.
- Black women, in particular, are misidentified at significantly higher rates.
- This technology is even less reliable when identifying transgender individuals and entirely inaccurate when used on non-binary people. Mismatches at airports and borders also create great risk of harm and harrasessment for non-binary and trans individuals.
- Robert Julian-Borchak Williams, Nijeer Parks, and Michael Oliver - Black men who were all wrongfully arrested and jailed because face surveillance software matched them to crimes they did not commit.
Crime forecasting threatens our constitutional rights:
- Predictive policing systems seek to forecast crime even before it happens, undermining our Fourth Amendment right against “unreasonable searches and seizures” by the police.
- Even though predictive policing tools are inaccurate and biased, they make it easier for the police to claim that individuals meet the “reasonable suspicion” standard, justifying stops even if no crime has taken place.
- Predictive policing tools rely on data derived from a long history of discriminatory policing, replicating biased police practices and reinforcing over-policing of communities of color.
Lack of transparency and accountability = due process concerns
- We don’t know what data goes in and what decisions/recommendations come out . - There is no way to correct inaccurate or discriminatory decisions (e.g., Robert Julian Borchak Williams being wrongfully arrested due to a false facial recognition match - Detroit MI). - FRT and PPT create what we might refer to as a “tech-washing” of biased policing patterns. How could a wrongful stop, arrest or assault on a civilian by a law enforcement officer be viewed as biased, if it can be blamed on an algorithm? This poses huge barriers to holding law enforcement accountable for harms they may inflict.
FRT does not work well on children & Bellingham does not need it to find missing children
There are many studies showing that FRT does not work well on children or elderly people (Dec 2019 NIST study).
FRT does not stop human and child trafficking but instead enables mass surveillance of the public & sex workers
The use of facial recognition with the purported purpose of halting human trafficking will not be effective in stopping human and child trafficking (face recognition does not work well on children—see above), but could instead enable mass surveillance of the general public, eroding everyone’s privacy and civil liberties, while further marginalizing and harming sex workers, who are disproportionately from LGBTQ, BIPOC and immigrant communities. Launching mass surveillance efforts to address human trafficking may do much more harm than good, because human trafficking is often conflated with consensual sex work. This inaccurate conflation not only does not help legitimate victims of human trafficking, but also harms the diverse sex worker community.
Major flaws in data
- Over-policed areas yield higher crime data. This creates a never-ending data loop.
- Communities that are already over-policed yield higher crime data, which predictive policing technologies then leverage into more over-policing, a cycle that all but guarantees inequity in community safety.
- Police are likely to make mistakes using these flawed and biased technologies - opening up possibilities for injury, harm, corruption and cover up, and/or very expensive lawsuits against the city.
- Algorithms can’t best human emotional intelligence or relational understanding - they’re very likely to make flawed predictions based on the limited and overly-objective set of facts (data) they are written and machine-taught to process. This is the human vs robot dilemma!
Dangers of Data retention
- The data these tools employ and collect is used by federal agencies, but local police are the one’s collecting it.
- Police shouldn’t be able to use biased technology, or the data generated by that technology. We don’t want Bellingham police to ask other agencies / companies to use harmful technology on their behalf.
- Using discriminatory policing data simply leads to more discrimination in communities that are already harmed by over-policing .
FRT & PPT enable an all-seeing view of people’s travels in real time, with very real concerns for the harm this may create.
- Stalking of people by law enforcement.
- Many cases of intimate partner violence abuse by law enforcement.
- Geolocation tracking aspects of FRT and PPT gives law enforcement an intimate portrait of people’s lives--from where they live and work, to what religion they practice, the health clinic they visit, and the family and friends they associate with. This has an incredible chilling effect on people’s freedom of movement and association, as well as potential to chill free speech by creating conditions in which people are afraid to attend public assemblies.
Key exceptions to this ban
It’s important to point out that Initiative 2 will ban the use of these invasive technologies, and the data they collect, by the City of Bellingham for at least two years. However, pressing “STOP” on the use of this tech by our City doesn’t mean they cannot be used by state and federal agencies.