Citizen LabVerified account

@citizenlab

Research & development at the intersection of cyberspace, global security & human rights. Munk School of Global Affairs & Public Policy, University of Toronto

Toronto
Joined April 2009

Tweets

You blocked @citizenlab

Are you sure you want to view these Tweets? Viewing Tweets won't unblock @citizenlab

  1. Pinned Tweet
    Sep 1

    NEW REPORT: To Surveil and Predict: A Human Rights Analysis of Algorithmic Policing in Canada

    Show this thread
    Undo
  2. Retweeted
    16 hours ago

    This is a brilliant op-ed by my co-author on today's new report. Key point: the human rights threat of algorithmic policing to marginalized groups makes it even more vital we revive Charter s15 right to equality:

    Show this thread
    Undo
  3. For more details, including recommendations for governments and police services, read the full report.

    Show this thread
    Undo
  4. Learning from these lessons, Canada has the opportunity to adopt human rights-focused criminal justice reform. Today, the Citizen Lab and published a first-of-its-kind report that focuses on the human rights impacts of algorithmic policing technology in Canada.

    Show this thread
    Undo
  5. In 2020, the Chicago Police Department decommissioned a 10-year experimental predictive policing program. The reasons for cancelling Chicago’s program included the “unreliability” of the technology & “negative consequences” related to arrests that never resulted in convictions.

    Show this thread
    Undo
  6. In 2020, the Chief of Police in Detroit reported that facial recognition technology fails to correctly identify people “96% of the time.”

    Show this thread
    Undo
  7. But there are more important questions than cost. The human rights dangers surrounding the technology are glaring given their susceptibility to bias and inaccuracy. And lessons have been learned in the United States.

    Show this thread
    Undo
  8. Creating and maintaining police algorithms is resource intensive, requiring publicly-funded human capital, tech support, and/or commercial licensing fees.

    Show this thread
    Undo
  9. These are only examples among many of the algorithmic policing technologies under development or in use in Canada. But the true cost of experimentation with algorithmic policing is probably much higher.

    Show this thread
    Undo
  10. And the York Regional Police Service budgeted $1.68 million in 2019 to pay for a facial recognition system.

    Show this thread
    Undo
  11. The City of Calgary reportedly paid $1.4 million to Palantir Technologies Inc. to give the Calgary Police service over three years—a figure which doesn’t include amounts paid to other technology vendors that the police service is also known to be using.

    Show this thread
    Undo
  12. The Saskatchewan Police Predictive Analytics Lab was reportedly expected to receive nearly $1 million dollars over two years for their predictive policing program in Saskatoon.

    Show this thread
    Undo
  13. Accurate figures about how much money Canadian police services have poured into experimentation with algorithmic technology are not fully known. But some details are available:

    Show this thread
    Undo
  14. In Los Angeles, one of the first cities to become a testing ground for predictive policing technology, the city reportedly spent over $17 million USD between 2010-2015 on the program.

    Show this thread
    Undo
  15. They're not designed to redress the root causes of crime. This alone is reason to abandon public investments in such technology. But at a time when police budgets in Canada are being closely re-examined, understanding the financial costs of this tech takes on new importance.

    Show this thread
    Undo
  16. Researchers are sounding alarm bells about its human rights dangers. It has already led to wrongful arrests of racialized individuals in known cases. Algorithms tend to replicate existing patterns of policing, including racial disparities in policing.

    Show this thread
    Undo
  17. Police services around the world—and now in Canada—are starting to use algorithmic technology in ways that the public has not seen before. Algorithmic policing technology is controversial. A thread on the true cost of this tech ⬇️

    Show this thread
    Undo
  18. Undo
  19. Sep 1

    As a result, these communities are the most likely to be negatively impacted by the use of algorithmic policing technologies, and are at the greatest risk of having their constitutional and human rights violated.

    Show this thread
    Undo
  20. Sep 1

    These technologies’ algorithms are generally trained on biased historical police data that include anti-Black racism, anti-Indigenous racism, and discrimination against racialized individuals and communities more broadly.

    Show this thread
    Undo
  21. Sep 1

    In collaboration with , this report examines predictive policing and algorithmic surveillance technologies used by Canadian police services, concluding that they present several human rights and constitutional law issues.

    Show this thread
    Undo

Loading seems to be taking a while.

Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.

    You may also like

    ·