Citizen Lab認証済みアカウント

@citizenlab

Research & development at the intersection of cyberspace, global security & human rights. Munk School of Global Affairs & Public Policy, University of Toronto

Toronto
2009年4月に登録

ツイート

@citizenlabさんをブロックしました

このツイートを表示してもよろしいですか?これによって@citizenlabさんのブロックが解除されることはありません。

  1. 固定されたツイート
    9月1日

    NEW REPORT: To Surveil and Predict: A Human Rights Analysis of Algorithmic Policing in Canada

    このスレッドを表示
    取り消す
  2. さんがリツイート
    17 時間前

    This is a brilliant op-ed by my co-author on today's new report. Key point: the human rights threat of algorithmic policing to marginalized groups makes it even more vital we revive Charter s15 right to equality:

    このスレッドを表示
    取り消す
  3. 19 時間前

    For more details, including recommendations for governments and police services, read the full report.

    このスレッドを表示
    取り消す
  4. 19 時間前

    Learning from these lessons, Canada has the opportunity to adopt human rights-focused criminal justice reform. Today, the Citizen Lab and published a first-of-its-kind report that focuses on the human rights impacts of algorithmic policing technology in Canada.

    このスレッドを表示
    取り消す
  5. 19 時間前

    In 2020, the Chicago Police Department decommissioned a 10-year experimental predictive policing program. The reasons for cancelling Chicago’s program included the “unreliability” of the technology & “negative consequences” related to arrests that never resulted in convictions.

    このスレッドを表示
    取り消す
  6. 19 時間前

    In 2020, the Chief of Police in Detroit reported that facial recognition technology fails to correctly identify people “96% of the time.”

    このスレッドを表示
    取り消す
  7. 19 時間前

    But there are more important questions than cost. The human rights dangers surrounding the technology are glaring given their susceptibility to bias and inaccuracy. And lessons have been learned in the United States.

    このスレッドを表示
    取り消す
  8. 19 時間前

    Creating and maintaining police algorithms is resource intensive, requiring publicly-funded human capital, tech support, and/or commercial licensing fees.

    このスレッドを表示
    取り消す
  9. 19 時間前

    These are only examples among many of the algorithmic policing technologies under development or in use in Canada. But the true cost of experimentation with algorithmic policing is probably much higher.

    このスレッドを表示
    取り消す
  10. 19 時間前

    And the York Regional Police Service budgeted $1.68 million in 2019 to pay for a facial recognition system.

    このスレッドを表示
    取り消す
  11. 19 時間前

    The City of Calgary reportedly paid $1.4 million to Palantir Technologies Inc. to give the Calgary Police service over three years—a figure which doesn’t include amounts paid to other technology vendors that the police service is also known to be using.

    このスレッドを表示
    取り消す
  12. 19 時間前

    The Saskatchewan Police Predictive Analytics Lab was reportedly expected to receive nearly $1 million dollars over two years for their predictive policing program in Saskatoon.

    このスレッドを表示
    取り消す
  13. 19 時間前

    Accurate figures about how much money Canadian police services have poured into experimentation with algorithmic technology are not fully known. But some details are available:

    このスレッドを表示
    取り消す
  14. 19 時間前

    In Los Angeles, one of the first cities to become a testing ground for predictive policing technology, the city reportedly spent over $17 million USD between 2010-2015 on the program.

    このスレッドを表示
    取り消す
  15. 19 時間前

    They're not designed to redress the root causes of crime. This alone is reason to abandon public investments in such technology. But at a time when police budgets in Canada are being closely re-examined, understanding the financial costs of this tech takes on new importance.

    このスレッドを表示
    取り消す
  16. 19 時間前

    Researchers are sounding alarm bells about its human rights dangers. It has already led to wrongful arrests of racialized individuals in known cases. Algorithms tend to replicate existing patterns of policing, including racial disparities in policing.

    このスレッドを表示
    取り消す
  17. 19 時間前

    Police services around the world—and now in Canada—are starting to use algorithmic technology in ways that the public has not seen before. Algorithmic policing technology is controversial. A thread on the true cost of this tech ⬇️

    このスレッドを表示
    取り消す
  18. 9月1日
    取り消す
  19. 9月1日

    As a result, these communities are the most likely to be negatively impacted by the use of algorithmic policing technologies, and are at the greatest risk of having their constitutional and human rights violated.

    このスレッドを表示
    取り消す
  20. 9月1日

    These technologies’ algorithms are generally trained on biased historical police data that include anti-Black racism, anti-Indigenous racism, and discrimination against racialized individuals and communities more broadly.

    このスレッドを表示
    取り消す
  21. 9月1日

    In collaboration with , this report examines predictive policing and algorithmic surveillance technologies used by Canadian police services, concluding that they present several human rights and constitutional law issues.

    このスレッドを表示
    取り消す

読み込みに時間がかかっているようです。

Twitterの処理能力の限界を超えているか、一時的な不具合が発生しています。やりなおすか、Twitterステータスで詳細をご確認ください。

    こちらもおすすめです

    ·