Tag: “AI”

AI, Fairness and Bias
Dr. Jutta Treviranus participated in the Sight Tech Global 2020 panel "AI, Fairness and Bias" on December 2.

We Count Badges
We Count badges enabled people to showcase their proficiency in the growing fields of AI, data systems and inclusive data practices as well as other skills.

Environmental Scan: Assessing Inclusionary Practice in Canadian Data Services
An overview of key findings from the Data Service Provider portion of the environmental scan conducted by the We Count team in May 2020.

Ensuring Equitable Outcomes from Automated Hiring Processes
These automated hiring and matching algorithms, implemented by major corporations such as LinkedIn, Amazon and others can be positioned in the wider context of automated processes, that use machine learning/AI algorithms, and support the infrastructure of society. These systems inevitably result in inequitable outcomes.

Environmental Scan: Addressing Inclusionary Practice in Canadian AI Firms
An overview of key findings from the AI Firm portion of the environmental scan conducted by the We Count team in May 2020.

Environmental Scan: Canadian Postsecondary Education and AI Ethics
An overview of key findings from the Postsecondary Education portion of the environmental scan conducted by the We Count team in May 2020.

Ensuring Equitable Outcomes from Automated Hiring Processes: An Update
For this article, Antranig is considering this problem in the context of corporate apologies for technological practice and initiatives such as data feminism that seek to transfer power from privileged groups to those at the margins of society.

Disability Bias in AI-Powered Hiring Tools Co-Design
In May, we completed our second set of co-design sessions with the Equitable Digital Systems (EDS) project. EDS is a project that explores how to make digital systems more inclusive for persons with disabilities in the workplace.

Exploring Bias in Hiring Tools
This video presentation examines how AI technology is utilized in recruitment and selection, its implications for candidates with disabilities, and the question of accessibility and diversity.

Pluralistic Data Infrastructure
The pluralistic data infrastructure supports communities in taking collective ownership of data that relates to them and curating its relationships with data from other sources.