Tag: “Disability”
A Future Date Conference
Jutta Treviranus shares her presentation from the Future Date Conference Key Note.
A Clusive Success Story
Clusive is a free, flexible, adaptive and customizable learning environment. It is a web application for students and teachers that addresses access and learning barriers present in digital and open learning materials.
AI, Fairness and Bias
Dr. Jutta Treviranus participated in the Sight Tech Global 2020 panel "AI, Fairness and Bias" on December 2.
Accessible Canada, Accessible World Conference
Organized by IDRC, in partnership with Concordia, the Accessible Canada, Accessible World conference on May 27-28, 2024, is an international event dedicated to advancing accessibility and inclusive design. This conference is an opportunity to co-create resources and to share knowledge and lessons learned in advancing accessibility and inclusion. We invite you to join us in this collaborative effort, and to ensure that the global accessibility agenda is guided by people with disabilities. Together, we can make Canada and the world more accessible and inclusive for everyone.
We Count Badges
We Count badges enabled people to showcase their proficiency in the growing fields of AI, data systems and inclusive data practices as well as other skills.
Continuing Our Work During COVID-19
A message from IDRC Director Jutta Treviranus about continuing our mission during COVID-19.
COVID-19 Vaccination Centre Data Monitor: Accessibility Map Demonstration
A COVID-19 data monitor prototype test that was created to show how data gaps can be addressed, providing a way to find accessibility information that is not included in these data sets.
Disability Bias in AI-Powered Hiring Tools Co-Design
In May, we completed our second set of co-design sessions with the Equitable Digital Systems (EDS) project. EDS is a project that explores how to make digital systems more inclusive for persons with disabilities in the workplace.
Ensuring Equitable Outcomes from Automated Hiring Processes
These automated hiring and matching algorithms, implemented by major corporations such as LinkedIn, Amazon and others can be positioned in the wider context of automated processes, that use machine learning/AI algorithms, and support the infrastructure of society. These systems inevitably result in inequitable outcomes.
Ensuring Equitable Outcomes from Automated Hiring Processes: An Update
For this article, Antranig is considering this problem in the context of corporate apologies for technological practice and initiatives such as data feminism that seek to transfer power from privileged groups to those at the margins of society.