BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Global Forum of Communities Discriminated on Work and Descent - GFoD - ECPv6.2.9//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:Global Forum of Communities Discriminated on Work and Descent - GFoD
X-ORIGINAL-URL:https://globalforumcdwd.org
X-WR-CALDESC:Events for Global Forum of Communities Discriminated on Work and Descent - GFoD
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20260308T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20261101T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20260318T113000
DTEND;TZID=America/New_York:20260318T124500
DTSTAMP:20260405T013433
CREATED:20260220T211101Z
LAST-MODIFIED:20260220T211101Z
UID:10108-1773833400-1773837900@globalforumcdwd.org
SUMMARY:Algorithmic Bias\, Gender Justice\, and Descent-Based Discrimination: Ensuring AI Works for All Women and Girls
DESCRIPTION:Gender discrimination remains deeply rooted in many societies and is often experienced through intersectional forms of inequality shaped by factors such as descent\, race\, ethnicity\, socio-economic status\, and digital access. As artificial intelligence becomes increasingly embedded across sectors\, these intersecting inequalities risk being replicated and amplified through algorithmic systems. \nTools like risk prediction algorithms\, forecasting-based law enforcement\, automated welfare allocation\, and online judicial processing are transforming decisions on entitlements\, aid delivery\, and legal remedies. Without global guidelines\, openness\, redress mechanisms\, or\nrights protections\, these technologies frequently amplify entrenched gender disparities\, embedding biases from past records\, enforcement histories\, and bureaucratic routines. \nFor women and girls from communities discriminated against on work and descent (CDWD)—including Dalit\, Roma\, Quilombola\, Haratine\, Burakumin\, and other similarly affected groups—the risks are particularly severe. These communities already experience structural discrimination\, over-policing\, exclusion from services\, biased judicial treatment\, and chronic under-reporting of violence. When AI models are trained on biased data reflecting caste-\, ethnicity-\, or descent-based prejudices\, the resulting systems risk: \n\nDisproportionately flagging CDWD youth as “high-risk”\nIntensifying surveillance of CDWD neighbourhoods\nAutomating exclusion from welfare or social-protection schemes\nMisclassifying or deprioritizing CDWD women’s cases of violence\nReinforcing discriminatory employment and labour-market barriers\n\nThis session delves into AI’s influence on women’s and girls’ pathways to justice\, emphasizing compounded vulnerabilities from overlapping discriminations. It will also provide a timely platform for discussion ahead of the Working Group’s forthcoming thematic report to be presented to the Human Rights Council in June 2026. \nObjectives \n\nAnalyse how current uses of AI in governance\, justice systems\, and policing\nreproduce gender\, caste\, ethnicity\, and descent-based inequalities.\nHighlight lived experiences of women and girls from CDWD communities regarding\nAI-driven exclusion\, surveillance\, and barriers to justice.\nDiscuss safeguards and regulatory frameworks needed to ensure AI\nstrengthens—rather than undermines—women’s access to justice and protection\nfrom violence.\nExamine opportunities for AI to expand legal empowerment and support for survivors\nwhen human-rights-based digital governance is applied.\n\nTo join this event please register here. \nPartners\nPermanent Mission of Albania to the UN New York\nUN Working Group on Discrimination Against Women and Girls (WGDAWG)\nThe Global Forum of Communities Discriminated on Work and Descent (GFoD)\nThe Inclusivity Project (TIP)\nEuropean Union Delegation to the United Nations in New York
URL:https://globalforumcdwd.org/event/algorithmic-bias-gender-justice-and-descent-based-discrimination-ensuring-ai-works-for-all-women-and-girls/
LOCATION:Conference Room E\, UN Headquarters\, New York\, United States
ATTACH;FMTTYPE=image/png:https://globalforumcdwd.org/wp-content/uploads/2026/02/Flyer_updated.png
END:VEVENT
END:VCALENDAR