{"version":"1.0","provider_name":"Ascend Magazine","provider_url":"https:\/\/www.yorku.ca\/ascend","author_name":"sandramc","author_url":"https:\/\/www.yorku.ca\/ascend\/author\/sandramc\/","title":"Mitigating bias - Ascend Magazine","type":"rich","width":600,"height":338,"html":"<blockquote class=\"wp-embedded-content\" data-secret=\"rpEESs5ffg\"><a href=\"https:\/\/www.yorku.ca\/ascend\/article\/mitigating-bias\/\">Mitigating bias<\/a><\/blockquote><iframe sandbox=\"allow-scripts\" security=\"restricted\" src=\"https:\/\/www.yorku.ca\/ascend\/article\/mitigating-bias\/embed\/#?secret=rpEESs5ffg\" width=\"600\" height=\"338\" title=\"&#8220;Mitigating bias&#8221; &#8212; Ascend Magazine\" data-secret=\"rpEESs5ffg\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" class=\"wp-embedded-content\"><\/iframe><script type=\"text\/javascript\">\n\/* <![CDATA[ *\/\n\/*! This file is auto-generated *\/\n!function(d,l){\"use strict\";l.querySelector&&d.addEventListener&&\"undefined\"!=typeof URL&&(d.wp=d.wp||{},d.wp.receiveEmbedMessage||(d.wp.receiveEmbedMessage=function(e){var t=e.data;if((t||t.secret||t.message||t.value)&&!\/[^a-zA-Z0-9]\/.test(t.secret)){for(var s,r,n,a=l.querySelectorAll('iframe[data-secret=\"'+t.secret+'\"]'),o=l.querySelectorAll('blockquote[data-secret=\"'+t.secret+'\"]'),c=new RegExp(\"^https?:$\",\"i\"),i=0;i<o.length;i++)o[i].style.display=\"none\";for(i=0;i<a.length;i++)s=a[i],e.source===s.contentWindow&&(s.removeAttribute(\"style\"),\"height\"===t.message?(1e3<(r=parseInt(t.value,10))?r=1e3:~~r<200&&(r=200),s.height=r):\"link\"===t.message&&(r=new URL(s.getAttribute(\"src\")),n=new URL(t.value),c.test(n.protocol))&&n.host===r.host&&l.activeElement===s&&(d.top.location.href=t.value))}},d.addEventListener(\"message\",d.wp.receiveEmbedMessage,!1),l.addEventListener(\"DOMContentLoaded\",function(){for(var e,t,s=l.querySelectorAll(\"iframe.wp-embedded-content\"),r=0;r<s.length;r++)(t=(e=s[r]).getAttribute(\"data-secret\"))||(t=Math.random().toString(36).substring(2,12),e.src+=\"#?secret=\"+t,e.setAttribute(\"data-secret\",t)),e.contentWindow.postMessage({message:\"ready\",secret:t},\"*\")},!1)))}(window,document);\n\/\/# sourceURL=https:\/\/www.yorku.ca\/ascend\/wp-includes\/js\/wp-embed.min.js\n\/* ]]> *\/\n<\/script>\n","description":"As artificial intelligence (AI) advances \u2013 particularly large language models (LLMs) which are increasingly integrated into social, governmental and economic systems \u2013 discriminatory stereotypes and biases persist. These prejudices reflect and reinforce historical and systemic inequalities embedded in massive datasets that models like OpenAI\u2019s Generative Pre-trained Transformer (GPT) and Google\u2019s Gemini learn from. York University [&hellip;]","thumbnail_url":"https:\/\/www.yorku.ca\/ascend\/wp-content\/uploads\/sites\/689\/2025\/12\/Mitigating-Bias-Christo-scaled.jpg","thumbnail_width":2560,"thumbnail_height":969}