{"version":"1.0","provider_name":"The Persona Blog","provider_url":"https:\/\/persona.qcri.org\/blog","author_name":"Jim Jansen","author_url":"https:\/\/persona.qcri.org\/blog\/author\/jim-jansen\/","title":"Analyzing Demographic Bias in Artificially Generated Facial Pictures &#8211; The Persona Blog","type":"rich","width":600,"height":338,"html":"<blockquote class=\"wp-embedded-content\" data-secret=\"RSjlFs8Oht\"><a href=\"https:\/\/persona.qcri.org\/blog\/analyzing-demographic-bias-in-artificially-generated-facial-pictures\/\">Analyzing Demographic Bias in Artificially Generated Facial Pictures<\/a><\/blockquote><iframe sandbox=\"allow-scripts\" security=\"restricted\" src=\"https:\/\/persona.qcri.org\/blog\/analyzing-demographic-bias-in-artificially-generated-facial-pictures\/embed\/#?secret=RSjlFs8Oht\" width=\"600\" height=\"338\" title=\"&#8220;Analyzing Demographic Bias in Artificially Generated Facial Pictures&#8221; &#8212; The Persona Blog\" data-secret=\"RSjlFs8Oht\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\" class=\"wp-embedded-content\"><\/iframe><script>\n\/*! This file is auto-generated *\/\n!function(d,l){\"use strict\";l.querySelector&&d.addEventListener&&\"undefined\"!=typeof URL&&(d.wp=d.wp||{},d.wp.receiveEmbedMessage||(d.wp.receiveEmbedMessage=function(e){var t=e.data;if((t||t.secret||t.message||t.value)&&!\/[^a-zA-Z0-9]\/.test(t.secret)){for(var s,r,n,a=l.querySelectorAll('iframe[data-secret=\"'+t.secret+'\"]'),o=l.querySelectorAll('blockquote[data-secret=\"'+t.secret+'\"]'),c=new RegExp(\"^https?:$\",\"i\"),i=0;i<o.length;i++)o[i].style.display=\"none\";for(i=0;i<a.length;i++)s=a[i],e.source===s.contentWindow&&(s.removeAttribute(\"style\"),\"height\"===t.message?(1e3<(r=parseInt(t.value,10))?r=1e3:~~r<200&&(r=200),s.height=r):\"link\"===t.message&&(r=new URL(s.getAttribute(\"src\")),n=new URL(t.value),c.test(n.protocol))&&n.host===r.host&&l.activeElement===s&&(d.top.location.href=t.value))}},d.addEventListener(\"message\",d.wp.receiveEmbedMessage,!1),l.addEventListener(\"DOMContentLoaded\",function(){for(var e,t,s=l.querySelectorAll(\"iframe.wp-embedded-content\"),r=0;r<s.length;r++)(t=(e=s[r]).getAttribute(\"data-secret\"))||(t=Math.random().toString(36).substring(2,12),e.src+=\"#?secret=\"+t,e.setAttribute(\"data-secret\",t)),e.contentWindow.postMessage({message:\"ready\",secret:t},\"*\")},!1)))}(window,document);\n<\/script>\n","description":"Artificial generation of facial images is increasingly popular, with machine learning achieving photo-realistic results. Yet, there is a concern that the generated images might not fairly represent all demographic groups. This has implications for persona development when approaching the goal of generating the facial pictures for the persona profiles automatically. In research led by\u00a0Joni Salminen, [&hellip;]","thumbnail_url":"https:\/\/persona.qcri.org\/blog\/wp-content\/uploads\/2020\/04\/fake_pictures.jpg"}