{"id":3385,"date":"2021-03-25T11:56:50","date_gmt":"2021-03-25T11:56:50","guid":{"rendered":"https:\/\/escp.eu\/thechoice\/?p=3385"},"modified":"2021-03-30T14:34:29","modified_gmt":"2021-03-30T13:34:29","slug":"just-like-us-machines-have-biases-but-this-can-change","status":"publish","type":"post","link":"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/","title":{"rendered":"Just Like Us, Machines have Biases. But This Can Change."},"content":{"rendered":"\n<p class=\"has-drop-cap drop-cap h-fs-20 h-fw-600 h-mb-20\">Artificial intelligence forces humans to face their own cognitive biases. The way of conceiving and programming machines is subjective, based on the experiences and social environment of the programmer. <\/p>\n\n\n\n<p>As AI is now ubiquitous in both our personal and professional lives, it can have a major impact on society, especially concerning discrimination. Since human bias is at the source of the discrimination, a way to reduce it may be to teach the engineers who are doing the AI programming.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>A<\/strong><strong> <\/strong><strong>Data-based<\/strong><strong> <\/strong><strong>Revolutionary<\/strong><strong> <\/strong><strong>Technology<\/strong><strong><\/strong><\/h3>\n\n\n\n<p>What is&nbsp;artificial&nbsp;intelligence&nbsp;and&nbsp;how&nbsp;is&nbsp;it&nbsp;biased?&nbsp;Imagine&nbsp;a&nbsp;situation&nbsp;in which there is a need to develop a programme that is capable of determining whether an animal is a dog or a cat, based on a photo. In the past, developers would code \u201cif\u201d, \u201cwhen\u201d and \u201celse\u201d statements using criteria such as height,&nbsp;fur,&nbsp;and color to determine the nature of the species in the photo. Today,&nbsp;as <a href=\"https:\/\/www.linkedin.com\/in\/sonia1abecassis1le1lan\/?originalSubdomain=fr\" target=\"_blank\" rel=\"noreferrer noopener\">Sonia Abecassis&nbsp;Le&nbsp;Lan <\/a>of&nbsp;IBM explains, artificial intelligence&nbsp;uses&nbsp;sets&nbsp;of&nbsp;data provided to the machine to teach it what a cat or a dog should look like. With thousands of photos of cats and dogs, the&nbsp;machine learns to recognise the&nbsp;two&nbsp;different&nbsp;species&nbsp;thanks to&nbsp;similarities&nbsp;between the photos. The machine is able to solve the problem by answering statistically which solution has the biggest probability of being the right one. It could, for example, conclude that the photo has a 95% probability of representing a dog. This is called machine learning and it is the foundation of artificial intelligence.&nbsp;<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>An<\/strong><strong> <\/strong><strong>Impressive<\/strong><strong> <\/strong><strong>Yet<\/strong><strong> <\/strong><strong>Flawed Technology<\/strong><\/h3>\n\n\n\n<p>Since&nbsp;machine&nbsp;learning&nbsp;and&nbsp;algorithms&nbsp;are&nbsp;fed&nbsp;with&nbsp;important&nbsp;amounts&nbsp;of&nbsp;data,&nbsp;their predictions are subject to bias because of a lack of diversity or irrelevance in the data. When using for example a facial recognition algorithm, if the database does not include enough&nbsp;diversified&nbsp;entries,&nbsp;it&nbsp;may&nbsp;not&nbsp;be&nbsp;able&nbsp;to correctly&nbsp;identify&nbsp;the&nbsp;different faces. <a href=\"https:\/\/www.linkedin.com\/in\/jeremy-patrick-schneider\" target=\"_blank\" rel=\"noreferrer noopener\">Jeremy Patrick&nbsp; Schneider<\/a> from&nbsp;IBM&nbsp;Interactive&nbsp;compares&nbsp;<mark>AI to a child&nbsp;who only&nbsp;knows what&nbsp;it&nbsp;has been&nbsp;taught:&nbsp;if&nbsp;the&nbsp;child&nbsp;has&nbsp;lived&nbsp;his whole life in a&nbsp;room,&nbsp;it&nbsp;will&nbsp;only know&nbsp;how to behave in that one&nbsp;room.<\/mark> When going outside for&nbsp;the first time,&nbsp;the child will not&nbsp;be able to react accordingly&nbsp;to&nbsp;various situations,&nbsp;because of&nbsp;his&nbsp;lack of&nbsp;experience.&nbsp;The&nbsp;scope of information that&nbsp;is&nbsp;fed&nbsp;into&nbsp;the machine&nbsp;is&nbsp;thus&nbsp;important in&nbsp;order&nbsp;to&nbsp;deal with&nbsp;multiple situations that&nbsp;might&nbsp;occur.&nbsp; Another problem linked&nbsp;with&nbsp;artificial intelligence,&nbsp;according&nbsp;to&nbsp;Schneider, is&nbsp;that the&nbsp;technology&nbsp;has spread very quickly, not giving enough time for scholars, politicians and others to thoroughly and rigorously&nbsp;test&nbsp;it&nbsp;and&nbsp;implement regulations&nbsp;to restrict biases.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>From<\/strong><strong> <\/strong><strong>Humans<\/strong><strong> <\/strong><strong>to<\/strong><strong> <\/strong><strong>Machines:<\/strong><strong> <\/strong><strong>How<\/strong><strong> <\/strong><strong>Our<\/strong><strong> <\/strong><strong>Biases<\/strong><strong> <\/strong><strong>Affect<\/strong><strong> <\/strong><strong>AI<\/strong><strong><\/strong><\/h3>\n\n\n\n<p>The technology itself is not the one to be blamed, the humans who created it are. Every human being&nbsp;has unconscious&nbsp;biases that&nbsp;are&nbsp;developed through&nbsp;their&nbsp;life:&nbsp; their culture, education, and experiences combine together to create cognitive biases. Machines merely reproduce them. <a href=\"https:\/\/www.linkedin.com\/in\/micka%C3%ABl-dell-ova-he-him-his-3435b260\" target=\"_blank\" rel=\"noreferrer noopener\">Mickael Dell\u2019ova<\/a> at Ubisoft gives the example of a well-intentioned colleague who wanted to make an inclusive video&nbsp;game&nbsp;by adding&nbsp;a&nbsp;lesbian character as the main protagonist of their triple A grand strategy game. His colleague thought it would be inclusive if the lesbian character had stereotypical short hair and a Perfecto leather jacket. This cartoonish representation, despite the colleague\u2019s best intentions,&nbsp;shows a&nbsp;clear&nbsp;unconscious bias. Such biases are mostly accidental, but being aware of them could help reduce their frequency.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\"><p>If machine bias is ignored, products will be biased as well.<\/p><\/blockquote>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>The<\/strong><strong> <\/strong><strong>Changes<\/strong><strong> <\/strong><strong>That<\/strong><strong> <\/strong><strong>Can<\/strong><strong> <\/strong><strong>Be Made<\/strong><\/h3>\n\n\n\n<p>The first improvement that could be made is <mark>to give artificial intelligence access to&nbsp;diversity&nbsp;when&nbsp;fed with&nbsp;data.&nbsp;<\/mark> <a href=\"https:\/\/www.linkedin.com\/in\/ir%C3%A8ne-balm%C3%A8s-9b853999\/\" target=\"_blank\" rel=\"noreferrer noopener\">Ir\u00e8ne Balm\u00e8s<\/a>&nbsp;underlines&nbsp;that&nbsp;algorithm designers need to carefully select the data given to AI and to clean it of bias through multiple&nbsp;tests&nbsp;and&nbsp;checks. More&nbsp;diversified&nbsp;teams&nbsp;would&nbsp;be&nbsp;a&nbsp;real&nbsp;asset&nbsp;in&nbsp;bias detection. Team diversification is only one solution. Educating the teams by providing training on ethical matters can help curb biases. As Balm\u00e8s says, it is unfortunate&nbsp;that&nbsp;there&nbsp;is not&nbsp;more&nbsp;communication&nbsp;between the fields&nbsp;of&nbsp;engineering&nbsp;and social science. Technology has always been questioned from a philosophical perspective. As concerns artificial intelligence, it is fundamental&nbsp;to&nbsp;think&nbsp;about&nbsp;the ethical issues as well and include the political perspective. Finally,&nbsp;to aid in the creation of diverse team members, the recruiting process should be reviewed, from the job description down to the hiring phase.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">What To Remember<\/h3>\n\n\n\n<p>Machines have biases. They merely reproduce the cognitive biases that algorithm designers have, which are deeply rooted in our society because of systemic discrimination. But this can change with more diversity and inclusion initiatives in the technology industry. We would like to take this as an opportunity to reflect on our own judgments and ways of thinking, and how they can affect others around us.<\/p>\n\n\n\n<p>The\u00a0above article has been derived from a\u00a0webinar organised\u00a0by\u00a0 <a href=\"https:\/\/www.lgbt-talents.com\/\" target=\"_blank\" rel=\"noreferrer noopener\">LGBT\u00a0 Talents<\/a>,\u00a0 with\u00a0the following speakers:\u00a0 <a href=\"https:\/\/www.linkedin.com\/in\/sonia1abecassis1le1lan\/?originalSubdomain=fr\" target=\"_blank\" rel=\"noreferrer noopener\">Sonia\u00a0 Abecassis\u00a0 Le\u00a0 Lan<\/a>,\u00a0 <a href=\"https:\/\/www.linkedin.com\/in\/ir%C3%A8ne-balm%C3%A8s-9b853999\/\" target=\"_blank\" rel=\"noreferrer noopener\">Ir\u00e8ne\u00a0 Balmes<\/a>,\u00a0 <a href=\"https:\/\/www.linkedin.com\/in\/jeremy-patrick-schneider\" target=\"_blank\" rel=\"noreferrer noopener\">Jeremy\u00a0 Patrick\u00a0 Schneider<\/a>\u00a0and <a href=\"https:\/\/www.linkedin.com\/in\/micka%C3%ABl-dell-ova-he-him-his-3435b260\" target=\"_blank\" rel=\"noreferrer noopener\">Micka\u00ebl Dell\u2019ova<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Since human bias is at the source of the discrimination, a way to reduce it may be to teach the engineers who are doing the AI programming.  <\/p>\n","protected":false},"author":1,"featured_media":3387,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[14],"tags":[95,27],"class_list":["post-3385","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-tomorrow-choices","tag-inclusion-and-diversity","tag-tech","category-14","description-off"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v25.5 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Just Like Us, Machines have Biases. But This Can Change. - The Choice by ESCP<\/title>\n<meta name=\"description\" content=\"Since human bias is at the source of the discrimination, a way to reduce it may be to teach the engineers who are doing the AI programming.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/\" \/>\n<meta property=\"og:locale\" content=\"en_GB\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Just Like Us, Machines have Biases. But This Can Change. - The Choice by ESCP\" \/>\n<meta property=\"og:description\" content=\"Since human bias is at the source of the discrimination, a way to reduce it may be to teach the engineers who are doing the AI programming.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/\" \/>\n<meta property=\"og:site_name\" content=\"The Choice by ESCP\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/ESCPbs\/\" \/>\n<meta property=\"article:published_time\" content=\"2021-03-25T11:56:50+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2021-03-30T13:34:29+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/magele-adobestock.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1280\" \/>\n\t<meta property=\"og:image:height\" content=\"556\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"The Choice Team\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@ESCP_bs\" \/>\n<meta name=\"twitter:site\" content=\"@ESCP_bs\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Jenny Lim &amp; Claire Pl\u00e9\" \/>\n\t<meta name=\"twitter:label2\" content=\"Estimated reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"4 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/\"},\"author\":{\"name\":\"The Choice Team\",\"@id\":\"https:\/\/escp.eu\/thechoice\/#\/schema\/person\/d777675eb749ba0781e5ac5b056ea5c3\"},\"headline\":\"Just Like Us, Machines have Biases. But This Can Change.\",\"datePublished\":\"2021-03-25T11:56:50+00:00\",\"dateModified\":\"2021-03-30T13:34:29+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/\"},\"wordCount\":1021,\"publisher\":{\"@id\":\"https:\/\/escp.eu\/thechoice\/#organization\"},\"image\":{\"@id\":\"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/magele-adobestock.jpg\",\"keywords\":[\"#InclusionDiversity\",\"#Tech\"],\"articleSection\":[\"Tomorrow's Choices, Today\"],\"inLanguage\":\"en-GB\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/\",\"url\":\"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/\",\"name\":\"Just Like Us, Machines have Biases. But This Can Change. - The Choice by ESCP\",\"isPartOf\":{\"@id\":\"https:\/\/escp.eu\/thechoice\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/magele-adobestock.jpg\",\"datePublished\":\"2021-03-25T11:56:50+00:00\",\"dateModified\":\"2021-03-30T13:34:29+00:00\",\"description\":\"Since human bias is at the source of the discrimination, a way to reduce it may be to teach the engineers who are doing the AI programming.\",\"breadcrumb\":{\"@id\":\"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/#breadcrumb\"},\"inLanguage\":\"en-GB\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-GB\",\"@id\":\"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/#primaryimage\",\"url\":\"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/magele-adobestock.jpg\",\"contentUrl\":\"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/magele-adobestock.jpg\",\"width\":1280,\"height\":556,\"caption\":\"Feature photo by Magele\/AdobeStock\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/escp.eu\/thechoice\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Just Like Us, Machines have Biases. But This Can Change.\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/escp.eu\/thechoice\/#website\",\"url\":\"https:\/\/escp.eu\/thechoice\/\",\"name\":\"The Choice by ESCP\",\"description\":\"The new media dedicated to choice makers\",\"publisher\":{\"@id\":\"https:\/\/escp.eu\/thechoice\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/escp.eu\/thechoice\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-GB\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/escp.eu\/thechoice\/#organization\",\"name\":\"ESCP Business School\",\"url\":\"https:\/\/escp.eu\/thechoice\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-GB\",\"@id\":\"https:\/\/escp.eu\/thechoice\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/the-choice-main-image-logo.jpg\",\"contentUrl\":\"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/the-choice-main-image-logo.jpg\",\"width\":1200,\"height\":800,\"caption\":\"ESCP Business School\"},\"image\":{\"@id\":\"https:\/\/escp.eu\/thechoice\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/ESCPbs\/\",\"https:\/\/x.com\/ESCP_bs\",\"https:\/\/www.instagram.com\/escpbs\/\",\"https:\/\/www.linkedin.com\/school\/escp-business-school\/\",\"https:\/\/www.youtube.com\/c\/ESCPBusinessSchool\",\"https:\/\/fr.wikipedia.org\/wiki\/ESCP_Business_School\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/escp.eu\/thechoice\/#\/schema\/person\/d777675eb749ba0781e5ac5b056ea5c3\",\"name\":\"The Choice Team\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-GB\",\"@id\":\"https:\/\/escp.eu\/thechoice\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/the-choice_avatar.png\",\"contentUrl\":\"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/the-choice_avatar.png\",\"caption\":\"The Choice Team\"},\"sameAs\":[\"https:\/\/escp.eu\/thechoice\"],\"url\":\"https:\/\/www.linkedin.com\/in\/jenny-lim-escpeurope\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Just Like Us, Machines have Biases. But This Can Change. - The Choice by ESCP","description":"Since human bias is at the source of the discrimination, a way to reduce it may be to teach the engineers who are doing the AI programming.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/","og_locale":"en_GB","og_type":"article","og_title":"Just Like Us, Machines have Biases. But This Can Change. - The Choice by ESCP","og_description":"Since human bias is at the source of the discrimination, a way to reduce it may be to teach the engineers who are doing the AI programming.","og_url":"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/","og_site_name":"The Choice by ESCP","article_publisher":"https:\/\/www.facebook.com\/ESCPbs\/","article_published_time":"2021-03-25T11:56:50+00:00","article_modified_time":"2021-03-30T13:34:29+00:00","og_image":[{"width":1280,"height":556,"url":"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/magele-adobestock.jpg","type":"image\/jpeg"}],"author":"The Choice Team","twitter_card":"summary_large_image","twitter_creator":"@ESCP_bs","twitter_site":"@ESCP_bs","twitter_misc":{"Written by":"Jenny Lim &amp; Claire Pl\u00e9","Estimated reading time":"4 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/#article","isPartOf":{"@id":"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/"},"author":{"name":"The Choice Team","@id":"https:\/\/escp.eu\/thechoice\/#\/schema\/person\/d777675eb749ba0781e5ac5b056ea5c3"},"headline":"Just Like Us, Machines have Biases. But This Can Change.","datePublished":"2021-03-25T11:56:50+00:00","dateModified":"2021-03-30T13:34:29+00:00","mainEntityOfPage":{"@id":"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/"},"wordCount":1021,"publisher":{"@id":"https:\/\/escp.eu\/thechoice\/#organization"},"image":{"@id":"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/#primaryimage"},"thumbnailUrl":"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/magele-adobestock.jpg","keywords":["#InclusionDiversity","#Tech"],"articleSection":["Tomorrow's Choices, Today"],"inLanguage":"en-GB"},{"@type":"WebPage","@id":"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/","url":"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/","name":"Just Like Us, Machines have Biases. But This Can Change. - The Choice by ESCP","isPartOf":{"@id":"https:\/\/escp.eu\/thechoice\/#website"},"primaryImageOfPage":{"@id":"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/#primaryimage"},"image":{"@id":"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/#primaryimage"},"thumbnailUrl":"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/magele-adobestock.jpg","datePublished":"2021-03-25T11:56:50+00:00","dateModified":"2021-03-30T13:34:29+00:00","description":"Since human bias is at the source of the discrimination, a way to reduce it may be to teach the engineers who are doing the AI programming.","breadcrumb":{"@id":"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/#breadcrumb"},"inLanguage":"en-GB","potentialAction":[{"@type":"ReadAction","target":["https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/"]}]},{"@type":"ImageObject","inLanguage":"en-GB","@id":"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/#primaryimage","url":"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/magele-adobestock.jpg","contentUrl":"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/magele-adobestock.jpg","width":1280,"height":556,"caption":"Feature photo by Magele\/AdobeStock"},{"@type":"BreadcrumbList","@id":"https:\/\/escp.eu\/thechoice\/tomorrow-choices\/just-like-us-machines-have-biases-but-this-can-change\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/escp.eu\/thechoice\/"},{"@type":"ListItem","position":2,"name":"Just Like Us, Machines have Biases. But This Can Change."}]},{"@type":"WebSite","@id":"https:\/\/escp.eu\/thechoice\/#website","url":"https:\/\/escp.eu\/thechoice\/","name":"The Choice by ESCP","description":"The new media dedicated to choice makers","publisher":{"@id":"https:\/\/escp.eu\/thechoice\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/escp.eu\/thechoice\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-GB"},{"@type":"Organization","@id":"https:\/\/escp.eu\/thechoice\/#organization","name":"ESCP Business School","url":"https:\/\/escp.eu\/thechoice\/","logo":{"@type":"ImageObject","inLanguage":"en-GB","@id":"https:\/\/escp.eu\/thechoice\/#\/schema\/logo\/image\/","url":"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/the-choice-main-image-logo.jpg","contentUrl":"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/the-choice-main-image-logo.jpg","width":1200,"height":800,"caption":"ESCP Business School"},"image":{"@id":"https:\/\/escp.eu\/thechoice\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/ESCPbs\/","https:\/\/x.com\/ESCP_bs","https:\/\/www.instagram.com\/escpbs\/","https:\/\/www.linkedin.com\/school\/escp-business-school\/","https:\/\/www.youtube.com\/c\/ESCPBusinessSchool","https:\/\/fr.wikipedia.org\/wiki\/ESCP_Business_School"]},{"@type":"Person","@id":"https:\/\/escp.eu\/thechoice\/#\/schema\/person\/d777675eb749ba0781e5ac5b056ea5c3","name":"The Choice Team","image":{"@type":"ImageObject","inLanguage":"en-GB","@id":"https:\/\/escp.eu\/thechoice\/#\/schema\/person\/image\/","url":"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/the-choice_avatar.png","contentUrl":"https:\/\/escp.eu\/thechoice\/wp-content\/uploads\/the-choice_avatar.png","caption":"The Choice Team"},"sameAs":["https:\/\/escp.eu\/thechoice"],"url":"https:\/\/www.linkedin.com\/in\/jenny-lim-escpeurope\/"}]}},"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/escp.eu\/thechoice\/wp-json\/wp\/v2\/posts\/3385","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/escp.eu\/thechoice\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/escp.eu\/thechoice\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/escp.eu\/thechoice\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/escp.eu\/thechoice\/wp-json\/wp\/v2\/comments?post=3385"}],"version-history":[{"count":0,"href":"https:\/\/escp.eu\/thechoice\/wp-json\/wp\/v2\/posts\/3385\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/escp.eu\/thechoice\/wp-json\/wp\/v2\/media\/3387"}],"wp:attachment":[{"href":"https:\/\/escp.eu\/thechoice\/wp-json\/wp\/v2\/media?parent=3385"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/escp.eu\/thechoice\/wp-json\/wp\/v2\/categories?post=3385"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/escp.eu\/thechoice\/wp-json\/wp\/v2\/tags?post=3385"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}