{"id":4411,"date":"2020-08-04T12:32:05","date_gmt":"2020-08-04T04:02:05","guid":{"rendered":"https:\/\/people.utm.my\/razman-ayop\/?p=4411"},"modified":"2020-08-04T12:33:01","modified_gmt":"2020-08-04T04:03:01","slug":"this-algorithm-might-make-facial-recognition-obsolete","status":"publish","type":"post","link":"https:\/\/people.utm.my\/razman-ayop\/this-algorithm-might-make-facial-recognition-obsolete\/","title":{"rendered":"This Algorithm Might Make Facial Recognition Obsolete"},"content":{"rendered":"<div class=\"sc-157agsr-1 sc-157agsr-2 OCqWi\">\n<div class=\"sc-1lmpmkf-1 gyRthi\"><\/div>\n<\/div>\n<div class=\"sc-157agsr-0 krcHQb\">\n<aside class=\"ynugv2-2 fpPzXA\">\n<div class=\"sc-101yw2y-4 QUUqu\">\n<div id=\"sidebar_wrapper\" class=\"ynugv2-3 kxHOoc\">\n<div class=\"js_newsletter-container\"><a href='https:\/\/gizmodo.com\/this-algorithm-might-make-facial-recognition-obsolete-1844591686' class='small-button smallsilver' target=\"_blank\">Link<\/a><\/div>\n<div id=\"leftrail_dynamic_ad_wrapper\"><\/div>\n<\/div>\n<\/div>\n<\/aside>\n<div class=\"sc-11qwj9y-1 hmBpsa\">\n<div class=\"js_kinja_notification\"><\/div>\n<div class=\"js_starterpost\">\n<div class=\"sc-1fofo4n-0 cdfDtI\">\n<div class=\"sc-83o472-1 bAZwNf\">\n<div class=\"sc-83o472-0 iYQxuq\">\n<div class=\"sc-1jc3ukb-0 kLkUqM\">\n<div class=\"sc-1jc3ukb-2 fUsAEy\">\n<div class=\"sc-1jc3ukb-3 hOuZKZ\">\n<div class=\"sc-1mep9y1-0 sc-1ixdk2y-0 fCQpxO\"><a class=\"sc-1out364-0 hMndXN js_link\" href=\"https:\/\/gizmodo.com\/author\/swodinsky\" data-ga=\"[[&quot;Permalink meta&quot;,&quot;Author click&quot;,&quot;https:\/\/kinja.com\/swodinsky&quot;]]\">Shoshana Wodinsky<\/a><\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<div class=\"r43lxo-0 hEDDLA js_post-content\">\n<figure class=\"sc-1eow4w5-1 dhDQnh align--bleed js_lazy-image js_marquee-assetfigure\" data-id=\"uj2hrvj4kqippfvphhht\" data-recommend-id=\"image:\/\/uj2hrvj4kqippfvphhht\" data-format=\"png\" data-width=\"866\" data-height=\"485\" data-lightbox=\"true\" data-recommended=\"false\">\n<div class=\"sc-1eow4w5-3 lktKQM image-hydration-wrapper\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full\" src=\"https:\/\/i.kinja-img.com\/gawker-media\/image\/upload\/c_scale,f_auto,fl_progressive,pg_1,q_80,w_800\/uj2hrvj4kqippfvphhht.png\" width=\"800\" height=\"448\" \/><\/div><figcaption class=\"sc-7s1ndr-0 cqfyTB no-caption\">Graphic:\u00a0University of Chicago\u00a0(<a class=\"sc-1out364-0 hMndXN sc-145m8ut-0 kVnoAv js_link\" href=\"http:\/\/sandlab.cs.uchicago.edu\/fawkes\" target=\"_blank\" rel=\"noopener noreferrer\" data-ga=\"[[&quot;Embedded Url&quot;,&quot;External link&quot;,&quot;http:\/\/sandlab.cs.uchicago.edu\/fawkes&quot;,{&quot;metric25&quot;:1}]]\">Fair Use<\/a>)<\/figcaption><\/figure>\n<p class=\"sc-77igqf-0 bOfvBY\">In 2020, it\u2019s worth assuming that every status update and selfie you upload online can eventually make its way into the hands of an obscure data-mining\u00a0<a class=\"sc-1out364-0 hMndXN sc-145m8ut-0 kVnoAv js_link\" href=\"https:\/\/gizmodo.com\/how-to-track-the-tech-thats-tracking-you-every-day-1843908029\" data-ga=\"[[&quot;Embedded Url&quot;,&quot;Internal link&quot;,&quot;https:\/\/gizmodo.com\/how-to-track-the-tech-thats-tracking-you-every-day-1843908029&quot;,{&quot;metric25&quot;:1}]]\">third party<\/a>, into the hands of\u00a0<a class=\"sc-1out364-0 hMndXN sc-145m8ut-0 kVnoAv js_link\" href=\"https:\/\/gizmodo.com\/your-phone-is-a-goldmine-of-hidden-data-for-cops-heres-1843817740\" data-ga=\"[[&quot;Embedded Url&quot;,&quot;Internal link&quot;,&quot;https:\/\/gizmodo.com\/your-phone-is-a-goldmine-of-hidden-data-for-cops-heres-1843817740&quot;,{&quot;metric25&quot;:1}]]\">national authorities<\/a>, or both.<\/p>\n<p class=\"sc-77igqf-0 bOfvBY\">On the flip side, being aware of exactly how shitty these companies are has prompted a lot of folks to come up with new, creative ways to slip out of this sort of surveillance. And while some of these methods\u2014like, say, wearing\u00a0<a class=\"sc-1out364-0 hMndXN sc-145m8ut-0 kVnoAv js_link\" href=\"https:\/\/newatlas.com\/computers\/face-masks-block-facial-recognition-technology-nist-study\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-ga=\"[[&quot;Embedded Url&quot;,&quot;External link&quot;,&quot;https:\/\/newatlas.com\/computers\/face-masks-block-facial-recognition-technology-nist-study\/&quot;,{&quot;metric25&quot;:1}]]\">masks<\/a>\u00a0or loading up on\u00a0<a class=\"sc-1out364-0 hMndXN sc-145m8ut-0 kVnoAv js_link\" href=\"https:\/\/www.codastory.com\/authoritarian-tech\/london-facial-recognition-facepaint\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-ga=\"[[&quot;Embedded Url&quot;,&quot;External link&quot;,&quot;https:\/\/www.codastory.com\/authoritarian-tech\/london-facial-recognition-facepaint\/&quot;,{&quot;metric25&quot;:1}]]\">face paint<\/a>\u00a0could\u00a0<em>theoretically<\/em>\u00a0keep your photos from being pilfered, you\u2019re also left with photos that don\u2019t look much like you at all. But now, a team from the University of Chicago has come up with a much subtler tactic that still effectively fights back against these sorts of snooping algorithms.<\/p>\n<p class=\"sc-77igqf-0 bOfvBY\">Called \u201cFawkes\u201d\u2014an homage to the Guy Fawkes mask that\u2019s become\u00a0<a class=\"sc-1out364-0 hMndXN sc-145m8ut-0 kVnoAv js_link\" href=\"https:\/\/gizmodo.com\/a-guy-fawkes-mask-for-every-skin-tone-gender-culture-740175233#!\" data-ga=\"[[&quot;Embedded Url&quot;,&quot;Internal link&quot;,&quot;https:\/\/gizmodo.com\/a-guy-fawkes-mask-for-every-skin-tone-gender-culture-740175233#!&quot;,{&quot;metric25&quot;:1}]]\">somewhat synonymous<\/a>\u00a0with the aptly named online collective Anonymous\u2014the Chicago team initially started working on the system at the tail end of last year as a way to thwart companies like\u00a0<a class=\"sc-1out364-0 hMndXN sc-145m8ut-0 kVnoAv js_link\" href=\"https:\/\/gizmodo.com\/shady-face-recognition-firm-clearview-ai-says-its-left-1844287344\" data-ga=\"[[&quot;Embedded Url&quot;,&quot;Internal link&quot;,&quot;https:\/\/gizmodo.com\/shady-face-recognition-firm-clearview-ai-says-its-left-1844287344&quot;,{&quot;metric25&quot;:1}]]\">Clearview AI<\/a>\u00a0that compile their face-filled databases by scraping public posts.<\/p>\n<p class=\"sc-77igqf-0 bOfvBY\">\u201cIt is our belief that Clearview.ai is likely only the (rather large) tip of the iceberg,\u201d the team\u00a0<a class=\"sc-1out364-0 hMndXN sc-145m8ut-0 kVnoAv js_link\" href=\"http:\/\/sandlab.cs.uchicago.edu\/fawkes\/#press\" target=\"_blank\" rel=\"noopener noreferrer\" data-ga=\"[[&quot;Embedded Url&quot;,&quot;External link&quot;,&quot;http:\/\/sandlab.cs.uchicago.edu\/fawkes\/#press&quot;,{&quot;metric25&quot;:1}]]\">wrote<\/a>. \u201cIf we can reduce the accuracy of these models to make them untrustworthy, or force the model\u2019s owners to pay significant per-person costs to maintain accuracy, then we would have largely succeeded.\u201d<\/p>\n<p class=\"sc-77igqf-0 bOfvBY\">See, when a facial recognition company like Clearview is trained to recognize a given person\u2019s appearance, that recognition happens by connecting one picture of a face (ie, from a Facebook profile) to another picture of a face (ie, from a passport photo), and finding similarities between the two photos. According to the Chicago team, this doesn\u2019t only mean finding matching facial geometry or matching hair color or matching moles, but it also means picking up on invisible relationships between the pixels that make up a computer-generated picture of that face.<\/p>\n<div class=\"bxm4mm-19 gvawPI\"><\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>Shoshana Wodinsky Graphic:\u00a0University of Chicago\u00a0(Fair Use) In 2020, it\u2019s worth assuming that every status update and selfie you upload online can eventually make its way into the hands of an obscure data-mining\u00a0third party, into the hands of\u00a0national authorities, or both. On the flip side, being aware of exactly how shitty these companies are has prompted [&hellip;]<\/p>\n","protected":false},"author":24954,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_et_pb_use_builder":"","_et_pb_old_content":"","_et_gb_content_width":"","_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[1],"tags":[],"class_list":["post-4411","post","type-post","status-publish","format-standard","hentry","category-variety"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/people.utm.my\/razman-ayop\/wp-json\/wp\/v2\/posts\/4411","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/people.utm.my\/razman-ayop\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/people.utm.my\/razman-ayop\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/people.utm.my\/razman-ayop\/wp-json\/wp\/v2\/users\/24954"}],"replies":[{"embeddable":true,"href":"https:\/\/people.utm.my\/razman-ayop\/wp-json\/wp\/v2\/comments?post=4411"}],"version-history":[{"count":2,"href":"https:\/\/people.utm.my\/razman-ayop\/wp-json\/wp\/v2\/posts\/4411\/revisions"}],"predecessor-version":[{"id":4413,"href":"https:\/\/people.utm.my\/razman-ayop\/wp-json\/wp\/v2\/posts\/4411\/revisions\/4413"}],"wp:attachment":[{"href":"https:\/\/people.utm.my\/razman-ayop\/wp-json\/wp\/v2\/media?parent=4411"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/people.utm.my\/razman-ayop\/wp-json\/wp\/v2\/categories?post=4411"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/people.utm.my\/razman-ayop\/wp-json\/wp\/v2\/tags?post=4411"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}