Features

The challenges of biometrics

Bio­met­rics in­cludes all the sys­tems that allow iden­ti­fy­ing a per­son based on their body shape or their voice or even the way they walk or smell, al­though the lat­ter are not yet as wide­spread as the for­mer. It is be­com­ing more and more com­mon to open your mo­bile phone with your fin­ger­print or with fa­cial or voice recog­ni­tion. This data is con­sid­ered “sen­si­tive” by the Eu­ro­pean Gen­eral Data Pro­tec­tion Reg­u­la­tion and it there­fore can­not be used un­less the owner has given his or her con­sent or un­less it is a case of pub­lic se­cu­rity. Many ap­pli­ca­tions we now use on our mo­bile phone – to add a pig’s nose or don­key’s ears to a photo, for ex­am­ple – can col­lect our bio­met­ric data. “I don’t use these ap­pli­ca­tions. Those who do should read the con­di­tions of use very well so that they know what they are con­sent­ing to and what hap­pens to the im­ages,” in­sists Jordi Soria from the Cata­lan Data Pro­tec­tion Au­thor­ity.

The issue of fa­cial recog­ni­tion is es­pe­cially con­tro­ver­sial. The most fa­mous case is that of Clearview, which is in­volved in in­ter­na­tional law­suits be­cause it col­lects videos and im­ages from so­cial media, cre­at­ing a huge data­base that can then be sold to gov­ern­ments and their se­cu­rity and mil­i­tary forces. “If you up­load a photo of your­self, they will then take it with­out your con­sent, and from then on they could iden­tify you through video sur­veil­lance sys­tems with ar­ti­fi­cial in­tel­li­gence,” says Soria. The Ukrain­ian de­fence min­istry has said that it is using this con­tro­ver­sial sys­tem to help iden­tify de­ceased peo­ple as well as Russ­ian sol­diers.

It is clear that some kind of reg­u­la­tion is needed to counter this kind of Big Brother that will be able to see every­thing, and a pro­posal for a reg­u­la­tion on ar­ti­fi­cial in­tel­li­gence is cur­rently going through the Eu­ro­pean Par­lia­ment with the idea of ban­ning the use of these tech­niques, “ex­cept in spe­cific cases that have to do with pub­lic safety, such as in­ves­ti­gat­ing crimes,” ac­cord­ing to Soria, who says the aim is to pre­vent the in­dis­crim­i­nate use of such tech­nol­ogy.

Sign in. Sign in if you are already a verified reader. I want to become verified reader. To leave comments on the website you must be a verified reader.
Note: To leave comments on the website you must be a verified reader and accept the conditions of use.