A big biometric security company in the UK, Facewatch, is in hot water after their facial recognition system caused a major snafu - the system wrongly identified a 19-year-old girl as a shoplifter.

  • cheesepotatoes@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    2
    ·
    7 months ago

    You lack imagination. What happens when the system mistakenly identifies someone as a violent offender and they get tackled by a bunch of cops, likely resulting in bodily injury.

      • PM_Your_Nudes_Please@lemmy.world
        link
        fedilink
        English
        arrow-up
        20
        ·
        7 months ago

        I mean, the article points out that the lady in the headline isn’t the only one who has been affected; A dude was falsely detained by cops after they parked a facial recognition van on a street corner, and grabbed anyone who was flagged.

      • blusterydayve26@midwest.social
        link
        fedilink
        English
        arrow-up
        18
        ·
        edit-2
        7 months ago

        That’s not very reassuring, we’re still only one computer bug away from that situation.

        Presumably she wasn’t identified as a violent criminal because the facial recognition system didn’t associate her duplicate with that particular crime. The system would be capable of associating any set of crimes with a face. It’s not like you get a whole new face for each different possible crime. So, we’re still one computer bug away from seeing that outcome.

      • cheesepotatoes@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        7 months ago

        No, it wouldn’t be. The base circumstance is the same, the software misidentifying a subject. The severity and context will vary from incident to incident, but the root cause is the same - false positives.

        There’s no process in place to prevent something like this going very very bad. It’s random chance that this time was just a false positive for theft. Until there’s some legislative obligation (such as legal liability) in place to force the company to create procedures and processes for identifying and reviewing false positives, then it’s only a matter of time before someone gets hurt.

        You don’t wait for someone to die before you start implementing safety nets. Or rather, you shouldn’t.