New research shows driverless car software is significantly more accurate with adults and light skinned people than children and dark-skinned people.

    • PetDinosaurs@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      1 year ago

      It’s not a discriminatory bias or even one that can really have anything done about it.

      It’s purely physics.

      Is it harder to track smaller objects or larger ones? Smaller, always .

      Is it harder for an optical system to track something darker. In any natural scene, universally.

      If you use the engineering terms, smaller and darker individuals have less signal. Less signal means lower probability of detection, even if your training data is perfectly sampled or you’re not not using ML.

      It’s the same reason a stealth bomber is harder to track than a passenger plane. Less signal.

      Edit: so many people are missing the point. Yes, they are likely using training data that is not statistically best, but this is a very known problem. The people that do this stuff, myself included, care about this and are aware of it.

      The next problem is that it’s extremely hard to quantify the equality we’re after.

      Ask the person their “race”, what do you get? Black? Asian? White?

      Borderline useless.

      There are strategies, but they are beyond the scope of this discussion.

      • stopthatgirl7@kbin.socialOP
        link
        fedilink
        arrow-up
        2
        arrow-down
        5
        ·
        1 year ago

        I’m sure that will be of great comfort to any dark-skinned person or child that gets hit.

        If those are known, expected issues? Then they had better program around it before putting driverless cars out on the road where dark-skinned people and children are not theoreticals but realities.

  • Endomlik@reddthat.com
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Seems this will be always the case. Small objects are harder to detect than larger objects. Higher contrast objects are easier to detect than lower contrast objects. Even if detection gets 1000x better, these cases will still be true. Do you introduce artificial error to make things fair?

    Repeating the same comment from a crosspost.

    • kibiz0r@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      All the more reason to take this seriously and not disregard it as an implementation detail.

      When we, as a society, ask: Are autonomous vehicles safe enough yet?

      That’s not the whole question.

      …safe enough for whom?

      • Mac@mander.xyz
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        Also what is the safety target? Humans are extremely unsafe. Are we looking for any improvement or are we looking for perfection?

        • kibiz0r@midwest.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 year ago

          This is why it’s as much a question of philosophy as it is of engineering.

          Because there are things we care about besides quantitative measures.

          If you replace 100 pedestrian deaths due to drunk drivers with 99 pedestrian deaths due to unexplainable self-driving malfunctions… Is that, unambiguously, an improvement?

          I don’t know. In the aggregate, I guess I would have to say yes…?

          But when I imagine being that person in that moment, trying to make sense of the sudden loss of a loved one and having no explanation other than watershed segmentation and k-means clustering… I start to feel some existential vertigo.

          I worry that we’re sleepwalking into treating rationalist utilitarianism as the empirically correct moral model — because that’s the future that Silicon Valley is building, almost as if it’s inevitable.

          And it makes me wonder, like… How many of us are actually thinking it through and deliberately agreeing with them? Or are we all just boiled frogs here?

      • lefixxx@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        1 year ago

        First impression I got is driverless cars are worst at detecting kids and black people than drivers

        • stopthatgirl7@kbin.socialOP
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          I think that’s more it’s a headline that can be misinterpreted than misleading. I read it as it was detailed later, as worse are spotting kids and dark-skinned folks than at adults and light-skinned people. It can be ambiguous.

  • Hazdaz@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    This is Lemmy, so immediately 1/2 the people here are ready to be race-baited and want to cry RaCiSm1!!1!!

  • huginn@feddit.it
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I wonder what the baseline is for the average driver spotting those same people? I expect it’s higher than the learning algo but by how much?

  • WorseDoughnut 🍩@lemdro.id
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Unfortunate but not shocking to be honest.

    Only recently did smartphone cameras get better at detecting darker skinned faces in software, and that was something they were probably working towards for a decent while. Not all that surprising that other camera tech would have to play catch up in that regard as well.