New research shows driverless car software is significantly more accurate with adults and light skinned people than children and dark-skinned people.

    • PetDinosaurs@lemmy.world
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      edit-2
      1 year ago

      It’s not a discriminatory bias or even one that can really have anything done about it.

      It’s purely physics.

      Is it harder to track smaller objects or larger ones? Smaller, always .

      Is it harder for an optical system to track something darker. In any natural scene, universally.

      If you use the engineering terms, smaller and darker individuals have less signal. Less signal means lower probability of detection, even if your training data is perfectly sampled or you’re not not using ML.

      It’s the same reason a stealth bomber is harder to track than a passenger plane. Less signal.

      Edit: so many people are missing the point. Yes, they are likely using training data that is not statistically best, but this is a very known problem. The people that do this stuff, myself included, care about this and are aware of it.

      The next problem is that it’s extremely hard to quantify the equality we’re after.

      Ask the person their “race”, what do you get? Black? Asian? White?

      Borderline useless.

      There are strategies, but they are beyond the scope of this discussion.

      • stopthatgirl7@kbin.socialOP
        link
        fedilink
        arrow-up
        2
        arrow-down
        5
        ·
        1 year ago

        I’m sure that will be of great comfort to any dark-skinned person or child that gets hit.

        If those are known, expected issues? Then they had better program around it before putting driverless cars out on the road where dark-skinned people and children are not theoreticals but realities.