• off_brand_@beehaw.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Anything that evaluates people, regardless of whether it sees the face. Hiring AI has a racist bent because there does not exist non-racist training data. I think it was Amazon who kept trying to fix this but when after scrubbing names and pictures it was still an issue.

    • Sotuanduso@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      1 year ago

      That one is weird to me. Was it somehow extrapolating the race? Or was it favoring colleges with racist admissions? Not really sure how else it could get racist without access to names and pictures.

      • Bldck@beehaw.org
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        There are many indicators of race in hiring. For example, these characteristics could give a pretty good idea of the race of an applicant

        1. Name
        2. College or university
        3. Clubs or fraternities
        4. Location (address, not city)

        While an ML program wouldn’t decide “don’t hire black men”, it may be reinforced by existing policies that are racist. Since these programs are trained on the existing success of employees at a company.

        For example, a company may not have hired someone from an HBCU. Therefore, an applicant who attended an HBCU would be viewed negatively by the hiring program.