• Amaltheamannen@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    And how do we know you didn’t crop out an instruction asking for diversity?

    Either that or a side effect of trying to have less training data bias.

  • Eddyzh@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    It is ridiculous. However, how can we know you did not first instruct to only show dark skin? Or select these from many examples that showed something else?

    • stoneparchment@possumpat.io
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      4 months ago

      It’s also like, I guess I would prefer it to make mistakes like this if it means it is less biased towards whiteness in other, less specific areas?

      Like, we know these models are dumb as rocks. We know that they are imperfect and that they mirror the biases of their trainers and training data, and that in American society that means bias towards whiteness. If the trainers are doing what they can to prevent that from happening, whatever, that’s cool… even if the result is some dumb stuff like this sometimes.

      I also don’t think it’s a problem for the user to specify race if it matters? Like “a white queen of England” is a fine thing to ask for, and if it isn’t specified, the model will include diverse options even if they aren’t historically accurate. No one gets bent out of shape if the outfits aren’t quite historically accurate, for example

      • ji59@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        The problem is that these answers are hugely incorrect and if some child learning about history of England would see this, they would create bias that England was always diverse.
        The same is true for some recent post, where people knowing nothing about Scotland history could learn from images that half of Scotland population in 18th century was black.
        So from my perspective these images are just completely wrong and it should be fixed.
        Also if you want diversity, what about handicapped people?

        • groet@feddit.de
          link
          fedilink
          arrow-up
          0
          ·
          4 months ago

          Repeat after me:

          “Current AI is not a knowledge tool. It MUST NOT be used to get information about any topic!”

          If your child is learning Scottish history from AI, you failed as a teacher/parent. This isn’t even about bias, just about what an AI model is. It’s not even supposed to be correct, that’s not what it is for. It is for appearing as correct as the things it has been trained on. And as long as there are two opinions in the training data, the AI will gladly make up a third.