Source

I see Google’s deal with Reddit is going just great…

    • @milicent_bystandr@lemm.ee
      link
      fedilink
      English
      61 month ago

      I think ‘hallucinating’ means when it makes up the source/idea by (effectively) word association that generates the concept, rather than here it’s repeating a real source.

      • @PersonalDevKit@aussie.zone
        link
        fedilink
        English
        6
        edit-2
        1 month ago

        Couldn’t that describe 95% of what LLMs?

        It is a really good auto complete at the end of the day, just some times the auto complete gets it wrong

        • @milicent_bystandr@lemm.ee
          link
          fedilink
          English
          31 month ago

          Yes, nicely put! I suppose ‘hallucinating’ is a description of when, to the reader, it appears to state a fact but that fact doesn’t at all represent any fact from the training data.

    • Echo Dot
      link
      fedilink
      English
      -31 month ago

      Well it’s referencing something so the problem is the data set not an inherent flaw in the AI

      • David GerardM
        link
        fedilink
        English
        151 month ago

        i’m pretty sure that referencing this indicates an inherent flaw in the AI

        • Echo Dot
          link
          fedilink
          English
          -61 month ago

          No it represents an inherent flaw in the people developing the AI.

          That’s a totally different thing. Concept is not flawed the people implementing the concept are.

          • @ebu@awful.systems
            link
            fedilink
            English
            11
            edit-2
            1 month ago

            “Of course, this flexibility that allows for anything good and popular to be part of a natural, inevitable precursor to the true metaverse, simultaneously provides the flexibility to dismiss any failing as a failure of that pure vision, rather than a failure of the underlying ideas themselves. The metaverse cannot fail, you can only fail to make the metaverse.”

            – Dan Olson, The Future is a Dead Mall

      • @Ultraviolet@lemmy.world
        link
        fedilink
        English
        13
        edit-2
        1 month ago

        The inherent flaw is that the dataset needs to be both extremely large and vetted for quality with an extremely high level of accuracy. That can’t realistically exist, and any technology that relies on something that can’t exist is by definition flawed.