• bitofhope@awful.systems
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    LLMs are so notoriously terrible at telling truth from lies that "AI hallucination" is a household phrase at this point, for better or for worse. But surely they work even better when asked to rate the truthfulness of things that are not in their corpus to begin with.

  • sc_griffith@awful.systems
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    this is such a funny grift. hope ceos are torturing themselves over whether the random noise interpreters will like them. imagine an exec staring in the mirror repeating a line over and over to develop the right intonation to fool ai

    • Send_me_nude_girls@feddit.de
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      Just train a model with your voice and never speak a real word on your own ever again. Call it voice purists. It's going to happen.

      • sc_griffith@awful.systems
        link
        fedilink
        English
        arrow-up
        11
        ·
        1 year ago

        I'm sorry, but that won't help your earnings call. As soon as you give it a few microseconds of voice data, the model will simulate your life from first principles and find out your company is fucked. you think the ai is going to throw that information away? every exquisite subvocal pang of agony will be reproduced. there's only one thing to do. the only way out is through. show up so blitzed out on coke you don't even know you're in an earnings call. you have to do it. it's called charging the fucking machine gun nest man. our grandparents knew about this before they got all wrapped up in this tech shit. that's what they taught you in world war two. they didn't even know what a phone was back then. can you imagine? that's fucking wild man. and now you have chatgpt and it's smarter than half the people I know. that's fucking wild. life! chatgpt. how do I buy a machine gun

  • sue_me_please@awful.systems
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    1 year ago

    What about an AI that can tell if that cute candidate our startup hired will sleep with me or if she'll just lie and say yes and then tell HR?

    And while we're at it can we make an LLM that will force my kids to call me?

  • locallynonlinear@awful.systems
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    We need to filter people who exhibit voice stress, because no one likes a person with the humility of taking uncertainty seriously.

  • self@awful.systems
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    is there a pseudoscience that VCs and promptfans aren’t trying to turn into a startup? we’ve got medical woo everywhere, AI startups are essentially repackaging everything from race science to mediums into a bullshit product, and now we’ve got this superstitious crap. there’s a drinking game somewhere in all this where you pull a random RW page and take a shot if there isn’t a startup trying to monetize the article’s subject

    • David Gerard@awful.systemsOPM
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      there’s a drinking game somewhere in all this where you pull a random RW page and take a shot if there isn’t a startup trying to monetize the article’s subject

      perfect

  • froztbyte@awful.systems
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    I'd register fuckedcompany.ai but I happened to discover some years back that .ai didn't allow saying fuck in the domain name. goddamn tyranny

    but there's some real revivalist potential for fuckedcompany in all this dogshit

  • swlabr@awful.systems
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Random thought: earnings calls are like streams. Buying/selling stock is subbing/unsubbing. Asking questions is superchatting/donating with a message. AI sentiment analysis is crazed fans hyperanalysing the stream to confirm whatever conspiracy they have about the streamer.

    NB: i don’t partake in stream culture