saw this kicking around on the lobsters frontpage

IBM also includes a DPU for accelerating IO, along with an on-board AI accelerator

ah yes. an AI accelerator. for a chip that goes into a system that’ll quite possibly have a lifespan measured in decade-partials. in environments so extremely up to date with the bleeding edge of technology that they are absolutely not losing their programmers to retirement. an AI acceleator for that. makes total sense.

imagine being the poor engineers who had to spec that out, design it, and get it actually existing. nevermind even the awe-inspiringly stunning disregard to reality that it takes for some management fuckhead(s) to have “steered” this

Funny enough, I asked why IBM had different terminology compared to the rest of the industry. They said IBM came up with the terminology first, and later on the industry adopted different terminology. It was all lighthearted and funny.

ah yes! funny! haha. not at all some weird insular shit from the same company that runs a worlds-apart platform with a control grip so strong it makes larry ellison check if they’re infringing on anything actionable…

  • skillissuer@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 months ago

    that ai accelerator will be used to spit out believably-looking but fucked up in subtle ways cobol. that’s job security with a side of eldritch horrors

    • froztbyte@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      2 months ago

      haha I did try to ponder what the “LLMs, but cobol” corp world would look like but I couldn’t really think of anything particularly good, although this might be one

      the other I thought of is:

      1. bank: “we have an innovative AI/ML/adaptive-fraud-response(/other bullshit term of art here) research team, leading the way with novel research in methodologies for detecting {…}”
      2. dept researchers throw some papers and jupyter notebooks over the wall
      3. suits up top say “okay now make it work”
      4. some poor bastard has to figure out how to glue cobol and jupyter together on systemz in some way that doesn’t get them fired or risk millions/minute

      I’m sure it’ll all go perfectly well

  • BurgersMcSlopshot@awful.systems
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 months ago

    I remember trying to get a not-stupid description of things like AI accelerators and NPUs, if I understand correctly isn’t it mostly “processes matrices of integers in less clock cycles?”

  • froztbyte@awful.systemsOP
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 months ago

    the press release (archive) says:

    featuring its first advanced on-processor chip AI accelerator for inferencing

    For instance, our AI-driven fraud detection solutions are designed to save clients millions of dollars annually. With the introduction of the AI accelerator on the Telum processor, we’ve seen active adoption across our client base. Building on this success, we’ve significantly enhanced the AI accelerator on the Telum II processor

    if I’m reading this correctly, it’s on-die in the telum ii, but was a separate thing (like co-processor or architecture add-in card or something) previously?

    the usecase sooooort of makes sense but I’m still skeptical about part of it because this seems awfully like it’d be potentially limited by changes over time in how one might do such tasks (e.g. if a new preferred inferencing method comes out that doesn’t quite fit the chip pattern). but also “our AI-driven fraud detection solutions” - ah.

    guess it’ll be interesting to see how this shit sits in 10y or something.