• RandomLegend
    link
    fedilink
    103 months ago
    1. yes i know, but Cuda is faster
    2. Ollama is for LLM, Stable Diffusion is for images
    • Possibly linux
      link
      fedilink
      English
      33 months ago

      I’m aware I wanted to point out that AMD isn’t totally useless in AI.

      • RandomLegend
        link
        fedilink
        23 months ago

        Oh it definetly isn’t

        Everything I need does run and I finally don’t run out of vram so easily 😅