I got 32 additional GB of ram at a low, low cost from someone. What can I actually do with it?

  • zkfcfbzr@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    22 days ago

    I have 16 GB of RAM and recently tried running local LLM models. Turns out my RAM is a bigger limiting factor than my GPU.

    And, yeah, docker’s always taking up 3-4 GB.