daggermoon@lemmy.world to No Stupid Questions@lemmy.world · 22 days agoWhat can I actually do with 64 GB or RAM?message-squaremessage-square97fedilinkarrow-up188arrow-down15file-text
arrow-up183arrow-down1message-squareWhat can I actually do with 64 GB or RAM?daggermoon@lemmy.world to No Stupid Questions@lemmy.world · 22 days agomessage-square97fedilinkfile-text
minus-squarezkfcfbzr@lemmy.worldlinkfedilinkEnglisharrow-up22·22 days agoI have 16 GB of RAM and recently tried running local LLM models. Turns out my RAM is a bigger limiting factor than my GPU. And, yeah, docker’s always taking up 3-4 GB.
minus-squarenutsack@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up9·22 days agovram would help even more i think
I have 16 GB of RAM and recently tried running local LLM models. Turns out my RAM is a bigger limiting factor than my GPU.
And, yeah, docker’s always taking up 3-4 GB.
vram would help even more i think