• M34L@alien.topB
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    It’s for AI acceleration. In AI training (and with LLMs, inference), the VRAM is basically hard limit of the complexity of the AI model you can run on the GPU. 20GB is enough to train some small LLMs, 10GB is not.