Assuming the training software could be run on the hardware and that we could distribute the load as if it was 2023, would it be possible to train a modern LLM on hardware from 1985?
Assuming the training software could be run on the hardware and that we could distribute the load as if it was 2023, would it be possible to train a modern LLM on hardware from 1985?
cant even adress so much memory on 16 bit systems :)