It’s all about the models and training, though. People thinking ChatGPT 3.5/4 can write their legal papers get tripped up because it confabulates (‘hallucinates’) when it isn’t thoroughly trained on a subject. If you fed every legal case for the past 150 years into a model, it would be very effective.
It’s all about the models and training, though. People thinking ChatGPT 3.5/4 can write their legal papers get tripped up because it confabulates (‘hallucinates’) when it isn’t thoroughly trained on a subject. If you fed every legal case for the past 150 years into a model, it would be very effective.