How many of these books will just be totally garbage nonsense just so they could fulfill a prearranged quota.
Now the LLM are filled with a good amount of nonsense.
Just use the llm to make the books that the llm then uses, what could go wrong?
Someone’s probably already coined the term, but I’m going to call it LLM inbreeding.
I suggested this term in academic circles, as a joke.
I also suggested hallucinations ~3-6 years ago only to find out it was ALSO suggested in the 1970s.
Inbreeding, lol
The real term is synthetic data
but it amounts to about the same
In computer science, garbage in, garbage out (GIGO) is the concept that flawed, biased or poor quality (“garbage”) information or input produces a result or output of similar (“garbage”) quality. The adage points to the need to improve data quality in, for example, programming.
There was some research article applying this 70s computer science concept to LLMs. It was published in Nature and hit major news outlets. Basically they further trained GPT on its output for a couple generations, until the model degraded terribly. Sounded obvious to me, but seeing it happen on the www is painful nonetheless…
It’s quite similar to another situation known as data incest
Soylent AI? Auto-infocannibalism
It can only go right because corporations must be punished for trying to replace people with machines.
That would be terrible because they are both some of the best academic publishers in the humanities.
And they expect you to do this for free?
Do they not have to pay for the privilege? Or is this not referring to academic publishing? (It’s not super clear, but context indicates academic?)
If it is that makes it even worse. Academic publishers need to be abolished.
Nah, they get “Exposure”!
/s
Anyone who reviews for the major publishers is part of the problem.
For profit corporations don’t deserve your volunteer work.
And yet if you aren’t a reviewer it makes your CV look worse.
Agreed that you should have some kind of “service” on your CV, but reviewing is pretty low impact. And if you want to review, you can choose something other than the predatory publishers.
Such as? They’re all predatory just to varying degrees.
deleted by creator
Maybe check the other comments.
Feed the LLM with LLM generated books. No resentment at all!
Jfc that’s gross
So what you’re saying is, don’t beat the targets because fuck those guys. Understood.
What’s the academic terminology for “go pound sand”?
Soylent Green is a lie anyway. Your need to “soylentify” half the population to feed the other half every year if it would be the only source of calories.
No, the point is that they’re just recycling the dissidents they were going to murder anyway.
Honestly sometimes I feel like I’m the only one on Lemmy who likes AI
AI absolutely has its benefits, but it’s impossible to deny the ethical dilemma in forcing writers to feed their work to a machine that will end up churning out a half assed version that also likely has some misinformation in it.
And will likely take their professions
I don’t think so, at least for a little bit. Big cooperation will surely try to market it that way, but we’ve already seen how badly AI can shit the bed when it feeds on its own content
It will take some people’s professions. People who write click bait articles, schlock product reviews, and pulp romance novels, and the things modern Hollywood describe as scripts might be out of a job.
Quality novels, hard-hitting journalism, and innovative storytelling of all sorts is outside of the capability of LLMs and might always be. There’s a world where nearly all run-of-the-mill writing is done by LLMs, but truly original works will always be made by people.
At the end of the day, though, if a person can’t out-write an AI they might be in the wrong line of work.
Remember! It’s not AI hallucinations, it’s simply bullshit!
AI as a technology is fascinating and can be extremely useful, especially in places like the medical field. AI as a product in its current state is nothing more than dystopian plagiarism.
I like AI. But I’m not sure I like the way we use it if it’s only to meet shareholders’ expectations or to be a tool for greedy people. What is your opinion concerning the way we seem to use AI in academic research?
Found the black and white only guy.