AI absolutely has its benefits, but it’s impossible to deny the ethical dilemma in forcing writers to feed their work to a machine that will end up churning out a half assed version that also likely has some misinformation in it.
I don’t think so, at least for a little bit. Big cooperation will surely try to market it that way, but we’ve already seen how badly AI can shit the bed when it feeds on its own content
It will take some people’s professions. People who write click bait articles, schlock product reviews, and pulp romance novels, and the things modern Hollywood describe as scripts might be out of a job.
Quality novels, hard-hitting journalism, and innovative storytelling of all sorts is outside of the capability of LLMs and might always be. There’s a world where nearly all run-of-the-mill writing is done by LLMs, but truly original works will always be made by people.
At the end of the day, though, if a person can’t out-write an AI they might be in the wrong line of work.
AI as a technology is fascinating and can be extremely useful, especially in places like the medical field. AI as a product in its current state is nothing more than dystopian plagiarism.
I like AI. But I’m not sure I like the way we use it if it’s only to meet shareholders’ expectations or to be a tool for greedy people. What is your opinion concerning the way we seem to use AI in academic research?
Honestly sometimes I feel like I’m the only one on Lemmy who likes AI
AI absolutely has its benefits, but it’s impossible to deny the ethical dilemma in forcing writers to feed their work to a machine that will end up churning out a half assed version that also likely has some misinformation in it.
And will likely take their professions
I don’t think so, at least for a little bit. Big cooperation will surely try to market it that way, but we’ve already seen how badly AI can shit the bed when it feeds on its own content
It will take some people’s professions. People who write click bait articles, schlock product reviews, and pulp romance novels, and the things modern Hollywood describe as scripts might be out of a job.
Quality novels, hard-hitting journalism, and innovative storytelling of all sorts is outside of the capability of LLMs and might always be. There’s a world where nearly all run-of-the-mill writing is done by LLMs, but truly original works will always be made by people.
At the end of the day, though, if a person can’t out-write an AI they might be in the wrong line of work.
Remember! It’s not AI hallucinations, it’s simply bullshit!
AI as a technology is fascinating and can be extremely useful, especially in places like the medical field. AI as a product in its current state is nothing more than dystopian plagiarism.
I like AI. But I’m not sure I like the way we use it if it’s only to meet shareholders’ expectations or to be a tool for greedy people. What is your opinion concerning the way we seem to use AI in academic research?
Found the black and white only guy.