I have said it before and I will say it again. I am not a big fan of AI. I see it being relied upon by people who lack artistry and creativity and it saddens me frankly.
A friend asked me to write a song with him recently. I finished a very nifty first couplet and after he saw it he told me that he was going to have ChatGTP do his part. I never got back to him. No interest in working with machine intelligence.
I should point out that AI is part of some of the processing tasks in Photoshop and I have had to use it at times but I use it as sparingly as I can as there is no other way. I won't generate fantastic backgrounds that AI thinks are cool.
I think that my biggest problem with AI is seeing the little proviso warnings that the information it provides you might not be accurate. That scares the hell out of me. And our willingness to accept this lack of veracity and precision in our research is equally if not more troubling.
Interesting article on the subject at Nature today, Hallucinated citations are polluting the scientific literature. What can be done? Nature suggests that tens of thousands of publications from 2025 might include invalid references generated by AI.
Earlier this year, computer scientist Guillaume Cabanac received a notification from Google Scholar that one of his publications had been cited in a paper published in the International Dental Journal1. That was unexpected, because his research on spotting fabricated papers doesn’t typically intersect with dentistry. “I was very surprised to see that I couldn’t recognize my own reference,” says Cabanac, who is based at the University of Toulouse in France.
The title in the citation resembled that of a preprint2 he had posted in 2021 and never published formally, but the journal was listed as Nature and the DOI — the unique identifier assigned by publishers and preprint repositories — did not lead to the original preprint. “I got very concerned,” adds Cabanac, who immediately suspected that the citation had been hallucinated by artificial intelligence.
This is just one example of a rapidly growing problem. Surveys and related studies have shown that researchers are increasingly using large language models (LLMs) to help to conduct literature searches, write manuscripts and format bibliographies. And sometimes, these models generate non-existent academic references.
I prefer to generate my hallucinations and my research the old fashioned way, thank you. Don't be lazy, research and verify and write and create your own output.
No comments:
Post a Comment