this post was submitted on 28 Apr 2026
109 points (78.2% liked)
Technology
84222 readers
4911 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If they used AI, then I consider they lost all credibility.
I really struggle to see the point of posts like this. It is an interesting article about an interesting topic.
For me it is not only that they used AI for the writing, is that they did not care to review/recheck/polishing it before releasing it to the public, so my effort in consuming it will be reciprocal
Oh so you want them to do all that and gather all the data and do it themselves for free? What a dumb comment.
I've run a honeypot for the last month and the data is near-identical to this. It's definitely credible.
If you publish something online, that's also a responsibility. If they don't want that, then they could just have made a comment somewhere "yo, i've had this container online on port 22 and this is what happened, yolo".
Same for open source software btw.
Yes, that is what 90% of the internet has been about since it became a thing. Doing everything for profit turns everything into shit.
The issue with using AI is that the author doesn't openly disclose the use at the beginning of the paper.
Yes, I know this particular write-up isn't for official submission to an academic journal, but sharing methodology is important.
I would have no problem with AI-assisted writing IF the author credited the service used and, where applicable, included the prompts used.
It should be similar to documenting any sourced material. It's not just about giving credit where credit is due. It's also about accountability.
Why is this necessary? Does this add anything at all to the conversation?
Ah, well then. Problem solved. Someone on the internet said it's credible, therefore it must be credible. Tell ya what - when you create a webpage to display your data and then provide an analysis of said data, I'll consider you credible. Until then, though, you are just some short-tempered, rude, anonymous voice shouting into the void.
Near-identical doesn't make it valuable. Plausible but incorrect is still incorrect. AI creates plausible and credible but incorrect data.
The plausibility and credibility is like a honeypot for your confidence. You read it, and understand it, and come to believe it. But it was false all along. You think you learned things. You actually learned nothing.
Sounds like AI
I initially disagreed but after actually reading the post, I'm with you. If it was only the article's text that was generated and not the data or graphs then I don't see why the whole thing would be written off. I mean, it's really sad seeing people offload their writing to AI but I still found it interesting.
Yeah I hate the slop but the data is good.