this post was submitted on 18 Aug 2025
1130 points (99.0% liked)
Technology
74292 readers
6451 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Is there nightshade but for text and code? Maybe my source headers should include a bunch of special characters that then give a prompt injection. And sprinkle some nonsensical code comments before the real code comment.
There are glitch tokens but I think those only effect it when using it.
I think the issue is that text uses comparatively very little information, so you can't just inject invisible changes by changing the least insignificant bits - you'd need to change the actual phrasing/spelling of your text/code, and that'd be noticable.
Maybe like a bunch of white text at 2pt?
Not visible to the user, but fully readable by crawlers.
If a bot can't read it, nor can a visually impaired user.
Well if it's a prompt injection to fuck with llms you don't want any users having to read it anyway, vision impaired or no.
You missed my point. A prompt injection to fuck with LLMs would be read by a visually impaired user's screen reader.