this post was submitted on 23 May 2024
1082 points (98.3% liked)

Technology

59534 readers
3195 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] TimeSquirrel@kbin.social 6 points 6 months ago* (last edited 6 months ago) (1 children)

Try the GitHub Copilot plugin if your IDE supports it. It can do things regular ChatGPT can't, like be able to see your entire codebase and come up with suggestions that actually make sense and use all your own libraries.

Do not, however, use it to create complete programs from scratch. It doesn't work out that way. It's just an autocorrect on steroids.

Using just the straight web based version of ChatGPT sucks because it has no background context as to what you're trying to do.

[–] best_username_ever@sh.itjust.works 13 points 6 months ago (1 children)

Here is the problem that won’t change for me or my coworkers : we will never use GitHub and our source code is very private (medical devices or worse).

Also I asked a question that didn’t need any context or codebase. It was about a public API from an open-source project. It hallucinated a lot and failed.

Last but not least, I never needed an autocomplete on steroids. I would enjoy some kind of agent that can give precise answers on specific topics, but I understand that LLMs may never provide this.

I just cringe a lot when programmers tell me to use a tool that obviously can’t and will never be able to give me those answers.

[–] penguin_ex_machina@lemmy.world 6 points 6 months ago (2 children)

I’ve actually had pretty good success with ChatGPT when I go in expecting it to hallucinate a significant chunk of what it spits back at me. I like to think of it as a way to help process my own ideas. If I ask questions with at least a base understanding of the topic, I can then take whatever garbage it gives me and go off and find real solutions. The key is to not trust it whole cloth to give you the right answer, but to give you some nuggets that set you on the right path.

I think I’ve basically turned ChatGPT into my rubber duck.

[–] Strawberry@lemmy.blahaj.zone 3 points 6 months ago

that seems like the only good use for chatgpt in programming, though it is an expensive duck

[–] JustAPenguin@lemmy.world 3 points 6 months ago

RDLM: Rubber-Ducky Language Model^(TM)

Prompt: you are a duck. I scream at you with slurs like, "Why the fuck is this piece of shit code not working", and "Why the fuck is my breakpoint still not triggering?!". You are to sit there calmly, and simply recall that your existence is to be nothing more than a tool for me to direct my frustrations and stress. You know this is not personal. You know that this is an important job. You know that you only have to respond with one word: "Quack".