this post was submitted on 25 Aug 2024
327 points (92.5% liked)

Technology

59569 readers
3825 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Nighed@sffa.community 24 points 2 months ago* (last edited 2 months ago) (4 children)

I'm going to call BS on that unless they are hiding some new models with huge context windows...

For anything that's not boilerplate, you have to type more as a prompt to the AI than just writing it yourself.

Also, if you have a behaviour/variable that is similar to something common, it will stubbornly refuse to do what you want.

[–] mozz@mbin.grits.dev 13 points 2 months ago (3 children)

Have you ever attempted to fill up one of those monster context windows up with useful context and then let the model try to do some useful task with all the information in it?

I have. Sometimes it works, but often it’s not pretty. Context window size is the new MHz, in terms of misleading performance measurements.

[–] floofloof@lemmy.ca 7 points 2 months ago

I find there comes a point where, even with a lot of context, the AI just hasn't been trained to solve the problem. At that point it will cycle you round and round the same few wrong answers until you give up and work it out yourself.

[–] Nighed@sffa.community 1 points 2 months ago

To actually answer your question - yes, but the only times I actually find it useful is for tests, for everything else it's usually iffy and takes longer.

Intelligently loading the window could be the next useful trick

[–] Nighed@sffa.community 1 points 2 months ago

I think that giving the LLM an API to access additional context and then making it more of an agent style process will give the most improvement.

Let it request the interface for the class your using, let it request the code for that extension method you call. I think that would solve a lot, but I still see a LOT of instances where it calls wrong class/method names randomly.

This would also require a lot more in depth (and language specific!) IDE integration though, so I forsee a lot of price hikes for IDEs in the near future!