Good thing that's not the case then.
CompassRed
It's not about stupid or smart. It's a tool, not a person. If you don't get the same results that other people get with the same tool, then what could possibly be the problem other than how the person is using the tool?
No, it's not. It doesn't have intention. It's literally just a tool. If you don't get the results you expect with a tool when other people do get those results, then the problem isn't the tool.
The symptoms you describe are caused by bad prompting. If an AI is providing over-complicated solutions, 9 times out of 10 it's because you didn't constrain your problem enough. If it's referencing tools that don't exist, then you either haven't specified which tools are acceptable or you haven't provided the context required for it to find the tools. You may also be wanting too much out of AI. You can't expect it to do everything for you. You still have to do almost all the thinking and engineering if you want a quality project - the AI is just there to write the code. Sure, you can use an AI to help you learn how to be a better engineer, but AIs typically don't make good high-level decisions. Treat AI like an intern, not like a principal engineer.
It's not the same, and you kinda answered your own question with that quote. Consider what happens when an object defines both dunder bool and dunder len. It's possible for dunder len to return 0 while dunder bool returns True, in which case the falsy-ness of the instance would not depend at all on the value of len
Python and Java are barely comparable. I adore both languages equally and use them about the same amount at work. They are just different tools better suited to different tasks.
I don't recall any socialized courier or food delivery services.
Must just be a skill issue.