I feel like anyone that would use ChatGPT or whatever to write code trusts computers too much to be an effective programmer.
Someone once said “that’s the problem with computers: they do exactly what you tell them to.” With the advent of “AI” we’re swiftly breaking ground on changing that.
So that's the thing. Computers may do exactly what you tell them but they also don't do things you don't tell them to do. A chat AI can never possibly understand all the weird edge cases and error checking you might need to do, it'll never fully understand the process flow the way you've laid it out in your head. That's why when you work on a big team in IT you often spend hours and hours creating flowcharts and entity-relationship database diagrams and technical requirements documents and thinking about how the infrastructure will scale and getting into fights with your data architect about how it's fucking impossible to code for "future business requirements" that you don't know about.