A fun thing about seeing someone post code that's AI-generated is going from "well that's syntatically correct python that probably does what it says" to "actually I strongly doubt anything about this web-scraping code actually works correctly" which, when I think about it. is an interesting reaction to have, that I necessarily trust code that I think a human has actually written more than AI.
Not that it's a wrong reaction, I just presume that someone writing the code has actually done enough to examine the underlying data structure being scraped. That's not necessarily true!
it's not, but there's still the fact that a human might
it's perfectly fair to assume that an LLM hasn't, because it can't
an LLM cannot work with intentionality
