All of Web Developer Twitter is absolutely shitting itself because it's convinced GPT is going to make software developers redundant. No idea if this guy's joking but look at this ridiculousness:
Honestly the scariest thing about ChatGPT's coding ability is how many people currently building the websites we use every day rate it credible competition to their skills.
I've had a play with ChatGPT and it's honestly really impressive, but by that I mean that it's very impressive that a computer can so perfectly recreate the experience of hand-holding a very junior developer through comparatively simple tasks. Obviously it will continue to develop, but I don't see this approach reaching a point where it can usefully replace a human developer.
(Incidentally I find it extremely funny that people are asking it to generate TypeScript. If a computer is writing your code, why not just have it write vanilla JavaScript and save yourself a build step?)
Example: I asked it to generate a login form with some code to stop a user from copy-pasting into the password field, and it did. Its code was quite correct, and looked like it would work. But a good developer would not have written that code, because stopping users copy-pasting into the password field is a bad idea. The AI was able to turn the instruction into code, but wasn't able to push back on the inappropriate instruction.
Then I asked it a harder problem. I was extremely impressed at the speed and quality of its code, but it hadn't quite understood what I meant. After a clarification it wrote better code, but it seemed inefficient. Another prompt fixed that, but the output wasn't in a nice format. And it just wasn't able to fix that. It got very close at one point, but its next iteration was to go back to a previous broken attempt.
Even if it had managed to fix the bug, I had to find it. A user could tell you 'hey man your website says "undefined" all over', but getting from there to the root cause somewhere in the codebase? I don't think that's an issue that a language model will even be able to address, at least, not one that costs less to run than a human being. Some problems actually require you to understand them.
Ultimately I think the issue becomes that before the computer can write good code, I need to find a way to succinctly and precisely explain to the computer what I want it to do. And we have a way of doing that. It's called me writing the code.
And all of this is cute fun and games but my job is not "write cute code snippets to play with the robot", it's maintaining a system that manages a database of sensitive data about real people. So I asked the AI to "write a node api to allow a customer to get their data from a SQL database in a multi-tenant system". It wrote this:
const express = require('express')
const app = express()
const bodyParser = require('body-parser')
const db = require('your-database-library')
// Use body-parser to parse incoming request bodies
app.use(bodyParser.json())
// GET request to retrieve customer data
app.get('/customer/:id', (req, res) => {
// Retrieve the customer ID from the URL parameter
const customerId = req.params.id
// Connect to the database
db.connect()
// Query the database to retrieve the customer data
db.query(`SELECT * FROM customers WHERE id = ${customerId}`, (err, results) => {
// Handle any errors that occurred during the query
if (err) {
res.status(500).json({ error: err.message })
return
}
// Return the customer data to the client
res.json(results[0])
})
// Close the database connection
db.end()
})
// Start the server
app.listen(3000, () => {
console.log('Server listening on port 3000')
})
Even after I got it to fix the SQL injection attack, you can still just request any customer's data you want. This is worse than useless for real-world applications — even if you just use it as a time-saving VSCode plugin for a human dev to use, all you've done is replace the relatively easy job of writing code with the much harder job of analysing code. You'll end up spending longer and ending up with worse code.
I guess where I'm ending up on this is: GPT is already capable of writing basic code and will only improve, but we could get all those same improvements much more easily and reliably by creating good open-source app frameworks and high-level languages. If the goal is to empower humans to quickly specify behaviour and have the computer turn it into runnable code, then it isn't a groundbreaking new development, it's just a compiler that works in the most obtuse and preposterous way imaginable. And while it will absolutely get better and better with time, so will normal compilers and the languages they work on — and I just can't fathom how it could possibly ever pull ahead.
I don't know. Maybe this will age really badly. But I don't think so.
