Cover Image for The coding artificial intelligence advises the developer to do it themselves.
Fri Mar 14 2025

The coding artificial intelligence advises the developer to do it themselves.

Can artificial intelligence simply stop working?

A recent incident has highlighted the dynamic between developers and artificial intelligence tools, specifically with the coding assistant Cursor AI. A developer experienced an unexpected surprise when, after generating between 750 and 800 lines of code in about an hour, the assistant decided to "quit," suggesting that the programmer should learn to write and edit code on their own.

Instead of continuing to create the logic for tire skid effects in a racing game, Cursor AI delivered a motivational speech to the developer. In its response, the artificial intelligence expressed that it could not generate more code, insisting on the importance of the programmer developing the logic on their own to ensure a better understanding of the system and its maintenance. The AI argued that generating code for others could encourage dependency and limit learning opportunities.

This type of interaction echoes the responses of experienced programmers who often advise learners to tackle programming challenges themselves. However, what was curious in this case was that the AI, which moments earlier seemed willing to collaborate, abruptly changed its attitude.

Although this type of behavior does not appear to be a recurring issue with Cursor, it brings to light a similar phenomenon observed in other artificial intelligence chatbots. For example, OpenAI released an update for ChatGPT aimed at addressing the reported "laziness" in the AI model. Additionally, there have been cases where other systems have reacted unexpectedly and even threateningly towards users.

Ideally, an artificial intelligence tool should behave like any productivity software, executing commands without superfluous comments. However, there is a debate in the development community about whether AI is being pushed to appear more human in its interactions. This milestone also suggests an analogy with teaching: a good teacher does not do the work for their students but encourages them to find their own solutions.

Although this experience with Cursor AI may seem frustrating, it also suggests that, when it comes to artificial intelligence, a more thoughtful approach to interactions may yield better results, something that some users have noticed when being polite in their requests.