Using AI to write a Python CGI
2025-09-20 | #ai #cgi #python | @Acidus
Last week, I added a CGI to my capsule that would display the current moon phase.
In the footer, you will see a fun line I include in most of my projects, varying the emoji:
Made with 🤖 and ❤️ by Acidus
In this case, the 🤖 was a bit of a joke. Because this CGI is written in Python. And I don't know Python. I created the entire CGI using ChatGPT 5.
For me, this was a perfect project to use AI on because:
- This is a small, knowable project.
- I could articulate most of what I wanted, but I was interested in how the AI could suggest and shape ideas.
- I knew there would be a large amount of iteration, and I wanted to see how the AI coped with the back-and-forth.
Who did what?
During this process, I didn't feel like a developer at all. I felt much more like a Product Manager. I would tell the AI what I wanted, see what it came back with, and would give it feedback and make suggestions. Only occasionally, if there was some small change I wanted to make, like adding a footer or something, would I actually "write" any code. It was just easier for me to add than ask it to do. And even then, I wasn't really writing anything. I would add or edit a print statement, modify the order, and change the Gemtext formatting.
Surprising Experiences
I had several interesting experiences that surprised me during this project. First was to note that the very first CGI it created worked and was pretty good. There were some minor polish things I would have added as a human developer, but as a 1.0, it was perfectly cromulent. That's pretty impressive.
Another surprising thing was how helpful it was to be able to ask it non-developer things that it would just know (to some reasonable approximation). For example, I very quickly stated asking about astronomy topics like Moonrise and Moonset times, Super Moons (when a Full Moon happens when the Moon is closest to the Earth), and illumination vs the phase of the Moon. If I were a PM working with a human developer and I asked these questions, the human could perhaps make a guess, but it is unlikely they would know the answer with any certainty. They would have to get back with me later, slowing things down. Having someone who is roughly a junior developer, who also has Wikipedia memorized, makes for fast conversations.
It goes beyond knowledge as well. When I decided I wanted to include some images of the moon, ChatGPT found a suitable image with appropriate Creative Commons licensing from Pixabay. But all the moon phases were in a single PNG image. I just asked ChatGPT to slice it into 8 different images for me, and it did it. I then asked it for the ImageMagick command to auto-crop them down, and got it. This made adding images to the CGI a roughly 5-min operation.
Also, I was surprised at how many extra ideas it came up with of related information to add. Some of these I would not have considered myself, including:
- An ASCII progress bar for where the Moon is in its cycle
- The traditional monthly names for each Full Moon like "Harvest Moon"
- Moon phase emoji art.
I didn't really notice the limits of ChatGPT until the end.
- Once I actually started testing the results against the data on Moon cycle websites, and found some errors. I noticed that there was an off-by-one error due to accumulated rounding. There was also an error where the illumination percentage and the cycle progress percentage were being conflated.
- At one point, I asked it to make an illumination progress bar just like the lunar cycle progress bar. ChatGPT did this by just adding another progress bar next to the lunar cycle progress bar. I already had a line for "Illumination" near the top of the CGI's output. A human would have understood that any illumination progress bar should go next to the existing illumination information. This was funny because ChatGPT did the logical thing but not the intuitive thing.
Lies
The only time ChatGPT seemed confused and gave me outright wrong information was while I was writing this blog post. I asked ChatGPT to look at the timestamps of the chat and count how long it took to create the CGI. It told me 10+ hours. When I asked it to give me all the prompts I asked it and the actual timedate stamps when I asked, it gave me a much longer time for today, even though I wrote this on the evening of September 11th. I can look at the creation and modified times for the `moon.py` file. I started around 10:30 p.m. and I finished around 1 a.m. So I spent ~2.5 hours on this. I have no idea why ChatGPT cannot understand when I asked what I asked.
Results and lessons
I'm pretty happy with the results. Overall, I spent 2.5 hours working with ChatGPT to pretty completely explore a problem and build out a solution.
- I wouldn't be comfortable asking ChatGPT to create something that I didn't know how to troubleshoot. I don't know Python, but the problem is simple enough that I can look at the math and the output and sanity check it.
- I would consider asking ChatGPT or another coding AI like Claude to help me write more complex programs (like helping with Kennedy) only in a language I already knew very well. In that sense, I treat it like a very junior developer and I'm doing a code review. This strikes me as a good use for it.
- I found ChatGPT pretty refreshing, like pairing with someone or being part of a team. It came up with approaches and ideas that surprised me, even as a fairly senior technologist. While many of them were not good (Zodiac signs, super technical astronautical details, etc.), enough were that I was surprised and glad I asked. Since I do so much of my programming now, by myself, as a hobby, this is a nice addition.
I have shared the entire ChatGPT conversation below: