Building FrameFit: AI-Powered macOS App in 3 Hours
I have a problem that many technical writers and product managers will recognize: inconsistent screenshots. When you’re creating documentation, having windows at different sizes makes everything look unprofessional. One image shows a browser at 1200px wide, the next at 1400px, and suddenly your work looks messy. I needed a tool to fix this.
I found Burke Holland’s ResizeMe, which was close, but I wanted something more tailored to my specific workflow. So, I decided to build my own solution: FrameFit. The challenge I set for myself was to build a fully functional macOS app in just three hours. With the help of AI, I did it.
This is the story of that three-hour journey, the good, the bad, and the surprising ways AI is changing how we build software.
The First Step: Defining the Vision with AI
I started my journey with CommandCode.ai, an AI development tool I was beta testing. My first step wasn’t writing code, but writing a clear prompt. I asked cmd to build the app with a simple set of requirements:
“Create me a macOS app that will be used to resize application windows to a preset of sizes and also allow to set custom presets. The application is inspired by the https://github.com/burkeholland/ResizeMe project.”
I also specified my preferred tech stack—what the tool calls “tastes”—including TypeScript, React, and that all app references should be named “FrameFit”. I decided on Tauri for the underlying framework, which allows you to build desktop applications with web technologies.
The First Roadblock: When the AI Stumbles
The AI completed the job, but the initial output wasn’t perfect. I was met with a couple of build errors. This is where the real collaboration began.
The first error was a fundamental one: nothing from Tauri seemed to be working correctly. It was a frustrating start. After some investigation, I realized the AI-generated setup was flawed. I solved it by creating a default “Hello World” Tauri app myself and feeding that clean structure back to the AI. It learned from this, and we were past the first hurdle.
Next, a more subtle issue appeared with the tauri-plugin-store, a plugin for persistent data storage. The configuration generated by the AI was for an older version of the plugin, creating a version mismatch that broke the build.
This back-and-forth process was crucial. The AI wasn’t a magic wand; it was a “copilot” (haha, that is why GitHub/Microsoft named it “Copilot”) that needed guidance and correction.
The Debugging Dance: A Tale of Two Errors
With the initial setup fixed, I moved on to the UI. The AI-generated UI was visually “perfect,” with a nice grid for presets and inputs for custom dimensions, all styled with Tailwind CSS. But the functionality was broken. This led to two interesting debugging stories.
1. The Hardcoded Screen Resolution
I asked the AI to implement a feature to center the app window on the screen. Visually, it failed spectacularly, placing the window at the top-right or bottom-right of my screen. I did a quick code review and immediately spotted the problem: the AI had hardcoded the screen resolution to 1920x1080. My screen was much larger, so the coordinates were completely off.
I didn’t need to fix this myself. I simply pointed out the issue to the AI and asked it to use the current screen’s resolution instead of a hardcoded value. It understood and corrected the logic.
2. The Cryptic AppleScript Failure
The core of the app relies on AppleScript to control and resize application windows. The AI’s first attempt failed with a cryptic error:
AppleScript error: 402:406: execution error: System Events got an error: Can’t get desktop 1 of application process "Ghostty". (-1728)I fed this error back to the AI. It provided a fix, which then led to a new error:
AppleScript error: 127:131: execution error: System Events got an error: Can’t get size of desktop 1. (-1728)The AI fixed that one too, but then the positioning was incorrect. At this point, I recognized the pattern. Just like with the UI centering issue, the AppleScript was likely using hardcoded screen dimensions. I prompted the AI to use the dynamic screen resolution, and after providing the correct file paths for the script to modify, everything finally clicked into place.
The 3-Hour Verdict: AI as an Accelerator
In just under three hours, FrameFit was a living, breathing, and, most importantly, functional macOS app. The journey wasn’t seamless, but the AI’s involvement was transformative. It handled the boilerplate, generated complex logic, and, even when it made mistakes, it provided a solid foundation that we could debug and refine together.
This experience taught me that AI doesn’t replace the developer. It empowers them. My ability to spot a hardcoded value or understand the context of a cryptic error message was essential. The AI wrote the code, but I provided the direction, the oversight, and the critical thinking to guide us to the finish line.
Related articles
From idea to production in under an hour with AI
How I used GitHub Copilot and AI-powered development tools to build a complete VS Code extension stats dashboard from scratch in record time.
How to debug your SharePoint Framework web part
Search web part created with the SharePoint Framework
Report issues or make changes on GitHub
Found a typo or issue in this article? Visit the GitHub repository to make changes or submit a bug report.
Comments
Let's build together
Manage content in VS Code
Present from VS Code
Engage with your audience throughout the event lifecycle