← Back to portfolio
Autonomous Agent · Computer Vision · macOS

Friday

The AI that controlled your Mac.

September 2025 · January 2026 · Solo project

An autonomous AI agent for macOS

Friday wasn't a chatbot or a voice assistant. It was an autonomous agent that could see the screen, move the mouse, type on the keyboard, and complete complex tasks on its own. No APIs, no cloud, no shortcuts. It used computer vision and chain of thought reasoning to perceive the display and decide what to do next, exactly like a human sitting at a computer. Except faster, and it never got tired.

100%
On device
0
APIs used
V2
Final version

See the screen, understand it, act on it

Friday continuously screenshotted the display, ran OCR and vision models on it, and built a mental model of everything visible: buttons, text fields, menus, windows. From that understanding, it decided what to do next.

It could click, type, scroll, switch apps, and chain multi step workflows across different programs. For complex tasks, it used chain of thought reasoning to break the problem down and execute each step in sequence.

The most interesting behavior was tool delegation. Friday understood which app was best for which job. Need to write something? It opened ChatGPT. Build a presentation? Gamma. Design work? Canva. Research? Perplexity. It picked the right tool, just like you would.

1. Screenshot capture the full screen at regular intervals 2. OCR + Vision extract all text, identify UI elements 3. State Model build a structured representation of what is on screen 4. Reasoning chain of thought to decide the next action 5. Execution mouse click, keyboard input, scroll, or app switch 6. Verification re-capture, confirm the action succeeded, continue ↺ loop

What Friday could actually do

Two versions, both fully on device

Friday went through two major iterations. No external servers or cloud processing in either.

V1

Proof of concept. Basic screen reading, simple action execution, linear reasoning. Minimal error handling, but it proved the core idea: an AI could perceive a GUI and interact with it through synthetic input.

V2

The real version. Deeper reasoning chains that planned several steps ahead, robust error recovery, smarter action planning for complex tasks. The perception pipeline got significantly more accurate too.

The real takeaway

Friday proved that an AI brain could perceive and act on a complex environment in real time. Not through APIs or pre programmed scripts. It literally saw the interface and interacted with it the way a human would. That's a fundamentally different approach from most AI tools, which work behind the scenes through programmatic interfaces.

If a brain can control a screen by seeing and acting on it, what happens when the environment is not a screen but a physical robot body? That question became the foundation of 20n.

The perception · reasoning · action loop that powered Friday is the same loop that powers biological organisms. Friday ran it in a digital environment. The natural next step was to run it in the physical world. That's what led Nick0 to leave Friday behind and go all in on robotics.

The concept was proven. The real frontier was elsewhere.

Controlling a screen was the proof of concept. But a screen is a controlled, predictable, 2D environment. A physical robot body deals with gravity, friction, inertia, and a world that doesn't pause when you stop to think. That's where the real difficulty is, and that's where the real impact would come from. So Nick0 went all in on robotics. Friday's core architecture (the perception · reasoning · action loop, the continuous state monitoring) became the conceptual foundation of 20n Research Laboratory. Friday was not abandoned. It graduated.

Built with

Intentionally minimal. Just what's needed to capture the screen, understand it, reason about it, and act on it.

Python Computer Vision OCR (Tesseract) pyautogui LLM Reasoning Screenshots macOS

Friday is no longer active. The website now serves as a transition page pointing to 20n, the project that grew directly out of Friday's core ideas.