I believe I decided to build my web apps at the right time1, because if I did it now, it would take two or three times longer due to the current credits consumption rate enforced by Anthropic. I made a little tweak to my dashboard web app this morning, and I’m already at 36% for the current session. It’s really that bad. I wouldn’t pay 200$ a month to get more credits.


  1. In the first three months of 2026. ↩︎

This morning, I realized that managing open issues and bugs on GitHub is advantageous: the more detailed each open issue is, the more effective it serves as a prompt when importing into Claude Code to initiate a new bug-fixing session. Claude Code can also close the issue at my request and link it to a specific GitHub commit.

I’ve been testing the latest release of the Claude Desktop app, and I must say, more than ever, I prefer integration to splitting features across many different apps. I’m also leaving the CLI behind for now.

For all my pending issues across my different Claude Code and web apps, I just realized I could (and probably should) use GitHub for issue tracking instead of Craft. Additionally, I could develop a web app to monitor and manage all open issues across my repositories!

Today, I tested Ollama and Locally AI more extensively on my M2 MacBook Air, and it was quite demanding. It’s no surprise that a serious local AI setup requires an Apple M5 Pro or M5 Pro Max with at least 16 or 24 GB of RAM. My M4 Pro Mac mini has 24 GB, and I could use it remotely through an SSH session. This experimentation puts any plans to replace my aging M2 MacBook Air into perspective.

Anthropic Rebuilds Claude Code Desktop App Around Parallel Sessions:

Anthropic has released a redesigned Claude Code experience for its Claude desktop app, bringing in a new sidebar for managing multiple sessions, a drag-and-drop layout for arranging the workspace, and more.

I’ve been testing the new fat client and found it to be familiar yet overwhelming. It’s taking the shape of a full IDE. Monolithic clients aren’t getting the favor of people apparently but I prefer this approach over separate apps.

Creating an email summarizer was simpler than expected. However, Claude AI struggled significantly with email decoding and data extraction. While one might assume these processes are well-documented and easy, that doesn’t seem to be the case. Additionally, setting up a new Gmail account proved to be unreliable. I encountered numerous errors at various stages, making me question whether the process actually succeeded. You’d expect Google to handle this smoothly, but unfortunately, I wasn’t so lucky.

Here’s the overall workflow.

With my blog’s new custom design in place, I’m considering my next project: developing an email summarizer using Gmail, n8n workflows, Claude AI, and Discord. I don’t plan to use this often, but I do have some emails that I get that I wish could be summarized just by sending them to a dedicated email address.

Inoreader announced support for third-party AI providers for articles summarization. Anthropic and OpenAI are supported among others. Just enter your own API key and voilà! But there is a big catch: even if available for Pro plans, you need an add-on upgrade to enable this! That, I don’t understand because in this scenario, Inoreader is in fact delegating the LLM so they incur not additional costs. I find this perplexing to say the least. Or I might be missing something. I hope someone at Inoreader will catch this comment.