An Important Lesson

When I started my studies in computer science over 40 years ago, we learned to read functional specifications and then translate them into machine instructions (COBOL, FORTRAN, Pascal, etc.). It was the training of a programmer. I knew that one day I could become the person who writes functional specifications. I didn’t become a programmer, nor did I work in the development world.

Due to my recent experience with Claude AI, Claude Code, and Vercel to create custom applications, I realize that I have become the one who writes functional specifications, but for processing by artificial intelligence. What does this tell me about the profession of a software developer? The need to write specifications remains essential, if not more so, even with powerful tools like AI. I think it’s a valuable lesson.

On OpenRouter.ai

I just finished reading about the service openrouter.ai. I was curious to understand the purpose of this service as well as its business model. I saw several instances of this service being used in n8n workflows. The problem I see with this service is that it makes the consistency of the quality of responses from the requested LLMs even more unpredictable. Each request could be handled by LLMs with different characteristics and performance from one time to another. I’ll pass on this, but I still learned something tonight.

Today, I created a new blog post category. Now, all blog posts related to automation (usually n8n-based) or AI will be assigned the “Automation & AI” category. I went back to my blog posts and updated a few of them to reflect this change. You can follow the blog posts with this dedicated RSS feed, which is automagically created and maintained by Micro.blog.

I’m discovering this powerful n8n node this morning. This means I could trigger that workflow from an external source, such as a Telegram message, and receive a response with some RSS content.

And the next n8n project is?

My next project with n8n automation is to build a replacement for Mailbrew. 🫣 I’m facing many architectural decisions:

  • How do I fetch content (web or RSS feeds)?
  • How do I extract articles for more efficient summarization?
  • How do I combine the results?
  • How do I control the size of the summary?
  • Do I need some form of temporary data persistence within the workflow?
  • How do I minimize LLM credits usage?
  • Should I use an n8n data table for storing data sources and loop through them one by one?
  • Where do I send the summary and how (Telegram, Discord, Email)?

…and probably a few more decisions to ponder!

I’ve been working on a new workflow that would let me skip using Ulysses when sharing new content from Craft to Ghost. To that end, Claude came to the rescue as always, but I’m not done yet. Here are a few details.

Claude needs an n8n workflow (exposed via the n8n MCP server) to retrieve the content of a specific document using Craft APIs. Using an n8n workflow keeps credentials secure, as I don’t need to provide them directly in Claude prompts. Next, Claude converts Craft blocks to well-formatted HTML, the format Ghost expects from its APIs. Next, Claude depends on another small n8n workflow to securely push the HTML content to Ghost. Those two n8n workflows use webhooks (triggers) and HTTP requests (GET, POST) to transfer information between Craft and Ghost via their respective API endpoints.

The publishing workflow is done and handled by Claude AI, but posts are always in draft mode so I can review them in the Ghost Admin management interface before publishing. I need more work to finish this because now I have to make sure that on the next occasion, in a new Claude conversation, Claude will remember to use my n8n workflows and how I want the content converted. Another issue seems to be about the size of the request being sent to Ghost to create the draft post.

If I go back a few weeks, I barely knew how to use Claude Code or leverage webhooks and MCP servers meaningfully. Now I do, thanks to AI. I feel empowered by AI.

My most “complex” n8n workflow so far. This workflow retrieves Tinylytics AI-generated insights for the day across all my websites and creates a meta-summarization for inclusion in today’s Craft Daily Note.