Testing Micro.blog Bookmarking Feature

For the first time today, I diligently tested Micro.blog’s bookmarking feature. I don’t know if this is a popular feature among MB users, but I wonder if I should find a place for MB bookmarks in my workflow. Let’s see a typical workflow.

So, I start reading an article from my now favourite RSS reader: Inoreader. I decide to open the source website and use the bookmarklet to save the page into MB bookmarks. After a few minutes, MB diligently created a readable article archive stripped of all the noise. Think of it as an MB version of Instapaper.

I open the newly created archive and start my reading. I find an interesting or very valuable passage that I select in the browser. MB shows a very gentle overlay titled “Highlight”. I click on it, and sure enough, the text gets highlighted. But that’s not all.

MB can display a list of all my highlights. If I find a highlight that I want to create a linkpost for, I simply click the “New post” button underneath it. And voilà, I can start writing my linkpost right there.

Moreover, MB offers a simple way to save a bookmark by entering the article’s URL into the provided field at the top of the “Bookmarks” section on the MB website. Very handy.

Bookmarks can be embedded in a blog post too. Just click “Embed” underneath a specific bookmark.

The only downside, for now, is the lack of data portability: bookmarks and highlights can’t be saved or exported outside MB.

The bookmarking feature is part of the Premium subscription tier.

Coming out of another rabbit hole…

👨‍💻 I’ve been extensively testing Inoreader recently and I have to say that as much as I like the service, I find the support for third-party services seriously lacking.

Inoreader supports many third-party services like Blogger, Telegram, Buffer, Evernote, LinkedIn, Hootsuite, Pocket, Google Drive, Instapaper, OneNote, Hatena Bookmarks and Dropbox.

It certainly a long list of services but the problem is that I don’t use any of them. I recently cancelled Buffer and Pocket. I’m surprised to see Blogger but not WordPress or Ghost. Who’s using Hootsuite these days?

I wish Raindrop.io or Notion would be supported, after all, both of these services support offer APIs. Too bad because with better integration often come more efficient workflows.

Building something around tags, IFTTT and RSS could unlock some form of automated workflow. For example, tagging an article would generate an article in a custom RSS feed built using Inoreader which would trigger an applet on IFTTT monitoring this RSS feed which in turn could create an entry in Things 3. The latter part is a challenge, though. IFTTT can’t talk to Things, but it can talk to Google Sheet.

Nothing is perfect I guess. 🤷🏻‍♂️

AI Comes to WordPress… Who Knew

The pervasiveness of AI is starting to look troubling at best. This week as I was heading to my WordPress admin page, I got this message at the top.

Who knew? I was curious so I read the official “AI Engine” plugin page on WordPress.org. Here’s something that I found dubious.

Five stars reviews only so far. I won’t have the pleasure to test this plugin or implement any of this on my website as I’m getting ready to move out of WordPress this year. Now, who’s next, Ghost?

The official plugin page can be found here.

A Typical Morning Rabbit Hole

I’m heading to Micro.blog and start reading on my timeline, I read someone’s post about note taking apps, mentioning how Bear Notes is great. I switch to Bear Notes website. After watching the introduction video, I head to the website’s blog section. Bear Notes certainly looks great. This particular blog post looks at how Bear Notes can work with other apps like Things 3 and Readwise. I wonder how I could use Brea Notes for my workflow. I’m reminded that we can export Readwise highlights in markdown files. I head to Readwise website and give it a try. I play with the export options for a while and export all my highlights, more than 1400 in total. Easy and quick. Next I wonder if I could import them in Craft. Yes I can. What about importing them in Ulysses. Yes I could. Then I remember that Ulysses, my go-to writing app, is great. And forget about Bear Notes.

Written and posted using Ulysses on my M1 MacBook Air.

The Computer is Wrong — Stay Critical & Curious

This evening, using an Apple Shortcut called Clips, I imported the « Apple in 2022 The Six Colors report card » by Six Colors into Craft. It’s a 25 000 article that I started reading on my iPad, within Craft. Then, fifteen minutes into my reading, I asked Craft assistant to summarize the Mac section. It came out with this summary:

« In 2022, Apple released the M2 MacBook Air, Mac Studio, and Studio Display. The hardware in the Mac lineup is powerful and impressive, and the M2 MacBook Air has become a favorite of Mac users due to its light weight, zero noise, and long battery life, as well as its ability to handle workloads that used to require a pro-level desktop. macOS Ventura has some great features, but there is still backsliding represented by bugs that have gone unfixed and apps that don’t work well. While many users are disappointed that an expected Mac Pro did not arrive, Apple is still praised for their strong hardware lineup and the Mac Studio, which might be a success except for the long shipping delays. »

Because I’m an Apple pundit; I read a lot about Apple in general, I find this summary « plausible », but also because I read the entire Mac section of the Six Colors report. Otherwise, I could I really know? This is where this article ChatGPT from Matt Birchler comes into view « The Computer is Wrong »: it’s fun to play with ChatGPT or any derivative services but staying critical, curious is still a mandatory thing to be these days.

Eternally Unsatisfied With My Reading Apps

I’ve been a News Explorer RSS reader user for a long time. It’s a less-known RSS reader compared to Reeder or anything else. It’s really good, but missing a few things that keep bugging me. There is no web version, no filtering feature, and no text highlighting either.

I started testing Inoreader yesterday and Feedbin. Both seem good RSS readers, but none of them is satisfying. In fact, I’m never satisfied with anything when it comes to RSS readers and reading applications or services in general. It’s been going on forever.

Read-later apps are unsatisfying, too, for me. None of Instapaper, Pocket, Matter, and Readwise’s Reader satisfy my needs. Readwise is too busy and still immature, Matter is nice, but some things like tags handling don’t scale well.

The perfect combination of a read-later function with an RSS reader doesn’t exist. If I were twenty years younger, I would write my own.

Highly Troubling—Ops are Taking Over Apple My Friends

Don’t bother reading too much into the latest Apple financial numbers. They’re not too bad. What you should be paying attention to is this:

Apple is eliminating one of its most high-profile executive positions. According to a new report today, Apple is eliminating the role of “industrial design chief” as part of a broader shake-up. This role was once held by Jony Ive, and most recently held by Evans Hankey.

More specifically:

Under this new structure, the design team will report to Apple’s chief operating officer Jeff Williams. Source: Apple is eliminating its iconic ‘industrial design chief’ position

This comment by one of the 9To5Mac staff members is not reassuring at all:

I think it’s important to keep in mind, however, that Williams has been involved with the design team for several years at this point. Hankey has reported to Williams since 2019. The difference now is that the middle ground between Williams and the rest of the design team is being removed.

Maybe Hankey saw this coming and couldn’t adhere to this direction. Here’s my take: ops are taking over Apple, and design is no longer the top priority. It is utterly troubling to read rumours of Williams possibly replacing Cook which looks like being more of the same if you ask me. Maybe Williams has more design experience, but not as a first-party involvement. Troubling.

My Taxi Ride to The Past

I recently took a taxi ride to leave the airport as Uber taxis were unavailable and plagued with longer than usual delays. We were directed to the traditional taxi lines. I couldn’t use an app on my iPhone to call a taxi instead.

Boy, it was a trip in the past. The taxi driver had no Google or Waze open to know where to go, only his memory and his knowledge of the city. The taxi timer was this old and ugly box installed on his car dash, partially blocking his view.

It was disorienting not to get any feedback about how long the trip would go, what was the best road alternatives along the way, and not having a driver reputation score.

You would think that Uber would kick the butt of taxi companies so they evolve the customer experience and get their shit together to build a competitive experience, but no. They seem to have given up a long time ago.

My message to taxi companies: enjoy the ride while it last.

Integrating Adobe Enhance Voice Tech Into My Video Production Workflow — In Search of a Solution

I don’t know if anyone knows about this free web tool by Adobe: Enhance Voice (link), but it is really impressive (@MattBirchler knows about it). Here is what I’d like to do: find a way to integrate this tool into my video production workflow.

So, I’m producing YouTube videos with ScreenFlow (my YouTube Channel). So far, I’m ok with the results, but I think my voice, and the sound in general, could be improved (I’m using the Blue Yeti Microphone, but Adobe Enhance Voice is really impressive).

So, how can I:

  1. Do my recording sessions as usual
  2. Do my video montage as usual
  3. Extract the audio track
  4. Use Adobe Enhance Voice to re-process the audio track
  5. Replace the audio track in my Screenflow document
  6. Export the final video

Step 3 and 5 are not possible in the current release of Screenflow. Any suggestion of tools I could use instead?

Here’s what I know or already use:

  • Permute allows for easy conversion of audio files, including converting video files into the audio-only version.
  • QuickTime Player can export the audio track only out of a video file.
  • I know how to use iMovie.
  • I’m a happy user of Audio Hijack
  • I don’t really want to get rid of Screenflow. LumaFusion, FinalCut Pro, etc., maybe could do the job here, but it would be ok if I could find a simple utility that can replace the audio track easily instead.

This question has been posted to the Screenflow Telestream forum.


Update #1: corrected a few typos but added the solution using iMovie. Here’s the solution.

  1. Do my recording sessions as usual
  2. Do my video montage as usual and export the video
  3. Extract the audio track using Permute in .MP3 format
  4. Use Adobe Enhance Voice to re-process the audio track
  5. Convert .WAV into .MP3
  6. Launch iMovie and create a new Project
  7. Import the produced video in step 2
  8. Detach the audio track and delete it
  9. Add the enhanced version of the audio track
  10. Export the final video using iMovie’s share option

Voilà!

Update #2: there is a major issue with this process, the video and audio are not in sync over time, even though both files are of the same duration. This is not something easy to fix. Back to the drawing board. 😒

Thanks for Paying Attention

There’s this question that keeps popping up in my mind all the time since I’m being more active on Micro.blog. Why am I getting way more interactions with others on Micro.blog compared to Twitter? What am I doing differently? I write about the same subjects, albeit maybe more frequently. I think I have a few possible explanations.

First, Twitter is full of bots. Twitter is a dumpster. I suspect many people or organizations are simply cross-posting stuff on Twitter without real human beings behind the content. I did exactly that myself via Buffer for a few years. Optimizing exposure by scheduling posts at the “right” time was the idea. A bot worked for me.

Second, and this is probably the most probable reason: algorithmic timeline. The Twitter engine is tuned to generate higher engagement. The more you engage, the higher the probability that your content will appear on people’s timelines. If you’re well-known, again, the higher the likelihood that you will make it to the timeline of others.

I’m not well-known. I didn’t engage that much with others. Both made me a near-nobody on Twitter. So I didn’t get exposure, hence the lack of engagement with my content.

Third, there is just too much noise on Twitter to get noticed. My content competes against the rest of the Twittosphere. My context was noise for others, hence the lack of feedback, comments, and interactions.

Here on Micro.blog? Night and day. I’m not a star, far from it. But I get a sense that some people are paying attention.

Thanks for that anyway. 🤗