-
9to5Mac reporting about a short Financial Times article on a silent war between Apple and Google:
Former Apple engineers say that Apple still holds a grudge over how Android allegedly copied iOS, and is steadily working to remove Google from the iPhone. Source: Apple is engaged in a ‘silent war’ against Google, claim engineers | AppleInsider
If there is something that I’d like to see Apple do: Apple (oops, Siri) Search for the web. No user tracking, no ads, full integration within Apple’s ecosystem. Imagine the possibilities.
-
Adam Mosseri, Instagram boss, spoke about the controversy growing on its platform:
We definitely have a number of photographers who have been upset. I want to be clear: though we are leaning into video, we still value photos. Photos will always be a part of Instagram. Source: Instagram showed people too many videos last year, admits Adam Mosseri - The Verge
If you like photography, go elsewhere than Instagram. They focus on user engagement, not photography. There are so many great platforms these days that are built for photography enthusiasts.
-
For those who don’t know me, I love photography. I’m an amateur photographer myself. Today, I want to share a link about an inspiring photographer named Adrian. If you like the B&W style, consider subscriber to his work (he’s on YouTube too).
-
Iām super happy to see Ivory going out to light and all. I was on the beta, downloaded the official release and played with it a bit. Itās really a great client. For now, Iāll let things settle down and see how Tapbots delivers on its promise. Iām focusing on Micro.blog for now, but Iām curious about Mastodon ecosystem evolution. I donāt want a new Twitter in disguise. Micro.blog in its current form is a very good compromise.
-
I’m seriously warming up to Readwise Reader. I get the feeling that I’ll be able to build a database of references and notes while I’m reading. The close proximity to Readwise is also a plus. It’s really geeky stuff for power readers, which I’m probably not. I’m waiting for the filtered view builder, they are working on a simplified and more visual version. Filtering the feed’s content by removing things like “Deal” or “Special Deal” would help remove some noise. I also keep an eye on the mobile app.
-
Sometimes Iām reminded that I wish I had picked out āDigital Citizenā instead of āNumeric Citizenā as my ānom de plumeā. In English, Digital has a better significance than Numeric. Am I correct? š§
-
My Taxi Ride to The Past
I recently took a taxi ride to leave the airport as Uber taxis were unavailable and plagued with longer than usual delays. We were directed to the traditional taxi lines. I couldnāt use an app on my iPhone to call a taxi instead.
Boy, it was a trip in the past. The taxi driver had no Google or Waze open to know where to go, only his memory and his knowledge of the city. The taxi timer was this old and ugly box installed on his car dash, partially blocking his view.
It was disorienting not to get any feedback about how long the trip would go, what was the best road alternatives along the way, and not having a driver reputation score.
You would think that Uber would kick the butt of taxi companies so they evolve the customer experience and get their shit together to build a competitive experience, but no. They seem to have given up a long time ago.
My message to taxi companies: enjoy the ride while it last.
-
More details are emerging about the rumoured Apple’s headset… and this is troubling…
Using the headset will “feel familiar to Apple users,” with an interface that is close to identical to the look of an āiPhoneā or an āiPadā. There will be a Home screen with app icons that can be rearranged, as well as customizable widgets. Source: Apple’s Mixed Reality Headset to Feature iOS-Like Interface, Advanced Hand Tracking, and Will Work as Second Display for Mac - MacRumors
Oh boy… I would expect a brand-new metaphor for the brand-new form factor. Otherwise, by using a familiar interface, it’s as if Apple is locked into their own creation.
Likely to be named “Reality Pro,” the headset will be able to switch between augmented reality and virtual reality. Augmented reality will overlay virtual objects on the real world, while virtual reality is an entirely virtual environment that shuts out the wearer’s surroundings. Augmented reality functions will work through a pass-through mode that will use the exterior cameras on the headset, and swapping between AR and VR will be done with a Digital Crown-like control knob.
“Reality Pro”!!!?? Someone must be kidding about that one, right? What an awful name! What about “viewPod”? Or something less “Macintosh Performa 620”-type of thing, please.
For those who wear glasses, Apple will provide custom lenses that are able to sit within the enclosure, and Apple is expecting users to wear AirPods to get an audio experience on par with the visual experience that the headset provides, though it will have built-in speakers.
I do.
As previously rumored, the headset will have an external battery pack to prevent it from overheating on a user’s face due to the high-end Mac chips used for the device. The battery is approximately the size of two iPhone 14 Pro Max models stacked on top of one another, and it will power the headset for around two hours. An external battery will allow users to swap one battery and charge another to use the device for a longer period of time.
No. Just no. It’s a proof of concept sold to consumers.
-
For 2023, Apple is reportedly working on a larger 15-inch MacBook Air to join its Mac lineup. The new MacBook Air will feature the upcoming M3 chip, according to Bloomberg’s Mark Gurman. The 15-inch MacBook Air is expected to feature the same design as the current 13.6-inch model but with a larger display and longer battery life thanks to the efficiency of the M3 chip and the inclusion of a larger battery. Source: What’s Next for the Mac: M3 iMac, 15-Inch MacBook Air, Mac Pro, and More - MacRumors
The possible 15-inch MacBook Air makes a lot of sense from a product line perspective (non pro machine with high portability but a larger screen). Where I disagree with Gurman is the possibility of the device getting an M3 chip. This is way too early. The rest of the product line is still on the just-released M2. When the Mac Pro finally makes the switch, then the M3 is a higher probability. In other words, I expect to see the 15-inch MacBook Air to get the same M2 chip as the current but smaller ones.
-
OK, I’m mind blown. I went to see the latest Avatar movie today. I was blown away. It’s probably the most beautiful, entertaining, touching, impressive and well-balanced movie I have ever seen. I’m just in awe of the ingenuity and craftsmanship that went into this movie. Wow.
-
So, the only way to set a profile image in IFTTT is to use one of the following services?? Not possible to upload a picture. My profile image is now blank because I deleted my connection to Twitter. Weird.
-
On the web, I prefer Matter, but on the iPad, I think I prefer the Readwise Reader. In particular, I prefer the customizable home screen of the latter. But I prefer the reading experience of the former. Today that is what it is. Tomorrow? Who knows. š¤·š»āāļø
-
Spent most of the day in front of a computer. Thankfully, I learned quite a bit about video production. I saved 400$ by not buying Final Cut Pro. I produced my first video of 2023.
-
Iāve been watching a few beginner videos about Final Cut Pro just because Iām searching for ways to improve my video production workflow. The new problem now is this: how do I export recording segments out of Screenflow so I can import them in Final Cut Pro? There is currently no easy way to do this automatically. The process is manual but documented in a Telestream forum. Iām not so sure the use of Final Cur Pro would be a boon to my workflow. Back to square one, but I learned a few things about Final Cut Pro and Screenflow export options. Itās a good thing in the end, as I wonāt find a reason to upgrade to an M2 Pro Mac mini. š¤£
-
Iāve been testing Continuity Camera mode with my iPhone 13 Pro as the video and audio sources for recording with Screenflow. I must say that I was very impressed and positively surprised.
First, the video quality is excellent, and I get Center Stage as a bonus. Screenflow sees this as a standard camera source. Second, using the Voice Isolation option, I no longer hear background noises in the house, which is cool because Iāll stop asking people to be less noisy neighbours. Now, If only I could integrate Adobe Enhance Voice into my workflow⦠š
Solution found (link)!
I may have a Blue Yeti microphone for sale soon!
-
Integrating Adobe Enhance Voice Tech Into My Video Production Workflow ā In Search of a Solution
I don’t know if anyone knows about this free web tool by Adobe: Enhance Voice (link), but it is really impressive (@MattBirchler knows about it). Here is what I’d like to do: find a way to integrate this tool into my video production workflow.
So, I’m producing YouTube videos with ScreenFlow (my YouTube Channel). So far, I’m ok with the results, but I think my voice, and the sound in general, could be improved (I’m using the Blue Yeti Microphone, but Adobe Enhance Voice is really impressive).
So, how can I:
- Do my recording sessions as usual
- Do my video montage as usual
- Extract the audio track
- Use Adobe Enhance Voice to re-process the audio track
- Replace the audio track in my Screenflow document
- Export the final video
Step 3 and 5 are not possible in the current release of Screenflow. Any suggestion of tools I could use instead?
Hereās what I know or already use:
- Permute allows for easy conversion of audio files, including converting video files into the audio-only version.
- QuickTime Player can export the audio track only out of a video file.
- I know how to use iMovie.
- Iām a happy user of Audio Hijack
- I donāt really want to get rid of Screenflow. LumaFusion, FinalCut Pro, etc., maybe could do the job here, but it would be ok if I could find a simple utility that can replace the audio track easily instead.
This question has been posted to the Screenflow Telestream forum.
Update #1: corrected a few typos but added the solution using iMovie. Hereās the solution.
- Do my recording sessions as usual
- Do my video montage as usual and export the video
- Extract the audio track using Permute in .MP3 format
- Use Adobe Enhance Voice to re-process the audio track
- Convert .WAV into .MP3
- Launch iMovie and create a new Project
- Import the produced video in step 2
- Detach the audio track and delete it
- Add the enhanced version of the audio track
- Export the final video using iMovieās share option
VoilĆ !
Update #2: there is a major issue with this process, the video and audio are not in sync over time, even though both files are of the same duration. This is not something easy to fix. Back to the drawing board. š
-
Genuine Questions About The War in Ukraine
Here are a few questions that pop up in my mind when reading the news about the war in Ukraine.
-
How do they estimate Russian casualties? They are around 500-800 KIA per day. Thatās a lot. Is this number inflated for propaganda purposes?
-
Why is Putin not saying, āAny country sending arms to help Ukraine is, in fact, a declaration of war against Russiaā? What would be the implications of such a declaration? Putin looks like a low-profile leader. So why is Putin not more vocal against the west?
-
Why is the West not sending military aid to Modlova to put pressure on the 1200 Russian soldiers in Transnistria? How is it possible for Russia to feed them with the required resources?
-
Why are tanks so much needed instead of other types of weapons? Are fighter planes effective? What is the aviation role in Ukraine? Are they making a difference?
-
How many Ukrainian soldiers are killed each day? Why donāt we get these numbers as much as the Russian KIA?
So many unanswered questions.
-
-
Google’s official announcement of incoming layoffs:
I have some difficult news to share. Weāve decided to reduce our workforce by approximately 12,000 roles. Weāve already sent a separate email to employees in the US who are affected. In other countries, this process will take longer due to local laws and practices. Source: A difficult decision to set us up for the future
And comment from Gruber:
There are numerous reasons the tech industry wound up at this layoffpalooza, but I think the main reason is that the biggest companies got caught up in a game where they tried to hire everyone, whether they needed them or not, to keep talent away from competitors and keep talent away from small upstarts (or from founding their own small upstarts). These big companies were just hiring to hire, and now the jig is up. Source: Daring Fireball
Here’s my view on this. Google is not alone. Microsoft and Meta announced major layoffs too. I’ve been working in IT for over thirty years, and I have never been in such a situation where we have so much difficulty finding or hiring new people. Big companies are competing for great talent not only with each other but also draining talent from smaller companies. It’s very difficult to compete in this context.
I think what is happening is not as catastrophic as it sounds. We will see a redistribution of the workforce in the industry. A lot of talent is being freed in the process from the big ones and is now available for the smaller companies where management is more sound, and financial posture is in good shape.
-
Matter is officially a paid service (if I want to get all the goodies). Well, my queue is full of unread articles. So this should be an excellent indicator to decide whether or not to subscribe.
-
Thanks for Paying Attention
Thereās this question that keeps popping up in my mind all the time since Iām being more active on Micro.blog. Why am I getting way more interactions with others on Micro.blog compared to Twitter? What am I doing differently? I write about the same subjects, albeit maybe more frequently. I think I have a few possible explanations.
First, Twitter is full of bots. Twitter is a dumpster. I suspect many people or organizations are simply cross-posting stuff on Twitter without real human beings behind the content. I did exactly that myself via Buffer for a few years. Optimizing exposure by scheduling posts at the ārightā time was the idea. A bot worked for me.
Second, and this is probably the most probable reason: algorithmic timeline. The Twitter engine is tuned to generate higher engagement. The more you engage, the higher the probability that your content will appear on peopleās timelines. If youāre well-known, again, the higher the likelihood that you will make it to the timeline of others.
Iām not well-known. I didnāt engage that much with others. Both made me a near-nobody on Twitter. So I didnāt get exposure, hence the lack of engagement with my content.
Third, there is just too much noise on Twitter to get noticed. My content competes against the rest of the Twittosphere. My context was noise for others, hence the lack of feedback, comments, and interactions.
Here on Micro.blog? Night and day. Iām not a star, far from it. But I get a sense that some people are paying attention.
Thanks for that anyway. š¤