{"id":33951,"date":"2025-06-04T16:43:13","date_gmt":"2025-06-04T23:43:13","guid":{"rendered":"https:\/\/www.podfeet.com\/blog\/?p=33951"},"modified":"2025-06-05T12:04:58","modified_gmt":"2025-06-05T19:04:58","slug":"rim-recording","status":"publish","type":"post","link":"https:\/\/www.podfeet.com\/blog\/2025\/06\/rim-recording\/","title":{"rendered":"Challenging Recording of the Remote Incident Manager for Macstock"},"content":{"rendered":"<p>I mentioned recently that one of the reasons I like to play with my devices and new software is so that when I\u2019m faced with a technical challenge, I have the Lego\u2122\ufe0f bricks necessary to piece together a solution.  This week, I got to exercise those skills in an interesting way.<\/p>\n<h2>The Problem to be Solved<\/h2>\n<p>Let\u2019s start with the problem to be solved. You may remember when <a href=\"https:\/\/www.podfeet.com\/blog\/2024\/04\/nc-987\/\">Chris Cooke from Unmute Presents came on the NosillaCast<\/a> to talk about how, as a blind person, she uses the Aira service to navigate the physical world. If you don\u2019t remember or missed that episode, let me just say that she is delightful, intelligent, and funny.<\/p>\n<p>The great news, is that Chris has been selected to be one of the speakers at <a href=\"https:\/\/macstockconferenceandexpo.com\">Macstock<\/a> this year. She\u2019ll be explaining how she uses <a href=\"https:\/\/pneumasolutions.com\/remote-incident-manager\/\">the Remote Incident Manager, from Pneuma Solutions<\/a>.  This product allows blind people to provide remote technical support to both blind and sighted people. Chris is an instructor and uses the Remote Incident Manager regularly to help her students.<\/p>\n<p>We\u2019ve been putting our heads together to come up with a way for her to demonstrate how it works to the Macstock audience. Since the Remote Incident Manager relies on the Internet, it would be folly for her to try to do a live demo on stage. Conference WiFi is notoriously unreliable, and to demonstrate this tool effectively, there\u2019s some interesting audio routing required. We\u2019ve been working on a way for me to make a video and audio recording of her controlling Mac, and then we\u2019ll play the canned video at Macstock. It is turning out to be an exercise that requires every bit of my expertise.<\/p>\n<h2>How the Remote Incident Manager Works<\/h2>\n<p>The Remote Incident Manager was only ported from Windows to macOS two years ago, and from my understanding, it\u2019s the only tool out there that will allow a blind person to provide remote tech support. I remember when NosillaCastaway Dan Eckmeier told me the good news that it was now available for the Mac.<\/p>\n<p>Both the remote technician (Chris) and the person requiring help (me) have to install the Remote Incident Manager on their computers. On my side, the installer walked me through the plethora of permissions Apple require for our security. It wasn\u2019t fundamentally different from any other app that requires screen recording permissions, but the UI of the Remote Incident Manager will appear non-standard to Mac users. The screens are white with black text and boxes around some things representing buttons. The screens explaining permissions are solid red with black text.<\/p>\n<p>In addition to the security permissions, the Remote Incident Manager installer also installs the Karabiner Virtual Human Interface Device Manager driver extension.  I had heard of Karabiner before, but had to look it up to find out exactly what it\u2019s for. <a href=\"https:\/\/karabiner-elements.pqrs.org\">Karabiner Elements<\/a> is a keyboard customizer for macOS that allows the user to remap keys.  Karabiner-Elements employs a virtual keyboard and mouse driver to modify input events. The <code>.Karabiner-VirtualHIDDevice-Manager.app<\/code> is responsible for activating and managing these virtual devices. Ok, then. Not sure why this needs to be installed for the Remote Incident Manager, but I\u2019ll play along.<\/p>\n<figure style=\"float: center; margin: 5px\"><img decoding=\"async\" src=\"https:\/\/www.podfeet.com\/blog\/wp-content\/uploads\/2025\/06\/RIM-installing-karibiner-System-Settings.png\" alt=\"RIM installing karibiner System Settings.\" title=\"#title#\" width=\"495 \" height=\"600\"><figcaption style=\"text-align:center\">Karibiner Driver Installation &#8211; Should I be Worried?<\/figcaption><\/figure>\n<p>The instructor and the student see the same interface when launching the Remote Incident Manager. It\u2019s a very simple white window with black text, with plain rectangular boxes around anything that is a button.  It kind of looks like an engineer designed a user interface instead of a designer with actual talent.  The title of this window says \u201cReceive Remote Help\u201d, but Chris will push the button that says, \u201cProvide help instead\u201d.  Her paid account allows her to define a keyword, which she tells to the person she\u2019s helping.<\/p>\n<p>On my screen, I simply enter the keyword Chris provides me into the little box in the window, and in a second or two, Chris is able to start controlling my computer.<\/p>\n<figure style=\"float: cente; margin: 5px\"><img decoding=\"async\" src=\"https:\/\/www.podfeet.com\/blog\/wp-content\/uploads\/2025\/06\/Remote-Incident-Manager-window-on-macOS.png\" alt=\"Remote Incident Manager window on macOS showing a text field for keyword, and buttons for connect, provide help instead, add this machine to your RIM account, and about.\" title=\"#title#\" width=\"600 \" height=\"400\"><figcaption style=\"text-align:center\">The Simple Remote Incident Manager Interface<\/figcaption><\/figure>\n<p>If you\u2019re sighted and controlling someone else\u2019s computer, you see everything on their screen, and you can move your cursor around, click on things, and see them change.  With the Remote Incident Manager, it\u2019s exactly like that, except Chris is navigating my computer using VoiceOver.  She can issue commands, open applications, and modify settings, all while listening to VoiceOver and using VoiceOver commands to navigate the system.  For her, it\u2019s just like navigating her own computer.<\/p>\n<p>When she first connects to my computer, I see the little grey box in the bottom left that displays what VoiceOver is saying. Because of a recent change in macOS Sequoia, if I want to hear VoiceOver, I have to toggle it off and back on again. This behavior has been reported to Apple by Pneuma Solutions, but they\u2019ve yet to catch anyone\u2019s attention about it.<\/p>\n<p>Chris, when initiating the connection, can choose to simply control my screen, or she can choose to have our voices transmitted within this same screen control session.<\/p>\n<p>At this point, we\u2019re ready for Chris to perform some task on my computer that would be interesting for the users of Macstock to watch.  But now comes the tricky part. We have to figure out how to record everything for the video.<\/p>\n<h2>What She Wants to Demo<\/h2>\n<p>Our goal is to create a recording that includes:<\/p>\n<ul>\n<li>Chris\u2019s voice<\/li>\n<li>Allison\u2019s voice<\/li>\n<li>VoiceOver from my Mac (which is what she hears when running Remote Incident Manager)<\/li>\n<li>Video of my desktop\n<ul>\n<li>Not only what she\u2019s controlling, but the visual element of VoiceOver<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<p>That sounds simple, but it has turned out to be much more complicated than we thought it would be. With some fancy dancing with tools like Loopback and Audio Hijack from Rogue Amoeba, I can capture the audio of me, Chris, <em>and<\/em> VoiceOver on my Mac and have those three sources be included in a ScreenFlow recording of my screen while we\u2019re doing the session.<\/p>\n<p>Unfortunately, in three separate sessions over the course of many days and many hours of experimentation, we have discovered a big problem. If any software is running on my Mac that can capture the audio, Chris hears her voice reflected back to her with a 3-5 second delay.  You can imagine how it would be impossible for her to conduct the demo under these conditions.<\/p>\n<p>I tried recording our voices with Audio Hijack, ScreenFlow, and even QuickTime, and the result was the same: double audio with a delay. We abandoned the voice call option in the Remote Incident Manager and tried using Zoom for our voices, but the same double delay was experienced on Chris\u2019s end.  We did a screenshare with Zoom where I was able to check her sound input and output settings in System Settings, and I was able to check her Loopback settings, but I was unable to account for the problem.<\/p>\n<p>Chris told me that she normally just calls her students on the phone and uses the Remote Incident Manager on the Mac for the screenshare, so we decided to give that a try. The lag was gone, and Chris was able to easily navigate around on my machine using VoiceOver and hear it speaking near real-time.  But now we have a phone call, instead of a VOIP call. How do I capture her audio to put it into the video?<\/p>\n<h2>One iPhone, Three Microphones, Three Sets of Headphones, and Three Macs<\/h2>\n<p>In order to get this to work, we ended up using one iPhone, three microphones, and three Macs.<\/p>\n<p>My MacBook Pro will be the target device for Chris to control using the Remote Incident Manager. As a side note, I have a large display connected to the MacBook Pro, but when Chris is controlling my Mac, she doesn\u2019t have any way of knowing which of my displays she\u2019s using. That will make it hard to record, so I put my MacBook Pro in screen mirroring mode.<\/p>\n<p>Since my MacBook Pro is the target, that\u2019s where I\u2019ll run ScreenFlow to record video of the screen and audio from VoiceOver.<\/p>\n<p>In order to capture our voices, we need to use a VOIP tool like Zoom. If I run Zoom on my MacBook Air, and she uses her iPhone to call into Zoom, then we\u2019ve removed whatever is causing the double delay for Chris. With Zoom running on my MacBook Air, I can run Audio Hijack and capture a stereo audio file with her iPhone audio on one track, and audio from my ATR2100x USB-C microphone on the other track. Chris also has an ATR2100x, and we figured out that she can use it as a microphone for her iPhone, so her fabulous voice will sound good on the recording.<\/p>\n<p>When you use the ATR2100x USB-C microphone on an iPhone, you also have to connect headphones to the monitor port on the microphone in order to hear the conversation. You have no control over sending the audio out to anything else, not even the speaker on the iPhone. No big deal, Chris has high-quality headphones.  But now if she has her headphones listening to me on the Zoom call, how is she going to hear VoiceOver on her Mac in the Remote Incident Manager?<\/p>\n<p>Chris came up with a brilliant solution. She has two sets of headphones that each cover only one ear. She\u2019ll be able to listen to the Zoom call in one ear while hearing VoiceOver from the Mac in the other ear. I told her we could only do this if she promised to take a photo of herself wearing two sets of headphones, and she immediately agreed. I told you she\u2019s fun.  Ok, back to the setup.<\/p>\n<p>My plan was to hit record on Audio Hijack on the MacBook Air and then record in ScreenFlow on the MacBook Pro and later combine the recordings. That introduced the problem of aligning the audio tracks from Chris and me chatting on the MacBook Air with the audio of VoiceOver and video on the MacBook Pro. And that\u2019s where the second microphone comes in. I connected my Heil PR-40 mic to the MacBook Pro and added that input as a second audio track for the video in ScreenFlow. The Heil voice recording will simply be a reference track. When I bring the stereo recording of Chris and me from the Zoom call into the ScreenFlow recording, the Heil recording will make it easy to line up. Then I can delete either the Heil or the ATR2100x voice recording of me. I was proud of that solution.<\/p>\n<h2>Of Course There\u2019s a Diagram<\/h2>\n<p>Chris and I ran through a quick example demo with the setup I described, and I was able to capture Chris and me, VoiceOver, and the screen as she controlled my Mac. I aligned the voice tracks using that reference track and then muted the reference track. I felt like we had made fire when we were done!  I immediately sketched out how we configured everything using a pencil and paper, and then I made one of my world-famous diagrams using draw.io.<\/p>\n<p>I\u2019m not missing the irony of using a diagram to explain what we\u2019re doing when the folks at Pneuma Solutions are blind, so they won\u2019t be able to appreciate our genius without the written explanation.<\/p>\n<figure style=\"float: center; margin: 10px\"><img decoding=\"async\" src=\"https:\/\/www.podfeet.com\/blog\/wp-content\/uploads\/2025\/06\/RIM-recording-with-chris-V3-1.png\" alt=\"complex diagram of all the Macs, mics, and software to do the recording. It is all described in the post\"  title=\"RIM recording with chris V3.png\" width=\"800 \" height=\"617\"><figcaption style=\"text-align:center\">Pretty Self-explanatory, Right?<\/figcaption><\/figure>\n<h2>Bottom Line<\/h2>\n<p>The bottom line is that while it was frustrating at times, Chris and I had a blast working on this puzzle together. I suspect you\u2019ll hear more of this story as time goes on. I fully expect and hope that the NosillaCastaways will step forward with better solutions that start with, \u201cWell, actually, Allison, it\u2019s quite simple. All you have to do is \u2026\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"<p>I mentioned recently that one of the reasons I like to play with my devices and new software is so that when I\u2019m faced with a technical challenge, I have the Lego\u2122\ufe0f bricks necessary to piece together a solution. This week, I got to exercise those skills in an interesting way. The Problem to be [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":33957,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[14,147],"tags":[717,6087,84,961,7482,7483,104],"class_list":["post-33951","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-accessibility","category-blog-posts","tag-a11y","tag-accesibility","tag-blind","tag-macstock","tag-remote-incident-manager","tag-remote-support","tag-rim"],"jetpack_featured_media_url":"https:\/\/www.podfeet.com\/blog\/wp-content\/uploads\/2025\/06\/RIM-recording-with-chris-V3-1.png","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/www.podfeet.com\/blog\/wp-json\/wp\/v2\/posts\/33951","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.podfeet.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.podfeet.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.podfeet.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.podfeet.com\/blog\/wp-json\/wp\/v2\/comments?post=33951"}],"version-history":[{"count":9,"href":"https:\/\/www.podfeet.com\/blog\/wp-json\/wp\/v2\/posts\/33951\/revisions"}],"predecessor-version":[{"id":33963,"href":"https:\/\/www.podfeet.com\/blog\/wp-json\/wp\/v2\/posts\/33951\/revisions\/33963"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.podfeet.com\/blog\/wp-json\/wp\/v2\/media\/33957"}],"wp:attachment":[{"href":"https:\/\/www.podfeet.com\/blog\/wp-json\/wp\/v2\/media?parent=33951"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.podfeet.com\/blog\/wp-json\/wp\/v2\/categories?post=33951"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.podfeet.com\/blog\/wp-json\/wp\/v2\/tags?post=33951"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}