Allison interviews Sandeep Arya from Naqi Logix about their advanced, non-invasive control platform that uses wearable devices to enable hands-free, voice-free interaction with digital systems. The platform is designed as an “Invisible User Interface,” allowing users to control technology without screens, cameras, touch, or speech.
Naqi Logix’s neural earbuds detect subtle biological signals such as jaw movements, facial micro-gestures, and muscle impulses. AI algorithms translate these signals into intentional commands, enabling intuitive control of computers, smart devices, robotics, AR/VR systems, and mobility equipment.
Naqi’s platform emphasizes broad compatibility and extensibility. It includes software tools, APIs, and a configuration hub that allow developers and manufacturers to integrate the control system into custom hardware and software environments across consumer, enterprise, and industrial use cases.
A major focus of the technology is accessibility, providing new ways for people with mobility or speech limitations to interact independently with technology. The system has received significant industry recognition, including CES Best of Innovation awards and inclusion in TIME’s Best Inventions list, highlighting its impact and potential.
Learn more at https://www.naqilogix.com/
Using a Screen Reader? click here
Transcript of Interview:
Allison: You know, I love accessible tech and I had to stop by the uh Naki Logics booth to talk to Sandeep Arya about something really interesting here. It looks like a headphone, but it’s going to be a lot more than that.
Sandeep: Uh you are absolutely right. This is definitely more than a headphone. This is a neural earbud which converts a normal earbud into something truly magical because it lets you control any devices around you, whether it’s a computer, whether it’s a mobile phone, a robotic arm, a wheelchair, and you do that without touching it, without speaking anything to it, without looking any camera and without looking at a screen where applicable.
Allison: So it’s all through your ear canal that it’s figuring this out?
Sandeep: It’s not the ear canal, but it’s the electromyography signal, EMG signals. So it’s actually picking up your subtle head movements and also the micro gestures from your face and converting them into the signals.
Allison: Okay, so I’ve got um some disability where I’m miss- I’m missing a lot of motor functions and maybe I can’t speak. You’re saying with this, I I I’m not even having to turn my head and I’d be able to control a computer. How does that How does that happen?
Sandeep: Yeah, so it the inspiration came from disability only. Uh for I’m sorry, came from what? From the person who is disabled. Okay. So the friend of the inventor is quadriplegic. And he was considering a brain implant and he told our inventor Dave Seagal that you are my last stop before I consider a brain implant. And that’s how this all started. And that person hasn’t moved anything above his neck. So we pick up the micro gesture and subtle head movements to convert those into the commands. So even if you cannot move your arms, you are in wheelchair, you can still control your wheelchair. You can still control a computer or pick up a phone call which you couldn’t before. For controlling television, you don’t need a remote control. You can actually control your entire television just with the earbud.
Allison: Okay, so we we have an earbud here. Can you hold that in your hand or Yeah. Yep. So this is this is earbud. It looks like a normal earbud with a but a black box on the outside.
Sandeep: No, so this is this this is nothing but because this is a prototype. We are still have not commercialized it. But the key or the magic is the sensors. The sensors are the one which are picking up the silver parts.
Allison: Okay, I’m going to describe this. So there’s the the rubber ear tip that goes into your ear like a normal earbud.
Sandeep: Like a normal earbud, but then there there are flat sensors on either side and and in the sort of wing winglet that a lot of headphones have to hold them in place.
Sandeep: Yeah, yeah, yeah. So these sensors are the one which are picking up any kind of micro gestures that that we make and then taking that into the consideration with our AI engine that knowing that this is intentional and then converting it to into the commands.
Allison: Wow. So this makes me think, what if I started laughing? Would it all of a sudden start, you know, rolling my wheelchair and changing my my PC doing something?
Sandeep: Great question and that is exactly why the AI that we have built in is so robust and we also have built something called invisible user interface. And the invisible user interface is like a Rubik’s Cube which also let you switch between the devices. So right now we our AI is robust enough where we are 95% plus accuracy. So we are not we are not worried about unintentional false inputs. We are able to take care of that.
Allison: Wow. Now in the background, uh I’m not sure if it’s been on since he’s been recording, but there’s been a gentleman with his hands folded across his chest to show basically nothing up my sleeve. Uh and he’s been controlling a PC in the background, right?
Sandeep: That is correct because people when they see these kind of demos, they think something is going on behind the scene. So Dave Seagal has purposefully put his hands up and just holding on to those so that people know that he’s not utilizing anything and it’s nothing but the earbud.
Allison: So he’s just like you said, very subtly tilting his head. He’s he is looking around.
Sandeep: Yes. Yep. But he’s not he’s not talking. He’s not gesturing in any way.
Sandeep: Yes, that is absolutely correct. That is absolutely correct.
Allison: So how close is this to a real product?
Sandeep: Well, we have as you can see the the the innovation started quite some time ago and we have gone through a journey with the Time magazine to CES couple of times to Edison award and now this year is the best of innovation uh here. So we have gone through that journey where we have reached a point where definitely this is ready for commercialization. So we are looking into going into uh we already working with the partners. That is the whole idea of best of innovation. Now for the consumer, it is supposed to go out anytime this year.
Allison: It could be this year.
Sandeep: Yes. Really? Okay. Well if people are interested in this in this product, where would they go to find out more?
Sandeep: Well, they can come to our website and they can learn more and they can also contact us at any time.
Allison: And the website is where?
Sandeep: Yeah, right here. Nakilogics.com. N A Q I L O G I X.com. That is correct.
Allison: Very good. Thank you very much for your your time and I I hope this works. This looks really fantastic. Looks like it does.
Sandeep: Oh, no, definitely it does Allison and hopefully we’ll have you sometime trying it out. Thank you so much.
Allison: It’ll be great. Maybe we’ll see you next year and you’ll have it to let me try.
Sandeep: Oh, of course. I’m well, you are welcome to try it this year as well. We just don’t have the demo right here, but
Allison: Very good. Very good. Thank you very much for your time.
Sandeep: Likewise. Thank you so much. Thank you.
