Mycroft serving blind and partially-sighted people

This is my first post, and I hope I’ve read enough here to avoid being charged with failing to do my homework.

I’m aiming to build an ‘appliance’ for a partially-sighted friend that will enable her to listen to the UK ‘Freeview’ (dvb-t) broadcast TV and radio channels. It needs to be voice-operated with appropriate t2s feedback sufficient to select and play the required channel. I started with a borrowed Pi 3b but couldn’t get Picroft to work beyond it wanting to set up WiFi when it was already working. I restarted with Raspbian and added Mycroft. I’m using a Plantronics headset to interact with Mycroft, a USB tuner, and amplified PC speakers for the dvb-t (broadcast) sound. By using the headset I hope to avoid the command mic picking up too much of the broadcast being played through speakers. A better solution further down the line instead of the headset might be a physical button that when pressed mutes the speakers and activates a table mic, perhaps implemented by means of some GPIO stuff. Mplayer in slave mode is doing what I want, and my aim is to use a named pipe to control Mplayer via Mycroft brokered by a suitable skill.

To recap I have a working Mycroft front end and dvb-t backend and need to insert the skill layer, preferably by adapting something nearly ready-made that outputs shell commands. I’ve read about several such skills. My Python skills are zero but I was a programmer in a past life so I’m not entirely clueless. A couple of questions for the community then:

  1. Which Mycroft skill or skills do you suggest I should ‘borrow’ and adapt to the above purpose?

  2. Has anyone else been working on applications for the blind/partially sighted using Mycroft? Seems an obvious use but I can’t find anything, here or elsewhere. As an aside, the Royal National Institute for the Blind (RNIB) website showcases a media player for the blind - special buttons, Braille remote control and other whizzy stuff - but it costs about a thousand US dollars. I’m hoping to build something that does a slightly different job and a lot more besides (via the other Mycroft skills) for rather less.

I’d appreciate the benefit of your experience and wisdom. Thanks in anticipation.


Welcome @fifthager, great to have you here, and excellent first post, you have certainly done your homework. Your project objectives are commendable. Assistive technology is so prohibitively priced, when it’s often not particularly expensive to produce - captive market I suspect.

The Pi 3B+ unfortunately won’t work yet with our Raspbian Jessie Lite-based Picroft image, however there have been some substantive attempts by the community to this end. It’s something we’re working on - covered in this recent blog post.

In terms of the Skill layer to send raw bash commands, I think this Skill from my colleague @forslund would be a good starting point. Unfortunately I’m not aware on anyone else in our developer community working in this space; and I’m likely to be the person to know :frowning:

What I am aware of are other technologies in the assistive space; it might also be worth checking out OpenBCI - they do open source brain computer interfaces for neural interfaces, and that might be another option as well.

Please don’t hesitate to let me know how I can assist further.

With kind regards,


Many thanks for your helpful and encouraging reply Kathy. I will follow your advice and start with a good look at “cmd_skill”. I accept that there’s a world of difference between a hobbyist’s hit-and-miss lash-up and a robust and polished commercial product, but many visually-impaired persons are supported by IT-savvy folk capable of following a reasonably clear and reliable ‘tech recipe’: parts list, software list, instructions and necessary extras, and who can also provide the necessary maintenance. I’m aiming to produce such a recipe, and I’ll let you know if I succeed, or at least learn something worth sharing.