Dave's brain dump

Hi Guys,

I have been doing a lot of thinking, since discovering Mycroft earlier this year, about how one might apply it. Sure I have it running at home, I have written a few skills. My daughter has been looking at a school project based on developing ideas for products that would help someone who is elderly or disabled. The root of this idea is traceable back to my Mother, who fitted both of this categories.

To be honest I couldn’t see how Mycroft could have helped my mother at that time in her life when I first discovered it. Earlier on then perhaps yes. At the end of last month she decided it was time to go on the next big adventure and have a closer look at the stars. A relief for her and for us to be honest. What this has done is leave my Father alone and while he says he is alright I am not convinced that long periods of solitude is healthy. Sadly he rejects any ideas of leaving the house to find company or entertainment.

So here’s my thinking. Could Mycroft become a companion? Not only in an autonomous way but also to be able to link to other Mycroft seasons to allow either direct or indirect communications, with myself, for example. I have already considered doing some skill research to allow Mycroft to lead a conversation. It asks you something and responds to your answer. It could announce something, based on your previous activities, such as a television program that is due to start. Obviously opens up an multitude of opportunities to develop the system.

My first challenge is to get a computer based device into his house, including connecting him to the internet, so it is a steep hill that I climb. It seems clear to me that the Mark 1 or 2 would be ideal. I would setup a VPN from my house to his so that I could remote administrate the device. Easily done. Convincing him to actually have “an infernal machine”, as he often refers to anything resembling a computer, that’s another problem.

Anyway, that’s my thoughts for the moment. The hot weather has departed the UK now and this weekend we are going to get a lot of rain. So I will hopefully get some time to architect some skill software and have a play about with some ideas.


1 Like

Hi @Darmain,

Interesting thoughts.

For inter-Mycroft communication you could use the IRC Skill. I use it to communicate between Mycroft and an Android phone (using IRC Radio to read out the messages). For the IRC server I run miniircd on the Mycroft (picroft) but there are many options.

For conversing directly with Mycroft I sort of do that with the RSS Reader Skill. I’ve modified it so that it can read the full article content and asks periodically whether he should continue reading or do something else (repeat the last bit, move onto the next item or stop). The conversation’s a bit one-sided but it deffinitely feels engaging. I’ve been listening to some of the Reuters News RSS feeds but it could be done with any RSS feeds (the code may need tweaking to get the full content).

I’ve been contemplating setting this up for my Grandmother as she spends most of her time alone as well. Unfortunately she is similiarly averse to technology also.

I miss the hot weather.


(edit: fix typos)

1 Like

Hi @Darmain, we’re very sorry on the passing of your Mother - our condolences. Losing a parent is never easy, even when that passing is a release from suffering.

Your Father’s situation is becoming more common as our populations age, coupled with the scarcity of residential places available at aged care facilities - and so your suggestions are very timely.

Currently, it’s not possible for Mycroft Devices to do remote communications, over a protocol such as SIP or XMPP - although this is something that others have explored - there are libraries available that might work for inter-device communication - such as suggested in this Skill Suggestion.

The Companion / interactivity piece is a little harder to achieve. At the moment, Mycroft’s Skills - as you’ve probably identified from your own exploration - are based on Intents and then taking action in response to Intents. Dialog trees in Mycroft are somewhat limited, and are generally closed ended. In speech recognition, open ended dialog - such as

“What are your thoughts on Immanuel Kant?”


“Who do you sympathize more with in Moby Dick - Ishmael or the whale?”

are much harder to achieve - as the artificial intelligence has to be trained both on the corpus of knowledge surrounding the questions - ie “literature by Herman Melville” or “German philosophers of the 18th Century” - and to recognise common dialog trees around that corpus -

“I disagree with Kant’s view that there is a moral categorical imperative; ethics is much more nuanced than that”.

Of course, it is possible for Mycroft to have very basic interactions - such as

“How are you feeling today?”

but they in no way approach the level of rapport or trust or emotional depth that I would ascribe to “companionship”. Some relationships perhaps, but not true “companionship” :wink:

Cynicism aside, I think that more broadly where this is going is toward the evolution of the personal assistant to the personal companion. In education, we rate the difficulty of a learning exercise using something called Bloom’s Taxonomy. Personal assistants exist at the lowest level of this classification - able to handle very primitve tasks.

A true ‘Personal companion’ would be able to synethsize, interpret, posit, assert and debate. And be a true ‘companion’.

However I think we’re definitely a few years - maybe 2-5 years - away from that at an industry level.


@KathyReid Wow, thank you so much for your response. That was truly a pleasure to read. It is clear that your intelligence and your grasp on the subject that you work on, are both advanced. This making you a huge asset to the development of Mycroft, as well as holding together the team, be it local, or interested parties, like myself. It is good to talk to you!

With regard to the “Mycroft companion”. I agree that to morph Mycoft into a condition that could reflect that of a true companion is no mean feat. However, although I am a chartered engineer I certainly couldn’t have described why this is “no mean feat” as well as you did. I am sure that many will relate to the Tony Stark / Jarvis example, where it is clear there is a defined relationship between two sentient beings, all be it one carbon based and the other supposedly silicon based. To develop an algorithm that could arrive at sending “Its always a pleasure to watch you work” to the speech synthesiser, without it been predefined in some vocabulary bank or triggered by some skill, well to have skills self developing is one thing, but this boils down to an intellectual response that could clearly beat the Turing Test. We are a long way from that and I doubt the eventual solution would be able to run on a Raspberry Pi either.

So what can be achieved? I think that is what we all need to pitch in and experiment with. I am certainly game.

Looking at some of your other comments. You said it was not possible for Mycroft devices to achieve remote communications. I saw a video on YouTube of a mother, asking her Mycroft to connect to the Mycroft unit in her daughters room. I understood that feature was already available? With regard to my fathers situation, my plan was to first get him on the internet (Probably the most difficult step to be honest!), then setup a VPN between our houses. That way he appears on a local subnet of my LAN. Two fold advantage is that his Mycroft and mine would be on the same network, and I can remote administrate his unit using SSH.

Finally, thank you for your kind words about my Mum. It has been a long time coming. We almost lost her in 2014 but she pulled through and we were granted four more years, for which we are grateful. Her degrading conditions are being researched using the platform “Folding@home”. My two big servers are fully committed to distributed folding work unit tasks. I started this several years ago on smaller machines, then I stepped up the efforts due to Mums diagnosis, and now I will continue in Mums memory. One side effect is it is not cold in my Man Cave.

So, more brain bashing has been going on here. Bouncing ideas off a family member.

Hardware - Dad won’t like a computer. He’s not getting a computer, he’s getting a Mycroft Mark ‘n’. So looking into that. There is one Ebay member who is selling Mark 1s in the UK right now. I could get one of those. Alternatively I could wait for the Mark 2 to start shipping in December. Alternatively I could buy a RPi3, a Mic, a Speaker and a box to put it in, which probably would end up with the cost of a Mark 1. Is a Mark 1 a good plan right now? Its not cheap.

Another topic was what can Dad use Mycroft for. Ah, right, here’s an idea. I do his shopping for him. He phones on a Friday and reads out his hand written list. I then feed this into Google Keep on the PC and then access this on my phone once I’m pushing the trolley about. There are over 100 items already on the list and its rare he wants anything that isn’t existing. All I have to do is finid the item and untick it. So, how about a skill that allows him to add and remove items from this list, plus read back the list? I’m liking this. First thing I need is an API into Google Keep, which they don’t have. It seems there are unofficial ones about but that seems to involve web page scraping. Okay, not beaten, yet.

Hi Dave, you’re very welcome - we aim to try and foster this sort of thinking because we see it vital to a rich, nuanced and inclusive debate about where machine learning and artificial intelligence is going.

It’s definitely possible for Mycroft devices to achieve remote communications, however often the devices are behind internal routers and firewalls, so direct internet access is usually restricted, particularly in corporate or college environments. I don’t have a lot of experience running Mycroft devices behind a VPN - but essentially it would work just like another router - if you can SSH to the unit then it’s possible that you could control the Message Bus on that Device - but we don’t have any Skills available at the moment which implement this.

Ah, yes - I’m part of the World Community Grid, so am familiar with Folding@Home. Much respect for that and similar projects - harnessing minute amounts of computing power across the globe for major societal benefit; it’s one of the major benefits of advanced and high performance computing - we’re able to solve problems on a scale never imagine - a little like machine learning! Respect.

Best, Kathy

1 Like

@KathyReid I already have a VPN running between my home and that of my In-laws. Each house is on a different subnet, however if I SSH to a system on the other side of the VPN, using its own IP address then the routers do their magic and make the connection. The machine could be in the same room as me. I wouldn’t want to peer to peer through the open internet, that would be asking for trouble.

A bit of progress to report on three fronts.

  1. I have been exploring the app “SimpleNote” to use in place of Google Keep. SimpleNote has an API. I normally start with isolated python scripts to perfect the dataflow. Right now I am having trouble with me Tuples. I think I can get some cream for that. Seriously I will sus it out.

  2. I may have made progress with Dad. Unbelievably he isn’t rejecting a Mycroft system outright. Yarp, I am surprised but striking while the iron is hot, lets see what progress can be made here.

  3. Mindful of needing a neat little personal assistant solution, as a computer at Dad’s just isn’t happening, I had been looking about. There were these two Mark 1s on the UK Ebay site, claiming to be new and from the same seller. Converting to US Dollars, he wanted $157 a unit. I had a bit of a haggle, based on the Mark 2s coming out in a few months and he agreed to let me have them at $130 each. I hope I haven’t bought a Pup!

Just on the secondhand Mark 1s - I’d strongly recommend putting a new Micro SD card in them when you get them, and burning a new Mark 1 image on to them. The Micro SD card is the component with the highest rate of failure overall.

I understand them to be “First hand”. For what I paid I certainly hope they are. Will be with me in a day or so. Then we will know.

My Mark 1s are in the house. They are brand new (Well, never unboxed before anyway). Very nice presentation! Of course a lid came off fairly quickly. Is that a RPi3 I see? Most impressive electronics and case work. Very neat indeed.

Power up proved a bit of a problem as the power packs are for US sockets. Happily I had a suitable UK version that got the first unit awake.

Anyway, couldn’t get the WiFi working as it couldn’t see my access point. For now its running on cabled LAN. Registered it, upon which it went mute…

The OS was in a bit of a pickle. Python 3.4 is broken with unmet dependencies. So a bit of surgery with APT-GET sorted that. Then 115 updates to JESSIE. Mycroft was at version 0.9.0. After a fair amount of time it brought itself up to version 18.08.

I have a screen and keyboard connected, obviously, and discovered the mycroft-cli-client. I feel it necessary to report that this is presenting a glitching screen, as I saw on the PC version and reported on another post.

However, he is alive!!!

I then managed to install one of my own skills. That got working pretty quickly so a lot of success being had here tonight. Yay!


Fantastic progress! Yes, 0.9.0 is very old for the operating system, hence my previous advice to burn a new Micro SD card with a fresh Mark 1 image, as this helps with some of the operating system updates.

The mycroft-cli-client display glitch is one that we thought we’d ironed out, so any further advice on that, such as a screenshot, would be helpful.

If needed, we have documentation available too on manually configuring WiFi.
These instructions are for Picroft, but they apply equally to Mark 1

It’s also worth noting which channels your SSID is broadcasting on; Mark 1 cannot connect to Channels 12 or 13.

Last night I instructed a Mycroft to wake me at 07:30. It didn’t work. Fortunately I had sense to rely on my normal clock. I found Mycroft asleep and it wouldn’t wake up. Further more it eyes have remained orange since it was upgraded. So, tonight the latest image is coming down to the PC and the top is off the Mycroft to get the SD card out.

Good progress. Both units reflashed. The first one is fully updated. The second one is updatng at time of writing. It is getting late now, near 23:45. It is quite amusing that both units have assumed the sleeping presentation. I have an update to go on to the American Male post which is most bizarre.

1 Like

Hi Dave,

I love the idea. I think the best possible improvement is in the intent parsing. Although true reflection is not going to be feasible anytime soon we can make the “forward path”, as in: “what do you really mean”, more conversational and intuitive.

The basic idea is to do away with a static compiled decision tree. If there are skills that have similar keywords/phrases a built-in skill asks for clarification: “Would you like to know the inside temperature or the outside temperature?” You could also build in some statistical modeling for certain paths to become more likely and skip straight to the answer (e.g. what local means for weather conditions). This means the intent parser will be “stateful”; it will have some notion of where it is in an ongoing conversation towards an answer supplied by a skill.

Some caveats: I haven’t looked at the intent parser in great detail yet. This idea came about while programming a skill which got me thinking about how to make skill development and integration easier. Really, the skill developer should not have to think too hard about possible conflicts in the intent parser.

I have been working on a PC with ubuntu installed, which works quite well - I have the mark II on order. One solution might be to get a tablet or such that runs linux, install mycroft, skype, create a skype skill (https://askubuntu.com/questions/5284/how-to-call-a-number-from-command-line-with-skype#5443) and you’d have something where “Call Dave” could bring you up on the screen.

I’d be happy to brainstorm some more.


1 Like

A very interesting point there @Sn0wbl1nd. Having three skills in use, not necessarily polished but usable. One allows me to hear reports from my weather station. Here a problem was apparent. The word “Weather” is “owned” by Mycrofts own weather skill, which is also useful. So I have used the word “Station” as my skill trigger word. I tried to use “Weather station” but this was always intercepted. :slightly_smiling_face:

Yes, exactly. Now imagine you could have mycroft say: “Do you want the weather from the station of from the built-in skill?” The differentiating labels could change and could even take on a role of ad-hoc sub-tree intent parsing. If you have enough conflicting skills you basically are having little conversations. The differentiating labels could come from the intent phrases reported with the skill or from a separate list.

For the moment I am not doing any work on this, but I’d be happy to contribute. My python skills are improving. :wink:

1 Like