We need your help again! We’re looking for feedback from anyone who has used Mycroft in the past, or wants to use it in the future. If you’re reading this and that doesn’t include you – then I have no idea why you’re here… #awks…
Over the last few months we have been polishing all the base Skills – both their voice and graphical interfaces. The focus during this time has been on each Skill working as expected in isolation, however we also know that real-world usage isn’t that straightforward. Conversations with other humans often jump around between different contexts and interacting with a voice assistant is no different.
When you speak to a voice assistant, you might start by playing some music. Set a timer for the pot on the stove. Find out how many teaspoons in a tablespoon. Then want everything to shut up while you focus on the key moment of your delicious creation. Each of these is likely handled by a different Skill – your music service, the Timer Skill, Wolfram Alpha, and the Volume Skill. If each of these Skills are only designed to operate on its own then these interactions quickly become disjointed, come into conflict with each other, and ultimately can become confusing for the human trying to focus on their food.
For a seamless interaction experience that does what you want, and otherwise gets out of the way – we need to think through all the ways that people might interact with Mycroft as a whole. It’s important that this also includes how Mycroft will behave when we diverge from what’s called the “happy path” in software development. When things go wrong, how do we recover gracefully? What should Mycroft do if the user doesn’t hear what was said? If the user makes a mistake or changes their mind – how do we help them get back on track?
As you might imagine, when you start digging into “how should anything interact with anything” – it quickly spawns discussions, disagreements, new scenarios, and an endless list of new questions. To help us work through these questions, and to ensure the interaction of Skills reflects the needs and wants of all Mycroft users – we need your help.
To unravel this enormous puzzle and all its possible pathways, we want to:
- Understand how you would expect Mycroft to behave in different circumstances
- Consider ways we might be able to group or generalize those expectations
- Design a technical implementation to meet these rules
- Implement the design and gather more feedback.
When you put it in 4 dot points, it sounds so simple – and we want to make it just as simple to have your say. So to begin with we want to understand how you want Mycroft to behave. To explore this we’re going to be releasing a series of videos showing some example interactions with a Mark II unit.
For each of these videos we want you to give it a thumbs up if it’s spot on, and a thumbs down if it misses the mark. Each video will also have it’s own questions that we’d love your opinion on in the comments. But if you have something to say about it, and it doesn’t fit in one of the questions, we still want to hear it.
The first video we wanted feedback on is:
- Does the video reflect the way you would want your Mycroft unit to behave?
- If not, how would you expect it to be different?
- Are there other situations where you would expect the same or similar behaviour?
- Are there situations where you wouldn’t want this behaviour?