Page display and reader (Recipe skill)

*Skill Name page-display-and-reader-skill

Most applicable to Mycroft II or any implementation with a screen, but not required.

User story:

_As a person who cooks, I want this Skill to display recipes on-screen and read the text so that I can cook without putting hands on a phone or tablet

Additional user stories: any action where it may be difficult to use hands to scroll through directions on-line, ie. Knitting/crocheting and needing to see the pattern and instructions, building a plastic model (handling glue, etc) and needing to see directions.

Could be useful to anyone who is visually impaired.

What third party services, data sets or platforms will the Skill interact with?

Data could come from a web page, text, or pdf file.

Are there similar Mycroft Skills already?

See GitHub - MycroftAI/mycroft-skills: A repository for sharing and collaboration for third-party Mycroft skills development. for a list. If so, how could they be combined?

What will the user Speak to trigger the Skill?

Show document
Show document {name}
Page up
page down
next page
previous page
back
fowrawd
_first page _
last page
read page
list documents in my library
load document {name}

What phrases will Mycroft Speak?

Mycroft would read the text content of the page.

What Skill Settings will this Skill need to store?

See https://mycroft.ai/documentation/skills/skill-settings/ for more information

Other comments?

_Loading or specifying documents as Skill configuration settings seems cumbersome. Perhaps there needs to be a “library” of documents or URLs, saved to Mycroft via the Skill configuration _

If your looking for a skill for recipes there is one available here that display’s and speaks recipes on the screen GitHub - AIIX/food-wizard: Get Popular Food & Cooking Recipes On The Go, Mycroft Recipes Skill