This application demonstrates methods Saiy exposes in order to allow developers of other Android applications to use features of Saiy.
The project is built using Java 7 - Android SDK (API 26 - Min 16)
In Android studio, import as a new project from version control from the downloadable zip.
There is a direct dependency to the Saiy Library project. Once that project has compiled you'll need to add the generated aar file as a module to the main project, as described here.
The project is licensed under the GNU Affero General Public License V3. This is a copyleft license. See LICENSE
The license grant is not for Saiy's trademarks, which include the logo designs. Saiy reserves all trademark and copyright rights in and to all Saiy trademarks.
Copyright © 2017 Saiy® Ltd.
I need to clarify the most appropriate for the GNU Affero General Public License - will revisit very soon. Any suggestions welcome.
- Nuance Developers - Text to Speech - Speech to Text - NLU
- Microsoft Cognitive Services - Text to Speech - NLU - Translate API
- IBM Bluemix - Speech to Text - NLU
- Wit - Speech to Text - NLU
- Google Cloud Speech - Speech to Text
- Google Chromium Speech - Speech to Text
- API AI - Speech to Text - NLU
Here you can choose from the available providers for Text to Speech, Speech to Text and your NLP backend.
To test almost all of these providers, you will need to enter your related credentials in the SaiyConfiguration class. Only store them here for testing purposes. As you would expect, you'll also need to create your own project for any NLP cloud provider you wish to use.
I hope it goes without saying that any keys or secrets sent via Saiy to connect to an external API, will be disposed of immediately after the request and never stored.
Here you can register a specific command to the main Saiy Application, which will be forwarded directly to your app to process. Think of it as registering a voice intent. The example given is how Spotify would register commands.
Here you can see how user interaction undertaken via Saiy, can be used by your app to create an interactive experience.
Once you get to grips with the above and have created a half decent NLP backend (which won't take long), you'll be able to handle your users asking FAQs via Saiy, instead of reading through a mass of information. By giving Saiy your custom context on each request, such as 'FragmentShare', it will be easy for you to process your user asking 'Where is the option for Instagram?' and allow Saiy to inform them 'that it will be available in the next release'. You get the idea...
Additionally, you could construct a voice tutorial, or ask them which theme they prefer etc etc - all of which will keep them engaged.
Please use the Stack Overflow tag for compiling related questions and errors.
For code issues and crashes, please open an issue.
For discussion, please use the XDA development thread for now.