Google Assistant

Post your questions and help other users.

Moderator: Martin

Mar
Posts: 49
Joined: 20 May 2015 23:24
Location: Germany

Google Assistant

Post by Mar » 12 Feb 2017 14:20

Hey,

Martin, it would be great if you can interact with the Google Assistant. For example, you say something and get a previously defined answer. Then you can handle your speech input with Automagic.
IFTTT is already supported. You can handle many different inputs, also with variable placeholders. I've attached two pictures regarding this.
I know the assistant is officially unlocked for the phones by Google only but there are some ways to activate it on any non-Google phones too, as I did. I use Android N-ify. Moreover, Google is currently working on the unlocking for every Android device.
Although IFTTT is already supported, I can't get it to work that it communicates with Automagic. I've seen some ways to interact with Tasker but I don't want to download a set of apps only for this. One app would be alright.
In IFTTT, there is an action called Maker. I thought that this would be suitable. The description says:

Integrate other services on IFTTT with your DIY projects. You can create Applets that work with any devices or apps that can make or receive a web request (aka webhooks). See how others are using the Maker service and share your project at hackster.io. **For information on triggering events, go to your Maker service settings and then the listed URL (web) or tap your username (mobile)

Is it possible to send such a web request to Automagic, Martin?
That would probably be the easiest way but I am not able to connect it with Automagic right now.

Any solutions?
If not, it would be nice if you could add this feature, Martin. It would make the life easier for all of us.
Mar
Attachments
Screenshot_20170212-145637.png
Screenshot_20170212-145637.png (91.49 KiB) Viewed 33794 times
2017 Feb 12 14-58-10.png
2017 Feb 12 14-58-10.png (195 KiB) Viewed 33794 times

User avatar
Martin
Posts: 4468
Joined: 09 Nov 2012 14:23

Re: Google Assistant

Post by Martin » 12 Feb 2017 14:54

Hi,

Unfortunately integrating Google Assistant in Automagic is quite complex and likely requires additional web servers so it's probably not something I'll find the time in the next few months.
Maybe someone will build a plugin to integrate Assistant into automation tools.

I'm not using IFTTT so I have no idea what the Maker action supports but it likely can only send requests to other web servers but not directly to a mobile device. You could try to send a request to a service like Pushbullet which will send the request to your device(s).
Maybe IFTTT allows to directly show a notification on the Android device. You could intercept this notification with Automagic, execute a flow and remove the notification with Automagic.

Regards,
Martin

mathieson
Posts: 51
Joined: 13 Aug 2013 18:16

Re: Google Assistant

Post by mathieson » 15 Feb 2017 16:11

Someone DID build a plugin for Tasker!

Have not tried it yet (on my list of to-do's) but here it is:
https://www.xda-developers.com/autovoic ... to-use-it/

Marucins
Posts: 4
Joined: 02 Aug 2017 06:45

Re: Google Assistant

Post by Marucins » 17 Oct 2017 15:35

What has changed in Automagic and Google Assistant?

emaeee
Posts: 19
Joined: 10 Nov 2017 22:37

Re: Google Assistant

Post by emaeee » 10 Nov 2017 23:58

Marucins wrote:What has changed in Automagic and Google Assistant?
Mar wrote:Hey,

M
Any solutions?
If not, it would be nice if you could add this feature, Martin. It would make the life easier for all of us.
Mar
I'm using one of the latest Google Assistant versions (7.12.24.21) and after trying Autovoice and Tasker now it seems that the easier and more reliable way to trigger actions after specific spoken text is the "UI Event" trigger built in Automagic itself. After the Google Now -> Google Assistant server switch things became harder and annoying using the above plugin.

Flow:

- Trigger: UI Event component ( component scrolled, restricted to Google Assistant package)
- Action: Control UI (sentence1 = getTextById(....), you can get the id by using the show overlay control window after asking something to Google Assistant -> click on the text you spoke and you'll get the id.)
- Condition: Expression (variable sentence1 matches "my keywords")
- If condition is true -> execute my action.

For example, you want to kill Facebook app.
Automagic gets triggered after you say "Ok Google, kill Facebook app ", because the window appears and the text scrolls.
Automagic catches the spoken keywords and, if they match your expression ("kill Facebook app"), the action gets executed.

I don't know if it works on bigger screens because the text needs to get scrolled up in Google Assistant window in order for the trigger to work.

As soon as i test extensively and make the flow better ill' update my post

EDIT: one problem i noticed is that the gettextbyid might catch the previous spoken command, because Google Assistant is like a chat so when it scrolls the command it shows previous messages above (which have the same id).
This could be avoided by using the getText(x,y) instead of getTextById or by other ways.

EDIT2: Martin have you got any suggestion on how to restrict the getTextById action to catch only the last user chat message?
I don't know if you use it but here's a video that shows the chat flow in Google Assistant https://www.youtube.com/watch?v=w1at-jSLcUU

emaeee
Posts: 19
Joined: 10 Nov 2017 22:37

Re: Google Assistant

Post by emaeee » 20 Nov 2017 18:56

Martin wrote:Hi,

Unfortunately integrating Google Assistant in Automagic is quite complex....

Regards,
Martin
Hello Martin, I know you are busy with new Google Play rules (and as a user, i'm worried with you too), but have you got any suggestion relating to my previous post?
I'm using - Trigger: text scrolled and Control UI getText (x,y) - to intercept spoken commands in Google Assistant window, but sometimes it fails (probably because it depends on the timing between the trigger activation and the position of the command in the window, if getText gets triggered a bit later the text has already scrolled up too much and the command catches nothing).
The getTextById function would work well but often catches previous messages that appear in the same window, because user commands have all the same ID and are showed in a chat-like window during the speaking. It generally catches the sentence on top (which is the command that was spoken in the previous session).

Is there a possibility to choose/restrict between elements with the same ID? Do you have something to suggest?

thanks

User avatar
Martin
Posts: 4468
Joined: 09 Nov 2012 14:23

Re: Google Assistant

Post by Martin » 20 Nov 2017 20:40

Hi,

I'm not really using the assistant so I'm not sure if I can be of any help.
You could try to use functions getTextInActiveWindow() to get all text and then use a regular expression or the substring function to extract the last response from the assistant.
I think the plugin AutoVoice could also work (but I did not try).

Regards,
Martin

emaeee
Posts: 19
Joined: 10 Nov 2017 22:37

Re: Google Assistant

Post by emaeee » 20 Nov 2017 21:59

Martin wrote:Hi,

I'm not really using the assistant so I'm not sure if I can be of any help.
You could try to use functions getTextInActiveWindow() to get all text and then use a regular expression or the substring function to extract the last response from the assistant.
I think the plugin AutoVoice could also work (but I did not try).

Regards,
Martin
Thanks for the help, i'll try that way.

Unfortunately, since Now -> Assistant switch, Autovoice needs some tricks (yet) unavailable for some non-english languages, in order to detect commands.

emaeee
Posts: 19
Joined: 10 Nov 2017 22:37

Re: Google Assistant

Post by emaeee » 22 Nov 2017 15:42

I created a working flow to intercept Google Assistant commands in order to execute custom actions.
Automagic detects the standard Google Assistant message "Can i Help you?" "How can i help?" "Do you need Help?" (find it yourself in your own language). This appears every time you say "ok google", but disappears in the chat history so it cannot be catched for error.
Now Automagic finds out the command that is spoken after this message (using an index variable).

- Trigger: UI Event component scrolled (restricted to Google Assistant package)
- Action: Control UI functions ->
getTextInActiveWindow (to get all the text that appears in google assistant chat);
indexOf (to get the index of the command that user says after the default Google intro "How can i help you?" "can i help you?", you must edit this sentence based on your language);
substring (to get the actual command string that starts at the index found previously).
- Condition: Expression (substring contains "my keywords")
- If condition is true -> execute my action and close Google Assistant window

3 variables are used, the last one (the substring) is the command spoken by the user together with the answer by Google after it (if Google uses your keywords in its answer it might get catched by automagic as if it was said by you). I used the "contains" function but you can use other functions to detect if the spoken command should trigger or not the action.
Sometimes the ControlUI action might throw an error/exception when it catches the wrong window or Google window is closing, so i added an action Sleep when this happens (so you don't see the automagic notification with the error).

Here's an example flow to start GPS when you say "start GPS" after Google Assistant asks "How can i help you?"

Martin, as always you are really helpful.
For now the trick works hoping that somedays Google will provide an open API for using custom commands (also offline)
Attachments
flow_StartGPS_20171122_163108.xml
(4.13 KiB) Downloaded 1185 times

emaeee
Posts: 19
Joined: 10 Nov 2017 22:37

Re: Google Assistant

Post by emaeee » 11 Dec 2017 08:49

UPDATE: After one or two updates :roll: Google App stopped informing accessibility services about the scrolling so the trigger no more worked.
New trigger based on logcat events

- Trigger:
Command Output: logcat -c; logcat -v time *:D
matches regex: .*EngineCallback.*|.*startMicroDetector.* (the first event appears in online mode, in offline mode we must use the second one)
- Action: Control UI functions ->
getTextInActiveWindow() (to get all the text that appears in google assistant chat);
indexOf (to get the index of the command that user says after the default Google intro "How can i help you?" "can i help you?", you must edit this sentence based on your language);
substring (to get the actual command string that starts at the index found previously).
(Look in "Functions" to understand better the job done above.)
- Condition: Expression
substring contains "my keywords"
- If condition is true -> close Google Assistant window and execute my action

I use 2 flows that, when Google App starts (App Task Started, Ended), enable/disable the above flow so the command output trigger does not consume battery.

If needed, you can search more logcat events in case they change in future versions (those above are tested and are fast)

Post Reply