I am trying to make Google Now accept custom commands and send an Intent to my app when a particular query is made.
I did this successfully using Tasker and Autovoice, but I want to do the same without using these apps.
I found this link to the documentation. Where can I handle common intents which did not fulfill my task.
I also tried the Voice Interaction API provided by Google, which is almost the same thing, but this did not help.
Has anyone here achieved this without using other apps like Commander, Autovoice or Tasker?
Tap on Settings (second to last option) at the bottom, and under Google Assistant select Settings. In the following page tap on the Assistant tab, and select Routines. Google Assistant will already have some commands ready for you to use. Pressing the blue button at the bottom-right, you can create your own.
Google Now does not currently 'accept' custom commands. The apps you detail use an AcccessibilityService 'hack' to intercept the voice command, or for rooted devices, the xposed framework.
They either then act upon them, simultaneously killing Google Now, or ignore them and allow Google to display its results as usual.
For many reasons, this is a bad idea:
Disclaimer complete! Use at your own risk....
You need to register an AccessibilityService
in the Manifest
:
<service
android:name="com.something.MyAccessibilityService"
android:enabled="true"
android:label="@string/label"
android:permission="android.permission.BIND_ACCESSIBILITY_SERVICE" >
<intent-filter>
<action android:name="android.accessibilityservice.AccessibilityService" />
</intent-filter>
<meta-data
android:name="android.accessibilityservice"
android:resource="@xml/accessibilityconfig" />
</service>
And add the config file to res/xml
:
<accessibility-service
xmlns:android="http://schemas.android.com/apk/res/android"
android:accessibilityEventTypes="typeWindowContentChanged"
android:accessibilityFeedbackType="feedbackGeneric"
android:accessibilityFlags="flagIncludeNotImportantViews"
android:canRetrieveWindowContent="true"
android:description="@string/accessibility_description"
android:notificationTimeout="100"
android:settingsActivity="SettingsActivity"/>
You can optionally add:
android:packageNames="xxxxxx"
or extend the functionality by adding further event types:
android:accessibilityEventTypes="typeViewTextSelectionChanged|typeWindowContentChanged|typeNotificationStateChanged"
Include the following AccessibilityService
example class:
/*
* Copyright (c) 2016 Ben Randall
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.your.package;
import android.accessibilityservice.AccessibilityService;
import android.support.annotation.NonNull;
import android.support.annotation.Nullable;
import android.util.Log;
import android.view.accessibility.AccessibilityEvent;
import android.view.accessibility.AccessibilityNodeInfo;
/**
* @author benrandall76 AT gmail DOT com
*/
public class MyAccessibilityService extends AccessibilityService {
private final boolean DEBUG = true;
private final String CLS_NAME = MyAccessibilityService.class.getSimpleName();
private static final String GOOGLE_VOICE_SEARCH_PACKAGE_NAME = "com.google.android.googlequicksearchbox";
private static final String GOOGLE_VOICE_SEARCH_INTERIM_FIELD = "com.google.android.apps.gsa.searchplate.widget.StreamingTextView";
private static final String GOOGLE_VOICE_SEARCH_FINAL_FIELD = "com.google.android.apps.gsa.searchplate.SearchPlate";
private static final long COMMAND_UPDATE_DELAY = 1000L;
private long previousCommandTime;
private String previousCommand = null;
private final boolean EXTRA_VERBOSE = false;
@Override
protected void onServiceConnected() {
super.onServiceConnected();
if (DEBUG) {
Log.i(CLS_NAME, "onServiceConnected");
}
}
@Override
public void onAccessibilityEvent(final AccessibilityEvent event) {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent");
}
if (event != null) {
switch (event.getEventType()) {
case AccessibilityEvent.TYPE_WINDOW_CONTENT_CHANGED:
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: checking for google");
}
if (event.getPackageName() != null && event.getPackageName().toString().matches(
GOOGLE_VOICE_SEARCH_PACKAGE_NAME)) {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: checking for google: true");
Log.i(CLS_NAME, "onAccessibilityEvent: event.getPackageName: " + event.getPackageName());
Log.i(CLS_NAME, "onAccessibilityEvent: event.getClassName: " + event.getClassName());
}
final AccessibilityNodeInfo source = event.getSource();
if (source != null && source.getClassName() != null) {
if (source.getClassName().toString().matches(
GOOGLE_VOICE_SEARCH_INTERIM_FIELD)) {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: className interim: true");
Log.i(CLS_NAME, "onAccessibilityEvent: source.getClassName: " + source.getClassName());
}
if (source.getText() != null) {
final String text = source.getText().toString();
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: interim text: " + text);
}
if (interimMatch(text)) {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: child: interim match: true");
}
if (commandDelaySufficient(event.getEventTime())) {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: commandDelaySufficient: true");
}
if (!commandPreviousMatches(text)) {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: commandPreviousMatches: false");
}
previousCommandTime = event.getEventTime();
previousCommand = text;
killGoogle();
if (DEBUG) {
Log.e(CLS_NAME, "onAccessibilityEvent: INTERIM PROCESSING: " + text);
}
} else {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: commandPreviousMatches: true");
}
}
} else {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: commandDelaySufficient: false");
}
}
break;
} else {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: child: interim match: false");
}
}
} else {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: interim text: null");
}
}
} else if (source.getClassName().toString().matches(
GOOGLE_VOICE_SEARCH_FINAL_FIELD)) {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: className final: true");
Log.i(CLS_NAME, "onAccessibilityEvent: source.getClassName: " + source.getClassName());
}
final int childCount = source.getChildCount();
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: childCount: " + childCount);
}
if (childCount > 0) {
for (int i = 0; i < childCount; i++) {
final String text = examineChild(source.getChild(i));
if (text != null) {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: child text: " + text);
}
if (finalMatch(text)) {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: child: final match: true");
}
if (commandDelaySufficient(event.getEventTime())) {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: commandDelaySufficient: true");
}
if (!commandPreviousMatches(text)) {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: commandPreviousMatches: false");
}
previousCommandTime = event.getEventTime();
previousCommand = text;
killGoogle();
if (DEBUG) {
Log.e(CLS_NAME, "onAccessibilityEvent: FINAL PROCESSING: " + text);
}
} else {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: commandPreviousMatches: true");
}
}
} else {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: commandDelaySufficient: false");
}
}
break;
} else {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: child: final match: false");
}
}
} else {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: child text: null");
}
}
}
}
} else {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: className: unwanted " + source.getClassName());
}
if (EXTRA_VERBOSE) {
if (source.getText() != null) {
final String text = source.getText().toString();
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: unwanted text: " + text);
}
} else {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: unwanted text: null");
}
}
final int childCount = source.getChildCount();
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: unwanted childCount: " + childCount);
}
if (childCount > 0) {
for (int i = 0; i < childCount; i++) {
final String text = examineChild(source.getChild(i));
if (text != null) {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: unwanted child text: " + text);
}
}
}
}
}
}
} else {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: source null");
}
}
} else {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: checking for google: false");
}
}
break;
default:
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: not interested in type");
}
break;
}
} else {
if (DEBUG) {
Log.i(CLS_NAME, "onAccessibilityEvent: event null");
}
}
}
/**
* Check if the previous command was actioned within the {@link #COMMAND_UPDATE_DELAY}
*
* @param currentTime the time of the current {@link AccessibilityEvent}
* @return true if the delay is sufficient to proceed, false otherwise
*/
private boolean commandDelaySufficient(final long currentTime) {
if (DEBUG) {
Log.i(CLS_NAME, "commandDelaySufficient");
}
final long delay = (currentTime - COMMAND_UPDATE_DELAY);
if (DEBUG) {
Log.i(CLS_NAME, "commandDelaySufficient: delay: " + delay);
Log.i(CLS_NAME, "commandDelaySufficient: previousCommandTime: " + previousCommandTime);
}
return delay > previousCommandTime;
}
/**
* Check if the previous command/text matches the current text we are considering processing
*
* @param text the current text
* @return true if the text matches the previous text we processed, false otherwise.
*/
private boolean commandPreviousMatches(@NonNull final String text) {
if (DEBUG) {
Log.i(CLS_NAME, "commandPreviousMatches");
}
return previousCommand != null && previousCommand.matches(text);
}
/**
* Check if the interim text matches a command we want to intercept
*
* @param text the intercepted text
* @return true if the text matches a command false otherwise
*/
private boolean interimMatch(@NonNull final String text) {
if (DEBUG) {
Log.i(CLS_NAME, "interimMatch");
}
return text.matches("do interim results work");
}
/**
* Check if the final text matches a command we want to intercept
*
* @param text the intercepted text
* @return true if the text matches a command false otherwise
*/
private boolean finalMatch(@NonNull final String text) {
if (DEBUG) {
Log.i(CLS_NAME, "finalMatch");
}
return text.matches("do final results work");
}
/**
* Recursively examine the {@link AccessibilityNodeInfo} object
*
* @param parent the {@link AccessibilityNodeInfo} parent object
* @return the extracted text or null if no text was contained in the child objects
*/
private String examineChild(@Nullable final AccessibilityNodeInfo parent) {
if (DEBUG) {
Log.i(CLS_NAME, "examineChild");
}
if (parent != null) {
for (int i = 0; i < parent.getChildCount(); i++) {
final AccessibilityNodeInfo nodeInfo = parent.getChild(i);
if (nodeInfo != null) {
if (DEBUG) {
Log.i(CLS_NAME, "examineChild: nodeInfo: getClassName: " + nodeInfo.getClassName());
}
if (nodeInfo.getText() != null) {
if (DEBUG) {
Log.i(CLS_NAME, "examineChild: have text: returning: " + nodeInfo.getText().toString());
}
return nodeInfo.getText().toString();
} else {
if (DEBUG) {
Log.i(CLS_NAME, "examineChild: text: null: recurse");
}
final int childCount = nodeInfo.getChildCount();
if (DEBUG) {
Log.i(CLS_NAME, "examineChild: childCount: " + childCount);
}
if (childCount > 0) {
final String text = examineChild(nodeInfo);
if (text != null) {
if (DEBUG) {
Log.i(CLS_NAME, "examineChild: have recursive text: returning: " + text);
}
return text;
} else {
if (DEBUG) {
Log.i(CLS_NAME, "examineChild: recursive text: null");
}
}
}
}
} else {
if (DEBUG) {
Log.i(CLS_NAME, "examineChild: nodeInfo null");
}
}
}
} else {
if (DEBUG) {
Log.i(CLS_NAME, "examineChild: parent null");
}
}
return null;
}
/**
* Kill or reset Google
*/
private void killGoogle() {
if (DEBUG) {
Log.i(CLS_NAME, "killGoogle");
}
// TODO - Either kill the Google process or send an empty intent to clear current search process
}
@Override
public void onInterrupt() {
if (DEBUG) {
Log.i(CLS_NAME, "onInterrupt");
}
}
@Override
public void onDestroy() {
super.onDestroy();
if (DEBUG) {
Log.i(CLS_NAME, "onDestroy");
}
}
}
I made the class as verbose an indented as possible, so it's hopefully easier to follow.
It does the following:
To test:
Service
in the Android Accessibility SettingsThe above will demonstrate the extracted text/command from both hard-coded views. If you don't restart Google Now, the command will still be detected as interim.
Using the extracted voice command, you need to perform your own language matching to determine if this is a command you are interested in. If it is, you need to prevent Google from speaking or displaying the results. This is achieved by killing Google Now or sending it an empty voice search intent, containing flags that should clear/reset task
.
You will be in a race-condition doing this, so your language processing needs to be pretty smart, or pretty basic....
Hope that helps.
EDIT:
For those asking, to 'kill' Google Now, you either need to have permission to kill processes, or send an empty ("") search intent to clear the current search:
public static final String PACKAGE_NAME_GOOGLE_NOW = "com.google.android.googlequicksearchbox";
public static final String ACTIVITY_GOOGLE_NOW_SEARCH = ".SearchActivity";
/**
* Launch Google Now with a specific search term to resolve
*
* @param ctx the application context
* @param searchTerm the search term to resolve
* @return true if the search term was handled correctly, false otherwise
*/
public static boolean googleNow(@NonNull final Context ctx, @NonNull final String searchTerm) {
if (DEBUG) {
Log.i(CLS_NAME, "googleNow");
}
final Intent intent = new Intent(Intent.ACTION_WEB_SEARCH);
intent.setComponent(new ComponentName(PACKAGE_NAME_GOOGLE_NOW,
PACKAGE_NAME_GOOGLE_NOW + ACTIVITY_GOOGLE_NOW_SEARCH));
intent.putExtra(SearchManager.QUERY, searchTerm);
intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK | Intent.FLAG_ACTIVITY_CLEAR_TOP
| Intent.FLAG_ACTIVITY_RESET_TASK_IF_NEEDED);
try {
ctx.startActivity(intent);
return true;
} catch (final ActivityNotFoundException e) {
if (DEBUG) {
Log.e(CLS_NAME, "googleNow: ActivityNotFoundException");
e.printStackTrace();
}
} catch (final Exception e) {
if (DEBUG) {
Log.e(CLS_NAME, "googleNow: Exception");
e.printStackTrace();
}
}
return false;
}
Not what you want to hear, but the current version of the API does not allow custom voice commands:
From https://developers.google.com/voice-actions/custom-actions
Note: We are not accepting requests for Custom Voice Actions. Stay tuned to Voice Actions - Google Developers and +GoogleDevelopers for product updates.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With