Loading docs/html/training/articles/assistant.jd +159 −147 Original line number Original line Diff line number Diff line Loading @@ -11,110 +11,92 @@ page.article=true <div id="tb"> <div id="tb"> <h2>In this document</h2> <h2>In this document</h2> <ol> <ol> <li><a href="#assist_api">Using the Assist API</a> <li><a href="#assist_api">Using the Assistant</a> <ol> <ol> <li><a href="#assist_api_lifecycle">Assist API Lifecycle</a></li> <li><a href="#source_app">Source app</a></li> <li><a href="#source_app">Source App</a></li> <li><a href="#destination_app">Destination app</a></li> <li><a href="#destination_app">Destination App</a></li> </ol> </ol> </li> </li> <li><a href="#implementing_your_own_assistant">Implementing your <li><a href="#implementing_your_own_assistant">Implementing Your own assistant</a></li> Own Assistant</a></li> </ol> </ol> </div> </div> </div> </div> <p> <p> Android 6.0 Marshmallow introduces a new way for users to engage with apps Android 6.0 Marshmallow introduces a new way for users to engage with apps through the assistant. through the assistant. The assistant is a top-level window that users can view to obtain </p> contextually relevant actions for the current activity. These actions might include deep links to other apps on the device.</p> <p> <p> Users summon the assistant with a long-press on the Home button or by saying Users activate the assistant with a long press on the Home button or by saying a the {@link android.service.voice.AlwaysOnHotwordDetector keyphrase}. In <a href="{@docRoot}reference/android/service/voice/AlwaysOnHotwordDetector.html">keyphrase</a>. response to the long-press, the system opens a top-level window that displays In response, the system opens a top-level window that displays contextually contextually relevant actions for the current activity. These potential relevant actions. actions might include deep links to other apps on the device. </p> </p> <p> <p> This guide explains how Android apps use Android's Assist API to improve the Google App implements the assistant overlay window through a feature called assistant user experience. Now on Tap, which works with the Android platform-level functionality. The system allows the user to select an assistant app, which obtains contextual information from your app using Android’s Assist API. </p> </p> <h2 id="assist_api">Using the Assist API</h2> <p> <p> The example below shows how Google Now integrates with the Android assistant This guide explains how Android apps use Android's Assist API to improve the assistant using a feature called Now on Tap. user experience. <p/> </p> </p> <h2 id="assist_api">Using the Assistant</h2> <p> <p> The assistant overlay window in our example (2, 3) is implemented by Google Figure 1 illustrates a typical user interaction with the assistant. When the user long-presses Now through a feature called Now on Tap, which works in concert with the the Home button, the Assist API callbacks are invoked Android platform-level functionality. The system allows the user to select in the <em>source</em> app (step 1). The assistant renders the overlay window (steps 2 and 3), the assistant app (Figure 2) that obtains contextual information from the and then the user selects the action to perform. The assistant executes the selected action, <em>source</em> app using the Assist API which is a part of the platform. such as firing an intent with a deep link to the (<em>destination</em>) restaurant app (step 4). </p> </p> <div> <div> <img src="{@docRoot}images/training/assistant/image01.png"> <img src="{@docRoot}images/training/assistant/image01.png"> <p class="img-caption" style="text-align:center;"> <p class="img-caption" style="text-align:center;"> Figure 1. Assistant interaction example with the Now on Tap feature of Figure 1. Assistant interaction example with the Now on Tap feature of Google Now the Google App </p> </p> </div> </div> <p> <p> An Android user first configures the assistant and can change system options Users can configure the assistant by selecting <strong>Settings > Apps > Default Apps > such as using text and view hierarchy as well as the screenshot of the Assist & voice input</strong>. Users can change system options such as accessing current screen (Figure 2). the screen contents as text and accessing a screenshot, as shown in Figure 2. </p> <p> From there, the assistant receives the information only when the user activates assistance, such as when they tap and hold the Home button ( shown in Figure 1, step 1). </p> </p> <div style="float:right;margin:1em;max-width:300px"> <div id="assist-input-settings" style="float:right;margin:1em;max-width:300px"> <img src="{@docRoot}images/training/assistant/image02.png"> <img src="{@docRoot}images/training/assistant/image02.png"> <p class="img-caption" style="text-align:center;"> <p class="img-caption" style="text-align:center;"> Figure 2. Assist & voice input settings (<em>Settings/Apps/Default Figure 2. Assist & voice input settings Apps/Assist & voice input</em>) </p> </p> </div> </div> <h3 id="assist_api_lifecycle">Assist API Lifecycle</h3> <h3 id="source_app">Source app</h3> <p> Going back to our example from Figure 1, the Assist API callbacks are invoked in the <em>source</em> app after step 1 (user long-presses the Home button) and before step 2 (the assistant renders the overlay window). Once the user selects the action to perform (step 3), the assistant executes it, for example by firing an intent with a deep link to the (<em>destination</em>) restaurant app (step 4). </p> <h3 id="source_app">Source App</h3> <p> <p> In most cases, your app does not need to do anything extra to integrate with To ensure that your app works with the assistant as a source of information for the user, the assistant if you already follow <a href= you need only follow <a href= "{@docRoot}guide/topics/ui/accessibility/apps.html">accessibility best "{@docRoot}guide/topics/ui/accessibility/apps.html">accessibility best practices</a>. This section describes how to provide additional information practices</a>. This section describes how to provide additional information to help improve the assistant user experience, as well as scenarios, such as to help improve the assistant user experience as well as scenarios custom Views, that need special handling. that need special handling, such as custom Views. </p> </p> <h4 id="share_additional_information_with_the_assistant">Share additional information <h4 id="share_additional_information_with_the_assistant">Share Additional Information with the Assistant</h4> with the assistant</h4> <p> <p> In addition to the text and the screenshot, your app can share In addition to the text and the screenshot, your app can share <em>additional</em> information with the assistant. For example, your music other information with the assistant. For example, your music app can choose to pass current album information, so that the assistant can app can choose to pass current album information so that the assistant can suggest smarter actions tailored to the current activity. suggest smarter actions tailored to the current activity. </p> </p> Loading @@ -122,13 +104,13 @@ page.article=true To provide additional information to the assistant, your app provides To provide additional information to the assistant, your app provides <em>global application context</em> by registering an app listener and <em>global application context</em> by registering an app listener and supplies activity-specific information with activity callbacks as shown in supplies activity-specific information with activity callbacks as shown in Figure 3. Figure 3: </p> </p> <div> <div> <img src="{@docRoot}images/training/assistant/image03.png"> <img src="{@docRoot}images/training/assistant/image03.png"> <p class="img-caption" style="text-align:center;"> <p class="img-caption" style="text-align:center;"> Figure 3. Assist API lifecycle sequence diagram. Figure 3. Assist API lifecycle sequence diagram </p> </p> </div> </div> Loading @@ -136,43 +118,42 @@ page.article=true To provide global application context, the app creates an implementation of To provide global application context, the app creates an implementation of {@link android.app.Application.OnProvideAssistDataListener} and registers it {@link android.app.Application.OnProvideAssistDataListener} and registers it using {@link using {@link android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener)}. android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener) registerOnProvideAssistDataListener()}. In order to provide activity-specific contextual information, activity To provide activity-specific contextual information, the activity overrides {@link android.app.Activity#onProvideAssistData(android.os.Bundle)} overrides {@link android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()} and {@link and {@link android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)}. android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()}. The two activity methods are called <em>after</em> the optional global The two activity methods are called <em>after</em> the optional global callback (registered with {@link callback is invoked. Because the callbacks execute on the main thread, they should android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener)}) is invoked. Since the callbacks execute on the main thread, they should complete <a href="{@docRoot}training/articles/perf-anr.html">promptly</a>. complete <a href="{@docRoot}training/articles/perf-anr.html">promptly</a>. The callbacks are invoked only when the activity is <a href= The callbacks are invoked only when the activity is <a href= "{@docRoot}reference/android/app/Activity.html#ActivityLifecycle">running</a>. "{@docRoot}reference/android/app/Activity.html#ActivityLifecycle">running</a>. </p> </p> <h5 id="providing_context">Providing Context</h5> <h5 id="providing_context">Providing context</h5> <p> <p> {@link android.app.Activity#onProvideAssistData(android.os.Bundle)} is called When the user activates the assistant, when the user is requesting the assistant to build a full {@link {@link android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()} is called to build a full {@link android.content.Intent#ACTION_ASSIST} Intent with all of the context of the android.content.Intent#ACTION_ASSIST} Intent with all of the context of the current application represented as an instance of the {@link current application represented as an instance of the {@link android.app.assist.AssistStructure}. You can override this method to place android.app.assist.AssistStructure}. You can override this method to place into the bundle anything you would like to appear in the anything you like into the bundle to appear in the <code>EXTRA_ASSIST_CONTEXT</code> part of the assist Intent. {@link android.content.Intent#EXTRA_ASSIST_CONTEXT} part of the assist intent. </p> </p> <h5 id="describing_content">Describing Content</h5> <h5 id="describing_content">Describing content</h5> <p> <p> Your app can implement {@link Your app can implement {@link android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)} android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()} to improve assistant user experience by providing references to content to improve the assistant user experience by providing content-related references related to the current activity. You can describe the app content using the related to the current activity. You can describe the app content using the common vocabulary defined by <a href="https://schema.org">Schema.org</a> common vocabulary defined by <a href="https://schema.org" class="external-link">Schema.org</a> through a JSON-LD object. In the example below, a music app provides through a JSON-LD object. In the example below, a music app provides structured data to describe the music album the user is currently structured data to describe the music album that the user is currently looking at. viewing: </p> </p> <pre class="prettyprint"> <pre class="prettyprint"> Loading @@ -191,127 +172,158 @@ public void onProvideAssistContent(AssistContent <strong>assistContent</strong>) </pre> </pre> <p> <p> Custom implementations of {@link You can also improve the user experience with custom implementations of android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)} {@link may also adjust the provided {@link android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()}, android.app.assist.AssistContent#setIntent(android.content.Intent) content which can provide the following benefits: intent} to better reflect the top-level context of the activity, supply </p> {@link android.app.assist.AssistContent#setWebUri(android.net.Uri) the URI} <ul> of the displayed content, and fill in its {@link <li><a href="{@docRoot}reference/android/app/assist/AssistContent.html#setIntent(android.content.Intent)"> android.app.assist.AssistContent#setClipData(android.content.ClipData)} with Adjusts the provided content additional content of interest that the user is currently viewing. intent</a> to better reflect the top-level context of the activity.</li> <li><a href="{@docRoot}reference/android/app/assist/AssistContent.html#setWebUri(android.net.Uri)"> Supplies the URI</a> of the displayed content.</li> <li>Fills in {@link android.app.assist.AssistContent#setClipData(android.content.ClipData) setClipData()} with additional content of interest that the user is currently viewing.</li> </ul> <p class="note"> <strong>Note: </strong>Apps that use a custom text selection implementation likely need to implement {@link android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()} and call {@link android.app.assist.AssistContent#setClipData(android.content.ClipData) setClipData()}. </p> </p> <h4 id="default_implementation">Default Implementation</h4> <h4 id="default_implementation">Default implementation</h4> <p> <p> If neither {@link If neither the {@link android.app.Activity#onProvideAssistData(android.os.Bundle)} nor {@link android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()} nor the {@link android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)} android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()} callbacks are implemented, the system will still proceed and pass the callback is implemented, the system still proceeds and passes the information collected automatically to the assistant unless the current automatically collected information to the assistant unless the current window is flagged as <a href="#excluding_views">secure</a>. window is flagged as <a href="#excluding_views">secure</a>. As shown in Figure 3, the system uses the default implementations of {@link As shown in Figure 3, the system uses the default implementations of {@link android.view.View#onProvideStructure(android.view.ViewStructure)} and {@link android.view.View#onProvideStructure(android.view.ViewStructure) onProvideStructure()} and {@link android.view.View#onProvideVirtualStructure(android.view.ViewStructure)} to android.view.View#onProvideVirtualStructure(android.view.ViewStructure) onProvideVirtualStructure()} to collect text and view hierarchy information. If your view implements custom collect text and view hierarchy information. If your view implements custom text drawing, you should override {@link text drawing, override {@link android.view.View#onProvideStructure(android.view.ViewStructure)} to provide android.view.View#onProvideStructure(android.view.ViewStructure) onProvideStructure()} to provide the assistant with the text shown to the user by calling {@link the assistant with the text shown to the user by calling {@link android.view.ViewStructure#setText(java.lang.CharSequence)}. android.view.ViewStructure#setText(java.lang.CharSequence) setText(CharSequence)}. </p> </p> <p> <p> <strong>In most cases, implementing accessibility support will enable the <em>In most cases, implementing accessibility support enables the assistant to obtain the information it needs.</strong> This includes assistant to obtain the information it needs.</em> To implement accessibility support, providing {@link android.R.attr#contentDescription observe the best practices described in <a href= android:contentDescription} attributes, populating {@link "{@docRoot}guide/topics/ui/accessibility/apps.html">Making Applications android.view.accessibility.AccessibilityNodeInfo} for custom views, making Accessible</a>, including the following:</p> sure custom {@link android.view.ViewGroup ViewGroups} correctly {@link android.view.ViewGroup#getChildAt(int) expose} their children, and following <ul> the best practices described in <a href= <li>Provide {@link android.R.attr#contentDescription "{@docRoot}guide/topics/ui/accessibility/apps.html">“Making Applications android:contentDescription} attributes.</li> Accessible”</a>. <li>Populate {@link </p> android.view.accessibility.AccessibilityNodeInfo} for custom views.</li> <li>Make sure that custom {@link android.view.ViewGroup ViewGroup} objects correctly <a href="{@docRoot}reference/android/view/ViewGroup.html#getChildAt(int)">expose</a> their children.</li> </ul> <h4 id="excluding_views">Excluding views from the assistant</h4> <h4 id="excluding_views">Excluding views from the assistant</h4> <p> <p> An activity can exclude the current view from the assistant. This is accomplished To handle sensitive information, your app can exclude the current view from the assistant by setting the {@link android.view.WindowManager.LayoutParams#FLAG_SECURE by setting the {@link android.view.WindowManager.LayoutParams#FLAG_SECURE FLAG_SECURE} layout parameter of the WindowManager and must be done FLAG_SECURE} layout parameter of the {@link android.view.WindowManager}. You must set {@link explicitly for every window created by the activity, including Dialogs. Your android.view.WindowManager.LayoutParams#FLAG_SECURE app can also use {@link android.view.SurfaceView#setSecure(boolean) FLAG_SECURE} explicitly for SurfaceView.setSecure} to exclude a surface from the assistant. There is no every window created by the activity, including dialogs. Your app can also use {@link android.view.SurfaceView#setSecure(boolean) setSecure()} to exclude a surface from the assistant. There is no global (app-level) mechanism to exclude all views from the assistant. Note global (app-level) mechanism to exclude all views from the assistant. Note that <code>FLAG_SECURE</code> does not cause the Assist API callbacks to stop that {@link android.view.WindowManager.LayoutParams#FLAG_SECURE firing. The activity which uses <code>FLAG_SECURE</code> can still explicitly FLAG_SECURE} does not cause the Assist API callbacks to stop firing. The activity that uses {@link android.view.WindowManager.LayoutParams#FLAG_SECURE FLAG_SECURE} can still explicitly provide information to the assistant using the callbacks described earlier provide information to the assistant using the callbacks described earlier this guide. this guide. </p> </p> <h4 id="voice_interactions">Voice Interactions</h4> <p class="note"><strong>Note: </strong>For enterprise accounts (Android for Work), the administrator can disable the collection of assistant data for the work profile by using the {@link android.app.admin.DevicePolicyManager#setScreenCaptureDisabled(android.content.ComponentName, boolean) setScreenCaptureDisabled()} method of the {@link android.app.admin.DevicePolicyManager} API.</p> <h4 id="voice_interactions">Voice interactions</h4> <p> <p> Assist API callbacks are also invoked upon {@link Assist API callbacks are also invoked upon android.service.voice.AlwaysOnHotwordDetector keyphrase detection}. For more <a href="{@docRoot}reference/android/service/voice/AlwaysOnHotwordDetector.html">keyphrase information see the <a href="https://developers.google.com/voice-actions/">voice detection</a>. For more information, see the actions</a> documentation. <a href="https://developers.google.com/voice-actions/" class="external-link">Voice Actions</a> documentation. </p> </p> <h4 id="z-order_considerations">Z-order considerations</h4> <h4 id="z-order_considerations">Z-order considerations</h4> <p> <p> The assistant uses a lightweight overlay window displayed on top of the The assistant uses a lightweight overlay window displayed on top of the current activity. The assistant can be summoned by the user at any time. current activity. Because the user can activate the assistant at any time, Therefore, apps should not create permanent {@link don't create permanent <a android.Manifest.permission#SYSTEM_ALERT_WINDOW system alert} href="{@docRoot}reference/android/Manifest.permission.html#SYSTEM_ALERT_WINDOW"> windows interfering with the overlay window shown in Figure 4. system alert</a> windows that interfere with the overlay window, as shown in Figure 4. </p> </p> <div style=""> <div style=""> <img src="{@docRoot}images/training/assistant/image04.png"> <img src="{@docRoot}images/training/assistant/image04.png"> <p class="img-caption" style="text-align:center;"> <p class="img-caption" style="text-align:center;"> Figure 4. Assist layer Z-order. Figure 4. Assist layer Z-order </p> </p> </div> </div> <p> <p> If your app uses {@link If your app uses <a android.Manifest.permission#SYSTEM_ALERT_WINDOW system alert} windows, it href="{@docRoot}reference/android/Manifest.permission.html#SYSTEM_ALERT_WINDOW"> must promptly remove them as leaving them on the screen will degrade user system alert</a> windows, remove them promptly because leaving them on the experience and annoy the users. screen degrades the user experience. </p> </p> <h3 id="destination_app">Destination App</h3> <h3 id="destination_app">Destination app</h3> <p> <p> The matching between the current user context and potential actions displayed The assistant typically takes advantage of deep linking to find destination apps. To make your in the overlay window (shown in step 3 in Figure 1) is specific to the app a potential destination app, consider adding <a href= assistant’s implementation. However, consider adding <a href= "{@docRoot}training/app-indexing/deep-linking.html">deep linking</a> support. The matching "{@docRoot}training/app-indexing/deep-linking.html">deep linking</a> support between the current user context and deep links or other potential actions displayed in the to your app. The assistant will typically take advantage of deep linking. For overlay window (shown in step 3 in Figure 1) is specific to the assistant’s implementation. example, Google Now uses deep linking and <a href= For "https://developers.google.com/app-indexing/">App Indexing</a> in order to example, the Google App uses deep linking and <a href= "https://developers.google.com/app-indexing/" class="external-link">Firebase App Indexing</a> in order to drive traffic to destination apps. drive traffic to destination apps. </p> </p> <h2 id="implementing_your_own_assistant">Implementing your own assistant </h2> <h2 id="implementing_your_own_assistant">Implementing Your Own Assistant </h2> <p> <p> Some developers may wish to implement their own assistant. As shown in Figure You may wish to implement your own assistant. As shown in <a href="#assist-input-settings">Figure 2, the active assistant app can be selected by the Android user. The 2</a>, the user can select the active assistant app. The assistant app must provide an implementation of {@link assistant app must provide an implementation of {@link android.service.voice.VoiceInteractionSessionService} and {@link android.service.voice.VoiceInteractionSessionService} and {@link android.service.voice.VoiceInteractionSession} as shown in <a href= android.service.voice.VoiceInteractionSession} as shown in <a href= "https://android.googlesource.com/platform/frameworks/base/+/android-5.0.1_r1/tests/VoiceInteraction?autodive=0%2F%2F%2F%2F%2F%2F"> "https://android.googlesource.com/platform/frameworks/base/+/marshmallow-release/tests/VoiceInteraction/" class="external-link"> this</a> example and it requires the {@link this <code>VoiceInteraction</code> example</a>. It also requires the {@link android.Manifest.permission#BIND_VOICE_INTERACTION} permission. It can then android.Manifest.permission#BIND_VOICE_INTERACTION} permission. The assistant can then receive the text and view hierarchy represented as an instance of the {@link receive the text and view hierarchy represented as an instance of the {@link android.app.assist.AssistStructure} in {@link android.app.assist.AssistStructure} in {@link android.service.voice.VoiceInteractionSession#onHandleAssist(android.os.Bundle, android.service.voice.VoiceInteractionSession#onHandleAssist(android.os.Bundle, android.app.assist.AssistStructure,android.app.assist.AssistContent) onHandleAssist()}. android.app.assist.AssistStructure,android.app.assist.AssistContent) onHandleAssist()}. The assistant receives the screenshot through {@link It receives the screenshot through {@link android.service.voice.VoiceInteractionSession#onHandleScreenshot(android.graphics.Bitmap) android.service.voice.VoiceInteractionSession#onHandleScreenshot(android.graphics.Bitmap) onHandleScreenshot()}. onHandleScreenshot()}. </p> </p> Loading
docs/html/training/articles/assistant.jd +159 −147 Original line number Original line Diff line number Diff line Loading @@ -11,110 +11,92 @@ page.article=true <div id="tb"> <div id="tb"> <h2>In this document</h2> <h2>In this document</h2> <ol> <ol> <li><a href="#assist_api">Using the Assist API</a> <li><a href="#assist_api">Using the Assistant</a> <ol> <ol> <li><a href="#assist_api_lifecycle">Assist API Lifecycle</a></li> <li><a href="#source_app">Source app</a></li> <li><a href="#source_app">Source App</a></li> <li><a href="#destination_app">Destination app</a></li> <li><a href="#destination_app">Destination App</a></li> </ol> </ol> </li> </li> <li><a href="#implementing_your_own_assistant">Implementing your <li><a href="#implementing_your_own_assistant">Implementing Your own assistant</a></li> Own Assistant</a></li> </ol> </ol> </div> </div> </div> </div> <p> <p> Android 6.0 Marshmallow introduces a new way for users to engage with apps Android 6.0 Marshmallow introduces a new way for users to engage with apps through the assistant. through the assistant. The assistant is a top-level window that users can view to obtain </p> contextually relevant actions for the current activity. These actions might include deep links to other apps on the device.</p> <p> <p> Users summon the assistant with a long-press on the Home button or by saying Users activate the assistant with a long press on the Home button or by saying a the {@link android.service.voice.AlwaysOnHotwordDetector keyphrase}. In <a href="{@docRoot}reference/android/service/voice/AlwaysOnHotwordDetector.html">keyphrase</a>. response to the long-press, the system opens a top-level window that displays In response, the system opens a top-level window that displays contextually contextually relevant actions for the current activity. These potential relevant actions. actions might include deep links to other apps on the device. </p> </p> <p> <p> This guide explains how Android apps use Android's Assist API to improve the Google App implements the assistant overlay window through a feature called assistant user experience. Now on Tap, which works with the Android platform-level functionality. The system allows the user to select an assistant app, which obtains contextual information from your app using Android’s Assist API. </p> </p> <h2 id="assist_api">Using the Assist API</h2> <p> <p> The example below shows how Google Now integrates with the Android assistant This guide explains how Android apps use Android's Assist API to improve the assistant using a feature called Now on Tap. user experience. <p/> </p> </p> <h2 id="assist_api">Using the Assistant</h2> <p> <p> The assistant overlay window in our example (2, 3) is implemented by Google Figure 1 illustrates a typical user interaction with the assistant. When the user long-presses Now through a feature called Now on Tap, which works in concert with the the Home button, the Assist API callbacks are invoked Android platform-level functionality. The system allows the user to select in the <em>source</em> app (step 1). The assistant renders the overlay window (steps 2 and 3), the assistant app (Figure 2) that obtains contextual information from the and then the user selects the action to perform. The assistant executes the selected action, <em>source</em> app using the Assist API which is a part of the platform. such as firing an intent with a deep link to the (<em>destination</em>) restaurant app (step 4). </p> </p> <div> <div> <img src="{@docRoot}images/training/assistant/image01.png"> <img src="{@docRoot}images/training/assistant/image01.png"> <p class="img-caption" style="text-align:center;"> <p class="img-caption" style="text-align:center;"> Figure 1. Assistant interaction example with the Now on Tap feature of Figure 1. Assistant interaction example with the Now on Tap feature of Google Now the Google App </p> </p> </div> </div> <p> <p> An Android user first configures the assistant and can change system options Users can configure the assistant by selecting <strong>Settings > Apps > Default Apps > such as using text and view hierarchy as well as the screenshot of the Assist & voice input</strong>. Users can change system options such as accessing current screen (Figure 2). the screen contents as text and accessing a screenshot, as shown in Figure 2. </p> <p> From there, the assistant receives the information only when the user activates assistance, such as when they tap and hold the Home button ( shown in Figure 1, step 1). </p> </p> <div style="float:right;margin:1em;max-width:300px"> <div id="assist-input-settings" style="float:right;margin:1em;max-width:300px"> <img src="{@docRoot}images/training/assistant/image02.png"> <img src="{@docRoot}images/training/assistant/image02.png"> <p class="img-caption" style="text-align:center;"> <p class="img-caption" style="text-align:center;"> Figure 2. Assist & voice input settings (<em>Settings/Apps/Default Figure 2. Assist & voice input settings Apps/Assist & voice input</em>) </p> </p> </div> </div> <h3 id="assist_api_lifecycle">Assist API Lifecycle</h3> <h3 id="source_app">Source app</h3> <p> Going back to our example from Figure 1, the Assist API callbacks are invoked in the <em>source</em> app after step 1 (user long-presses the Home button) and before step 2 (the assistant renders the overlay window). Once the user selects the action to perform (step 3), the assistant executes it, for example by firing an intent with a deep link to the (<em>destination</em>) restaurant app (step 4). </p> <h3 id="source_app">Source App</h3> <p> <p> In most cases, your app does not need to do anything extra to integrate with To ensure that your app works with the assistant as a source of information for the user, the assistant if you already follow <a href= you need only follow <a href= "{@docRoot}guide/topics/ui/accessibility/apps.html">accessibility best "{@docRoot}guide/topics/ui/accessibility/apps.html">accessibility best practices</a>. This section describes how to provide additional information practices</a>. This section describes how to provide additional information to help improve the assistant user experience, as well as scenarios, such as to help improve the assistant user experience as well as scenarios custom Views, that need special handling. that need special handling, such as custom Views. </p> </p> <h4 id="share_additional_information_with_the_assistant">Share additional information <h4 id="share_additional_information_with_the_assistant">Share Additional Information with the Assistant</h4> with the assistant</h4> <p> <p> In addition to the text and the screenshot, your app can share In addition to the text and the screenshot, your app can share <em>additional</em> information with the assistant. For example, your music other information with the assistant. For example, your music app can choose to pass current album information, so that the assistant can app can choose to pass current album information so that the assistant can suggest smarter actions tailored to the current activity. suggest smarter actions tailored to the current activity. </p> </p> Loading @@ -122,13 +104,13 @@ page.article=true To provide additional information to the assistant, your app provides To provide additional information to the assistant, your app provides <em>global application context</em> by registering an app listener and <em>global application context</em> by registering an app listener and supplies activity-specific information with activity callbacks as shown in supplies activity-specific information with activity callbacks as shown in Figure 3. Figure 3: </p> </p> <div> <div> <img src="{@docRoot}images/training/assistant/image03.png"> <img src="{@docRoot}images/training/assistant/image03.png"> <p class="img-caption" style="text-align:center;"> <p class="img-caption" style="text-align:center;"> Figure 3. Assist API lifecycle sequence diagram. Figure 3. Assist API lifecycle sequence diagram </p> </p> </div> </div> Loading @@ -136,43 +118,42 @@ page.article=true To provide global application context, the app creates an implementation of To provide global application context, the app creates an implementation of {@link android.app.Application.OnProvideAssistDataListener} and registers it {@link android.app.Application.OnProvideAssistDataListener} and registers it using {@link using {@link android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener)}. android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener) registerOnProvideAssistDataListener()}. In order to provide activity-specific contextual information, activity To provide activity-specific contextual information, the activity overrides {@link android.app.Activity#onProvideAssistData(android.os.Bundle)} overrides {@link android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()} and {@link and {@link android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)}. android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()}. The two activity methods are called <em>after</em> the optional global The two activity methods are called <em>after</em> the optional global callback (registered with {@link callback is invoked. Because the callbacks execute on the main thread, they should android.app.Application#registerOnProvideAssistDataListener(android.app.Application.OnProvideAssistDataListener)}) is invoked. Since the callbacks execute on the main thread, they should complete <a href="{@docRoot}training/articles/perf-anr.html">promptly</a>. complete <a href="{@docRoot}training/articles/perf-anr.html">promptly</a>. The callbacks are invoked only when the activity is <a href= The callbacks are invoked only when the activity is <a href= "{@docRoot}reference/android/app/Activity.html#ActivityLifecycle">running</a>. "{@docRoot}reference/android/app/Activity.html#ActivityLifecycle">running</a>. </p> </p> <h5 id="providing_context">Providing Context</h5> <h5 id="providing_context">Providing context</h5> <p> <p> {@link android.app.Activity#onProvideAssistData(android.os.Bundle)} is called When the user activates the assistant, when the user is requesting the assistant to build a full {@link {@link android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()} is called to build a full {@link android.content.Intent#ACTION_ASSIST} Intent with all of the context of the android.content.Intent#ACTION_ASSIST} Intent with all of the context of the current application represented as an instance of the {@link current application represented as an instance of the {@link android.app.assist.AssistStructure}. You can override this method to place android.app.assist.AssistStructure}. You can override this method to place into the bundle anything you would like to appear in the anything you like into the bundle to appear in the <code>EXTRA_ASSIST_CONTEXT</code> part of the assist Intent. {@link android.content.Intent#EXTRA_ASSIST_CONTEXT} part of the assist intent. </p> </p> <h5 id="describing_content">Describing Content</h5> <h5 id="describing_content">Describing content</h5> <p> <p> Your app can implement {@link Your app can implement {@link android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)} android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()} to improve assistant user experience by providing references to content to improve the assistant user experience by providing content-related references related to the current activity. You can describe the app content using the related to the current activity. You can describe the app content using the common vocabulary defined by <a href="https://schema.org">Schema.org</a> common vocabulary defined by <a href="https://schema.org" class="external-link">Schema.org</a> through a JSON-LD object. In the example below, a music app provides through a JSON-LD object. In the example below, a music app provides structured data to describe the music album the user is currently structured data to describe the music album that the user is currently looking at. viewing: </p> </p> <pre class="prettyprint"> <pre class="prettyprint"> Loading @@ -191,127 +172,158 @@ public void onProvideAssistContent(AssistContent <strong>assistContent</strong>) </pre> </pre> <p> <p> Custom implementations of {@link You can also improve the user experience with custom implementations of android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)} {@link may also adjust the provided {@link android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()}, android.app.assist.AssistContent#setIntent(android.content.Intent) content which can provide the following benefits: intent} to better reflect the top-level context of the activity, supply </p> {@link android.app.assist.AssistContent#setWebUri(android.net.Uri) the URI} <ul> of the displayed content, and fill in its {@link <li><a href="{@docRoot}reference/android/app/assist/AssistContent.html#setIntent(android.content.Intent)"> android.app.assist.AssistContent#setClipData(android.content.ClipData)} with Adjusts the provided content additional content of interest that the user is currently viewing. intent</a> to better reflect the top-level context of the activity.</li> <li><a href="{@docRoot}reference/android/app/assist/AssistContent.html#setWebUri(android.net.Uri)"> Supplies the URI</a> of the displayed content.</li> <li>Fills in {@link android.app.assist.AssistContent#setClipData(android.content.ClipData) setClipData()} with additional content of interest that the user is currently viewing.</li> </ul> <p class="note"> <strong>Note: </strong>Apps that use a custom text selection implementation likely need to implement {@link android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()} and call {@link android.app.assist.AssistContent#setClipData(android.content.ClipData) setClipData()}. </p> </p> <h4 id="default_implementation">Default Implementation</h4> <h4 id="default_implementation">Default implementation</h4> <p> <p> If neither {@link If neither the {@link android.app.Activity#onProvideAssistData(android.os.Bundle)} nor {@link android.app.Activity#onProvideAssistData(android.os.Bundle) onProvideAssistData()} nor the {@link android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent)} android.app.Activity#onProvideAssistContent(android.app.assist.AssistContent) onProvideAssistContent()} callbacks are implemented, the system will still proceed and pass the callback is implemented, the system still proceeds and passes the information collected automatically to the assistant unless the current automatically collected information to the assistant unless the current window is flagged as <a href="#excluding_views">secure</a>. window is flagged as <a href="#excluding_views">secure</a>. As shown in Figure 3, the system uses the default implementations of {@link As shown in Figure 3, the system uses the default implementations of {@link android.view.View#onProvideStructure(android.view.ViewStructure)} and {@link android.view.View#onProvideStructure(android.view.ViewStructure) onProvideStructure()} and {@link android.view.View#onProvideVirtualStructure(android.view.ViewStructure)} to android.view.View#onProvideVirtualStructure(android.view.ViewStructure) onProvideVirtualStructure()} to collect text and view hierarchy information. If your view implements custom collect text and view hierarchy information. If your view implements custom text drawing, you should override {@link text drawing, override {@link android.view.View#onProvideStructure(android.view.ViewStructure)} to provide android.view.View#onProvideStructure(android.view.ViewStructure) onProvideStructure()} to provide the assistant with the text shown to the user by calling {@link the assistant with the text shown to the user by calling {@link android.view.ViewStructure#setText(java.lang.CharSequence)}. android.view.ViewStructure#setText(java.lang.CharSequence) setText(CharSequence)}. </p> </p> <p> <p> <strong>In most cases, implementing accessibility support will enable the <em>In most cases, implementing accessibility support enables the assistant to obtain the information it needs.</strong> This includes assistant to obtain the information it needs.</em> To implement accessibility support, providing {@link android.R.attr#contentDescription observe the best practices described in <a href= android:contentDescription} attributes, populating {@link "{@docRoot}guide/topics/ui/accessibility/apps.html">Making Applications android.view.accessibility.AccessibilityNodeInfo} for custom views, making Accessible</a>, including the following:</p> sure custom {@link android.view.ViewGroup ViewGroups} correctly {@link android.view.ViewGroup#getChildAt(int) expose} their children, and following <ul> the best practices described in <a href= <li>Provide {@link android.R.attr#contentDescription "{@docRoot}guide/topics/ui/accessibility/apps.html">“Making Applications android:contentDescription} attributes.</li> Accessible”</a>. <li>Populate {@link </p> android.view.accessibility.AccessibilityNodeInfo} for custom views.</li> <li>Make sure that custom {@link android.view.ViewGroup ViewGroup} objects correctly <a href="{@docRoot}reference/android/view/ViewGroup.html#getChildAt(int)">expose</a> their children.</li> </ul> <h4 id="excluding_views">Excluding views from the assistant</h4> <h4 id="excluding_views">Excluding views from the assistant</h4> <p> <p> An activity can exclude the current view from the assistant. This is accomplished To handle sensitive information, your app can exclude the current view from the assistant by setting the {@link android.view.WindowManager.LayoutParams#FLAG_SECURE by setting the {@link android.view.WindowManager.LayoutParams#FLAG_SECURE FLAG_SECURE} layout parameter of the WindowManager and must be done FLAG_SECURE} layout parameter of the {@link android.view.WindowManager}. You must set {@link explicitly for every window created by the activity, including Dialogs. Your android.view.WindowManager.LayoutParams#FLAG_SECURE app can also use {@link android.view.SurfaceView#setSecure(boolean) FLAG_SECURE} explicitly for SurfaceView.setSecure} to exclude a surface from the assistant. There is no every window created by the activity, including dialogs. Your app can also use {@link android.view.SurfaceView#setSecure(boolean) setSecure()} to exclude a surface from the assistant. There is no global (app-level) mechanism to exclude all views from the assistant. Note global (app-level) mechanism to exclude all views from the assistant. Note that <code>FLAG_SECURE</code> does not cause the Assist API callbacks to stop that {@link android.view.WindowManager.LayoutParams#FLAG_SECURE firing. The activity which uses <code>FLAG_SECURE</code> can still explicitly FLAG_SECURE} does not cause the Assist API callbacks to stop firing. The activity that uses {@link android.view.WindowManager.LayoutParams#FLAG_SECURE FLAG_SECURE} can still explicitly provide information to the assistant using the callbacks described earlier provide information to the assistant using the callbacks described earlier this guide. this guide. </p> </p> <h4 id="voice_interactions">Voice Interactions</h4> <p class="note"><strong>Note: </strong>For enterprise accounts (Android for Work), the administrator can disable the collection of assistant data for the work profile by using the {@link android.app.admin.DevicePolicyManager#setScreenCaptureDisabled(android.content.ComponentName, boolean) setScreenCaptureDisabled()} method of the {@link android.app.admin.DevicePolicyManager} API.</p> <h4 id="voice_interactions">Voice interactions</h4> <p> <p> Assist API callbacks are also invoked upon {@link Assist API callbacks are also invoked upon android.service.voice.AlwaysOnHotwordDetector keyphrase detection}. For more <a href="{@docRoot}reference/android/service/voice/AlwaysOnHotwordDetector.html">keyphrase information see the <a href="https://developers.google.com/voice-actions/">voice detection</a>. For more information, see the actions</a> documentation. <a href="https://developers.google.com/voice-actions/" class="external-link">Voice Actions</a> documentation. </p> </p> <h4 id="z-order_considerations">Z-order considerations</h4> <h4 id="z-order_considerations">Z-order considerations</h4> <p> <p> The assistant uses a lightweight overlay window displayed on top of the The assistant uses a lightweight overlay window displayed on top of the current activity. The assistant can be summoned by the user at any time. current activity. Because the user can activate the assistant at any time, Therefore, apps should not create permanent {@link don't create permanent <a android.Manifest.permission#SYSTEM_ALERT_WINDOW system alert} href="{@docRoot}reference/android/Manifest.permission.html#SYSTEM_ALERT_WINDOW"> windows interfering with the overlay window shown in Figure 4. system alert</a> windows that interfere with the overlay window, as shown in Figure 4. </p> </p> <div style=""> <div style=""> <img src="{@docRoot}images/training/assistant/image04.png"> <img src="{@docRoot}images/training/assistant/image04.png"> <p class="img-caption" style="text-align:center;"> <p class="img-caption" style="text-align:center;"> Figure 4. Assist layer Z-order. Figure 4. Assist layer Z-order </p> </p> </div> </div> <p> <p> If your app uses {@link If your app uses <a android.Manifest.permission#SYSTEM_ALERT_WINDOW system alert} windows, it href="{@docRoot}reference/android/Manifest.permission.html#SYSTEM_ALERT_WINDOW"> must promptly remove them as leaving them on the screen will degrade user system alert</a> windows, remove them promptly because leaving them on the experience and annoy the users. screen degrades the user experience. </p> </p> <h3 id="destination_app">Destination App</h3> <h3 id="destination_app">Destination app</h3> <p> <p> The matching between the current user context and potential actions displayed The assistant typically takes advantage of deep linking to find destination apps. To make your in the overlay window (shown in step 3 in Figure 1) is specific to the app a potential destination app, consider adding <a href= assistant’s implementation. However, consider adding <a href= "{@docRoot}training/app-indexing/deep-linking.html">deep linking</a> support. The matching "{@docRoot}training/app-indexing/deep-linking.html">deep linking</a> support between the current user context and deep links or other potential actions displayed in the to your app. The assistant will typically take advantage of deep linking. For overlay window (shown in step 3 in Figure 1) is specific to the assistant’s implementation. example, Google Now uses deep linking and <a href= For "https://developers.google.com/app-indexing/">App Indexing</a> in order to example, the Google App uses deep linking and <a href= "https://developers.google.com/app-indexing/" class="external-link">Firebase App Indexing</a> in order to drive traffic to destination apps. drive traffic to destination apps. </p> </p> <h2 id="implementing_your_own_assistant">Implementing your own assistant </h2> <h2 id="implementing_your_own_assistant">Implementing Your Own Assistant </h2> <p> <p> Some developers may wish to implement their own assistant. As shown in Figure You may wish to implement your own assistant. As shown in <a href="#assist-input-settings">Figure 2, the active assistant app can be selected by the Android user. The 2</a>, the user can select the active assistant app. The assistant app must provide an implementation of {@link assistant app must provide an implementation of {@link android.service.voice.VoiceInteractionSessionService} and {@link android.service.voice.VoiceInteractionSessionService} and {@link android.service.voice.VoiceInteractionSession} as shown in <a href= android.service.voice.VoiceInteractionSession} as shown in <a href= "https://android.googlesource.com/platform/frameworks/base/+/android-5.0.1_r1/tests/VoiceInteraction?autodive=0%2F%2F%2F%2F%2F%2F"> "https://android.googlesource.com/platform/frameworks/base/+/marshmallow-release/tests/VoiceInteraction/" class="external-link"> this</a> example and it requires the {@link this <code>VoiceInteraction</code> example</a>. It also requires the {@link android.Manifest.permission#BIND_VOICE_INTERACTION} permission. It can then android.Manifest.permission#BIND_VOICE_INTERACTION} permission. The assistant can then receive the text and view hierarchy represented as an instance of the {@link receive the text and view hierarchy represented as an instance of the {@link android.app.assist.AssistStructure} in {@link android.app.assist.AssistStructure} in {@link android.service.voice.VoiceInteractionSession#onHandleAssist(android.os.Bundle, android.service.voice.VoiceInteractionSession#onHandleAssist(android.os.Bundle, android.app.assist.AssistStructure,android.app.assist.AssistContent) onHandleAssist()}. android.app.assist.AssistStructure,android.app.assist.AssistContent) onHandleAssist()}. The assistant receives the screenshot through {@link It receives the screenshot through {@link android.service.voice.VoiceInteractionSession#onHandleScreenshot(android.graphics.Bitmap) android.service.voice.VoiceInteractionSession#onHandleScreenshot(android.graphics.Bitmap) onHandleScreenshot()}. onHandleScreenshot()}. </p> </p>