Handling Touch Events

Most screen-enabled Alexa devices are actually touch screen devices. Echo Show devices, for example, are able to respond to touch gestures much like tablets. You can take advantage of this by handling touch events in your APL-enabled skill.

To add touch events to an APL template, you need to wrap one or more elements in your APL template with the TouchWrapper component. For example, suppose that the skill has a welcome APL template that is displayed when the user launches the skill. And suppose that in the welcome template, you have an Image component that renders the Star Port 75 logo. Before you add the TouchWrapper, that Image component might be defined like this:

 {
 "type"​: ​"Image"​,
 "width"​: ​"340dp"​,
 "height"​: ​"100%"​,
 "source"​: ​"https://starport75.dev/images/SP75_lg.png"​,
 "align"​: ​"center"
 }

Now, let’s suppose that you want to add touch capability to the welcome screen such that when the user touches the logo, a marketing soundbyte about Star Port 75 is played. The first thing you’ll need to do is wrap the Image component with TouchWrapper like this:

 {
 "type"​: ​"TouchWrapper"​,
 "item"​: {
 "type"​: ​"Image"​,
 "width"​: ​"340dp"​,
 "height"​: ​"100%"​,
 "source"​: ​"https://starport75.dev/images/SP75_lg.png"​,
 "align"​: ​"center"
  },
 "onPress"​: {
 "type"​: ​"SendEvent"​,
 "arguments"​: [
 "starport75Logo"
  ]
  }
 }

Here, the Image component we started with is now the value of the TouchWrapper components’s item property. It will still be rendered the same as before, but now it will also react to touch. How it reacts is defined by the onPress property.

The onPress property is defined to send an event to the skill’s fulfillment backend with a single argument of “starport75Logo”. We’ll use this argument in a handler’s canHandle() function to handle touch events for this specific component.

A touch event handler is much like an intent request handler, in that it has both canHandle() and handle() functions. If canHandle() returns true, then handle() will handle the event. The key difference between a touch event handler and other kinds of handlers is in how the canHandle() evaluates the request type. Instead of looking for an IntentRequest, LaunchRequest, or some other kind of request we’ve seen already, a touch event handler checks for requests whose type is “Alexa.Presentation.APL.UserEvent”.

For example, look at the canHandle() function in the following touch event handler:

 const​ Alexa = require(​'ask-sdk-core'​);
 
 const​ LogoTouchHandler = {
   ​canHandle(handlerInput) {
 return​ Alexa.getRequestType(
  handlerInput.requestEnvelope) === ​'Alexa.Presentation.APL.UserEvent'
  && Alexa.getRequest(
  handlerInput.requestEnvelope).​arguments​.includes(​'starport75Logo'​);
   ​},
   ​handle(handlerInput) {
     ​​return​ handlerInput.responseBuilder
       ​.speak(handlerInput.t(​'WELCOME_TOUCH'​))
  .withShouldEndSession(​false​)
       ​.getResponse();
   ​}
 };
 
 module.exports={
  LogoTouchHandler:LogoTouchHandler
 };

The canHandle() function checks for a request type of “Alexa.Presentation.APL.UserEvent”, indicating that this is a touch event from the user. It also checks that the request arguments contain “starport75Logo”. This ensures that this handler will only handle events from the TouchWrapper we created on the welcome screen and not any other touch events that may be defined elsewhere in our skill’s visual user interface.

As for the handle() function, it’s not much different from the handle() method of any of our request handlers we’ve already defined. It simply speaks the message defined in languageStrings.js with the key “WELCOME_TOUCH” and leaves the session open so that the skill will continue listening for utterances about planetary travel.

Speaking of the languageStrings.js module, here’s the new “WELCOME_TOUCH” entry to support the touch event handler:

 WELCOME_TOUCH:
 'Star Port 75 Travel, your source for '​ +
 '<audio src="soundbank://soundlibrary/scifi/'​ +
 'amzn_sfx_scifi_small_zoom_flyby_01"/> '​ +
 'out-of-this world adventures.'​,

Touch events provide another way that users can interact with your skill along with voice. But remember that Alexa skills are voice-first user interfaces. It’s important that you design the visual interface of your skill to complement the voice interface, not replace it.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset