blog post

Interactive Canvas – Build Visual Immersive Games for Google Assistant

Sachin Kumar
April 16, 2023
12 MIN READ

Interactive Canvas – Build Visual Immersive Games for Google Assistant

Learn how to build a sample Action/App for the Google Assistant using Interactive Canvas for an immersive and interactive game experience.

In this tutorial, we will learn how to build a sample Action/App for the Google Assistant using Interactive Canvas for an immersive and interactive game experience.

Update: Some of the features in this tutorial is deprecated by Google.

Introducing Interactive Canvas

Interactive Canvas is a framework built on the Google Assistant that allows developers to add a visual, immersive experience to conversational Actions.

When to use Interactive Canvas in your Actions

  • Create full-screen visuals
  • Create custom animations and transitions
  • Do data visualization
  • Create custom layouts and GUIs
  • Implement video playback (videos are not yet fully supported, but may still play in Interactive Canvas)

Note: At this time(July 2019 Google is only approving Actions that are gaming experiences.

Build Immersive Games for Google Assistant using Interactive Canvas

How it Works

Interactive Canvas - Build Immersive Games for Google Assistant

Interactive Canvas connects your conversational Action to an interactive web app so that your users can interact with your visual user interface through voice or touch.

Four Components to an Action that uses Interactive Canvas

  • Custom Conversational Action: An Action that uses a conversational interface to fulfill user requests. Actions that use Interactive Canvas operate in the same fundamental way as any conversational Action, but use immersive web views(HtmlResponse) to render responses instead of rich cards or simple text and voice responses.
  • Web app: A front-end web app with customized visuals that your Action sends as a response to users during a conversation. You build the web app with web standards like HTML, JavaScript, and CSS. interactiveCanvaslets your web app communicate with your conversational Action.
  • interactiveCanvas: JavaScript API that you include in the web app to enable communication between the web app and your conversational Action.
  • HtmlResponse: A response that contains a URL of the web app and data to pass it.

Getting Started with Interactive Canvas

Now that we understand the basic fundamentals behind the Interactive Canvas framework, Let’s start building one.

Create a New Project

Actions on Google Console - Interactive Canvas - Build Immersive Games for Google Assistant
Interactive Canvas – Build Immersive Games for Google Assistant – Actions on Google Console
  • Click on Games & fun card.
Interactive Canvas - Build Immersive Games for Google Assistant - Games and fun
Interactive Canvas – Build Immersive Games for Google Assistant – Games and fun
  • After that click on the Conversational card.
Interactive Canvas - Conversational Card
  • At this point, we will be taken to an Overview screen as shown below. We will come back to this later.
Interactive Canvas - Actions on Google Console
Interactive Canvas – Actions on Google Console
  • Click on Deploy on the top and then scroll down to the bottom of the Directory Information page and select the checkbox for Interactive Canvas and Save it.
Interactive Canvas - Build Immersive Games for Google Assistant
Interactive Canvas – Build Immersive Games for Google Assistant
  • On the top navigation, click on Overview again and Under Build your Action, Click on Add Action(s).
  • Click Add your first action.
Add Action - Interactive Canvas - Build Immersive Games for Google Assistant
Add Action – Interactive Canvas – Build Immersive Games for Google Assistant
  • After that under Built-in intents, select Play game and click Get started in Dialogflow.
Interactive Canvas - Build Immersive Games for Google Assistant

Configure Dialogflow Agent

The above action will open up the Dialogflow Console where you will enter your agent name and click on Create button.

Interactive Canvas - Build Immersive Games for Google Assistant
  • On the Intents page, click on Default Welcome Intent.
  • At the bottom of the page, click Fulfillment to expand the section.
  • Click Enable webhook call for this intent.
  • These are just the basic configuration if you need to build one yourself. However, for the purpose of the demo, we can already import an agent with intents and enable webhook for intents already setup. So, go to the next section to import an Agent in Dialoglfow.
  • If you are new to Dialogflow, I encourage you to check out this tutorial.
Dialogflow welcome intent

Import Agent in Dialogflow

⚙
  • > Export and Import > Restore from zip using the agent.zip in this sample’s directory.
  • Alternatively, this would actually do all the steps we did before and add more intents based on the sample action that we will run.
Import Agent in Dialogflow
RESTORE Agent in Dialogflow

Fulfillment

Dialogflow provides an option called Fulfillment where you link your webhook. Fulfillment tells your Action what to do when user requests are made. With Interactive Canvas, your fulfillment also tells the Assistant to render the web app you created and provides updates to data, which your web app custom logic uses to make changes to your web app.

From the left navigation, click on Fulfillments and then Enable Webhook

Setup the Webhook.

Note: If you are receiving any permission issue, make sure to run firebase commands with sudo command.

  • Firebase Deployment: On your local machine, go to the functions directory and run the following commands from the terminal.

Furthermore, make sure to replace {PROJECT ID} with yours. You can find this in Dialogflow console under Settings

> General tab > Project ID.

npm installnpm install actions-on-google@previewfirebase deploy --project {PROJECT_ID}

Also, make a note and copy the Function URL. We will need this in the next step.

  • So let us return back to Dialogflow console, select Fulfillment > Enable Webhook > Set URL to the Function URL that was returned after the deploy command > SAVE.
Interactive Canvas - Fulfillment and Webhook - Dialogflow

Test the Action using the Actions on Google Simulator

Select Integrations > Integration Settings under Google Assistant > Enable Auto-preview changes > Test to open the Actions on Google simulator then say or type Talk to my test app.

Interactive Canvas - Google Assistant - Integrations
Interactive Canvas - Google Assistant - Integration

Finally, this would open up the Actions on Google Console Simulator where you can test the app. So, go ahead and initiate the Action using “Talk to my test app“. Try changing the colors of the spinning Triangle or pause the spinning.

Check your Google Permission Settings

To test the app, we will need to enable the necessary permissions. So, go to the Activity Controls page and sign in with your Google credentials if prompted.

Ensure that the following permissions are enabled by sliding the toggles and selecting TURN ON for the following cards:

  • Web & App Activity
  • Device Information
  • Voice & Audio Activity

Lastly, now close the Activity Controls page and return back to the Actions on Google Console to test the app.

Interactive Canvas - Google Assistant - Demo
Interactive Canvas - Google Assistant - Demo
Interactive Canvas – Google Assistant – Demo
Interactive Canvas - Google Assistant - Demo

By default, we will notice that we have some logs shown on the top right. This is basically for us to understand the callbacks made. Hence, we can hide the debug overlay by uncommenting the code as shown below.

uncomment the code below  /* Uncomment below to disable the debug overlay */ in public/css/main.css.

Interactive Canvas - css - debug

Once we make the changes deploy it again in firebase like before. Once it is deployed, test the app again in the simulator. We can also test the Action in a Smart display as shown below. Also, notice that the debug overlay is not showing anymore.

Interactive Canvas Demo

Congrats! You have successfully run your Action for Google Assistant using Interactive Canvas.

How does it all work? Let’s Understand

Architecture:

It is strongly recommended to use a single-page application architecture. This allows for optimal performance and supports continuous conversational UX.

HTML

The HTML file defines how your UI looks. This also loads the JavaScript for your HTML, which facilitates communication between your Action and Interactive Canvas.



  
    
    
    Immersive Canvas Sample
    
    
    
    
    
    
    
    
    
    
  
  
    

Web app custom logic and Communication

Firstly, the communication between the web app and fulfillment is through the Interactive Canvas API. Callbacks provide a way for you to respond to information or requests from the conversational Action, while the methods provide a way to send information or requests to the intent fulfillment.

Caution: The interactiveCanvas APIs are attached to the window object. If you're using a front-end framework to develop your web app, you'll need access to window to set up the API.

Add interactiveCanvas.ready(callbacks); to your HTML file to initialize and register callbacks.

// main.js
const view = document.getElementById('view');

// initialize rendering and set correct sizing
const renderer = PIXI.autoDetectRenderer({
  antialias: true,
  width: view.clientWidth,
  height: view.clientHeight,
});
view.appendChild(renderer.view);

// center stage and normalize scaling for all resolutions
const stage = new PIXI.Container();
stage.position.set(view.clientWidth / 2, view.clientHeight / 2);
stage.scale.set(Math.max(renderer.width, renderer.height) / 1024);

// load a sprite from a svg file
const sprite = PIXI.Sprite.from('triangle.svg');
sprite.anchor.set(0.5);
sprite.tint = 0x00FF00; // green
stage.addChild(sprite);

let spin = true;
// register interactive canvas callbacks
const callbacks = {
  onUpdate(data) {
    console.log('onUpdate', JSON.stringify(data));
    if ('tint' in data) {
      sprite.tint = data.tint;
    }
    if ('spin' in data) {
      spin = data.spin;
    }
  },
};
interactiveCanvas.ready(callbacks);

// toggle spin on tap of the triangle
sprite.interactive = true;
sprite.buttonMode = true;
sprite.on('pointerdown', () => {
  spin = !spin;
});

// code to be ran per frame
let last = performance.now();
const frame = () => {
  // calculate time differences for smooth animations
  const now = performance.now();
  const delta = now - last;

  // rotate the triangle only if spin is true
  if (spin) {
    sprite.rotation += delta / 1000;
  }

  last = now;

  renderer.render(stage);
  requestAnimationFrame(frame);
};
frame();

HTML Responses

The method for communicating your web app URL and state is to send an HtmlResponse through your webhook. So, when you send an HtmlResponse, the following steps are executed:

  • The fulfillment of the matched intent sends an HtmlResponse to the device.
  • The device uses the URL in the HtmlResponse to load the web app.
  • The data JSON payload is passed to the web app in a callback.
  • Your conversational Action sends a new HtmlResponse to send updates or load new states.
    Note: If the value for the url field changes in subsequent fulfillment responses, the web app reloads. If the same value for url is provided or if url is excluded as part of a subsequent response, the existing web app is kept on the device.

An HtmlResponse is included in your intent-specific fulfillment and updates the state of variables within your web app’s custom logic.

So in this sample app that was built, the fulfillment has logic that triggers an intent based on the value of spin or color.

...
app.intent('pause', (conv) => {
  conv.ask(`Ok, I paused spinning. What else?`);
  conv.ask(new HtmlResponse({
    data: {
      spin: false,
    },
  }));
});
...

Restrictions

Take the following restrictions into consideration as you develop your web app:

  • No cookies, local storage, geolocation, camera usage, popups
  • Origin is set to null for AJAX
  • Stay under the 200mb memory limit
  • 3P Header takes up upper portion of screen
  • No styles can be applied to videos
  • Only one media element may be used at a time
  • No HLS video
  • Assets must accept requests from null origins

What’s Next

Now that we have a basic understanding of Interactive Canvas and how we can use the framework to build Interactive Games for the Google Assistant. We can also start customizing and build meaningful and fun games using the Interactive Canvas. So do watch out for the upcoming tutorials on how we can build more.

Conclusion:

To conclude, Interactive Canvas is an amazing framework built on the Google Assistant that allows us to build visual immersive experiences to Conversational Actions. The framework can later cater to a lot of meaningful and wonderful immersive and interactive conversational experiences.

Furthermore, check out some of my other tutorials.

More Resources and Contents