Last Updated: 4/11/2022, 6:25:12 AM

# Telemetry

# Overview

The AccelByte Game Telemetry service acts as a scalable event data ingestion pipeline for your game, to send events into a designated streaming pipeline using an HTTP based REST API. With this service, you can collect and analyze player data and use it to inform future design decisions. We also maintain a list of AccelByte Telemetry Events (opens new window) grouped by service name that you can see as a reference.

# How it Works

The diagram below gives a basic overview of how the Game Telemetry service works:

telemetry

The Game Telemetry SDK sends events to be tracked into the Game Telemetry pipeline. After that, the Game Telemetry pipeline sends events to Kafka. Kafka Connect then pulls the events from Kafka and passes them to the designated data warehouse and data visualization tools.

# Set Up the Data Warehouse

To be able to receive events and the player actions log, you have to set up a data warehouse. The Game Telemetry service supports several different data warehouses by default, and AccelByte can work with you to integrate a data warehouse that we don’t yet support.

# Redshift

We’ll create a connector for you in Kafka Connect to pull event data from Kafka and write the data to Redshift. Nested data will be flattened in the Game Telemetry pipeline before getting pushed into Kafka. Each payload attached to the events will be converted into a Redshift column. You can see the payload for each event in the list of AccelByte Telemetry Events (opens new window).

# Integration Requirements

AccelByte can integrate the Game Telemetry pipeline with Redshift for you. To do so, you’ll need to send us the following information:

  • Redshift credentials
  • Redshift database name
  • Redshift table name

After we input that information into Kafka Connect, the connector will be created and the pipeline will automatically send game Game Telemetry data to Redshift.

# S3

# Integration Requirements

AccelByte can integrate the Game Telemetry pipeline with S3 for you. To do so, you’ll need to send us the following information:

  • AWS Credentials
  • Bucket name

After we input that information into Kafka Connect, the pipeline will automatically send Game Telemetry data to your S3 bucket.

# Snowflake

If you use Snowflake as your data warehouse, you can set it up in the Admin Portal by following the steps below.

IMPORTANT

To connect Snowflake to AccelByte Cloud, first complete the following prerequisites in the Snowflake application:

# Connect Snowflake to AccelByte Cloud

  1. In the publisher namespace of the Admin Portal, go to the 3rd Party Integrations section and open the Analytics & Data Monitoring menu.

    telemetry

  2. In the Analytics & Data Monitoring menu, click Configure Now.

    telemetry

  3. The Add Snowflake Configuration form appears. Fill in the fields with the following information:

    telemetry

    • Enter the Snowflake username of the user that will be responsible for this data in the Snowflake Username field.

    • Enter your Snowflake account ID in the Account ID field.

      NOTE

      The Snowflake Account ID is the prefix before snowflakecomputing.com in your Snowflake account url. Example: if your Snowflake URL is rsa42644.us-east-1.snowflakecomputing.com then the account ID is rsa42644.us-east-1.

    • Enter the destination database’s name in the Database Name field.

    • Enter the schema of the database used for the integration in the Schema field.

    • Choose how you want to store the data in the Model field. There are two model options:

      • Single: This model will store the data in one table for all AccelByte system events. If you use the Telemetry service, two tables will be created (one for Telemetry events and one for AccelByte system events).
      • Mapping: This model will store the data by mapping each event's topic name into a table. One table will be created for each topic name. Snowflake stages and pipes will also be created as tables.

    When you’re done, click Continue.

  4. The Connect to Snowflake form will appear along with a Public Key. Copy the Public Key and paste it in your Snowflake Worksheet to set the connection.

    telemetry

    NOTE

    This step must be completed before you can proceed to the following steps.

  5. Click the Connect button. You will be redirected to the Analytics & Data Monitoring page. Your configuration may still be in progress. If so, you can refresh the page to update the status of your configuration.

    telemetry

  6. Once your configuration is successful, the Integration Status field will change to Active and the Connection Status field will change to Connected.

    telemetry

    If your configuration fails, the Integration Status field will change to Deactivated and the Connect button will appear in the Connection Status field. You can click the Connect button to try reconnecting, click the Edit button to edit the configuration, or click Remove to remove the configuration.

    NOTE

    If your configuration fails, make sure all the prerequisites on the Snowflake side have been satisfied. Check the IMPORTANT box in the top of Snowflake’s documentation to learn more.

    telemetry

  7. If your Snowpipe is missing data after you change the Snowflake Credentials, or data has disappeared due to a connection issue, you can use Backfill to run past scheduled pipelines and fill in the missing data.

    telemetry

  8. The backfilling process will run in the background. Refresh the page to see the latest configuration status.

    telemetry

    Once backfilling has been completed successfully, a confirmation will appear on the configuration page.

    TIP

    If the backfilling fails, recheck your connection and ensure that:

  9. Once the integration is successful, if you want to regularly update the RSA Key to ensure connection security, you can generate a new RSA Key by clicking Generate New RSA Key next to Connection Status on the configuration page.

    telemetry

    The Snowflake integration will be disconnected and you will need to reconnect Snowflake.

# Edit Snowflake Configuration

  1. In the publisher namespace of the Admin Portal, go to the 3rd Party Integrations section and open the Analytics & Data Monitoring menu.

    telemetry

  2. In the Analytics & Data Monitoring menu, click the Edit Configuration button.

    telemetry

  3. Before editing the configuration, it must first be disconnected. The Disconnect Snowflake form appears. Click Continue to disconnect the configuration and proceed to the editing form.

    telemetry

  4. The Edit Snowflake Configuration appears. Fill in the following information:

  • Enter the Snowflake username of the user that will be responsible for this data in the Snowflake Username field.

  • Enter your Snowflake account ID in the Snowflake Account ID field.

    NOTE

    The Snowflake Account ID is the prefix before snowflakecomputing.com in your snowflake account url. Example: if your Snowflake URL is rsa42644.us-east-1.snowflakecomputing.com then the account ID is rsa42644.us-east-1.

    • Enter the destination database’s name in the Database Name field.
    • Enter the schema of the database used for the integration in the Schema field.
    • Choose how you want to store the data in the Model field. Choose Single if you want to store all of the data in a single table. Choose Mapping if you want to store the data in a different table for each topic.
  1. When you’re done, click Continue.

    telemetry

  2. The Connect to Snowflake form appears along with a public key. Copy the provided public key and paste it into your Snowflake configuration script.

    telemetry

    IMPORTANT

    You must complete this step before moving on.

  3. After setting up the public key, click the Connect button. You'll be redirected to the Analytic & Data Monitoring page. Most likely, your configuration is still in progress. You can refresh the page to see the latest configuration status.

    telemetry

  4. If your configuration is successful, you can see that the Integration Status field will change to Active and the Connection Status field will change to Connected.

    telemetry

    If your configuration fails, the Integration Status field will change vto Deactivated and the Connect button will appear in the Connection Status field. You can click the Connect button to try reconnecting, click the Edit button to edit the configuration, or click Remove to remove the configuration.

    NOTE

    If your configuration fails, make sure the Snowflake user ID you entered earlier has the ability to create a table in the provided database.

    telemetry

# Send Event

With our Game Telemetry service, you can track and log players' actions within the game by sending an event to be tracked to the game’s streaming pipeline. To track those actions a player session is required, so you must enable your players to create an account in your game before you can use this feature.

# Preparation Before Sending Events

By default, telemetry events will be sent to a game’s streaming pipeline as a batch after a set amount of time. This amount of time should be configured before telemetry events can be sent. Any events that should be sent immediately should also be added to the immediate event list. Here is an example of how to configure your telemetry events.

# Send Event using SDK

The following function signature can also be used to send an event. The returned data will be displayed in your data warehouse, where you’ll be able to customize its visualization.

# Send Event using API

To send an event to the designated streaming pipeline, follow these steps.

  1. Use the Protected Save Events: POST - ​/game-telemetry​/v1​/protected​/events endpoint.

  2. Fill in the Request Body.

    • Input the EventNamespace with the game namespace.
    • Input the EventName with the name of the event e.g. player_killed, mission_accomplished.
    • Input the Payload with an arbitrary JSON of the event. Here are some examples of the payload for different use cases:
    • If you want to track player login and logout, you can use the following reference for your request body.
    • If you want to track when matchmaking starts and ends, you can use the following reference for your request body.
    • If you want to detect a player’s active status on the server, you can use the following reference for your request body.
    • The Event Timestamp will be generated automatically by the Game Telemetry in ISO 8601 format, e.g. 2021-01-05T05:04:08.775Z.
    • The Event ID will be generated automatically by the Game Telemetry. Upon successful request, the event will be sent to the designated streaming pipeline for tracking.

# Connecting Custom Services to Game Telemetry using the Server SDK

# SDK Initialization

Before using the Game Telemetry (opens new window) service from the SDK, you will need to initialize your server-side SDK to ensure that you are authorized and able to perform create, read, update, and delete actions.

# Golang SDK Initialization

Before using the Game Telemetry service from the Golang SDK, you will need to initialize the SDK by following the steps below:

Once completed, you can use the Golang SDK to create, read, update, or delete Game Telemetry (opens new window) from your serverless app.

# Send Events

Use the following code to send events (opens new window) into designated streaming pipelines:

# Retrieve a Player’s Total Playtime

Use the following function to retrieve a player's total playtime (opens new window) in Steam for a specific game. This endpoint will also store the player's total playtime in the service cache. Set the player's Steam account to public so the Game Telemetry service can retrieve their total playtime data.

# Update a Player’s Total Playtime

Use the following function to update a player's total playtime (opens new window) in Steam for a specific game in the service cache.