メインコンテンツまでスキップ

Analytics Data Connector

Last updated on August 9, 2024
注記

Data warehouse integrations are not available on AGS Shared Cloud tier.

Overview

The AccelByte Gaming Services (AGS) Analytics Data Connector provides the ability to send events of service game telemetry and custom game telemetry. The Data Connector supports several different solutions, such as S3, Redshift, and Snowflake. This article provides the details of the AGS Analytics Data Connector.

Activate Data Connector

To activate the Data Connector and begin including data into your in-house analytics solution, contact your AccelByte representative.

注記

AGS Analytics Data Connector has limitations compared to AIS Data Connector, which is a flexible and self-service solution. To see a comparison of the Data Connectors, refer to Data Connector Comparison.

S3

The AGS Analytics Data Connector supports outputting directly to Amazon S3 buckets. The data is structured into text files containing JSON representations of each event, which can then be processed by your in-house analytics systems.

S3 integration requirements

AccelByte can integrate the game telemetry pipeline with S3 for you. For us to do so, you'll need to send us the S3 bucket name and the region that you wish events to be written to. AccelByte will provide an IAM (S3 Policy script) representing your AccelByte connector. Once you add this IAM to the security for your S3 bucket, configuration will be complete.

S3 key concepts

To make your integration seamless, you should understand the following key concepts.

Flush configuration

You can configure flush while requesting to AGS Data Connector activation.

  1. Flush Interval: The time interval in minutes that the data should be periodically streamed to your data warehouse.
  2. Flush Size: The maximum number of events per file that will be streamed into your data warehouse.

S3 file and path format

The S3 file and path format in AGS Data Connector cannot be customized. See AIS Data Connector to look into a solution that includes file and path format customization.

  1. JSON Formatted File: the files that AccelByte will send to you will have .json format.
  2. Default S3 Path: you will have the following S3 path formats:

Service Telemetry Path

your-bucket-name/justice-event/realm/dev/topics/yourNamespace.dev.matchmaking/year=2024/month=07/day=03/hour=10/yourNamespace.dev.matchmaking+0+0000000001.json

Custom Telemetry Path

your-bucket-name/game-telemetry/realm/dev/topics/analytics_game_telemetry.dev.yourNamespace.yourEventName/year=2024/month=07/day=17/hour=07/minute=00/analytics_game_telemetry.dev.yourNamespace.yourEventName+0+0000000001.json

Redshift

AccelByte can create a connector for you in Kafka Connect to pull event data from Kafka and write the data to your Redshift.

Redshift integration requirements

AccelByte can integrate the game telemetry pipeline with Redshift for you. For us to do so, you'll need to send us the following information:

  • Redshift credentials
  • Redshift database name

After we input that information into Kafka Connect, the connector will be created and the pipeline will automatically send game telemetry data to Redshift.

Redshift key concepts

Data Format: Nested data will be flattened in the game telemetry pipeline before getting pushed into Kafka. Each payload attached to the events will be converted into a Redshift column.

Snowflake

Due to the complexity of the configuration, AccelByte provides a self-service UI to manage your Snowflake. You can set up a connection to AGS in the AGS Admin Portal by following the steps below.

Snowflake prerequisites

To connect Snowflake to AGS, you will need to have completed the following prerequisites:

Connect Snowflake to AGS

  1. In your studio namespace of the AGS Admin Portal, go to Analytics > Analytics & Data Monitoring.

  2. Click Configure Now.

  3. On the Add Snowflake Configuration form that appears, fill in the fields with the following information:

    1. Enter the Snowflake username of the user that will be responsible for this data in the Snowflake Username field.

    2. Enter your Snowflake account ID in the Snowflake Account ID field.

      The Snowflake Account ID is the prefix before snowflakecomputing.com in your Snowflake account URL. For example, if your Snowflake URL is rsa42644.us-east-1.snowflakecomputing.com, then the account ID is rsa42644.us-east-1. Ensure you use the correct Account ID format.

    3. Enter the destination database's name in the DB Name field.

    4. Enter the schema of the database used for the integration in the Schema field.

    5. Enter the pipeline you want as the source of the data in the Data Pipe Source field. You can enter more than one pipeline.

    6. Choose how you want to store the data in the Model field. There are two model options:

      • Single: This model will store the data in one table for all AGS system events. If you use AGS Telemetry, two tables will be created (one for Telemetry events and one for AGS system events).
      • Mapping: This model will store the data by mapping each event's topic name into a table. One table will be created for each topic name. Snowflake stages and pipes will also be created as tables.
  4. Click Next. The Connect to Snowflake form will appear along with a Public Key. Copy the Public Key and paste it in your Snowflake worksheet to set the connection.

    備考

    This step must be completed before you can proceed to the following steps.

  5. Click Connect and you will be redirected to your Snowflake configuration details page. Set the Integration Status toggle to ACTIVE. Your Connection Status will be changed to CONNECTED. If your Connection Status still says IN PROGRESS, you can refresh the page to update it.

    Once your configuration is successful, the Integration Status field will be changed to Active.

注記

If your configuration fails, the Integration Status field will change to Deactivated and the Connect button will appear in the Connection Status field. You can click the Connect button to try reconnecting, click the Edit button to edit the configuration, or click Remove to remove the configuration. Ensure you have met all the Snowflake prerequisites for this configuration.

Backfill Snowflake data

If your Snowpipe is missing data after you change your Snowflake credentials or the data has disappeared due to a connection issue, you can use the backfill feature to run past scheduled pipelines and fill in the missing data.

Ensure you have activated and started the Snowflake Warehouse before using the backfill feature.

  1. In the studio namespace of the AGS Admin Portal, go to Analytics > Analytics & Data Monitoring.

  2. Click Backfill to make the Backfill form appear.

  3. If you want to retrieve missing data from the entire range of your backfill data, choose All Time in the Time Range option. If you want to retrieve missing data for a specified time, choose Partial in the Time Range option and enter the Start Date and End Date for this time range.

  4. When you're finished, click Submit.

The backfilling process will run in the background. Refresh the page to see the latest configuration status. You can also cancel the backfilling process by clicking Cancel Backfilling Process.

Once backfilling has been completed successfully, a confirmation will appear on the configuration page.

If backfilling fails, check your connection and ensure that the username you set when creating your Snowflake configuration has been set in the default warehouse using the DESCRIBE USER syntax in Snowflake. The default warehouse is running by checking the Warehouse Menu in Snowflake.

Once the integration is successful, if you want to regularly update the RSA Key to ensure connection security, you can generate a new RSA Key by clicking Generate New RSA Key next to Connection Status on the configuration page. If you do this, the Snowflake integration will be disconnected and you will need to reconnect Snowflake.

Edit the Snowflake configuration

  1. In the studio namespace of the Admin Portal, go to Analytics > Analytics & Data Monitoring.

  2. Click Edit Configuration.

  3. Before editing the configuration, you must first disconnect Snowflake. Click the option to do so and the Disconnect Snowflake form will appear. Click Continue to confirm the disconnection and proceed to the editing form.

  4. Edit your configuration as desired on the Edit Snowflake Configuration form.