Analytics Data Connector
Data warehouse integrations are not available on AGS Shared Cloud tier.
Overview
The AccelByte Gaming Services (AGS) Analytics Data Connector provides the ability to send events of service game telemetry and custom game telemetry. The Data Connector supports several different solutions, such as S3, Redshift, and Snowflake. This article provides the details of the AGS Analytics Data Connector.
Activate Data Connector
To activate the Data Connector and begin including data into your in-house analytics solution, contact your AccelByte representative.
AGS Analytics Data Connector has limitations compared to AIS Data Connector, which is a flexible and self-service solution. To see a comparison of the Data Connectors, refer to Data Connector Comparison.
S3
The AGS Analytics Data Connector supports outputting directly to Amazon S3 buckets. The data is structured into text files containing JSON representations of each event, which can then be processed by your in-house analytics systems.
S3 integration requirements
AccelByte can integrate the game telemetry pipeline with S3 for you. For us to do so, you'll need to send us the S3 bucket name and the region that you wish events to be written to. AccelByte will provide an IAM (S3 Policy script) representing your AccelByte connector. Once you add this IAM to the security for your S3 bucket, configuration will be complete.
S3 key concepts
To make your integration seamless, you should understand the following key concepts.
Flush configuration
You can configure flush while requesting to AGS Data Connector activation.
- Flush Interval: The time interval in minutes that the data should be periodically streamed to your data warehouse.
- Flush Size: The maximum number of events per file that will be streamed into your data warehouse.
S3 file and path format
The S3 file and path format in AGS Data Connector cannot be customized. See AIS Data Connector to look into a solution that includes file and path format customization.
- JSON Formatted File: the files that AccelByte will send to you will have
.json
format. - Default S3 Path: you will have the following S3 path formats:
Service Telemetry Path
your-bucket-name/justice-event/realm/dev/topics/yourNamespace.dev.matchmaking/year=2024/month=07/day=03/hour=10/yourNamespace.dev.matchmaking+0+0000000001.json
Custom Telemetry Path
your-bucket-name/game-telemetry/realm/dev/topics/analytics_game_telemetry.dev.yourNamespace.yourEventName/year=2024/month=07/day=17/hour=07/minute=00/analytics_game_telemetry.dev.yourNamespace.yourEventName+0+0000000001.json
Redshift
AccelByte can create a connector for you in Kafka Connect to pull event data from Kafka and write the data to your Redshift.
Redshift integration requirements
AccelByte can integrate the game telemetry pipeline with Redshift for you. For us to do so, you'll need to send us the following information:
- Redshift credentials
- Redshift database name
After we input that information into Kafka Connect, the connector will be created and the pipeline will automatically send game telemetry data to Redshift.
Redshift key concepts
Data Format: Nested data will be flattened in the game telemetry pipeline before getting pushed into Kafka. Each payload attached to the events will be converted into a Redshift column.
Snowflake
Due to the complexity of the configuration, AccelByte provides a self-service UI to manage your Snowflake. You can set up a connection to AGS in the AGS Admin Portal by following the steps below.
Snowflake prerequisites
To connect Snowflake to AGS, you will need to have completed the following prerequisites:
- Create a Snowflake account.
- Create a role. You must have the ACCOUNTADMIN role granted.
- Create a new database or use an existing one.
- Grant the minimum required access to the role you created:
- Grant the role you created to the user who will be responsible for this database.
Connect Snowflake to AGS
In your studio namespace of the AGS Admin Portal, go to Analytics > Analytics & Data Monitoring.
Click Configure Now.
On the Add Snowflake Configuration form that appears, fill in the fields with the following information:
Enter the Snowflake username of the user that will be responsible for this data in the Snowflake Username field.
Enter your Snowflake account ID in the Snowflake Account ID field.
The Snowflake Account ID is the prefix before
snowflakecomputing.com
in your Snowflake account URL. For example, if your Snowflake URL isrsa42644.us-east-1.snowflakecomputing.com
, then the account ID isrsa42644.us-east-1
. Ensure you use the correct Account ID format.Enter the destination database's name in the DB Name field.
Enter the schema of the database used for the integration in the Schema field.
Enter the pipeline you want as the source of the data in the Data Pipe Source field. You can enter more than one pipeline.
Choose how you want to store the data in the Model field. There are two model options:
- Single: This model will store the data in one table for all AGS system events. If you use AGS Telemetry, two tables will be created (one for Telemetry events and one for AGS system events).
- Mapping: This model will store the data by mapping each event's topic name into a table. One table will be created for each topic name. Snowflake stages and pipes will also be created as tables.
Click Next. The Connect to Snowflake form will appear along with a Public Key. Copy the Public Key and paste it in your Snowflake worksheet to set the connection.
infoThis step must be completed before you can proceed to the following steps.
Click Connect and you will be redirected to your Snowflake configuration details page. Set the Integration Status toggle to ACTIVE. Your Connection Status will be changed to CONNECTED. If your Connection Status still says IN PROGRESS, you can refresh the page to update it.
Once your configuration is successful, the Integration Status field will be changed to Active.
If your configuration fails, the Integration Status field will change to Deactivated and the Connect button will appear in the Connection Status field. You can click the Connect button to try reconnecting, click the Edit button to edit the configuration, or click Remove to remove the configuration. Ensure you have met all the Snowflake prerequisites for this configuration.
Backfill Snowflake data
If your Snowpipe is missing data after you change your Snowflake credentials or the data has disappeared due to a connection issue, you can use the backfill feature to run past scheduled pipelines and fill in the missing data.
Ensure you have activated and started the Snowflake Warehouse before using the backfill feature.
In the studio namespace of the AGS Admin Portal, go to Analytics > Analytics & Data Monitoring.
Click Backfill to make the Backfill form appear.
If you want to retrieve missing data from the entire range of your backfill data, choose All Time in the Time Range option. If you want to retrieve missing data for a specified time, choose Partial in the Time Range option and enter the Start Date and End Date for this time range.
When you're finished, click Submit.
The backfilling process will run in the background. Refresh the page to see the latest configuration status. You can also cancel the backfilling process by clicking Cancel Backfilling Process.
Once backfilling has been completed successfully, a confirmation will appear on the configuration page.
If backfilling fails, check your connection and ensure that the username you set when creating your Snowflake configuration has been set in the default warehouse using the DESCRIBE USER syntax in Snowflake. The default warehouse is running by checking the Warehouse Menu in Snowflake.
Once the integration is successful, if you want to regularly update the RSA Key to ensure connection security, you can generate a new RSA Key by clicking Generate New RSA Key next to Connection Status on the configuration page. If you do this, the Snowflake integration will be disconnected and you will need to reconnect Snowflake.
Edit the Snowflake configuration
In the studio namespace of the Admin Portal, go to Analytics > Analytics & Data Monitoring.
Click Edit Configuration.
Before editing the configuration, you must first disconnect Snowflake. Click the option to do so and the Disconnect Snowflake form will appear. Click Continue to confirm the disconnection and proceed to the editing form.
Edit your configuration as desired on the Edit Snowflake Configuration form.