メインコンテンツまでスキップ

Create an S3 connector

Last updated on July 10, 2024

Introduction

You can create and configure an S3 connector in the AccelByte Gaming Services (AGS) Admin Portal to ensure seamless integration between your data streaming workflows and Amazon S3.

Prerequisites

  • Access to the AGS Admin Portal.
  • An S3 bucket in your AWS account. This is where the streamed data will be stored. Refer to the Amazon Web Services (AWS) documentation to learn how to create an S3 bucket.

Create an S3 connector

  1. On the Admin Portal sidebar, go to ADD-ONS > AIS > Data Connector.

    Image shows navigating to the Analytics Connector page

  2. On the Data Connector page, click the Create Connector button. The Create Connector page appears.

  3. In the General Configuration section, fill in the required information:

    • Platform: select AWS S3 from the dropdown.
    • Connector Name: type in a name for the connector. Note that after the connector is created, a randomized set of numbers will be added at the end of the name your provided, which will be preceded by a dash. For example, "AAAConnect" will be "AAAConnect-123456" after the connector is created.
    • Bucket Name: Type in the name of the S3 bucket.
    • AWS Region Code: type in the AWS region where the S3 bucket has been created.
    • S3 Path Format: add the path where the data in the S3 bucket will be stored. You can set a duration from one up to 60 minutes. For more information on the path format, refer to the S3 path format section in the Configure the S3 path format article.
    • Partition Duration: specify how data will be partitioned into S3 objects in minutes.
    • Flush Interval: set the maximum time interval in milliseconds in which the data will be periodically written into S3. The flush interval range is between one and 15 minutes.
    • Flush Size: set the maximum number of events that will be written into S3. The flush size range is between 100 and 1000.
    注記

    Data will be sent depending on which condition is reached first between flush interval or flush size.

  4. Click Next to go to the Data Filtering section.

  5. In the Data Filtering section, fill in the required information:

    • Event: choose which event type will be the source:
      • Game Telemetry: Custom telemetry events that are sent from game clients (Custom Telemetry).
      • AccelByte Event: System-generated events from AccelByte services (Service Telemetry).
    • Filter: click the Add Filter button. The Add Filter pop-up form appears. You can add specific namespaces and Kafka topics that you want to stream from the source services. You can also select all namespaces and/or all topics. This ensures only relevant data is transferred. Click Add to create create and save the filter.
  6. Click Next to go to the Bucket Policy Script Configuration section.

  7. In the Bucket Policy Script Configuration section, copy the policy template and implement it into your S3 bucket in AWS. To learn how to add bucket policies in AWS, refer to the AWS documentation.

    important
    • Ensure that you implement the policy before clicking Save in this section. Otherwise, the connector will fail to activate.
    • If you don't want to activate the connector yet, you have the option to save it as a draft. Without implementing the policy in AWS, click Save. An error message will appear, stating that the connection failed. Click on Do It Later to save the connector as a draft. You can activate it at a later time.

    Image shows the Bucket Policy Script Configuration form

  8. After implementing the bucket policy script to your S3 bucket in AWS, go back to the AGS Admin Portal and click Save. The Admin Portal will establish the connection between the connector and the S3 bucket in you AWS. The details page of the connector will then appear, with the connector status set to "ACTIVE."

    The image shows the activation success