Archiving Event Data

The deltaDNA platform supports the automatic archiving of events data to Amazon Web Services (AWS) using Amazon S3 Buckets.

Once your S3 Bucket has been configured within the platform CSV files will be automatically delivered containing the event data for your game.

Bucket Configuration

Follow the AWS guide on creating a bucket to create the bucket that will be used to store your exported events data.

After creating your bucket, you will need to configure the permissions to allow deltaDNA access to export to your bucket. This is achieved by accessing your bucket then browsing to Permissions > Bucket Policy and then entering the following JSON replacing <BUCKET> with the name of your bucket:

Ensure that the permissions are entered correctly otherwise the export process will not be successful.

Next, you must configure your deltaDNA game to start exporting to this bucket. Browse to the Game Details page for your game by accessing the Game Management page and clicking the ‘Edit game details button next to you game, as detailed below.

Finally, enter your bucket name in the S3 Bucket Name field, ensuring that it is entered correctly in order for the export process to be successful.

S3 Bucket Name

 

After you have completed the bucket configuration steps, your historical data will be moved to the new bucket and any new export data will be stored in the bucket specified.

There are a few things you should take into consideration with regards to exporting events to Amazon S3:

  • Historical data will be moved to the new bucket within roughly 24 hours – if the bucket is deleted after this takes place then the data will be lost
  • We recommend that server-side encryption is enabled on the bucket you are configuring to accept event data – more information can be found at the Amazon S3 documentation site
  • Any data going forward will be exported straight to the bucket provided – again, if the bucket is deleted then this data will be lost
  • Once the bucket is set up and receiving exports correctly do not adjust permissions or settings as this will cause future exports to fail
  • Using AWS and S3 storage will incur a commercial charge – for more information regarding pricing visit the AWS pricing guide – note that the number of PUT requests made will likely match the number of export files stored in your existing S3 archive over a certain time period 
  • In support of our efforts to be GDPR compliant by 25th May 2018 we will be moving towards customer-owned storage of archive files and all customers will have to have an S3 bucket configured within the platform if they wish to access archived events data