Delivery Options

Placer offers various options to gain access to its POI analytics insights exported as CSV files on a daily/weekly/monthly basis.
Customers can choose one of the following delivery options:

  • Customer-owned bucket
    • Google Cloud Storage (GCS)
    • Amazon Web Services (AWS) S3
    • Microsoft Azure
  • Placer bucket directory allocated for the customer (available for GCP and AWS only)
  • Temporary storage owed by Placer shared via email link.

Amazon Web Services (AWS) S3

Customers can decide whether to have data feed pushed into their own bucket or pull data off of Placer owned bucket.

Requesting Placer bucket set up

If you are interested in getting data pulled from using a Placer-owned bucket, contact your customer success manager. Once your bucket is ready, Placer will share with you the required credentials and the path to access your bucket

Share your S3 bucket with Placer

  1. Create or reuse an S3 bucket for receiving the export files
  2. Create a user (or share existing user permissions) and provide your Placer customer success manager the following credentials:
    AWS_ACCESS_KEY_ID
    AWS_SECRET_ACCESS_KEY
  3. The user needs the following permissions
    "s3:PutObject",
    "s3:GetObject",
    "s3:DeleteObject",
    "s3:ListBucket",
    "s3:GetBucketLocation"

For your convenience this is an example of a user-Policy:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject",
                "s3:ListBucket",
                "s3:GetBucketLocation"
            ],
            "Resource": [                
                "arn:aws:s3:::{{bucket goes here}}",
                "arn:aws:s3:::{{bucket goes here}}/*"
            ]
        }
    ]
}

Why do we require these bucket permissions?

  1. Test bucket policy by downloading and runing the placer_s3_export.py test script as follows:
    (replace with the relevant parameters)
python placer_s3_export_test.py <bucket-name> <access-key> <access-secret>

Missing User-Policy permissions will be listed in the script response.

If there are no issues listed, you are all set! Once enabled, Placer's data will be uploaded to the desired bucket.

Google Cloud Storage (GCS)

Requesting Placer bucket set up
If you are interested in getting data pulled from using a Placer-owned bucket, contact your customer success manager. Once your bucket is ready, Placer will share with you the required credentials to access it

Share your GCS bucket with Placer

  1. Create a Google Cloud Storage bucket
  2. Create a new Service Account (or use an existing account) and provide your Placer account manager the credentials file associated with the account (json file)
  3. Provide 'Storage Admin' permission for the bucket, to the newly created Service Account
  4. All done! Once enabled, Placer's data will be uploaded to the desired bucket.

Microsoft Azure sync

We currently support only export to Microsoft Azure customer-owned bucket.
In order to sync exports with Microsoft Azure, the following should be available in your Microsoft Azure account:

  1. Create a dedicated (or use existing) Azure container
  2. A role with Microsoft.Storage/storageAccounts/listkeys/action permission
  3. Provide Placer with the bucket information:
  • Storage account name
  • Storage account key
  • path to store the data feet (container name and internal path)

Email

Placer can grant you access to your data feed via a temporary storage link delivered to mail recipients of your choice.
Contact your customer success manager to request this option.

Note: Additional option to receive data feed(available for some of the data feeds only) is by using SFTP connection. We strongly recommend choosing any of the other options listed above, due to the robust security and high availability offered by all cloud providers.