Delivery Details

On this page, you will find information regarding the various methods available for receiving your data feeds and details about the organization of the folder where the deliverables are stored.

Delivery options

Placer offers various options to gain access to its POI analytics insights exported as CSV files. Customers can choose one of the following delivery options as per the availability of each feed type.

Delivery option Available for feeds
Customer-owned bucketsMulti-metrics chain/venue
Migration trends
Brand dominance
Industry trends
Premium export
Origins
Placer bucketMulti-metrics chain/venue
Migration trends
Brand dominance
Industry trends
Premium export
Origins
EmailMulti-metrics chain/venue
Migration trends
Premium export
SFTPMulti-metrics chain/venue
Brand dominance
Industry trends
SnowflakeMulti-metrics chain/venue
Migration trends
Premium export

Customer-owned bucket

Amazon Web Services (AWS) S3

Customers can decide to have the data feed pushed into their own bucket or pull data off the Placer-owned bucket.

Share your S3 bucket with Placer

  1. Create or reuse an S3 bucket for receiving the export files
  2. Create a user (or share existing user permissions) and provide your Placer customer success manager with the following credentials:
    AWS_ACCESS_KEY_ID
    AWS_SECRET_ACCESS_KEY
  3. The user needs the following permissions
    "s3:PutObject",
    "s3:GetObject",
    "s3:DeleteObject",
    "s3:ListBucket",
    "s3:GetBucketLocation"

This is an example of a user-Policy:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:DeleteObject",
                "s3:ListBucket",
                "s3:GetBucketLocation"
            ],
            "Resource": [                
                "arn:aws:s3:::{{bucket goes here}}",
                "arn:aws:s3:::{{bucket goes here}}/*"
            ]
        }
    ]
}

Why do we require these bucket permissions?

  1. Test bucket policy by downloading and running the placer_s3_export.py test script as follows:
    (replace with the relevant parameters)
python placer_s3_export_test.py <bucket-name> <access-key> <access-secret>

Missing User-Policy permissions will be listed in the script response. Once enabled, Placer's data will be uploaded to the desired bucket.

Google Cloud Storage (GCS)

Share your GCS bucket with Placer

  1. Create a Google Cloud Storage bucket
  2. Create a new Service Account (or use an existing account) and provide your Placer account manager the credentials file associated with the account (JSON file)
  3. Provide 'Storage Admin' permission for the bucket, to the newly created Service Account
  4. All done! Once enabled, Placer's data will be uploaded to the desired bucket.

Microsoft Azure sync

Placer currently support only export to Microsoft Azure customer-owned buckets.
To sync your exports with Microsoft Azure, follow these steps in your Microsoft Azure account:

  1. Create a dedicated (or use existing) Azure container
  2. A role with Microsoft.Storage/storageAccounts/listkeys/action permission
  3. Provide Placer with the bucket information:
  • Storage account name
  • Storage account key
  • path to store the data feet (container name and internal path)

Requesting Placer bucket set up

Contact your customer success manager if you are interested in getting data pulled from using a Placer-owned bucket. Once your bucket is ready, Placer will share with you the required credentials and the path to access your bucket.

Email

Placer can grant you access to your data feed via a temporary storage link delivered to mail recipients of your choice.
Contact your customer success manager to request this option.

SFTP

An additional option to receive data feed(available for some of the data feeds only) is by using an SFTP connection. We strongly recommend choosing any of the other options listed above due to the robust security and high availability offered by all cloud providers.

Snowflake

Some Placer's data feeds are available for access via your Snowflake account. You may contact your customer success manager to set up a Snowflake view for you.

Delivery folders organization - V3

📘

Relevancy note

This section is relevant for Chain/Venue multi-metrics and Premium export.

  1. File path will be as follows:
    '/placer-analytics/SCHEMA/EXPORT_TYPE/DATE_STRING/ENTITY_TYPE/FILE_TYPE
TermsValuesDescription
SCHEMA(1) multi-metric
(2) bulk-export
(3) visits-by-region
The path will also include the export SCHEMA, currently supported only “multi-metric” (default for new exports) “single-metric” (deprecated, still used for existing exports).
EXPORT_TYPEdaily, weekly, monthly, weekly-daily, monthly-daily, monthly-weeklyEXPORT_TYPE is a combination of frequency and aggregation defined for the feed. For example, a weekly export (generated once a week) with a daily level aggregation (each line represents a single day), will be defined as “weekly-daily”. If that export is defined with a weekly level aggregation (each line represents a single week’s aggregated metrics), the definition would be “weekly”.
Note: Different feeds support a different list of aggregation/frequency options.
DATE_STRINGDate in the format YYYY-MM-DDFeed's generation date
ENTITY_TYPEchain, property, cbgEach exported file represents an entity. And that entity has an ENTITY_TYPE.
FILE_TYPEmetrics, metadata, region_metadataUnder each ENTITY_TYPE folder the files will be split based on the file types:

Metrics files
POI metadata files
Regions metadata files
  1. Files are GZip compressed
  2. In case the feed is configured to include an explicit list of entities ( venues/complexes/billboards) the files will be located under the property folder in case the configuration was for a list of chains (include their sub-entities or not) the files will be located under the chain folder.
  3. Folders structure example:

Delivery folders organization - V4

📘

Relevancy note

This section is relevant for Chain/Venue multi-metrics, Premium export, and Origins feeds.

  1. File path will be as follows:
    '/placer-analytics/SCHEMA/EXPORT_TYPE/DATE_STRING/FILE_TYPE/
TermsValuesDescription
SCHEMA(1) multi-metric
(2) bulk-export
(3) visits-by-region
The path will also include the export SCHEMA, currently supported only “multi-metric” (default for new exports) “single-metric” (deprecated, still used for existing exports).
EXPORT_TYPEdaily, weekly, monthly, weekly-daily, monthly-daily, monthly-weeklyEXPORT_TYPE is a combination of frequency and aggregation defined for the feed. For example, a weekly export (generated once a week) with a daily level aggregation (each line represents a single day), will be defined as “weekly-daily”. If that export is defined with a weekly level aggregation (each line represents a single week’s aggregated metrics), the definition would be “weekly”.
Note: Different feeds support a different list of aggregation/frequency options.
DATE_STRINGDate in the format YYYY-MM-DDFeed's generation date
FILE_TYPEmetrics, metadata, regions_metadataUnder each ENTITY_TYPE folder the files will be split based on the file types:

Metrics files
POI metadata files
Regions metadata files
  1. Files are GZip compressed
  2. File partitioning optimization is employed.
  3. Folders structure example:

Delivery Scheduling

📘

Relevancy note

This section is relevant for Chain/Venue multi-metrics, Premium export and Origins feed.

Please find below the delivery schedule for each feed by its delivery frequency:

FeedFrequencyDelivery Scheduling
Chain/venue multi-metricsDailyEach day at around 7 am EST
WeeklyEvery Wednesday
MonthlyOn the 6th of each month
Premium exportWeeklyEvery Thursday
MonthlyOn the 6th of each month
OriginsMonthlyOn the 10th of each month

Delivery Completion Indication

To receive notifications upon the completion of the entire delivery and the availability of files for retrieval, Placer offers various notification choices:

  1. .success file - This file is the final file being delivered and it is empty; its arrival signals that all deliverables are ready for retrieval. You can find the file's location within the various available file structures above.
  2. Amazon Simple Notification Service - you can learn more about it here.