This is your API key, used by Developers when accessing the Appuri Event Sink API to transfer data to our Data Warehouse. Full documentation is available Event API · Appuri .
These are the credentials used to collect directly to your data warehouse via SQL. You will be provided with a single username and password to connect (the /analyst/ user), and the credentials required to establish a connection using your choice of SQL clients.
Please note that, for security purposes, external connections are disabled by default. You will need to add your IP address to the whitelist in order to gain access. This applies to any services which use these credentials, including Jobs, and other cloud services.
You can use these credentials to place temporary data into Amazon S3. This is useful when creating Jobs, or may be requested for a specific type of custom integration. Using these credentials, you can read and write files in your Amazon S3 scratch space. Please refer to the Amazon Documentation for more information about AWS Amazon Simple Storage Service (S3) Documentation .
Here, you can scale your Redshift cluster up and down, as you run into storage limits. Please note, If you are not the individual who handles billing for your account, you will be charged for each additional node you add to the cluster.
This allow you to configure the automatic deletion of historic data, to preserve disk space in your Redshift cluster. This is particularly useful if you ingest large amounts of data, and historic activity is no longer relevant to your analysts after several months.
You will be able to choose a period of time, in days, to retain data. After which, all event data will be deleted permanently from our systems.
Please note, this action is irreversible, and will result in the irrecoverable loss of data. Please make this decision after careful deliberation of your current storage costs, projected utilization, and analyst requirements. Do not hesitate to reach out and let us know if you have any questions.