Class AsyncDataExportClient


  • public class AsyncDataExportClient
    extends java.lang.Object
    • Constructor Detail

      • AsyncDataExportClient

        public AsyncDataExportClient​(ClientOptions clientOptions)
    • Method Detail

      • downloadReportingDataExport

        public java.util.concurrent.CompletableFuture<java.lang.Void> downloadReportingDataExport​(DownloadReportingDataExportRequest request)
        Download the data from a completed reporting data export job.

        Octet header required

        You will have to specify the header Accept: application/octet-stream when hitting this endpoint.

      • downloadReportingDataExport

        public java.util.concurrent.CompletableFuture<java.lang.Void> downloadReportingDataExport​(DownloadReportingDataExportRequest request,
                                                                                                  RequestOptions requestOptions)
        Download the data from a completed reporting data export job.

        Octet header required

        You will have to specify the header Accept: application/octet-stream when hitting this endpoint.

      • create

        public java.util.concurrent.CompletableFuture<DataExport> create​(CreateDataExportRequest request)
        To create your export job, you need to send a POST request to the export endpoint https://api.intercom.io/export/content/data.

        The only parameters you need to provide are the range of dates that you want exported.

        🚧 Limit of one active job

        You can only have one active job per workspace. You will receive a HTTP status code of 429 with the message Exceeded rate limit of 1 pending message data export jobs if you attempt to create a second concurrent job.

        ❗️ Updated_at not included

        It should be noted that the timeframe only includes messages sent during the time period and not messages that were only updated during this period. For example, if a message was updated yesterday but sent two days ago, you would need to set the created_at_after date before the message was sent to include that in your retrieval job.

        📘 Date ranges are inclusive

        Requesting data for 2018-06-01 until 2018-06-30 will get all data for those days including those specified - e.g. 2018-06-01 00:00:00 until 2018-06-30 23:59:99.

      • create

        public java.util.concurrent.CompletableFuture<DataExport> create​(CreateDataExportRequest request,
                                                                         RequestOptions requestOptions)
        To create your export job, you need to send a POST request to the export endpoint https://api.intercom.io/export/content/data.

        The only parameters you need to provide are the range of dates that you want exported.

        🚧 Limit of one active job

        You can only have one active job per workspace. You will receive a HTTP status code of 429 with the message Exceeded rate limit of 1 pending message data export jobs if you attempt to create a second concurrent job.

        ❗️ Updated_at not included

        It should be noted that the timeframe only includes messages sent during the time period and not messages that were only updated during this period. For example, if a message was updated yesterday but sent two days ago, you would need to set the created_at_after date before the message was sent to include that in your retrieval job.

        📘 Date ranges are inclusive

        Requesting data for 2018-06-01 until 2018-06-30 will get all data for those days including those specified - e.g. 2018-06-01 00:00:00 until 2018-06-30 23:59:99.

      • find

        public java.util.concurrent.CompletableFuture<DataExport> find​(FindDataExportRequest request)
        You can view the status of your job by sending a GET request to the URL https://api.intercom.io/export/content/data/{job_identifier} - the {job_identifier} is the value returned in the response when you first created the export job. More on it can be seen in the Export Job Model.

        🚧 Jobs expire after two days All jobs that have completed processing (and are thus available to download from the provided URL) will have an expiry limit of two days from when the export ob completed. After this, the data will no longer be available.

      • find

        public java.util.concurrent.CompletableFuture<DataExport> find​(FindDataExportRequest request,
                                                                       RequestOptions requestOptions)
        You can view the status of your job by sending a GET request to the URL https://api.intercom.io/export/content/data/{job_identifier} - the {job_identifier} is the value returned in the response when you first created the export job. More on it can be seen in the Export Job Model.

        🚧 Jobs expire after two days All jobs that have completed processing (and are thus available to download from the provided URL) will have an expiry limit of two days from when the export ob completed. After this, the data will no longer be available.

      • download

        public java.util.concurrent.CompletableFuture<java.lang.Void> download​(DownloadDataExportRequest request)
        When a job has a status of complete, and thus a filled download_url, you can download your data by hitting that provided URL, formatted like so: https://api.intercom.io/download/content/data/xyz1234.

        Your exported message data will be streamed continuously back down to you in a gzipped CSV format.

        📘 Octet header required

        You will have to specify the header Accept: application/octet-stream when hitting this endpoint.

      • download

        public java.util.concurrent.CompletableFuture<java.lang.Void> download​(DownloadDataExportRequest request,
                                                                               RequestOptions requestOptions)
        When a job has a status of complete, and thus a filled download_url, you can download your data by hitting that provided URL, formatted like so: https://api.intercom.io/download/content/data/xyz1234.

        Your exported message data will be streamed continuously back down to you in a gzipped CSV format.

        📘 Octet header required

        You will have to specify the header Accept: application/octet-stream when hitting this endpoint.