Skip to main content
Retrieve historical property, unit, and floorplan data using asynchronous batch jobs. This workflow processes large data exports in the background and notifies you when results are ready for download in your preferred format (CSV, JSONL, or Parquet). This workflow is designed for data engineers, analysts, and portfolio managers who need to:
  • Export time-series property data for trend analysis and reporting
  • Build historical datasets for machine learning models
  • Create custom dashboards with historical performance metrics
  • Sync property data into data warehouses or business intelligence tools
  • Analyze rent growth, occupancy trends, and market dynamics over time

Prerequisites

  • Your API key (Bearer token)
  • Property IDs for the properties you want data for
  • Date range for the data export (start_date and end_date)
If you need an API token, see Getting started → “Get an API token”.

1) Create a batch job

Submit a batch job request specifying the report type, output format, date range, and property IDs.
curl -X POST \
  -H "Authorization: Bearer <your_api_key>" \
  -H "Content-Type: application/json" \
  -d '{
    "report_type": "property",
    "output_format": "jsonl",
    "start_date": "2025-01-01",
    "end_date": "2025-12-31",
    "property_ids": [99001416, 99001470, 99004737],
    "callback_url": "https://your-domain.com/webhook"
  }' \
  "https://data.apartmentiq.io/apartmentiq/api/v1/bulk_api/jobs"
Parameters:
  • report_type: Type of data to export - "property", "units", or "floorplans"
  • output_format: File format - "csv", "jsonl", or "parquet"
  • start_date / end_date: Date range in YYYY-MM-DD format
  • property_ids: Array of property IDs (use Property Lookup to find IDs)
  • callback_url (optional): Webhook URL to receive notification when job completes
The response includes a job_id that you’ll use to check status and download results:
{
  "job_id": "ed49987b-6539-4e9e-8a63-2bcd811ef0ec",
  "status": "submitted",
  "report_type": "property",
  "output_format": "jsonl",
  ...
}
Jobs are processed asynchronously. The initial status will be "submitted" and will transition to "succeeded", "failed", or "cancelled".
If you provide a callback_url, you’ll receive a POST request to that URL when the job completes. The POST body will contain the same job information as the Get Batch Job Status endpoint, including the final status and any error messages.

2) Check job status

Poll the job status endpoint to monitor progress. Check the status field to determine when the job is complete.
curl -H "Authorization: Bearer <your_api_key>" \
  "https://data.apartmentiq.io/apartmentiq/api/v1/bulk_api/jobs/ed49987b-6539-4e9e-8a63-2bcd811ef0ec"
Status values:
  • "submitted": Job is queued for processing
  • "succeeded": Job completed successfully - ready to download
  • "failed": Job failed - check error_message field
  • "cancelled": Job was cancelled before completion
Poll periodically until status is "succeeded". For large jobs (many properties or long date ranges), expect longer processing times.

3) Download results

When the job status is "succeeded", download the results file. The endpoint returns a redirect to a pre-signed S3 URL valid for 7 days.
curl -L -H "Authorization: Bearer <your_api_key>" \
  "https://data.apartmentiq.io/apartmentiq/api/v1/bulk_api/jobs/ed49987b-6539-4e9e-8a63-2bcd811ef0ec/results" \
  -o property_data.jsonl
Use the -L flag to follow the redirect to the S3 URL. The file contains your exported data in the format you specified.

List all jobs

View all batch jobs for your account with pagination support.
curl -H "Authorization: Bearer <your_api_key>" \
  "https://data.apartmentiq.io/apartmentiq/api/v1/bulk_api/jobs?page=1&per_page=50"
This returns a list of jobs ordered by creation date (newest first) with pagination metadata. Useful for:
  • Tracking multiple concurrent exports
  • Finding job IDs for jobs created earlier
  • Monitoring job success rates and errors

Cancel a running job

Cancel a job that’s still processing if you no longer need the results.
curl -X DELETE \
  -H "Authorization: Bearer <your_api_key>" \
  "https://data.apartmentiq.io/apartmentiq/api/v1/bulk_api/jobs/b7b05e3d-e582-4c04-acb9-e1662ecf40f8"
Once cancelled, a job cannot be resumed. You’ll need to create a new job to get the data.

Report types and data included

Property reports (report_type: "property"):
  • Daily snapshots of property-level metrics including rent, occupancy, inventory, concessions
  • Physical characteristics, ownership, management, and demographics
  • Bedroom-level breakdowns for rent, availability, and days on market
Unit reports (report_type: "units"):
  • Individual unit-level data with rent, availability, concessions, and unit characteristics
  • Floorplan details, square footage, and amenities
  • Lease terms and pricing history
Floorplan reports (report_type: "floorplans"):
  • Aggregated metrics by floorplan (bedroom/bathroom configuration)
  • Unit count, availability, and pricing by floorplan type
  • Historical trends for specific unit mixes

Choosing an output format

  • CSV: Best for Excel, Google Sheets, or simple SQL imports. Human-readable.
  • JSONL: Best for modern data pipelines, streaming ingestion, and APIs. One JSON object per line.
  • Parquet: Best for analytics platforms (Snowflake, BigQuery, Databricks). Columnar format with compression.
Parquet files are typically 5-10x smaller than CSV for the same data and query faster in analytics tools.

Example workflow: Monthly data sync

Many teams use batch jobs to maintain a data warehouse with monthly updates:
  1. On the 1st of each month, create a job for the previous month’s data
  2. Use callback_url to trigger your ETL pipeline when the job completes
  3. Load the downloaded file into your data warehouse
  4. Update dashboards and reports with fresh data
# Automated monthly export (example for January 2025)
curl -X POST \
  -H "Authorization: Bearer <your_api_key>" \
  -H "Content-Type: application/json" \
  -d '{
    "report_type": "property",
    "output_format": "parquet",
    "start_date": "2025-01-01",
    "end_date": "2025-01-31",
    "property_ids": [/* your portfolio property IDs */],
    "callback_url": "https://your-etl.example.com/webhooks/apartmentiq"
  }' \
  "https://data.apartmentiq.io/apartmentiq/api/v1/bulk_api/jobs"

Best practices

  • Batch requests efficiently: Request data for multiple properties in a single job rather than creating separate jobs per property
  • Use webhooks: Set callback_url to avoid polling - you’ll receive a notification when the job completes
  • Download promptly: Pre-signed download URLs expire after 7 days
  • Handle failures gracefully: Check the error_message field if a job fails and retry with adjusted parameters
  • Monitor quota: Large exports count against your export quota limits

Errors and troubleshooting

Common error scenarios:
  • 400 - Invalid parameters: Check that start_date and end_date are in YYYY-MM-DD format and property_ids is a non-empty array
  • 401 - Unauthorized: Verify your bearer token is valid and not expired
  • 403 - Forbidden: Ensure you have permissions to create batch jobs and access the specified properties
  • 404 - Job not found: The job_id may be incorrect or the job may belong to a different account
See Errors for more details.