- Export time-series property data for trend analysis and reporting
- Build historical datasets for machine learning models
- Create custom dashboards with historical performance metrics
- Sync property data into data warehouses or business intelligence tools
- Analyze rent growth, occupancy trends, and market dynamics over time
Prerequisites
- Your API key (Bearer token)
- Property IDs for the properties you want data for
- Date range for the data export (start_date and end_date)
1) Create a batch job
Submit a batch job request specifying the report type, output format, date range, and property IDs.report_type: Type of data to export -"property","units", or"floorplans"output_format: File format -"csv","jsonl", or"parquet"start_date/end_date: Date range in YYYY-MM-DD formatproperty_ids: Array of property IDs (use Property Lookup to find IDs)callback_url(optional): Webhook URL to receive notification when job completes
job_id that you’ll use to check status and download results:
Jobs are processed asynchronously. The initial status will be
"submitted" and will transition to "succeeded", "failed", or "cancelled".2) Check job status
Poll the job status endpoint to monitor progress. Check thestatus field to determine when the job is complete.
"submitted": Job is queued for processing"succeeded": Job completed successfully - ready to download"failed": Job failed - checkerror_messagefield"cancelled": Job was cancelled before completion
3) Download results
When the job status is"succeeded", download the results file. The endpoint returns a redirect to a pre-signed S3 URL valid for 7 days.
Use the
-L flag to follow the redirect to the S3 URL. The file contains your exported data in the format you specified.List all jobs
View all batch jobs for your account with pagination support.- Tracking multiple concurrent exports
- Finding job IDs for jobs created earlier
- Monitoring job success rates and errors
Cancel a running job
Cancel a job that’s still processing if you no longer need the results.Report types and data included
Property reports (report_type: "property"):
- Daily snapshots of property-level metrics including rent, occupancy, inventory, concessions
- Physical characteristics, ownership, management, and demographics
- Bedroom-level breakdowns for rent, availability, and days on market
report_type: "units"):
- Individual unit-level data with rent, availability, concessions, and unit characteristics
- Floorplan details, square footage, and amenities
- Lease terms and pricing history
report_type: "floorplans"):
- Aggregated metrics by floorplan (bedroom/bathroom configuration)
- Unit count, availability, and pricing by floorplan type
- Historical trends for specific unit mixes
Choosing an output format
- CSV: Best for Excel, Google Sheets, or simple SQL imports. Human-readable.
- JSONL: Best for modern data pipelines, streaming ingestion, and APIs. One JSON object per line.
- Parquet: Best for analytics platforms (Snowflake, BigQuery, Databricks). Columnar format with compression.
Example workflow: Monthly data sync
Many teams use batch jobs to maintain a data warehouse with monthly updates:- On the 1st of each month, create a job for the previous month’s data
- Use
callback_urlto trigger your ETL pipeline when the job completes - Load the downloaded file into your data warehouse
- Update dashboards and reports with fresh data
Best practices
- Batch requests efficiently: Request data for multiple properties in a single job rather than creating separate jobs per property
- Use webhooks: Set
callback_urlto avoid polling - you’ll receive a notification when the job completes - Download promptly: Pre-signed download URLs expire after 7 days
- Handle failures gracefully: Check the
error_messagefield if a job fails and retry with adjusted parameters - Monitor quota: Large exports count against your export quota limits
Errors and troubleshooting
Common error scenarios:- 400 - Invalid parameters: Check that
start_dateandend_dateare in YYYY-MM-DD format and property_ids is a non-empty array - 401 - Unauthorized: Verify your bearer token is valid and not expired
- 403 - Forbidden: Ensure you have permissions to create batch jobs and access the specified properties
- 404 - Job not found: The job_id may be incorrect or the job may belong to a different account