Explore the engine that turns messy file dumps into structured Postgres assets. Every feature is designed to prevent data loss and eliminate engineering glue code.
Stop writing custom parsers for every new file format. Content Atlas streams data directly from AWS S3, handling massive files through chunked processing that never hits memory limits.
Upload a single .zip and we'll auto-detect schemas for every file inside, keeping grouped data mapped together.
Network blip? No problem. Our chunked streamer resumes exactly where it left off.
Content Atlas creates a unique "fingerprint" for every file structure. Map a provider's format once, and future uploads are matched automatically—even if the column order changes.
We analyze data types (Integers, ISO Dates, Booleans) and build optimized Postgres tables on the fly.
Rename columns, cast types, or ignore fields without altering the original source file in S3.
Silent drops are the enemy of data integrity. We provide row-level and file-level detection with explicit error reporting, so you know exactly why a record wasn't imported.
We calculate SHA-256 hashes of every uploaded file. If someone uploads the same file twice, we flag it immediately before processing begins.
Define unique keys (e.g., Email + Date) in your mapping config. Rows violating these constraints are diverted to an error log, not discarded.
Download a generated CSV containing only the failed rows, complete with specific error messages for each line.
Content Atlas isn't just a UI—it's a headless ingestion engine. Integrate imports directly into your own SaaS product or internal tools using our REST API.
// Trigger an async import job
const response = await fetch('https://api.consuly.ai/v1/ingest', {
method: 'POST',
headers: {
'Authorization': 'Bearer sk_live_...',
'Content-Type': 'application/json'
},
body: JSON.stringify({
source_url: 's3://my-bucket/leads_q3.csv',
schema_id: 'sch_882910',
options: {
deduplicate: true,
notify_webhook: 'https://myapp.com/hooks/done'
}
})
});
const { job_id } = await response.json();