Prequel replicates data by moving it from one database to another, while ETLs replicate data by scraping APIs.
Efficient, fault-tolerant data transfers
Defensively engineered
Our transfer process is designed to handle edge cases that may come up working with a wide range of databases. For instance, our platform supports eventual consistency by default. When our system initiates a transfer, it creates a look-back window to capture any data that might arrive out of order.
Data integrity checks
One of Prequel's core promises is the guarantee that data in the source will be accurately replicated in the destination. Every row will be transferred, no row will be dropped, and the destination will get all changes to existing rows. Data integrity checks compare the data in the source to the data in the destination.
Webhooks & alerts
Use webhooks to receive events from Prequel. Webhook endpoints created using the POST /webhooks endpoint can subscribe to any of Prequel's Event Types and configure delivery via HTTPS, Slack, or Pagerduty.
Leaders trust Prequel
Prequel sends billions of rows of data every month.
Data is replicated using ephemeral workers, which read data from the source and then upsert that data into the destination. Data is reformatted in flight so that it's compatible with the destination. No data remains on Prequel's servers once the data is transferred.
Deployment
Multiple deployment options
Prequel can be deployed privately, on our cloud, or using a private cloud, depending on your requirements.