πͺ Webhooks
Crawlio supports webhooks for notifying your server when important events occur β such as the completion of crawl or scrape jobs. With webhooks, you can automate your workflows, trigger pipelines, or log job outcomes in real time.
π§ Setup Webhooks
To start receiving webhook events:
- Go to the Dashboard β Settings β Webhooks.
- Enter your server endpoint (must support HTTPS).
- Copy your Signing Secret β used to verify authenticity of incoming payloads.
β Tip: Each user has a unique signing secret. Keep this secret safe and do not expose it publicly.
π¬ Events
Crawlio sends webhook events for the following job types:
Type | Description | Payload Contents |
---|---|---|
crawl | When a crawling job completes | crawlId , status , type |
scrape | When a scraping job completes | Full job result data |
batch-scrape | When a batch scrape finishes processing | batchId , status , type |
π§Ύ Payload Structure
All webhook payloads follow the structure below:
Example Payload: scrape
Example Payload: crawl
Example Payload: batch-scrape
π Authenticating Webhooks
Each webhook request is signed using your unique secret key and the HMAC SHA-256
algorithm.
Signature Header
Each request includes an X-Signature
header:
Signature Generation
Verify this signature on your server to ensure the request came from Crawlio.
π₯ Handling Webhooks
Hereβs a simple Node.js/Express example of how to handle and verify a webhook:
π§ͺ Testing Webhooks
To test your webhook integration:
- Use a tool like Webhook.site or ngrok to expose a local server.
- Trigger test jobs from the Crawlio Dashboard and verify receipt and handling.
π Summary
Feature | Supported |
---|---|
Event Types | crawl , scrape , batch-scrape |
Signature Headers | β |
Custom Payloads | β (by event type) |