Runbook: Reviews Count Mismatch Between Channel and Reviewku Dashboard
| Service | Reviewku x Tripla Review |
|---|---|
| Owner Team slack handle | @bnl-dev-bali |
| Team's Slack Channel | #bnl-teams-b |
Table of Contents
- [[#Important Links]]
- [[#1. Triage]]
- [[#2. Decision Point]]
- [[#3. False Alarm]]
- [[#4. True Incident]]
- [[#4.1. Recover the System]]
- [[#4.2. Clean up]]
Important Links
| Alert | `` |
|---|---|
| Webscraper Dashboard | Webscraper Dashboard URL |
| Reviewku Worker Logs | Worker Service Logs URL |
| Review API | https://review-api.bookandlink.com/h/ws/notif |
1. Triage
Goal: Determine whether the mismatch is caused by scraping failure, webhook delivery failure, or queue processing failure.
Step A - Validate Scraping Result
- Login to Webscraper dashboard.
- Go to Jobs menu.
- Type property ID in the search field.
- Click Inspect.
Verify the following values are 0:
- Failed pages
- Empty pages
- No value pages
If any value > 0 → scraping likely failed.
Next, go to Data Preview tab and ensure these fields are NOT empty:
namedatetitlepros(orrevdepending on channel)rating
If data preview is empty → scraping invalid.
Step B - Validate Webhook Delivery
- Go to Details tab.
- Copy Job ID (example:
39138680). - Go to API → Webhook Logs.
- Paste Job ID.
Confirm:
Status Code: 200
Message: OK
Response body should be:
{"action taken":"Processing Data","custom id":"booking.com-2235-production","job id":"39138680","site map id":"1135196","status":"finished"}
Custom ID format:
{channel-name}-{propID}-{environment}
If Status ≠ 200 → webhook failed.
Step C - Validate Queue Processing
Go to Reviewku Worker Service Logs.
Search using property ID.
If you find log like:
Message received: map[channel:traveloka event:download-scraped-data from:2026-02-20 property_id:3065 scraping_id:39138585 to:2026-02-20 trigger:daily_cron]
SUCCESS: 2026/02/20 00:22:09 webScraper.go:674: Successfully inserted 27 Traveloka reviews with property ID 3065 into the main table. Failed: 0. Already exists: 13
Then queue processing is successful.
2. Decision Point
-
IF scraping result invalid (failed pages > 0 or preview empty)...
- ➡️ Go to: [[#4. True Incident]]
-
IF webhook response not
200 OK...- ➡️ Go to: [[#4. True Incident]]
-
IF worker logs show successful insertion and review count still mismatch...
- ➡️ Go to: [[#4. True Incident]]
-
IF scraping valid, webhook 200, and worker inserted successfully...
- ➡️ Go to: [[#3. False Alarm]]
3. False Alarm
If:
- Scraping values all 0
- Data preview valid
- Webhook status 200
- Worker inserted successfully
Then mismatch likely due to:
- Duplicate filtering
- Already exists count
- Channel UI caching delay
Actions:
- Wait 10–15 minutes.
- Refresh Reviewku dashboard.
- Compare with channel again.
Post in Slack:
Review count mismatch investigated.
Scraping and queue processing successful.
Likely caching or duplicate filtering.
Monitoring.
4. True Incident
Mismatch caused by one of:
- Scraping failure (captcha blocked)
- Webhook not delivered
- Queue handler failure
- Data not inserted
Primary objective: Restore review synchronization.
4.1. Recover the System
Case 1 - Scraping Failed (Captcha Blocking)
Diagnostic Steps
- Failed pages > 0
- Empty pages > 0
- Data preview empty
Remediation Plan
- Login to Webscraper dashboard.
- Go to Jobs.
- Search property ID.
- Click Inspect.
- Open Continue tab.
- Change proxy to another available proxy.
- Click Continue Scraping.
Verification
- Failed pages = 0
- Data preview populated
Case 2 - Webhook Failed
Diagnostic Steps
- Webhook status ≠ 200
- No processing response
Remediation Plan
Re-trigger manually via terminal:
curl -X POST 'https://review-api.bookandlink.com/h/ws/notif' \
-A 'webscraper.io/v1' \
-H 'Content-Type: application/x-www-form-urlencoded' \
-H 'Signature: faebe19220e9cb22d40e9280e4083ae4f1749dc9a84267fbadd44096cf88cbbc' \
-d 'scrapingjob_id={scrapingJobID}&status=finished&sitemap_id={sitemapID}&sitemap_name={sitemapName}&custom_id={channelName}-{propID}-production'
Values found at:
- Jobs → Scraping job details → Detail tab
- Sitemap →
sitemap_name - Job ID →
scrapingJobID - Sitemap ID →
sitemapID
- Sitemap →
Expected response:
{"action taken":"Processing Data","custom id":"expedia-64-production","job id":"39140478","site map id":"953592","status":"finished"}
Verification
- Response returns
"Processing Data" - Worker logs show insert success
Case 3 - Queue Handler Failure
Diagnostic Steps
- No SUCCESS log in Worker Service
Remediation Plan
- Confirm webhook successfully sent.
- Restart Reviewku Worker service if necessary.
- Re-trigger webhook using curl command above.
Verification
Worker logs show:
Successfully inserted X reviews with property ID Y into the main table.
4.2. Clean up
- Confirm review count now matches channel.
- Check for duplicate insertions.
- Monitor logs for 30 minutes.
- Post Slack update:
Review synchronization restored.
Scraping, webhook, and queue processing verified.
System stable.
- Document recurring captcha cases if frequent.