feat(connector): [31e88f42] Implement robust de-duplication at ingress

This commit addresses the issue of duplicate jobs being created by the SuperOffice connector.

The root cause was identified as a race condition where SuperOffice would send multiple  webhooks in quick succession for the same entity, leading to multiple identical jobs in the queue.

The solution involves several layers of improvement:
1.  **Ingress De-duplication:** The  now checks for existing  jobs for the same entity *before* adding a new job to the queue. This is the primary fix and prevents duplicates at the source.
2.  **DB Schema Enhancement:** The  table schema in  was extended with an  column to allow for reliable and efficient checking of duplicate entities.
3.  **Improved Logging:** The log messages in  for job retries (e.g., when waiting for the Company Explorer) have been made more descriptive to avoid confusion and false alarms.
This commit is contained in:
2026-03-09 12:50:32 +00:00
parent f35a702216
commit 65aaff3936
3 changed files with 61 additions and 12 deletions

View File

@@ -34,6 +34,11 @@ async def receive_webhook(request: Request, background_tasks: BackgroundTasks):
event_type = payload.get("Event", "unknown")
# --- DEDUPLICATION AT INGRESS (Added March 2026) ---
# Before adding a job, check if an identical one is already pending.
if queue.is_duplicate_pending(event_type, payload):
return {"status": "skipped_duplicate"}
# Add to local Queue
queue.add_job(event_type, payload)