Skip to content

fix: deduplicate jobs from getJobsForWorker to prevent parallel execution of same job#110

Open
oharbo wants to merge 1 commit intoSimonErm:masterfrom
oharbo:fix/deduplicate-jobs-from-getJobsForWorker
Open

fix: deduplicate jobs from getJobsForWorker to prevent parallel execution of same job#110
oharbo wants to merge 1 commit intoSimonErm:masterfrom
oharbo:fix/deduplicate-jobs-from-getJobsForWorker

Conversation

@oharbo
Copy link

@oharbo oharbo commented Mar 16, 2026

Problem

When a worker has concurrency > 1 and there is only one job in the queue, the native getJobsForWorker(name, count) can return the same job multiple times (to fill the requested count). The JS layer then runs that job in parallel on multiple workers, causing:

  • One execution may fail (e.g. file not found, compression error)
  • Another may succeed
  • Inconsistent state: same task id appears as both FAILED/REMOVED and UPLOAD OK
  • onQueueFinish receives duplicate entries in executedJobs

Solution

Deduplicate jobs by id before execution. Keep only the first occurrence of each job id.

Reproduction

  1. Register a worker with concurrency: 2
  2. Add a single job to the queue
  3. Observe the job being executed twice in parallel (e.g. via logs or onQueueFinish with 2 entries for the same job id)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant