Skip to content

fix(token): add batch keys endpoint to prevent 429 rate limit on bulk copy#4084

Open
ImogeneOctaviap794 wants to merge 1 commit intoQuantumNous:mainfrom
ImogeneOctaviap794:fix/batch-copy-token-keys-rate-limit
Open

fix(token): add batch keys endpoint to prevent 429 rate limit on bulk copy#4084
ImogeneOctaviap794 wants to merge 1 commit intoQuantumNous:mainfrom
ImogeneOctaviap794:fix/batch-copy-token-keys-rate-limit

Conversation

@ImogeneOctaviap794
Copy link
Copy Markdown

@ImogeneOctaviap794 ImogeneOctaviap794 commented Apr 4, 2026

Problem

When users batch-copy 50+ token keys from the token management page, the frontend fires N individual POST /api/token/:id/key requests simultaneously via Promise.all. The CriticalRateLimit (20 req/20min per IP) rejects most of them with HTTP 429, making the batch copy feature completely broken.

Solution

Add a single POST /api/token/batch/keys endpoint that accepts an array of token IDs and returns all keys in one request, eliminating the N-request problem entirely.

Changes (5 files, +63 / -6)

File Change
model/token.go Add GetTokensByIdsAndUserId for batch DB query
controller/token.go Add BatchGetTokenKeys handler (max 100 IDs)
router/api-router.go Register POST /api/token/batch/keys route
web/src/helpers/token.js Add fetchTokenKeysBatch calling batch endpoint
web/src/hooks/tokens/useTokensData.jsx Rewrite batchCopyTokens to use single batch request

Security

  • Retains CriticalRateLimit + UserAuth middleware on the new endpoint
  • Only returns keys owned by the authenticated user (WHERE user_id = ?)
  • Hard cap of 100 IDs per request

Before vs After

Before After
Copy 50 keys 50x POST /api/token/:id/key 1x POST /api/token/batch/keys
Rate limit consumption 50 (triggers 429) 1 (no issue)

Summary by CodeRabbit

  • New Features
    • Added batch token key retrieval API endpoint supporting up to 100 tokens per request
    • Optimized multi-token operations to use single API call instead of individual requests
    • Implemented input validation to prevent invalid batch requests

… copy

When users batch-copy 50+ token keys, the frontend previously fired N
individual POST /api/token/:id/key requests simultaneously. This quickly
exhausted the CriticalRateLimit (20 req/20min per IP), causing most
requests to fail with HTTP 429.

Solution: add a single POST /api/token/batch/keys endpoint that accepts
an array of token IDs and returns all keys in one request.

Backend:
- model: add GetTokensByIdsAndUserId for batch query
- controller: add BatchGetTokenKeys handler (max 100 IDs)
- router: register POST /api/token/batch/keys with CriticalRateLimit

Frontend:
- helpers/token.js: add fetchTokenKeysBatch calling batch endpoint
- useTokensData.jsx: rewrite batchCopyTokens to use single batch request
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Apr 4, 2026

Walkthrough

A new batch token key fetching feature is introduced across the stack. The backend adds a /api/token/batch/keys endpoint that accepts multiple token IDs, validates input constraints (non-empty, max 100 IDs), and returns a keyed map of token IDs to their full keys. The frontend switches from individual per-token requests to this single batch request.

Changes

Cohort / File(s) Summary
Backend API Endpoint
controller/token.go, model/token.go, router/api-router.go
Added BatchGetTokenKeys handler with input validation (empty and 100-item limit). New model function GetTokensByIdsAndUserId queries database for tokens matching IDs and user. Route registered as POST /api/token/batch/keys with rate limiting.
Frontend Batch Fetching
web/src/helpers/token.js, web/src/hooks/tokens/useTokensData.jsx
Added fetchTokenKeysBatch helper for batch key requests. Updated batchCopyTokens to replace per-token fetching with single batch request, reducing overhead and merging response into resolved keys map.

Sequence Diagram

sequenceDiagram
    participant Client as Browser Client
    participant Helper as Token Helper
    participant API as API Controller
    participant DB as Database
    participant Hook as useTokensData Hook

    Hook->>Helper: fetchTokenKeysBatch([id1, id2, ...])
    Helper->>API: POST /api/token/batch/keys<br/>{ ids: [...] }
    API->>API: Validate ids (non-empty, ≤100)
    API->>DB: Query tokens WHERE<br/>id IN (...)<br/>AND user_id = ?
    DB-->>API: [Token, Token, ...]
    API-->>Helper: { success, data: {<br/>keys: { id1: key1, ... } } }
    Helper-->>Hook: { [id1]: key1, [id2]: key2, ... }
    Hook->>Hook: Merge into resolvedTokenKeys
    Hook->>Hook: Generate copy content<br/>from keysMap[token.id]
    Hook-->>Client: Copy-to-clipboard
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Poem

🐰 Hop, hop! The tokens fly in batches now,
No more one-by-one to slow us down—bow wow!
From browser to database, a single leap,
Keys map in hand, our secrets to keep. ✨

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 22.22% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'fix(token): add batch keys endpoint to prevent 429 rate limit on bulk copy' accurately describes the primary change: adding a batch endpoint to resolve rate-limiting issues during bulk token copy operations.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
web/src/hooks/tokens/useTokensData.jsx (1)

412-425: ⚠️ Potential issue | 🟠 Major

Handle selections larger than 100 to avoid hard failure.

batchCopyTokens currently sends all selected IDs in one request, but backend rejects payloads with more than 100 IDs. Large multi-select copy will fail entirely.

💡 Proposed fix (chunk requests by backend limit)
-      const ids = selectedKeys.map((token) => token.id);
-      const keysMap = await fetchTokenKeysBatch(ids);
-      setResolvedTokenKeys((prev) => ({ ...prev, ...keysMap }));
+      const ids = selectedKeys.map((token) => token.id);
+      const BATCH_SIZE = 100;
+      const keysMap = {};
+      for (let i = 0; i < ids.length; i += BATCH_SIZE) {
+        const chunkIds = ids.slice(i, i + BATCH_SIZE);
+        const partial = await fetchTokenKeysBatch(chunkIds);
+        Object.assign(keysMap, partial);
+      }
+      setResolvedTokenKeys((prev) => ({ ...prev, ...keysMap }));
       let content = '';
       for (const token of selectedKeys) {
         const fullKey = keysMap[token.id];
         if (!fullKey) continue;
         if (copyType === 'name+key') {
           content += `${token.name}    sk-${fullKey}\n`;
         } else {
           content += `sk-${fullKey}\n`;
         }
       }
+      if (!content) {
+        showError(t('未找到可复制的令牌密钥'));
+        return;
+      }
       await copyText(content);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@web/src/hooks/tokens/useTokensData.jsx` around lines 412 - 425, The current
logic sends all selectedKeys' ids to fetchTokenKeysBatch at once which fails for
>100 ids; modify the code to chunk ids into batches of <=100, call
fetchTokenKeysBatch for each chunk (sequentially or with Promise.allSettled),
merge each returned keysMap into a single aggregate (then call
setResolvedTokenKeys(prev => ({...prev, ...aggregate}))), and build the content
from the aggregate before calling copyText; use the existing symbols
selectedKeys, fetchTokenKeysBatch, setResolvedTokenKeys, copyType, and copyText
and ensure errors in individual chunks are handled/ignored so the overall copy
doesn't hard-fail.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Outside diff comments:
In `@web/src/hooks/tokens/useTokensData.jsx`:
- Around line 412-425: The current logic sends all selectedKeys' ids to
fetchTokenKeysBatch at once which fails for >100 ids; modify the code to chunk
ids into batches of <=100, call fetchTokenKeysBatch for each chunk (sequentially
or with Promise.allSettled), merge each returned keysMap into a single aggregate
(then call setResolvedTokenKeys(prev => ({...prev, ...aggregate}))), and build
the content from the aggregate before calling copyText; use the existing symbols
selectedKeys, fetchTokenKeysBatch, setResolvedTokenKeys, copyType, and copyText
and ensure errors in individual chunks are handled/ignored so the overall copy
doesn't hard-fail.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Run ID: 4314b5c4-ec16-4f32-a7f3-5a439e948685

📥 Commits

Reviewing files that changed from the base of the PR and between bb5b9ea and cf0822d.

📒 Files selected for processing (5)
  • controller/token.go
  • model/token.go
  • router/api-router.go
  • web/src/helpers/token.js
  • web/src/hooks/tokens/useTokensData.jsx

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant