Skip to content

(experiment Feat): Pack Playwright traces into the upload bundle#1078

Draft
EliSchleifer wants to merge 1 commit intomainfrom
claude/playwright-trace-streaming-j6fvC
Draft

(experiment Feat): Pack Playwright traces into the upload bundle#1078
EliSchleifer wants to merge 1 commit intomainfrom
claude/playwright-trace-streaming-j6fvC

Conversation

@EliSchleifer
Copy link
Copy Markdown
Member

Discovers Playwright trace.zip archives referenced as [[ATTACHMENT|<path>]] lines in the JUnit <system-out> payload, computes the same SHA-256 identity hash the bundle ingester uses (file|className|parentName|name|variant), and packs each archive into the existing bundle.tar.zstd at traces/<identity_hash>.zip. No new CLI flag — auto-detection only, per the PRD.

The server-side ingester
(https://github.com/trunk-io/trunk2/pull/3741) recomputes the same identity hash, matches each archive to a test_case_id, and re-uploads to S3 with a 7-day retention. This commit completes the contract on the producer side.

Implementation

  • New bundle::traces module exposes the wire-level constants: compute_trace_identity_hash, extract_attachment_paths, is_trace_archive_path, trace_archive_name, DiscoveredTrace, and the traces/ prefix and .zip suffix. Both sides of the contract import or duplicate these — changing the hash format breaks every bundle a previous CLI produced.
  • BundlerUtil::with_traces attaches a list of DiscoveredTrace alongside bep_result. make_tarball writes each archive at traces/<hash>.zip. Trace packing is best-effort: a missing source file or tar.append_file error is logged and skipped so a single bad attachment can never sink the upload. Duplicate identity hashes (e.g. retried test cases pointing at the same trace) collapse to a single tarball entry.
  • discover_traces_from_junit_parser walks the parsed JunitParser reports, scans each <system-out> for [[ATTACHMENT|...]] markers, filters to .zip files, computes the identity hash from the test case's tuple, and resolves each path: absolute → as-is, relative → next to the JUnit XML, then under repo_root. Misses are warned and skipped.
  • generate_internal_file and generate_internal_file_from_bep now return InternalFileResult { bundled_file, validations, traces }. upload_command::run_upload threads the discovered traces into BundlerUtil::with_traces.

Tests

  • Unit tests in bundle::traces: hash determinism, hex format, variant separation, missing-field equivalence, attachment extraction with dedupe and garbage tolerance, archive name format.
  • Roundtrip tests in bundle::bundler: a tarball with three DiscoveredTrace entries (two distinct + one duplicate) yields exactly two traces/*.zip entries with the right bytes; a DiscoveredTrace with a non-existent source path is skipped, not fatal.

Caveats

  • cargo fmt shows pre-existing drift in unrelated crates (codeowners, bazel-bep); this commit only formats files it touches and leaves the rest alone.
  • cargo clippy --all-targets -- -D warnings is blocked by pre-existing lints in codeowners and bazel-bep generated proto code; clippy on bundle and trunk-analytics-cli lib targets is clean.

Discovers Playwright `trace.zip` archives referenced as
`[[ATTACHMENT|<path>]]` lines in the JUnit `<system-out>` payload, computes
the same SHA-256 identity hash the bundle ingester uses
(`file|className|parentName|name|variant`), and packs each archive into
the existing `bundle.tar.zstd` at `traces/<identity_hash>.zip`. No new
CLI flag — auto-detection only, per the PRD.

The server-side ingester
(https://github.com/trunk-io/trunk2/pull/3741) recomputes the same
identity hash, matches each archive to a `test_case_id`, and re-uploads
to S3 with a 7-day retention. This commit completes the contract on the
producer side.

Implementation
- New `bundle::traces` module exposes the wire-level constants:
  `compute_trace_identity_hash`, `extract_attachment_paths`,
  `is_trace_archive_path`, `trace_archive_name`, `DiscoveredTrace`,
  and the `traces/` prefix and `.zip` suffix. Both sides of the
  contract import or duplicate these — changing the hash format breaks
  every bundle a previous CLI produced.
- `BundlerUtil::with_traces` attaches a list of `DiscoveredTrace`
  alongside `bep_result`. `make_tarball` writes each archive at
  `traces/<hash>.zip`. Trace packing is best-effort: a missing source
  file or `tar.append_file` error is logged and skipped so a single
  bad attachment can never sink the upload. Duplicate identity hashes
  (e.g. retried test cases pointing at the same trace) collapse to a
  single tarball entry.
- `discover_traces_from_junit_parser` walks the parsed `JunitParser`
  reports, scans each `<system-out>` for `[[ATTACHMENT|...]]` markers,
  filters to `.zip` files, computes the identity hash from the test
  case's tuple, and resolves each path: absolute → as-is, relative →
  next to the JUnit XML, then under `repo_root`. Misses are warned
  and skipped.
- `generate_internal_file` and `generate_internal_file_from_bep` now
  return `InternalFileResult { bundled_file, validations, traces }`.
  `upload_command::run_upload` threads the discovered traces into
  `BundlerUtil::with_traces`.

Tests
- Unit tests in `bundle::traces`: hash determinism, hex format, variant
  separation, missing-field equivalence, attachment extraction with
  dedupe and garbage tolerance, archive name format.
- Roundtrip tests in `bundle::bundler`: a tarball with three
  `DiscoveredTrace` entries (two distinct + one duplicate) yields
  exactly two `traces/*.zip` entries with the right bytes; a
  `DiscoveredTrace` with a non-existent source path is skipped, not
  fatal.

Caveats
- `cargo fmt` shows pre-existing drift in unrelated crates
  (`codeowners`, `bazel-bep`); this commit only formats files it
  touches and leaves the rest alone.
- `cargo clippy --all-targets -- -D warnings` is blocked by
  pre-existing lints in `codeowners` and `bazel-bep` generated proto
  code; clippy on `bundle` and `trunk-analytics-cli` lib targets is
  clean.
@trunk-io
Copy link
Copy Markdown

trunk-io Bot commented Apr 26, 2026

Merging to main in this repository is managed by Trunk.

  • To merge this pull request, check the box to the left or comment /trunk merge below.

After your PR is submitted to the merge queue, this comment will be automatically updated with its status. If the PR fails, failure details will also be posted here

@trunk-staging-io
Copy link
Copy Markdown

trunk-staging-io Bot commented Apr 26, 2026

Static BadgeStatic BadgeStatic BadgeStatic Badge

View Full Report ↗︎Docs

@codecov-commenter
Copy link
Copy Markdown

codecov-commenter commented Apr 26, 2026

Codecov Report

❌ Patch coverage is 82.68156% with 62 lines in your changes missing coverage. Please review.
✅ Project coverage is 82.00%. Comparing base (365c962) to head (5f0c3ff).

Files with missing lines Patch % Lines
cli/src/context.rs 50.00% 57 Missing ⚠️
bundle/src/bundler.rs 98.36% 2 Missing ⚠️
cli/src/upload_command.rs 90.47% 2 Missing ⚠️
bundle/src/traces.rs 99.00% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1078      +/-   ##
==========================================
+ Coverage   81.73%   82.00%   +0.26%     
==========================================
  Files          69       70       +1     
  Lines       14905    15242     +337     
==========================================
+ Hits        12183    12499     +316     
- Misses       2722     2743      +21     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@EliSchleifer EliSchleifer changed the title (Feat): Pack Playwright traces into the upload bundle (experiment Feat): Pack Playwright traces into the upload bundle Apr 26, 2026
@trunk-io
Copy link
Copy Markdown

trunk-io Bot commented Apr 26, 2026

Static BadgeStatic BadgeStatic BadgeStatic Badge

View Full Report ↗︎Docs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

3 participants