(experiment Feat): Pack Playwright traces into the upload bundle#1078
Draft
EliSchleifer wants to merge 1 commit intomainfrom
Draft
(experiment Feat): Pack Playwright traces into the upload bundle#1078EliSchleifer wants to merge 1 commit intomainfrom
EliSchleifer wants to merge 1 commit intomainfrom
Conversation
Discovers Playwright `trace.zip` archives referenced as `[[ATTACHMENT|<path>]]` lines in the JUnit `<system-out>` payload, computes the same SHA-256 identity hash the bundle ingester uses (`file|className|parentName|name|variant`), and packs each archive into the existing `bundle.tar.zstd` at `traces/<identity_hash>.zip`. No new CLI flag — auto-detection only, per the PRD. The server-side ingester (https://github.com/trunk-io/trunk2/pull/3741) recomputes the same identity hash, matches each archive to a `test_case_id`, and re-uploads to S3 with a 7-day retention. This commit completes the contract on the producer side. Implementation - New `bundle::traces` module exposes the wire-level constants: `compute_trace_identity_hash`, `extract_attachment_paths`, `is_trace_archive_path`, `trace_archive_name`, `DiscoveredTrace`, and the `traces/` prefix and `.zip` suffix. Both sides of the contract import or duplicate these — changing the hash format breaks every bundle a previous CLI produced. - `BundlerUtil::with_traces` attaches a list of `DiscoveredTrace` alongside `bep_result`. `make_tarball` writes each archive at `traces/<hash>.zip`. Trace packing is best-effort: a missing source file or `tar.append_file` error is logged and skipped so a single bad attachment can never sink the upload. Duplicate identity hashes (e.g. retried test cases pointing at the same trace) collapse to a single tarball entry. - `discover_traces_from_junit_parser` walks the parsed `JunitParser` reports, scans each `<system-out>` for `[[ATTACHMENT|...]]` markers, filters to `.zip` files, computes the identity hash from the test case's tuple, and resolves each path: absolute → as-is, relative → next to the JUnit XML, then under `repo_root`. Misses are warned and skipped. - `generate_internal_file` and `generate_internal_file_from_bep` now return `InternalFileResult { bundled_file, validations, traces }`. `upload_command::run_upload` threads the discovered traces into `BundlerUtil::with_traces`. Tests - Unit tests in `bundle::traces`: hash determinism, hex format, variant separation, missing-field equivalence, attachment extraction with dedupe and garbage tolerance, archive name format. - Roundtrip tests in `bundle::bundler`: a tarball with three `DiscoveredTrace` entries (two distinct + one duplicate) yields exactly two `traces/*.zip` entries with the right bytes; a `DiscoveredTrace` with a non-existent source path is skipped, not fatal. Caveats - `cargo fmt` shows pre-existing drift in unrelated crates (`codeowners`, `bazel-bep`); this commit only formats files it touches and leaves the rest alone. - `cargo clippy --all-targets -- -D warnings` is blocked by pre-existing lints in `codeowners` and `bazel-bep` generated proto code; clippy on `bundle` and `trunk-analytics-cli` lib targets is clean.
|
Merging to
After your PR is submitted to the merge queue, this comment will be automatically updated with its status. If the PR fails, failure details will also be posted here |
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1078 +/- ##
==========================================
+ Coverage 81.73% 82.00% +0.26%
==========================================
Files 69 70 +1
Lines 14905 15242 +337
==========================================
+ Hits 12183 12499 +316
- Misses 2722 2743 +21 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Discovers Playwright
trace.ziparchives referenced as[[ATTACHMENT|<path>]]lines in the JUnit<system-out>payload, computes the same SHA-256 identity hash the bundle ingester uses (file|className|parentName|name|variant), and packs each archive into the existingbundle.tar.zstdattraces/<identity_hash>.zip. No new CLI flag — auto-detection only, per the PRD.The server-side ingester
(https://github.com/trunk-io/trunk2/pull/3741) recomputes the same identity hash, matches each archive to a
test_case_id, and re-uploads to S3 with a 7-day retention. This commit completes the contract on the producer side.Implementation
bundle::tracesmodule exposes the wire-level constants:compute_trace_identity_hash,extract_attachment_paths,is_trace_archive_path,trace_archive_name,DiscoveredTrace, and thetraces/prefix and.zipsuffix. Both sides of the contract import or duplicate these — changing the hash format breaks every bundle a previous CLI produced.BundlerUtil::with_tracesattaches a list ofDiscoveredTracealongsidebep_result.make_tarballwrites each archive attraces/<hash>.zip. Trace packing is best-effort: a missing source file ortar.append_fileerror is logged and skipped so a single bad attachment can never sink the upload. Duplicate identity hashes (e.g. retried test cases pointing at the same trace) collapse to a single tarball entry.discover_traces_from_junit_parserwalks the parsedJunitParserreports, scans each<system-out>for[[ATTACHMENT|...]]markers, filters to.zipfiles, computes the identity hash from the test case's tuple, and resolves each path: absolute → as-is, relative → next to the JUnit XML, then underrepo_root. Misses are warned and skipped.generate_internal_fileandgenerate_internal_file_from_bepnow returnInternalFileResult { bundled_file, validations, traces }.upload_command::run_uploadthreads the discovered traces intoBundlerUtil::with_traces.Tests
bundle::traces: hash determinism, hex format, variant separation, missing-field equivalence, attachment extraction with dedupe and garbage tolerance, archive name format.bundle::bundler: a tarball with threeDiscoveredTraceentries (two distinct + one duplicate) yields exactly twotraces/*.zipentries with the right bytes; aDiscoveredTracewith a non-existent source path is skipped, not fatal.Caveats
cargo fmtshows pre-existing drift in unrelated crates (codeowners,bazel-bep); this commit only formats files it touches and leaves the rest alone.cargo clippy --all-targets -- -D warningsis blocked by pre-existing lints incodeownersandbazel-bepgenerated proto code; clippy onbundleandtrunk-analytics-clilib targets is clean.