[3/n] allow JSON schemas relative to the out dir#1301
[3/n] allow JSON schemas relative to the out dir#1301sunshowers wants to merge 4 commits intomainfrom
Conversation
Created using spr 1.3.6-beta.1 [skip ci]
Created using spr 1.3.6-beta.1
Created using spr 1.3.6-beta.1
| /// The path to the spec file. | ||
| path: LitStr, | ||
| /// Where to resolve the path relative to. | ||
| relative_to: RelativeTo, |
There was a problem hiding this comment.
should this be #[serde(default)] and should RelativeTo derive Default?
There was a problem hiding this comment.
Maybe? That's what I had at first, but I figured if you're using the struct style in the first place, you're probably intending to specify relative_to.
There was a problem hiding this comment.
I think I was thrown off by a description of a default variant of RelativeTo
progenitor-macro/src/lib.rs
Outdated
| relative_to: RelativeTo, | ||
| } | ||
|
|
||
| impl SpecSource { |
There was a problem hiding this comment.
Could this be impl syn::Parse for SpecSource?
There was a problem hiding this comment.
Interesting idea, and definitely better than what I did. I keep forgetting how syn works, haha. Might be worth putting this pattern somewhere in a list of serde_tokenstream recipes.
Changed it to this + lookahead1 as commented below. I did a bunch of testing on span attributions, and the big thing is that you cannot have the MacroSettings struct use ParseWrapper<SpecSource> directly. Instead, it must be a TokenStreamWrapper that is later parsed.
(related, we don't have a directory of failing inputs to test span attributions for progenitor with -- is it worth building that?)
| openapiv3 = "2.2.0" | ||
| proc-macro2 = "1.0" | ||
| progenitor-impl = { version = "0.12.0", path = "../progenitor-impl" } | ||
| quote = "1.0" | ||
| schemars = "0.8.22" | ||
| serde = { version = "1.0", features = ["derive"] } | ||
| serde_json = "1.0" | ||
| serde_yaml = "0.9" | ||
| serde_tokenstream = "0.2.0" | ||
| serde_tokenstream = "0.2.2" | ||
| syn = { version = "2.0", features = ["full", "extra-traits"] } |
There was a problem hiding this comment.
it seems like all/most of these should be { workspace = true }. I'm not sure why they're not.
There was a problem hiding this comment.
Yeah they should. Should I do this as a followup?
There was a problem hiding this comment.
either way; I'm happy to do it myself later
Created using spr 1.3.6-beta.1
| // Note this is a TokenStreamWrapper and not a ParseWrapper<SpecSource>. The | ||
| // latter loses span information. |
There was a problem hiding this comment.
- is that an intrinsic property of
ParseWrapperor is it fixable? - How much do we care? We use
ParseWrapperin several other places.
There was a problem hiding this comment.
It's an artifact of the D::Error::custom being called here: https://docs.rs/serde_tokenstream/latest/src/serde_tokenstream/ibidem.rs.html#59 -- serde flattens custom errors into their Display implementation, so that loses span information. The span used ends up being the span of the entire value associated with the key (so for spec, the entire { path = ..., relative_to = ... }).
Can this be fixed? Maybe we could use a side channel similar to the WRAPPER_TOKENS thread local in ibidem.rs. Worth investigating.
This probably doesn't matter for scalars like ParseWrapper<syn::Ident>. It only matters for compound structures like SpecSource. But I see that we have:
pre_hook: Option<ParseWrapper<ClosureOrPath>>,
pre_hook_async: Option<ParseWrapper<ClosureOrPath>>,
post_hook: Option<ParseWrapper<ClosureOrPath>>,
post_hook_async: Option<ParseWrapper<ClosureOrPath>>,which would all be impacted.
There was a problem hiding this comment.
oxidecomputer/serde_tokenstream#221 fixes this. Once that lands and a new serde_tokenstream version is out, we can change this code to use ParseWrapper directly. (Note that ParseWrapper isn't totally compatible with the error collector/diagnostic sink proc macro pattern where we collect all errors into a common place, but that is a pre-existing limitation.)
In upcoming work related to RFD 634 we're going to need to read JSON files out of the build script output directory. Add support for that.