Skip to content

feat: include recipe parameter details in Summon load context (#7288)#7297

Draft
michaelneale wants to merge 1 commit intomainfrom
fix-7288
Draft

feat: include recipe parameter details in Summon load context (#7288)#7297
michaelneale wants to merge 1 commit intomainfrom
fix-7288

Conversation

@michaelneale
Copy link
Collaborator

candidate fix for #7288

Include full recipe parameter information (key, type, requirement,
description, default, options) in the Summon extension's load output.

Previously, when an agent used load() to discover or load recipe sources,
parameter details were omitted — the agent could only guess parameters from
interpolated variable names in the recipe content. This led to unnecessary
trial-and-error when invoking recipes via delegate().

Changes:
- Add 'parameters' field to Source struct to carry RecipeParameter data
- Populate parameters when scanning recipe files and loading subrecipes
- Include a '### Parameters' section in to_load_text() with full details
- Show parameter summaries in discovery listing (load with no args)

Fixes #7288
&self,
sr: &crate::recipe::SubRecipe,
) -> (String, Option<Vec<RecipeParameter>>) {
if let Some(desc) = &sr.description {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I'm understanding the code correctly, I think the prior-existing short circuit was here mostly (maybe it's for testing purposes so you can pass in a value without needing the files to be all reconfigured?) to avoid needing to do a file lookup for performance reasons - so if we need to do the file lookup for the parameters anyways then I think the short circuit no longer has a purpose?

If so, maybe it would make more sense to put this condition inside the pre-existing Ok(recipe_file)...Ok(recipe) block since the two paths mostly share the same code, maybe maintaining the same "prioritize the in-memory recipe data and fallback to the file-stored values" with something like:

let description = sr
                    .description
                    .as_ref()
                    .cloned()
                    .unwrap_or_else(|| recipe.description.clone());

? Apologies if I'm overlooking something.

let param_names: Vec<&str> = params.iter().map(|p| p.key.as_str()).collect();
if !param_names.is_empty() {
let params_str = param_names.join(", ");
desc = format!("{} (params: {})", desc, params_str);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it right to keep injecting param data into the description if we're now passing the parameters back separately and formatting them explicitly down in handle_load_discovery?

"## {} ({})\n\n{}\n\n### Content\n\n{}",
self.name, self.kind, self.description, self.content
"## {} ({})\n\n{}{}\n\n### Content\n\n{}",
self.name, self.kind, self.description, params_section, self.content
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does it still make sense to return the full content of a recipe in to_load_text? My understanding is that delegate uses a different flow, so when the recipe is actually "executed" it won't depend on this data flow. The content of a recipe may be substantial so I was thinking it might make sense to keep the content of the recipe "private" (much like an MCP server for example, you hopefully have a description and a list of parameters but you don't need the actual source code to be able to invoke it) especially since now that we're describing parameters explicitly, it's no longer necessary to surface the parameters via the content (and where they're interpolated etc)

source.name,
truncate(&source.description, 60)
));
if let Some(params) = &source.parameters {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what do you think about consolidating the "discovery" parameter presentation (here) with the "load" presentation (above, currently inlined in to_load_text)? Perhaps a shared function could take a flag for whether it should truncate param descriptions or not? Aside from truncating the description, which might hypothetically be long, it seems like we'd mostly want the same information available between the two WRT options, defaults, etc

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

here's one I've been playing with, which doesn't currently truncate at all:

fn format_parameters_section(parameters: Option<&[RecipeParameter]>) -> String {
    match parameters {
        None | Some([]) => "  Parameters: None".to_string(),
        Some(params) => {
            let mut out = String::from("  Parameters:\n");
            for p in params {
                let req = match &p.requirement {
                    RecipeParameterRequirement::Required => "required",
                    RecipeParameterRequirement::Optional | RecipeParameterRequirement::UserPrompt => {
                        "optional"
                    }
                };
                let type_str = match &p.input_type {
                    RecipeParameterInputType::Select => {
                        if let Some(ref opts) = p.options {
                            format!("select: {}", opts.join(", "))
                        } else {
                            "select".to_string()
                        }
                    }
                    other => format!("{}", other),
                };
                let default_suffix = p
                    .default
                    .as_ref()
                    .map(|d| format!(", default: \"{}\"", d))
                    .unwrap_or_default();
                let bullet = format!(
                    "{} ({}, {}{}): {}",
                    p.key, req, type_str, default_suffix, p.description
                );
                out.push_str(&format!("    • {}\n", bullet));
            }
            out.pop();
            out
        }
    }
}

It ends up as something like:

• goose-self-test - A comprehensive meta-testing recipe
  Parameters:
    • test_phases (optional, string): Which test phases to run
• hello-world - A sample recipe
  Parameters: None

The nesting is kind of awkward but was going for something that's tight and neat while very explicit/clear about what's being specified where.

Very hazy at best understanding of the precedence of where the specifics around these output formats come from though, so apologies if it seems like I'm imposing guesswork on you here - just trying to share some tentative findings in case they're helpful.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that nesting is no where near as bad as I have seen elsewhere, it is fine!

@jcwilk
Copy link

jcwilk commented Feb 18, 2026

Apologies for the deluge of comments, still getting my head around Rust/Goose best practices - and thank you for jumping on my issue so quickly 🙌 🙌

@michaelneale
Copy link
Collaborator Author

no worries @jcwilk - the issue isn't totally clear to me but based on what you observed, are you able to try this branch to see if it makes that bit better? (then I will decide if should keep it or not).

Thanks in advance!

@michaelneale
Copy link
Collaborator Author

might tab @tlongwell-block who had the vision around "summon" consolidation... for any thoughts.

@jcwilk
Copy link

jcwilk commented Feb 19, 2026

@michaelneale sure thing! I'll try it when I get home - at the rust/goose meetup at block hq at the moment 😅 looking forward to seeing what y'all have in the pipes

@jcwilk
Copy link

jcwilk commented Feb 19, 2026

Okay, I made a somewhat contrived recipe that executes a compiled binary (ie, goose can't simply read the source code to figure out how the binary works) which does nothing except verify the passed parameters match allowed values, throwing an unhelpful error if they do not. Here's the recipe:

version: 1.0.0
title: API Client Generator
description: Generate a client library for your API specification.
instructions: |-
  Run the api-client-generator script. It expects exactly six positional arguments in order.
  Pass each parameter value as the corresponding argument (first parameter = first argument, second = second, and so on).
  The script is at ./api-client-generator

  1. language: {{ language }}
  2. style: {{ style }}
  3. async_mode: {{ async_mode }}
  4. output_path: {{ output_path }}
  5. spec_format: {{ spec_format }}
prompt: Execute the script by feeding the ordered parameters in as CLI arguments.
activities: []
parameters:
- key: language
  input_type: select
  requirement: required
  description: Target language
  options:
  - typescript
  - python
  - go
- key: style
  input_type: select
  requirement: required
  description: Code style
  options:
  - class-based
  - functional
  - minimal
- key: async_mode
  input_type: boolean
  requirement: optional
  description: Use async/await patterns
  default: 'true'
- key: output_path
  input_type: string
  requirement: required
  description: Where to write the generated client
- key: spec_format
  input_type: select
  requirement: required
  description: API spec format
  options:
  - openapi3
  - openapi2

Using gpt-4o I start a new chat with just developer+summon and tell it "Run the api generator recipe with reasonable values using summon's load/delegate tooling" and it loads (the discovery-based load doesn't include select value options) and then runs the subagent via delegate, the subagent attempts to run the script a few times with the bogus values it got passed from the parent agent and then gives up. If I paste in the full recipe yaml then it's able to call delegate successfully with valid parameters.

I realize we're getting into a bit of an edge case here, but it just seemed to me that if goose has a great spec for storing and validating recipe parameters then we might as well try to get the agent to be able to fully leverage that. But maybe it's a balancing act between keeping the loads lean too?

@jcwilk
Copy link

jcwilk commented Feb 19, 2026

In case you want the binary, just compile with rustc api-client-generator.rs -o api-client-generator

//! API client generator - expects 5 positional args: language style async_mode output_path spec_format

use std::env;
use std::process::exit;

fn fail() -> ! {
    eprintln!("failed due to parameters!");
    exit(1);
}

fn main() {
    let args: Vec<String> = env::args().skip(1).collect();
    if args.len() != 5 {
        fail();
    }

    let (lang, style, async_mode, output_path, spec_format) = (
        &args[0],
        &args[1],
        &args[2],
        &args[3],
        &args[4],
    );

    let valid_lang = ["typescript", "python", "go"].contains(&lang.as_str());
    let valid_style = ["class-based", "functional", "minimal"].contains(&style.as_str());
    let invalid_combo = lang == "go" && style == "class-based";
    let valid_async = ["true", "false"].contains(&async_mode.as_str());
    let valid_format = ["openapi3", "openapi2"].contains(&spec_format.as_str());

    if !valid_lang || !valid_style || invalid_combo || !valid_async || output_path.is_empty()
        || !valid_format
    {
        fail();
    }

    println!(
        "OK: {} {} {} {} {}",
        lang, style, async_mode, output_path, spec_format
    );
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants

Comments