-
Notifications
You must be signed in to change notification settings - Fork 1.6k
feat: add AUTOCOMPLETE_UNIQUE_SUGGESTION_SHOWN telemetry event #4383
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
|
To do this properly is hard, and we decided to let it go for now, and instead use at telemetry which is correct, even though it means missing the conceptual 'acceptance rate' as experienced by the end-user of autocomplete |
Track when a suggestion is shown to the user for the first time. This helps count only the requests that were actually displayed to users. - Add AUTOCOMPLETE_UNIQUE_SUGGESTION_SHOWN to TelemetryEventName enum - Add captureUniqueSuggestionShown method to AutocompleteTelemetry - Track shown suggestions using a Set with text+prefix+suffix as key - Integrate tracking in GhostInlineCompletionProvider for both cache hits and LLM responses - Add comprehensive tests for the new telemetry event
- Add shownToUser flag to FillInAtCursorSuggestion - Update findMatchingSuggestion to return isFirstTimeShown flag - Simplify AutocompleteTelemetry.captureUniqueSuggestionShown signature - Remove ID generation and Set tracking logic - Update tests to match new implementation
fc9c864 to
411d811
Compare
Only count suggestions as 'unique shown' if they were visible for at least 300ms, filtering out suggestions that flash briefly when typing quickly. - Add firstShownAt and uniqueTelemetryFired fields to FillInAtCursorSuggestion - Add MIN_VISIBILITY_DURATION_MS constant (300ms) - Add updateVisibilityTracking() helper function - Add shouldFireUniqueTelemetry flag to MatchingSuggestionWithFirstTimeFlag - Add comprehensive tests for visibility duration tracking
…metry The previous implementation relied on subsequent calls to findMatchingSuggestion to check if 300ms had elapsed since the suggestion was first shown. This approach was flawed because there's no guarantee the function would be called again - if the user just stares at the suggestion without typing, no telemetry would fire. This fix implements proper timeout-based tracking: 1. When a suggestion is shown, start a 300ms timer 2. After 300ms, check if the same suggestion is still being displayed 3. If yes, fire AUTOCOMPLETE_UNIQUE_SUGGESTION_SHOWN telemetry 4. If no (different suggestion or dismissed), don't fire Changes: - Added VisibilityTrackingState interface to track current suggestion - Added startVisibilityTracking() and cancelVisibilityTracking() methods - Modified findMatchingSuggestion to return suggestionKey instead of shouldFireUniqueTelemetry - Removed firstShownAt and uniqueTelemetryFired from FillInAtCursorSuggestion type - Updated tests to reflect the new implementation
The isFirstTimeShown flag and shownToUser property were computed but never actually consumed by any code. The visibility-based telemetry system uses suggestionKey and firedUniqueTelemetryKeys Set for deduplication instead. Removed: - isFirstTimeShown property from MatchingSuggestionWithVisibilityKey interface - shownToUser property from FillInAtCursorSuggestion interface - All logic computing and returning isFirstTimeShown in findMatchingSuggestion() - Related test assertions and test data
Summary
Track when a suggestion is shown to the user for the first time. This helps count only the requests that were actually displayed to users.
Changes
AUTOCOMPLETE_UNIQUE_SUGGESTION_SHOWNtoTelemetryEventNameenum inpackages/types/src/telemetry.tscaptureUniqueSuggestionShownmethod toAutocompleteTelemetryclassGhostInlineCompletionProviderfor both cache hits and LLM responsesEvent Properties
languageId: The programming language of the filemodelId: The model used for the suggestionprovider: The provider used for the suggestionsuggestionLength: Length of the suggestion textsource: Either "llm" (new from LLM) or "cache" (retrieved from cache)Testing