Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move AIFunction parameter schematization from parameter level to function level. #5826

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

eiriktsarpalis
Copy link
Member

@eiriktsarpalis eiriktsarpalis commented Jan 29, 2025

Fix #5655.

@dotnet-comment-bot
Copy link
Collaborator

‼️ Found issues ‼️

Project Coverage Type Expected Actual
Microsoft.Extensions.AI.Ollama Line 80 77.94 🔻
Microsoft.Extensions.AI.Ollama Branch 80 79.73 🔻
Microsoft.Extensions.Caching.Hybrid Line 86 77.7 🔻
Microsoft.Gen.MetadataExtractor Line 98 57.35 🔻
Microsoft.Gen.MetadataExtractor Branch 98 62.5 🔻

🎉 Good job! The coverage increased 🎉
Update MinCodeCoverage in the project files.

Project Expected Actual
Microsoft.Extensions.AI.AzureAIInference 91 92
Microsoft.Extensions.AI.Abstractions 83 85
Microsoft.Extensions.AI 88 89

Full code coverage report: https://dev.azure.com/dnceng-public/public/_build/results?buildId=933837&view=codecoverage-tab

@eiriktsarpalis eiriktsarpalis force-pushed the feature/function-metadata branch from 0a3b415 to ec8d3ab Compare February 3, 2025 18:39
@eiriktsarpalis eiriktsarpalis marked this pull request as ready for review February 3, 2025 18:41
@eiriktsarpalis eiriktsarpalis requested a review from a team as a code owner February 3, 2025 18:41
@dotnet-comment-bot
Copy link
Collaborator

‼️ Found issues ‼️

Project Coverage Type Expected Actual
Microsoft.Extensions.AI.Ollama Line 80 78.24 🔻
Microsoft.Gen.MetadataExtractor Line 98 57.35 🔻
Microsoft.Gen.MetadataExtractor Branch 98 62.5 🔻

🎉 Good job! The coverage increased 🎉
Update MinCodeCoverage in the project files.

Project Expected Actual
Microsoft.Extensions.AI 88 89
Microsoft.Extensions.AI.AzureAIInference 91 92
Microsoft.Extensions.AI.Abstractions 83 85

Full code coverage report: https://dev.azure.com/dnceng-public/public/_build/results?buildId=938548&view=codecoverage-tab

@dotnet-comment-bot
Copy link
Collaborator

‼️ Found issues ‼️

Project Coverage Type Expected Actual
Microsoft.Extensions.AI.Ollama Line 80 78.11 🔻
Microsoft.Gen.MetadataExtractor Line 98 57.35 🔻
Microsoft.Gen.MetadataExtractor Branch 98 62.5 🔻

🎉 Good job! The coverage increased 🎉
Update MinCodeCoverage in the project files.

Project Expected Actual
Microsoft.Extensions.AI.AzureAIInference 91 92
Microsoft.Extensions.AI 88 89
Microsoft.Extensions.AI.Abstractions 83 85

Full code coverage report: https://dev.azure.com/dnceng-public/public/_build/results?buildId=939708&view=codecoverage-tab

@RussKie RussKie added the area-AI label Feb 4, 2025
@eiriktsarpalis eiriktsarpalis requested review from stephentoub and a team February 5, 2025 15:36
/// When no schema is specified, consuming chat clients should assume the "{}" or "true" schema, indicating that any JSON input is admissible.
/// </para>
/// </remarks>
public JsonElement Schema
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@stephentoub one consequence of making the schema self-contained and never-null is that leaf clients no longer need to consult AIFunctionParameterMetadata or AIFunctionReturnParameterMetadata information. Neither they, nor the AIFunctionMetadata.GetParameter helper function is being used in product code today.

While I can see the informational value in exposing this metadata, it is all entirely redundant today and might in fact completely diverge from both the schema and the actual requirements of the underlying AIFunction.InvokeAsync method. It is also forcing more boilerplate on users implementing AIFunction types of their own. Is it something we could consider removing?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How would SK layer on top if it lacked that information?

Is there not still value / scenarios in having the .NET type information for parameters exposed, so that another consumer of the AIFunction can use that to understand what .NET types it's actually expecting? Seems like this is inching back towards having the arguments dictionary being strongly typed as JsonElement.

And as we spoke about offline, we've seen folks take the return schema information and add it into function call results to help the LLM better understand what the return information represents. Unless we incorporate the return schema into the function schema, we'll be cutting off that possibility, no?

I'm all for simplifying, but I worry this would be cutting off useful information.

@SteveSandersonMS?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Microsoft.Extensions.AI.AIFunctionMetadata should allow to be initialized with a json schema string
4 participants