Skip to content

Should Content property ever come back null with FinishReason=ChatFilter with no exception thrown? #10345

Answered by SergeyMenshykh
pamims asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @pamims, I think the reason you see output tokens consumed is that the LLM reasoned over the prompt and provided a result, but it was filtered out by the Azure OpenAI Service content filtering system. You can identify which filter was triggered by using the GetResponseContentFilterResult extension method from the Azure.AI.OpenAI.Chat; namespace:

CC: @RogerBarreto

Replies: 2 comments 1 reply

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
1 reply
@pamims
Comment options

Answer selected by sophialagerkranspandey
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
.NET Issue or Pull requests regarding .NET code triage
3 participants