admin管理员组

文章数量:1387467

I'm using Microsoft.Extensions.AI to run queries against numerous Ollama models that I have installed. I have added a custom functions (AIFunction type) by creating a ChatOptions instance and passing it into CompleteAsync() when calling models.

I know that some models do not support custom functions. However, out of those that definitely do, some return JSON in the response body instead of returning normal text. They return it regardless of whether my custom function was called or not.

How can I fix this?

本文标签: