admin管理员组文章数量:1400044
I have this code that works:
public ChatOllama(string? model = null, string? systemPrompt = null, Uri? endpoint = null, TimeSpan? timeout = null)
{
SystemPrompt = systemPrompt ?? SystemPrompt;
string ollamaModel = string.IsNullOrEmpty(model) ? DefaultModel : model.ToLower().Trim();
Uri ollamaEndpoint = endpoint ?? DefaultUri;
TimeSpan ollamaTimeout = timeout ?? DefaultHttpTimeout;
this.ollamaEndpoint = new HttpClient
{
Timeout = ollamaTimeout,
BaseAddress = ollamaEndpoint,
};
var builder = Kernel.CreateBuilder();
builder.Services.AddOllamaChatCompletion(
ollamaModel,
httpClient: this.ollamaEndpoint
);
var kernel = builder.Build();
m_chatService = kernel.GetRequiredService<IChatCompletionService>();
History.AddSystemMessage(SystemPrompt);
}
How can I set the ollama temperature when I am connecting to ollama using the Semantic Kernel library from Microsoft?
本文标签:
版权声明:本文标题:artificial intelligence - Adding temperature when using semanticKernel and OLLAMA connector in C# - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1744125572a2591937.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论