You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using a local hosted model (llama 3.1) with the openai backend, the hardcoded temperature (value 1.0) used by the mattermost plugin is too high. The model tends to create new words and produce incorrect sentences in foreign languages. Using a temperature of 0.5 solve the issue.
It would be nice to be able to set the temperature with the model parameters
The text was updated successfully, but these errors were encountered:
Description
Using a local hosted model (llama 3.1) with the openai backend, the hardcoded temperature (value 1.0) used by the mattermost plugin is too high. The model tends to create new words and produce incorrect sentences in foreign languages. Using a temperature of 0.5 solve the issue.
It would be nice to be able to set the temperature with the model parameters
The text was updated successfully, but these errors were encountered: