Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why Does My AI Reply to My Threads Instead of Posting Directly in the Channel? #158

Open
CyberXUDD opened this issue Mar 21, 2024 · 5 comments

Comments

@CyberXUDD
Copy link

image

In https://github.com/mattermost/mattermost-plugin-ai/blob/master/docs/usage.md
ai can post msg in channel
image

@CyberXUDD
Copy link
Author

this is my setting
image
image

@esethna
Copy link

esethna commented Mar 28, 2024

Thanks @CyberXUDD. It responds in a thread since it's a response to the original query, and can help keep the context of conversations with teh AI bot contained. I hear the frustration though, since the RHS doesn't pop open when you message the AI bot in the DM. I'll take this feedback back to the team. In the mean time I'd aslo suggest interacting with the AI bot through the icon on the right hand side. We'd welcome any more feedback on it!

image

@azigler
Copy link
Contributor

azigler commented Mar 29, 2024

@esethna In this case, could the user turn off Collapse Reply Threads to have all of the threaded messages show in the root channel? They would still be threaded, but visually they would be in the main channel.

@CyberXUDD
Copy link
Author

My initial feedback indeed lacked thorough consideration. I believe an accurate description should be as follows:

Since Mattermost channels are created around topics, it can be assumed that the discussions within a channel are all related to a single theme. Therefore, in the context of a channel, the AI assistant can consider all members in the channel as the same "input source". The AI doesn't need to differentiate who is expressing opinions, it just needs to reply based on the context of the channel's content itself.

This is an ideal experience, but I do recognize there are potential pitfalls. When there are thousands or tens of thousands of messages in a channel, it's uncertain whether the AI can continue to understand the context smoothly. This is because the information in channel chats can easily grow to a large scale.

So, the current method of replying in group chats with separate threads, I believe, is a more viable approach. I look forward to PM bringing more advanced designs.

Regarding DMs, indeed, if the DM AI could directly communicate and reply, the entire experience would be smoother, but it also poses issues. Since DM channels by default only have one, if all messages are presented in a conversational manner, it becomes cumbersome when I want to switch topics. The current method of replying, though cumbersome, ensures that each conversation is a new topic and avoids interference with each other.

I might not have carefully considered when providing feedback. Thanks to both of you for your responses, I will continue to think about whether there are better solutions here and will provide feedback to you all at any time.

@esethna
Copy link

esethna commented Apr 5, 2024

The AI doesn't need to differentiate who is expressing opinions, it just needs to reply based on the context of the channel's content itself.

100% agree here, and that's something we're working toward. You pointed out the challenges which is that channels may have 100's of thousands of messages and we need to a way to first search which messages are relevant to the query, then only provide those as context to the LLM. In the interim, when using interacting with the AI in the thread offers a way to provide all the context back to the LLM since threads are not going to have as large a number of messages as channels.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants