Welcome to TotaUI, the next version of my React-based web interface for running local Large Language Models (LLMs) through Ollama. This updated version improves on the original TotaUI, making it easier and faster to chat with your local models.
Here’s how to get it up and running on your machine:
git clone https://github.com/dhaneshdutta/totaUI-v2.git
cd totaUI-v2
You’ll need Node.js installed. Once you have that, run:
npm install
To launch the web interface locally:
npm start
This will open the app in your browser at http://localhost:3000
. You’re ready to interact with your local LLM!
Here’s what the UI looks like in action:
Feel free to open issues or suggest new features. I’m open to collaboration!
That’s it! If you have any questions or run into any issues, don’t hesitate to reach out. Enjoy chatting with your local models! 😊