CandleMist
Fullstack chatbot built using Rust. Made using Candle, Leptos, Actix, Tokio and Tailwind. Uses quantized Mistral 7B Instruct v0.1 GGUF models.
Install / Use
/learn @shettysach/CandleMistREADME
CandleMist
<div "align: center;"> <img src="assets/image.png" width="45%"> <img src="assets/image1.png" width="45%"> </div>-
A simple chatbot built using Rust, in the frontend and the backend.
-
Made using
candle,leptos,actix,tokioand TailwindCSS. -
Uses quantized Mistral 7B Instruct v0.1 GGUF models.
Credits
- This is a fork of MoonKraken/rusty_llama by Code to the Moon.
- This chatbot uses Mistral GGUF models and the
huggingface/candleframework, which includes thecandle-transformerscrate, whereas the original uses GGML models and therustformers/llmcrate. - The frontend has some aesthetic changes, but the overall structure is the same.
- Colours are from the Tokyo Night colorscheme.
Setup Instructions
Rust Toolchain
You'll need to use the nightly Rust toolchain, and install the wasm32-unknown-unknown target as well as the trunk and cargo-leptos tools:
rustup toolchain install nightly
rustup target add wasm32-unknown-unknown
cargo install trunk cargo-leptos
Hardware
- For CUDA, add the
cudafeature for candle_core in Cargo.toml.
candle-core = { git = "https://github.com/huggingface/candle.git", version = "0.6.0", optional = true, features = ["cuda"] }
- For Metal, add the
metalfeature for candle_core in Cargo.toml. - For Intel's oneAPI Math Kernel Library, add the
mklfeature for candle_core in Cargo.toml. - For Accelerate, add the
acceleratefeature for candle_core in Cargo.toml.
Model
-
Download any Mistral 7B Instruct v0.1 GGUF model and set the environment variable
MODEL_PATHin.env.Tested Models
-
Download tokenizer.json and set the environment variable in
TOKENIZER_PATHin.env.Tokenizer
TailwindCSS
- Install TailwindCSS with
npm install -D tailwindcss.
Run
git clone https://github.com/ShettySach/CandleMist.git
cd CandleMist
npx tailwindcss -i ./input.css -o ./style/output.css
cargo leptos serve --release
- In your browser, navigate to http://localhost:3000/?
NOTE -
- You can modify parameters such as temperature, seed, top-k, top-p, max history and max response length, and also modify the chat template in 'src/api.rs'.
Related Skills
openhue
347.0kControl Philips Hue lights and scenes via the OpenHue CLI.
sag
347.0kElevenLabs text-to-speech with mac-style say UX.
weather
347.0kGet current weather and forecasts via wttr.in or Open-Meteo
tweakcc
1.6kCustomize Claude Code's system prompts, create custom toolsets, input pattern highlighters, themes/thinking verbs/spinners, customize input box & user message styling, support AGENTS.md, unlock private/unreleased features, and much more. Supports both native/npm installs on all platforms.
