DuaOS
tryna give back differently this ramadan
Install / Use
/learn @muhibwqr/DuaOSREADME
DuaOS
Match your intent (problem, current du'a, or goal) to Allah's Names and Hadith via semantic search, then refine your du'a in a Prophetic style.
Stack: Next.js (App Router), Tailwind CSS, shadcn/ui, Supabase (pgvector), Mastra, OpenAI (gpt-4o-mini + text-embedding-3-small + Whisper for voice).
Voice input (web): Speech-to-text uses OpenAI Whisper via /api/transcribe. An optional "Arabic" toggle (ع) sends language: "ar" to Whisper for better Arabic transcription. For a future React Native (mobile) app, you can use TarteelAI/voice for on-device or online recognition and call the same DuaOS backend APIs (search, refine, transcribe).
Architecture
flowchart LR
subgraph frontend [Frontend]
UI[Next.js App Router]
Shadcn[shadcn/ui]
UI --> Shadcn
end
subgraph backend [Backend]
SearchAPI["/api/search"]
RefineAPI["/api/refine"]
end
subgraph data [Data and AI]
Supabase[(Supabase pgvector)]
Mastra[Mastra Dua Agent]
OpenAI[OpenAI gpt-4o-mini and embeddings]
end
UI --> SearchAPI
UI --> RefineAPI
SearchAPI --> OpenAI
SearchAPI --> Supabase
RefineAPI --> Mastra
Mastra --> OpenAI
RefineAPI --> Supabase
Setup
-
Clone and install
npm install -
Environment
Copy
.env.exampleto.env(or.env.local). Next.js only loads files whose names start with.env(e.g..env,.env.local). Set:SUPABASE_URL– Supabase project URL (server-only; never exposed to the client)SUPABASE_SERVICE_ROLE_KEY– Supabase service role key (server-only)OPENAI_API_KEY– from OpenAI
-
Supabase
- Create a project at database.new.
- In SQL Editor run:
CREATE EXTENSION IF NOT EXISTS vector; - Run the schema: paste contents of
supabase/schema.sqlinto the SQL Editor and run.
-
Seed data
Curated only (Names + curated hadith + Quran du'as):
npm run seedFull Quran & Hadith (download repos, chunk, then seed):
- Download:
npm run download-data— fetches full Quran (Arabic, risan/quran-json) and Sahih Bukhari (4thel00z/hadith.json) intoscripts/data/raw/.
- Download:
- Chunk:
npm run chunk-data— buildsscripts/data/quran-chunks.jsonandhadith-chunks.json(Quran: one verse per chunk; Hadith: one per hadith, split only if very long). - Seed:
npm run seed— if chunk files exist, seeds from them (full Quran + full downloaded Bukhari); otherwise uses curatedscripts/hadiths.jsonandscripts/quran-duas.json. - Important: all Quran ayas and full downloaded hadith are indexed only when chunk files exist before seed.
Licensing: Full Quran is from risan/quran-json (CC-BY-SA 4.0); full hadith from 4thel00z/hadith.json (GPL-3). See docs/DATA_SOURCES.md for use and attribution.
Uses .env (Node 20+ --env-file). Requires the table and RPC from step 3.
-
Run dev
npm run devOpen http://localhost:3000.
Deploy to Vercel
- Push the repo to GitHub and connect it in Vercel.
- In the project Settings → Environment Variables, add:
OPENAI_API_KEYSUPABASE_URLSUPABASE_SERVICE_ROLE_KEY
- Deploy. All secrets are server-only; the frontend never receives API keys or Supabase credentials.
Scripts
npm run dev– start Next.js dev servernpm run build– production buildnpm run seed– clear and seedspiritual_assetswith Names + hadith + Quran (full if chunk files exist, curated fallback otherwise)npm run download-data– download full Quran + Bukhari raw datanpm run chunk-data– build Quran/hadith chunk files from raw datanpm run data:pipeline– run download + chunk + seed
Project structure
src/app/page.tsx– Search hub (Problem / Refine / Goal) and Refined du'a + Ledgersrc/app/api/search/route.ts– Embed query, callmatch_documents, return name + hadith list + quran (rate-limited, validated)src/app/api/refine/route.ts– Mastra Dua agent stream (rate-limited, validated)src/app/api/transcribe/route.ts– Whisper transcription (optionallanguage, e.g.arfor Arabic)src/lib/validation.ts– Zod schemas and input limits for API routessrc/lib/rate-limit.ts– In-memory rate limiting by IPsrc/mastra/agents/dua-agent.ts– DuaOS refiner agent (gpt-4o-mini)scripts/names-of-allah.json– 99 Names with intent tagsscripts/seed-spiritual-assets.ts– Embed and insert into Supabasesupabase/schema.sql– Table andmatch_documentsRPC
Security
- No secrets in the frontend: The app only calls relative URLs (
/api/search,/api/refine). All credentials are read server-side from env; nothing sensitive is in the client bundle. - Input validation: Request bodies are validated with Zod (type, length, trim). Search query and refine input have max lengths to prevent abuse and stay within model limits.
- SQL injection: The database is only called via parameterized RPC (
match_documentswith a vector and typed params). User text is never concatenated into SQL. - Rate limiting: In-memory limits by IP: 20 requests/min for search, 10/min for refine. Responses use
429 Too Many RequestsandRetry-Afterwhen exceeded. On multi-instance or Edge deployments, limits are per instance. For production at scale, use a shared store (e.g. Upstash Redis): setUPSTASH_REDIS_REST_URLandUPSTASH_REDIS_REST_TOKENand switchsrc/lib/rate-limit.tsto use@upstash/ratelimitwhen those env vars are present. - Error messages: API responses return generic messages (e.g. "Search failed.") and do not leak internal details or stack traces.
Personal Ledger
Refined du'as can be saved to the Personal Ledger (stored in localStorage). No auth in this MVP.
Related Skills
node-connect
350.1kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
109.9kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
350.1kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
350.1kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
