What are System Prompts?
System prompts tell the AI enhancement provider how to process your transcribed text. They control grammar correction, formatting, tone, and structure.Default Enhancement
Both Stenox Cloud and local providers come with fine-tuned enhancements that we continue to refine. No configuration needed — Stenox works great out of the box.Custom prompts are optional for general use. However, profiles with custom prompts are powerful for users who want specific behavior per app — like professional tone in email clients or technical formatting in code editors.
Customizing Your Prompt
1
Open Profile Settings
Settings → Profiles tab → Select your profile → Click Edit
2
Enable LLM Enhancement
Toggle on Enable LLM Enhancement in the Post-Processing (LLM) section
3
Find System Prompt
Scroll to the System Prompt field — labeled “Instructions for the AI on how to format text”
4
Edit Prompt
Enter your custom instructions in the text box
5
Save Profile
Click Save Profile to apply changes
Example Prompts
Professional Email
Casual Notes
Technical Documentation
Meeting Notes
Creative Writing
Medical/Legal (Minimal)
Prompt Tips
Be specific about tone
Be specific about tone
Instead of just “improve this”, specify the tone you want:
- “professional and formal”
- “friendly and conversational”
- “technical and precise”
- “casual and relaxed”
Specify what NOT to change
Specify what NOT to change
Tell the AI what to preserve:
- “Keep technical terms as-is”
- “Don’t change proper nouns”
- “Preserve my original phrasing when possible”
Include formatting instructions
Include formatting instructions
Request specific formatting:
- “Format as bullet points”
- “Add paragraph breaks”
- “Use markdown headers”
- “Keep as a single paragraph”
Set boundaries
Set boundaries
Control how much the AI changes:
- “Make minimal changes” — Light touch
- “Polish and improve” — Moderate editing
- “Significantly rewrite for clarity” — Heavy editing
Prompt Patterns
Minimal Editing
Moderate Editing
Heavy Editing
Per-Profile Prompts
Different contexts need different prompts:| Profile | Prompt Style | Example Use |
|---|---|---|
| Work Email | Professional, polished | Client communication |
| Slack | Casual, brief | Team chat |
| Documentation | Technical, clear | Docs and guides |
| Notes | Minimal changes | Personal capture |
| Creative | Preserve voice | Blog posts, writing |
Provider Differences
Different AI providers interpret prompts differently:| Provider | Behavior |
|---|---|
| OpenAI GPT-4o | Follows complex instructions well, nuanced |
| Google Gemini | Good at formatting, fast responses |
| Groq LLMs | Fast, good for simpler prompts |
| MLX (Local) | Works best with simple, direct prompts |
For local MLX models: Keep prompts simple and direct. Complex multi-step instructions may not work as well as with larger cloud models.
Testing Your Prompts
After changing a prompt:- Test with typical input — Dictate something you’d normally say
- Check the output — Does it match your expectations?
- Iterate — Refine the prompt until it works consistently
- Edge cases — Test with unusual input to catch issues
Troubleshooting
AI changes too much
AI changes too much
Add constraints:
AI doesn't change enough
AI doesn't change enough
Be more directive:
Wrong formatting
Wrong formatting
Be explicit about format:Or:
Prompt works with one provider but not another
Prompt works with one provider but not another
Different models have different capabilities. Simplify prompts for local MLX models, or create separate profiles for different providers.

