Home / Technology / AI Expert Prompts Backfire, Study Finds
AI Expert Prompts Backfire, Study Finds
24 Mar
Summary
- Asking AI to 'act as an expert' can decrease result reliability.
- Persona prompts may hinder knowledge tasks like math and coding.
- Comprehensive prompts with context improve AI output quality.

Researchers have found that prompting AI to adopt an 'expert' persona may not enhance result reliability and could even impair performance on knowledge-intensive tasks like mathematics and coding. These persona prompts seem to trigger an instruction-following mode, diverting the AI from accurate fact recall.
Instead of over-engineering prompts with personas, comprehensive instructions that provide AI with sufficient context and necessary tools are more effective. This approach empowers the AI to act more autonomously and generate superior output. Developers are advised against exploiting AI biases through prompt engineering.
New solutions like PRISM allow AI to compare outputs with and without personas, learning to use them only when beneficial. Reasoning models tend to gain more from longer context, while instruction-tuned models are more sensitive to personas. Ultimately, clear tasks and relevant context are key.




