Engineering Prompts

Engineering Prompts

Share this post

Engineering Prompts
Engineering Prompts
Separating Instruction from Data

Separating Instruction from Data

Large language models are easily confused if you're not clear about what you want from them. We'll look at a simple way of dealing with this, and briefly explore the challenges of "prompt injections".

Marcel Salathé's avatar
Marcel Salathé
May 10, 2023
∙ Paid
2

Share this post

Engineering Prompts
Engineering Prompts
Separating Instruction from Data
Share

In a well-known story, a passenger entered a self-driving taxi and requested, "get me to the airport as quickly as possible." The taxi raced to the airport seemingly out of control, and upon reaching the destination a few minutes later, the passenger had vomited all over the vehicle's interior due to the reckless driving. The trip also resulted in numerous speeding and traffic light violations, which ultimately put the robo-taxi out of business.

This amusing tale highlights the importance of clear communication with AI systems. When you talk to a person, there is a lot of context that can remain unsaid, but large language models will often miss that context, unless you provide it explicitly.

Keep reading with a 7-day free trial

Subscribe to Engineering Prompts to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Marcel Salathé
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share