Exercise caution when building off LLMs
Summary
Large Language Models (LLMs) have attracted widespread interest since ChatGPT’s release, and organisations are rapidly integrating them into services. The NCSC cautions that our understanding of LLMs is still ‘in beta’: models and vendor offerings change quickly, behaviours can be unpredictable, and specific vulnerabilities (notably prompt injection) allow user-supplied data to override intended instructions. The blog advises treating LLM-based components with caution, avoiding delegation of critical tasks (for example, financial transactions) to LLMs without strong safeguards, and designing system architectures that assume worst-case LLM behaviour while following emerging research on mitigations.
Key Points
- LLM APIs and models evolve rapidly; dependencies may break or providers may disappear.
- LLMs show behaviours beyond classical ML; they can exhibit more general capabilities that are not yet fully understood.
- Prompt injection is a practical risk: models may not reliably distinguish instructions from data and can be manipulated by adversaries.
- Traditional security testing is insufficient — testing LLM-powered apps may require social-engineering-style approaches to reveal weaknesses.
- Architect systems assuming the ‘worst case’ of what an LLM component might do; avoid granting LLMs authority over critical operations like payments.
- Research into mitigations is ongoing and there are no surefire fixes yet — proceed with caution and monitor developments.
Author style
Punchy: This is a succinct, authoritative warning from NCSC. If you’re responsible for product design, risk or security, the article amplifies why you shouldn’t rush LLMs into critical workflows — read the detail and act on the guardrails it suggests.
Why should I read this?
Quick take: it’s a solid reality check. LLMs are exciting but flaky — we skimmed it so you don’t have to. If your project uses LLMs for customer-facing features, automation or money moves, this piece flags real attack paths and sensible precautions you need to know now.
Source
Source: https://www.ncsc.gov.uk/blog-post/exercise-caution-building-off-llms