Tokenization
Tokenization replaces sensitive data with non-sensitive tokens that map back to the original.
Updated: 2026-03-06
Definition
Tokenization swaps sensitive values (like card numbers) with tokens stored in a secure vault mapping.
It reduces exposure because systems can store tokens instead of the original data.
Key points
- Reduces exposure of sensitive data
- Uses a secure mapping/vault
- Different from encryption (not reversible without vault)
Common mistakes
- Storing the vault mapping insecurely.
- Confusing tokenization with hashing/encryption.
Related exams
Related terms
Want to practice this in exam-style questions?
Use the mini tests on each exam page, then continue in the app for offline packs and detailed explanations.
Go to exams