Final answer:
Tokenization replaces sensitive data with a non-sensitive token, Encryption encodes data into a secret code, and Data Masking hides data with fictitious content. Option 1 is the correct choice, describing each process according to its function in securing data.
Step-by-step explanation:
The difference between Tokenization, Encryption, and Data Masking refers to different methods of securing sensitive data.
- Tokenization is the process of substituting sensitive data with a non-sensitive equivalent, referred to as a token, that has no extrinsic or exploitable meaning. This token is then mapped back to the sensitive data through a tokenization system.
- Encryption is the process of encoding data using an algorithm to transform the original sensitive data into a secret code, known as ciphertext, which can only be deciphered with the correct key.
- Data Masking is the process involving the hiding of original data with modified content (characters or other data), known as fictitious data or a mask.
Therefore, the correct answer is that tokenization is the process of converting sensitive data into a non-sensitive token, Encryption is the process of converting data into a secret code, and Data Masking is the process of replacing sensitive data with fictitious data. This corresponds to option 1 in the provided choices.