exam questions

Exam CCSP All Questions

View all questions & answers for the CCSP exam

Exam CCSP topic 1 question 174 discussion

Actual exam question from ISC's CCSP
Question #: 174
Topic #: 1
[All CCSP Questions]

What strategy involves replacing sensitive data with opaque values, usually with a means of mapping it back to the original value?

  • A. Masking
  • B. Anonymization
  • C. Tokenization
  • D. Obfuscation
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
kns20
Highly Voted 3 years, 2 months ago
Masking protects data in use Tokenization typically use to protect data at rest Obfuscation scrambles sensitive data Anonymization permanently replaces sensitive data with a substitute value
upvoted 8 times
...
Ahbey_911
Highly Voted 3 years, 10 months ago
Tokenization is correct.
upvoted 5 times
...
Muhammadk007
Most Recent 1 week, 1 day ago
Selected Answer: C
The correct answer is: C. Tokenization Explanation: Tokenization involves replacing sensitive data with opaque, non-sensitive substitutes called tokens. These tokens can be mapped back to the original values using a secure token vault or lookup table, but have no intrinsic meaning or value outside the specific system. This method is commonly used for credit card processing and other sensitive data management scenarios
upvoted 1 times
...
TraceSplice
8 months ago
Selected Answer: C
C is correct - Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no intrinsic or exploitable meaning or value. The token is a reference (i.e. identifier) that maps back to the sensitive data through a tokenization system.
upvoted 1 times
...
Lee_Lah
9 months, 2 weeks ago
Selected Answer: C
Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no intrinsic or exploitable meaning or value. The token is a reference that maps back to the sensitive data through a tokenization system.
upvoted 2 times
...
JohnnyBG
9 months, 2 weeks ago
Selected Answer: A
Masking is the answer. Tokenisation mean the value is changed for a token that you can convert to the original value in another DB.
upvoted 1 times
...
akg001
2 years, 6 months ago
Selected Answer: C
C. Tokenization
upvoted 2 times
...
EdwardLeeBurtle
2 years, 10 months ago
Poorly worded question.
upvoted 2 times
...
HCL
4 years ago
Why Masking or Obfuscation are not the answers? Algorithmic substitution of Data Masking allows re-generation of real data too.
upvoted 1 times
evilwizardington
3 years, 9 months ago
Data masking not always is looking for back regeneration, only in few cases (as algorithmic substitution). In contrast, tokenization is intended always for mapping back. Here the key word is: mapping.
upvoted 7 times
...
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago