exam questions

Exam SY0-601 All Questions

View all questions & answers for the SY0-601 exam

Exam SY0-601 topic 1 question 751 discussion

Actual exam question from CompTIA's SY0-601
Question #: 751
Topic #: 1
[All SY0-601 Questions]

In which of the following scenarios is tokenization the best privacy technique to use?

  • A. Providing pseudo-anonymization for social media user accounts
  • B. Serving as a second factor for authentication requests
  • C. Enabling established customers to safely store credit card information
  • D. Masking personal information inside databases by segmenting data
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Theoreign
Highly Voted 1 year, 5 months ago
Selected Answer: C
This question was in my exam and I passed my exam and I scored 800. I wrote my exams on 03/28/2024 BELIEVE’
upvoted 13 times
...
meister13
Highly Voted 1 year, 7 months ago
Selected Answer: C
tokenization = almost always credit cards (for the exam)
upvoted 7 times
...
scholarbust
Most Recent 1 year, 1 month ago
Selected Answer: C
i love credit card debt, and if you don't vote C
upvoted 1 times
...
durel
1 year, 1 month ago
Selected Answer: C
C for sure.
upvoted 1 times
...
LuckyAro
1 year, 6 months ago
Selected Answer: D
D is the correct answer
upvoted 2 times
...
kong345
1 year, 7 months ago
Selected Answer: C
its C guys
upvoted 4 times
...
ComPCertOn
1 year, 7 months ago
Selected Answer: C
It only C
upvoted 1 times
...
ProdamGarazh
1 year, 7 months ago
Selected Answer: C
It's absolutely for credit card storing. Tokenization replaces data elements with a token, or substitute value. A tokenization system retains both the token and the original value. Tokenization is commonly used with credit cards
upvoted 2 times
...
chimz2002
1 year, 7 months ago
Selected Answer: D
I change my vote to D - "Masking personal information inside databases by segmenting data". Tokenization is the process of exchanging sensitive(personal information) data for nonsensitive data called “tokens” that can be used in a database or internal system without bringing it into scope. All tokenization is segmentation, but not all segmentation is tokenization. Tokenization is a special type of segmentation where we segment the entire text into words, as opposed to sentences or phrases. https://www.cio.com/article/403692/data-tokenization-a-new-way-of-data-masking.html#:~:text=Tokenization%20masks%20or%20substitutes%20sensitive,data%20is%20called%20a%20token. https://www.tokenex.com/blog/what-is-tokenization/
upvoted 1 times
...
chimz2002
1 year, 7 months ago
Selected Answer: C
right answer
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...