exam questions

Exam AWS Certified AI Practitioner AIF-C01 All Questions

View all questions & answers for the AWS Certified AI Practitioner AIF-C01 exam

Exam AWS Certified AI Practitioner AIF-C01 topic 1 question 162 discussion

What is tokenization used for in natural language processing (NLP)?

  • A. To encrypt text data
  • B. To compress text files
  • C. To break text into smaller units for processing
  • D. To translate text between languages
Show Suggested Answer Hide Answer
Suggested Answer: C 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Rcosmos
1 week, 6 days ago
Selected Answer: C
Tokenização é uma etapa fundamental no processamento de linguagem natural (NLP). Ela consiste em dividir o texto em partes menores chamadas tokens, que podem ser: Palavras Sílabas Frases curtas Ou até mesmo subpalavras ou caracteres, dependendo do modelo Esses tokens são então usados por modelos de IA (como LLMs) para analisar, entender e gerar texto.
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...