exam questions

Exam NCA-GENL All Questions

View all questions & answers for the NCA-GENL exam

Exam NCA-GENL topic 1 question 3 discussion

Actual exam question from NVIDIA's NCA-GENL
Question #: 3
Topic #: 1
[All NCA-GENL Questions]

In large-language models, what is the purpose of the attention mechanism?

  • A. To measure the importance of the words in the output sequence.
  • B. To determine the order in which words are generated.
  • C. To capture the order of the words in the input sequence.
  • D. To assign weights to each word in the input sequence.
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
Puppy2716
1 week, 5 days ago
Selected Answer: A
Attention means weight for the importance part of the sequences
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...