Which of the following best describes Word2vec?
Correct Answer:
C
🗳️
In the context of language models, what does an autoregressive model predict?
Correct Answer:
A
🗳️
In large-language models, what is the purpose of the attention mechanism?
Correct Answer:
D
🗳️
In the transformer architecture, what is the purpose of positional encoding?
Correct Answer:
C
🗳️