exam questions

Exam Certified Generative AI Engineer Associate All Questions

View all questions & answers for the Certified Generative AI Engineer Associate exam

Exam Certified Generative AI Engineer Associate topic 1 question 17 discussion

Actual exam question from Databricks's Certified Generative AI Engineer Associate
Question #: 17
Topic #: 1
[All Certified Generative AI Engineer Associate Questions]

A Generative Al Engineer is tasked with developing a RAG application that will help a small internal group of experts at their company answer specific questions, augmented by an internal knowledge base. They want the best possible quality in the answers, and neither latency nor throughput is a huge concern given that the user group is small and they’re willing to wait for the best answer. The topics are sensitive in nature and the data is highly confidential and so, due to regulatory requirements, none of the information is allowed to be transmitted to third parties.
Which model meets all the Generative Al Engineer’s needs in this situation?

  • A. Dolly 1.5B
  • B. OpenAI GPT-4
  • C. BGE-large
  • D. Llama2-70B
Show Suggested Answer Hide Answer
Suggested Answer: D 🗳️

Comments

Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.
Switch to a voting comment New
jothi28021951
2 days, 19 hours ago
Selected Answer: D
The engineer needs "the best possible quality in the answers." Llama2-70B is one of the largest and most capable open-source language models available, known for producing high-quality outputs. So the answer is Llama2-70B model
upvoted 1 times
...
adaine
3 days, 4 hours ago
Selected Answer: D
D is the correct answer. - A) Dolly is too small so it won't be the best quality. - B) OpenAI GPT-4 is high quality but there would be issues around confidentiality - C) BGE-large is an embedding model so it's not suitable for answering questions. Llama2-70B is sufficiently big to provide high quality answers. Even though it would be slower than smaller models the question says that the users don't mind about latency. It can also be self-hosted so there is no third-party data transmission.
upvoted 1 times
...
AtulYadav
1 week ago
Selected Answer: C
The correct answer is BGE-large which is C.
upvoted 1 times
...
Community vote distribution
A (35%)
C (25%)
B (20%)
Other
Most Voted
A voting comment increases the vote count for the chosen answer by one.

Upvoting a comment with a selected answer will also increase the vote count towards that answer by one. So if you see a comment that you already agree with, you can upvote it instead of posting a new comment.

SaveCancel
Loading ...
exam
Someone Bought Contributor Access for:
SY0-701
London, 1 minute ago