Practice Exam: Oracle Cloud Infrastructure Generative AI Professional
- 1. In the simplified workflow for managing and querying vector data, what is the role of indexing?
- 2. In which scenario is soft prompting appropriate compared to other training styles?
- 3. Which statement is true about Fine-tuning and Parameter-Efficient Fine-Tuning (PEFT)?
- 4. When does a chain typically interact with memory in a run within the LangChain framework?
- 5. What do prompt templates use for templating in language model applications?
- 6. What does a cosine distance of 0 indicate about the relationship between two embeddings?
- 7. What does accuracy measure in the context of fine-tuning results for a generative model?
- 8. What is the purpose of Retrievers in LangChain?
- 9. Which is a characteristic of T-Few fine-tuning for Large Language Models (LLMs)?
- 10. Which statement is true about string prompt templates and their capability regarding variables?
- 11. Which LangChain component is responsible for generating the linguistic output in a chatbot system?
- 12. How does the temperature setting in a decoding algorithm influence the probability distribution over the vocabulary?