Pass the NVIDIA NVIDIA-Certified Associate NCA-GENL Questions and answers with CertsForce

Viewing page 2 out of 3 pages
Viewing questions 11-20 out of questions
Questions # 11:

Which of the following best describes Word2vec?

Options:

A.

A programming language used to build artificial intelligence models.


B.

A statistical technique used to analyze word frequency in a text corpus.


C.

A deep learning algorithm used to generate word embeddings from text data.


D.

A database management system designed for storing and querying word data.


Expert Solution
Questions # 12:

Which metric is commonly used to evaluate machine-translation models?

Options:

A.

F1 Score


B.

BLEU score


C.

ROUGE score


D.

Perplexity


Expert Solution
Questions # 13:

What metrics would you use to evaluate the performance of a RAG workflow in terms of the accuracy of responses generated in relation to the input query? (Choose two.)

Options:

A.

Generator latency


B.

Retriever latency


C.

Tokens generated per second


D.

Response relevancy


E.

Context precision


Expert Solution
Questions # 14:

Imagine you are training an LLM consisting of billions of parameters and your training dataset is significantly larger than the available RAM in your system. Which of the following would be an alternative?

Options:

A.

Using the GPU memory to extend the RAM capacity for storing the dataset and move the dataset in and out of the GPU, using the PCI bandwidth possibly.


B.

Using a memory-mapped file that allows the library to access and operate on elements of the dataset without needing to fully load it into memory.


C.

Discarding the excess of data and pruning the dataset to the capacity of the RAM, resulting in reduced latency during inference.


D.

Eliminating sentences that are syntactically different by semantically equivalent, possibly reducing the risk of the model hallucinating as it is trained to get to the point.


Expert Solution
Questions # 15:

You are working with a data scientist on a project that involves analyzing and processing textual data to extract meaningful insights and patterns. There is not much time for experimentation and you need to choose a Python package for efficient text analysis and manipulation. Which Python package is best suited for the task?

Options:

A.

NumPy


B.

spaCy


C.

Pandas


D.

Matplotlib


Expert Solution
Questions # 16:

Which of the following options describes best the NeMo Guardrails platform?

Options:

A.

Ensuring scalability and performance of large language models in pre-training and inference.


B.

Developing and designing advanced machine learning models capable of interpreting and integrating various forms of data.


C.

Ensuring the ethical use of artificial intelligence systems by monitoring and enforcing compliance with predefined rules and regulations.


D.

Building advanced data factories for generative AI services in the context of language models.


Expert Solution
Questions # 17:

In Exploratory Data Analysis (EDA) for Natural Language Understanding (NLU), which method is essential for understanding the contextual relationship between words in textual data?

Options:

A.

Computing the frequency of individual words to identify the most common terms in a text.


B.

Applying sentiment analysis to gauge the overall sentiment expressed in a text.


C.

Generating word clouds to visually represent word frequency and highlight key terms.


D.

Creating n-gram models to analyze patterns of word sequences like bigrams and trigrams.


Expert Solution
Questions # 18:

Why might stemming or lemmatizing text be considered a beneficial preprocessing step in the context of computing TF-IDF vectors for a corpus?

Options:

A.

It reduces the number of unique tokens by collapsing variant forms of a word into their root form, potentially decreasing noise in the data.


B.

It enhances the aesthetic appeal of the text, making it easier for readers to understand the document’s content.


C.

It increases the complexity of the dataset by introducing more unique tokens, enhancing the distinctiveness of each document.


D.

It guarantees an increase in the accuracy of TF-IDF vectors by ensuring more precise word usage distinction.


Expert Solution
Questions # 19:

Which of the following contributes to the ability of RAPIDS to accelerate data processing? (Pick the 2 correct responses)

Options:

A.

Ensuring that CPUs are running at full clock speed.


B.

Subsampling datasets to provide rapid but approximate answers.


C.

Using the GPU for parallel processing of data.


D.

Enabling data processing to scale to multiple GPUs.


E.

Providing more memory for data analysis.


Expert Solution
Questions # 20:

Which of the following is a parameter-efficient fine-tuning approach that one can use to fine-tune LLMs in a memory-efficient fashion?

Options:

A.

TensorRT


B.

NeMo


C.

Chinchilla


D.

LoRA


Expert Solution
Viewing page 2 out of 3 pages
Viewing questions 11-20 out of questions