The nn module in MindSpore provides essential tools for building neural networks, including:
C. Optimizers: such as Momentum and Adam, which are used to adjust the weights of the model during training.
D. Loss functions: such as MSELoss (Mean Squared Error Loss) and SoftmaxCrossEntropyWithLogits, which are used to compute the difference between predicted and actual values.
The other options are incorrect because:
A. Hyperparameter search modes (like GridSearch and RandomSearch) are typically found in model training and tuning modules, but not in the nn module.
B. Model evaluation indicators like F1 Score and AUC are also handled by specific evaluation functions or libraries outside the nn module.
HCIA AI References:
AI Development Framework: Detailed coverage of MindSpore’s nn module, its optimizers, and loss functions.
Introduction to Huawei AI Platforms: Explains various MindSpore features, including network construction and training.
Contribute your Thoughts:
Chosen Answer:
This is a voting comment (?). You can switch to a simple comment. It is better to Upvote an existing comment if you don't have anything to add.
Submit