Huawei HCIP - AI EI Developer V2.5 Exam H13-321_V2.5 Question # 6 Topic 1 Discussion

Huawei HCIP - AI EI Developer V2.5 Exam H13-321_V2.5 Question # 6 Topic 1 Discussion

H13-321_V2.5 Exam Topic 1 Question 6 Discussion:
Question #: 6
Topic #: 1

Which of the following statements about the multi-head attention mechanism of the Transformer are true?


A.

The dimension for each header is calculated by dividing the original embedded dimension by the number of headers before concatenation.


B.

The multi-head attention mechanism captures information about different subspaces within a sequence.


C.

Each header's query, key, and value undergo a shared linear transformation to obtain them.


D.

The concatenated output is fed directly into the multi-headed attention mechanism.


Get Premium H13-321_V2.5 Questions

Contribute your Thoughts:


Chosen Answer:
This is a voting comment (?). It is better to Upvote an existing comment if you don't have anything to add.