Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Huawei Exam H13-321 Topic 1 Question 84 Discussion

Actual exam question for Huawei's H13-321 exam
Question #: 84
Topic #: 1
[All H13-321 Questions]

The bag-of-words model is the earliest text vectorization method with words as the basic processing unit. Which of the following options is not the bag of words model.

Show Suggested Answer Hide Answer
Suggested Answer: C

Contribute your Thoughts:

Ozell
21 days ago
You know, I'm starting to think option D, 'order information cannot be retained', might also be a valid answer. The bag-of-words model completely ignores the order of words, which can be important for understanding context and meaning. That seems like a pretty significant shortcoming to me.
upvoted 0 times
...
Temeka
23 days ago
Haha, 'dimensional disaster'? What a great term! I'm definitely going to remember that one. But yeah, I think I'm leaning towards option B as the answer. The bag-of-words model is all about surface-level word frequencies, so the semantic gap is a pretty fundamental limitation.
upvoted 0 times
...
Beatriz
24 days ago
Ooh, good point, Trina. The 'dimensional disaster' is a well-known issue with the bag-of-words model. Though I suppose you could argue that the lack of semantic awareness (option B) is also a pretty big limitation. Tough choice!
upvoted 0 times
...
Trina
25 days ago
Hmm, I'm not so sure. Option C, 'dimensional disaster', could also be a valid answer. The bag-of-words model can create really high-dimensional feature vectors, which can be computationally expensive and lead to overfitting. That seems like a more relevant drawback to me.
upvoted 0 times
...
Moira
27 days ago
Yeah, I agree with Leana. The bag-of-words model is all about counting word frequencies without any consideration for meaning or context. So option B definitely doesn't apply. I'm leaning towards that as the answer.
upvoted 0 times
...
Leana
29 days ago
The bag-of-words model is a pretty basic text vectorization technique, so I'm not surprised it's on the exam. I think option B, 'there is a semantic gap', is not a characteristic of the bag-of-words model. The bag-of-words model doesn't really capture semantic relationships between words, so there's no semantic gap to worry about.
upvoted 0 times
...

Save Cancel