69.2k views
2 votes
Self-attention is given a word; its neighboring words are used to compute its context by summing up the word values to map the attention related to that given word.

a) True
b) False

1 Answer

2 votes

Final answer:

Self-attention in NLP involves computing the context of a word by summing up the values of its neighboring words using attention.

Step-by-step explanation:

The statement given in the question is true. Self-attention in natural language processing (NLP) models involves computing the context of a word by summing up the values of its neighboring words. This is done using a mechanism called attention, which assigns weights to each word in a sentence based on their relevance to a given word. These weights are then used to compute the context representation of the word.

User Mamachanko
by
7.1k points