Final answer:
Self-attention in NLP involves computing the context of a word by summing up the values of its neighboring words using attention.
Step-by-step explanation:
The statement given in the question is true. Self-attention in natural language processing (NLP) models involves computing the context of a word by summing up the values of its neighboring words. This is done using a mechanism called attention, which assigns weights to each word in a sentence based on their relevance to a given word. These weights are then used to compute the context representation of the word.