73.5k views
2 votes
Let S u, v be a linearly independent set. Prove that the {u + v,u - v} is linearly independent

1 Answer

4 votes

Answer with explanation:

  • It is given that {u,v} be a linearly independent set of a set S.

This means that there exist constant a,b such that if:

au+bv=0

then a=b=0

  • Now we are asked to prove that:

{u+v,u-v} is a linearly independent set.

Let us consider there exists constant c,d such that:

c(u+v)+d(u-v)=0

To show: c=d=0

The expression could also be written as:

cu+cv+du-dv=0

( Since, using the distributive property)

Now on combining the like terms that is the terms with same vectors.

cu+du+cv-dv=0

i.e.

(c+d)u+(c-d)v=0

Since, we are given that u and v are linearly independent vectors this means that:

c+d=0------------(1)

and c-d=0 i.e c=d-----------(2)

and from equation (1) using equation (2) we have:

2c=0

i.e. c=0

and similarly by equation (2) we have:

d=0

Hence, we are proved with the result.

We get that the vectors {u+v,u-v} is linearly independent.

User Trenki
by
8.3k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories