41.3k views
4 votes
let b={b1, b2, b3} be a basis for a vector space v and let t : v → ℝ2 be a linear transformation with the property shown below. find the matrix for t relative to b and the standard basis for ℝ2.

User Bitifet
by
8.6k points

1 Answer

4 votes

Answer:

.............

Explanation:

......................................

User Barnes
by
8.4k points