120k views
0 votes
In a 3-dimensional 2-class classification problem, the following training sample vectors are given:

[12] [ 8 ] [10] [ 7 ] [11] [1] [4] [1] [2] [3]
D₁ = [ 7 ] , [10] , [11] , [12] , [ 9 ] D₂= [4] , [6] , [7] , [8] , [2]
[13] [ 7 ] [ 9 ] [13] [10] [5] [6] [5] [7] [5]

Following are the steps to be taken to process this dataset.
the transform vector w to reduce the data dimension to 1 dimension, and
the projections of all the data samples in the resulting 1-dimensional space.

1 Answer

2 votes

Final answer:

To reduce the data dimension to 1 dimension in a 3-dimensional 2-class classification problem, you can use a transform vector w and calculate the projections of all the data samples in the resulting 1-dimensional space. Here's how:

Step-by-step explanation:

In a 3-dimensional 2-class classification problem, we are given training sample vectors. To reduce the data dimension to 1 dimension, we can use a transform vector w. To find the projections of all the data samples in the resulting 1-dimensional space, we can calculate the dot product of each data sample with the transform vector. Here's how:

  1. Normalize the transform vector w to make it a unit vector.
  2. Calculate the dot product of each data sample with the normalized transform vector w.

The dot product of a data sample D with the normalized transform vector w is given by:

Projection of D = D · w = D1 * w1 + D2 * w2 + D3 * w3

User Igor Kanshyn
by
8.9k points