217k views
5 votes
Suppose a feature has the following values: x1=10, x2=12, x3=15,

x4=20.
1) Compute scaled x2 using normalization.
2) Compute scaled x2 using standardization.

User DrSAR
by
8.2k points

1 Answer

6 votes

Final answer:

Normalization involves rescaling data to a 0-1 range, while standardization rescales data to have a mean of 0 and standard deviation of 1. For x2=12, normalization would use the min and max of the data set, whereas standardization would use the data set mean and standard deviation.

Step-by-step explanation:

The student is asking how to compute the normalized and standardized values of a given feature, specifically for the value x2=12, given a set of values (x1=10, x2=12, x3=15, x4=20). Normalization typically involves rescaling the values of a feature so they range from 0 to 1. Standardization, on the other hand, involves rescaling data to have a mean of 0 and a standard deviation of 1 (also known as z-score normalization).

To normalize x2, we use the formula:

normalized_x2 = (x2 - min(x)) / (max(x) - min(x))

Where min(x) and max(x) are the minimum and maximum values in the set, respectively.

For standardization, the formula is:

standardized_x2 = (x2 - mean) / standard_deviation

Where mean is the average of the values x1 through x4, and standard_deviation is the sample standard deviation of these values.

User Jaxxbo
by
9.1k points
Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.