Final answer:
We've confirmed that the given function is a probability density function since it is non-negative and the area under it is 1. The expected value E(Yi) is 1/2, and if E(Y) equals the parameter it estimates, Y is unbiased. Y is consistent if its variance decreases with an increasing sample size and converges to the parameter.
Step-by-step explanation:
Checking a Probability Density Function (PDF) and Finding Expectations
To verify that a function f(y) is a probability density function, we need to check two main properties:
- The function must be non-negative for all y in its domain.
- The total area under the function over its domain must be equal to 1.
Given f(y) = -1 for y in the domain [0, 1] and f(y) = 0 otherwise, we can see that the function is non-negative in the interval [0, 1]. Moreover, integrating f(y) from 0 to 1, we get the area as 1, fulfilling the conditions for a PDF.
To find E(Yi), the expected value of Yi, we compute the integral of y * f(y) over the interval [0, 1]. This gives us E(Yi) = 1/2.
An unbiased estimator implies that the expected value of the estimator equals the parameter it estimates. The estimator Y for the parameter \(/+1\) (possibly a typo, should be \(\lambda + 1\)) is unbiased if E(Y) = \(\lambda + 1\).
A consistent estimator means that as the sample size n goes to infinity, the estimator converges in probability to the parameter it estimates. If the variance of Y decreases as n increases, and E(Y) approaches \(\lambda + 1\), then Y is a consistent estimator.