Ask Question
19 December, 16:08

A sample of 100 independent random numbers is taken from this distribution, and its average is used to estimate the mean of the distribution. What isthe standard error of this estimate?

+1
Answers (1)
  1. 19 December, 19:40
    0
    The standard error in estimating the mean = (0.1 * standard deviation of the distribution)

    Step-by-step explanation:

    The standard error of the mean, for a sample, σₓ is related to the standard deviation, σ, through the relation

    σₓ = σ / (√n)

    n = sample size = 100

    σₓ = σ / (√100)

    σₓ = (σ/10) = 0.1σ

    Hence, the standard error in estimating the mean = (0.1 * standard deviation of the distribution)
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “A sample of 100 independent random numbers is taken from this distribution, and its average is used to estimate the mean of the ...” in 📘 Mathematics if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers