Math  /  Data & Statistics

QuestionLet's go to the movies: A random sample of 42 Hollywood movies made in the last 10 years had a mean length of 111.7 minutes, with a standard deviation of 13.8 minutes.
Part: 0/20 / 2
Part 1 of 2 (a) Construct a 90%90 \% confidence interval for the true mean length of all Hollywood movies in the last 10 years. Round the answers to at least one decimal place. A 90%90 \% confidence interval for the true mean length of all Hollywood movies made in the last 10 years is \square <μ<<\mu< \square .

Studdy Solution
Construct the confidence interval:
Margin of Error (ME)=t×SEM=1.684×2.133.58 \text{Margin of Error (ME)} = t^* \times \text{SEM} = 1.684 \times 2.13 \approx 3.58
Calculate the lower and upper bounds of the confidence interval:
Lower bound=xˉME=111.73.58=108.1 \text{Lower bound} = \bar{x} - \text{ME} = 111.7 - 3.58 = 108.1
Upper bound=xˉ+ME=111.7+3.58=115.3 \text{Upper bound} = \bar{x} + \text{ME} = 111.7 + 3.58 = 115.3
The 90% 90\% confidence interval for the true mean length of all Hollywood movies made in the last 10 years is:
108.1<μ<115.3 \boxed{108.1} < \mu < \boxed{115.3}

View Full Solution - Free
Was this helpful?

Studdy solves anything!

banner

Start learning now

Download Studdy AI Tutor now. Learn with ease and get all help you need to be successful at school.

ParentsInfluencer programContactPolicyTerms
TwitterInstagramFacebookTikTokDiscord