the 2006 general social survey asked a large number of people how much time they spent watching tv each day. the mean number of hours was 3.46 with a standard deviation of 2.13. assume that in a random sample of 31 teenagers, the sample standard deviation of daily tv time is 2.24 hours, and that the population of tv watching times is normally distributed. under 1% significance level can you conclude that the population standard deviation of tv watching times for teenagers is different from 2.13?