MS 150 Statistics spring 2010 assessment

College mandated assessment report format

Review of performance: MS 150 spring 2010. 64 students were enrolled in course. Submitted by Dana Lee Ling.

nSLOProgram SLOI, D, MReflection/comment
1Identify levels of measurement and appropriate statistical measures for a given leveldefine mathematical concepts, calculate quantities, estimate solutions, solve problems, represent and interpret mathematical information graphically, and communicate mathematical thoughts and ideas. M47of 64 students were successful on this SLO based on an item analysis of the comprehensive final examination
2Determine frequencies, relative frequencies, creating histograms and identifying their shape visuallyM34
3Calculate basic statistical measures of the middle, spread, and relative standingM47
4Perform linear regressions finding the slope, intercept, and correlation; generate predicted values based on the regressionM44
5Calculate simple probabilities for equally likely outcomesM46
6Determine the mean of a distributionM48
7Calculate probabilities using the normal distributionM45
8Calculate the standard error of the meanM40
9Find confidence intervals for the meanM35
10Perform hypothesis tests against a known population mean using both confidence intervals and formal hypothesis testingM29
11Perform t-tests for paired and independent samples using both confidence intervals and p-valuesM29

In the table above, n is the outline outcome number. As noted below, outline outcome items five, six, and seven were not directly tested on the final examination. The data for those outcomes above is based on in-class testing data.

Assessment report

Performance in MS 150 Statistics was measured by quizzes and tests throughout the term. A comprehensive final examination consisting of forty-nine fill-in-the-blank questions was administered. Forty-four questions mapped back to a course level student learning outcome. Performance against the outline proposed in 2008 has been measured for the past four terms and provided a basis for comparison across the terms. The following table indicates the percentage of students who correctly answered questions for each outcome on the outline over five terms. The data below derives from an item analysis of the final examination, except for outcomes five, six, and seven. These three outcomes are foundation material for outcomes eight, nine, ten, and eleven. These three outcomes are tested during the term. In the table below quiz data from the term is used to report on success in outcomes five, six, and seven.

nStudents will be able to...Sp 08Fa 08Sp 09Fa 09Sp 10
1Identify levels of measurement and appropriate statistical measures for a given level0.780.820.940.960.74
2Determine frequencies, relative frequencies, creating histograms and identifying their shape visually0.830.840.750.790.55
3Calculate basic statistical measures of the middle, spread, and relative standing0.90.80.860.930.74
4Perform linear regressions finding the slope, intercept, and correlation; generate predicted values based on the regression0.80.790.630.690.7
5Calculate simple probabilities for equally likely outcomes0.67NA0.820.590.73
6Determine the mean of a distributionNANA0.770.520.75
7Calculate probabilities using the normal distribution0.71NA0.610.450.71
8Calculate the standard error of the mean0.840.840.870.910.64
9Find confidence intervals for the mean0.660.850.670.730.55
10Perform hypothesis tests against a known population mean using both confidence intervals and formal hypothesis testing0.360.560.550.550.46
11Perform t-tests for paired and independent samples using both confidence intervals and p-values0.550.630.430.560.46
12Go beyond outlineNA0.330.160.370.44
PSLOdefine mathematical concepts, calculate quantities, estimate solutions, solve problems, represent and interpret mathematical information graphically, and communicate mathematical thoughts and ideas. 0.720.760.690.710.62

Sp refers to spring terms, Fa to fall terms. The digits that follow are the last two digits of the calendar year. Performance marked NA indicates that the material was not directly performed on the final examination and alternative assessment data was not recorded in that term. Quiz based averages from chapters 5, 6, and 7 were not included in the overall average for any term due to performance differences for in-class quizzes.

The 95% confidence interval for the overall mean is 64% to 76%. This term saw a statistically significant drop in performance against this confidence interval for the mean. This collapse in learning is atypical for a mature course. Excluding the quiz based performance data for outline items 5, 6, and 7, performance fell for every single outine item.

The final five questions on the final examination are from material beyond the outline. The students are unaware that this material will appear, the material is not covered in the textbook, and the material is deleted from the posted finals that students use to study for the examination. This material is a complete surprise to the students. The point of this material is noted on the final examination itself, "One intention of any course is that a student should be able to learn and employ new concepts in the field even after the course is over."

The 44% success rate of students on this material is an increase from last term, which was double the performance of the previous term. This value appears to be highly variable from term-to-term.

Making inferences

Questions that required an inference, an interpretation of a result, had only a 45% success rate this term, down from a success rate of 55% the previous term. That 55% success rate was an increase from a 40% success rate the prior term. The current term-on-term drop is reflective of the overall drop in performance seen spring 2010.

Projects

For a fourth term in a row students were asked to engage in putting together a basic statistical research project. The number of projects completed improved with the addition of an earlier (fourth week of class) initial written commitment to a project idea. Writing on the projects also showed improvement, but this was not specifically analyzed this term.

Attendance

A separate report on absenteeism in the problematic 8:00 section was published earlier in the term.

As noted in the above report, attendance was problematic for the 8:00 section. The average number of absences by section is noted in the following table. This table, however, provides data only for those students who completed the term. Seven students withdrew from the 8:00 section, two students withdrew from the 9:00 section, and three withdrew from the 10:00 section. Withdrawals from the 8:00 section were dominantly withdrawals due to absences. None of the withdrawn students are included in the average number of absences by section data below.

SectionAverage number of absences
8:008.29
9:007.46
10:005.52
Overall6.93

In statistics a late is a third of an absence. Since I run attendance as a point per day, a third of an absence would be "0.6666..." Thus I actually enter "0.7" making a late technically 30% of an absence. Extremely late (arriving half way through the period) causes a reduction to 50% of a point.

Performance on the final examination by section inversely paralleled the absence table data.

SectionAverage on the final examination
8:000.66
9:000.68
10:000.76
Overall0.71

Note: The overall percentage average of 71% is higher than the 62% average because the 62% is based on an item analysis where an item is either 100% right or 0% wrong. The item analysis looks at each of the forty-nine questions as an all or nothing result. The average on the final, however, is out of 56 possible points where there are single questions worth more than one point and on which a student can obtain partial credit. Questions such as number 13 (a three column table) and 14 (a chart) are examples of multiple point questions.

Although the data suggests a potential connection between attendance and performance on the final examination, an xy scattergraph of individual attendance versus individual final examination scores shows NO correlation. The relationship is random. An xy scattergraph showing no correlation between attendance and performance on the final examination.
Linear regression trend line attendance (not absences) versus performance in course

Note that three students who were still enrolled in the course did not sit for the final examination. These three students were absent, none of these three students contacted the instructor. All three had been absent for the final few weeks of the term, roughly back to spring break.

Grade distributions

Although not a direct assessment of learning, grade distributions do have meaning when coupled with in-course assessment that is aligned with the student learning outcomes on the outline. Questions on in-course quizzes and tests consist of questions that can each be mapped back to a student learning outcome on the outline. The bulk of the questions are valued at one point, with certain multiple-entry tables and charts being worth up to five points. Grades are then calculated from the points attained. While this is not matrix-level assessment which permits one to say, "Johnny can do x," I would argue the result is grades that have underlying meaning in terms of student learning. The course grade distribution by section is shown in the following chart.

Grade distribution
Grade distribution

There appears to be a clear differential in the performance by section where sections later in the day perform better. The factors that may underlay this result are not immediately obvious to this other. A selection effect has been proposed which posits that students who are more organized and self-disciplined registered before less organized students, filling the preferred later sections first. This theory would suggest that the 8:00 section, which fills last, would tend to collect weaker students.

With a debt of gratitude to the SIS programmer, the above theory can be tested. The following table looks at the percentage performance on the final examination against the registration date of the enrolled student.

DateNumber of studentsAverage on final
11/16/09150.72
11/17/09100.75
11/18/0980.71
11/19/0960.85
11/20/09100.60
11/23/0910.80
01/05/1070.75
01/06/1040.50
01/11/1010.75
01/12/1050.73
Overall670.71

There is no distinct pattern in the data. Where the number of students is less than five, one cannot draw any statistically solid conclusion. The only data point that appears that it might be significantly low is the Friday 20 November set of ten students. This date does include one of the three students who did not sit the final examination, which has a strong impact on the average. One would be hard pressed to argue that there is a pattern whereby earlier registrants performed more strongly as measured by the comprehensive final examination.

The above analysis does not leave much room for a coherent explanation to the poor performance at 8:00, which was a contributing factor to the overall drop in performance.