A Study of mathematical learning among nine students in MS 100 and MS 101 Fall 1997 to Spring 1998.
On 01 September 1997 students in two sections of MS 100 College Algebra taught by Dana Lee Ling were given a pre-test covering MS 100 level algebraic knowledge. On 05 December 1997 these same students were given the same test as a post-test. On 16 January 1998 students in Dana Lee Ling's section of MS 101 Algebra and trigonometry were given the same test. Near the end of the Spring term, on 24 April 1998, these MS 101 students took the test again. Nine students took the test four times, twice in MS 100 and twice in MS 101. The following study looks at the performance of these nine students in the four sittings of the test.
The nine students who completed the test four times share one common academic feature: they all successfully passed MS 100 in the Fall of 1997. As result, these findings cannot be generalized to the broader student population. These nine students are students who have succeeded in mathematics at the College. These students have also had the opportunity to take the test four times, there also exists the possibility of a practice effect.
The administration of the test in the Spring of 1998 was originally intended to look at the question of whether students "lose" knowledge over the Christmas break. Anecdotal evidence existed of students passing a prerequisite course and then appearing to be completely unprepared for the subsequent course the very next term.
A separate question concerned the theory that students learn one level of mathematics only once they have progressed to the next higher level of mathematics. If this were true, then one might expect improved performance on the test between the January and April tests. During this time the students were in the next higher course, while the test covered only material from the prior course.
The test consisted of 25 four answer multiple choice questions covering material in the MS 100 College Algebra course. The test was written by the Chair of the Department of Mathematics and Natural Sciences Stephen Blair. The instructor of the sections which took the test, Dana Lee Ling, did not see the test nor know if its contents until after the final administration of the test on 24 April 1998.
Due in part to the small n, only nine students, there is no significant change at a significance 0.05 in the overall average score for all nine students between time consecutive testings. On the Fall 1997 pre-test the average for the nine students was 8.2 out of 25, in December the average was 10.1, in January the average was 11.1, and in April the average was 9.9. The only two columns that attain statistical separation for these nine students are the Fall 1997 pre-test versus the Spring 1998 pre-test.
Table of student averages for the nine students on the four sittings of the test.
The lack of a statistically significant separation in the student's averages from December to January argues that there is no significant loss of knowledge over the Christmas break. On the contrary, the statistically significant separation of the January scores from the September scores suggests that students had learned new material and had retained it over the break.
The highest average, 11.1, represents a score of only 44%. The most "significant" statistic may be that at their best these nine students never mastered more an average 44% of the material on the test.
A best fit slope was obtained for each student, numbering the tests consecutively to generate x-values (see the table). Six students show a positive slope to their test averages and three students show a negative slope. The positive slopes suggest that these six students did improve over the course of the year. As an exploratory exercise the slopes were checked against various in-class performance measures for correlations. No strong correlations were found. The largest value of R² found was 0.18 for the slope versus the number of homework assignments. In class test scores were even less predictive of performance on the test.
Consider the following partial table of answers to the questions by a hypothetical student "x" where a 1 means the student answered the question correctly and a 0 means the answer was wrong:
|Student||Question||97 pre||97 post||98 pre||98 post||Sum||Pattern||Category|
A series of binary patterns can be formed. Patterns such as "0111" suggest that the student did not originally know the material, learned how to perform the problem, and retained that knowledge. Patterns such as "1111" suggest that the student knew that question before taking MS 100 and continued to retain the knowledge of how to answer that question. The pattern "0000" indicates that the student never knew how to correctly answer that question and never did learn over the course of the year. The following categories were then constructed for these patterns:
Always Knew (AK): 1111
Never Learned (NL): 0000
Strictly Ascending (SA) "learned and retained":
Strictly Descending (SD) "possibly knew it and then lost the knowledge"
Somewhat Random (SR): all other patterns.
Obviously patterns such as 0001 and 1000 may simply be random results and may not indicate learning or forgetting, hence the use of the ascending and descending terminology.
The following table counts the number of each pattern type present. Nine students answered 25 questions, so the total number of possible patterns is 225.
|Always Knew (AK)||28||12%|
|Strictly Ascending (SA)||35||16%|
|Strictly Descending (SD)||20||09%|
|Never Learned (NL)||72||32%|
|Somewhat Random (SR)||70||31%|
One third of the patterns are "0000," a pattern that suggests the students never knew how to answer the question and never learned. The test is considered to be well aligned to the MS 100 College Algebra curriculum as is the course itself. It is this author's opinion that the high rate of this pattern represents a real failure to learn and not a mis-alignment between the measurement instrument and the material in the course. Obviously, however, what is thought to be taught and what is actually learned do not yet meet.
Another one third of the patterns are classifiable as having a good chance of simply being random: the student guessed correctly on some occasions, or knew the answer only at one time and then forgot how to perform that problem subsequently.
Twelve percent of the material the students knew all along and managed to retain. If one accepts the interpretation of the patterns given here, then the mathematics sequence can take credit only for communicating knowledge on 16% of the test material. When coupled with the lack of a statistically significant improvement in the average the resulting conclusion has to be that there is little long term learning and retention occurring for these nine students.
The following table looks at overall strengths and weaknesses across all four sittings for all nine students. For example, question number one was answered correctly 21 times. Theoretically a question could be answered correctly 9 ×4 = 36 times. On average a question was answered 14.3 times correctly, with a standard deviation of 5.6, hence 21 correct answers is more than one standard deviation above the mean (1.20 standard deviations). Question number one can be classified a relative strength for the students. The most common pattern for question one was "1111," the "always knew" pattern. But the distribution of patterns (not shown) was fairly even: the negative Z pattern mode indicates that the pattern was not strongly present. A positive Z pattern mode indicates the pattern was strongly present. The Z pattern mode uses the average of the pattern counts and the standard deviation to calculate the relative prevalence of a pattern. At the end of this document is a table with the actual distributions by question.
|1||21||1.20||AK||-1.1||Which is not the equation of a line?|
|2||19||0.84||SR||-0.2||What is the slope of the line (graph given)?|
|3||13||-0.23||SR||-0.2||What is the vertex of y = -2(x + 3)² - 4|
|4||7||-1.30||NL||1.6||What is the equation of (graph of abs fcn)?|
|5||17||0.49||SA||-1.1||What is the inverse of (linear function given)?|
|6||20||1.02||SA||-0.2||The zeros of the function f(x) = x³ - 3x² - 10x are...|
|7||7||-1.30||NL||0.7||If f(1) = -2 and f(2) = 5 the Int. Med. Value thm tells us...|
|8||18||0.66||SR||1.6||What is the remainder of (polynomial division prob. given)?|
|9||14||-0.05||NL||-0.2||Which equation has degree 5 and roots at x=-1, 2, and 5?|
|10||7||-1.30||NL||1.6||The following graph represents (even, odd, both, neither) fcn?|
|11||17||0.49||NL||-1.1||If f(x) = ... and g(x) = ... then what is f(g(x))?|
|12||18||0.66||SR||0.7||If f(0) = 3 and f(2) = 11 and f(x) is a line, then what is f(5)?|
|13||15||0.13||SR||0.7||(Solve surface area algebraically to get radius)|
|14||9||-0.94||NL||0.7||The translation of a function f(x) = ... 3 units left is given by...|
|15||13||-0.23||SR||-0.2||If f(x) = x² and g(x) = x+2 then f(x) = g(x) for (values given)|
|16||16||0.31||NL||-1.1||The following is a sketch of (quartic graph given)|
|17||6||-1.48||NL||-0.2||A quad. eqn. f(x)=ax²+bx+c with discriminant = 0 has how many roots?|
|18||25||1.91||AK||-1.1||The solution set to 3 - 4y < 9 is|
|19||10||-0.76||NL||-1.1||What is the domain of (eqn. with sqrt. in it)|
|20||23||1.56||SA||-1.1||"Twice a number is two less than the number multiplied by itself" is written as the equation...|
|21||16||0.31||SR||0.7||Simplify (negative exponent simplification problem)|
|22||17||0.49||SD||-1.1||Simplify (fractional polynomial simplification problem)|
|23||16||0.31||NL||-1.1||If x = 2i - 3 is one complex root of a poly. the other root must be...|
|24||5||-1.66||NL||0.7||If f(x) = 1/x then f -1(x) is equal to:|
|25||8||-1.12||NL||1.6||Find vertex -3t² + 6t +4|
Only three questions have Strictly Ascending patterns as their mode. The pattern "0000" dominates 12 of the 25 questions. Mathematical instructors at the College may want to look more closely at the above table and to draw their own interpretations on the above data. Due to the small number of students involved drawing conclusions from the above table would be speculatory and subject to individual interpretation. For example, the Strictly Descending mode of problem 22 might be due to the use of MathView software to simplify algebra problems, software that was not made available to the students during the application of these tests. But the a look at the actual distribution (AK=2, SA=2, NL=2, SD=3) indicates that actual pattern distribution does not strongly support the "unlearning" of this material. The "big picture" is a general lack of mastery: an average of 14.28 out of 36 is only a 40%. Whether a given question or topic was "learned" or not relative to other topics may not be as important as a broader based failure to master the material on the whole as evidenced, for example, by the dominance of Never Learned patterns over Strictly Ascending patterns.
The lack of correlation between the test and in-class performance measures is disturbing. This suggests that our traditional in-class measures do not well measure longer term performance gains or losses. In-class measures seem to be a snap-shot in time that does not relate closely to retention.
There is another possible explanation for the lack of improvement in the averages and for the apparent evidence for a lack of learning: the instructor is at fault. This is in part why the instructor undertook this study only in his own classes, there exists the option of slaying the messenger by blaming the instructor. Had other instructors been involved, then this report might be seen as an attack on them.
Although the number of students is small, there is still disturbing evidence of a general lack of mastery of the material. This lack of mastery may be at the core of the high repeat rates seen in the mathematics courses.
Work within Title III on mathematics is drawing to a close. To date there has been no strong evidence for a measurably beneficial impact of technology. The students indicate that they believe the technology is helping them master mathematics. Instructors are modifying courses to more effectively utilize technology. The technology has created more positive attitudes, but it has not made a dramatic impact in the rates of success for students in mathematics and the College.
There are units within the College that have shown great success, units such as the Health Care Opportunities Program, Talent Search Program, and the Intensive English Program. Discussions at various times with the leadership of these programs suggests that some keys to their success are their sense of common community, sense of group identity, and individualized attention for members. Adopting these "best practices" models would likely require a restructuring of education at the College well beyond the scope of a single department. Structures like cohort coherent grouping and common classing would have to be put in place. Discussion of these ideas is beyond the scope of this report, but will certainly come in future discussions.
Dana Lee Ling
01 June 1998
Return to Lee Ling home page
|Count of Questions Analysis||Question Analysis|