Teagle Report: Where are we



Teagle Report

Department of Mathematics

Knox College

Introduction

Our department set out to try to understand and assess three common goals of liberal education, namely quantitative literacy, critical thinking, and civic engagement. Although all of us had a top level understanding of these goals, we found that it was a struggle to understand them at a deeper level, a level at which we might actually develop instruments to measure them. In grappling with the meaning of these goals we first tried to understand them in the broad context of general education and then tried to focus on what they meant to us as a department. Then we faced the question of how to measure them.

Quantitative Literacy

Definitional Issues

The first issue we faced in discussing the question of assessing quantitative literacy was to develop a working definition of the subject. Fortunately, quantitative literacy has been an ongoing area of interest for mathematicians, and over the past decade the Mathematical Association of America (MAA) has established a Special Interest Group of the MAA (“SIGMAA QL”) devoted to the discussion of quantitative literacy and produced several volumes dealing with the subject, the most recent of which (Current Practices in Quantitative Literacy) was published in 2006. More details of these and other initiatives may be found at the URL .

While we spent some time completing a broad overview of previous work from the MAA in quantitative literacy, the document Quantitative Reasoning for College Graduates: A Complement to the Standards soon became our primary focus. The report was produced by the Subcommittee on Quantitative Literacy Requirements of the MAA's Committee on the Undergraduate Program in Mathematics (CUPM) and, as the report itself notes, it “sets a standard for a quantitatively literate college graduate and suggests reasonable means for the achievement of that standard”. An online version of the report can be found at the URL . The goals of the report are quite broad: In four sections, it presents a rationale for why educated citizens should be quantitatively literate, develops a definition of quantitative literacy, discusses strategies for implementing a curriculum focused on quantitative literacy, and concludes with a section discussing how to assess the success of such a curriculum. What impressed us most about the report was that it established a relatively concrete definition of a quantitatively literate individual:

In short, every college graduate should be able to apply simple mathematical methods to the solution of real-world problems. A quantitatively literate college graduate should be able to:

1. Interpret mathematical models such as formulas, graphs, tables, and schematics, and draw inferences from them.

2. Represent mathematical information symbolically, visually, numerically, and verbally.

3. Use arithmetical, algebraic, geometric and statistical methods to solve problems.

4. Estimate and check answers to mathematical problems in order to determine reasonableness, identify alternatives, and select optimal results.

5. Recognize that mathematical and statistical methods have limits.

In an appendix to the report available at the URL , this definition is fleshed out even further with a specific list of mathematical topics that address these skills, divided into the broad mathematical subject areas of arithmetic, geometry, algebra, statistics, and “other”. This topic list (taken verbatim from Appendix B of the report) became the basis of our assessment tool for looking at the quantitative literacy of Knox students:

ARITHMETIC

• estimation

• percentage change

• use of calculator: rounding and truncation errors; order of operations.

GEOMETRY

• measurement: units and conversion of systems; length; area; volume.

• "familiar'' shapes: rectangles, triangles, circles, cubes, cones, cylinders, spheres, the Pythagorean relationship.

• angles: slopes of lines; parallel and perpendicular lines; right angles; similarity.

• complex shapes: approximation by "familiar" shapes; solution region for a system of linear inequalities in the plane.

ALGEBRA

• linear equations: equations in one unknown; systems in two unknowns; methods of solution.

• proportionality

• graphs and tables: constructing; reading, interpreting; extrapolating from; the notions of direct and indirect variation.

• simple exponents: roots and powers; products and quotients with a common base.

• concept of function: constructing discrete and continuous functions; graphical representation of functions; zeros of functions.

STATISTICS

• experimental probability: counting; mutually exclusive and independent events.

• graphical displays of data: pie and bar charts; frequency polygons; visual impact of scale changes.

• central tendency and spread: comparison of data sets using mean, median, mode and standard deviation, quartile deviation, range; percentile rank.

• the idea of correlation: measuring and evaluating the relationship between two variables.

• common sources of error: sampling error; misinterpreting averages or probabilities; invalid comparison distributions; statistical significance; statistical "proof''.

• random sampling: the count-recount technique; polls; lotteries; fair representation.

• linear fit: comparison of fit of two lines to a data set.

• quality control: the binomial distribution.

• simulation

• confidence intervals*: interval estimates; the standard error of the mean.

OTHER

• exponential change

• rates: comparison of average rates of change.

• models

• algorithms: sequential thinking; construction; relationship to formulas.

• optimization: the notions of maxima and minima of functions with or without constraints; graphical and computational methods for finding them; simple analytic methods, such as completing the square for quadratic polynomials.

• linear programming*: systems of equations in two variables with a linear objective function.

• scheduling*

• networks*

Here an asterisk denotes a subject area which was considered to be “less essential”.

Designing an Assessment Tool

The Quantitative Reasoning for College Graduates report provided us both with a concrete definition of quantitative literacy and a specific set of skills which a quantitatively literate person should possess. It also indicated to us that we would need to design a specific tool for measuring quantitative literacy rather than using existing measures of student assessment, since none of our courses addressed the whole array of topics covered in the list and more advanced courses in mathematics typically addressed these topics even less than introductory courses. Thus, the challenge for us was to develop an assessment tool which tested for these skills as thoroughly as possible. It was clear to us that we could assess student quantitative literacy by asking them a series of questions in an exam or quiz about the topic list in this report. However, we also set ourselves several additional constraints. First, the tool should not be time-consuming to implement. In particular, it couldn't consume a lengthy amount of class time or demand an extraordinary amount of faculty resources. Our concern was that neglecting either one of these would provide a substantial disincentive toward continuing to use it for assessment beyond the time frame of the grant. The downside to this approach was that any assessment tool we designed to fit these constraints couldn't possibly be inclusive of the entire topic list in the Quantitative Reasoning for College Graduates report.

The second major constraint in designing an assessment tool was the need to include realistic questions that students could conceivably encounter in their daily lives beyond college. The Quantitative Reasoning for College Graduates report was quite emphatic that the list of topics it presented needed to be “laced with good applications” in order to give a good measure of student quantitative literacy. So we wanted to develop questions that were not straightforward template problems they would see in a mathematics textbook. Instead, these questions needed to be embedded in a context that was both realistic and discursive, sometimes with information that was irrelevant to the solution of the problem.

Our first task in designing the exam was to generate a list of problems satisfying these conditions. We used our own experience in applying relatively elementary mathematics to real-world situations to arrive at these questions. As a first pass at designing the exam, we came up with about a dozen such questions. Further topics for questions were suggested during a break-out session at a Teagle-sponsored workshop on assessment at Beloit College. All of these questions were then whittled down to five questions which covered a wide range of topics in the Quantitative Reasoning for College Graduates report. We limited the quiz to only five questions because we felt the questions would be time-consuming for the students and we were concerned that student interest would wane if the quiz required an extraordinary time commitment from them. The resulting “quantitative literacy quiz” is given on the next two pages:

Quantitative Literacy Quiz

1. One definition of inflation is ``the persistent rise in the general price level as measured against a standard level of purchasing power.'' In short, because of inflation a dollar tomorrow is generally worth less than a dollar today. One common measure of inflation is the Consumer Price Index (CPI), which measures ``changes in the prices paid by urban consumers for a representative basket of goods and services.'' The table below shows the December-to-December percentage increase in CPI for several recent years:

Year % Increase in CPI

2002 2.4%

2003 1.9%

2004 3.3%

2005 3.4%

Suppose that you earned $30,000 per year in the year 2002. How much would you need to earn in 2006 to have the same ``purchasing power'' as you did in 2002?

2. A driver needs to take a large group of people on a week-long 1700 mile trip. The group is so large that they can't fit in one car, so the driver has two options: (1) rent a large passenger van, which costs $700/week to rent and which gets 14mpg for mileage, to hold the entire group; (2) use his own car, which gets 27mpg, and rent a smaller car, which costs $300/week to rent and gets 30mpg. If gas costs $3.00 per gallon, which is the better financial alternative?

3. A common strategy among bicyclists who ride long distances is to begin their ride by going into the direction of the wind so that the wind will help to ``push'' them on their way back. A ``cyclocomputer'' attached to the bicycle records total miles traveled during the trip, total time spent traveling, and the average speed for the entire trip. In a ride from Galesburg to Oquaka, Illinois, the cyclocomputer displayed the following information when the cyclist arrived in Oquaka:

Distance traveled: 42.556 miles

Time spent traveling: 2:30:30 (2 hours, 30 minutes, and 30seconds)

Average speed: 17.0 miles per hour

After arriving back in Galesburg (via a shorter route), the cyclocomputer displayed the following:

Distance traveled: 77.068 miles

Time spent traveling: 4:04:36

Average speed: 18.9 miles per hour

If the bicyclist travelled at an average speed of 17.0mph during the first leg of the trip and at an average speed of 18.9mph during the entire trip, what was the cyclist's average speed during the return leg of the trip?

4. A homeowner needs to replace a concrete driveway whose shape is given below:

The volume of concrete to be delivered to a customer is measured in units of cubic yards (three feet by three feet by three feet). Currently, each cubic yard of concrete costs about \$75. If concrete in the driveway is to be poured to a depth of four inches, what will be the cost of the concrete needed to replace the current driveway?

5. Plutonium is a naturally-occurring element which has been mined and used extensively in the development of nuclear weapons. However, there are serious concerns about the link between exposure to plutonium and cancer incidence rates, so scientists have developed an index which measures a person's exposure to plutonium. In a study of cancer rates in communities near a plutonium processing plant scientists compared this exposure index with cancer mortality rates (in deaths per 100,000 residents) and found the following data:

If you had to predict the cancer mortality rate for an exposure index of 5.0, approximately what would that mortality rate be?

The “wide range of topics” assertion was assessed by creating a table with one row for each “essential” topic area in the Quantitative Reasoning for College Graduate report and one column for each question, and placing a check in a box if we felt the question significantly addressed the subject area. The resulting table for the five questions used in the quiz is as follows:

Our concern was that our “Quantitative Literacy Quiz” would eventually get stale. The idea behind the table was that it would allow us to create future iterations of the quiz without having to start from scratch each time. After establishing a pool of potential questions and entering them into our table, we could create new quizzes by picking and choosing questions from the pool and using the table to insure that we covered a wide range of topics.

Administering and Assessing the Quantitative Literacy Quiz

The quantitative literacy quiz was administered to six different mathematics classes over the course of two terms. We felt it would be burdensome and impractical to administer the quiz to each student at Knox—or even to all students in mathematics courses at Knox--but we felt we could generate a random selection of Knox students by administering the quiz in several mathematics sections each term. Since Knox has a quantitative literacy requirement that students generally satisfy by taking a lower-level mathematics course, we felt that this methodology of administering to selected classes would result in a good cross-section of the Knox student body. In addition to the Math 121 “Mathematical Ideas” course which is a default course for students looking to satisfy the quantitative literacy requirement, we administered the quiz in calculus-level courses, which are often taken by students with a stronger interest in mathematics and the sciences, and in “300-level courses” generally taken only by mathematics majors. In addition to testing the quantitative literacy of Knox students overall, we wanted to see if there would be a difference in quantitative literacy levels among the different types of students, and so these different courses gave us three different categories for looking at the results of Knox Students:

• Math 121 (Mathematical Ideas) Students: Math 121 is the lowest-level mathematics course we offer which satisfies the college's quantitative literacy requirement. Students in this course have often self-selected the course because they dislike mathematics or feel they are not good at it. The quiz was administered in two sections of Math 121.

• Math 152/205 (Calculus-level) Students: The calculus sequence appeals both to science and mathematics majors as well as students desiring to satisfy the quantitative literacy requirement. Our hypothesis was that calculus-level students would score better than Math 121-level students on the quiz. The quiz was administered in one section of Math 152 (Calculus II) and one section of Math 205 (Calculus III).

• Math 300/327 (Major-level) Students: These are courses taken almost exclusively by students with an interest in mathematics. Our hypothesis was that these students would score better than students in either other categories. The quiz was administered in one section of Math 300 (Mathematical Structures) and one section of Math 327 (Mathematical Finance).

In each class, the five-question quiz was given as a take-home assignment that was to be returned at the next class meeting. Students were allowed to use calculators and were expected to submit solutions which demonstrated their work on each problem. They were also instructed that they could not seek outside help for the problems. The quizzes from all sections were graded by one faculty member to assure uniformity in standards. To ease the burden of grading, each problem was simply awarded a “1” or a “0” depending on whether or not the student had the problem substantially correct. Thus, a student could receive a maximum of five points on the quiz.

A total of 86 students returned the quiz, including 53 students in Mathematical Ideas, 20 calculus-level students, and 13 major-level students. The average score on the exam was 2.895. The overall distribution of scores among the 86 students was as follows:

When the scores were separated out by category (Mathematical Ideas vs. Calculus-level vs. Major-level) the distribution of scores was:

The break-out shows a clear distinction between students in the different categories. Among students in Math 121 the average score was 2.51; among Calculus-level students the average score was 3.15; among major-level students the average scores was 4.08.

The results of the quiz led us to a discussion of how the results could be used to assess the level of quantitative literacy among Knox students. We felt that each of the problems on the quiz was accessible to students with an understanding of high school level mathematics. The students were also given plenty of time to complete the problems, and they were allowed the use of a calculator if necessary. Allowing for one error or omission, our feeling was that a score of four or five merited a designation of “Quantitatively Literate”. Using this definition, this quiz suggests that overall about 34% of Knox students would qualify as quantitatively literate. Breaking these scores out into our different categories of students, about 17% of students in Math 121 would qualify as quantitatively literate; about 50% of students in calculus-level courses would qualify as quantitatively literate; and about 78% of students in mathematics major-level courses would qualify as quantitatively literate.

Further Observations and Directions for Future Research

• We need to stress that the percentages of “Quantitatively Literate” students that we arrived at should be taken with a grain of salt. There was some discussion of whether students who scored a three or higher on the quiz should be given the designation. This would clearly have led to more impressive results in the outcome. Our final decision to choose four or higher as the cut-off point was based on our gut instinct as mathematicians rather than any correlation with known standards for quantitative literacy. A longer quiz would also provide a more reliable result, but we were genuinely concerned about the commitment this would impose on us and our students in future years after the grant had expired.

• To build our confidence in these scores, we also need to establish better incentives to make sure students take the quantitative literacy quiz as seriously as they should. In most of the courses, the reward for completing the quiz was an additional credit on the homework component of the course grade. Unfortunately, students do not always take the homework component of the course as seriously as they should, and so lower scores may reflect a lack of effort rather than a lack of ability. However, significantly increasing the relative point value of the quiz seems unethical given that most of the courses in which we administered the exam have little to do with the material discussed in the quiz. Increasing the point value of the quiz may also provide an incentive for students to discuss the quiz among themselves or obtain help from others, and this would also detract from the assessment value of the quiz.

• While we were heartened that students who take major-level courses score better than other Knox students on our quiz, we were left wondering whether this reflected the innate interests and abilities of the students or the impact of taking more college-level mathematics courses. As of yet, we have no ideas for how to test this.

• Our discussions on quantitative literacy have had an impact on the mathematics department curriculum. Originally, our “Mathematical Ideas” course, which we offer for students intending to satisfy the quantitative literacy requirement in our curriculum, was essentially a mathematics appreciation course. It was designed primarily to give students with a real aversion to mathematics an appreciation for the importance of mathematics in cultures throughout history. However, as our understanding of quantitative literacy solidified, we realized that we could make “Mathematical Ideas” an applications-based course that attempted to address many of the topic areas identified in the Quantitative Reasoning for College Graduates report. This year the quantitative literacy quiz was administered to students at the beginning of the course. One direction for further research would be to administer a “before” and “after” quantitative literacy quiz in this course to determine if our new approach is contributing to the quantitative literacy of Knox students who decide to take the course.

Civic Engagement

Definitional Issues

We struggled more with the meaning of civic engagement for our majors than with the other two goals. Does civic engagement equate to applying learning outside the classroom? Does civic engagement mean that one works at a position such as an internship that advances personal aspirations, or does civic engagement demand a more humanitarian activity? That is, does civic engagement equate to activities that show a sense of social responsibility by identifying and addressing issues of public concern (as opposed to activities directed toward advancing one’s personal agenda)? Since we found that this particular educational goal was not as easy to quantify or test as the other two goals, we concluded that our assessment methods needed to be experience and narrative driven. Thus, we felt that reflective documents and self-reported results from surveys would probably be good sources of information.

At Knox we are fortunate to have an Experiential Learning requirement that is defined broadly as a project that involves the application of knowledge completed after the first year of study. Such projects can be humanitarian and non-humanitarian projects. Since our Experiential Learning requirement requires the student to write a short description of and reflection on their chosen experience, we will use these documents as our primary basis for assessing civic engagement by our majors.

Surveying our majors as well as our alumni will help us understand the extent to which our majors are civically engaged while in college and as citizens in our society. Since it is apparent that civic engagement is a lifetime habit, we plan to administer our surveys periodically to alumni. We believe that broad survey questions like:

• In what ways do you contribute to the world through civic engagement?

 

• To what community or communities do you lend your talents or resources?

 

will provide us with useful information.

Finally, the NSSE survey has several survey items that have bearing on civic engagement. A comparison of first year to senior survey results could provide important information that can be tracked through time.

Relevant NSSE Questions

• In your experience at your institution during the current school year, about how often have you done each of the following? (1-never, 2-sometimes, 3-often, 4-very often)

o participated in a community-based project as part of a regular course

o worked with faculty members on activities other than coursework (committees, orientation, student life activities, etc.

• During the current school year, about how often have you done each of the following? (1-never, 2-sometimes, 3-often, 4-very often)

o attended an art exhibit, gallery, play, dance, or other theater performance

o participated in activities to enhance your spirituality

• Which of the following have you done or do you plan to do before you graduate from your institution? (1-have not decided, 2-do not plan to do, 3-plan to do, 4-done)

o practicum, internship, field experience, co-op experience, or clinical assignment

o community service or volunteer work

• About how many hours do you spend in a typical 7-day week doing each of the following? (1-0 hrs, 2-1-5 hrs, 3-6-10 hrs, 4-11-15 hrs, 5-16-20 hrs, 6-21-25 hrs, 7-26-30 hrs, 8-more than 30 hrs)

o participating in co-curricular activities (organizations, campus publications, student government, fraternity or sorority, intercollegiate or intramural sports, etc.)

• To what extent has your experience at this institution contributed to your knowledge, skills, and personal development in the following areas? (1-very little, 2-some, 3-quite a bit, 4-very much)

o voting in local, state, or national elections

o developing a personal code of values and ethics

o contributing to the welfare of your community

Critical Thinking

Definitional Issues

In its broadest terms, critical thinking is the ability to gather relevant information, analyze and synthesize it, and draw valid conclusions from it. The Council for Aid to Education, who administers the CLA exam, has a good handle on many aspects of critical thinking. From their template institutional reports we cull the following:

Students are expected to evaluate evidence by:

1. Determining what information is or is not pertinent

2. Distinguishing between fact and opinion

3. Recognizing limitations in the evidence

4. Spotting deception and holes in the arguments of others

Students are expected to analyze and synthesize the evidence by:

1. Presenting his/her own analysis of the data

2. Breaking down the evidence into its component parts

3. Drawing connections between discrete sources of data

4. Attending to contradictory or inadequate information

Students are also expected to draw conclusions by:

1. Constructing cogent arguments rooted in data rather than speculation

2. Selecting the strongest set of supporting evidence

3. Avoiding overstated or understated conclusions and suggesting additional information to complete the analysis

Such statements, and others from the literature on learning and teaching, may be used as a starting point for deliberations about critical thinking in the disciplines. Although different disciplines will have different slants on what definitional statements mean for them, they nevertheless provide a good starting point for deliberations.

Our next step was to look at our curriculum to find particular places where epiphanies were reached. We examined common core courses, sophomore bridge courses, off-campus experiences, and senior research projects. We will put in place non-intrusive checks, such as collection of student written work, tabulation of results on special examination questions, or recording of faculty perceptions of student research presentations to measure progress in critical thinking. We will ask the students themselves what parts of their major education helped them most in the improvement of critical thinking, and if a particular experience is mentioned often, that experience will be studied more carefully.

-----------------------

[pic]

[pic]

[pic]

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download

To fulfill the demand for quickly locating and searching documents.

It is intelligent file search solution for home and business.

Literature Lottery

Related searches