The primary reason many students do not succeed in the [traditional math] course is that they do not actually do the problems. As a population, they generally do not spend enough time with the material, and this is why they fail at a very high rate.
Course redesign team at Kingwood Community College
Throughout the 1990's, many people saw information technology as a silver bullet that could solve many of higher education's problems, among them the need to improve learning outcomes and control the ever-upward trajectory of higher education costs. The term “silver bullet” connotes a direct and effortless solution to a problem. Unsurprisingly, the integration of technology and higher education has been neither direct nor effortless, but now we can say with certainty that technology can be used to address both learning and cost problems simultaneously.
One of our most persistent learning problems is the dismal record of student performance in developmental and college-level mathematics at our two- and four-year institutions. But we now know how to improve learning outcomes and student success rates in math at a lower cost than that of traditional instruction—and we can prove it. While not effortless, the solution is as close to a silver bullet as one can get in the complex world of teaching and learning.
Course redesign is the process of re-conceiving whole courses (rather than individual classes or sections) to achieve better learning outcomes at a lower cost by taking advantage of the capabilities of information technology. NCAT has 11 years of experience in conducting large-scale redesign projects in mathematics that do just that. Thirty-seven institutions have been involved, and most have redesigned more than one course either during the project period or afterwards. Collectively, NCAT math redesigns have affected more than 200,000 students to date.
Course redesign is not about putting courses online. It is about rethinking the way we deliver instruction, especially large-enrollment core courses, in light of the possibilities that technology offers.
Redesigns in mathematics at NCAT partner institutions have:
increased the percentage of students successfully completing a developmental math course by 51 percent on average (ranging from 10 to 135 percent) while reducing the cost of instruction by 30 percent on average (from 12 to 52 percent), and
increased the percentage of students successfully completing a college-level math course by 25 percent on average (from 7 to 63 percent) while reducing the cost of instruction by 37 percent on average (from 15 to 77 percent).
In addition to measuring course-completion rates and cost reduction, all NCAT projects compare student learning outcomes taught in the traditional format with those achieved in the redesigned course. This is done by 1) running parallel sections of the course in the two formats or 2) comparing baseline data from a traditional course to a later redesigned version of the course and looking at differences in outcomes. Assessment techniques include comparing the results of common final examinations, common questions or items embedded in examinations or assignments, pre/post-tests, and final grades when the same assignments, tests, and final exams are used and graded using the same criteria.
From working with large numbers of students, faculty and institutions, NCAT has learned what works and what does not work in improving student achievement in mathematics. The underlying principle is simple: Students learn math by doing math, not by listening to someone talk about doing math. Interactive computer software, personalized on-demand assistance, and mandatory student participation are the key elements of success.
The emporium model (named after what the model's originator, Virginia Tech, called its initial course redesign) has been implemented in various ways. Some institutions have large computer labs; others have small ones. At some institutions, students spend a required number of hours in the lab at any time that it is open. At others, instructors meet with students in the lab or in a classroom at scheduled hours. Each institution makes design decisions in the context of the constraints it faces. What is critical is the pedagogy: eliminating lecture and using interactive computer software combined with personalized, on-demand assistance.
Core Principles: Why Is the Emporium Model So Successful?
The emporium model has been successful for four reasons:
Students spend the bulk of their course time doing math problems rather than listening to someone talk about doing them.
Mathematics software has been evolving over the last five years, providing more reliable scoring and a better interface for students and instructors. By working with an instructional software package such as ALEKS, Hawkes Learning Systems, or MyMathLab, students are able to spend more time on task than when they listen to a lecture. Students quickly become comfortable with the technology; they especially like the instant feedback they receive when working on problems and the guided solutions that are available when they do not get a correct answer.
The opinion of the faculty leaders who have successfully redesigned math courses is unanimous: students do not learn math by going to lectures. The reason most success rates in math are so low, we believe, is because the three-lectures-per-week approach is simply not appropriate for introductory mathematics courses.
Students spend more time on things they don't understand and less time on things they have already mastered.
In the traditional lecture format, some students are bored because others' questions result in repetition of material they have already mastered, while other students feel overwhelmed by the amount of material covered in each lecture. Moreover, instructional software packages—which include interactive tutorials, computational exercises, videos, practice exercises, and online quizzes—can support auditory, visual, and discovery-based learning preferences.
Through diagnostic assessments for each student, areas of needed practice can be highlighted and individualized study plans developed. When students understand the material, they can move quickly through it and demonstrate mastery. When they get stuck, they can ask for examples or step-by-step explanations and take more time to practice.
Students get assistance when they encounter problems.
Traditional models increase the likelihood that students will get discouraged and stop doing the work because they have no immediate support and don't want to admit before fellow students that they do not understand. So they often do not get answers to the questions they have. In addition, homework problems are typically hand-graded and returned days after the students have made mistakes. By the time they see the graded homework, they are not sufficiently motivated to review their errors.
The emporium model helps students in a variety of ways. Instant feedback lets students review their errors at the time they make them and immediately get assistance from online tutorials and guided solutions, as well as from fellow students. In several of the math emporia, computer stations are arranged in pods of four to six to encourage collaboration. Moreover, instructors, graduate teaching assistants, and/or peer tutors are available to provide individual assistance. Any problem areas are addressed on an individual basis during lab time.
Students are required to do math.
Course redesign succeeds when students participate in scheduled learning activities, yet 30 percent or more may fail to do so. Some institutions have been more successful than others in addressing the issue of non-participating students. Redesign projects have found that students will participate in lab activities and homework if they require student participation and if they give points for doing so. Students participate more, score higher, and spend longer on course activities when credit is at stake.
At the University of Alabama, the 3.5-hour per week attendance requirement that was in place during the fall 2000 semester was eliminated in spring 2001. Student attendance in the lab declined significantly, and there was an appreciable increase in the number of students who stopped taking tests.
In fall 2001 the requirement was reinstated. Students received course credit for lab time and were penalized if their efforts fell short of the requirement. They were also given the opportunity to erase failing grades on tests by spending a minimum of 10 additional hours in the lab completing assessments on the materials covered by the test. Those changes led to a significant improvement in student performance.
Some institutions recognize that giving course points for attendance increases student engagement and learning but are hesitant to do so because they think it will inflate grades. To determine what effect giving attendance credit had on final grades, Alabama analyzed the grades of 3,439 students in five courses during fall 2005.
Attendance credit had no effect on the grades of 86.8 percent of the students. For 4.5 percent of the students, that credit increased their grade by a plus (e.g., the grade went from a C to a C+). For 0.5 percent, attendance credit allowed them to pass the course. For 1 percent, the credit caused them not to pass the course, and for 7.3 percent, it decreased their grade by a minus (e.g., went from a C to a C-). Thus, the argument that giving attendance credit inflates grades is not supported by the data.
For some faculty, getting rid of classroom meetings implies abandoning the human-interaction side of a classroom and conjures up images of students working alone. Nothing could be further from what happens in a well-designed math emporium. On-demand, personalized assistance is a hallmark of the emporium model. At all institutions using it, personal assistance is available far in excess of that offered in traditional courses.
Academics also tend to confuse using instructional software that includes online tutorials, homework, and quizzes with self-paced online courses. Leaving students on their own doing computer homework without having on-demand tutoring support available is a recipe for disaster. A laissez-faire, unstructured, open-entry/open exit model simply does not work. Students need sufficient structure within a well-articulated set of requirements to succeed.
These core principles have evolved over the last decade based on NCAT's experience in working with hundreds of faculty members and thousands of students. While the basic idea of the emporium was first conceived at Virginia Tech, the original model has been modified and extended in a variety of ways as described below.
Four Stages of Innovation
In conducting redesign programs, NCAT's approach has been first to establish a set of broad parameters (e.g., redesign the whole course, use instructional technology, reduce cost, modularize the curriculum) and then let experimentation bloom within them. From this iterative process, a number of redesign solutions have emerged—some anticipated, some not. NCAT has extracted lessons learned (models, principles, techniques) from these experiences and refined the parameters over the past 11 years.
Stage I: Experimentation
NCAT's Program in Course Redesign (PCR), funded by the Pew Charitable Trusts from 1999 to 2003, demonstrated that it is possible to improve student learning while reducing instructional costs in higher education. NCAT asked the 30 participating institutions to redesign large-enrollment introductory courses and in the process improve learning, reduce costs, and use technology. Beyond that, they had a blank slate.
Among the seven PCR mathematics redesigns (five at four-year institutions and two at two-year institutions), the most influential was the math emporium at Virginia Tech (VT), which was based on the idea that the best time to learn mathematics is when the student wants to do so rather than when the instructor wants to teach.
In its redesign of a linear algebra course, VT replaced all class meetings with a learning resource center (the emporium) featuring online materials and on-demand, personalized assistance. Multiple sections (38 sections of ~40 students each) were combined into one 1,500-student section, replacing duplicative lectures, homework, and tests with collaboratively developed online materials. As the course moved away from the three-contact-hours-per-week norm, the emporium significantly expanded the instructional assistance available to students: it is now open twenty-four hours a day, seven days a week.
VT's model allows students to choose when to access course materials, what types of learning materials to use, and how quickly to work. It is heavily dependent on instructional software, including interactive tutorials, computational exercises, electronic hyper-textbooks, practice exercises, solutions to frequently asked questions, and online quizzes. Modularized online tutorials present course content with links to a variety of additional learning tools: streaming-video lectures, lecture notes, and exercises. Navigation is interactive; students can choose to see additional explanations and examples along the way. Online weekly practice quizzes replace weekly homework. A server-based testing system generates large databases of questions, and grading and record-keeping are automatic.
The emporium is staffed by a combination of faculty, graduate teaching assistants (GTAs), and peer tutors. Instead of spending time preparing lectures or grading homework and tests, these instructors devote time to responding directly to each student's specific, immediate needs—directing them to resources from which they can learn and coaching them how to proceed. By creating a triage response team, the emporium increases the number of contact hours for students while greatly decreasing the cost per hour for that contact.
In the redesigned linear algebra course's first iteration, overall performance on a common final exam was similar to that of the traditional format. But the percentage of students completing the course by achieving grades of D or better improved, from an average of 80.5 percent in the two fall semesters immediately preceding the redesign to an average of 87.25 percent (a statistically significant improvement) in the subsequent four fall semesters.
Virginia Tech produced savings of about $53 per student (the cost went from $77 to $24), a 77 percent reduction. Today VT teaches more than 20 courses in the emporium. By teaching multiple math courses in its facility, VT can share instructional person-power among courses, significantly reducing the cost of teaching these additional courses.
Stage II: Modification
Following the successes achieved at VT, the University of Alabama and the University of Idaho–Moscow replicated the model in order to determine whether it would be successful with students who are significantly less prepared to study mathematics than those at VT. Alabama redesigned Intermediate Algebra, a developmental math course with a traditional success rate of 40 percent. Idaho redesigned two pre-calculus courses taken by those not prepared for regular college mathematics courses.
Alabama and Idaho made several key modifications to the original model:
Mandatory attendance. VT follows an open-attendance model, whereas Alabama and Idaho mandate attendance to ensure that students spend sufficient time on task. Alabama requires students to spend a minimum of three and a half hours per week in the MTLC; Idaho requires them to spend three per week.
Weekly group meetings. VT eliminated all class meetings, whereas Alabama requires students to attend a thirty-minute group session each week, allowing instructors to follow up in areas where testing has defined weaknesses, and Idaho students are assigned to groups of 40 to 50 students according to their majors so that particular applications can be emphasized. Groups meet once a week to coordinate activities and discuss experiences and expectations. Both universities believe that the group activities help build community among students and between students and instructors.
Smaller facilities. VT's math emporium holds 500 workstations as well as other specialized spaces and equipment. Alabama's mathematics technology learning center (MTLC) contains 240 computers, plus rooms for tutorial activities. Idaho's lab, the Polya center, contains 72 computers in pods of four, designed for as many as three students to work together at a single monitor. VT's emporium is open 24/7, Alabama's MTLC for 71 hours per week, and Idaho's Polya center for 86 hours per week.
Commercial software. When VT conducted its initial redesign, no instructional software was available to support its linear algebra course, so the university wrote its own. From the beginning, Alabama and Idaho used commercial software.
The University of Alabama
In 2000, the University of Alabama redesigned Intermediate Algebra, a pre-general-studies course enrolling 1500 students each year, in order to address poor student performance. Nearly 60 percent of the students in the fall 1999 traditional course earned a D, F, or W grade, and students often needed to take the course two or three times before passing. The course redesign involved the development of a student-centered, computer-assisted course structure that allowed the individual student to focus on his or her questions and difficulties. The software provided quick feedback to students, an instant assessment of skills, and a steady flow of information to instructors and tutors.
Tests in the redesigned course were derived from those in the traditional course, and similar criteria were used for the assignment of grades; therefore, changes in grade distribution reflected changes in student learning.
The sum of A and B grades was significantly higher for the redesigned than for the traditional course in the fall semesters. Student success rates (C or better) increased from 40.6 percent in the fall 1999 traditional course to 78.8 percent in fall 2003. Students completing Intermediate Algebra in both course formats were also tracked into subsequent math courses. Students in the first course in a two-course pre-calculus sequence who had taken Intermediate Algebra in the redesigned format significantly out-performed students who took it in the traditional format.
The redesign reduced the cost per student from approximately $122 to $86, a 30 percent savings. Alabama subsequently replicated this redesign success in Pre-calculus Algebra and continues to offer both courses in the redesigned format.
In addition to demonstrating that the emporium model could succeed among those less prepared to engage in college study, important lessons were learned from the Alabama and Idaho projects about increasing success among low-income students, students of color, and working adults. At Alabama, the success rate (grades of C– or better) for African-American freshmen in the redesigned course was substantially higher than for white freshmen, despite the fact that the African-American students were less prepared when they entered the course (on a math placement exam, 20 percent of Caucasian freshmen scored less than 200, versus 41 percent of African Americans). In fall 2000, 71.4 percent of African-American freshmen were successful, versus 51.8 percent of Caucasian freshmen; in fall 2001, it was 70 percent versus 65.3 percent.
The University of Idaho
In 2000, the University of Idaho redesigned two pre-calculus courses enrolling a total of 2,428 students. These lecture courses had traditionally met three times per week in sections of ~50 students taught by lecturers and graduate students. Out-of-class assistance was provided by a tutoring center.
The university moved all structured learning activity to the Polya Learning Center, where students receive just-in-time assistance from instructors and undergraduate assistants. Instructors also meet students in a once-a-week group that focuses on student problems and builds community among students and instructors.
With comparable examinations and assignments used in each situation, student success rates (grades of C or better) in Intermediate Algebra increased from 59 percent in the traditional format to 75 percent in the redesigned format and in Precalculus from 68 to 75 percent.
In the redesign, total faculty preparation hours were reduced from 4,609 to 3,347, a decrease of 27 percent. Interaction time more than doubled, increasing from 2,846 hours in the traditional format for all sections to 6,178 hours in the redesigned format. Training time for the redesigned course is higher, since the course involves a greater diversity of personnel. One faculty member coordinates the course and a lab manager supervises personnel in the lab.
The new active-learning model reduced the total cost of offering both courses from approximately $338,000 to $235,000, a reduction of 31 percent. The cost per student for the two courses ranged from $110 to $176 per student, whereas the cost per student served in Polya is about $97 for both courses. Idaho continues to offer precalculus courses in the redesigned format.
At Idaho, Hispanic students who were part of the College Assistance Migrant Program (CAMP) historically had been unsuccessful in math courses. During the fall 2002 semester, however, students in the redesigned Intermediate Algebra course had an unprecedented 80 percent pass rate, compared to a prior 70 percent rate—indeed, these students surpassed the success rate for the entire course population. At both universities, the key was ensuring that students spent sufficient time on task and received personal assistance when needed.
Stage III: Replication
NCAT's next national program, the FIPSE-funded Roadmap to Redesign, or R2R (2003–2006), introduced a streamlined course-redesign methodology and widened course-redesign adoption to 20 additional institutions. Of those, seven were in mathematics at both the developmental and college levels (all at four-year institutions.)
Based on our experience in the PCR, NCAT identified five models for course redesign and five principles of successful course redesign that had produced improved student learning and reduced instructional costs. We began to ask redesign teams to select a redesign model and explain how they planned to embody the five principles within it. We also developed other planning resources to support course redesign, such as four models for assessing student learning, cost-reduction strategies, and five critical implementation issues.
At that point in our history, we did not prescribe a model to new redesign teams for several reasons. First, we wanted each redesign team to own their redesign plan by making choices as they went through the planning process. Second, we were interested in seeing variations on previous redesigns in different disciplines. Third, we wanted to see new models emerge as we worked with greater numbers of institutions and disciplines.
By the end of R2R, we had learned that certain models seemed to be appropriate to certain disciplines. For example, all of the foreign language projects that we worked with chose the replacement model, in which they moved grammar instruction, practice exercises, testing, writing, and small-group activities to the online environment and used in-class time for developing and practicing oral communication skills. That model has also been the model of choice in English composition.
In the PCR and R2R, redesign projects in math used both the replacement model, where typically half of the traditional class time was retained and half was replaced by lab time, and the emporium model. We learned that the latter consistently produced spectacular gains in student learning and impressive reductions in instructional costs, whereas the former did not.
The most significant occurrence in mathematics redesign during this period was the application of the emporium-model variation developed at Alabama and Idaho to college-level courses at Louisiana State University and the University of Missouri-Saint Louis. The ability of student populations at these institutions was somewhere in between VT's strong math students and the Alabama and Idaho developmental-math students. The results at both institutions achieved confirmed the validity of the model.
Similarly, the University of Missouri–St. Louis redesign of College Algebra increased student success rates (grade of C or better) from 50 percent to 80 percent while reducing cost by 30 percent.
Louisiana State University
In 2004, Louisiana State University (LSU) redesigned College Algebra, a three-credit course enrolling 4,900 students annually. Like Alabama and Idaho, the university moved all structured learning activity to a lab modeled after the Virginia Tech math emporium. Instructors also met students in a once-a-week focus group.
Learning outcomes were measured by comparing medians on a common final exam. Final-exam medians for traditional fall sections ranged from 70 percent to 76 percent. After redesign, the final-exam median in fall 2006 was 78 percent, the highest ever achieved. In the traditional format, final exams were graded by individual instructors, and partial credit was allowed. In the redesigned format, final exams were group-graded, which meant that grading was more consistent across sections, and partial credit was not allowed, yet exam medians were higher.
The success rate (grades of C or better) for College Algebra in the five years prior to the redesign had averaged 64 percent. The goal of the redesign was to maintain this rate and possibly raise it over time. In fall 2006, students had the lowest-ever drop rate of 6 percent and an A-B-C rate of 75 percent.
The redesign produced cost savings by serving the same number of students with half the personnel used in the traditional model. Section size stayed at 40–44 students, but the number of class meetings each week went from three to one. The redesigned format allowed one instructor to teach twice as many students as in the traditional format without increasing class size or workload, because the class only met once a week and no hand-grading was required. While the cost of adding tutors in the learning center and increased time for coordination and systems administration reduced the net savings, the redesign reduced the cost-per-student from $121 to $78, a 36 percent savings.
LSU has also redesigned its trigonometry and pre-calculus courses using the emporium model and continues to offer all courses in the redesigned format.
Stage IV Expansion: 2006 – 2010
NCAT's 2006–2010 System- and Statewide Redesign Initiatives expanded the number of course redesigns to an additional 54 projects in Arizona, Maryland, Mississippi, New York, and Tennessee. Of those, 13 were in mathematics. We continued to gain more experience and to refine and clarify our redesign principles and models based on the results of these 13 projects.
The most significant occurrence in this period was the successful replication of the emporium model at two community colleges in Tennessee. In partnership with NCAT, the Tennessee Board of Regents (TBR) launched a Developmental Studies Redesign Initiative in 2006 to reform its remedial and developmental math and English curriculum. The goal was to develop and implement a more effective and efficient assessment and delivery system that would increase completion rates for students, reduce the amount of time that they spent in remedial and developmental courses, and decrease the amount of money that students spent to take developmental courses.
The redesign projects at Cleveland State Community College and Jackson State Community College were consecutive winners in 2009 and 2010 of the Community College Futures Assembly's prestigious Bellwether Award. The award is given to cutting-edge programs that other colleges would find worthy of replicating.
SIX MODELS FOR COURSE REDESIGN
Supplemental: Add to the current structure and/or change the content
- Replacement: Blend face-to-face with online activities
- Emporium: Move all classes to a lab setting
- Fully Online: Conduct all (or most) learning activities online
- Buffet: Mix and match according to student preferences
- Linked Workshop: Replace developmental courses with just-in-time workshops
FIVE PRINCIPLES OF SUCCESSFUL COURSE REDESIGN
Redesign the whole course.
- Encourage active learning.
- Provide students with individualized assistance.
- Build in ongoing assessment and prompt (automated) feedback.
- Ensure sufficient time on task and monitor student progress.
FOUR MODELS FOR ASSESSING STUDENT LEARNING
Establish the method of obtaining data
Parallel sections (traditional and redesign
- Baseline “before” (traditional) and “after” (redesign)
Choose the measurement method
Comparisons of common final exams
- Comparisons of common content items selected from exams
- Comparison of pre- and post-tests
- Comparisons of student work using common rubrics
Identify the enrollment profile of the course: stable or growing?
Choose the labor-savings tactic(s) that will allow you to implement the chosen strategy with no diminution in quality.
Substitute coordinated development and delivery of the whole course and shared instructional tasks for the individual development and delivery of each section.
- Substitute interactive tutorial software for face-to-face class meetings.
- Substitute automated grading of homework, quizzes, and exams for hand grading.
- Substitute course management software for human monitoring of student performance and course administration.
- Substitute interaction with other personnel for one-to-one faculty/student interaction.
Choose the appropriate cost-reduction strategy.
Each instructor carries more students by increasing the size or number of sections for the same workload credit.
- Change the mix of personnel from more expensive to less expensive.
- Do both simultaneously.
FIVE CRITICAL IMPLEMENTATION ISSUES
Prepare students (and their parents) and the campus for changes in the course.
- Train instructors, graduate teaching assistants, and undergraduate peer tutors.
- Ensure an adequate technological infrastructure to support the redesign as planned.
- Achieve initial and ongoing faculty consensus about the redesign.
- Avoid backsliding by building ongoing institutional commitment to the redesign.
Each institution made several key modifications to the model used by four-year institutions:
A “fixed” or “fixed/flexible” version of the emporium. In all versions, mandatory attendance (e.g., a minimum of three hours weekly) in a computer lab or classroom ensures that students spend sufficient time on task and receive on-demand assistance when they need it. At four-year institutions, a flexible version of the emporium has predominated: While a minimum number of lab hours are mandatory, they may be completed at the student's convenience. JSCC implemented a fixed version where instructors meet with student cohorts in the lab at scheduled times. CSCC developed a fixed/flexible version—that is, the three mandatory hours working with software are a combination of one fixed meeting in a computer classroom, one flexible hour in the lab, and one additional hour spent working from anywhere (e.g., from home.)
Modularization. The Tennessee community colleges redesigned multi-course sequences and introduced modularization as an additional innovation. Both CSCC and JSCC replaced the developmental math three-course sequence with a modularized curriculum mapped to the competencies originally required in the three courses. Students are required to complete one module satisfactorily before moving on to the next, and they can begin the next semester with the next required module not completed during the previous semester. The multi-entry and multi-exit opportunities and individualized pacing permit students more frequent opportunities for successful completion and more time to focus on deficiencies. Students can progress through content modules at a faster or slower pace, depending on the amount of time they need to master the module content.
Mastery learning. Both institutions combined a modularized curriculum with a mastery-based learning strategy. Before students can move from one homework assignment to the next, they are required to demonstrate mastery (70 percent at CSCC and 80 percent at JSCC.) After all homework for a module is completed, students take a practice test as many times as needed.
Once ready, students take an online proctored post-test that comprises 70 percent (CSCC) or 75 percent (JSCC) of the module score. Unsuccessful students can ask for help before retaking the test. The remaining portion of the module score, which has to be at least 75 to complete the module, is for attendance, notebooks, and homework.
At Jackson State, average post-test scores increased by 15 points and final grades of C or better increased by 44 percent; instructional costs were reduced by 20 percent.
NCAT is now replicating what was learned from the Tennessee program in a major national program, Changing the Equation. Thirty-eight community colleges are redesigning their developmental math sequences (all sections of all developmental courses offered) using the emporium model and commercially available instructional software. Each redesign will modularize the curriculum, allowing students to progress through the developmental course sequence at different rates. Institutions are piloting their redesign plans in spring 2011 and will fully implement them in fall 2011. Collectively, these 38 redesigns will affect more than 120,000 students annually.
We continued to apply what we learned from Tennessee in other state-based programs. An example from the Mississippi Course Redesign Initiative, conducted from 2007– 2010 in partnership with the Mississippi Institutions of Higher Learning, illustrates the success of the emporium model at historically black colleges and universities.
Cleveland State Community College
In 2007, Cleveland State Community College (CSCC) redesigned its three developmental math courses, enrolling over 1200 students annually At CSCC, students are required to spend three hours each week in a combination of one fixed meeting, one flexible hour in the lab, and one additional hour spent working from anywhere. The one-hour class meetings are held in small computer classrooms (20 computers) where students work online and instructors help them individually, review their progress, and assist with their action plans for the coming week.
CSCC assessed student learning outcomes by comparing common content items from departmental final exams. In Basic Math, the number of items answered correctly increased from 73.3 percent to 86.2 percent, in Elementary Algebra from 70.3 percent to 86.2 percent, and in Intermediate Algebra from 77.3 percent to 90.1 percent. Prior to the redesign, an average of 55 percent of students taking any developmental math course earned a final grade of C or better. After the redesign, 72 percent earned an A, B, or C.
Before the redesign, the completion rate of developmental students in subsequent college-level courses was 71 percent, compared to a completion rate of 70 percent for other students. After the redesign, their completion rate was 81 percent, compared to 70 percent for other students.
An independent Tennessee Board of Regents (TBR) Office of Academic Affairs research study concluded, “A student in a redesign course is twice as likely to receive a grade that would allow them to move on to the next course than a student in a course before the redesign. The redesign also format has a positive and strong impact on the success in the next course. As in the pre-college course, a student entering the next course is twice as likely to receive a grade of A, B, C, P, or S as a student from a course before the redesign.”
CSCC's redesign has reduced the cost of developmental math by 20 percent and produced an annual savings of more than $50,000 while reducing class size from 24 to 18. Faculty productivity rose by 23 percent. The average student load per faculty member went from 106 to 130. This increased productivity enabled the department to eliminate the use of adjunct faculty members while increasing course offerings. Faculty members are now expected to teach 10–11 sections, work 8–10 hours in the lab, and handle 150+ students each semester.
During the same period, CSCC also redesigned three college-level courses (College Algebra, Finite Math and Introductory Statistics) with similarly impressive results. CSCC's intention is to offer 95 percent of departmental offerings in the redesigned format.
Alcorn State University
In 2008, Alcorn State University redesigned College Algebra, a large-enrollment course of ~600 students, using the emporium model.
Students in the redesigned course performed significantly better than those in the traditional format: the average score on mid-term and final exams of the fall 2008 traditional sections was 55.89, while that of the fall 2009 redesigned sections was 66.16.
The redesigned course decreased the cost per student from $278 to $184, a 34 percent savings achieved by reducing the number of sections offered annually from 16 to eight and increasing section size from ~38 to 75. The number of faculty teaching the course was reduced from eight to four.
The savings have been used to strengthen the math major program. Each of the instructors is now able to teach a section of another course, and senior faculty are able to teach more upper-level courses, increasing the variety of courses available to math majors. The Alcorn State team is now redesigning Intermediate Algebra as well.
Similarly, Mississippi Valley State University (MVSU) redesigned Intermediate Algebra, which enrolls ~500 students annually. Students receiving passing grades (A-C) increased from 36 percent to 49 percent. At first glance, a student success rate of 49 percent might seem unimpressive. It is important to remember, however, that this rate was achieved during MSVU's first semester of full implementation of the redesign and that math redesigns at other institutions steadily improved their success rates over time. For example, the University of Alabama's initial improvement was from 40 to 50 percent; their success rate now hovers around 80 percent. MVSU's redesign decreased the number of sections offered from 17 to eight, producing a 24 percent savings.
During 2006–2009, NCAT's FIPSE-funded Colleagues Committed to Redesign program (C2R) expanded course-redesign adoption to 28 additional institutions, 10 of which were in mathematics (six at four-year institutions and four at two-year institutions.) Again, those who “followed the rules” succeeded.
For example, the University of Central Florida (UCF) redesign of College Algebra, enrolling more than 4,100 students, increased the student success rate (C or higher) from 65 percent to 74 percent and mean exam performance from 63 percent to 81 percent while decreasing costs by 30 percent. The UCF team has gone on to redesign Intermediate Algebra and Pre-Calculus. The redesign of Intermediate Algebra, enrolling 2,760 students, at Santa Fe College (SFC) increased the percentage of students scoring a grade of C or better on a common final exam from 59 percent to 78 percent while reducing costs by 26 percent. SFC is also redesigning Prep Pre-Algebra, Elementary Algebra, Integrated Arithmetic and Algebra, and College Algebra.
By 2009, we began to insist that any math redesign proposal in an NCAT program use the emporium model.
Core Principles: Why has the Emporium Model Been Sustained?
In a June 9, 2008 Inside Higher Ed article, Vincent Tinto declared, “We must stop tinkering at the margins of institutional life, stop our tendency to take an ‘add-on’ approach to institutional innovation, … stop marginalizing our efforts and in turn our academically under-prepared students, and take seriously the task of restructuring what we do.”
Most reformers in mathematics are simply tinkering at the margins without a clear vision of how to create significant and sustainable change. NCAT and its partner institutions have proven that redesigning both developmental and college-level math using the emporium model results in dramatic increases in student success and reductions in instructional costs. Furthermore, we have done so with very large numbers of students over a ten-year period. Institutions like Virginia Tech and the Universities of Alabama and Idaho have taught thousands of students for a decade in this new mode. NCAT redesigns have moved well beyond the experimentation stage; they have been both scaled and sustained.
We believe that the following characteristics of redesign directly contribute to that scalability and sustainability and are key differentiators between NCAT redesigns and other reform efforts in math education.
Whole-course redesign conducted by teams of faculty and administrators. Innovations in higher education frequently fail because they are dependent upon a single champion—a risk-taking, creative faculty member or administrator who is trying to create change within the institution. If that champion leaves the institution or changes positions within it, there goes the innovation. “Random acts of progress,” as Bill Graves has called them, frequently produce good results but rarely lead to sustained change.
In contrast, NCAT course-redesign teams include many faculty and administrators who follow a redesign plan that is fully supported by the entire department. In each NCAT redesign, the whole course rather than a single class or section is the target of redesign. In contrast to traditional courses, where each instructor typically does his or her own thing, redesigned courses are consistent in content, coverage, assessment, and pedagogy across all sections. The redesign becomes “institutionalized,” making the innovation relatively impervious to individual shifts in personnel. A collective commitment to redesign the whole course is key to sustainability.
Proven methods of integrating technology and learner-centered pedagogy. Innovations in higher education that focus on materials creation rather than how the materials are used frequently fail. Successful course redesign that improves student learning while reducing instructional costs is heavily dependent upon high-quality, commercially available learning materials such as ALEKS, Hawkes Learning Systems, or MyMathLab, which play a central role in engaging students with course content. Faculty members who incorporate commercially available materials are able to focus on pedagogical and organizational issues rather than on materials creation, adaptation, and maintenance. Redesign teams can also rely on commercial providers for training, support, and software maintenance.
But it's not the software itself that's critical to success; it's the way the software is used. Most attempts to use technology in mathematics reform are simply “add-ons” to an otherwise unchanged instructional process. Students continue to meet in traditional classroom settings with teacher-led activities at fixed times and places, and technology is used as a supplement, typically outside of class as homework, and often as a suggestion rather than a requirement.
NCAT redesigns make student use of software coupled with on-demand, individualized assistance a centerpiece of their pedagogical strategies. Rather than leaving it up to individual instructors to decide whether and how to use instructional software, these redesigns coordinate the efforts of all course instructors so that all students receive a uniform, high-quality learning experience.
Cost reduction as an integral part of the redesign. Unfortunately, many innovations in higher education rely on internal or external grant funding in order to exist rather than to support the transition to a sustainable model. Increased student success may be temporarily achieved due to extra resources provided by the grant, but when the funding ends, so does the innovation.
In contrast, in every successful NCAT redesign, the cost of offering the course is reduced. Institutions that have increased learning at a reduced cost have no motivation to return to a less-successful, more-expensive approach. Each redesign includes sustainability in its plan from the outset, and no new resources are needed on a recurring basis to sustain the redesign.
In a 1994 Educom Review article, Robert C. Heterick, Jr., former president of Educom, wrote:
Lord Kelvin once made the observation, “If you can measure that of which you speak and express it in numbers, you know something about your subject; but if you cannot measure it, your knowledge is of a very meager and unsatisfactory kind.” If he is correct, then our knowledge about how, and to what extent, the use of information technology in teaching and learning affects outcomes—both learning and cost—is meager indeed.
One of our continuing tasks must be to measure, hypothesize, and finally formalize theories about how technology applies to the educational enterprise. One of our great failings as a community consists of relying too heavily on the anecdotal and not doing the hard work of “proving” our concepts through meticulous measurement and theory building.
If, as many of us are already convinced, information technology will be the lever that dramatically repositions the learning enterprise in our society, then we have a truly formidable task ahead of us in selling a significant reallocation of institutional resources away from personal mediation and toward technology mediation. Absent well-documented measurements of how much learning for how much resource, we can expect to see a continuation of the pursuit of academic quality that is indifferent to cost.
NCAT's work over the past 11 years has been inspired by these ideas and has tried to live up to this standard.
Clearly the implications for colleges and universities around the country of the outcomes produced by the colleges and universities cited in this article are substantial. By putting students first and organizing their redesigns around the individual needs of students rather than the convenience of institutions, these pioneering institutions are making a major contribution to improving the ways in which all of us help students succeed in college and move more rapidly to degree completion.
I recently wrote to John Squires, the project leader at Cleveland State, to congratulate him and his colleagues on their outstanding work. I said, “You guys are the poster children for how to do the right thing! You should be really proud.” John's response: “Much of what we did is simply follow the NCAT playbook.”
That playbook is the product of the hard work and dedication of many extraordinary faculty and staff around the country who are showing the way to address one of our country's most vexing academic problems. The message is simple: Students learn math by doing math, not by listening to someone talk about doing math.
Carol A.Twigg (ctwigg@theNCAT.org) is president and CEO of the National Center for Academic Transformation (NCAT), an independent, not-for-profit organization dedicated to the effective use of information technology to improve student learning outcomes and reduce costs in higher education. Since 1999, NCAT has conducted four national and five state-based course redesign programs, producing more than 120 large-scale redesigns that achieve quality enhancements as well as cost savings. Participating institutions include research universities, comprehensive universities, private colleges, and community colleges. Course redesigns focus primarily, but not exclusively, on large-enrollment, introductory courses in multiple disciplines, including 16 in the humanities, 60 in quantitative subjects, 23 in the social sciences, 15 in the natural sciences and six in professional studies.