Thirteen years ago, 276 bachelor's-granting colleges and universities inaugurated a new approach to assessing college quality by participating in the first national administration of the National Survey of Student Engagement (NSSE). The timing was right. Policymakers were growing increasingly impatient with an ongoing yet unsustainable pattern of cost escalation, skepticism was building about how much students were learning in college, and regional accreditors were ratcheting up their demands on colleges and universities to adopt assessment for purposes of improvement.
Meanwhile, higher education's leaders were frustrated by the crude metrics dominating the discourse about college quality. It's been said that a dean at one of those early-adopting institutions enthusiastically proclaimed: “Finally, a test I actually want to teach to!”
NSSE introduced a simple yet effective reframing of the quality question: ask undergraduates about their educationally purposeful experiences. It incorporated several important design principles:
Emphasize behaviors that prior research found to be positively related to desired learning outcomes.
Emphasize actionable information—behaviors and experiences that institutions can influence.
Standardize survey sampling and administration to ensure comparability between institutions.
Provide participating institutions with comprehensive reports detailing their own students' responses relative to those at comparison institutions, plus an identified student data file to permit further analysis by the institution.
NSSE was administered to first-year students and seniors, opening a window on quality at these “bookends” of the undergraduate experience. In addition to reporting item-by-item results, the project created summary measures in the form of five “Benchmarks of Effective Educational Practice” that focused attention on key dimensions of quality in undergraduate education: level of academic challenge, active and collaborative learning, student-faculty interaction, enriching educational experiences, and supportive campus environment.
The new survey caught on fast. Annual participation now numbers 600–700 institutions, for a cumulative total of more than 1,500 colleges and universities in the US and Canada. What started as a bold experiment in changing the discourse about quality and improvement in undergraduate education—and providing metrics to inform that discourse—is now a trusted fixture in higher education's assessment landscape.
High rates of repeat participation offer compelling testimony of the project's value. Of the first group of 276, 93 percent administered the survey in NSSE's tenth year or later.
The Web-based survey is now offered as a census of first-year students and seniors, permitting disaggregated analyses by academic unit or demographic subgroup. In 2013, some 1.6 million undergraduates were invited to complete the survey, providing both valuable information for more than 620 participating campuses and a comprehensive look at student engagement across a wide variety of institutions.
The 2013 administration marks the first major update of the survey since its inception. In the following pages, we summarize what we've learned over NSSE's first 13 years, why we're updating the survey, and new insights and diagnostic possibilities represented by these changes. Although NSSE's companion surveys, the Faculty Survey of Student Engagement (FSSE) and the Beginning College Survey of Student Engagement (BCSSE), are incorporating parallel changes, here we focus on the changes to NSSE.
What We've Learned
Both NSSE and the Community College Survey of Student Engagement (CCSSE) have collected data for more than a decade now. A host of large-scale studies (the Wabash National Study of Liberal Arts Education [WNSLAE], NSSE's Connecting the Dots, and CCSSE's validation studies) connect engagement data to indicators of success.
The Wabash Study Connects Learning and Engagement
WNSLAE—a large-scale multi-institutional, multi-method study conducted by the Wabash College Center of Inquiry in the Liberal Arts—has found evidence of positive connections between students' experiences and their learning and development. Using a pre-test/post-test design, WNSLAE tested students on six broad outcomes of liberal education—critical thinking and problem solving, inclination to inquire and orientation toward lifelong learning, intercultural effectiveness, leadership, moral reasoning, and personal well-being—and connected these to information about the student experience, including measures of engagement from NSSE (Blaich &Wise, 2011).
Adjusting for the average pre-test scores of entering students, all but one of the NSSE benchmarks were positively associated with one or more outcomes after the first year of college, averaged at the institutional level (Pascarella, Seifert, & Blaich, 2010). Deep approaches to learning—for example, coursework that emphasizes higher-order cognitive tasks such as synthesis and evaluation, asks students to integrate diverse perspectives and ideas from different courses, and encourages reflective learning—positively affected the development of moral reasoning in first-year students (Mayhew, Seifert, Pascarella, Nelson Laird & Blaich, 2012).
What's more, meaningful discussions with faculty and peers outside of the classroom during the first year of college appeared to stimulate a desire to engage in cognitive activities (Padgett et al., 2010). The Wabash studies show how core elements of liberal education connect to outcomes such as intercultural effectiveness, lifelong learning, psychological well-being, and socially responsible leadership (Seifert et al., 2008).
NSSE and CCSSE Connect Engagement and Success
In their “Connecting the Dots” study, NSSE researchers found significant positive, though modest, relationships between engagement and both grades and persistence to the second year, after controlling for a wide range of pre-college variables. And engagement had stronger effects on first-year grades and persistence to the second year for underprepared and historically underserved students (Kuh, Cruce, Shoup, Kinzie, & Gonyea, 2008)—the very populations most in need of improved outcomes.
CCSSE researchers analyzed three large multistate data sets in order to document the relationships in the two-year sector between engagement and indicators of success such as grades, credit-hour accumulation, persistence, and degree attainment. They found significant positive associations between student engagement and those outcomes, supporting the proposition that student engagement is related to success in that sector as well (McClenney & Marti, 2006).
Researchers have also found a correlation between specific dimensions of engagement and retention. Students at bachelor's-granting institutions were more likely to return to college if they participated in high-impact and co-curricular activities (Kuh, 2008), as were community college students who engaged in collaborative learning, were challenged academically, spent substantial time on task, and had interactions with faculty (McClenney & Marti, 2006).
Some Key Questions Answered
Do students invest enough time in their studies?
According to accounts of time use in NSSE, the average full-time college student only studies about half of the traditional expectation of two hours of study time for each hour of class (NSSE, 2011). But study time varies widely by discipline. For example, engineering students study five hours more, on average, than their peers in business and the social sciences. And faculty no longer hold to the two-to-one rule of thumb. Results from FSSE suggest that the study time they typically expect of students is only slightly higher than what students themselves report (NSSE 2011).
Can students be classified by engagement type? Another promising development in engagement research is a renewed interested in student typologies. Typological methods may be a useful approach to understanding patterns of student engagement and how they relate to success, especially since most of the variability in student engagement is between students rather than between institutions. Hu and McCormick (2012) used WNSLAE data to develop a typology of first-year students representing different engagement patterns, and they examined how outcomes varied across the types.
Of the seven groups identified, two polar opposites were the “disengaged” and the “maximizers.” The average disengaged student scored well below the mean on all engagement measures, while the typical maximizer was well above average across the board. As a result, disengaged students showed significantly lower gains than maximizers on the four WNSLAE outcomes examined and had lower first-year GPAs, perceived learning gains, and persistence to the second year than their most-engaged peers.
What distinguishes engaging institutions? A decade ago, NSSE's Documenting Effective Educational Practice (DEEP) project described the conditions for student learning and success at high-performing institutions. Twenty DEEP colleges and universities shared six predominant features:
A “living” mission and a “lived” educational philosophy
An unshakeable focus on student learning
Clearly marked pathways to student success
Environments adapted for educational enrichment
An improvement-oriented campus culture
Shared responsibility for educational quality and student success.
Revisiting these institutions in 2010, DEEP researchers found that their retention and graduation rates were still good, and several had become better. NSSE scores were also strong, and the six features remained a focus of their commitment to student outcomes.
Several practices had taken on even greater importance: data-informed decision-making, the ethic of “positive restlessness,” collaboration between academic and student affairs, and campus leaders' work to increase faculty and staff understanding of the conditions for student success (Kuh, Kinzie, Schuh & Whitt, 2011).
Is there evidence of improvement? From its inception, one of NSSE's core objectives has been to inform institutional improvement efforts. So from the project's early days, we have collected examples of how institutions use their engagement results, featuring scores of examples in our annual reports and more recently in our Lessons from the Field series and in a searchable database on our Website.
Results from more than 200 institutions with at least four NSSE administrations showed that more than two-fifths had a significant positive trend in at least one engagement measure for first-year students; 28 percent did so for seniors. Only a handful had significant negative trends, indicating that the trends were not due to chance variation (NSSE, 2009). A more recent analysis involving more than 400 institutions and a longer span of years confirmed the earlier finding—indeed, the proportions with positive trends went up (NSSE, 2012).
Preliminary findings from follow-up investigations suggest that the positive trends are the result of several factors, including intentional efforts by the institutions, an institutional commitment to improving undergraduate education, attention to data that reveal a need for improvement, and faculty or staff interest in improving undergraduate education. Few institutions identified “national calls for accountability” or “mandates from governing, state, or legislative boards” as motivating factors (McCormick, Kinzie, & Korkmaz, 2011).
And contrary to conventional wisdom about the types of institutions where change is possible, positive trends were detected across the spectrum of institutions—not just at small, private, residential colleges (see examples in figure 1).
Figure 1. Positive Trends in First-Year Active & Collaborative Learning at Four Institutions
In NSSE's brief history, we have learned much about the conditions that foster student success. But like all teenagers, NSSE is changing. We elaborate on the changes, and on their potential to advance the conversation about student engagement and educational quality, in the next section.
An Updated NSSE
Given NSSE's wide adoption, the adage “Don't mess with success” would seem to apply. Yet the changing context of higher education, lessons from NSSE's first decade, and new research findings from projects such as WNSLAE argue for an update—as do the increasing importance of high-quality assessment data and the need for NSSE to remain relevant to current issues and concerns.
The first few years of the NSSE project witnessed a series of modifications and refinements. The calculation of NSSE benchmarks also changed from aggregate institution-level measures to student-level measures, enabling intra-institutional comparisons of subpopulations (e.g., among academic units or demographic subgroups).
Beginning in 2005, however, the project adopted a policy of continuity. This kept the survey largely unchanged, enabling institutions to track their results over time. During this time we focused on enhanced reporting and services for NSSE users, making it easier to assess trends, examine results by major-field groups, use results for accreditation, and so on. We also continued to analyze survey properties and performance, collected input from users about valued items and recommended changes, and carried out research and development to inform a future revision. The development model was akin to the concept of “punctuated equilibrium” from evolutionary biology: a long period of stability, followed by a burst of change.
The update process began in NSSE's tenth anniversary year—2009. In contrast to NSSE's initial development—carried out with generous but time-limited startup funding from the Pew Charitable Trusts and a small but dedicated staff—the “NSSE 2.0” development effort had an extended timetable, a deep pool of test institutions, and a large research staff due to the scale of the mature project.
The multi-year process involved consulting with campus users and experts from the field, reviewing recent literature, conducting research, gathering ideas from our advisory board and other interested partners, analyzing the psychometric properties of the current survey and several years of experimental questions, conducting focus groups and cognitive interviews with students, and two years of pilot testing and analysis. Nearly 80 institutions participated in the development effort, whether by administering pilot instruments or hosting cognitive interviews of students.
This work—which would have been impossible without such collaboration—yielded valuable insights while surfacing difficult challenges and choices. The update had four goals:
Develop new measures related to effective teaching and learning,
Refine existing measures,
Improve the clarity and applicability of survey language, and
Update terminology to reflect current educational contexts.
A guiding principle was to maintain NSSE's signature focus on diagnostic and actionable information related to effective educational practice. That resulted in one of the most significant transitions introduced with the updated survey: the shift from the familiar five NSSE Benchmarks to a new set of ten “Engagement Indicators,” nested within broad themes that echo the Benchmarks.
While the NSSE Benchmarks had high face validity, were simple to recite, and provided an easily digested overview of the results, many users reported that their value for institutional improvement was limited: They lacked specificity about where to concentrate improvement efforts. In addition, because we drew the Benchmarks from only about half of NSSE's engagement-related questions, institutional users sometimes neglected other valuable information (for example, items that assessed reflective learning). The new Engagement Indicators combine high face validity with a more coherent framework and specific measures for the improvement of teaching and learning (see the box on the next page).
The new Engagement Indicators—such as higher- order learning, collaborative learning, learning strategies, discussions with diverse others, and supportive environment—provide faculty members, department chairs, deans, provosts, and presidents with new opportunities to dig into results and formulate plans to increase the prevalence of effective educational practices. In addition, their strong psychometric properties will be useful in supplemental analyses, including ones about student subgroups within institutions. Some of the most helpful insights that participating institutions have gained from their NSSE data have resulted from “looking within” — examining the considerable variability in student engagement that occurs within institutions rather than between them.
For example, discovering that first-generation students are significantly less engaged with faculty than their non-first-generation peers or that students in different disciplines participate at vastly different rates in high-impact practices such as service-learning or research with faculty can suggest immediate and specific action. While many institutions have already achieved this level of NSSE data use, we believe that the new measures will help many more make the vital transition from data to information to action.
Emerging Areas of Interest
With each major update of the NSSE survey comes a chance to explore emergent areas of interest in higher education. For example, the 2013 update includes new measures of quantitative reasoning, perceptions of effective teaching practice, and collaborative learning activities. It mirrors an increasing interest in deep approaches to learning and high-impact practices. The update also introduces a new menu of topical modules to permit deeper exploration of topics of wide interest to colleges and universities.
Most colleges and universities identify quantitative literacy as an important outcome in today's information age—as important as reading and writing have been for generations. In Mathematics and Democracy (2001), Steen and colleagues argue that quantitative literacy “empowers people by giving them tools to think for themselves, to ask intelligent questions of experts, and to confront authority confidently” and asserts that such skills are “required to thrive in the modern world” (p. 2).
Defined as the “ability to understand and use numbers and data in everyday life” (Madison, 2003, p. 3), the pedagogical antecedent to quantitative literacy emphasizes quantitative reasoning—the skills needed to evaluate, support, and critique arguments using numerical information. Because all students need to develop these skills, quantitative-reasoning experiences should not be limited to students in science, technology, engineering, and mathematics (although we can expect these majors to have deeper and more frequent exposure).
After testing a larger set of quantitative-reasoning items, NSSE selected three for inclusion in the 2013 survey. Taken together, these questions provide a basic measure of quantitative-reasoning activity for both first-year students and seniors.
During the current school year, about how often have you done the following? (Response options: Very often, Often, Sometimes, Never)
Reached conclusions based on your own analysis of numerical information (numbers, graphs, statistics, etc.)
Used numerical information to examine a real-world problem or issue (unemployment, climate change, public health, etc.)
Evaluated what others have concluded from numerical information
Effective Teaching Practices
The literature has established a relationship between effective teaching practices and desired outcomes in both course-specific learning and general measures of cognitive growth (see Pascarella & Terenzini, 2005, for a summary of this evidence), in addition to positive links with persistence (Braxton, Bray, & Berger, 2000). Analyses of WNSLAE data also document a positive relationship between effective teaching and learning gains: Students learn more when their instructors are organized and prepared, give clear instructions, use examples or illustrations to convey difficult points, and provide prompt and detailed feedback (WNSLAE, n.d.).
Informed by these important findings, the 2013 NSSE survey includes the following teaching-related questions:
During the current school year, to what extent have your instructors done the following? (Response options: Very much, Quite a bit, Some, Very little)
Clearly explained course goals and requirements
Taught course sessions in an organized way
Used examples or illustrations to explain difficult points
Provided feedback on a draft or work in progress
Provided prompt and detailed feedback on tests or completed assignments
The updated NSSE survey emphasizes collaborative learning in courses by asking students how often they seek the help of, and explain course material to, other students, as well as how often they engage with their peers in study and coursework. Working with others to puzzle through problems, engage in creative tasks, and reach reasoned conclusions exposes students to different perspectives that, when weighed against their own views, can shape new understandings. When instructors and institutions create environments where peers help each other learn, other students become legitimate resources for cognitive growth.
In Academically Adrift, Arum and Roksa (2010) included time spent studying with peers in a measure of “social engagement,” which they found to be negatively related to gains in generic skills. (Their social engagement measure also included time spent at fraternities or sororities.) We believe the new NSSE items more specifically target intentional and productive out-of-class intellectual engagement.
During the current school year, about how often have you done the following? (Response options: Very often, Often, Sometimes, Never)
Asked another student to help you understand course material
Explained course material to one or more students
Prepared for exams by discussing or working through course material with other students
Worked with other students on course projects or assignments
Deep Approaches to Learning
The learning sciences have increased our understanding of the tasks and experiences that confer lasting educational benefits. The construction, transformation, and application of knowledge, for example, are typically more effective than rote memorization.
Some researchers thus make a distinction between deep and surface-level processing (see Tagg, 2003). Building on several years of NSSE research on “deep approaches to learning” (Nelson Laird, Shoup, Kuh, & Schwarz, 2008), new Engagement Indicators will provide insight into students' experiences with higher-order, reflective, and integrative learning.
One set of NSSE items asks students how much their coursework emphasizes analysis, synthesis, judgment, and application (corresponding to the list in Benjamin Bloom et al.'s Taxonomy of Educational Objectives ). A second set relates to students' revision of previously held views, consideration of others' perspectives on a topic or issue, and integration of knowledge from multiple sources
In 2007, when asked what one thing institutions could do to increase student engagement and success, NSSE's founding director George Kuh offered a challenge: “Make it possible for every student to participate in at least two high impact activities during their undergraduate program, one in the first year, and one later related to their major field” (NSSE, 2007).
Enriching educational opportunities such as learning communities, service learning, research with a faculty member, study abroad, internships, and culminating senior experiences are labeled “high impact” because of their positive effect on student learning and development. These experiences call on students to invest considerable time and effort, facilitate out-of-class learning, engage students meaningfully with faculty, encourage interaction with people unlike themselves, and provide frequent feedback on performance. Students often describe their participation in these activities as life changing.
Previously combined with other activities under the Enriching Educational Experiences benchmark, participation in high-impact practices, including both activity-specific patterns and total exposure among first-year and senior students, will be summarized separately in NSSE reporting.
Drilling Down with Topical Modules
The current NSSE survey asks a few questions about a wide range of educationally purposeful experiences—it offers broad but not deep coverage. Another change with NSSE 2013 is the introduction of optional topical modules—short question sets that focus on institutions' special interests or strategic priorities—all of which have been extensively field-tested.
Examples include experiences with writing, diversity, technology, advising, and civic engagement. Several of these were developed in collaboration with individuals or organizations with special expertise, such as the American Association of State Colleges and Universities, the Association of American Colleges and Universities, the Council of Writing Program Administrators, and EDUCAUSE.
Challenges and Opportunities for Institutions, Leaders and Faculty
Improving quality in undergraduate education to foster learning and success for all students is a vital imperative for US higher education. The challenge that this presents to institutional leaders, faculty, and staff demands meaningful measures of and concerted action to enhance educational effectiveness.
Many campuses have implemented improvement initiatives leading to gains in student engagement (McCormick, Kinzie, & Korkmaz, 2011), while others struggle to draw clear lessons from their assessment data or to formulate concrete improvement priorities. Campuses that truly use NSSE—as distinguished from merely participating in it—know that the receipt of data files and reports is only the beginning of a process of sharing and making meaning of results, identifying priorities for action, formulating concrete action plans, implementing those plans, and circling back to assess their impact.
But there are obstacles to doing this. Confounding institutions' capacity to use data about the student experience is the growing problem of survey fatigue on college campuses. Pleas for students to complete surveys and sit for learning assessments have proliferated with increasing calls for evidence-based decisions, accountability regimes, and the advent of low- and no-cost Web-based survey tools. Typically these voluntary contributions of time confer no concrete benefit on the students, whose compliance is eroding steadily.
NSSE-participating institutions have responded to survey fatigue in a variety of ways, including better coordination of surveys, greater control over student email lists, creative promotional campaigns and incentives, and follow-up efforts signaling to students that they take assessment findings seriously. But concerns about survey fatigue constrain how much new content can be added to the NSSE and necessitate the elimination of some items to make room for that content.
The growth of online education represents another challenge. It has altered the landscape of undergraduate education and consequently motivated several changes to NSSE. Question wording that might have implied a physical classroom setting has been modified to be more neutral with respect to mode of delivery.
This change enhances both the survey's relevance for online learners and our ability as researchers to share findings about online education. The need to explore the quality of learning online and in physical classrooms is vital as institutions make decisions about investing more in online education and related faculty development.
The Past As Prologue
NSSE has accomplished a lot in its short history. It has been a useful tool for a wide array of educators (faculty members, deans, assessment specialists, and teaching center staff, as well as provosts and presidents), helping them gain valuable insights into complicated questions of quality and avoiding the reductionist logic of rankings.
NSSE's greatest strength is arguably its ability to stimulate serious conversations about what colleges and universities are doing well and where improvement is needed. Under the right conditions, those conversations lead to deeper inquiry, action, and improvement.
We are excited by the opportunities presented by the updated NSSE survey and grateful for the extensive collaboration that made it possible. An ever-expanding research base; a deliberate, rigorous, and collaborative development process; and a continued emphasis on diagnostic and actionable information provide a strong foundation for renewed attention to student engagement and the serious work of improving educational quality.
1. Arum, R. and Roksa, J. (2011) Academically adrift., University of Chicago Press., Chicago, IL.
2. Blaich, C. F. and Wise, K. S. (April 2011) The Wabash National Study: The impact of teaching practices and institutional conditions on student growth. Paper presented at the annual meeting of the American Educational Research Association.
3. Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H. and Krathwohl, D. R. (1956) Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain., Longmans, Green., New York, NY.
4. Braxton, J. M., Bray, N. J. and Berger, J. B. (2000) Faculty teaching skills and their influence on the college student departure process. Journal of College Student Development 41, pp. 215-227.
5. Hu, S. and McCormick, A. C. (2012) An engagement-based student typology and its relationship to college outcomes. Research in Higher Education 53, pp. 738-754.
6. Kuh, G. D. (2008) High-impact educational practices: What they are, who has access to them, and why they matter., Association of American Colleges and Universities., Washington, DC.
7. Kuh, G. D., Cruce, T. M., Shoup, R., Kinzie, J. and Gonyea, R. M. (2008) Unmasking the effects of student engagement on first-year college grades and persistence. Journal of Higher Education 79, pp. 540-563.
8. Kuh, G. D., Kinzie, J., Schuh, J. H. and Whitt, E. J. (2011) Fostering student success in hard times. Change 43:4, pp. 13-19.
9. Madison, B. L. Madison, B. L. and Steen, L. A. (eds) (2003) The many faces of quantitative literacy.. Quantitative literacy: Why numeracy matters for schools and colleges, pp. 3-6. National Council on Education and the Disciplines., Princeton, NJ.
10. Mayhew, M. J., Seifert, T. A., Pascarella, E. T., Nelson Laird, T. F. and Blaich, C. (2012) Going deep into mechanisms for moral reasoning growth: How deep learning approaches affect moral reasoning development for first-year students. Research in Higher Education 53, pp. 26-46.
11. McClenney, K. M. and Marti, C. N. (2006) Exploring relationships between student engagement and student outcomes in community colleges: Report on validation research., Retrieved from www.ccsse.org/publications
12. McCormick, A. C., Kinzie, J. and Korkmaz, A. (April 2011) Understanding evidence-based improvement in higher education: The case of student engagement. Paper presented at the annual meeting of the American Educational Research Association.
13. National Survey of Student Engagement. (various years). NSSE Annual Results, reports cited in this article may be viewed or downloaded from nsse.iub.edu.
14. Nelson Laird, T. F., Shoup, R., Kuh, G. D. and Schwarz, M. (2008) The effects of discipline on deep approaches to student learning and college outcomes. Research in Higher Education 49:6, pp. 469-494.
15. Padgett, R. D., Goodman, K. M., Johnson, M. P., Saichaie, K., Umbach, P. D. and Pascarella, E. T. (2010) The impact of college student socialization, social class, and race on need for cognition. New Directions for Institutional Research 145, pp. 99-111.
16. Pascarella, E. T., Seifert, T. A. and Blaich, C. (2010) How effective are the NSSE benchmarks in predicting important educational outcomes?. Change 42:1, pp. 16-22.
17. Pascarella, E. T. and Terenzini, P. T. (2005) How college affects students: A third decade of research, Volume 2, Jossey-Bass., San Francisco, CA.
18. Seifert, T. A., Goodman, K. M., Lindsay, N., Jorgensen, J. D., Wolniak, G. C., Pascarella, E. T. and Blaich, C. (2008) The effects of liberal arts experiences on liberal arts outcomes. Research in Higher Education 49:2, pp. 107-125.
19. Steen, L. A. (ed) (2001) Mathematics and democracy: The case for quantitative literacy., National Council on Education and the Disciplines., Washington, DC.
20. Tagg, J. (2003) The learning paradigm college., Anker., Boston, MA.
21. Wabash National Study of Liberal Arts Education. (N.D.) High-impact practices and experiences from the Wabash National Study., Retrieved from www.liberalarts.wabash.edu/storage/High-Impact_Practices_Summary06.01.09.pdf
Alexander C. McCormick (email@example.com) is associate professor of Educational Leadership and Policy Studies at Indiana University Bloomington, where he teaches in the Higher Education and Student Affairs program. When he joined the faculty at Indiana University in 2008, he succeeded George Kuh as NSSE's director, having served as technical consultant to the project in its early years.
Robert M. Gonyea (firstname.lastname@example.org) is associate director of the Indiana University Center for Postsecondary Research, with special responsibility for research and data analysis for NSSE and its affiliated surveys.
Jillian Kinzie (email@example.com) is associate director of the Indiana University Center for Postsecondary Research, with special responsibility for NSSE's outreach and service efforts through the NSSE Institute for Effective Educational Practice.