Assessing learning in higher education includes various course-specific indicators, such as instructor-designed tests, tasks, projects, and instructor perceptions. Outcomes and standards either locally-designed or administered through national or state agencies and regional accrediting bodies argue for indicators that assess institutional well-being, especially as it relates to information literacy, problem solving in small groups, critical thinking, and communication both written and spoken, and broadly conceived notions of what it means to be liberally educated. Accordingly, college administrators agree to ongoing assessments to profile students and to better understand the changing needs of the institution. Teaching in a culture of on-demand information necessitates such ongoing review.
The National Survey of Student Engagement (NSSE) survey assesses five benchmarks for effective learning practices that are linked to academic success: level of academic challenge, active and collaborative learning, student-faculty interactions, enriching educational experiences, and supportive campus environment (NSSE, 2005). Mark and Boruff-Jones (2003) argue that NSSE is underused as an assessment instrument to correlate higher education learning goals to student engagement. The survey instrument, both valid and reliable, provides information to compare different student groups. Through an analysis of an institution’s NSSE results compared to responses by students at similar colleges, instructors gain insight into students’ perceptions of their learning. These perceptions when compared to students’ responses over time and when examined in relation to other information about learning provide a more comprehensive view of institutional learning outcomes.
The authors/researchers first mapped Information Literacy (IL) Standards on a Ladder of Abstraction to show Bloom’s Taxonomy of Cognitive Development (Weiss, Corso, Kelly, 2005/2006). After the authors/researchers examined syllabi from across the curriculum, they identified courses with IL Standards and plotted each on the Ladder of Abstraction. The authors/researchers further gathered information from the 2004, 2005, and 2006 NSSE surveys. Using an analysis of variance, the authors identified fifteen statements from NSSE that demonstrated statistical significance (p<0.05) between freshmen and seniors at other similar institutions, and between freshmen and seniors at Experimental College to measure growth or development areas within the institution.
As a result of this study, the authors/researchers conclude that NSSE data are useful to assess Information Literacy Standards in undergraduate education along with examining other data. Results indicate that students at this institution show growth (p<0.05) in three areas of cognitive development. However, responses by seniors at Experimental College when compared to seniors at other institutions indicate a need for greater emphasis in one area of cognitive development. Results from the analysis of the ACRL standards, Bloom’s Taxonomy, the Ladder of Abstraction, and the NSEE surveys from three years portray sustainable learning at the institution. Use of these four pragmatic assessment instruments indicates a sharper focus for institutional improvement that when shared with faculty members can become the impetus for curricular development and faculty growth.
|Keywords:||NSSE, Information Literacy, Assessment, Bloom’s Taxonomy, Institutional Assessment, Ladder of Abstraction, Curricular Mapping, Cognitive Growth, Engagement, Active Learning, Collaborative Learning|
Associate Professor of English and Communication & Media Arts; Coordinator of Writing, Division of Arts and Sciences, Neumann College, Aston, PA, USA
Professor, Division of Arts and Sciences, Neumann College, Aston, PA, USA
There are currently no reviews of this product.Write a Review