Progress Report from Summer 2010
MEMO
TO: Julie Peacock, VPAA
FROM: Mary Woestman, Co-chair Student Learning Assessment Committee
DATE: August 10, 2010
RE: Progress Report covering May 20 to August 10, 2010
A. Visit to Mohawk Valley Community College
On June 22, Kelli and I travelled to MVCC to visit Norayne Rosero, their Assessment Liaison, and Mark Radlowski, Director of Institutional Research and Analysis. We sent them the following list of questions before the visit so that Norayne would know what topics were most important to us:
1. What structure are you using for student assessment and how did you decide on a structure?
2. What are the strengths and weaknesses of your current process?
3. In hindsight, what did you miss or what mistakes did you make early on that we should avoid?
4. How did you go about getting buy-in from faculty and other stakeholders/constituents? How long did that take?
5. How are your feedback loops designed?
6. Are you using a data base for documentation?
7. How do you share data and analysis of same with constituencies?
8. How are assessment of student learning and institutional assessment tied together?
Herkimer’s Institutional Effectiveness Council (IEC) has the primary responsibility for assessment. The council is co-chaired by the Director of Institutional Research and the Assessment Liaison (who is a faculty member), and includes a faculty member from each division (5 “Learning Centers”), a student, and four members appointed by the President and by other administrators at the VP level. Historically, those members have included such appointments as the VP for Administrative Services, the VP for Student Services, the VP for Instruction, the president’s executive assistant, and the registrar.
The Student Learning Assessment Subcommittee of the IEC is also headed by the Assessment Liaison and includes 6 members of the IEC as well as 5 other faculty members hand-picked by department chairs or Center Deans. The Council structure was designed to facilitate communication within the assessment group itself and also across campus. The weaknesses of the structure that have become apparent are that there is not enough student involvement and that the whole enterprise is too dependent on Mark and Norayne. It hurts their credibility to the extent that Norayne sometimes feels the need to bring in experts to validate their decisions. Theoretically, the Assessment Liaison is elected each year from “within the Center.” But Norayne has been the liaison since inception with no term limits. It sounds to me like she has enabled the campus to shoulder her with the burden of having to constantly cajole people (nag? beg?) into doing the necessary tasks of assessment.
I think we could avoid this pitfall by having by-laws similar to those of our Curriculum Committee with term limits and rotating participation. We also need to have close coordination with the Professional Development Coordinator and the TRC.
The IEC has chosen the core indicators for the institution. Performance indicators are required at the department, division and institutional levels. There is a standard template for data collection and the data is available on the campus O: drive. All of this is under the IR Director. His department consists of two full time and one part time assistants. The two full time staff are responsible for data base maintenance; one does the evaluation coordination for things like Perkins grants and IE, the other does finance and records management. The part time person does all of the surveys for the campus.
As far as faculty buy-in, it is still an ongoing process. Norayne does most of the work of educating and working with faculty. She runs a New Faculty Institute that meets monthly and she creates reminder handouts for faculty wide distribution (e.g. Top Ten Questions about Assessment, Course Outline and Syllabus Design, MVCC Assessment Fast Facts, General Education Quick Reference Guide). There is also a website to “provide historical and current information on institutional effectiveness activities at MVCC.” The Activities Timelines are available on the site (from 2005 to 2008 anyway) as well as an Assessment Handbook and various internal and external links. Norayne provided us with handouts including:
- template for units to report their assessment activities, results and actions taken
- program review process
- information on rubrics and holistic grading
- Information Literacy/ Management Competencies and assessment instrument
- 2010-2011 Strategic Plan Initiatives
- Institutional Effectiveness and Assessment Plan
- A cross comparison of Collegewide competencies, MVCC Gen Ed competencies and SUNY GER’s
I think one of the most important things Norayne said to us was that everyone at the college needs help to understand why assessment is necessary. The standard answer is to show accountability and to work towards improvement. But the answer that seems most meaningful to faculty is that assessment results in deep student learning.
B. Thoughts from summer reading –Themes for beginning
Besides the visit to MVCC, Kelly and I have both been reading extensively (bibliography attached). As we start the new semester, I think there are a couple of themes that we need to keep in mind and use to begin engaging the community in working on assessment. One is this question of why we do it. How we answer this will drive what our process becomes. Besides the tension between accountability and improvement, there is also a tension between the scientific orientation of specific activities and demonstrated behaviors and a more developmental, descriptive process in which the assessment not only shows what has been learned, but also helps determine the path of future exploration. The assessment helps direct the unfolding of the path which may not be completely determined in advance. There is yet another tension between assessment used to evaluate individual performance versus aggregate results. The best resolution of these polarities isn’t necessarily a hybrid. Each discipline and each facet of student learning will need the considered judgment of the appropriate professionals to determine how to assess performance. As we attempt to design a process that meets the needs of all levels of the institution, we will of necessity work from the individual point of contact (classroom, office, gym, club, LAC, etc.) outward to department, division and institutional levels. The good news is that many of us are already doing valid and helpful assessments that can continue to be used. They will be our building blocks if we can bring them to the surface, make them visible and connect them together.
Another important theme is the idea of assessment becoming an integral part of all we do. Assessment is not meant to be an “evaluative stance that emphasizes checking up on results… [but] an emphasis on assuming active and collective responsibility for fostering student attainment.” (Banta, p.24) Our assessment structure needs to be a permanent fixture of planning and governance at the college. Thus all of our planning, from strategic initiatives and budget considerations to course policies and syllabi need to have assessment as an integral part of the plan. As a community, we need to understand that this is not something being imposed on us from outside. It is imposed on us by our own values and sense of integrity. We need to be cognizant of the effects of our efforts, and fix whatever isn’t working.
Newer and younger faculty, especially those who have had recent university courses in education, could be a great source of energy and may already be inculcated with the culture of assessment. We should try to find them.
Bibliography
Suskie, Linda (2009). Assessing Student Learning: A Common Sense Guide (3rd ed.). San Francisco, CA: Jossey-Bass
Walvoord, Barbara E. (2010). Assessment Clear and Simple: A Practical Guide for Institutions, Departments, and General Education (2nd ed.). San Francisco, CA: Jossey-Bass
Banta, Trudy W. et al (2002) Building a Scholarship of Assessment. San Francisco, CA: Jossey-Bass
Middle States Commission on Higher Education. (2006).Characteristics of Excellence in Higher Education: Eligibility Requirements and Standards for Accreditation. (12th ed.) Philadelphia, PA: Middle States Commission on Higher Education.
Middle States Commission on Higher Education. (2005). Assessing Student Learning and Institutional Effectiveness: Understanding Middle States Expectations. Philadelphia, PA: Middle States Commission on Higher Education.