Hello, My Name Is

I just got back on campus after spending time at the Association for Institutional Researcher (AIR) Forum in Denver.  I had the opportunity to hear lots of great sessions from my colleagues and learn more about their institutions. That said, you would be inclined to think I would blog this week about some great sessions, but you would be wrong. When we checked in at the forum, we were given a preprinted name badge and a group of stickers to adhere to a blank space under our name. The point of the stickers was for us to “personalize” our name badges to better reflect the diversity of experience present at the conference.  I chose ones like “I <3 statistics” and “Learning Outcomes” and the ever popular “Accreditation.”  Some of my colleagues chose even more interesting stickers like “Data Diva.” I came really close to selecting that one but didn’t think I had quite earned the title of “Diva.” Not yet, Continue reading Hello, My Name Is

Serving Two Masters

There is a well-known Bible verse in the Book of Matthew that says, “No one can serve two masters.”  I was thinking about that adage recently when discussing assessment with some colleagues.  We were discussing a department’s assessment plan for the coming academic year and I was repeatedly asked if the accrediting body “would be okay with it.” Assessment has the unfortunate role of trying to serve two masters.  The first and primary role of assessment is to measure student learning in the classroom/program/college, and to use this information to inform curricular changes with the goal of increasing student learning. The second master that has begun creeping into my conversations with faculty is the issue of accountability.  Will this satisfy our accrediting body?  Will the state board think this is okay?  Does this meet the requirements? It is hard to meet the needs of two masters that are so fundamentally different in expectations and requirements.  So how does academia address Continue reading Serving Two Masters

Assessment Stories

“Story telling is about connecting to other people and helping people to see what you see.” Michael Margolis I ran across this quote recently as I was looking for some interesting quotes to put into a presentation I was working on.  It didn’t fit my presentation, but it did make me think about telling a story. This past January, the professional development week that precedes the semester at JCCC included a panel of faculty who told their stories of assessment.  The panelists were wonderfully candid in sharing insights about their experiences and I was thrilled to hear all the different stories of how assessment had evolved in their disciplines.  Unfortunately, not many people attended the session. I thought the stories the faculty told about their respective assessment journeys were powerful and compelling. So, living in a digital age, I asked the faculty from the panel to come to the studio and tell their stories again – to the camera. Under Continue reading Assessment Stories

OOA Over the Summer…

Over the summer – We were still busy in the Office of Outcomes Assessment… The OOA hosted a faculty workshop with Dr. Tom Angelo, on “CATs and COLTs: Classroom Learning Techniques and Collaborative Learning Techniques.” Dr. Angelo has consulted on teaching, assessment and learning improvement in 17 countries and throughout the United States for more than 60 higher education associations/systems and more than 250 postsecondary institutions. Internationally, Tom has been awarded fellowships from the Fulbright Program (Italy), the Calouste Gulbenkian Foundation (Portugal), the Carrick Institute for Learning and Teaching in Higher Education (Australia) and the Higher Education Research and Development Society of Australasia. He has authored or co-authored five books and more than thirty-five articles and chapters. His best-known publication is Classroom Assessment Techniques: A Handbook for College Teachers, 2nd Edition, with more than 100,000 copies in print. The truly great news is he is coming back to JCCC in the Spring as the keynote speaker for the 5th Annual Continue reading OOA Over the Summer…

Excellence in Outcomes Assessment Award Winners

Congratulations to our Winners! Patty Titus, Jane Zaccardi, David Luoma, Ginny Radom, Connie Reischman  “To ensure that our students are prepared to meet the challenges of the future, we are committed to continuous program improvement.” This attitude is the defining feature of the team that will receive the Excellence in Outcomes Assessment award for 2013-14.   Their project looked at the students’ ability to collaborate respectfully as part of an emphasis on professionalism for their practical nursing students.  Key components of the project included curriculum mapping to identify objectives specific to their project, a qualitative survey, and a faculty-designed rubric to evaluate aspects of professionalism during clinical practicum and simulation experiences. The most crucial component of the assessment work involved the changes that the project brought to the practical nursing curriculum.  Some of the changes that were made based on the data include the use of behavioral objectives regarding professionalism that faculty incorporated into simulation experiences for the students.  Curriculum changes Continue reading Excellence in Outcomes Assessment Award Winners

Reporting Assessment Data

Reporting & Using Assessment Results   Assessment results are meant to improve teaching and learning as well as inform planning and decision making. Results of assessment activities can highlight successes such as:  better alignment of the curriculum with desired outcomes; creation of useful rubrics; development of explicit standards and corresponding samples of student work; evidence that students are meeting or exceeding learning expectations. Elements to include in an Assessment Report Things to consider The outcome(s) that was addressed. Is this a department/course level or general education assessment? The type of data that was collected and the timeframe for collection. Was this a pilot or department wide assessment? Was the data collected in Fall or Spring semester or over the course of the academic year?  If sampling was used, how was the sample collected? Who submitted data How many students were used? How were they selected? How was the student work evaluated? Describe the scoring mechanism, rubric, dichotomous responses, scaled responses. Continue reading Reporting Assessment Data

Does this Rubric Make my Assessment look big?

Does this Rubric Make My Assessment Look Big?  In my office, rubrics are frequently a topic of conversation.  Usually I am meeting with a faculty member, or a department chair about an assessment planned for the coming semester.  The conversation usually comes around to needing a rubric to evaluate a student performance.  So in this article I am sharing the basics about rubrics.  Something to get you started if you are currently a non-rubric user, or if you want to improve the rubrics you currently use. Rubrics at their most basic are a tool used by faculty to help in the task of assessing student learning.  Rubrics can be holistic or analytic, general or task specific.  Rubrics can assist faculty in assigning grades, or can be used for collecting assessment data. Holistic vs. analytic Rubrics  Holistic rubrics provide a single score based on the overall performance of a student on a specific task or assignment. Advantages: this type of rubric Continue reading Does this Rubric Make my Assessment look big?