From:
David Hubert
Sent: Friday, February 11, 2011 3:51 PM
To: Academic Administrators; Full Time Faculty
Cc: Academic Administrative Assistants; Tom Zane; Deneece Huftalin; Cynthia Bioteau; Lisa Hubert; Kati Lewis
Subject: Assessment Plan for College-wide Learning Outcomes
Sent: Friday, February 11, 2011 3:51 PM
To: Academic Administrators; Full Time Faculty
Cc: Academic Administrative Assistants; Tom Zane; Deneece Huftalin; Cynthia Bioteau; Lisa Hubert; Kati Lewis
Subject: Assessment Plan for College-wide Learning Outcomes
Hello
Everyone:
Earlier this week, the Quality Higher Education Council re-adopted a slightly revised version of an Assessment Plan for College-wide Outcomes that it had originally adopted in February of 2010. This document is attached.
The Highlights:
Earlier this week, the Quality Higher Education Council re-adopted a slightly revised version of an Assessment Plan for College-wide Outcomes that it had originally adopted in February of 2010. This document is attached.
The Highlights:
- We need to assess College-wide learning outcomes, but we want to do so without adding an additional assessment burden on top of the program-level assessment that we are already doing.
- The best way to meet the need expressed in the first bullet is to...
- Create a schedule so that everyone at the College knows what College-wide learning outcome is being assessed in a given year.
- Coordinate our program assessments so that programs make sure to assess program-level outcomes that correspond to the College-wide learning outcome that is scheduled to be assessed that year.
- As much as is possible, coordinate the rubrics that programs use to examine student work.
- Conduct a meta-analysis of the assessment data that organizes the various program results into common criteria.
- Layer on top of the meta-analysis other relevant data: a holistic look at the Gen Ed ePortfolios of graduating students, CCSSE data, graduating student survey data, etc.
- We will identify 8-10 “pathfinder departments” that will work this May and June to develop their rubrics (from a common starting point, but reflecting disciplinary differences), signature assignments*, and sampling methodologies. Departments wishing to volunteer should contact David Hubert and Tom Zane. (Nursing, I’ve got you on the list already)
- The work of the pathfinder departments will be broadly shared, and this Fall all other departments will use it as a set of models from which they can draw when developing their own plans.
- We are assessing “Effective Communication” in the cycle that begins this summer, and all departments must be ready to collect data no later than the Spring of 2012. The attached plan has two important assessment starting points that most departments will find useful: They are rubrics from the Association of American Colleges and Universities dealing with written and oral communication.
Clear as mud? I am happy to field your questions, but please read the attached plan before firing off an email, because it is quite detailed.
Thank you,
David Hubert
Dean of General & Developmental Education
Salt Lake Community College
801.957.4280
*
I’m using “signature assignment” to refer to the student work
that will form the basis of the direct assessment the program is
doing. It may take the form of the signature assignment that
students in Gen Ed courses are asked to put--and reflect upon—in
their ePortfolios, or it may not. Those kinds of decisions are
the prerogative of individual department chairs and faculty.
_______________________________________________________________________________
_______________________________________________________________________________
February
11, 2011 The SLCC Assessment Office has decided to assess the 1020
language courses during Spring 2011, using the ACTFL OPIc. Currently
tests are available for Arabic, Chinese, French, German, Russian, and
Spanish. (ACTFL = American Council on the Teaching of Foreign
Languages. OPIc = Oral Proficiency Interview via Computer).
The
tests run 20-30 minutes for each student.
Please
feel free to access the OPIc Demo Site at your convenience to try out
the OPIc:
http://info.actfltesting.org/OPIcDemoEN/testeeCD.html
(This site does not require a login password)
SLCC
may also do this during more than one semester, so if you are in LANG
1010 right now, you might care to become familiar with the test.
The
demo shows you the testing procedure:
- The demo has an avatar who asks you questions in English
- It conducts a preliminary survey of each student in order to generate questions based on one’s interests (However, the demo only has one set of questions, no matter what you put on your survey).
- It shows you how to turn on each question so you can hear it.
- It shows you how to record your answer.
- It gives you a sample test, but it does not rate your recorded answers.
_______________________________________________________________________________
Assessment
Plan for College-Wide Learning Outcomes—Adopted February 8, 2011
Program
Assessment—Each program is responsible to assess
discipline-specific knowledge, skills and abilities at least every
other academic year. In addition, each program is responsible for
assessing college-wide learning outcomes two through five (where
applicable to the program outcomes) at least once every four years.
Programs are required to do so during the scheduled targeted
assessment (see below).
General
Studies/General Education Assessment—The ePortfolio
Director conducts an annual review of a sample of ePortfolios
from graduating General Studies majors. The review will document the
extent of evidence students use to address CWLOs #2-5.
Institution-Level:
Targeted Assessment Studies—Following a cyclical
schedule that allows programs to implement instructional changes in
light of assessment results, the ePortfolio Director uses the same
data gathered during program and General Studies/General Education
assessments described above to create a detailed study pertaining
to the targeted learning outcome. The schedule:
- #2: Effective Communication
- Spring Term 2012—Gather Data
- May 2012—Programs Analyze Data using Rubrics
- Summer 2012—ePortfolio Director Writes Overall Assessment Report
- Fall 2012—Begin to Implement Changes
- Follow-up Data Gathering and Analysis in Spring 2016
- #3: Quantitative Literacy
- Spring Term 2013—Gather Data
- May 2013—Programs Analyze Data using Rubrics
- Summer 2013—ePortfolio Director Writes Overall Assessment Report
- Fall 2013—Begin to Implement Changes
- Follow-up Data Gathering and Analysis in Spring 2017
- #4: Critical Thinking
- Spring 2014—Gather Data
- Summer 2014— Programs Analyze Data using Rubrics
- Summer 2014—ePortfolio Director Writes Overall Assessment Report
- Fall 2014—Begin to Implement Changes
- Follow-up Data Gathering and Analysis in Spring 2018
- #5: Civic Engagement and/or Working Professionally and Constructively with Others
- Spring 2015—Gather Data
- Summer 2015— Programs Analyze Data using Rubrics
- Summer 2015—ePortfolio Director Writes Overall Assessment Report
- Fall 2015—Begin to Implement Changes
- Follow-up Data Gathering and Analysis in Spring 2019
Visualizing
Assessment of College-wide Learning Outcomes at SLCC
Institution-Level
Assessment
CWLO:
4-year cycle targeted assessment of one outcome per year: effective
communication, quantitative literacy, critical thinking, and civic
engagement/working professionally with others.
Program
Assessment
Gen
Ed Assessment
Program
Assessment of the following:
- Program outcomes assessed by chair and faculty.
- Annual assessment of a program outcome that speaks directly to the CWLO being targeted by Institution-level Assessment in a given year.
Informs
improvements of faculty pedagogy, program curriculum, and
departmental professional development.
Direct
assessment of student work, which may/may not take the form of work
submitted in ePortfolios.
General
Education Assessment annually of CWLO (#2-5).
Sample
consists of ePortfolios of graduating General Studies majors.
Holistic
ePortfolio rubric used to assess the evidence in student
ePortfolios.
Informs
Gen Ed Committee, FTLC and departmental professional development.
Conducted
by Dean of Gen and Dev Ed with paid faculty reviewers.
Institution-level
Assessment Report combines the following:
- Results of Program and Gen Ed Assessment.
- Graduating Student Survey data.
- CCSSE data.
Informs
QHEC initiatives, Executive Cabinet, FTLC and departmental
professional development, and Innovation Grant RFP.
Example:
Spring 2012
From the Chair and
faculty’s perspective:
- Because of the pre-established schedule, the Chair and faculty know that Effective Communication is the targeted CWLO for this year. [The department will likely have other program-specific learning outcomes it wants assessed as well.]
- Assessment Design: The Chair and faculty design assessment(s) that target student performance on Effective Communication that is/are relevant to the program’s own learning outcomes. This assessment must involve direct assessment of student work using a rubric that meets basic standards set by the Quality Higher Education Council.
- Assessment Implementation:
- The department gathers student assignments during the late Spring of 2012 or Fall of 2011. The assignments might be signature assignments that students have posted in their ePortfolios, or the department may decide that some other method of accessing student assignments works better.
- In Spring 2012, the Chair organizes a small group of faculty to apply the rubric to student work, and then the Chair reports the results to the Office of Outcomes Assessment. Such reporting should be complete by the end of May.
From the Dean’s
perspective:
- Deans need to see that Departments complete their targeted assessment projects and report their data.
From the ePortfolio
Director’s perspective:
- Will work with the Outcomes Assessment Coordinator to gather the assessment data pertaining to Effective Communication reported by each Department.
- Each summer the Director will collect the results of Program and Gen Ed Assessment and write an Institution-level Assessment Report identifying strengths and weaknesses with respect to the elements of Effective Communication assessed by programs, to the extent of evidence captured by the annual review of ePortfolio signature assignments in the Gen Ed assessment, and to other indirect data on Effective Communication such as CCSSE and Graduating Student Survey results.
- The Director will forward the Institution-level Assessment Report to the QHEC, the Executive Cabinet, and the general College community before Welcome Back in August.
From the Provost’s
perspective:
- The Provost will use the academic administrative structure to see that the loop is closed, starting with the QHE Council and its initiatives, but also including FTLC and departmental professional development opportunities, Innovation Grant RFP process, changes in curricula, and other means.
Timeline for this
Coming Year:
Step One
(immediately)—Announce and distribute this assessment plan.
Step Two (as
soon as possible)—Obtain definitive statement from the Curriculum
Committee regarding the expectations embodied in the levels of
performance standards in the rubrics we use (see bottom of page 5 and
top of page 6).
Step Three (by
beginning of April)—Identify 8-10 “pathfinder” departments that
will lead the way in developing assessment plans focusing on
Effective Communication. This will include identifying signature
assignments, developing a sampling methodology, and writing
assessment rubrics starting from a common source.
Step Four (May
and June)—Pathfinder departments will work with the Outcomes
Assessment Coordinator and/or the ePortfolio Director to complete
their responsibilities as sketched in step three. This work will
serve as a set of models for other departments.
Step Five (Fall
Semester)—All other departments will use the work of the pathfinder
departments to develop their own signature assignments, sampling
methodologies, and assessment rubrics. Pathfinder department chairs
and faculty will make themselves available to support the other
departments, and the Outcomes Assessment Coordinator and the
ePortfolio Director will also work with them.
Step Six (First
day of Spring term)—All departments will be ready to collect data
during the spring term on Effective Communication and other program
outcomes. Departments that are ready early can collect data during
the fall term.
Step Seven (by
end of Spring term)—All departments will collect data.
Step Eight (by
June 1st)—All departments will turn in to the Outcomes
Assessment Office the results of their assessment.
Step Nine
(Summer)—ePortfolio Director will conduct a meta-analysis of
program assessments of Effective Communication and write a report for
the Quality Higher Education Council.
Step Ten
(Convocation and afterwards)—The assessment report will be broadly
shared through various venues. The QHEC will make recommendations for
college-wide response, and individual programs will decide how to
respond, what interventions (if any) are needed, and when to
follow-up on the efficacy of those interventions.
Step Eleven
(Continuous)—Meanwhile, all departments will need to start the
cycle over again for the next scheduled learning outcome.
The Importance of
Semi-Standard Rubrics
The integrity of this
plan to assess College-Wide Student Learning Outcomes depends heavily
on SLCC faculty and administrators across the College coming to a
common understanding about the design of the rubrics they will use to
assess student work. While we want to avoid a one size fits all
approach, we do want to ensure the following:
- When common rubrics can be used across disciplines, they should be used.
- When different rubrics are used across disciplines, they should start from a common source and be constructed according to the same basic standards and should reflect a common understanding regarding the levels of student performance.
To encourage various
programs to use common rubrics, the QHEC will publicize common
starting points, such as the AAC&U VALUE rubrics for written and
oral communication at the end of this document.
With respect to the
second bullet above, the QHEC should enforce certain standards with
respect to rubrics that programs design for themselves. One standard
would be to have the rubric contain four levels of student
performance as indicated in the diagram below and the example
following this proposal.1
With this setup, for example, a department could designate average
criterion performance scores of 2.9 to 2.5 as meriting attention and
scores below 2.5 as meriting special or immediate attention.
Effective
Comm Criteria
|
Levels
of Performance
|
|||
4
Exceeds
Expectations
|
3
Meets
Expectations
|
2
Below
Expectations
|
1
Well
Below Expectations
|
|
Criterion
A
|
|
What does
a 3 mean?
|
|
What does
a 1 mean?
|
Criterion
B
|
|
|
|
|
Criterion
C
|
|
|
|
(and
so on)
|
Another standard would
be to use as many common criteria within the rubric as is feasible,
which will facilitate meta-analysis across disciplines and programs.
Departments should build their rubrics from a common starting point,
such as the Written Communication Rubric appended to this plan.
Departments are encouraged to use the same criteria headings, but
tailor their specific meanings to better address the manner in which
each criterion manifests itself in different programs. Having said
that, departments may choose to drop criteria that are not relevant
and add others that are important to the assessment of written
communication in their programs.
A final standard would
be to achieve a consensus regarding the approximate expectations
embodied in each level of performance, regardless of the learning
outcome being assessed. In the example above, does “meets
expectations” mean that the student’s work met the faculty’s
expectations of performance on that particular assignment, or
does it mean that the student’s work met the faculty’s
expectations of what should be expected of a student who is about
to graduate from SLCC? That is an important distinction, and we
would be much more confident in the validity of our meta-analysis if
an authoritative body like the Gen Ed Committee or the Curriculum
Committee came to an agreement about this.
As long as these basic
standards are respected, departments would be free to design rubrics
as they see fit. For example, please consult the written
communication rubric on page 8 of this document. Imagine that
Department A and B both see the relevance of students’ written work
to adhere to the “genre and disciplinary conventions” that are
embodied in that criterion in the rubric. For Department A, the
language in the cells specifying performance levels from “well-below
expectations” to “exceeds expectations” will focus on the
writing conventions that are most pertinent to their program(s),
while Department B will focus on the conventions that are most
pertinent to their programs(s). Those kinds of differences are
essential to the validity of program-level assessment, but are
largely tangential to the meta-analysis that will look at the ability
of students to adhere to “genre and disciplinary conventions”
across the College.
The Importance of
Standard Reporting Methods
Finally, it is
important for departments to report their results in a standard
fashion. We suggest that results be reported as follows:
- A copy of the assignment(s) and the rubric used to assess student work.
- A copy of the rubric with the cell descriptors empty, replaced by:
- The percent of students in the sample who placed into each cell.
- The number of students in the sample who placed into each cell.
- The average score for each criterion in the rubric.
- The sampling methodology used by the department.
- The overall number of student assignments in the sample.
- The program’s plans to respond to the results.
What Will the
Overall Assessment Report Look Like?
We envision a
straightforward report with the following sections:
- Introduction
- Methods
- Effective Communication: Strengths of SLCC’s Students
- Direct measures
- Indirect measures
- Effective Communication: Weaknesses of SLCC’s Students
- Direct measures
- Indirect measures
- Effective Communication: Unclear Results
- Recommendations
- Follow-up (Appended later when recommended interventions have taken hold and new data collected)
How will the
meta-analysis be conducted?
It would proceed as
follows:
- Start by sorting the dimensions of Effective Communication as defined by SLCC’s programs by looking at the various criteria in the rubrics used across the College to assess student work.
- Group those dimensions together (e.g., all the criteria that speak to dimensions like “supporting claims with evidence” or “mechanics of standard English”).
- Examine the data for each dimension of Effective Communication across the disciplines, coming to a determination as to whether the data indicate strengths, weaknesses, or are too muddy to make a conclusion.
- Write a short analysis of each dimension in the strengths and weakness categories, and explain why those dimensions in the “muddy” category are there.
- With respect to weaknesses, make recommendations that might help the College better serve students on the dimensions of Effective Communication.
Written Communication Rubric2
Levels
of performance
Criteria
|
Exceeds
Expectations
4
|
Meets
Expectations
3
|
Below
Expectations
2
|
Well
Below Expectations
1
|
Context
and Purpose for Writing
Includes considerations of
audience, purpose, and the circumstances surrounding the writing
task(s)
|
Demonstrates a thorough
understanding of context, audience, and purpose that is responsive
to the assigned task(s) and focuses all elements of the work.
|
Demonstrates adequate
consideration of context, audience, and purpose and a clear focus
on the assigned task(s) (e.g., the task aligns with audience,
purpose, and context.
|
Demonstrates awareness of context,
audience, purpose, and the assigned task(s) (e.g., begins to show
awareness of audience’s perceptions and assumptions).
|
Demonstrates minimal attention to
context, audience, purpose, and to the assigned task(s) (e.g.,
expectation of instructor or self as audience).
|
Content Development
|
Uses appropriate, relevant, and
compelling content to illustrate mastery of the subject, conveying
the writer’s understanding, and shaping the whole work.
|
Uses appropriate, relevant, and
compelling content to explore ideas within the context of the
discipline and shape the whole work.
|
Uses appropriate and relevant
content to develop and explore ideas through most of the work.
|
Uses appropriate and relevant
content to develop simple ideas in some parts of the work.
|
Genre
and Disciplinary Conventions
Formal and informal rules
inherent in the expectations for writing in particular forms
and/or academic fields.
|
Demonstrates detailed attention to
and successful execution of a wide range of conventions particular
to a specific discipline and/or writing task(s) including
organization, content, presentation, formatting, and stylistic
choices.
|
Demonstrates consistent use of
important conventions particular to a specific discipline an/or
writing task(s), including organization, content, presentation,
and stylistic choices.
|
Follows expectations appropriate
to a specific discipline and/or writing task(s) for basic
organization, content, and presentation.
|
Attempts to use a consistent
system for basic organization and presentation.
|
Claims and Evidence
|
Makes definite claims that are
always supported by credible evidence and skillful argumentation.
|
Makes concrete claims that are
usually supported by credible evidence and solid argumentation.
|
Makes claims that are sometimes
supported by evidence and argumentation.
|
Makes claims that are often
unsupported.
|
Control of Syntax and
Mechanics
|
Uses graceful language that
skillfully communicates meaning to readers with clarity and
fluency, and is virtually error-free.
|
Uses straightforward language that
generally conveys meaning to readers. The language in the work has
few errors.
|
Uses language that generally
conveys meaning to readers with clarity, although writing may
include some errors.
|
Uses language that sometimes
impedes meaning because of errors in usage.
|
AAC&U
VALUE Rubric for Oral Communication
Criteria
|
Exceeds
Expectations
4
|
Meets
Expectations
3
|
Below
Expectations
2
|
Well
Below Expectations
1
|
Organization
|
Organizational pattern (specific
introduction and conclusion, sequenced material within the body,
and transitions) is clearly and consistently observable and is
skillful and makes the content of the presentation cohesive.
|
Organizational pattern (specific
introduction and conclusion, sequenced material within the body,
and transitions) is clearly and consistently observable within the
presentation.
|
Organizational pattern (specific
introduction and conclusion, sequenced material within the body,
and transitions) is intermittently observable within the
presentation.
|
Organizational pattern (specific
introduction and conclusion, sequenced material within the body,
and transitions) is not observable within the presentation.
|
Language
|
Language choices are imaginative,
memorable, and compelling, and enhance the effectiveness of the
presentation. Language in presentation is appropriate to audience.
|
Language choices are thoughtful
and generally support the effectiveness of the presentation.
Language in presentation is appropriate to audience.
|
Language choices are mundane and
commonplace and partially support the effectiveness of the
presentation. Language in presentation is appropriate to audience.
|
Language choices are unclear and
minimally support the effectiveness of the presentation. Language
in presentation is not appropriate to audience.
|
Delivery
|
Delivery techniques (posture,
gesture, eye contact, and vocal expressiveness) make the
presentation compelling, and speaker appears polished and
confident.
|
Delivery techniques (posture,
gesture, eye contact, and vocal expressiveness) make the
presentation interesting, and speaker appears comfortable.
|
Delivery techniques (posture,
gesture, eye contact, and vocal expressiveness) make the
presentation understandable, and speaker appears tentative.
|
Delivery techniques (posture,
gesture, eye contact, and vocal expressiveness) detract from the
understandability of the presentation, and speaker appears
uncomfortable.
|
Supporting Material
|
A variety of types of supporting
materials (explanations, examples, illustrations, statistics,
analogies, quotations from relevant authorities) make appropriate
reference to information or analysis that significantly supports
the presentation or establishes the presenter's
credibility/authority on the topic.
|
Supporting materials
(explanations, examples, illustrations, statistics, analogies,
quotations from relevant authorities) make appropriate reference
to information or analysis that generally supports the
presentation or establishes the presenter's credibility/authority
on the topic.
|
Supporting materials
(explanations, examples, illustrations, statistics, analogies,
quotations from relevant authorities) make appropriate reference
to information or analysis that partially supports the
presentation or establishes the presenter's credibility/authority
on the topic.
|
Insufficient supporting materials
(explanations, examples, illustrations, statistics, analogies,
quotations from relevant authorities) make reference to
information or analysis that minimally supports the presentation
or establishes the presenter's credibility/authority on the topic.
|
Central Message
|
Central message is compelling
(precisely stated, appropriately repeated, memorable, and strongly
supported.)
|
Central message is clear and
consistent with the supporting material.
|
Central message is basically
understandable but is not often repeated and is not memorable.
|
Central message can be deduced,
but is not explicitly stated in the presentation.
|
1
Why four? Many existing rubrics
have four levels of student performance, including notably the
AAC&U’s VALUE rubrics that are targeted at many of the CWLOs
present at SLCC.
2
This is a modified version of the VALUE Rubric for
Written Communication, published by the Association of American
Colleges and Universities.
_________________________________________________________________________________
Humanities Faculty
Discussion Tuesday, 26. April 2011
If the assessment
doesn’t matter to me and my students, don’t do it.
This year, identify
what we faculty are already doing In assessment.
Humanities has the
most difficult time converting to measureable assessment because the
parameters ar eso varied.
His daughter is
learning Navajo and Shoshone
2.
b. tom’s pet peeve: teaching versus learning --- learning is
better
2. c. use the listed
parameters;
2. d. Clarity for
Stakeholders----make sure ‘your mom’ can understand the Outcomes
3. . big bullets;
rule of thiumb: no more than five words after each bullet---do
fast—this is not necessarily the table of contents of the textbook
5.
a: KNOW:
b. KNOW HOW: Know
the steps of analyzing mediums of art
c. SHOW HOW;
d. DO: The student
can analyze four different mediums of art.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
Jennifer Bauman’s
part
Become more
effective at writing our own LOs and incorporating them into our own
rubric / assessment.
- Rubrics help students what quality means and how to achieve it.
- Make your standards explicit and clear, if yju give them out in advance
Student learn hotot
learn
- Makes instructor’s mind and values transparent to a student
- Have your students help you build the assessment rubric.
- Rubric is a form of scholarship, also a teaching tool in which you expose yourself as a scholar, you are qualified to judge work in this area.
Includes one‘s
personal teaching philosopy, teaching methods, establishes standards,
level of quality
One’s syllabus
should also be an artifact of scholarship.
No comments:
Post a Comment