Skip to main content
  • Information for
    • Students
    • Alumni & Friends
    • Faculty
    • Staff
    • Community
  • QUICK LINKS
  • DIRECTORY
  • APPLY
  • GIVE
  • RENT
Hunter College
About
  • Overview
  • Mission
  • Strategic Plan
  • Accreditation
  • Fast Facts
  • Office of the President
  • Capital Projects & Planning
  • Sustainability
  • Campus Information
  • Contact Us
Academics
  • Approach
  • Provost
  • Schools
  • Departments & Programs
  • Majors
  • Honors & Scholars
  • Education Abroad
  • Advising
  • Research & Creative Works
  • Course Catalogs
Admissions
  • Overview
  • Undergraduate
  • Graduate
  • Course Catalogs
Student Life
  • Clubs & Organizations
  • Residence Life
  • Athletics
  • Dining On Campus
  • Community
  • Events
  • News
  • Libraries
Hunter College Schools
  • School of Arts & Sciences
  • School of Education
  • School of Health Professions
  • Hunter-Bellevue School of Nursing
  • Silberman School of Social Work
More Schools
  • Hunter College Campus Schools
  • Hunter College Continuing Education
  • Libraries
  • Students
  • Alumni & Friends
  • Faculty
  • Staff
  • Community
  • Events
  • News
  • APPLY
  • GIVE
  • RENT
  • QUICK LINKS
  • DIRECTORY
Office of Assessment
  • About
  • Assessment Guidelines
    • Course Assessment
      • Identify Course Learning Outcomes
      • Course Maps
      • Rubrics and Item Analysis
      • Adjust Pedagogical Practice
      • Formative Assessment
    • Department and Program Assessment
      • Identify Program Learning Outcomes
      • Curriculum Maps
      • Key Assessments, Portfolios, Capstones
      • Adjust Curriculum and Resources
    • Administrative or Support Units Assessment
      • Identify Office or Program Goals
      • Logic Models
      • Assess Evidence
      • Adjust Office or Program Design
  • Institutional Learning Outcomes
  • Learning Outcomes by Program
    • School of Arts and Sciences
    • School of Education
    • School of Health Professions
    • School of Nursing
    • School of Social Work
    • Continuing Education
    • Guidelines by Discipline
  • Resources
    • Assessment Glossary
    • Sample Tools
    • Assignment Library
    • Assessment Quick Tips
    • Assessment Bibliography
    • Course Guidelines
    • Assess, Teach & Learn Online
  • Events
    • On-going Events
    • Archived Events
  • Middle States
  • Policies and Reports
  • Staff
    • Assessment Staff
    • Assessment Coordinators
    • Assessment Fellows
  • Contact

Assess Evidence

  • Identify Office or Program Goals
  • Logic Models
  • Assess Evidence
  • Adjust Office or Program Design

Assess Evidence

Tracking, Pre-Post Tests, Surveys and Focus Groups

Administrative offices come in many forms, and will therefore be exploring many different types of evidence to see if their goals are being met. It is possible, however, to break down different types of evidence, and their associated assessment tools, by the type of question you are asking.

Type of Question Appropriate Tools
How many/ how much of something (time, people, services) Track totals, averages, and percentages
Whether population has learned something (knowledge, skills) Conduct pre-post tests
How a population feels/ experiences something Conduct surveys, interviews, and focus groups

Data tracking is a way to keep an eye on how your program is running by storing measures that are pulled at certain moments and comparing those measures over time.

For example, a tutoring program might want to track the number of students coming in each day (output), the number of times an individual student comes in (output), and any changes in that student's GPA (impact). At the end of the academic year, the program can count the total number of students who came in, the average number of times any one student came in, and the percentage of students whose GPA increased over the academic year.

To do this kind of assessment, it is best to use some sort of software that allows you to easily calculate these types of summary measures. Most often, people use a spreadsheet program (e.g. Microsoft Excel), but you can also use more sophisticated database software that lets you pull customizable reports based on the information you are interested in (e.g. Microsoft Access).

Pre-post tests are a way to measure whether your population knows more about something, or is better able to do something, as a result of participating in your program. In many ways, this type of assessment is similar to student learning assessment found on the academic side of things, but instead of measuring the effectiveness of an academic course or curriculum, you're measuring the effectiveness of training program.

Pre-post tests can take many forms, depending on what you are looking to measure. However there are two critical elements to remember:

  1. The questions on the pre-test should match exactly the questions on the post-test, and
  2. Each test should be labeled with an identifier unique to each participant

So, for example, if you are assessing a staff training program, and you ask three questions about compliance regulations when they first start the training, you should ask the same three questions when you end the training. And if you ask for participant identifiers, you will not only ensure the same set of people took the test, but will be able to calculate fancier measures of programmatic impact, such as t-tests.

Surveys, interviews and focus groups are used to gather information that is harder to quantify—things like feelings, experiences, and perceptions. All three are built upon asking people questions about themselves. Though it's not a hard and fast rule, surveys generally occur on paper or online and tend to have more close-ended questions (pick among choices), while interviews generally occur face-to-face or on the phone and tend to have more open-ended questions (no defined choices). Focus groups are like group interviews, in which people can respond to the facilitator or to each other.

The more open-ended your questions are, the richer your data will be, offering new perspectives that might not have surfaced if you had given people a pre-defined set of choices. However, open-ended data will require additional analysis, such as qualitative coding, to say anything definitive about your question. Most often, people use survey software to create a survey (e.g. Qualtrics, available for free to the Hunter community).

Remember! Matching your tool to your question is very important! If you do not make this match, you will spend a lot of time gathering evidence and it won't answer any of the things you wanted it to answer!

Hunter's Technology Resource Center is a great place to get started with any of the software described above.

Continue to Close the Loop

HUNTER

Hunter College
695 Park Ave NY, NY 10065
(212) 772-4000

  • Facebook
  • Twitter
  • Instagram
  • Flickr
  • ABOUT
  • ACADEMICS
  • ADMISSIONS
  • EVENTS
  • NEWS
Hunter College Schools
  • School of Arts & Sciences
  • School of Education
  • School of Health Professions
  • Hunter-Bellevue School of Nursing
  • Silberman School of Social Work
  • School of Arts & Sciences
  • School of Education
  • School of Health Professions
  • Hunter-Bellevue School of Nursing
  • Silberman School of Social Work
Our Other Schools
  • Hunter College Campus Schools
  • Hunter College Continuing Education
  • Hunter College Campus Schools
  • Hunter College Continuing Education
Hunter College Libraries
More Info
  • Bookstore
  • Contact Us & Feedback
  • Jobs
  • Public Safety
  • Roosevelt House
  • Student Housing
  • Space Rentals
  • Bookstore
  • Contact Us & Feedback
  • Jobs
  • Public Safety
  • Roosevelt House
  • Student Housing
  • Space Rentals
Public Information
  • Annual Security & Fire Safety Report
  • Consumer Information
  • CUNY Tobacco Policy
  • Enough is Enough
  • Focus on Campus
  • Annual Security & Fire Safety Report
  • Consumer Information
  • CUNY Tobacco Policy
  • Enough is Enough
  • Focus on Campus
CUNY
  • © 2025 Hunter College
  • Accessibility
  • Privacy
  • Terms