Yale Center for Teaching and Learning

Suggestions for Designing an Inviting Assessment Approach

There is not always one “right” method to answer a given question. Greater awareness and respect of your participants time and perspectives will help you select the best option for your work. Here are concepts to consider that guide your choice. Details on specific methods, such as survey design, interview protocols, and observation can be found [here], or reach out to us directly [contact]. Regardless of the specific method you select, consider the following suggestions.

Match the “tone” of your assessment to the interactions of your program. Does the method you are considering reflect the types of interactions you have had, and hope to have, with participants? 

Why does this matter? The choices you make about assessing your program are part of how participants’ experience it. Sharing input can feel like an invitation to participants rather than an obligation. The design communicates the project’s values and goals. It is one way you are showing participants what “matters” to you or the program.

Example. A faculty member was planning on sending a 20-item Likert survey at the end of a discussion-based student training experience. Upon reflection, she felt this method may be discordant with the training experiences to that point. The faculty member then decides to ask four brief questions of the students: “What was a highlight of the experience? What was the most valuable knowledge or skill they learned during the training? What was most challenging about the training? What would you recommend we do differently in future trainings like these?”

Have a data storage and analysis plan. Do you know how the data will look or in what form they will be when you receive them? Do you know how you will organize, analyze, interpret, and report the data?

Why does this matter? Collecting data without a clear plan of how to clean and analyze it can leave you with the dreaded “Data Pile.” Curious investigators can produce a TON of data and then be stuck with a complex file or set of files. Knowing how your data will be exported and stored will confirm that you are able to analyze and interpret data according to your research questions. Subtle changes in how data are output can have unexpected impacts on data interpretation. Test our how your data will be explored and stored. Avoid the “collect it all and sort it out later” approach! If you cannot describe how you will be analyzing a given item or question, consider whether this item is needed or if it can be tied back to one of your original goals.

Example. A staff member sends out an online survey to multiple departments asking about participation in local volunteer activities. Using “skip-logic” in the survey, people were able to skip over items that were not relevant to their experiences. While this made answering the surveys easier for participants, it left the staff member with hundreds of blank columns that needed to be merged in complex ways to answer their research question. This approach may be warranted but this unanticipated step in data processing pushed back the project by a couple weeks.

Specify whether the data will be confidential or anonymous. Do participants know whether identifying data will be collected and who will see it?

Why does this matter? Clarifying whether any identifying data is going to be collected, who will have access to it, and how it will be reported respects participants’ ability to consent and garners trust with them to provide answers they feel comfortable sharing. As terms, confidential and anonymous are meaningfully different. Confidential means that some information being collected could potentially link back to a specific individual and that this information will be protected by the researchers. Anonymous means that no identifying information will be collected that could be traced back to a specific individual. In both cases, summarizing comments and data points before sharing it can help protect people being inadvertently (or even erroneously) identified.

Example. A director of undergraduate studies sends graduating students a survey inquiring about the beneficial and challenge experiences they have had over their time in the department. It is not clear who will have access to the data or whether identifying information is available through completing the survey. Some students are hesitant to provide candid feedback about particular experiences knowing that the DUS may see their results directly with their name associated with it. This resulted in lower response rates and limits generalization from the responses. Consequently, the department’s ability to identify and respond to areas of challenge is limited.

Demographic information. Only ask for personal characteristics such as race, ethnicity, gender, citizen status when you have a clear need for using this data to answer a clear question and/or when defining the sampled population may be necessary.

Why does this matter? Given the smaller sample size and heterogeneity of many higher education studies, the more defining personal characteristics that are collected, the higher the likelihood of a specific person being identified in the data. Those identifying as a more diverse race in a small sample have a greater risk of identification. These questions can also convey to participants that you are expecting differences based on these factors, which can activate stereo-type threat responses under some conditions.

Example. Faculty in a medical school department were surveyed on their mentorship. Although the survey did not collect any name, email, or IP address, it did ask for gender, race, and age bracket. There were only three female-identifying Black individuals in the department and only one female-identifying Black individual in the younger age bracket. Without intention, the staff member sending out the survey can now identify the responses of a particular person.

Balance data needs with participants’ time. Where possible, respect participants by asking for only what is necessary and honor their time by using their data responsibly and responsively. 

Why does this matter? Requesting data from your participants often means requesting their time. Groups within higher education are often sampling from the same community of people. Being thoughtful about whether and when to collect data, from whom, and how to share findings back to the community can reduce burden on our communities and improve perceptions of educational assessments within higher education.

Example. A tutoring group wants to survey students who have used their service over the past semester. Since the group wants to collect data before the end of the semester, they decide to launch a 15-minute survey right as classes end. This is the same time when students are being asked to complete course evaluations, end-of-semester surveys, and pack up to return home. Response rates are low and limits interpretation of the program’s impact.

Be responsive. Share findings and intended actions with participants and, when possible, let them know how their data are being used.  

Why is this important? Collecting data involves relying on members in a given community to invest their effort and perspective. To keep this relationship healthy and receive the most robust response, demonstrate your response to findings when possible. Participants express frustration when investing time providing input and not hearing how the data are used. This frustration increases when similar data are requested from the same community members without interim communication of findings or responses.   

Example. A department wants to survey first year graduate students to see how their first year went. Students are asked to complete a 15-minute reflection form sharing the high and low points of their first year. In addition to the positive responses, students share a few consistent challenges they would like to see addressed. The department meets to discuss the feedback but does not share the findings or actions they plan to take based on the feedback with the students. Students are unsure how the results were used and may be less likely to complete future requests.

Want more information? Reach out!

To learn more about how the EPA team can support your assessment, please contact assessment.poorvu@yale.edu.