Designing a feedback survey
This project applies content design principles to survey design by focusing on clear language, logical flow, and user-centered presentation to optimize the completion rates and the data quality of a feedback survey for the European Union Center.
Background
Challenges
Open-ended questions
I worked on this project for the European Union Center, an academic and research center affiliated to the University of Illinois. The center offers courses, workshops, and hosts several events throughout the academic year to promote understanding of the European Union both on campus and in the local community. The European Union Center aimed to create and distribute a survey to get feedback on the events organized throughout the academic year. However, surveys from previous years had suffered from low participation rates and limited data quality. I used content design principles to devleop a new survey focused on clarity, engagement, and usability, with the goal of increasing response rates and collecting more reliable data.
Low discoverability
Long question list
Lack of incentives
Methodology and tools
I then compiled all the questions that had been asked in previous surveys and reviewed them. Some questions were left out and some others were kept, though I did change the wording in most cases. I ended up with 3 quesions for each event type (15 questions total).
I started this project by creating a flowchart in Whimsical to represent the structure and logic of the survey. The idea was to only show users questions about the events they could have attended.
I then did a pilot test with a small group of users to determine how long it would take them to complete the survey. II
Lastly, the survey was shared in a separate email (instead of as part of the weekly newsletter), obtaining higher response rates than in previous years.
As a next step, I developed the survey in Qualtrics, following the Whimsical flowchart and the final list of questions.
Survey design
Cover page
The title mentions the goal of the survey and the incentive.
Estimated completion time is mentioned to encourage participation.
The button text mentions the value of clicking the button ("share my thoughts") instead of the action ("start the survey").
Survey design
Sample background questions
Survey design
Sample event-related questions
Accessibility
I took into consideration accessibility best practices when designing the survey.
Thank you
Lack of incentives
None of the previous surveys reviewed included an incentive to encourage participation. Although not required, incentives can motivate users to complete the survey without leaving any questions unanswered.
Open-ended questions
An analysis of previously used surveys showed that they used a good amount of open-ended questions. Open-ended questions are meant to collect qualitative data, which is better captured with other types of methods, such as focus groups or semi-structured interviews. This could have contributed to the low participation rates.
Tooltip
One of the questions referenced the Brown Bag Series. However, some users were unfamiliar with that term during the pilot test. For that reason, a tooltip was added, so that users could see a brief definition when hovering over it.
Low discoverability
Previous surveys had been shared in a newsletter the European Union Center emailed weekly. Although this newsletter was meant to be received by professors, students, alumni, and some community members (the target groups for the survey), including it in the newsletter could have made it hard to find. In essence, this might have contributed to the low rates of participation as well.
Long question list
Previous surveys included a long list of questions, shown all at once. Research has shown that this can lead to high abandonment rates, leaving many questions unanswered. This could have discouraged many users from completing the survey or could have made users provide low quality responses.
Age range
This question asks respondents to select an age range. To prevent confusion, each range ends with a number ending in 4, and the next begins with a number ending in 5—for example, "25–34". Using overlapping ranges like "25–35" and "35–45" could be ambiguous for someone aged 35, as they might fit into both.
Avoiding bias
Only one open-ended question was retained from previous surveys, but it was reworded to reduce bias. The original version—“What else could we improve in future conferences to enhance the experience (e.g., refreshments)?”—included an example that could steer respondents toward specific topics like food or drinks. By removing the example, the revised question encourages more diverse and unbiased feedback.
Question grids
It is common to group questions using a Likert scale in a grid format. However, these were avoided, since they are hard to parse and are not always displayed properly on all devices.
Likert scales
Likert scales were used for some questions in order to replace former open-ended questions. A 5-point Likert scale was chosen for different reasons: - They have a "neutral" option. - They do not force users to make a "good"/"bad" choice. - They offer more granular responses than 4-point Likert scales but are less cognitively taxing than 6-point Likert scales. A
Role selection
Since some events are restricted to a specific group of attendees, the users' response to this question determines what other questions they see later in the survey.
Content design for surveys
iratihurtado
Created on June 25, 2025
Start designing with a free template
Discover more than 1500 professional designs like these:
View
Simple Branching Scenario Mobile
View
Branching Scenario: Leadership Decisions
View
Branching Scenario: Digital Alert
View
Conflict Resolution: Branching Scenarios
View
Simple Branching Scenario
View
Choose Your Own Story
View
Branching Scenario: Save Christmas
Explore all templates
Transcript
Designing a feedback survey
This project applies content design principles to survey design by focusing on clear language, logical flow, and user-centered presentation to optimize the completion rates and the data quality of a feedback survey for the European Union Center.
Background
Challenges
Open-ended questions
I worked on this project for the European Union Center, an academic and research center affiliated to the University of Illinois. The center offers courses, workshops, and hosts several events throughout the academic year to promote understanding of the European Union both on campus and in the local community. The European Union Center aimed to create and distribute a survey to get feedback on the events organized throughout the academic year. However, surveys from previous years had suffered from low participation rates and limited data quality. I used content design principles to devleop a new survey focused on clarity, engagement, and usability, with the goal of increasing response rates and collecting more reliable data.
Low discoverability
Long question list
Lack of incentives
Methodology and tools
I then compiled all the questions that had been asked in previous surveys and reviewed them. Some questions were left out and some others were kept, though I did change the wording in most cases. I ended up with 3 quesions for each event type (15 questions total).
I started this project by creating a flowchart in Whimsical to represent the structure and logic of the survey. The idea was to only show users questions about the events they could have attended.
I then did a pilot test with a small group of users to determine how long it would take them to complete the survey. II
Lastly, the survey was shared in a separate email (instead of as part of the weekly newsletter), obtaining higher response rates than in previous years.
As a next step, I developed the survey in Qualtrics, following the Whimsical flowchart and the final list of questions.
Survey design
Cover page
The title mentions the goal of the survey and the incentive.
Estimated completion time is mentioned to encourage participation.
The button text mentions the value of clicking the button ("share my thoughts") instead of the action ("start the survey").
Survey design
Sample background questions
Survey design
Sample event-related questions
Accessibility
I took into consideration accessibility best practices when designing the survey.
Thank you
Lack of incentives
None of the previous surveys reviewed included an incentive to encourage participation. Although not required, incentives can motivate users to complete the survey without leaving any questions unanswered.
Open-ended questions
An analysis of previously used surveys showed that they used a good amount of open-ended questions. Open-ended questions are meant to collect qualitative data, which is better captured with other types of methods, such as focus groups or semi-structured interviews. This could have contributed to the low participation rates.
Tooltip
One of the questions referenced the Brown Bag Series. However, some users were unfamiliar with that term during the pilot test. For that reason, a tooltip was added, so that users could see a brief definition when hovering over it.
Low discoverability
Previous surveys had been shared in a newsletter the European Union Center emailed weekly. Although this newsletter was meant to be received by professors, students, alumni, and some community members (the target groups for the survey), including it in the newsletter could have made it hard to find. In essence, this might have contributed to the low rates of participation as well.
Long question list
Previous surveys included a long list of questions, shown all at once. Research has shown that this can lead to high abandonment rates, leaving many questions unanswered. This could have discouraged many users from completing the survey or could have made users provide low quality responses.
Age range
This question asks respondents to select an age range. To prevent confusion, each range ends with a number ending in 4, and the next begins with a number ending in 5—for example, "25–34". Using overlapping ranges like "25–35" and "35–45" could be ambiguous for someone aged 35, as they might fit into both.
Avoiding bias
Only one open-ended question was retained from previous surveys, but it was reworded to reduce bias. The original version—“What else could we improve in future conferences to enhance the experience (e.g., refreshments)?”—included an example that could steer respondents toward specific topics like food or drinks. By removing the example, the revised question encourages more diverse and unbiased feedback.
Question grids
It is common to group questions using a Likert scale in a grid format. However, these were avoided, since they are hard to parse and are not always displayed properly on all devices.
Likert scales
Likert scales were used for some questions in order to replace former open-ended questions. A 5-point Likert scale was chosen for different reasons: - They have a "neutral" option. - They do not force users to make a "good"/"bad" choice. - They offer more granular responses than 4-point Likert scales but are less cognitively taxing than 6-point Likert scales. A
Role selection
Since some events are restricted to a specific group of attendees, the users' response to this question determines what other questions they see later in the survey.