Writing Good Questions

In October, I was lucky enough to attend the very first “Undergraduate Research Practices Workshop” hosted by CLIR and run by Nancy Fried Foster from the University of Rochester. One of our topics was “Asking Good Questions,” and during the discussion I offered up the process we use here at Columbia. I hope some of you find our approach helpful, too.

This process is used at Columbia University Libraries to facilitate discussions about information needs in assessment projects. All of the questions that staff may want to ask student or faculty, via a survey, focus group, interview, or ethnographic study can be developed using this process. Many assessment projects start with “let’s do a survey.” A survey is a useful tool for gathering information, and we use them often. However, it’s not always the best tool for the job. No matter the methodology you decide on, it’s important to have good questions to collect relevant data. We can ask any number of “interesting” questions, but we want to be sure to prioritize our information needs, which this process facilitates. Assessment supports decision making at CUL, and I try to make that as direct a relationship as possible.

Another reason to write good questions is to make our assessment tools more “usable.” There’s often anxiety about the length of a survey. In my opinion, it’s not the length, but the usability of the survey that matters. If you have 20 questions that are relevant to the survey taker, and easy to understand and answer, you’ll get a higher response rate than having 5 poorly-worded, leading or confusing questions. We consistently get pretty high completion rates to our surveys, which hints at the general usability of the questions we write.

Step One: Using the “Project Team Brainstorming Activity” chart below identify what you know and what you don’t know about the project at hand. I call these items, simply, “Knowns” and “Unknowns.” Before we gather information from our users, it’s important to be sure that we don’t already have data on-hand that may answer our questions. Examples of Knowns can be anything from budget information, website usage statistics or gate count statistics to general limitations of the project or things that you know you will do during the project. Unknowns can be anything that team members want to know about the project or from the user population you’re working with. Team members should be encouraged to be exhaustive in their brainstorming – this is our opportunity to ask all of our questions! Team members shouldn’t worry at all about the phrasing of the question at this point, this is a brainstorming activity.

I usually give team members a paper worksheet (like the one below) before the meeting, and ask them each to brainstorm on their own, in preparation for a group brainstorming session.

At the group brainstorming session, start with the Knowns, and then move onto the Unknowns. Team members share the items they came up with on their own, and then move on with further brainstorming of more items.

Usually, as the Assessment Librarian, I am facilitating the brainstorming session, and recording all of the ideas on flip chart paper.

Step Two: Transfer all of the Unknowns to the “Writing Good Questions” document.


Step Three: Write an Information Need statement for each Unknown. A good way to do this is to write “I want to know” statements for each Unknown, as recommended by Nancy Fried Foster. For example, an Unknown might be “Should the new science library be open 24/7?” The corresponding Information Need would be “I want to know if undergraduate science students need library services overnight.” There may be multiple Information Needs for each Unknown. (This isn’t an exact science! The goal is to clarify the questions you brainstormed, to identify the exact information you need, and help you write a more clear question.)

Step Four: This is a good time for the project team to prioritize the information needs – what do you need to know now for the project at hand, and what would simply be nice to know? (You could add a “Priority” column to the worksheet.)

Step Five: Assign the appropriate audience and methodology (survey, focus group, interview, observation, ethnographic study, etc.) to each Information Need. Some questions will be appropriate for undergraduates, some just for faculty. You may want to use different methodologies for different populations.

Step Six: Based on the Information Need, audience, and methodology, write the text for each question you will ask. Try to be as clear as possible. Play around with the wording, maybe write two or three options for each Information Need, before choosing the final question text. I like to do this with a partner, so we can challenge each others word choices, and find the “best” way to ask a question. One good thing to try is “If we ask this question, people will give us this information. Will that help us?”

Step Seven: Test the questions, rewrite the questions as necessary, test, rewrite, test, rewrite, test, rewrite… I usually send the questions to a couple of students workers who will not be participating in the study. I ask them to tell me if any of the questions are unclear, confusing, use terms they don’t understand, or don’t allow them to give the answer they want to express. I also send the questions to colleagues who have experience with research, and a critical eye. At some point, you’ll need to get the study under way, so don’t spend too much time over-thinking the questions.  A couple of rounds of testing will pick up the red flag issues, and then you’ll notice that the feedback you’re getting from colleagues sounds nitpicky. Your questions are probably ready to go!

I encourage everyone to join the Anthrolib list serve run by Nancy Fried Foster a the University of Rochester.

For more information on our approach to assessment at Columbia, take a look at the materials available from the 2009 ACRL conference workshop Assessment Project Management in the Real World. I’d love to hear how others approach question writing – please post a comment and share your experience!

No comments yet.

Leave a Reply