3.5.2 Wiki surveys

Wiki surveys enable new hybrids of closed and open questions.

In addition to asking questions at more natural times and in more natural contexts, new technology also allows us to change the form of the questions. Most survey questions are closed, where respondents choose from a set choices written by researchers. It’s a process that one prominent survey researcher calls “putting words in people’s mouths.” For example, here’s a closed survey question:

“This next question is on the subject of work. Would you please look at this card and tell me which thing on this list you would most prefer in a job?”

  1. High income;
  2. No danger of being fired;
  3. Working hours are short, lots of free time;
  4. Chances for advancement;
  5. The work is important, and gives a feeling of accomplishment.

Now here’s the same question asked in an open form:

“This next question is on the subject of work. People look for different things in a job. What would you most prefer in a job?”

Although these two questions appear quite similar, a survey experiment by Howard Schuman and Stanley Presser (1979) revealed that they can produce very different results: nearly 60% of the open results fall outside of the categories in the closed responses (Figure 3.7).

Figure 3.7: Results from Schuman and Presser (1979). Responses are quite different depending on whether the question is asked in closed or open form.

Figure 3.7: Results from Schuman and Presser (1979). Responses are quite different depending on whether the question is asked in closed or open form.

Although open and closed questions can yield quite different information and both were popular in the early days of survey research, closed questions have come to dominate the field. This domination is not because closed questions have been proven to provide better measurement, rather it is because they are much easier to use; the process of coding open-ended questions is complicated and expensive. The move away from open questions is unfortunate because it is precisely the information that the researcher did not know ahead of time that can be the most valuable.

In some research I’ve done with Karen Levy, we tried to create a new kind of survey question that combines the best features of both open and closed questions (Salganik and Levy 2015). That is, it enables researchers to learn new information as in an open question, and it yields easy to analyze data as in a closed question. Inspired by online systems driven by user-generated content, of which Wikipedia is an exemplar, we called our system a wiki survey. By combing the characteristics of Wikipedia and a traditional survey, we hope to create a new way of asking questions.

The data collection process in a wiki survey is illustrated by a project we did with the New York City Mayor’s Office in order to integrate residents’ ideas into PlaNYC 2030, New York’s citywide sustainability plan. To begin the process, the Mayor’s Office generated a list of 25 ideas based on their previous outreach (e.g., “Require all big buildings to make certain energy efficiency upgrades,” “Teach kids about green issues as part of school curriculum”). Using these 25 ideas as “seeds,” the Mayor’s Office asked the question “Which do you think is a better idea for creating a greener, greater New York City?” Respondents were presented with a pair of ideas (e.g., “Open schoolyards across the city as public playgrounds” and “Increase targeted tree plantings in neighborhoods with high asthma rates”), and asked to choose between them (Figure 3.8). After choosing, respondents were immediately presented with another randomly selected pair of ideas. Respondents were able to continue contributing information about their preferences for as long as they wished by either voting or choosing “I can’t decide.” Crucially, at any point, respondents were able to contribute their own ideas, which—pending approval by the Mayor’s Office—became part of the pool of ideas to be presented to others. Thus, the questions that participants receive is both open and closed simultaneously.

Figure 3.8: Interface for a wiki survey (Salganik and Levy 2015).

Figure 3.8: Interface for a wiki survey (Salganik and Levy 2015).

The Mayor’s Office launched its wiki survey in October 2010 in conjunction with a series of community meetings to obtain resident feedback. Over about four months, 1,436 respondents contributed 31,893 responses and 464 new ideas. Critically, 8 of the top 10 scoring ideas were uploaded by participants rather than part of the set of seed ideas from the Mayor’s Office. And, as we describe in our paper, this is a pattern in many wiki surveys. In other words, by being open to new information, researchers are able to learn things that would have been missed using more closed approaches to asking.