In defense of the sanctimonious women's studies set || First feminist blog on the internet

Online Surveys Should Have Comments Systems

I might be blogging too much, but when I was solicited to take an online college survey, I truly wished there were comment boxes available for me to flesh out my answers. I took the damn thing sort of like I take a quiz on what 70s glam rock icon I am, all the while knowing that the only answers I could give were going to put me in a little box too small for my preferences.

Most of the answers I had to give on school life were completely predicated on me being a single mother at a massive land grant university. My time and attention is limited, of course I’m not seeking out places on intramural sports teams and “frequently” joining campus activities. I do what I can, which I indicated to the best of my ability, but nowhere was I able to say that although my university has plenty of things going on I don’t have the wherewithal to participate.

After all the campus life business, the political questions started: Do you consider yourself far left, left, center, right, far right? Let me think. I clicked through to the next page and my mouth gaped open. Forgive me for being a media skeptical blogger, but I can only assume how this research could be framed and used against the “liberal academic elite.”

Oh, how I wish I could have had a little box to add clarifications and critique the wording of the questions [click for screenshot].

See the first question, “There is too much concern in the courts for the rights of criminals.” Is this survey seeking out hard conservatives and liberals among college undergraduates? I clicked Disagree Somewhat because while I think the rights of all, including convicted criminals, are an important human rights standard to set, even if those concerns can be used by defense lawyers to detract from the rights of victims, especially in the case of victims of violent crimes seeking justice.

Another question, “If two people really like each other, it’s all right for them to have sex even if they’ve only known each other for a short time.” I clicked Agree Somewhat because I think any two consenting adults have the right to have sex in pretty much any way they want, they don’t even have to know or like one another, and as long as it’s safe and consentual you’re a-okay with me. Bad question, poor available answers.

It seems as though surveys like this are set up for college students to display their poor understanding of the Constitution, or at the very least to typecast us as wild pipe dreamers. At the very least I wish for the kinds of surveys that give an opportunity to place my answers within a context other than my study habits, alcohol use, and name and address. We are not all cut from the same cloth.


5 thoughts on Online Surveys Should Have Comments Systems

  1. I’m not an expert in survey research design, but I have worked for the last year and a half with a team writing a national survey that is actually in the field right now… so I have been in the position of having to actually try and write survey questions and I’ll say this much: It’s friggin hard.

    The thing that I had to learn was the point of survey questions isn’t to actually find out what any one individual thinks. You can’t do that with a survey, you have to do that with an indepth interview or a focus group. The point is to collect enough answers from enough different people (different in as many ways possible usually) to see if there are non-random differences between groups of people. And only then do you have something to say, but importantly, you shouldn’t say “Women think X and men think Y” (since you don’t actually know what X and Y necessarily are) – but rather “Women and Men seem to think differently on X” – where X is the question itself, with all its ambiguous glory.

    That said, the problems you note are pretty much right on.

    First off, what you note in the “if two people really like each other” question shows a classic question design problem, in that the question has two prompts – the degree to which people like each other and the amount of time they have been together. For data analysis purposes, this is a pretty useless question without more items that break this apart. Just as you note, you can’t tell what the respondent is sayin from their response. Bad design.

    Now, the first one you note certainly is ambiguous, and I really hate the way its written – but there is a defense of intentionally ambiguous questions in two ways. First, if the ambiguous answer, when crossed against another variable, like race, sex, or income, has statistically reliable differences, then it’s interesting even though its ambigious. But, of course, we don’t actually know why its interesting if we don’t actually know what the item is testing.

    The second defense has to do with times in which the ambiguous prompt is logically ambiguous, but not necessarily culturally ambiguous. That is, often statements that have multiple interpretations are used constantly in public, and are recognizable and meaningful to respondents even if they are internally unclear. You use them on the survey kind of because you have to use them.

    Of course, you still have no idea what they necessarily mean when you get to the stage of analyzing your results, since people can always argue, “That’s not what the respondents meant.” Of course, find me a question that measures an attitude that you can’t actually do this with and I’ll show you a boring question.

    But the real problem is the basic methodology. If users self-select in choosing to take the survey, its a amazingly limited data. Borderline useless for generalization.

    You can do some things to make it better, like draw a random sample from within the self-selected population, and ask a hell of a lot of demographic questions which can be used to build weights for individual cases. But at the end of the day, people who choose to spend their time find and taking surveys online are, the argument goes, basically different people then those who don’t. And most likely in meaningful ways. For example they 1) have internet access and a computer, 2) that access is not controlled by someone else, 3) they have the free time to take the online survey, and 4) they want to take the survey.

    This last problem doesn’t just plague internet surveys, but pretty much any kind of ethical survey that allows respondents to, well, not be respondents. There are entire books written about survey non-response and what it does to data. And its not good.

    But anybody who makes an argument about the general population based on data they drew from an web survey can be pretty much dismissed out of hand. And likely will be by more survey researchers.

  2. Yah, those college surveys were always very, very annoying. And the political questions always seemed rather inappropriate. Okay, I can see why a college administration would like to be able to see if there’s problem drinking going on, but why is it necessary to know this other stuff?

    The absolute best example of terrible survey wording I ever ran across, however, was:

    “How many times in the past month have you not had a drink of alcohol, in order to:
    (1) Study
    (2) Go to class
    …list continues”

    Ponder that for a moment. How many times have I not done something over the past month? Hmm. I didn’t drink when I got up this morning. That’s one. Then, I had to go to breakfast, and I didn’t have a beer with breakfast. That’s two. Damn. There must a THOUSAND times that I didn’t drink alcohol because I was doing something else! Impressive.

  3. The same problems exist within phone surveys, with the added problem that the person giving the survey can mess with the data. I worked for three years doing phone surveys, and I can’t tell you the number of co-workers who would coach the respondent on what to answer, or just put something down for them, in order to make their own jobs easier (with damn good reason).

    People who are really dumb are annoying to do surveys with, because they won’t understand the questions. People who are really smart are annoying to do surveys with, because, like you, they don’t want to give an inaccurate answer within the defined fields. It’s the vast, mediocre majority that survey-takers like talking too.

  4. People who are really dumb are annoying to do surveys with, because they won’t understand the questions. People who are really smart are annoying to do surveys with, because, like you, they don’t want to give an inaccurate answer within the defined fields. It’s the vast, mediocre majority that survey-takers like talking too.

    Ah, phone surveys. I almost always have answers to even the most focused questions that booger up the poor survey-taker on the other end of the line. i.e. Are you a Democrat, a Republican, or an Independent? (Answer: A Green). What religion do you belong to? (Answer: Unitarian Universalist). And then when you get into the questions where I can either guess the thrust behind the survey, or, as you say, dislike giving an overly simplistic answer to a question about a highly complicated issue… Luckily, most of the survey-takers I’ve interacted with were more amused than annoyed by my yatterings.

  5. People who want to play around and give real answers to the questions are fun as long as they’re nice about it and don’t get angry about the fact that you can’t accept anything other than the canned responses. Of course, if you’ve had a long shift or need to just get the survey done and get on to the next complete, they can be a huge pain in the ass, too.

    I’m so glad I’m done with that job.

Comments are currently closed.