Friday, October 26, 2012

Establishing Rapport


Telephone survey respondents sometimes tell me that they will do my telephone survey, but they don’t usually do surveys.  Experience has given me some skill at getting people to answer survey questions, but my voice probably deserves most of the credit.  I have a deep and authoritative-sounding voice.

One reason that many people do not participate in telephone surveys is personal or hot-button questions.  We tell people when we read the introduction to a questionnaire that the survey is confidential.  We tell them that their responses will be reported in the aggregate and not individually.  Respondents often do not hear this.  If they do hear it, they perceive it more from the tone of voice of the interviewer.  Some respondents ask direct questions about the confidentiality of a survey.  If an interviewer can answer such questions without getting flustered, the interviewer can usually establish rapport with the respondent.

Establishing rapport with a survey respondent is important not simply to get them to participate at all, but also to give honest answers to hot-button and personal questions.  If a respondent perceives that an interviewer is truly neutral the respondent will honestly give answers that may be considered socially unacceptable.  If we do a survey about household financial management, we want accurate information about gambling habits and credit card debt.  We ask demographic questions about age, income, marital status and education level so that we can make sure that we get a representative sample of a population.  We usually ask these questions at the end of the questionnaire.  An interviewer who can establish rapport at the beginning of the call is less likely to have a respondent refuse to answer these personal questions.

Sometimes it is too easy to establish rapport.  Some respondents love to talk.  Several years ago I worked on a survey in which we asked women about their experiences at the maternity center of a local hospital.  One woman went into great detail about childbirth, but was offended that I asked about her annual household income.  Some respondents want to talk about anything but the questions on the survey.  They view the call as an opportunity to get something off their chest.  The interviewer’s responsibility in such situations is to keep the call on track by continually bringing the respondent back to the questions.  This often requires tact and diplomacy, which is also part of establishing rapport.  

Monday, October 22, 2012

Probing for Illuminating Information


My work as a market research interviewer helps organizations to be more competitive and efficient.  Businesses commission market research studies to learn how to add value to their product or service.  They may also conduct market research to uncover problems.  Market research studies are also done so that public relations workers can learn how to put the best spin on a message.

One of my best memories is of when a client said that the information we gathered was illuminating.  To get illuminating data, an interviewer must be able to probe well.  Probing is asking questions that are not written in the questionnaire to clarify vague or general responses to open-ended questions.  We also probe with questions such as “What else?” or “Why else?” to get as much information as possible from a respondent.

Probing well requires the ability to think quickly and the ability to read a respondent.  An interviewer must not ask leading questions.  For example, a respondent might say that the most important issue to be addressed by local public officials is education.  One person might be thinking about teacher salaries when they talk about education.  Another person might be thinking about outdated textbooks.  Still another person might be thinking about dropout rates.  If the interviewer asks “Do you mean the high dropout rate?” the interviewer is asking a leading question.  The respondent might mean the dropout rate, but they might mean something else.  If it is something else, we could lose the information by asking a leading question.  Many respondents will just say “Uh, yeah” rather than correct the interviewer.

A stock probing question is “What do you mean?”  This question usually gets a respondent to be more specific, but sometimes a respondent will give a definition of a word that they used.  Such a response is not usually useful.  A few years ago I did a telephone interview about an upcoming election.  I asked a respondent how he would vote in a race for a seat in the United States House of Representatives.  He told me how he would vote.  The follow-up question was “Why?”  The respondent told me that he would vote for his candidate because the other candidate was a slacker.  I could tell that this guy would give me a definition of the word “slacker” if I asked what he meant, so I asked “What gives you that impression?”  He talked about missed votes and failure to answer letters and phone calls.  This is the kind of information clients need to address why people make the choices that they do.

Another good probing question is “Why is that important to you?”  This usually gets at information that is not revealed by agree/disagree statements or demographic information.  It must be asked appropriately, however.

John C. Stevens
Saperstein Associates
(614) 261-0065

Friday, October 12, 2012

Neutrality in Political Polls




A Facebook friend who lives in Florida recently posted about a polling call that his wife answered.  The interviewer asked if she planned to vote for Barack Obama in the upcoming election.  When she said no, the interviewer said they hoped she would change her mind and ended the call.

I commented on my friend’s post that this did not sound like a legitimate poll.  It was really a canvassing call.  A real poll would not ask a leading question.  Where I work, we even go so far as to rotate the names of the candidates in a race so that half of the respondents hear the name of the Democratic candidate first, and half of the respondents hear the name of the Republican candidate first.  That is one way that we work to reduce bias in polling.

Respondents often ask our interviewers who is sponsoring a survey.  When we conduct political polls we often do not tell the interviewers the name of the client.  We tell them to tell respondents that interviewers are intentionally not told who is sponsoring a survey, so as not to bias or influence anyone’s answers.  We have done several polls for the Columbus Dispatch.  At the end of the call we ask respondents if a reporter from the Dispatch may call them to discuss their answers.  When we do these polls, we tell interviewers to tell respondents who is sponsoring the survey after they have completed the survey.

When I brief interviewers on a new telephone survey project I like to remind them that they are free to have opinions about candidates and issues, but to keep those opinions to themselves when interviewing respondents.  If an interviewer asks:

“If the election were held today, who would you vote for President of the United States, Barack Obama, the Democrat, or Mitt Romney, the Republican?”

OR

“If the election were held today, who would you vote for President of the United States, Mitt Romney, the Republican, or Barack Obama, the Democrat?”

The interviewer should not emphasize either choice.  If the interviewer asks a respondent if they agree or disagree with a political position, the interviewer needs to read the question in such a way that the respondent cannot tell if the interviewer agrees or disagrees with the position.  We occasionally have respondents ask interviewers how they feel about issues.  They are permitted to discuss their opinions only after the respondent has answered all of the survey questions, and then to keep it brief.  We remind interviewers not to say okay or uh-huh after a respondent answers a question.  These utterances can be interpreted as approval or disapproval of answers to questions.

As far as I know, CBS News, Gallup, Quinnipiac University and Rasmussen train their pollsters in maintaining neutrality.  These organizations have an interest in making sure that the data they collect is accurate.  The larger the phone room, the more difficult it may be to make sure that each interviewer is asking questions in a neutral manner.  Errors in sampling may also skew polling results slightly.  That is why a margin of error is always reported with survey results.  For the most part, however, polling organizations have neither a Democratic nor a Republican agenda.

John C. Stevens
Saperstein Associates
(614) 261-0065

Wednesday, October 10, 2012

Reading Questions Verbatim


Art requires creativity and innovation based on a foundation of discipline.  A musician must master scales before he or she can improvise.  A painter must understand the color spectrum and geometric forms to create compelling abstract art.  An interviewer needs to be able to read verbatim in order to gather useful information.

Several years ago I devised a mock survey questionnaire to use for screening applicants for telephone interviewer positions.    Research and my own experience showed that interviewers need continual supervision and coaching to consistently read survey questionnaires verbatim.  I needed to do something that would help me hire interviewers who already have the ability to read verbatim.

Reading survey questions exactly as they are written is important because changing the wording of a question can change the meaning of a question. 
  •         Could you tell your boss to go jump in the lake?
  •         Would you tell your boss to go jump in the lake?
  •         Should you tell your boss to go jump in the lake?

 These three questions are very similar, but they are different questions.  I could tell my boss to go jump in the lake.  I am physically able to utter those words.  Would I?  It depends on the answer to the last question.  If an interviewer says “could” instead of “would” the interviewer has re-written the question.  If some respondents hear “could” instead of “would,” we would get answers to different questions when we think we are getting answers to the same question.  Our analysis of opinions would be inaccurate.

The interviewers I supervise generally do a good job of reading verbatim, but they are human beings.  If I could record myself reading survey questions verbatim and then let a voice recognition system call people to let them hear the questions I could reduce interviewer error.  Reducing interviewer error would help us deliver more accurate information to our clients while reducing the amount of money we spend on supervising and coaching interviewers to read verbatim.

An interviewer needs to be a robot when reading questions and a human being when listening to answers.  A voice recognition system would need to have an Artificial Intelligence component to be able to probe vague answers to open ended questions.  We will need disciplined interviewing artists until then.

John C. Stevens
Saperstein Associates
(614) 261-0065
jstevens@sapersteinassociates.com