Showing posts with label survey. Show all posts
Showing posts with label survey. Show all posts

Friday, June 17, 2016

Couldn't do a day in retail

For the last couple of months I’ve been working for MaritzCustomer Experience, an international market research firm. I’m glad to have the chance to work at home, doing work that I like. It’s been a little bit of an adjustment, switching from working for a very small firm to working for a very large firm. It’s been more of an adjustment in the topics I interview people about.

When I worked for Saperstein Associates I did polling about local politics and interviewed educators about educational materials. We moved from project to project. I’ve worked on the same project at Maritz since I started at the end of March. I interview people about their customer service experience after they’ve visited one of the client’s stores.

Respondents often tell me that I did a good job interviewing them, but that doesn’t always make me feel good. That’s because I remember getting a call as a supervisor from a respondent who told me that she appreciated that one of the interviewers I supervised did a good job explaining the questions. In case you don’t know, an interviewer should never explain questions to a respondent. It introduces bias to the survey.

Interviewing respondents about their customer service experience often reminds me of the time that I made a comment about one of Saperstein’s clients that “She couldn’t do a day in retail.” Those words taught me a hard lesson on the words of Jesus, when he said, “For with what judgement ye judge, ye shall be judged: and with what measure ye mete, it shall be measured to you again.” (Matthew 7:2, KJV)


I like to keep this verse in mind when I interview cantankerous respondents who give low ratings to the people who work in the client’s stores. I’m grateful that I didn’t have to be the one to sell something to the cantankerous respondents, or deal with their problems.

If you apply to Maritz, please be sure to tell them that I referred you.

Monday, March 7, 2016

Pretesting Surveys




A regular client asked me to do a couple of pretests in the last few months. Pretesting used to be much more common among polling organizations than it is now. The idea is to do about a dozen interviews using the final draft of the questionnaire. We time each interview to calculate the average length. The client expects me to report any errors in language or logic of the questionnaire. They also expect me to report any problems with the sample.


I feel fortunate that I have a client who thinks of not pretesting as a penny-wise and pound-foolish proposition. Even if he does not need to rewrite questions, he finds pretest data helpful for accurate proposals and budgets.  We always learn something from a pretest.


One thing I like about getting on the phone to do a pretest is that we often learn about hot-button issues we didn’t know about. Surveying people about a school levy, park levy or library levy sounds dry, but questions on a questionnaire sometimes provoke excited responses on educational policy  or environmental issues. When this happens, a pretest provides a good excuse to let a respondent go off topic. Doing so can illuminate issues that have not been addressed by either side of a levy issue.


Pretesting also provides an opportunity for me to evaluate my interviewing skills. Respondents let me know pretty quickly if I read questions too quickly or too slowly.  The better I get at listening, the earlier I learn which probes yield useful information for the client.


I would be glad to have a chance to help other opinion research or market research organizations with pretests, if there are any that still do. If so, please let me know.


John C. Stevens

(614) 772-2332

Friday, October 26, 2012

Establishing Rapport


Telephone survey respondents sometimes tell me that they will do my telephone survey, but they don’t usually do surveys.  Experience has given me some skill at getting people to answer survey questions, but my voice probably deserves most of the credit.  I have a deep and authoritative-sounding voice.

One reason that many people do not participate in telephone surveys is personal or hot-button questions.  We tell people when we read the introduction to a questionnaire that the survey is confidential.  We tell them that their responses will be reported in the aggregate and not individually.  Respondents often do not hear this.  If they do hear it, they perceive it more from the tone of voice of the interviewer.  Some respondents ask direct questions about the confidentiality of a survey.  If an interviewer can answer such questions without getting flustered, the interviewer can usually establish rapport with the respondent.

Establishing rapport with a survey respondent is important not simply to get them to participate at all, but also to give honest answers to hot-button and personal questions.  If a respondent perceives that an interviewer is truly neutral the respondent will honestly give answers that may be considered socially unacceptable.  If we do a survey about household financial management, we want accurate information about gambling habits and credit card debt.  We ask demographic questions about age, income, marital status and education level so that we can make sure that we get a representative sample of a population.  We usually ask these questions at the end of the questionnaire.  An interviewer who can establish rapport at the beginning of the call is less likely to have a respondent refuse to answer these personal questions.

Sometimes it is too easy to establish rapport.  Some respondents love to talk.  Several years ago I worked on a survey in which we asked women about their experiences at the maternity center of a local hospital.  One woman went into great detail about childbirth, but was offended that I asked about her annual household income.  Some respondents want to talk about anything but the questions on the survey.  They view the call as an opportunity to get something off their chest.  The interviewer’s responsibility in such situations is to keep the call on track by continually bringing the respondent back to the questions.  This often requires tact and diplomacy, which is also part of establishing rapport.  

Friday, October 12, 2012

Neutrality in Political Polls




A Facebook friend who lives in Florida recently posted about a polling call that his wife answered.  The interviewer asked if she planned to vote for Barack Obama in the upcoming election.  When she said no, the interviewer said they hoped she would change her mind and ended the call.

I commented on my friend’s post that this did not sound like a legitimate poll.  It was really a canvassing call.  A real poll would not ask a leading question.  Where I work, we even go so far as to rotate the names of the candidates in a race so that half of the respondents hear the name of the Democratic candidate first, and half of the respondents hear the name of the Republican candidate first.  That is one way that we work to reduce bias in polling.

Respondents often ask our interviewers who is sponsoring a survey.  When we conduct political polls we often do not tell the interviewers the name of the client.  We tell them to tell respondents that interviewers are intentionally not told who is sponsoring a survey, so as not to bias or influence anyone’s answers.  We have done several polls for the Columbus Dispatch.  At the end of the call we ask respondents if a reporter from the Dispatch may call them to discuss their answers.  When we do these polls, we tell interviewers to tell respondents who is sponsoring the survey after they have completed the survey.

When I brief interviewers on a new telephone survey project I like to remind them that they are free to have opinions about candidates and issues, but to keep those opinions to themselves when interviewing respondents.  If an interviewer asks:

“If the election were held today, who would you vote for President of the United States, Barack Obama, the Democrat, or Mitt Romney, the Republican?”

OR

“If the election were held today, who would you vote for President of the United States, Mitt Romney, the Republican, or Barack Obama, the Democrat?”

The interviewer should not emphasize either choice.  If the interviewer asks a respondent if they agree or disagree with a political position, the interviewer needs to read the question in such a way that the respondent cannot tell if the interviewer agrees or disagrees with the position.  We occasionally have respondents ask interviewers how they feel about issues.  They are permitted to discuss their opinions only after the respondent has answered all of the survey questions, and then to keep it brief.  We remind interviewers not to say okay or uh-huh after a respondent answers a question.  These utterances can be interpreted as approval or disapproval of answers to questions.

As far as I know, CBS News, Gallup, Quinnipiac University and Rasmussen train their pollsters in maintaining neutrality.  These organizations have an interest in making sure that the data they collect is accurate.  The larger the phone room, the more difficult it may be to make sure that each interviewer is asking questions in a neutral manner.  Errors in sampling may also skew polling results slightly.  That is why a margin of error is always reported with survey results.  For the most part, however, polling organizations have neither a Democratic nor a Republican agenda.

John C. Stevens
Saperstein Associates
(614) 261-0065

Friday, September 7, 2012

Robot Interviewers


A few years ago I conducted telephone interviews for a business-to-business market research project.  I called librarians to ask their opinions of a proposed new software product for libraries.  One woman told me very politely after I read the introduction that it sounded like an interesting topic to discuss and that she would not mind taking the time, but that she had recently agreed to do a telephone survey that turned out to be an obscene phone call.  She said that she was very sorry, but that she would have to decline.  I had been trained to overcome objections, but this was a new one for me.  I had to mark the response for that library as refused to participate and move on.

The very next day, on the same project, another woman told me that the survey I had called her about sounded interesting, but that she had recently agreed to do a survey that turned into an obscene phone call.  This time I was quicker on the uptake.  I asked the woman if we could have a female interviewer call her.  She told me “Well, since you asked that question, go ahead.”  I was able to complete the interview with her.  I guess some obscene callers have a thing for librarians.

This is one of many anecdotes that I need to keep in mind as I investigate the use of speech recognition technology to collect information for market research and public opinion research telephone surveys.  My motivation for this investigation is not to reduce payroll costs, although that is a consideration.  Nor is my motivation to eliminate the headaches involved in supervising human interviewers who either do not want to work or who do not pay attention to instructions.

My motivation for investigating the use of speech recognition technology for telephone interviewing is that I think robot interviewers could get more accurate information for our clients.  Some of the vendors of systems that use speech recognition technology for market research interviewing use this as a selling point.  They say that it is actually an advantage for a respondent to know that they are being interviewed by a robot because the respondent is more likely to give an honest opinion. 

I am thinking specifically about a survey we did earlier this year.  We called registered voters throughout Ohio to ask them their opinions of proposed legislation regarding animals.  The survey had questions about regulating the ownership of exotic animals, the treatment of chickens on factory farms, and whether the penalty for cockfighting should be a felony instead of a misdemeanor.  I wondered at the time whether people would give honest answers to these questions or if they would give what they considered to be socially acceptable answers.  Who is going to say that they are in favor of cockfighting?  A person might say that if they knew that their answers would be kept confidential and if they did not have to say it to a human interviewer.

We will most likely not use speech recognition technology for telephone surveys until the technology advances quite significantly.  The technology can now be used for simple surveys that have yes/no or agree/disagree questions.  It can skip a question based on an answer to a question if appropriate.  It can record answers to open-ended questions, but cannot probe those responses.  A robot interviewer would not know when and when not to ask “Why?”  The speech recognition systems used in customer service applications can understand what a customer is saying well enough to route a call to a human CSR and can even schedule a reservation, but cannot actually help a customer resolve a billing discrepancy.

Another consideration as I investigate this issue is that researchers and their clients may rely less on telephone polling in the future.  Google claims that its Google Surveys can provide data that is statistically representative of a population.  If so, this will remove an obstacle to doing more research via the Internet.

These are my impressions of speech recognition technology so far.  I would like to find out if I am incorrect.  Future posts on this blog will cover what I learn about speech recognition technology and artificial intelligence as well as successful interviewing techniques as they apply to concepts in market research and public opinion research.  I will also be looking at how big  data can beused for market research and public opinion research and more generally about how automation replaces human workers.  Please feel free to direct me to sources of information or share your own stories about interviewing.

John C. Stevens
jstevens@sapersteinassociates.com