Book contents
- Frontmatter
- Contents
- Figures
- Tables
- Acknowledgements
- 1 Introduction
- 2 Basic statistics and probability
- 3 Basic issues in surveys
- 4 Ethics of surveys of human populations
- 5 Designing a survey
- 6 Methods for conducting surveys of human populations
- 7 Focus groups
- 8 Design of survey instruments
- 9 Design of questions and question wording
- 10 Special issues for qualitative and preference surveys
- 11 Design of data collection procedures
- 12 Pilot surveys and pretests
- 13 Sample design and sampling
- 14 Repetitive surveys
- 15 Survey economics
- 16 Survey implementation
- 17 Web-based surveys
- 18 Coding and data entry
- 19 Data expansion and weighting
- 20 Nonresponse
- 21 Measuring data quality
- 22 Future directions in survey procedures
- 23 Documenting and archiving
- References
- Index
17 - Web-based surveys
Published online by Cambridge University Press: 05 June 2012
- Frontmatter
- Contents
- Figures
- Tables
- Acknowledgements
- 1 Introduction
- 2 Basic statistics and probability
- 3 Basic issues in surveys
- 4 Ethics of surveys of human populations
- 5 Designing a survey
- 6 Methods for conducting surveys of human populations
- 7 Focus groups
- 8 Design of survey instruments
- 9 Design of questions and question wording
- 10 Special issues for qualitative and preference surveys
- 11 Design of data collection procedures
- 12 Pilot surveys and pretests
- 13 Sample design and sampling
- 14 Repetitive surveys
- 15 Survey economics
- 16 Survey implementation
- 17 Web-based surveys
- 18 Coding and data entry
- 19 Data expansion and weighting
- 20 Nonresponse
- 21 Measuring data quality
- 22 Future directions in survey procedures
- 23 Documenting and archiving
- References
- Index
Summary
Introduction
In the past few years the internet has emerged as an interesting mechanism by which to undertake surveys. Its appeal is both the low cost (there are no interviewers needed, nor survey personnel to stuff envelopes, no postage costs, and data are recorded directly to computer storage, thereby eliminating the requirements for manual data entry by survey staff) and the relative ease by which the survey can be set up (Schonlau, Fricker, and Elliott, 2002). Web surveys, like postal surveys, have the advantage that respondents can choose their own time to fill out the survey, rather than feeling pressured to respond when an interviewer shows up at the door or calls on the telephone. Like computer-assisted surveys, Web surveys also offer the ability to build in various error-checking procedures, so that respondents can be asked to correct potentially conflicting responses or responses that are out of range. Another distinct advantage of Web surveys is that complex skip patterns can be embodied within the survey, but their complexity can be hidden completely from the respondent. In this respect, the Web survey shares an advantage with any interviewer-administered survey, in which a skilled interviewer can follow complex skips without the respondent being aware that answers to certain questions lead to certain other questions or segments of the survey. However, the Web survey is a self-administered survey, which in almost any other form would require guidance on how to skip certain questions and go to certain other questions. Because a specific respondent to a Web survey is never aware of the other possible questions that may be included in the total survey design but that were not relevant to this respondent, the Web survey has the advantage over other self-administered surveys that the survey task may appear much smaller than it would in a paper version.
However, on the negative side, the response rates to internet surveys are reported by several researchers as being lower than that of postal surveys, which is a matter of some concern, because postal surveys generally experience lower response rates than other methods (Jones and Pitt, 1999; Leece et al., 2004; McDonald and Adam, 2003). Jones and Pitt in their study achieved response rates of 19 per cent with e-mail and an internet survey, while achieving 72 per cent with a postal survey, both surveys having three reminders sent. In the survey by Leece et al., the response rate by the internet was 45 per cent, while that of the postal survey was 58 per cent, which was a statistically significant difference. McDonald and Adam achieved a response rate of 46 per cent by post, but 21 per cent by the internet. The postal response rates of these three surveys are at the high end of postal response rates, in general, suggesting that these were well-designed surveys. The response rates to the internet are then quite low, at 19 to 45 per cent, with two studies of the three achieving internet response rates that are only around 20 per cent. It should be noted that these surveys were all conducted more than five years ago and that internet connection speeds, computer capabilities, and penetration of the internet have all increased markedly since then. In current work (2010) by the author, the response rate to an internet survey is running at around 32 per cent, without deducting the number of potential respondents whose e-mail notifications proved to be incorrect and so did not reach the intended respondent. This suggests a slightly higher response rate to an internet survey, but still only in the same range as a postal survey.
- Type
- Chapter
- Information
- Collecting, Managing, and Assessing Data Using Sample Surveys , pp. 385 - 400Publisher: Cambridge University PressPrint publication year: 2012