A number of issues affecting online research methods were considered for the design of the online study. This section focuses on how such issues affected this study and how they were addressed.
In order to ensure that the participants had access to the required technology and the confidence to use it, an online study should be kept as easy as possible for respondents to access and complete (
Dillman, 2000). Also, the virtual environment of the survey should be familiar to the respondents (
Mann and Stewart, 2000). In seeking these priorities, some advantages from email and web-page-based surveys were met. First, all the communications between the researcher and the participants were made through text-based email messages, making them convenient for the respondents because they required no facilities or expertise beyond those that they use in their day-to-day email communication. Second, a website containing a form was used to collect the data from the survey, avoiding typical problems of email-based surveys, such as selecting several answers when only one choice is required, deleting questions accidentally, or altering their format (
Mann and Stewart, 2000), and providing a visually attractive interface that appeared identical to all respondents, was easy to complete and submit, and whose data was in a completely predictable and consistent (coded) format, making automated processing and analysis possible by the researcher.
Another challenge may arise from the perception of the notification email as from an unknown sender by the participants (
Faught et al., 2004). Although participants were familiar with this research and had previously agreed to collaborate further, previous communication with them was accomplished by postal mail. Therefore, it was possible that the first contact email informing the participants about the second phase of the fieldwork seemed unknown to them, their email clients filtered the message as “junk”/“spam” mail, or that they simply deleted the message before reading it. These problems are analogous to the “wastebasket problem” for mail surveys, and the researcher needed to be aware of this issue and work to avoid filters and the delete button. To overcome this potential problem the researcher’s university email system was used, which identified the sender’s email address belonging to a UK university, and also used the university’s mail server, increasing the reliability of message handling. Moreover, neither graphic elements nor attachments were sent along with the messages and email messages intentionally contained only text-based information, therefore reducing the risk of some email clients blocking the message for being potentially dangerous.
7.4.1.1. An online survey
Prior to the development of the web form, and to sending contact messages to the participants, a survey implementation strategy was devised.
First, a tracking document was created using spreadsheet software and included information about the participants, the messages sent to them, the messages received from them, and the overall progress of the survey.
Second, contact messages to be sent were prepared and produced using group mail software. These messages included an invitation letter to participate in the online study, a thank you message for completing the survey (to be sent individually or in small groups after receiving the responses), and a template for an apology message in the event of technical problems (to be modified and addressed individually in each case).
Third, participants were split into two groups for sending the contact messages: the pilot group (21), and the rest of the participants (130).
After these preliminary tasks, the web page containing the questionnaire was designed. Separate sets of questions were produced for “adopter” and “non-adopter” groups. Once the contents were ready, the web page containing the questionnaire was designed in HTML following guidelines for good web design and principles for constructing and implementing web surveys. In the literature on research methods, it is highlighted that “Internet surveys need to be designed with the less knowledgeable, low-end computer user in mind” (
Dillman, 2000, p. 377), and that it is important to design with computer and questionnaire logic in mind: “Meshing the demands of questionnaire logic and computer logic creates a need for instructions and assistance, which can easily be overlooked by the designer who takes for granted the respondent’s facility with computer and web software. […] The building of such instructions takes on the same level of importance as writing the survey questions” (idem).
Some of the design principles for web surveys discussed by Dillman (idem) and applied to this web survey were addressed as follows:
• “Introduce the Web questionnaire with a welcome screen that is motivational, emphasizes the ease of responding, and instructs respondents about how to proceed to the next page” (p. 377). This was achieved by including welcoming, motivation, and instructions statements in the contact email and also contained a link pointing to the web address where the web survey was located.
• “Provide a PIN number for limiting access only to people in the sample” (p. 378). Since the invitation to participate was sent privately to the participants, password protection access was considered unnecessary; moreover, for ease of use reasons, it was considered that a login process could complicate the access to the survey or deter some respondents from following this process.
• “Present each question in a conventional format similar to that normally used on paper self-administered questionnaires” (p. 379). Questions and scales were reproduced in a similar way to a paper questionnaire, although taking advantage of colouring, layout, and shading features offered by HTML format.
• “Restrain use of color so that figure/ground consistency and readability are maintained, navigational flow is unimpeded, and measurement properties of questions are maintained” (p. 382). As mentioned above, HTML format features were used, but only using colouring and bold font face in a sensible way, so that it enhanced the readability of the questions. Moreover, all the questions including a list of items across scales were designed using resizable tables to ensure the integrity of the proportions and consistency were maintained.
• “Avoid differences in the visual appearance of questions that result from different screen configurations, operating systems, browsers, partial screen displays, and wrap-around text” (p. 385). All text used relative font sizes so text could be enlarged or reduced using the text size options available in visual browsers, and a flexible page format was used so pages could be automatically resized for different window sizes and screen resolutions avoiding annoying wrap-around effects regardless of the participants’ computer or software used to display the web survey.
• “Provide skip directions in a way that encourages marking of answers and being able to click to the next applicable question” (p. 394). This principle was applied several times giving explicit instructions to click on a link that forwarded the respondent to the following question (i.e. “please click here to go to next section”).
The final version of the survey web page is reproduced in
Appendix 2, and was made available online for the duration of the study through the university web servers.