Academic journal article e-Journal of Business Education and Scholarship Teaching

Selecting a Free Web-Hosted Survey Tool for Student Use

Academic journal article e-Journal of Business Education and Scholarship Teaching

Selecting a Free Web-Hosted Survey Tool for Student Use

Article excerpt

Introduction

The Internet provides scholars and students a channel through which to solicit information using web-based surveys. Gone are the days when knowledge of web authoring software, Hypertext Markup Language (HTML) and scripting software acted as effective barriers against those interested in the use of online surveys. Today's online survey products present users with a remarkable array of features to help facilitate the design, implementation and analysis of survey data. The purpose of this study is to help marketing educators select a free web-hosted survey tool for student use by examining indirect (two website evaluation scales) and direct (web-hosted vendor ranking and student project completion times) measures. The benefits of this study are (a) offer students 24-7 access to manage their survey instrument(s) and respondent data from any computer and therefore engage in a higher order of online learning (National Center for Educational Statistics, 2005; Ronsisvalle & Watkins, 2005) and (b) offer instructors guidance for which web-hosted survey tool to use.

Literature Review

Studies addressing online populations make extensive use of online surveys offering scholars new challenges in terms of applying traditional survey research methods to the study of online behavior and Internet use (Andrews, Nonnecke, & Preece 2003). There are upwards of 125 online providers of online survey tools (Ajeebo 2009; VCL Components 2009) offering an online survey as part of their suite of services (Moodle), to PC survey software (Lime Survey) and web-hosted survey software (e.g., Vovici, Snap Surveys). For a studies addressing the advantages and disadvantages of online surveys, see Evans and Mathur (2005), or Van Seim and Jankowski (2006). In a nutshell, obstacles facing online surveys center about sample frame specification with the likelihood of ending up with a non-probability sample (Kay & Johnson 1999; Wolter-Warmerdam et al 2003) and the difficultly of determining the response rate (Andrews et al 2003; Manfred a et al 2006) and the likelihood of lower responses rates (33% versus 56%) for online versus paper surveys (Nulty 2008). In contrast, the attraction of online (versus paper) surveys enhance a project's efficiency by eliminating data entry, administrative chores and offering a 'just-in-time' service (Watt et al, 2002) as well as avoiding the need to administer surveys in class (Dommeyer et al 2004) especially given the preponderance of web-based surveying for course and teaching evaluation (Seal 8i Przasnyski 2001). Online surveys offer access to Internet savvy samples (Aoki & Elasmar 2000), large samples at low cost (Weible & Wallace 1998) and rapid replies (Schmidt 1997, Taylor 2000). Access to large online samples offers a way to reduce sampling error (Babbie 1990; Sills & Song 2002) together with high statistical power and access to participants in geographically distant areas (Birnbaum 2004). As such, online research is likely to grow in popularity and use as investigators devise methods to overcome the shortcomings with online research.

Given most technology performance failures are behaviorally based (Henderson Divett 2003), the Technology Acceptance Model (Davis 1989, Davis Venkatesh 1996) was developed specifically to predict who is most likely to accept new technology in a workplace environment and is applied to this study to answer the question 'which free web-hosted survey software would students mostly likely adopt?' The Technology Acceptance Model (TAM) based on the Theory of Reasoned Action (Ajzen & Fishbein 1980) is the primary model used to measure student adoption interest in the online-vendor hosted technology (how and when). The Technology Acceptance Model (TAM) has been used extensively during the past 20 plus years to explain user acceptance and use of technology (Chuttur 2009) such as evaluation of students' attitude towards technology use for coursework (Edmonds, Thorpe & Conole 2012). …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.