Academic journal article Entrepreneurship: Theory and Practice

The Power and Effects of Entrepreneurship Research

Academic journal article Entrepreneurship: Theory and Practice

The Power and Effects of Entrepreneurship Research

Article excerpt

This study summarizes and analyzes average statistical power and effect sizes in empirical entrepreneurship research. Results show that statistical power was higher than expected, and was particularly high in studies employing archival measures. Statistical power has also increased over time. Effect sizes were higher than expected, a finding that remained consistent for different levels of analysis and across multiple sub-domains. We discuss these findings, compare them to related disciplines, and draw implications for the design of future studies.


Statistical power and effect sizes of empirical studies are important components of a discipline's research methodology (Maxwell, 2004). Collectively, efforts to deal with statistical power and effect sizes have the potential to contribute to the statistical validity of empirical studies, meaning that neglecting them may limit the ability to base conclusions on the research (Scandura & Williams, 2000). When statistical inference testing is dominant, common practice calls for surveying and calculating the statistical power and average effect sizes of empirical research particular to the field (Chase & Chase, 1976; Maxwell; Rossi, 1990).

Reviews of statistical power and effect sizes have played important roles in informing and developing several social science disciplines such as marketing (Sawyer & Ball, 1981), accounting (Borkowski, Welsh, & Zhang, 2001), and psychology (Clark-Carter, 1997; Rossi, 1990). Reviews of this type have also been completed for a variety of subdisciplines within management, including international business (Brock, 2003), industrial/organizational psychology (Mone, Mueller, & Mauland, 1996), and strategy (Mazen, Hemmasi, & Lewis, 1987).

To date, however, statistical power and effect sizes associated with empirical entrepreneurship research have not been assessed and reported. This is an important omission in the entrepreneurship domain's published record for several reasons. First, a macro understanding of statistical power can point to where the field is and ought to be heading with respect to certain aspects of research design. Howard, Maxwell, and Fleming (2000) note that if aggregate statistical power is too low, it leads to a body of evidence in which results appear to be contradictory in nature (see also Boyd, Gove, & Hitt, 2005a; Ferguson & Ketchen, 1999). Conversely, if empirical research is characterized by high power, scholars may wish to consider the effect size of their research questions because even small effects occurring in the population can be statistically significant (Thompson, 2006).

Second, a review of statistical power allows comparison in this dimension of a focal domain to other domains within social science. As an academic discipline strives to establish itself as a unique entity, it is important to demonstrate that researchers are held to the same level of methodological rigor as colleagues in related disciplines (Harrison & Leitch, 1996). Statistical power is a critical methodological dimension on which scholars evaluate research involving statistical inference tests (Boyd, Gove, & Hitt, 2005b; Cohen, 1988). Hitt, Boyd, & Li (2004, p. 15) note that "inattention to statistical power is one of the greatest barriers to advancing the strategic management paradigm," and we suggest the same may be said of entrepreneurship.

Third, a review of effect sizes within entrepreneurship research can facilitate meta-analytic rationale in study design (Thompson, 2005). Scholars conducting statistical inference tests ought to explicitly invoke prior effect sizes for their research stream and relationships under consideration when planning their studies (Kline, 2004). This is not simply limited to replication studies; even groundbreaking research should be designed and placed in the context of the effects of prior related literature (Harris, 1991; Henson, 2006). …

Search by... Author
Show... All Results Primary Sources Peer-reviewed


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.