Trustworthiness, Credibility, and Soundness: A Vision for Research in the Journal of Mental Health Counseling

Article excerpt

It is a privilege to have been named the new associate editor of research for the Journal of Mental Health Counseling (JMHC), and an honor to have the opportunity to work with my esteemed colleagues Dr. James Rogers and Dr. Heather Trepal. As I transition into my new role, I would also like to acknowledge the dedicated work of my predecessor, Dr. Mercedes Schneider.

Dr. Rogers, during his time as associate editor of research, wrote "Research is important to the identity, health, and vitality of the profession of mental health counseling" (2002, p. 195). Eight years later, his words seem more relevant than ever. More and more, professional counselors must justify the efficacy of their programs and demonstrate that they are using evidenced-based best practices. They can no longer simply assert that their counseling is effective; they must provide data or results to demonstrate their efficacy with clients.

For the past three years I have had the opportunity to serve as an editorial board member for JMHC. During that time, I have been encouraged by the quality of the research submissions I have had the opportunity to review. I have also been pleased at the increasing number of submissions that utilize research methodology.

Although the quality of submissions has been generally encouraging, I have also noticed some recurrent problems with some submissions. As incoming associate editor I feel a need to address some of the more common quandaries associated with conducting quality research and provide basic guidelines for research-based submissions.

Suggestions for Quantitative Submissions

Dr. Schneider's editorial (2009) reviewed basic guidelines for quantitative submissions. I would add to her thoughts by expanding upon the idea of reporting significance related to quantitative research. Although sound research is certainly important, it is also equally important that the research and its results have meaning to the journal readership. In their Journal of Counseling & Development (JCD) editorial, Trusty, Thompson, and Petrocelli (2004) noted the discrepancy between those who typically publish in JCD (doctoral-level professionals) and the journal's intended audience (master's-level counselors). Because the primary audience for JMHC is primarily master's-level counselors, I concur with the suggestions of Trusty et al., especially describing the significance of findings for counseling practice and presenting findings in ways readers will easily understand.

Potential JMHC contributors might find it helpful in designing and implementing credible quantitative research to consult these resources: Hancock and Mueller's recent publication (2010) provides detailed information on numerous quantitative methodologies; Granello's editorial (2007) in Counselor Education and Supervision provides general guidelines for the preparation of quantitative manuscripts; Prieto (2005) and Schneider (2009) each discuss creating credible research for publication in JMHC; and Trusty et al. (2004) provide guidelines for reporting effect size information.

Suggestions for Qualitative Submissions I believe strongly that qualitative submissions based on strategies to ensure trustworthiness and credibility are also important. As a reviewer for JHMC, I have unfortunately encountered qualitative submissions that lacked those elements. As an instructor, I am often frustrated when I hear students express their mistaken belief that qualitative research is somehow easier because it does not involve statistical tests or mathematical abilities. Those of us who identify as qualitative researchers take exception to the idea that our research is somehow less difficult simply because it does not incorporate computations. True qualitative research requires the researcher to demonstrate a thorough understanding of the underlying paradigms that guide qualitative research. For example, a common mistake in JMHC qualitative submissions is use of the constant comparison method of data analysis using methodologies other than grounded theory. …

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.