Benchmarking 10 Major Canadian Universities at the Divisional Level: A Powerful Tool for Strategic Decision Making: Proulx Reports on the Continuing, Decade-Long Exchange of Data and Benchmarking among Canada's Most Research-Intensive Universities
Proulx, Roland, Planning for Higher Education
It has taken almost 30 years for universities to borrow from the corporate world and integrate the concepts, methodologies, and logistics of various quantitative and qualitative evaluative processes (such as evaluation, assessment, and total quality management [TQM]) into institutional planning. It has taken even more time--beginning circa 1980--for performance indicators, strategic planning, benchmarking, and ranking to gain broad acceptance. A rapid review of the recent history and evolution of benchmarking illustrates the exponential growth of its use to compare and rank universities: more than 40 countries (2) now have national or regional university rankings, including
* America's Best Colleges published by U.S. News & World Report (see, for example, U.S. News & World Report 2010).
* Maclean's University Rankings produced by the Canadian magazine Maclean's (see, for example, Dwyer 2008).
* The University Guide published by the Guardian in the United Kingdom.
* The CHE University Rankings produced by the Centre for Higher Education Development in Germany.
* Six other international university ranking and league tables systems that compare and rank world universities, such as the Academic Ranking of World Universities produced by the Shanghai Jiao Tong University in China and the World University Rankings edited by The Times Higher Education Supplement in the United Kingdom.
Very clearly, a change has occurred in university culture: benchmarking is now widely used throughout the world.
This cultural innovation necessarily has affected university institutional research activities. At one time, institutional research offices simply produced facts and figures that were collected and published as a "fact book," primarily for descriptive purposes. Starting in the early 1980s, data and metrics began to be related to other purposes such as quality improvement, strategic planning, and accountability. These data were then compared to metrics produced by peer institutions. Benchmarking has since contributed to more policy-oriented institutional research studies and has demonstrated the rich possibilities for the use of data analysis and reporting.
It was in this context that a consortium of 10 Canadian research-intensive universities launched a data exchange program in 1999 to share information that could be used to identify and evaluate the best practices of each institution and to help each institution position itself strategically to achieve its mission. One part of the program was devoted to collecting departmental-level academic data (instructional and financial) from these 10 institutions.
This project built on two previous studies by the consortium that were experimental and limited in focus. In 2001-2002 and 2002-2003, data for six and 12 academic departments, respectively, were collected. In 2003-2004, the goal was more comprehensive: between 30 and 35 academic departments (figure 1) were benchmarked using 24 variables (figure 2) in comparisons based on selected indicators. This article presents the data from 2003-2004 as a case study to illustrate the purpose and methodology (process, variables, indicators, and ratios) of benchmarking. In addition, the article presents the results of this exercise and describes the multiple uses made of the data generated by the program.
Figure 2 Variables and Definitions Section 1--Faculty FTE Tenured/Tenure-Track Faculty: Full-time and part-time (converted to FTE) tenured and tenure-track faculty from all funding sources. Filled positions only. Joint appointments have to be prorated. Individuals with duties outside the department such as vice presidents, deans, and associate deans should be excluded for the duties they assume outside the faculty/department but should be prorated for their work within the faculty/department. The count should exclude non-tenured/tenure-track staff. Sabbaticals and leave of absences should be included. Nontenured/Tenure-Track Full-Time Faculty: Nontenured/tenure-track full-time faculty from all funding sources. Individuals with duties outside the department such as vice presidents, deans, and associate deans should be excluded. Other Teaching in $: The count should reflect the total annual expenditures of all part-time nontenured/tenure-track faculty from all funds. Excludes full-time staff and teaching assistants. If $ are to be converted to FTEs, the stipends for 1/2 courses (3 credit courses) are to be reported on a separate sheet. Teaching Assistants in $: Should be determined by total annual expenditures for teaching assistants from all sources. Average Hourly Rate for Teaching Assistants: Preferably departmental average hourly rate for teaching assistants. Section 2--Students Students taught by the faculty whether they are registered in the programs of the department or in other programs outside the department. Undergraduate FTE Students: Based on the fiscal year (Summer-Fall-Winter). One FTE = 30 credit hours or 10 half courses or 5 full courses. Master's, Diploma, Doctorate (FTE): Based on Fall headcount of degree registrants in programs administered by the department. FTEs = full-time headcounts + part-time headcounts divided by 3. Section 3--Staff Operating Funded Staff (FTE): This count should be FTE staff from operating fund. Excludes leave of absences unless paid. All Other Staff $: This count should be based on total annual expenditures on all other staff not included above and from all funds. Research assistants are excluded and not reported. Postdoctoral excluded but are reported for information on a separate sheet. Section 4--Degrees Awarded (According to Institution's Annual Count) Undergraduate (baccalaureate), professional (e.g., MD, VMD, OD, PharmD), master's (research and professional), doctorate, graduate diploma/certificate. Section 5--Total Direct Operating Expenses: Expenditures from the General Purpose Operating Funds Incurred Only By the Department Faculty Salaries: Should be determined by actual $ spent and taken from financial statements and not budgetary documents. Staff Salaries: Should be determined by actual $ spent and taken from financial statements and not budgetary documents. Teaching Assistant Salaries: Should be determined by actual $ spent and taken from financial statements and not budgetary documents. Benefits: Should be determined by actual $ spent and taken from financial statements and not budgetary documents. Central benefits must be estimated. Other Expenses: Should be determined by actual $ spent and taken from financial statements and not budgetary documents. Cost Recoveries: Should be determined by actual $ spent and taken from financial statements and not budgetary documents. Total Divisional Expenses: Total of the six subcategories. Section 6--Total Three-Year Average Research Expenditures (Operating): From financial statements (actual $ spent during the time period). Capital and equipment, Canadian research chairs, and Canadian Foundation for Innovation (CFI) should be included. Mention to be made if not three-year average.
History and Purpose of the Benchmarking Initiative
In 1999, an informal group of presidents from Canada's 10 most research-intensive universities (3) voluntarily agreed to create a data exchange consortium to be known as the Group of Ten Data Exchange (G10DE). The G10DE comprised the directors of the institutional research offices from each of the G10 institutions and was modeled after the Australia Group of Eight (G8, see www.go8.edu.au/about-us) and the Association of American Universities Data Exchange (AAUDE, see www.utexas.edu/academic/ima/aaude). The purpose of the G10DE is to support the presidents in the development and enhancement of the distinctive missions of the member institutions through the formal and informal exchange of data sets and information.
Among the many reasons for creating the G10DE and its various programs, three were particularly important:
* There was a paucity of common data on Canadian postsecondary institutions for use in benchmarking comparable institutions. The data that was available was centrally produced by Statistics Canada (Canada's national statistical agency) and considered to be of poor quality both then and, according to the Institute for Higher Education Policy (2009), now.
* The 10 universities were dissatisfied with the Maclean's University Rankings, which they believed used inadequate and unreliable criteria to assess and rank medical/doctoral universities.
* The success of The Delaware Study of Instructional Costs and Productivity (Middaugh 2001; Middaugh, Graham, and Shahid 2003) provided a model analytical tool to use in benchmarking teaching workloads, instructional costs, and productivity by academic discipline, which the G10DE could eventually have used to create its own program.
The G10DE program produced three sets of data. First, since 2000 the group has produced the annual "President's Fact Book" for institutional-level data, which included information about students and their progress, teaching staff, institutional resources, libraries, capital resources, and research revenues and allowed each participating institution to compare data and performance from an institutional perspective. Second, the G10DE provided, by broad field disciplinary divisions, regularly updated comparative data on rates of completion and time to completion of graduate students (known as the "G10/G13 Data Exchange Time to Completion Study"). (4) Finally, in October 2003 the G10DE produced an analysis of the data of the three federal granting councils (known as the "Federal Granting Council Research Data Analysis [Phase I]"). All studies conducted and published by the G10DE are restricted to member institutions.
All three studies described above produced data sets at the institutional and field-study levels. However, the G10DE's growing experience in benchmarking institutional data sets convinced the participating presidents that comparisons between academic peer groups should be extended to comparable disciplines, departments, and programs. They insisted on completing a benchmarking exercise and producing reports at the departmental level. The Delaware Study (Middaugh, Graham, and Shahid 2003) pointed in that direction and offered a successful example of benchmarking based on academic discipline.
Subsequent research confirmed that departmental reporting from peer institutions offers the most and, indeed, the only reliable, customized means of comparison to allow performance analysis and the finding of best practices (Burke 2005; Skogstad 2003). The evolution of the various ranking systems has since confirmed the validity and robustness of benchmarking at the subinstitutional level:
University rankings in Germany and Italy are now being done at the field-of-study level; in both the Academic Ranking of World Universities (by Shanghai Jiao Tong University) and the World University Rankings (by The Times Higher Education Supplement), it is now possible to disaggregate rankings by broad fields of study. Within Canada, while almost no data is publicly available at the field-of-study level, the G13 [formerly G10] group shares an astonishing wealth of such indicators. Indeed, the effort that the G13 has made to ensure the complete comparability of the data is remarkable, and, for that reason, its success--and also the indicators it uses--would make an excellent starting point for a common Canadian data set (Junor, Kramer, and Usher 2006, p. 11).
The Framework for Departmental-Level Benchmarking
When the presidents of the 10 universities decided to move forward with benchmarking, (5) they created a formal study group to guide, direct, and monitor the work. Each university appointed one representative, typically chosen from the Office of Institutional Research and/or the Planning Office, to serve primarily as a liaison. The 10 representatives elected a chair and a co-chair and appointed a central inter-institutional caretaker who would serve as the project manager of the program.
The study group was tasked with developing guidelines and a protocol and providing an annual report to the presidents. The benchmarking program required a rigorous framework comprising the program goals; the comparison of all institutional and divisional data concerning missions and areas of study; the development of commonly agreed-upon definitions of terms to support valid comparisons across institutions; the choice of the standard study areas (disciplinary areas, departments, or programs); and the collection of consistent, reliable, and fully validated data. The result of this work was a report titled "Instructional and Financial Data at the Divisional Level."
External collaborative benchmarking and information disclosure protocol. The G10DE clearly identified two major goals: to share information on the best practices of each institution and to help each participating institution position itself strategically to achieve its mission. It follows, therefore, that the G10 benchmarking exercise can be defined as an example of external collaborative benchmarking (i.e., between institutions that are not viewed as competitors).
Such a definition has a considerable impact on the information disclosure protocol signed by all participant institutions. The protocol reads in part
The results of the research produced by the G10DE are primarily intended for three purposes:
1. advocacy representing the best interests of all member institutions
2. internal management and decision making purposes
3. benchmarking for the benefit of governing boards of member universities with a mandate to provide stewardship and accountability for the institutions they govern..
The overarching principle in interpreting these guidelines will be that members of the G10 and the G10DE will act in the best interests of all members of the group. The statistics and research results will not be used for promotional purposes of any single university. A primary consideration in all issues of disclosure of statistical information will be the avoidance of public comparisons which could damage the reputation of a member institution. (G10DE Information Disclosure Protocol 2003, p. 1)
Steps in the benchmarking exercise. Given the nature of the G10DE program and the purpose and complexity of the benchmarking exercise, much discussion was required to clarify and define the measures to be collectively approved by the study group and subsequently used by each individual institution. The development of the benchmarking exercise entailed nine steps and took almost three years to complete.
1. Identify the G10DE consortium members. This preliminary step focused on the identification and choice of peer institutions. This is a basic step in any benchmarking exercise (Teeter and Christal 1986-1987). It was easy in this case due to the nature of the agreed-upon exchange program, which was built on the relationships among 10 institutions sharing similar missions, visions, and degree programs.
2. Select the variables. The variables and the corresponding data upon which the peer institutions agreed to be compared had to reflect core academic criteria, refer to current institutional databases or to other public non-university databases (whether governmental or private), and be accessible for definition purposes (Proulx 2007). Each variable was discussed and tested for its academic relevance and accessibility. Twenty-four academic variables were selected and organized under six criteria that represented core academic features: faculty, students, staff, degrees, operating expenses, and research expenditures (see figure 2).
3. Develop a rigorous set of definitions. As the author notes in his discussion of benchmarking,
The establishment of clear, coherent, and operational definitions is fundamental, is time-consuming, and cannot be achieved without complete collaboration. From our experience and understanding of the benchmarking process we believe that it is only through a back-and-forth process that definitions may be tested to ensure that they harmonize with the institutional definitions, refer to available data that could be verified, and be agreed upon by all peer institutions. (Proulx 2007, p. 78)
In the case of the G10DE, testing and refining the workability of each definition was particularly arduous and took most of the time dedicated to the refining process.
4. Test the methodology. Months of discussion and pilot testing were required to ensure that the methodology worked. The methodology was tested first with six and later with 12 academic departments representing different fields of study. This step ended with the selection of 24 variables and 30 academic departments. Expanding the number of academic units and disciplinary fields covered helped to test the coherence and workability of all the definitions.
5. Identify the key performance indicators. The 24 variables produced provided basic information for comparative purposes. Other G10DE comparative studies, such as those that analyzed time to completion by broad disciplinary field and the research data of the federal granting council, provided additional relevant information. But the identification of key performance indicators at the departmental level that would allow institutions to benchmark their own performance was considered by the G10DE study group and the presidents to be the major operational tool needed for strategic thinking and planning (Alstete 1995; Vlasceanu, Grunberg, and Parlea 2004). Consequently, selecting the 10 key performance indicators (figure 3) was viewed as a requirement for completing the ongoing planning processes within the various academic departments (Bryson 1995; Middaugh 2001; Norris and Poulton 2008). Of course, given the data collected for the 24 basic variables, additional key performance indicators could have been produced according to the needs and strategic goals of the member institutions.
6. Select the academic departments/programs to be benchmarked. A list of the departments and programs offered by at least five of the 10 universities was circulated among the institutions. Each had to choose the disciplinary divisions on which they could and would report. A template sorting the potential divisions was sent to each university for review and decision. Figure 1 presents the list of all disciplinary fields, departments, and sections (using the classification of instructional programs [CIP] code) that were offered by at least five member universities and notes the divisions that had been accepted in the previous rounds and those the group would unanimously add or exclude.
7. Collect data using templates. Templates listing all divisions accepted by the G10 and the 24 agreed-upon variables were sent to the 10 institutional caretakers for completion. The interinstitutional caretaker received the completed template from each university and developed a preliminary analysis, which included testing the discrepancies and anomalies that appeared at first sight or that needed explanation. The need for a contextual explanation led to adding a section that would capture the main features of the departmental profile: students, degrees, budgets, and faculty (see figure 4, section 8).
8. Review and correct the data. Once the data were gathered by the interinstitutional caretaker, a Microsoft Excel file comprising the data for all variables for each department and for each institution was sent to each institutional caretaker for validation. Each caretaker was invited to consult with the Office of Institutional Research at his or her university. The deans and chairs of the academic departments were asked to validate both their own data and the data of those same departments at other universities with which they were familiar. Comparing the data (as well as the set of indicators and the disciplinary picture defined by the profile section) among institutions helped detect both anomalies and suspect data and thus provided additional validation. The interinstitutional caretaker gathered the comments and suggestions for corrections reported by the 10 institutional caretakers.
9. Produce a final report. Once all corrections, additions, and suggestions were sent to the interinstitutional caretaker and reflected in appropriate modifications to the data, the caretaker produced a final report. The report included eight sections: data (sections 1-6), key performance indicators (section 7), and departmental profiles (section 8). Figure 4 illustrates for one department the data reported by the 10 universities. (6)
Using the G10DE Data
Member institutions participating in the G10DE benefit from the availability of comparable data, metrics, profiles, and performance indicators that can be used to evaluate academic practices and to identify better or best practices to help the institution improve. As noted, the users of the data understand that the identified practices are not necessarily absolute, ultimate examples or patterns, but are instead the identified best approaches to specific situations, as institutions and programs obviously vary greatly in constituency and scope. In this study, instructional and financial data at the divisional level were gathered to help participating institutions enhance the quality of their departmental curricula, support the periodic assessment and evaluation of departments, facilitate strategic departmental planning, and promote internal and external accountability.
Example: searching for a better or best practice. To search for a better or best practice, a department at one university may use comparable data to analyze its performance relative to the same department at other universities. To illustrate, six performance indicators (PIs) were selected from figure 4, section 7, to create figure 5. Figure 5 presents six PIs for Department X at 10 universities to determine the group average and a minus/plus (-/+) 20 percent range around the group average.
To facilitate the analysis and identification of best practices, the PIs are presented in an ordered sequence (Vlasceanu, Grunberg, and Parlea 2004):
* input value (instructional cost per FTE student)
* throughput values (FTE students taught/full-time faculty: other teaching $/full-time faculty)
* output values (percentage of graduate students; graduate degrees/graduate FTEs; research expenditures $/tenure-tract faculty member).
The analysis of these six PIs may lead to suggestions of best practices and indicate directions for further investigation by identifying relevant questions.
If Department X at University 2 is interested in improving (increasing) its graduate degrees/graduate FTEs ratio (an outcome improvement), it can compare its ratio with those of the nine other universities' departments. This comparison suggests that possible causes may be too many graduate students admitted and/or too few graduate students completing degrees. Although the percentage of graduate students in Department X at University 2 is lower than the group average, it is within the -/+ range, so too few graduate students may not be the problem. However, it is notable that the departments with higher graduate degree/graduate FTE ratios also have larger research expenditures/tenure-track faculty member. It could be concluded that departments with more research support may be better able to support their graduate students through the degree process, thereby reducing dropout rates and resulting in more completed degrees. The overall performance of departments at Universities 1, 4, 5, 6, 7, and 9 supports this initial conclusion.
This tentative conclusion must be verified by studying other indicators, supplemented by qualitative measures such as the department's established reputation. If further investigation supports the initial conclusion, University 2 can consider two strategic responses. One response is to reduce the number of graduate students. This may not be a good solution since Department X at University 2 already has one of the lowest percentages of graduate students.
Another response is to create incentives for departmental faculty to more actively seek external research funding that can provide financial support for graduate students through the completion of their degrees. This can be achieved through two reinforcing strategies. First, the department can recruit additional faculty members who will actively seek external research funding. Second, noting that the measure FTE students taught/full-time faculty is relatively high, the department can hire lecturers and part-time faculty to alleviate the comparatively heavy teaching loads to give current faculty more time to write research grant proposals.
Similarly, if Department X at University 10 is interested in reducing the FTE students taught/full-time faculty ratio, it can review the PIs for the other nine departments. The departments at Universities 3, 4, 5, and 7 have the most advantageous FTE students taught/full-time faculty ratios. Of these four universities, only University 3 has a lower instructional cost per FTE student, falling near the -20 percent range, while Universities 4, 5, and 7 have much higher instructional costs that fall outside the +20 percent range. While University 10 may be able to achieve its strategic goal by reducing the number of students admitted and/or increasing the number of faculty members, there is a potential negative side effect: the probability of increased instructional costs per FTE student. This negative effect might be offset to some degree by improving faculty productivity through online teaching or other technology-aided methods of delivering instruction. Again, further investigation will be required before strategic planning decisions can be made.
How benchmarking data can be used. Conducting a benchmarking exercise creates expectations that the findings will be used. Some of the most common ways in which benchmarking findings are or could be used are described below, although many of these examples have never been publicized due to the privacy guidelines of the G10DE protocol.
* Identify problem areas, diagnose weaknesses and strengths, and formulate plans to improve quality (Taylor and Massey 1996). It is noteworthy that benchmarking has become a powerful environmental scanning tool used in strategic planning. Departmental planning by the G10 institutions has systematically used the data and indicators listed in the G10DE.
* Support the periodic assessment and evaluation of academic departments through internal and external benchmarking exercises and peer review. Internal data must be complemented by comparisons with peer departments. For example, the University of Montreal systematically uses the data and indicators of the G10DE program in all of its departmental peer review processes.
* Inform budget requests and reallocate resources to better meet departmental needs and expectations or to justify financial investment.
* Secure funding from university administration or external funding agencies.
* Enhance productivity through improvements in tracking, graduation rate, the quality of the learning process, the balance between undergraduate and graduate students, the number of teaching assistants, and research productivity. For example, the University of Western Ontario (2007) used many of the G10DE indicators in the document Performance and Activity Indicators, the university's annual report to the Board of Governors. The G10DE data was also used by the University of Toronto to produce its institutional plan, Performance Indicators for Governance (University of Toronto 2008).
* Facilitate departmental strategic and operational planning by helping departments position themselves strategically and articulate goals and targets for ongoing planning and budgetary processes. For example, the Faculty of Arts and Sciences of the University of Montreal (27 departments) has established its strategic plan using, among other sources, the G10DE financial and instructional data at the divisional level program.
* Identify areas for additional research and benchmarking and explore further contacts and information. Bibliometric data and indicators (i.e., publications, citations, field and journal publication impact) are being explored with the collaboration of Observatoire des Sciences et des Technologies (OST, see www.ost.uqam.ca/) at the University of Quebec (Lacroix 2007).
* Render accountability, both internally and externally. Both the University of Montreal and the University of Western Ontario have produced accountability reports using the G10DE data and indicators.
It will take several years to determine the full impact of the G10 data exchange, including whether it has contributed to changes that enhance performance at each participating institution.
Concluding Remarks: Lessons Learned
The G10 exchange program on instructional and financial data has resulted in consistent and reliable benchmarks that have been used--and will continue to be used--in diverse and creative ways to guide institutional planning and improvement. The 2003-2004 study discussed in this article was the first complete benchmarking study at the departmental level produced in Canada.
There are a number of lessons to be learned from the Canadian benchmarking process:
* The fundamental question, what should be measured? should be carefully explored before undertaking a benchmarking program (CHERI/HEFCE 2008). The key to developing effective benchmarking measures is to choose peer institutions that share the same vision and use comparable measures aligned with strategic objectives. The answer to this question affects the choice of the data set (e.g., basic academic data), the appropriate definitions (e.g., tenured/tenure-track faculty), and the data sources (e.g., departmental budgetary report).
* The establishment of clear and coherent operational definitions is fundamental to success. This is a time-consuming process that requires the development of effective collaborative partnerships supported by top leadership at the participating universities. Refining the definitions related to the G10 exchange program took two years and two pilot studies to complete. The process involved a sequence of tests to verify both that the definitions were in harmony with the definitions of the individual institutions and to obtain final agreement from all 10 institutional participants.
* The validation of data is crucial since this ensures the accuracy and reliability of the process. In the case of the G10 exchange program, this involved an initial validation by the G10 interinstitutional caretaker followed by a second validation by each institutional caretaker, who was responsible both for checking the accuracy of the data entered within the overall template by the G10 caretaker and for confirming the validity of the data reported. The interinstitutional comparison of data and the production of sets of graphs for each of the 24 variables for each department helped considerably and pictorially to detect nonvalid data and trace anomalies.
* Benchmarking is a valuable management tool only if the benchmarking partners understand the data and what they mean. Without proper definitions, comparisons become a "Tower of Babel" In addition, data (particularly performance indicators) must be viewed in context to be used appropriately. Data are like the pieces of a puzzle, with each piece contributing to the emergence of the full picture--the full understanding of how to effect desired changes. Also, although data and indicators point toward a particular direction, they are not static norms or standards, but rather a dynamic set of references or pointers to the best practices and processes.
* Very little benchmarking at the departmental level has been attempted internationally; the variables, data, and indicators used in the present study apply mainly to North American universities. Data and indicators used by the G10DE refer mostly to national criteria. However, an emerging trend involves exploring how to compare national and world universities not only at the institutional level but also at the level of disciplinary fields with comparable profiles and practices. For example, The Times Higher Education Supplement (2007), Academic Ranking of World Universities (Shanghai Ranking Consultancy 2008), and the Centre for Higher Education Development (Zeit Online n.d.) now rank universities according to broad disciplinary fields. The Academic Ranking of World Universities (see www.arwu.org/#) has completed in 2009 a new ranking by subject field (chemistry, physics, mathematics, computer science and engineering, and economics). Some of the performance indicators from the Canadian benchmarking exercise could be used in such international comparisons. The six world university ranking and league table systems that focus mainly on bibliometric measures might suggest the development of a more global benchmarking program by adding indicators on publications, citations, and publication impact at the field/departmental level. The Canadian firm Research Infosource (2008), which has already refined its ranking methodology, works along these lines. The Observatoire des Sciences et des Technologies is also working with the now G13DE to help member institutions meet the bibliometric challenge at the departmental level (Lacroix 2007).
* The experience of the G10DE benchmarking exercise indicates very convincingly that ranking, comparing, and measuring academic peer groups at all levels must include comparable departments and programs. Michael Batty (2003) is probably right when he assumes that if an entire university is world-class, then it would mean that at least half of its departments are equivalently ranked. It has become clear that universities worldwide must be compared at the subinstitutional level.
* One of the methodological problems encountered in the world university ranking systems is the absence of a rigorous benchmarking typology. Lang and Zha (2004) rightly point out that "league tables and rankings (are) the most evident and accessible manifestations of benchmarking" (p. 1). What is ranking if not the result of a benchmarking exercise that is standardized, weighted, and sorted in ranking order (Vlasceanu, Grunberg, and Parlea 2004)? Benchmarking must then be considered as a necessary and structural step in an overall ranking process. This has been, in the opinion of the author, one of the major problems encountered in the ranking systems and procedures at all levels: regional, national, and international (Proulx 2007). The G10DE benchmarking typology could serve as a reference for future work (Proulx 2007).
This article has described a process that was used successfully by 10 Canadian research universities to develop collaborative data sharing. This approach to benchmarking can provide the basis for more innovative approaches. The concepts, definitions, and methodology developed here may be further refined and validated, the goals further clarified, the indicators more carefully chosen, the selection of peer universities revisited, and the whole process improved.
Alstete, J. W. 1995. Benchmarking in Higher Education: Adapting Best Practices to Improve Quality. ERIC Digest (ED402800). Retrieved March 22, 2010, from the World Wide Web: www.ericdigests.org/1997-3/bench.html.
Batty, M. 2003.World Class Universities, World Class Research: What Does It All Mean? Environment and Planning B: Planning and Design 30 (1): 1-2.
Broderick, M., and M. England. n.d. A Study on Completion Rates and Time to Completion of Graduate Students: Methodology Adopted by the G10 Data Exchange. Retrieved March 22, 2010, from the World Wide Web: aaude.org/documents/public/case-studies/2004-g10-ttd.ppt.
Bryson, J. M. 1995. Strategic Planning for Public and Nonprofit Organizations. Rev. ed. San Francisco: Jossey-Bass.
Burke, J. C. 2005. Closing the Accountability Gap for Public Universities: Putting Academic Departments in the Performance Loop. Planning for Higher Education 34 (1): 19-28.
Centre for Higher Education Research and Information and Higher Education Funding Council for England. 2008. Counting What is Measured or Measuring What Counts? League Tables and Their Impact on Higher Education Institutions in England. Bristol: Higher Education Funding Council for England. Retrieved March 22, 2010, from the World Wide Web: www.hefce.ac.uk/pubs/hefce/2008/08_14/#exec.
CHERI/HEFCE 2008. See Centre for Higher Education Research and Information and Higher Education Funding Council for England.
Dwyer, M. 2008. Our 18th Annual Rankings. Macleans.ca, December 19. Retrieved March 22, 2010, from the World Wide Web: oncampus.macleans.ca/education/2008/12/19/ our-18th-annual-rankings/.
G10DE Information Disclosure Protocol. 2003. Preamble. Unpublished document.
Institute for Higher Education Policy. 2009. Issue Brief: Impact of College Rankings on Institutional Decision Making: Four Country Case Studies. Washington, DC: Institute for Higher Education Policy. Retrieved March 22, 2010, from the World Wide Web: www.ihep.org/assets/files/publications/g-l/ImpactofCollegeRankings.pdf.
Junor, S., M. Kramer, and A. Usher. 2006. Apples to Apples: Towards a Pan-Canadian Common University Data Set. Virginia Beach, VA/Winnipeg, MB: Educational Policy Institute. Retrieved March 22, 2010, from the World Wide Web: www.educationalpolicy.org/pdf/Apples_to_Apples_CommonDataSet_061027.pdf.
Lacroix, R. 2007. Bibliometric Evaluation of G-13 Universities Departments. Paper presented at the Observatoire des Sciences et des Technologies 10th Anniversary Colloquium, Montreal.
Lang, D. W., and Q. Zha. 2004. Comparing Universities: A Case Study Between Canada and China. Higher Education Policy 17 (4): 339-54.
Middaugh, M. F. 2001. Understanding Faculty Productivity: Standards and Benchmarks for Colleges and Universities. San Francisco: Jossey-Bass.
Middaugh, M. F., R. Graham, and A. Shahid. 2003. A Study of Higher Education Instructional Expenditures: The Delaware Study of Instructional Costs and Productivity (NCES 2003-161). Washington, DC: U.S. Department of Education, National Center for Education Statistics. Retrieved March 22, 2010, from the World Wide Web: http://nces.ed.gov/pubs2003/2003161.pdf.
Norris, D., and N. Poulton. 2008. A Guide to Planning for Change. Ann Arbor, MI: Society for College and University Planning.
Proulx, R. 2007. Higher Education Ranking and Leagues Tables: Lessons Learned from Benchmarking. Higher Education in Europe 32 (1): 71-82.
Research Infosource. 2008. Canada's Top 50 Research Universities. Retrieved March 22, 2010, from the World Wide Web: www.researchinfosource.com/media/2008-top50-sup.pdf. (Lists from 2001 to 2007 are also available at www.researchinfosource.com/top50.shtml.)
Shanghai Ranking Consultancy. 2008. Academic Ranking of World Universities-2008. Retrieved March 22, 2010, from the World Wide Web: http://www.arwu.org/ARWU2008.jsp.
--. 2009. Academic Ranking of World Universities Ranking Resources. Retrieved March 22, 2010, from the World Wide Web: www.arwu.org/resources.jsp.
Skogstad, E. 2003. Using Benchmarking Metrics to Uncover Best Practices. APQC white paper. Houston, TX: American Productivity and Quality Center.
Taylor, B. E., and W. F. Massey. 1996. Strategic Indicators for Higher Education 1996: Vital Benchmarks and Information to Help You Evaluate and Improve Your Institution's Performance. Princeton, NJ: Peterson's.
Teeter, D. J., and M. E. Christal. 1986-1987 Establishing Peer Groups: A Comparison of Methodologies. Planning for Higher Education 15 (2): 8-17
The Times Higher Education Supplement. 2007. THES-QS World University Rankings 2007. Retrieved March 22, 2010, from the World Wide Web: www.timeshighereducation.co.uk/hybrid.asp? typeCode=142&pubCode=1&navcode=118.
University of Toronto. 2008. Performance Indicators for Governance 2007: Measuring Up. Toronto: University of Toronto. Retrieved March 22, 2010, from the World Wide Web: www.provost.utoronto.ca/Assets/Provost+Digital+Assets/ Provost/assets/PDF+Version+-+2007PI.pdf.
University of Western Ontario. 2007 Performance and Activity Indicators: Annual Report to the Board of Governors. London, ON: University of Western Ontario. Retrieved March 22, 2010, from the World Wide Web: www.ipb.uwo.ca/documents/2007_performance_indicator.pdf.
U.S News & World Report. 2010. Best Colleges 2010. Retrieved March 22, 2010, from the World Wide Web: colleges.usnews.rankingsandreviews.com/college/index.html.
Vlasceanu, L., L. Grunberg, and D. Parlea. 2004. Quality Assurance and Accreditation: A Glossary of Basic Terms and Definitions. Papers on Higher Education series. Bucharest: UNESCO-CEPES. Retrieved March 22, 2010, from the World Wide Web: http://www.cepes.ro/publications/pdf/QA&A%20Glossary.pdf.
Zeit Online. n.d. CHE University Ranking 2009/10. Retrieved March 22, 2010, from the World Wide Web: http://ranking.zeit.de/che10/CHE_en.
(1.) This article is based on a paper (revised in 2009) originally presented at the 27th EAIR (The European Higher Education Society) Forum held at the University of Latvia in Riga, Latvia in August 2005. The paper was subsequently presented at the Society for College and University Planning North Atlantic Regional Conference held at the University of Massachusetts Boston in March 2007.
(2.) For a review of the countries having a national benchmarking and ranking process, consult Shanghai Ranking Consultancy (2009).
(3.) The 10 original members were Laval University, University of Montreal, McGill University, Queen's University, University of Toronto, McMaster University, University of Waterloo, University of Western Ontario, University of Alberta, and University of British Columbia. The G10 has now expanded to include three additional universities (University of Calgary, University of Ottawa, and York University) and is now called the G13.
(4.) For a presentation of a "G13 Data Exchange Time to Completion Study" consult Broderick and England (n.d.).
(5.) There are many definitions of benchmark, benchmarking, and best practices. The author refers in this article to the definitions given by Vlasceanu, Grunberg, and Parlea (2004):
Benchmark: A standard, a reference point, or a criterion against which the quality of something can be measured, judged, and evaluated, and against which outcomes of a specified activity can be measured. The term, benchmark, means a measure of best practice performance. The existence of a benchmark is one necessary step in the overall process of benchmarking. (p. 24)
Benchmarking: [A standardized method] that involves comparisons of processes, practices, and performances with similar institutions of a larger group of institutions in the same field that are not immediate competitors. (p. 27)
Best practice: A superior method or an innovative process involving an actual accepted range of safe and reasonable practices resulting in the improved performance of a higher education institution or programme, usually recognized as "best" by other peer organizations. A best practice does not necessarily represent an absolute, ultimate example or pattern, the application of which assures the improved performance of a higher education institution or programme; rather, it has to do with identifying the best approach to a specific situation, as institutions and programmes vary greatly in constituencies and scope. (p. 29)
(6.) G10 university names are suppressed in accordance with the data exchange protocol. However, on each table, the numerical code in the top row represents a specific university.
Roland Proulx was a professor of Biblical and Ancient Near Eastern Studies at the University of Montreal before becoming the first director of the institution's planning office, a position he held for 20 years. He served two terms as secretary of the board of directors of the Society for College and University Planning (SCUP) and was co-founder and vice-chair of the G10 Data Exchange Program. He has written many papers and reports related to planning, evaluation, performance indicators, benchmarking, and strategic intelligence. He is regularly invited as a speaker, lecturer, reviewer, and facilitator in many international settings and serves on the International Expert Group on world university rankings. He has been nominated as Professional of the Year 2009 in Education/Global Higher Education by Strathmore's Who's Who Registry & Global Network for Outstanding Professionals. As emeritus, he presently acts as special adviser for institutional planning and strategic intelligence to the vice-provost and vice-rector for planning and to the vice-rector for international affairs of the University of Montreal.
Figure 1 Departments Accepted and Excluded Prev- iously Agreed CIP Divisions/Units Agreed Additions Excluded 4,02 Architecture x 4,03 Urban, Community, x and Regional Planning 9,01 Communications x 11,07 Computer Science x 13,01 Education x 14,01 Engineering: Various x Departments Aerospace, Aeronautical, and Astronautical Engineering Agricultural/ Biological Engineering and Bioengineering Biomedical/Medical Engineering Materials Engineering Nuclear Engineering Petroleum Engineering Industrial Engineering 14,07 Chemical Engineering x 14,08 Civil Engineering x 14,10 Electrical and x Electronic Engineering 14,19 Mechanical Engineering x 16,00 Foreign Languages and x Literatures: Various Departments East Asian Languages, Literatures, and Linguistics; General Slavic Languages, Literatures, and Linguistics; Germanic Languages, Literatures, and Linguistics; General Romance Languages, Literatures, and Linguistics; General Francais/English Classics and Classical Languages, Literatures, and Linguistics 19,05 Food and Nutrition x Studies 22,01 Law and Legal Studies x 23,01 Francais/English x 25,00 Library Science x 26,01 Biology: Various x Departments Molecular Biology Botany/Plant Biology Biology Zoology/Animal Biology Ecology, Evolution, Systematics, and Population Biology 26,02 Biochemistry x 27,01 Mathematics/Statistics x 31,05 Physical Education/ x Kinesiology 38,01 Philosophy x 38,02 Religion/Religious x Studieso 40,08 Physics, including x Related Disciplines Astronomy and Astrophysics Atmospheric Sciences and Meteorology 40,05 Chemistry x 40,06 Earth Sciences/Geology x and Related Sciences 42,01 Psychology x 44,07 Social Work x 45,02 Anthropology x 45,06 Economics x 45,07 Geography x 54,01 History x 45,10 Political Science x 45,11 Sociology x 50,07 Fine Arts and Art x 50,09 Music x 51,16 Nursing x 51,20 Pharmacy x 51,22 Public Health (all x programs) 51,02/51,23 Rehabilitation x Sciences: Various Departments Physical Therapy Occupational Therapy Speech Language Pathology/Communi- cation Disorders 51,04 Dentistry (all x programs) 52,01 Business/Management x (consolidation of all programs) Figure 3 Selected Key Performance Indicators 1 Instructional cost per FTE student 2 FTE students taught/full-time faculty 3 Other teaching $/full-time faculty 4 Other teaching $/FTE undergraduates 5 Research expenditures $/tenure-track faculty 6 Teaching assistants $/full-time faculty 7 Teaching assistants $/FTE undergraduates 8 Staff salaries/FTE student 9 Percentage of graduate students 10 Graduate degrees/graduate FTEs Figure 4 G10DE: Exchange of Instructional and Financial Data 2003-2004, Department X UNIVERSITIES 1 2 Section 1--FACULTY FTE tenure-track faculty 31.0 24.5 Nontenure-track/tenure-track 0.0 3.0 full-time faculty Other teaching in $ $186,196 $0 Teaching assistants $ $844,635 $330,676 Average hourly rate for teaching assistants $26 $17 Section 2--STUDENTS Undergraduate FTE students 916 666.8 Master's (FTEs) 59 21.0 Diploma(FTEs) 0 0.0 Doctoral (FTEs) 96 76.3 Section 3--STAFF Operating funded staff (FTE) 56 20.2 All other staff $ $2,674,385 $1,013,999 Section 4--DEGREES Undergraduate 32 33.3 Master's 9 9.3 Doctorate 24 8.7 Graduate diploma/certificate 0 0.0 Section 5--OPERATING BUDGET Faculty salaries $3,351,038 $2,467,902 Staff salaries $3,482,830 $776,238 Teaching assistant salaries $844,635 $330,676 Benefits $1,192,570 $582,521 Other expenses $2,717,780 $485,265 Cost recoveries -$638,222 -$59,714 Total divisional expenses $10,950,631 $4,582,888 Section 6--RESEARCH Expenditures (operating) $8,085,667 $4,301,581 Section 7--Indicators and ratios Instructional cost per FTE student $10 221 $5 997 FTE students taught/full-time faculty 34,6 27,8 Other teaching $/full-time $6 006,3 $0,0 faculty Other teaching $/FTE undergraduate $203,3 $0,0 Research expenditures $/ tenure-track faculty $260 828 $175 575 Teaching assistants $/ full-time faculty $1,030 $707 Teaching assistants $/ FTE undergraduates $35 $29 Staff salaries/FTE student $3,251 $1,016 Percentage of graduate students 14.5% 12.7% Graduate degrees/graduate FTEs 21.7% 18.5% Section 8--Profiles STUDENT PROFILE Undergraduate FTE students 85.5% 87.3% Master's 5.5% 2.7% Diploma 0.0% 0.0% Doctorate 9.0% 10.0% Total 1,071 764 DEGREES PROFILE Undergraduate 48.7% 64.9% Master's 14.2% 18.2% Doctorate 37.1% 16.9% Graduate Diploma/Certificate 0.0% 0,.% Total 66 51 BUDGET PROFILE Faculty salaries 30.6% 53.9% Staff salaries 31.8% 16.9% Teaching assistant salaries 7.7% 7.2% Benefits 10.9% 12.7% Other expenses 24.8% 10.6% Cost recoveries -5.8% -1.3% Total divisional expenses $10,950,631 $4,582,888 FACULTY PROFILE FTE tenure-track faculty 89.3% 89.1% Non-tenure-track faculty 0.0% 10.9% Other teaching FTE 10.6% 0.0% Total 34.7 27.5 UNIVERSITIES 3 4 Section 1--FACULTY FTE tenure-track faculty 26.5 14.0 Nontenure-track/tenure-track 6.0 0.3 full-time faculty Other teaching in $ $44,519 $12,545 Teaching assistants $ $559,142 $91,687 Average hourly rate for teaching assistants $33.60 $16.13 Section 2--STUDENTS Undergraduate FTE students 626.1 139.0 Master's (FTEs) 48.9 52.7 Diploma(FTEs) 0.0 Doctoral (FTEs) 30.3 32.0 Section 3--STAFF Operating funded staff (FTE) 15.7 11.4 All other staff $ $500,272 $720,315 Section 4--DEGREES Undergraduate 278 27.3 Master's 4.7 14.7 Doctorate 9.3 6.0 Graduate diploma/certificate 0.0 Section 5--OPERATING BUDGET Faculty salaries $2,120,644 $1,163,159 Staff salaries $739,786 $700,395 Teaching assistant salaries $559,142 $78,602 Benefits $513,973 $467,258 Other expenses $656,933 $843,008 Cost recoveries -$656,526 $11,189 Total divisional expenses $3,933,952 $3,263,612 Section 6--RESEARCH Expenditures (operating) $7,654,168 $3,978,742 Section 7--Indicators and ratios Instructional cost per FTE student $5 577 $14 591 FTE students taught/full-time faculty 21,7 15,6 Other teaching $/full-time $1 370 $877 faculty Other teaching $/FTE undergraduate $71,1 $90,3 Research expenditures $/ tenure-track faculty $288 837 $284 196 Teaching assistants $/ full-time faculty $17,204 $6,407 Teaching assistants $/ FTE undergraduates $893 $660 Staff salaries/FTE student $1,049 $3,131 Percentage of graduate students 11.2% 37.9% Graduate degrees/graduate FTEs 177% 24.4% Section 8--Profiles STUDENT PROFILE Undergraduate FTE students 88.8% 62.1% Master's 6.9% 23.5% Diploma 0.0% 0.0% Doctorate 4.3% 14.3% Total 705 224 DEGREES PROFILE Undergraduate 66.5% 56.9% Master's 11.2% 30.6% Doctorate 22.2% 12.5% Graduate Diploma/Certificate 0.0% 0.0% Total 42 48 BUDGET PROFILE Faculty salaries 53.9% 35.6% Staff salaries 18.8% 21.5% Teaching assistant salaries 14.2% 2.4% Benefits 13.1% 14.3% Other expenses 16.7% 25.8% Cost recoveries -16.7% 0.3% Total divisional expenses $3,933,952 $3,263,612 FACULTY PROFILE FTE tenure-track faculty 79.3% 96.4% Non-tenure-track faculty 18.0% 2.1% Other teaching FTE 2.7% 1.4% Total 33.4 14.5 UNIVERSITIES 5 6 Section 1--FACULTY FTE tenure-track faculty 31.0 21.00 Nontenure-track/tenure-track 0.0 0.00 full-time faculty Other teaching in $ $194,000 $281,989 Teaching assistants $ $290,214 $331,474 Average hourly rate for teaching assistants $19.49 $23 Section 2--STUDENTS Undergraduate FTE students 307.7 576.7 Master's (FTEs) 80.3 39.2 Diploma(FTEs) 0.0 0 Doctoral (FTEs) 83.0 59.6 Section 3--STAFF Operating funded staff (FTE) 24.5 14.5 All other staff $ $889,521 $882,673 Section 4--DEGREES Undergraduate 33.8 22.7 Master's 30.7 11.3 Doctorate 11.7 10.0 Graduate diploma/certificate 0.0 0.0 Section 5--OPERATING BUDGET Faculty salaries $2,565,313 $1,907,644 Staff salaries $1,320,486 $663,833 Teaching assistant salaries $290,214 $331,474 Benefits $735,246 $437,151 Other expenses $1,069,802 $761,344 Cost recoveries -$76,000 -$612,344 Total divisional expenses $5,905,061 $3,489,102 Section 6--RESEARCH Expenditures (operating) $8,656,360 $5,428,846 Section 7--Indicators and ratios Instructional cost per FTE student $12 536 $5 165 FTE students taught/full-time faculty 15,2 32,2 Other teaching $/full-time $6 258 $13 428 faculty Other teaching $/FTE undergraduate $630,5 $489,0 Research expenditures $/ tenure-track faculty $279 237 $258 516 Teaching assistants $/ full-time faculty $9,362 $15,784 Teaching assistants $/ FTE undergraduates $943 $575 Staff salaries/FTE student $2,803 $983 Percentage of graduate students 34.7% 14.6% Graduate degrees/graduate FTEs 26.0% 21.6% Section 8--Profiles STUDENT PROFILE Undergraduate FTE students 65.3% 85.4% Master's 171% 5.8% Diploma 0.0% 0.0% Doctorate 176% 8.8% Total 471 675 DEGREES PROFILE Undergraduate 44.4% 51.5% Master's 40.3% 25.8% Doctorate 15.4% 22.7% Graduate Diploma/Certificate 0.0% 0.0% Total 76 44 BUDGET PROFILE Faculty salaries 43.4% 54.7% Staff salaries 22.4% 19.0% Teaching assistant salaries 4.9% 9.5% Benefits 12.5% 12.5% Other expenses 18.1% 21.8% Cost recoveries -1.3% -17.6% Total divisional expenses $5,905,061 $3,489,102 FACULTY PROFILE FTE tenure-track faculty 90.5% 85.8% Non-tenure-track faculty 0.0% 0.0% Other teaching FTE 9.5% 14.2% Total 34.3 24.5 UNIVERSITIES 7 8 Section 1--FACULTY FTE tenure-track faculty 30.15 475 Nontenure-track/tenure-track 8.00 1.0 full-time faculty Other teaching in $ $138,059 $173,094 Teaching assistants $ $444,871 $823,908 Average hourly rate for teaching assistants $29 $25.97 Section 2--STUDENTS Undergraduate FTE students 582.90 1,289.4 Master's (FTEs) 59.00 36.0 Diploma(FTEs) Doctoral (FTEs) 118.00 130.0 Section 3--STAFF Operating funded staff (FTE) 40.60 41.0 All other staff $ $125,322 $613,903 Section 4--DEGREES Undergraduate 18.33 70.0 Master's 24.33 10.3 Doctorate 20.00 18.7 Graduate diploma/certificate 0.0 Section 5--OPERATING BUDGET Faculty salaries $3,940,109 $3,989,796 Staff salaries $2,119,827 $1,694,611 Teaching assistant salaries $444,871 $821,066 Benefits $1,202,956 $1,040,876 Other expenses $1,517,181 $940,948 Cost recoveries -$1,907,169 Total divisional expenses $7,317,775 $8,487,295 Section 6--RESEARCH Expenditures (operating) $12,614,740 $10,944,758 Section 7--Indicators and ratios Instructional cost per FTE student $9 630 $5 832 FTE students taught/full-time faculty 19,9 30,0 Other teaching $/full-time $3 619 $3 567 faculty Other teaching $/FTE undergraduate $236,8 $134,2 Research expenditures $/ tenure-track faculty $418 399 $230 271 Teaching assistants $/ full-time faculty $402 $16,977 Teaching assistants $/ FTE undergraduates $26 $639 Staff salaries/FTE student $2,790 $1,164 Percentage of graduate students 23.3% 11.4% Graduate degrees/graduate FTEs 25.0% 175% Section 8--Profiles STUDENT PROFILE Undergraduate FTE students 76.7% 88.6% Master's 78% 2.5% Diploma 0.0% 0.0% Doctorate 15.5% 8.9% Total 760 1,455 DEGREES PROFILE Undergraduate 29.3% 70.7% Master's 38.8% 10.4% Doctorate 31.9% 18.9% Graduate Diploma/Certificate 0.0% 0.0% Total 63 99 BUDGET PROFILE Faculty salaries 53.8% 470% Staff salaries 29.0% 20.0% Teaching assistant salaries 6.1% 9.7% Benefits 16.4% 12.3% Other expenses 20.7% 11.1% Cost recoveries -26.1% 0.0% Total divisional expenses $7,317,775 $8,487,295 FACULTY PROFILE FTE tenure-track faculty 73.7% 92.5% Non-tenure-track faculty 19.6% 1.9% Other teaching FTE 6.9% 5.6% Total 41.0 51.4 UNIVERSITIES 9 10 Section 1--FACULTY FTE tenure-track faculty 28.50 29.0 Nontenure-track/tenure-track 6.00 2.0 full-time faculty Other teaching in $ $496,000 $93,733 Teaching assistants $ $660,000 $480,897 Average hourly rate for teaching assistants $41.21 $33.80 Section 2--STUDENTS Undergraduate FTE students 859.60 758.1 Master's (FTEs) 45.00 30.0 Diploma(FTEs) 0.00 0.0 Doctoral (FTEs) 65.00 48.2 Section 3--STAFF Operating funded staff (FTE) 20.50 18.4 All other staff $ $145,000 $2,062,947 Section 4--DEGREES Undergraduate 46 50.0 Master's 15 4.0 Doctorate 9 70 Graduate diploma/certificate 0 0.0 Section 5--OPERATING BUDGET Faculty salaries $2,479,000 $2,683,957 Staff salaries $1,164,000 $931,106 Teaching assistant salaries $660,000 $480,897 Benefits $684,000 $1,069,278 Other expenses $1,656,000 $418,613 Cost recoveries -$1,209,000 -$16,785 Total divisional expenses $5,434,000 $5,567,066 Section 6--RESEARCH Expenditures (operating) $7,426,000 $5,200,462 Section 7--Indicators and ratios Instructional cost per FTE student $5 604 $6 657 FTE students taught/full-time faculty 28,1 27,0 Other teaching $/full-time $14 377 $3 024 faculty Other teaching $/FTE undergraduate $577,0 $123,6 Research expenditures $/ tenure-track faculty $260 561 $179 326 Teaching assistants $/ full-time faculty $19,130 $15,513 Teaching assistants $/ FTE undergraduates $768 $634 Staff salaries/FTE student $1,200 $1,113 Percentage of graduate students 11.3% 9.4% Graduate degrees/graduate FTEs 21.5% 14.1% Section 8--Profiles STUDENT PROFILE Undergraduate FTE students 88.7% 90.6% Master's 4.6% 3.6% Diploma 0.0% 0.0% Doctorate 6.7% 5.8% Total 970 836 DEGREES PROFILE Undergraduate 65.8% 82.0% Master's 21.2% 6.6% Doctorate 13.0% 11.5% Graduate Diploma/Certificate 0.0% 0.0% Total 69 61 BUDGET PROFILE Faculty salaries 45.6% 48.2% Staff salaries 21.4% 16.7% Teaching assistant salaries 12.1% 8.6% Benefits 12.6% 19.2% Other expenses 30.5% 75% Cost recoveries -22.2% -0.3% Total divisional expenses $5,434,000 $5,567,066 FACULTY PROFILE FTE tenure-track faculty 66.1% 88.4% Non-tenure-track faculty 13.9% 6.1% Other teaching FTE 19.9% 5.5% Total 43.1 32.8 UNIVERSITIES Average Section 1--FACULTY FTE tenure-track faculty Nontenure-track/tenure-track full-time faculty Other teaching in $ Teaching assistants $ Average hourly rate for teaching assistants Section 2--STUDENTS Undergraduate FTE students Master's (FTEs) Diploma(FTEs) Doctoral (FTEs) Section 3--STAFF Operating funded staff (FTE) All other staff $ Section 4--DEGREES Undergraduate Master's Doctorate Graduate diploma/certificate Section 5--OPERATING BUDGET Faculty salaries Staff salaries Teaching assistant salaries Benefits Other expenses Cost recoveries Total divisional expenses Section 6--RESEARCH Expenditures (operating) Section 7--Indicators and ratios Instructional cost per FTE student $6 049 FTE students taught/full-time faculty 25,6 Other teaching $/full-time $4 633 faculty Other teaching $/FTE undergraduate $213,3 Research expenditures $/ tenure-track faculty $233 794 Teaching assistants $/ full-time faculty $12,966 Teaching assistants $/ FTE undergraduates $597 Staff salaries/FTE student $1,275 Percentage of graduate students 15.3% Graduate degrees/graduate FTEs 21.3% Section 8--Profiles STUDENT PROFILE Undergraduate FTE students 84.7% Master's 5.9% Diploma 0.0% Doctorate 9.3% Total 7,932 DEGREES PROFILE Undergraduate 58.3% Master's 21.6% Doctorate 20.1% Graduate Diploma/Certificate 0.0% Total 619 BUDGET PROFILE Faculty salaries 48.6% Staff salaries 21.1% Teaching assistant salaries 8.3% Benefits 14.0% Other expenses 17,4% Cost recoveries -9,4% Total divisional expenses $47,980,751 FACULTY PROFILE FTE tenure-track faculty 84.0% Non-tenure-track faculty 7.8% Other teaching FTE 8.2% Total 337.1 Figure 5 Performance Indicators, Department X Universities 1 2 3 4 Instructional cost per FTE student $10,221 $5, 997 $5,577 $14,591 FTE students taught/ full-time faculty 34.6 2.,8 21.7 15.6 Other teaching $/ full-time faculty $6,006.3 $0.0 $1,370 $877 % graduate students 14.5% 12.7% 11.2% 37.9% Graduate degrees/graduate FTEs 21.7% 18.5% 17.7% 24.4% Research expenditures $/tenure-track faculty $260,828 $175,575 $288,837 $284,196 Universities 5 6 7 8 Instructional cost per FTE student $12,536 $5,165 $9,630 $5,832 FTE students taught/ full-time faculty 15.2 32.2 19.9 30.0 Other teaching $/ full-time faculty $6,258 $13,428 $3,619 $3,567 % graduate students 34.7% 14.6% 23.3% 11.4% Graduate degrees/graduate FTEs 26.0% 21.6% 25.0% 17.5% Research expenditures $/tenure-track faculty $279,237 $258,516 $418,399 $230,271 Universities 9 10 Average corridor Instructional cost per FTE student $5,604 $6,657 $6,049 $4,839 FTE students taught/ full-time faculty 28.1 27.0 25.6 20.5 Other teaching $/ full-time faculty $14,377 $3,024 $4,633 43,706 % graduate students 11.3% 9.4% 15.3% 12.24% Graduate degrees/graduate FTEs 21.5% 14.1% 21.3% 17.1% Research expenditures $/tenure-track faculty $260,561 $179,326 $233,794 $187,035 Universities - / + 20% Instructional cost per FTE student $7,259 FTE students taught/ full-time faculty 30.8 Other teaching $/ full-time faculty 65,559 % graduate students 18.35% Graduate degrees/graduate FTEs 25.6% Research expenditures $/tenure-track faculty $280,552…
Questia, a part of Gale, Cengage Learning. www.questia.com
Publication information: Article title: Benchmarking 10 Major Canadian Universities at the Divisional Level: A Powerful Tool for Strategic Decision Making: Proulx Reports on the Continuing, Decade-Long Exchange of Data and Benchmarking among Canada's Most Research-Intensive Universities. Contributors: Proulx, Roland - Author. Journal title: Planning for Higher Education. Volume: 38. Issue: 4 Publication date: July-September 2010. Page number: 20+. © 2009 Society for College and University Planning. COPYRIGHT 2010 Gale Group.
This material is protected by copyright and, with the exception of fair use, may not be further copied, distributed or transmitted in any form or by any means.