"Why don't technology managers want our analyses'? And what can we do about it?" This was the title of our effort to contrast the perspectives of analysts and managers in a 2001 SCIP (Society of Competitive Intelligence Professionals) presentation (1). The issue struck a chord with both sides. Analysts feel frustrated that their hard work is under-appreciated: managers feel frustrated that they're not getting the information they want, when riley need it.
Empirical technology analyses can take many forms, including: competitive technological intelligence, and technology forecasting, foresight, roadmapping, and assessment (2). Such analyses can aid various technology managers and professionals, including CIOs, R&D managers, new product development managers, operations managers, IP managers, strategic planners, and the Executive Suite.
And that leads us to our questions: Why do these empirical technology analyses not play a stronger role in technology management? And what can we do to enhance this role?
Utilizing Research Knowledge
As technology analysts in the mid-1990s, we were bubbling with enthusiasm for our version of empirical technology analyses (3). We thought that technology managers would quickly grasp the value and develop an insatiable appetite. Instead, after some hundred studies (conducted for companies, agencies and universities), we recognized something was wrong.
Accordingly, we turned to the National Science Foundation for support to research the issue. The resulting three-year project identified serious impediments to knowledge flow. The Center for Innovation Management Studies (CIMS) supported a follow-on study that elicited 32 case experiences (4). This article reports what we learned.
We are not alone in discovering that utilization of technology analyses in decision-making is not automatic. Colleagues in management science and operations research, policy analysis, statistics, program evaluation, and patent analysis have expressed similar concerns. "Knowledge Utilization" can claim to be a field unto itself, with journals dating back several decades (5). Issue, content, context, and presentation all affect the utility of findings (6-9).
Decision-makers trust familiar sources. In general, knowledge derived from empirical analyses is less familiar consequently less relied upon than tacit knowledge from a respected colleague. A 1999 Harvard Business Review article (10) distinguished use of codified knowledge, stored in databases, from personalized knowledge, delivered face-to-face. Surprisingly, technology managers seem to utilize empirical analyses
Getting the Analyses You Need
Our conceptualization of what affects the utility of empirical technology analyses keys on two roles: the prime user (i.e., the manager/customer) and the technology analyst. Each may actually involve multiple persons. In particular, "the user" may involve a sequence of people who authenticate and filter findings that contribute in various ways to complex business decision processes. We label the technology analysis findings as Technology Information Products (TIPs) to convey the notion that these are deliverables, and they can take multiple forms. Our sole evaluation criterion is the extent to which decision-makers gain value from the TIP--we don't address validity or other considerations.
Guided by this conceptualization, we digested the lessons from our case studies and experiences to yield eight factors that can affect your getting TIPs that make a difference in your decision-making. The Table, next page, presents these in the form of a "TIP-sheet." We now discuss each of the factors.
1. Know Thy Analysts--Interact with your analysts to recognize their strengths and figure out how to compensate for their weaknesses (e.g., particular technical bent, communication skills). Don't allow a reclusive analyst to "toss the report over the transom" to you. …