An Object and Performance Framework for Implementation of Web-Based Knowledge Sharing Technology

Article excerpt


Any e-organization, whether it is commercial or governmental, requires a knowledge management support in order to achieve optimal performance. Many of the technologies that serve the operations of such organizations can also support knowledge management to facilitate efficient knowledge sharing and reuse. Thus, eorganizations should be at the forefront in the use of knowledge management. This paper examines systems of knowledge management used in large organizations. The limitations of traditional organizational schemes are examined, including the tie to the traditional pre-digital knowledge unit, the multi-page document. An action research approach is taken towards the question of how we improve upon traditional approaches using the technology available in conjunction with approaches arising from organizational research. A new framework is described where knowledge is packaged into objects and classified by organizational performance roles and goals. A prototype implementation of the framework was developed in order to test its feasibility. Evaluation of the prototype suggests that the system could result in a more intuitive organizational framework that enables workers to obtain appropriate knowledge support in a timely manner without the need for extensive search, and also facilitates greater reuse and sharing of knowledge.

Key words: Knowledge management, Learning organizations, Performance improvement, Knowledge objects, Action research

1 Introduction

One of the most important parts of any organization, especially those that are or aspire to be e-organizations, is the knowledge that employees acquire and apply to tasks during their working lives [20], [33]. Knowledge management (KM) has arisen as the study of both organizational and technological approaches to support the necessary knowledge distribution. The increase in businesses using distributed, virtual, and remote employees [1] has further emphasized the need to incorporate technology to connect people and knowledge. Included in this trend is the increasing globalization of corporations and the need to develop cross-cultural perspectives on knowledge. There is a greater need for different organizations and businesses to communicate and share knowledge across organizational boundaries. This includes internal boundaries between different functional units and external boundaries to collaborating organizations and contractors that tend to have different knowledge cultures.

There has also been an increasing recognition of the social dimension of knowledge creation and management and criticism of the traditional approach to KM [18], [27]. The so-called "water cooler" effect, in which important knowledge is transmitted among people at social spaces, is often cited as an example of this. In this trend we are specifically dealing with what is known as ''tacit'' knowledge [41]. The term relates to informally communicated knowledge that is difficult to record into a formally structured document. Recently, there has been much hype around Web 2.0 technologies that emphasize informal knowledge and resource sharing of the kind seen in collaborative sharing sites such as Flickr and YouTube. The involvement of end-users in the categorization and review of such content [21] is an important aspect that deserves to be considered within new approaches to KM.

It could be argued that many approaches to KM have not caught up with these trends. This paper will examine the trends and implications for the design of KM support. It will consider the problem of knowledge silos in organizations and the need for a new framework for dealing with knowledge in organizations, one organized around the needs of the organization and its individual workers. The central question that motivates this examination is the potential for developing an alternative model for digitally organizing knowledge- a model that is more user-centered and better facilitates knowledge reuse and sharing.

KM is often characterized by the development of databases, repositories, and digital libraries that store the documented ''explicit'' knowledge [41] of the employees of organizations. They are often seen as being tied to one specific organization and utilize a cataloguing scheme customized to that specific organization. To retrieve information and knowledge relevant to their job functions and activities, employees would browse or search a KM database to find relevant documents. They would then browse the documents to identify items of knowledge within them that look relevant to their current needs. The employees may start their own collections of relevant documents either in the form of printouts or bookmarks, which they will organize with their own classification schemes. This latter activity has developed into a new area of research referred to as personal information management [42]. Employees will use their personal knowledge collections to supplement the knowledge they have internalized through learning, and they will seek to add to the collections when they encounter problems where their internalized knowledge and personal knowledge collections are not sufficient. An important additional source of knowledge is other workers, who can help create shortcuts to solutions through their own internalized knowledge, collections, or experience of searching the formal sources of knowledge.

While knowledge management systems (KMSs) have advanced over the years (e.g., the incorporation of methods for handling tacit knowledge), most applications are still structured as searchable databases containing many pieces of explicit knowledge (mainly in the form of documents). A number of problems can be identified in the operation of KMSs:

1. The Redundancy Problem: Knowledge is often duplicated in different repositories of knowledge or in different documents within the same repository. These redundancies can confuse the knowledge seeker and increase the time and effort in locating and making sense of the available knowledge.

2. The Collaboration Problem: The use of technology is also often conceived in terms of individual support rather than group support. For example, Paraponaris found that KM technology used in multinationals often lacks collaboration space [34].

3. The Access Problem: KM usually requires users to pull the information from the systems rather than have the system push it to users when they need it. Having some element of context sensitivity in computer support applications is seen as important.

4. The Categorization Problem: The terminology used within documents and in the catalogue can vary and be inconsistent with the end-user's frame of reference. In some organizations specialized terminologies and acronyms gain prevalence, and terminology can vary across cultures (e.g., cell phone vs. mobile phone). This can make searching for and interpreting documents difficult.

5. The Rationale Problem: There is not always supporting or validating information (i.e., the supporting rationale, or empirical data) directly linked to the knowledge in a KMS.

Many of the deficiencies noted above are related to the lack of user-centered design [31] in the construction of KMSs. The designer's conception of a user is often that of an individual who needs to be fed officially sanctioned knowledge from the organizational hierarchy. Treating people as passive recipients of technology has not benefited companies in the way they might have anticipated [28], [29]. A user-centered design approach seeks a greater understanding of how people operate in their day-to-day work; it seeks to mold the technology into facilitating people in reaching their goals.

One example of research seeking to take a more user-centered approach is the work of Ye and Fischer [45]. They have investigated using digital tracking of work processes to identify the need to push the right knowledge to the right users at the right times. Employees do not always know when they need information and they may not know what search terms to use to find a relevant knowledge object. Creating computer systems that can monitor work facilitated through computers can enable active KM that determines a need on the behalf of the employee.

Researchers in several areas are investigating how the process of seeking knowledge can be made more efficient. There is substantial research into dealing with conceptual and terminological differences in knowledge representations through the development and use of knowledge ontology [12]. Ontology research seeks a way to cross reference different terms and concepts, to enable software to distinguish different meanings implicit in document titles and content. The development and use of metadata standards and tools is another important trend in this regard [26]. Knowledge systems often include some rating and recommending systems to help users determine the relevance and quality of a particular knowledge source [36]. All this research seeks to guide users to the most appropriate knowledge and reduce the time they spend searching.

All of the above research involves refining and supplementing existing frameworks of KM. The research can bring some improvement to the existing model of KMS organization, search, and storage. It does not address the fundamental model for KMS, which is essentially derived from the pre-digital age, and is based on a relatively small number of authors and publishers who determine the importance of knowledge and how it should be packaged within documents and other media. The packages are then arranged in collections that are ordered by specialist librarians according to organizational frameworks that they determine to be logical. However, the digital domain allows mass authorship, mass publishing, mass review, and multiple co-existing levels of packaging and cataloguing. These phenomena are at the heart of what some people refer to as Web 2.0 [43]. This potential has yet to be fully investigated in research and there is a need for alternative theoretical frameworks that facilitate this approach to knowledge.

2 Methodological approach to framework development

In examining the concept of a framework, Wong and Aspinwall [44] argue that a distinction should be made between KM frameworks and KM implementation frameworks. They note that this distinction is already established in the general information systems (IS) literature, where there are frameworks focused on conceptual understanding of IS [4], [32] and frameworks guiding implementation [6], [22]. Likewise in the KM literature, it is possible to identify frameworks that seem to address the question of 'what is,' and to identify implementation frameworks that address the 'how to'. An early example of a 'what is' framework is Nonaka and Takeuchi [30]; there are now a number of such frameworks, which are reviewed in Rubenstein-Montano et al. [38].

Wong and Aspinwall [44] provide a thorough review of implementation frameworks in KM. They argue that the development of implementation frameworks provides a strong foundation and theoretical underpinning to guide organisations in the successful use of KM ideas. They note that although there are currently few examples of implementation frameworks, they can be categorized into three types. The first type is a 'systems approach' that uses diagrams or visual representations to provide a model of what a system should incorporate (e.g., [24]). The second type is the 'step' approach, which outlines a set of steps to be followed in the implementation process (e.g., [46]). The third is a combination of the two. In addition, it is possible to have a hybrid of the systems and step approach.

In this paper the implementation framework can be categorized as a 'systems approach' that uses visual representation. It is a framework that offers a new perspective on how knowledge should be packaged and catalogued. Unlike much other work on frameworks, in addition to a description of the framework, a software prototype that serves as a working model for the system is created and subjected to a qualitative evaluation.

In order to better understand the existing model prevalent in large organizations and to provide grounding for the development of a new framework, an action research approach has been adopted [13]. Action research is one of a number of qualitative methods of research that have gained both increasing acceptance and importance in computer research, which has been traditionally dominated by quantitative and empirical methods. Avison, Lau et al, [2] note the particular strength of action research is that "research informs practice and practice informs research synergistically'; they note five information systems development methods that have resulted from this research approach (e.g., the MULTIVIEW development methodology [3]).

There are several models for action research [35]. The most basic model involves three simple steps: identifying a problem, developing an intervention that is seen as resolving the problem, and then evaluating the success of the intervention. In relation to technology, software prototyping is the preferred approach that fits in with the action research model, due to the cost of a full-scale intervention and the uncertainty of success. In action research the emphasis is on the study of the problem situation and innovativeness of the intervention. It is seen as an important approach in the study of real world problems that do not easily lend themselves to empirical investigations in a laboratory. In particular, it is important that the evaluation be done in the context of real world practice; evaluation with real world practitioners on real world problems is preferable to a purely artificial lab-based evaluation. It is important to obtain people's reaction to the intervention and not merely test if it is technically functional.

The three steps of action research are not necessarily done in a linear sequence; an iterative approach is more usual. The research reported in this paper took place over a period of three years from 2003 to 2006, during which there was continuous investigation of current practice and development of prototypes. Three working prototypes were developed and evaluated prior to the construction of the final prototype described in this paper. The work was funded through the U.S. Army research office and managed by the U.S. Army Training and Support Center. Meetings were held every three months with a review group of up to ten practitioners from both the military and industry, who validated our investigations and facilitated access to practitioners.

In identifying the nature of the problem, visits were undertaken to ten separate U.S. military/government units and commercial contractors that analyze and share knowledge about operational performance. Typically these units are called in when there is perceived to be an inadequacy in operations that needs to be addressed. Examples of this kind of situation include increased debt problems among recruits, high drop out rates in specialist courses, or poor operation of communications equipment despite having trained operators. The units would be charged with investigating the root causes of the problem and recommend solutions that would overcome them. Similar units and specialist individuals are likely to operate in all organizations and there is a growing consultancy industry built around analyzing human performance problems in organizations [37].

During the course of our research we had access to thirty detailed analysis reports that covered non-classified performance problems within the military. The methods used in the problem analysis (e.g., gap analysis) and many of the problems dealt with are common to any large organization (e.g., maintaining air conditioning, developing leadership courses); relatively few were specific to the military. In addition to the methods, we were particularly interested in studying how knowledge derived from analysis was communicated, stored and archived.

The project involved close collaboration with the U.S. Coast Guard's Human Performance Center, the U.S. Navy's Performance Center, and the project's main sponsor, the U.S. Army Training and Support Center. In addition, several specialist meetings and conferences were attended to investigate how knowledge was shared and accessed. Access was provided to both the Army Knowledge Online and the Navy Knowledge Online systems. The author also participated in meetings of the Advanced Distributed Learning initiative. This organization was in the process of framing solutions to the sharing of electronic content within the more specific area of e-learning, which in its general definitions encompasses knowledge as well as traditional course content.

After analyzing the problem and drawing on research from a number of areas (including metadata, digital libraries, software engineering, and organizational analysis) a new theoretical framework for implementing Web-based systems for knowledge collection and organization was developed and a concrete example constructed in the form of a fully operational software prototype. This enabled the investigation of the framework's potential in actual practice. This prototype was evaluated both in lab-based usability testing and in a field test involving experienced professionals working on a real project with the U.S. Coast Guard.

3 Problem Situation: Knowledge Sharing in Large Multi-faceted Organizations

From the investigations of current knowledge sharing in the U.S. government/military sector, it was apparent that much knowledge tends to be communicated in traditional documents- in particular, PowerPoint presentations containing large numbers of slides are a surprisingly prevalent form of knowledge communication. These are traditional documents that are, in fact, packages of a number of separate pieces of knowledge having more or less relevance to different types of readers. The organizational structure of the document is often determined by the type of study that generated the knowledge or the type of professional that led its creation. Thus research, operational analysis, performance analysis, white papers, and strategy documents will result in different types of titles, packaging, and internal headings. This makes it difficult to identify needlessly duplicated or interrelated knowledge within different documents created by different departments in large organizations. Those seeking to extract knowledge from the document must understand the perspective of the document creator as much as the domain being dealt with in the document. There are likely to be redundancies and inefficiencies resulting from this. Object packaging, an alternative approach to document level packaging, may mitigate this problem.

When documents containing knowledge are put on the Web, problems locating knowledge can ensue. Separate branches of the U.S. military have developed solutions to this problem utilizing centralized online stores, in particular Army Knowledge Online. This system is primarily document based and is ordered according to a structural analysis of the document categories. On more than one occasion we were told about documents relevant to the project and then had to obtain their exact navigational paths, as we were not able to retrieve them through navigation or search. The number of documents, and their organization in terms specific to the organizational culture, makes it difficult for newcomers or outsiders to access and share in the knowledge contained in the system. Once documents were found there was no standard organizational structure, so finding the same information in documents from different units required understanding the unique organizational structure used in each document. In one instance, crucial information was missed on several readings, only to be discovered in an appendix.

Many organizations employ a variety of professionals who are trained in different departments of universities and exist within separate units of the organizations. These professionals often evolve their own ways of communicating and use independent methods, terminology, and tools. Figure 1 illustrates this phenomenon with three possible professional groups.

Ideally, if a problem occurs in a large organization (e.g., during the design of a new product for an aircraft manufacturer) all groups that are involved in providing human support would look at the problem together and create a single shared analysis in a standard data format. More usually, the solutions developed are delivered in separate digital formats in separate systems that employees must negotiate. Thus, e-learning might be available in learning management systems and other knowledge only available through traditional classes. Documentation may be available in a central digital repository or KM system, and it may also be scattered as links throughout a variety of Internet and intranet web pages. Employees must negotiate the separate systems in order to compile the knowledge that they need to deal with day-to-day operational problems.

Perhaps a more important problem for efficiency and effectiveness in organizations is the potential existence of vertical silos of problem handling within organizations. Vertical silos (see Figure 2) exist when groups who determine the need for a new solution (e.g., equipment, training, and procedure changes), groups who design and construct it, and groups who evaluate its effectiveness work independently of each other. Analysis is a key to breaking down silos, as it lays the groundwork for both the solution and its evaluation. When these three processes are not connected, solutions may be developed without a clear rationale (provided by quality analysis), and subsequently widely used without justification (provided by careful evaluation). Connecting the relevant knowledge requires a common standard reference point and a single IT system to support the whole development lifecycle.

In working over a number of years with the U.S. Department of Defense and specialist units in performance analysis, it was noted that the KM problems referred to above were resulting in great inefficiencies [16]. It was apparent that individuals all over the country were working to improve various aspects of the same performance. These performance analysts operated in isolation and shared little knowledge, thereby creating a sub-optimal organization. Sub-optimizations arose both from a lack of knowledge sharing and from the duplicate creation of knowledge (e.g., solving the same problems), along with the duplicate development of solutions to those problems. For example, the Navy, Coast Guard, and Army run their own fleets of ships, all of which encounter similar performance problems. Yet their problem analysis units work in isolation and use completely separate KM systems. Such inefficiencies not only occur across organizational boundaries, but also across time. For example, at a conference on government knowledge sharing, a Coast Guard analyst reported going over the reports on previous major hurricane disasters and noting that very little knowledge carried forward, with each report containing the same analysis and conclusions as the previous ones.

4 Developing a New Approach: An Implementation Framework and Web-based Software Prototype

From the study of the problem situation it was clear that an efficient electronic means of organizing and sharing knowledge had to move beyond purely digitized versions of the traditional document. Research into disaggregation or granularization is relevant here [7], [8]. Document disaggregation and granularization require dismantling documents into digital objects, which are more specific, standardized, and easily searchable than complete documents. A given document may cover a range of separate issues and findings, and have knowledge components that are important to some users hidden among many irrelevant knowledge components. Knowledge components in some documents may also have important relationships to components in other documents, even though the title, the purpose, or the majority of the content may not seem relevant. Document granularization requires ensuring that each individual digital object does not lose meaning when removed from the context of the overall document. At the same time, granularization allows the implicit meaning of each object to be made explicit through object-level metadata. Each object will need to be fully contextualized in its own right, such that any given object can be linked back to or re-assembled in, the context of the original document.

In the digital learning domain the move from large packages to collections of reusable and sharable objects is apparent in the U.S. military's Advanced Distributed Learning Initiative's Sharable Content Reference Model, or SCORM (Site 1). SCORM has gained some acceptance as a more general learning technology standard. The main goal of SCORM was to minimize inefficiencies in the development of military e-learning systems. Before this initiative, when the Navy bought a new helicopter it would also build a sophisticated e-learning system to support training of the people who would operate and support the helicopter. If the Army then bought the same helicopter, despite it only having 10%-20% variation from the Navy helicopter and being a different color, the Army would have purchased a completely separate e-learning system that most likely utilized incompatible technology. SCORM provides technical standards to allow interchange of digital learning content. The object approach requires that the training systems be divided into a number of well-defined components. Under this scheme if the Navy buys an elearning system for its helicopter which has, for example, 1,000 SCORM objects, then the Army should be able to reuse the majority of these and only develop the 100 or so that cover the unique aspects of their equipment. The idea behind SCORM is being extended to a number of other areas. For example, the technical documentation community is developing standards to enable similar ideas of reuse and sharing (Site 2).

A test bed has been created to merge these separate standards for e-learning and technical documentation, given that there is often a great deal of overlap in the knowledge they capture (Site 3). This approach is an attempt to break down the horizontal silos that often exist between those who create solutions to support the performance of organizations and solution providers in separate organizational units. Both SCORM and S1000D deal with knowledge in specific delivery context (training or technical documentation). Although merging the knowledge representation of these solution providers removes one set of horizontal silos, there are still a number of others in play (e.g., human factors, equipment procurement) and there is also still the problem of vertical silos to deal with. There is scope for a more general framework not tied to a specific knowledge delivery framework, which includes all forms of knowledge relevant to a specific problem. In order for such a framework to work, there must be a more generic approach to organizing knowledge.

The emerging field of performance improvement (Site 4) seeks to break down the "silo" approach to problem solving in organizations based on organizational research. It is an interdisciplinary field sharing many concerns with organizational psychology and operations research. Performance improvement advocates a solution-neutral, whole systems approach to studying performance with a single front-end performance analysis that is not tied to a specific solution type. Performance improvement views a lack of appropriate knowledge and the need for learning as just one of many causes of performance deficiency [19], [39]. Often problems lie in the information flow to employees, the tools and equipment used, and the consequences established for good and poor performance [5], [11], [19], [39]. For this reason, performance analysts who seek to solve organizational performance problems conduct thorough analyses of the factors affecting employee performance and often recommend multi-faceted solutions. Performance improvement consists of five major activities:

1. Identifying an organization's goals and defining what determines successful attainment

2. Analyzing data to identify gaps between the desired attainment and the current state

3. Determining possible causes of such gaps

4. Selecting the appropriate solutions to address those causes

5. Evaluating the extent to which the chosen solutions are effective in achieving attainment of the goal

Clark and Estes [9] describe an example of a performance analysis effort. They present a case study illustrating that data-driven analysis leads to better solutions for performance improvement. The study describes a computer hardware manufacturing company that was experiencing a decrease in productivity and an increase in assembly mistakes/damaged goods. A new assembly process was established to address these problems and until production data revealed otherwise, most people thought it was a smooth transition. In reality, no performance gains were achieved. The client suggested that the lack of increased productivity was due to inadequate training during the transition to the new production process and recommended that employees receive training. Upon further analysis, it was determined that the problem was not employee knowledge or skills. Data collected through focus groups, document reviews, and interviews indicated that workers were not receiving critical parts for the new assembly process. In addition, there was a lack of communication between employees (regarding production activities). After the implementation of process and human performance-oriented solutions, errors were reduced by 43 percent, productivity increased by 31 percent, and on-time deliveries increased by 46 percent. The suggested training solution would have wasted resources, and without a thorough analysis, performance would have likely remained unchanged. This example highlights the importance of conducting quality performance analyses.

An important concept used in performance improvement is that of alignment [25]. This involves establishing the strategic goals for the organization and then ensuring that process, team, and individual goals are aligned to the organizational goals. While organizations often spend a lot of time defining high-level goals in the form of a mission statement, it is not always made clear to employees how what they do specifically contributes to the attainment of these goals. In an aligned organization, each individual and team will have clear work goals measured by clear metrics. Individuals will be able to trace how their personal goals contribute to team and process goals, and how these in turn contribute to the attainment of the organizational goals.

This process of alignment and goal identification forms a useful and unexplored mechanism for knowledge organization with KM systems. In addition, a visual modeling technique based on UML Use Cases [14] is one means of assisting in the visualization of the specific performance roles and goals active in an organization and how they are aligned with higher-level process and organizational goals. Based on the study of the SCORM approach and its emphasis on the use of granular objects to facilitate reuse and the performance-oriented approach to analysis problems a new framework is for implementing KM is recommended. The defining features of this framework can be outlined as follows:

* A human performance orientation to problem solving. Thinking on performance problems should be focused on outcomes and not be dominated by those trained in a particular solution type.

* Object thinking throughout the process. Knowledge of problem analysis, design, and evaluation should be standardized, componentized, and shared in the same way as envisaged for digitized support solutions (e.g., SCORM objects).

* Collaboration. Problems should be solved by cohesive teams (including actual performers) supported by collaborative information environments. The web is used to maximize careful scrutiny at the early analysis stage.

* Visual modeling. Graphical models can illustrate concepts and relationships among performance roles, goals, processes, and unit missions. These form the basis for collaborative brainstorming and an organizational structure for reusable knowledge.

* Rationale management. Decision points in a process are justified by a rationale that describes which alternatives were considered and the criteria on which decisions were made.

* Configurability. In general, performance improvement practices are likely to benefit from giving those involved the freedom to use the skills they have learned through experience rather than locking them into a specific methodology. In a configurable system there is flexibility in methods but standardization of the knowledge storage format.

* Integrated system of IT support for the entire lifecycle. There are lots of IT 'systems' available that support various stages of the performance improvement process (e.g., authoring, learning management, KM, HR systems). What is needed is an integrated system of systems to support the continuous improvement of performance. In particular such a system needs to include support for up front analysis and evaluation. Such a system would also need to include repositories of reusable and sharable knowledge to support greater efficiency in the performance improvement process.

4.1 Web-based Software Prototype Based on the Framework

Figure 3 illustrates an example diagram embedded in the Net-centric Performance Improvement (Net-PI) System [16], [40]. The prototype was intended to provide a proof-of-concept for the framework described in the previous section. The Net-PI prototype demonstrates how performance analysis, support, and evaluation of knowledge can be delivered directly to relevant individuals through cataloguing all knowledge around a model of performance goals within the organization. The specific example represents the goals of someone occupying the role of "operator" for a specific piece of equipment. Job titles and high-level goals such as complete routine maintenance inspectors are broken down to more specific goals that have a few clearly defined performance goals associated with them. A multimedia walkthrough of this system can be found in the Projects area at Site 5:

Three types of users are envisaged for this system: knowledge specialists, performers, and stakeholders. Specialists include analysts, solution developers, and evaluators, who will create the core knowledge objects within the system. Knowledge will be founded on the initial work of the analysts, who will create new projects that incorporate analysis of roles and performance goals. Performers are representatives of the roles being analyzed; they cannot create knowledge objects, but they can provide commentary on the validity of the knowledge. (Is the analysis correct? How well do the solutions work?) Stakeholders include the supervisors of the performers and those who are dependent on the services the performers provide. Their main role is to provide commentary on the effectiveness of the performers and the support provided to them.

Each goal in the system is associated with objects contained in stores. At the core of the Net-PI system is a database with three interrelated stores:

1. The first store is of problem analysis knowledge. This is in the form of analysis objects, each of which includes a definition of a work goal, measures of its attainment, possible causes of problems, and recommended solutions (training, new equipment, change in methods) that can help an individual achieve the goal with greater success (see table 1). A collection of objects can be linked into a more comprehensive report, but they are indexed and accessible separately within the system.

2. The second store is a collection of digital solutions (or documentation of non-digital solutions) that aims to enhance the attainment of a specific performance goal. The actual solutions are not necessarily stored in the same database, but a link to where they are stored is provided.

3. The third store is evaluation knowledge. This is in the form of evaluation objects, each of which includes the results of a study that measures the attainment of the relevant performance goal with or without the use of one of the solutions contained in the second store.

All three of these data stores are interlinked and can be augmented by threaded discussions about the quality and implications of each object. The discussion is open to performers and stakeholders in the particular role and goal being analyzed, supported, and evaluated. This, in effect, creates a collection of tacit knowledge related to the core formalized performance knowledge (the analysis object), and uses a single understandable organizational framework to guide the respective people to the appropriate knowledge. All workers charged with contributing to given performance goals access knowledge relevant to their goals directly in the knowledge system. They do not have to find the knowledge by searching and sifting through documents that cover general areas of performance and may or may not contain knowledge relevant to their specific goals.

The framework provides mechanisms for knowledge sharing across unit and organizational boundaries. Common goals that extend across different units or rely on cross organization collaboration can be shared. Thus, picking up on the helicopter example used earlier, maintenance technicians in the Army and Navy can access the same analysis describing optimum performance and have a single link to the learning objects and documentation to assist them in attaining their maintenance goals.

Figure 4 provides an architectural overview of the Net-PI prototype. It is predominantly intended to break down the silos illustrated in Figures 1 and 2. Net-PI packages the key lessons from analysis and research into digital objects that are organized by operational roles and goals. Analysts and researchers can submit a new analysis object each time they study someone in a specific role attempting to achieve a specific performance goal. In doing this the system can check and alert the user if there are prior analysis objects that they can build upon or reuse. The system addresses the problems outlined in the introduction as follows:

1. The Redundancy Problem: In this system, rather than having analysis and evaluation knowledge distributed in separate documents in separate stores, all analysis knowledge must be registered on one system according to the performance role it addresses rather than the study type or the unit that initiated the knowledge development. Likewise, all suggested solutions for enhancing a particular performance goal must be related to the analysis that determined its need and any subsequent evaluation that provides data as to its efficacy.

2. The Collaboration Problem: Collaboration is an integral part of the Net-PI approach. The community of stakeholders in the particular role/goal (trainers, practitioners, supervisors, and researchers) can easily find relevant knowledge and discuss, critique, or utilize the knowledge as required.

3. The Access Problem: Rather than search for information relevant to a role and goals, users will register into the system based on the roles and goals they perform within the organization. This places knowledge access in a work performance context, with a customized view of the available knowledge that is relevant to users' personal performance goals. Various technologies could be used in conjunction with Net-PI to allow the direct push of information to users. For example, given knowledge of an individual's work role and location (determined by GPS or RFID) it would be possible to customize menus on a mobile device to access knowledge relevant to the activities he or she is likely to be carrying out.

4. The Categorization Problem: Roles and goals, unlike job and unit titles, are often consistent across similar organizations. For example, radio operator is a common role that will be taken by a large number of people with many different job titles in all branches of the government service. It is also possible that knowledge mapping and ontologies may be easier to create for more granular units of knowledge that they would be for many-faceted analysis documents which cover a range of different issues within them.

5. The Rationale Problem: All solutions must be registered in the same location, allowing a one-stop center for the relevant performers to identify solutions. With the interlinking illustrated in Figure 4, employees can, if required, see the associated objects containing the rationale for why a solution was developed (analysis) and its validation in the form of any evaluation studies carried out. In addition, community commenting and rating is linked to each object giving additional information potential those who could potentially use the solutions

Figure 5 gives an overview of the software architecture used in the prototype. It is based on the use of the model view controller pattern. The knowledge model is built around three database repositories; each object in the repositories has a unique identifier and each object references the unique identifier of its related objects. Thus, an analysis object would reference any performance support (e.g., learning object) that was created to solve the problem identified and each performance support action would reference any evaluation carried out upon it. The system reuses components from shared services; for example, log on and collaboration features are provided by Microsoft SharePoint collaboration services. These are used in conjunction with custom created services such as diagram creators. What the user sees (as illustrated in Figure 3) is only one view of the system. Figure 5 illustrates that it is possible to have completely customized views of the same data stores. In this case, a Navy and Coast Guard front end is illustrated. Although the view and services may be different, the underlying shared data is the same.

5 Evaluation

Two levels of evaluation were carried out on the prototype. Formative usability testing occurred during the development and summative field testing was completed using the full system on a real project being conducted by U.S. Coast Guard personnel. The evaluation aimed to determine applicability and appropriateness of the prototype. Evaluation of the prototype included visual modeling as well as correlated user support.

The formative evaluation involved three data collection methods: individual tests, observation, and a post-test questionnaire. Data was collected from the twelve participants in two ways: 1) one-on-one tests and 2) review questionnaires. The sources for this evaluation included three professional analysts from the U.S. Coast Guard, two graduate students with experience in professional level analysis, and seven undergraduate students. The undergraduate students served to represent a performer or stakeholder who did not have specialist knowledge of the analysis methodology behind the system. We wanted to establish that the system was operational without a high degree of specialist knowledge. Although the numbers used in the evaluation seem relatively small, it has long been held that it is possible to identify the majority of usability problems with a small sample. Some have argued this number to be as low as 5. An empirical study by Faulkner places the optimal number between 10 and 20 [17].

During the one-on-one tests, each participant was given the same scenario and set of operational tasks to complete with the prototype. For the individual tests, the moderator reviewed the purpose of the evaluation test and the background of the prototype, and described the scenario for each participant. After each participant completely understood the objectives for testing, the participants received the same set of tasks. They then began entering data into the following components using information derived from a real military Front-end Analysis project. The task involved the following activities:

* Login

* Entering Project Details

* Building an Analysis object

* Generating a report

* Making the object ready for reuse

All the test subjects were able to complete the operational tasks they were given. After each one-on-one review of the prototype, each participant completed a questionnaire with evaluation probes, the results of which can be seen in table 2. This was intended to measure whether their perception of the experience of using the software conformed to what was observed on the test. It is desirable that the software not only be efficient to use, but that the user feels comfortable in using it.

The evaluation revealed several categories of findings. New users had few problems with the prototype. The nonpractitioners (undergraduate students) also had very little trouble using the prototype even though they had no prior knowledge of what the prototype was for and regardless of whether they had prior experience with performance analysis. Overall, the results were good and few problems had to be corrected.

There were a number of interface elements that were improved due to the results of the tests. Users had some trouble identifying submenu bars because they did not stand out from other text on the page. For example, on more than one occasion, users spent several minutes studying the page for the data entry option they wanted. The prototype allows users to enter only five data elements at a time, which was limiting. To add more data, they had to save their work, select the data element again, and then click an "add" button. Because several software engineers worked on the prototype, there was a degree of inconsistency throughout the prototype. For example, one of the submenu bars informed the user that data was available to them while the other submenu bar did not. These problems were corrected on the version of the prototype that went forward into field testing.

For field testing a research associate from the project shadowed two professional analysts from the U.S. Coast Guard's Performance Technology Center (PTC) as they used the prototype throughout a real project analyzing the maintenance needs of a new boat engine that was going into service (Honda BF225 FEA Project). The main objective was to determine the relevance of the prototype and the likelihood of its adoption. The prototype's main function was to facilitate knowledge reuse and sharing, but if the process of knowledge capture was more onerous than current methods, analysts would be resistant to its use. The two PTC analysts working on the FEA project had worked on numerous PTC projects.

Data collection involved three methods: individual review, observation, and questionnaires. Individual review consisted of PTC analysts reviewing the prototype and providing feedback. Evaluators used questionnaires to collect data regarding each analyst's reactions to the prototype. Observation consisted of project research associates watching the PTC analysts as they conducted a focus group of the Honda BF225 project stakeholders, where accomplished performers (AP) organize and validate goals and tasks involved in the engine's maintenance. Finally, to get an idea of the PTC analysts' reactions, a questionnaire was sent to those who worked on the Honda BF225 project regarding the compatibility of the prototype with their knowledge acquisition process.

Field testing is often difficult because it can be disruptive to the normal work process. The goal was to collect meaningful information on prototype viability and efficacy while minimizing both disruptions to Coast Guard analysts and demands on PTC time and resources. The field evaluation scope included demonstrating proof-of-concept on an actual project. The output of this project was not intended to be a production-level application. However, one result of the field evaluation is that stakeholders are able to more clearly envision what features or feature refinements a production level application should include, thereby making adoption of the concepts behind the prototype more likely.

First, the analysts commented that the prototype's visual models are useful for summarizing data graphically and for navigation. Additional inquiry could provide an even more helpful visual model such as a graphic linkage of recommendations to findings, which could help analysts identify causal patterns, enabling them to predict recommendations earlier in the analysis and make more accurate diagnoses.

A second feature identified as making adoption more likely is the integration of third party software, which was done in the Generic and Navy prototypes. The Coast Guard uses a synchronous text-based discussion tool called GroupSystems to collect performer data and to reach consensus on major goals and tasks. The version of GroupSystems used by the Coast Guard at the time of the study exports data as Microsoft Word documents, but not in XML. Future versions are likely to adopt XML, which will enable data converters to extract relevant data and automatically transfer it to the Net-PI system. Without such conversion, manual transcription would be required. The lack of interoperability of different tools is a common problem in the IT industry. The wider adoption of the XML standard is likely to reduce this problem in the future [10]. In other parts of the prototype we were able to demonstrate successful integration of third-party software features. In particular, the prototype's collaboration services were derived from those provided by Microsoft's SharePoint.

A third identified feature that would make adoption more likely is the creation and integration of customized screens for collecting raw analysis data. The Coast Guard FEA process is based on a specific methodology (based on Harless [23]). This process relies on a number of data collection worksheets. The worksheets were not recreated as additional screens in the prototype because of copyright restrictions; however, it is likely that the Coast Guard, having purchased the worksheets, could reproduce them as Web-based forms. They could also reproduce them as Microsoft Word, Excel, or InfoPath forms that generate XML data for use by the prototype.

Comments about the current prototype indicate PTC analysts would find a fully developed system based on the prototype useful in the field since it streamlines their data collection and automatically generates reports. It also provides for digital storage of project data, enabling analysts to compile, share, and reuse analysis knowledge. The seamless integration of Harless worksheets, GroupSystems data, and the features in the FSU-LSI prototype could provide a start-to-finish tool for Coast Guard analysts covering the entire analysis process- project planning, collection of raw data, diagnosis, and storing of analysis knowledge (in a granular form) for optimal sharing and reuse. The same structure that is used in performance analysis subsequently serves as the access structure for the performers themselves.

6 Conclusion

Having completed this work, we are now looking at how the underlying architecture has more general applicability for knowledge sharing. Rather than facilitating knowledge sharing in large organizations such as governments and militaries, we are interested in how it can facilitate large networks of independent small organizations. In particular, we have looked at networks of small non-profit organizations and networks of globally distributed usability test laboratories [15].

The efficient and effective collection and dissemination of knowledge necessary for workers to continuously improve their activities is a very important issue for all organizations. KM has been constrained by the traditional approaches to communicating knowledge in documents with limited distribution (published in specific journals or unit specific document repositories). The results of any study only affect operations if relevant people are aware of the document's publication, where to obtain it, and how to extract the relevant information and convert it into operational activities (e.g., a change in training procedures).

The Internet provides a means of radically re-conceiving the way knowledge is created and disseminated within organizations. In addition to the constant need for new research and operational analysis, new approaches to organizing, vetting, and channeling the resulting knowledge to those that need it are required. This paper has described one such new approach based on a performance improvement/object framework. The Net-PI prototype demonstrates the feasibility of creating new KM systems using this scheme. Because it granularizes knowledge and organizes it by organization performance goals, rather than organization structure, it can facilitate easier browsing for performance support and push technologies. It can facilitate sharing across organizational boundaries where performance goals are shared (e.g., ship maintenance in the Army, Navy, and Coast Guard) and within organizational, boundaries (e.g., among training, human resources, and documentation professionals). The evaluation of the prototype suggests that such a system can be integrated into the current work practices of those analyzing performance and developing knowledge sources to support workers. It promises to support the work of the analyst while at the same time eliminating unnecessary duplication of both analysis and solution development both with and across organizational boundaries.

Websites List

Site 1: Advanced Distributed Learning

Site 2: S1000D

Site 3: S1000D SCORM Testbed

Site 4: International Society for Performance Improvement (ISPI)

Site 5: Knowledge Communities Research Group (KCRG)



[1] J. Austin and L. Garnier, The virtual office: A behavior engineering model (BEM) perspective, Performance Improvement Quarterly, vol. 11, no. 4, pp. 7-21, 1998.

[2] D. Avison, F. Lau, M. Myers, and P. A. Nielsen. Action Research, Communications of the ACM, vol. 42, no 1., pp. 94-97, 1999.

[3] D. E. Avison, A. Wood-Harper, R. Vidgen, and J. A. Wood, A further exploration into information systems development: The evolution of Multiview 2, Information Technology & People, vol. 11, no. 2, pp. 124-139, 1998.

[4] C. J. Bacon, and B. Fitzgerald, A systemic framework for the field of information systems, Database for Advances in Information Systems, vol. 32, no. 2, pp. 46-67, 2001.

[5] J. S. Bailey and J. Austin, Productivity in the workplace, in Finding Solutions to Social Problems: Behavioral Strategies for Change (M. A. Mattaini and B. A. Thyer, Eds.). Washington, DC: American Psychological Association, 1996, pp. 179-200.

[6] S. J. Barnes, and D. Targett, A framework for strategic information systems implementation in the United Kingdom health sector, Topics in Health Information Management, vol. 19, no. 4, pp. 62-74, 1999.

[7] A. P. Bishop, Document structure and digital libraries: How researchers mobilize information in journal articles, Information Processing & Management, vol. 35, no. 3, pp. 255-279, 1999.

[8] A.P. Bishop, L. J. Neumann, S. L. Star, C. Merkel, E. Ignacio, and R. J. Sandusky, Digital libraries: Situating use in changing information infrastructure, Journal of the American Society for Information Science, vol. 51, no. 4, pp. 394-413, 2000.

[9] R. Clark and F. Estes, Turning research into results: A guide to selecting the right performance solutions. Atlanta, GA: CEP Press, 2002.

[10] F. P. Coyle, XML, Web Services, and the Data Revolution. Boston: Addison Wesley, 2002.

[11] A. C. Daniels, Performance Management: Improving Quality Productivity through Positive Reinforcement. Tucker, GA: Performance Management Publications, 1989.

[12] J. Davies, D. Fensel, and F. V. Harmelen, Eds., Towards the Semantic Web: Ontology-driven Knowledge Management. Chichester, England: John Wiley & Sons, Ltd, 2003.

[13] R.M. Davison, M. G. Martinsons, and N. Kock, Principles of canonical action research, Information Systems Journal, vol. 14, no. 1, pp. 65-86, 2004.

[14] I. Douglas, Adapting use cases for human performance modeling, in Handbook of Visual Languages for Instructional Design: Theories and Practices (L. Botturi and T. Stubbs, Eds.). Hershey PA: Information Science Reference, 2008, pp. 210-225.

[15] I Douglas, Testing object management (TOM): A prototype for usability knowledge management in global software development, in Usability and Internationalization, Part I (N. Aykin, Ed.), HCII 2007, LNCS 4559, pp. 297-305.

[16] I. Douglas, M. Wright, and C. Nowicki, Communicating performance knowledge among the services, in Proceedings of Inter-service/Industry Training, Simulation and Education Conference (I/ITSEC), Orlando, FL, 2004, 1630, pp. 1-10.

[17] L. Faulker, Beyond the five-user assumption: Benefits of increased sample sizes in usability testing, Behavior Research Methods, Instruments, & Computers, vol. 35, no.3, pp. 379- 383, 2003.

[18] G. Fischer and J. Ostwald, Knowledge management: Problems, promises, realities, and challenges, IEEE Intelligent Systems, vol. 16, no. 1, pp. 60-72, 2001.

[19] T. F. Gilbert, Human Competence: Engineering Worthy Performance. Washington, DC: International Society for Performance Improvement, 1996.

[20] B. Hall, Human capital management, Training, vol. 41, no. 3, pp. 16-17, 2004.

[21] T. Hammond, T. Hannay, B. Lund, and J. Scott, Social bookmarking tools: A general review, D-Lib Magazine, vol. 11, no. 4, 2005. [Online]. Available:

[22] H. R. Hansen, Conceptual framework and guidelines for the implementation of mass information systems. Information & Management, vol. 28, no. 2, pp.125-142, 1995.

[23] J. H. Harless, An Ounce of Analysis. Falls Church, VA: Harless Performance Guild, 1969.

[24] C.W. Holsapple, and K. D. Joshi, Knowledge management: A threefold framework. The Information Society, vol. 18, no. 1, pp. 47-64, 2002.

[25] R. Kaufman, H. Oakley-Brown, R. Watkins, and D. Leigh, Strategic Planning for Success: Aligning People, Performance, and Payoffs. San Francisco: Jossey-Bass/Pfeiffer, 2003.

[26] V. Malaxa and I. Douglas, A framework for metadata creation tools. Interdisciplinary Journal of Knowledge and Learning Objects, vol. 1, pp. 151-162, 2005.

[27] Y. Malhotra, Toward a knowledge ecology for organizational white-waters, presented at the Knowledge Ecology Fair 98: Beyond Knowledge Management, 1998. [Online]. Available:

[28] R. McAdam and S. McCreedy, A critique of knowledge management: Using a social constructionist model, New Technology, Work and Employment, vol. 15, no. 2, pp. 155-168, 2000.

[29] T. Newcombe, Knowledge management: New wisdom or passing fad? Government Technology, 1999. [Online]. Available:

[30] I. Nonaka, and H. Takeuchi, The Knowledge-Creating Company: How Japanese Companies Create the Dynamics of Innovation. New York: Oxford University Press, 1995.

[31] D. Norman and S. W. Draper (Eds.), User-Centered System Design: New Perspectives on Human-Computer Interaction. Hillsdale, NJ: Lawrence Earlbaum Associates, 1986.

[32] B. O'Donovan, and D. Roode, A framework for understanding the emerging discipline of information systems, Information Technology and People, vol. 15, no. 1, pp. 26-41, 2002.

[33] A. Papmehl, Accounting for knowledge, CMA Management, vol. 78, no. 1, pp. 26-28, 2004.

[34] C. Paraponaris, Pathways of relevance: Exploring inflows of knowledge into subunits of multinational corporations, Organization Science, vol. 14, no. 4, pp. 440-459, 2003.

[35] P. Reason and H. Bradbury, Handbook of action research. London: Sage, 2001.

[36] P. Resnick and H. R. Varian, Recommender systems, Communications of the ACM, vol. 40, no. 3, pp. 56-58, 1997.

[37] D. G. Robinson and J. C. Robinson, Performance Consulting. San Francisco: Berrett-Kohler, 1996.

[38] B. Rubenstein-Montano, J. Liebowitz, J. Buchwalter, D. McCaw, B. Newman, K. Rebeck, and the Knowledge Management Methodology Team, A systems thinking framework for knowledge management, Decision Support Systems, vol. 31, no. 1, pp. 5-16, 2001.

[39] G. A. Rummler and A. P. Brache, Improving Performance: How to Manage the White Space on the Organization Chart. San Francisco: Jossey-Bass, 1995.

[40] J. Sasson and I. Douglas, A conceptual integration of performance analysis, knowledge management and technology: From concept to prototype, The Journal of Knowledge Management, vol. 10, no. 6, pp. 81-99, 2006.

[41] F. A. Starke, B. Dyck, and M. K. Mauws, Coping with the loss of an indispensable employee: An exploratory case study, The Journal of Applied Behavioral Science, vol. 39, no. 2, pp. 208-28, 2003.

[42] J. Teevan, W. Jones, and B. B. Bederson (Eds.), Special issue: Personal information management, Communications of the ACM, vol. 49, no. 1, 2006.

[43] L. Tredinnick, Web 2.0 and Business: A pointer to the intranets of the future? Business Information Review, vol. 23, no. 4, pp. 228-234, 2006.

[44] K. Y. Wong, and E. Aspinwall, Knowledge management implementation frameworks: A review, Knowledge and Process Management, vol. 11, no. 2, pp 93-104, 2004.

[45] Y. Ye and G. Fischer, Supporting reuse by delivering task-relevant and personalized information, in Proceedings of 24th International Conference on Software Engineering, Orlando, FL, 2002, pp. 513-523.

[46] S. M. Yusof, and e. Aspinwall, Total quality management implementation frameworks: Comparison and review, Total Quality Management, vol. 11, no. 33, pp. 281-294, 2000.

[Author Affiliation]

Ian Douglas1

1 Learning Systems Institute, Florida State University,

Received 13 January 2008; received in revised form 13 June 2008; accepted 27 November 2008


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.