International Journal of Information Management 28 (2008) 3–11 The information audit: Methodology selection Steven BuchananÃ, Forbes Gibb Graduate School of Informatics, University of Strathclyde, 26 Richmond Street, Glasgow G1 1XH, UK Abstract This paper considers the comprehensiveness, applicability, and usability of four commonly cited information audit methodologies. Comprehensiveness considers the conceptual, logical, and structural completeness of each methodological approach. Applicability considers the scope of each approach, and the ability to tailor the approach to individual organisational requirements. Usability considers the perceived ease with which each approach can be adopted and applied. A methodological baseline has also been established, which provides a reusable framework to guide future methodology selection, and for developing an individual or tailored approach to the information audit. r 2007 Elsevier Ltd. All rights reserved. Keywords: Information audit; Information resource management; Information strategy; Information systems 1. Introduction This is the second paper in a series of three providing a comprehensive review of the information audit (IA). This paper considers methodological approaches, while the previous paper (Buchanan & Gibb, 2007) reviewed role and scope, and the final paper (Buchanan & Gibb, 2008) presents and discusses evidence from the field. The primary purpose of this paper is to deconstruct commonly cited methodologies to assist auditors with methodology selection and/or development (by identifying the core building blocks which are used to form the various methods), and to consider their related applicability. Cognisant of the fact that our own methodology is included, we have striven to compare not critique, and to analyse rather than review, with appraisal of the usability of individual methodologies largely limited to third party commentary. 2. Information audit methods Methodological origins have been extensively reviewed by Barker (1990), Buchanan and Gibb (1998), and Botha and Boon (2003) and will not be repeated here. Instead, the authors will focus on a review of commonly cited methodologies: Burk and Horton (1988), Orna (1990, 1999), Buchanan and Gibb (1998), and Henczel (2001). It is noted by the authors that Wood (2004) has also recently proposed a methodology, but Wood focuses more on definition and key considerations than explicit methodological process (Carlisle, 2005); consequently, it has not been included. A further consideration are those methodologies which are associated with related standards, such as ISO 15489 for records management; but taking ISO 15489 as an example, we view records management as a subset of content management, which we consider a subset of the information audit (as per the scope matrix proposed in our first paper in this series). While ISO 15489 would be a key standard for a content oriented information audit, it would not provide the overarching methodological information audit framework, which is the focus of this paper. 2.1. Burk and Horton (1988) InfoMap, developed by Burk and Horton (1988), was arguably the first detailed IA methodology developed for widespread use. In contrast to its predecessors (Best, 1985; Gillman, 1985; Henderson, 1980; Quinn, 1979; Reynolds, 1980; Riley, 1976; Worlock, 1987), the methodology provides a step-by-step process to discover, map, and ARTICLE IN PRESS www.elsevier.com/locate/ijinfomgt 0268-4012/$ - see front matter r 2007 Elsevier Ltd. All rights reserved. doi:10.1016/j.ijinfomgt.2007.10.002 ÃCorresponding author. E-mail address: steven.buchanan@cis.strath.ac.uk (S. Buchanan). evaluate an organisation’s information resources. There are four main stages: 1. Survey: the organisation’s information resource base is identified through staff interviews and questionnaire. 2. Cost/value: identified information resources are measured utilising cost and value ratios. 3. Analysis: corporate level resources are identified through mapping of individual information resources to the structure, functions, and management of the organisation. 4. Synthesis: the organisation’s information resources are confirmed along with their strengths and weaknesses relative to the objectives of the organisation. The main purpose of InfoMap is the discovery and inventory of an organisation’s information resources, something for which it still has application to this day. For instance, asset classification and control is a key activity in ISO 1779 (2005). However, a noted limitation to InfoMap is that there is limited organisational analysis conducted, largely due to its predominantly bottom-up approach (Buchanan & Gibb, 1998). Burk and Horton (1988) do highlight the importance of links back to business plans and goals, but there are no explicit steps, tools or techniques to identify and evaluate this relationship, nor to consider wider organisational and environmental issues. As a consequence, IA findings can lack vital organisational context (Underwood, 1994). 2.2. Orna (1990, 1999) In contrast to InfoMap, and perhaps in response to the limitations noted above, Orna’s top-down approach places greater emphasis on the importance of organisational analysis. While InfoMap focuses on the inventory of information resources, Orna’s approach identifies both information resources and information flow. Also, while the end product of InfoMap is essentially the inventory, the end product of Orna’s more progressive approach is a corporate information policy. Initially consisting of four stages, the methodology was later expanded to 10 to include steps pre- and post-audit: 1. Analyse the information implications of key business objectives: conducting a high-level preliminary review to confirm strategic and operational direction. 2. Ensure support and resources from management: obtaining senior management commitment to the audit. 3. Get support from people in the organisation: obtaining wider organisational commitment. 4. Plan the audit: project planning, team selection, and tools and techniques selection. 5. Finding out: identifying information resources and information flow, including high-level cost and value assessments. 6. Interpreting the findings: analysis of findings based on current state versus target state. 7. Presenting the findings: reporting on the audit. 8. Implement changes: establishment of information policy and realisation of audit recommendations. 9. Monitor effects: measuring change. 10. Repeat the audit cycle: establishing the audit as a regular exercise. While notable for placing emphasis on the importance of organisational analysis and introducing the mapping of information flow, Orna’s original approach was identified as lacking some of the practical tools and techniques required for carrying out several of the steps (Buchanan & Gibb, 1998; Nickerson, 1991); however, Orna does provide some further examples and practical insights in later publications (Orna, 1999, 2004). 2.3. Buchanan and Gibb (1998) Buchanan and Gibb developed a top-down approach similar to Orna’s (1990) approach, but with some expanded stages, and a more comprehensive IA toolset. The tools and techniques recommended, largely drawn from established management disciplines, were selected by Buchanan and Gibb based upon their widespread adoption and hence likely familiarity to practitioners. Their methodology has five main stages: 1. Promote: communicating the benefits of the audit, ensuring commitment and cooperation, and conducting a preliminary survey of the organisation. 2. Identify: top-down strategic analysis followed by identification of information resources and information flow. 3. Analyse: analysis and evaluation of identified information resources and formulation of action plans. 4. Account: cost/value analysis of information resources. 5. Synthesise: reporting on the audit and development of the organisation information strategy. The comprehensive IA toolset was a notable contribution to IA methodologies. However, the methodology, which was described within the constraints of a journal article, could be considered more framework than prescriptive methodology, lacking the depth of instruction provided by its more extensive handbook-based peers. 2.4. Henczel (2001) Henczel provides a methodology similar in approach to both Orna (1999) and Buchanan and Gibb (1998), drawing from both. There are seven stages: 1. Planning: audit planning and preparation, and submission of a business case for approval to proceed. 2. Data collection: development of an information resource database and population through survey. 3. Data analysis: structured analysis of the data collected. ARTICLE IN PRESS S. Buchanan, F. Gibb / International Journal of Information Management 28 (2008) 3–114 4. Data evaluation: interpretation of data and formulation of recommendations. 5. Communicating recommendations: reporting on the audit. 6. Implementing recommendations: establishment of an implementation programme. 7. The information audit as a continuum: establishing the audit as a regular, cyclical process. Given Henzcel adopts an approach similar to both Orna and Buchanan and Gibb, it is not surprising that the main strengths of the methodology constitute several of the combined strengths of both Orna and Buchanan and Gibb. That is, the method promotes top-down strategic and organisational analysis, includes both an inventory of information resources and the mapping of information flow, and draws on several of the tools and techniques proposed by both Orna and Buchanan and Gibb. However, the methodology has been criticised for lacking practical guidance (Webster, 2001). 3. Comparison of IA methods In order to conduct a comparative analysis of the four IA methods discussed in the previous sections, three measures have been identified and applied:  Comprehensiveness: the conceptual, logical, and structural completeness of each methodological approach.  Applicability: the applicability and scope of each approach, and the ability to tailor the approach to individual organisational requirements.  Usability: the perceived ease with which the method can be adopted and applied. 3.1. Comprehensiveness A challenge with attempting to assess the relative comprehensiveness of each of the four respective IA methodologies is that, given that there is no standard, agreed methodological approach, there exists no master reference model nor independent guide to the methodological stages of an IA. While it can be debated as to whether or not a standard IA approach is required, the authors suggest that a ‘‘methodological baseline’’ against which individual IA methodologies could be compared and contrasted for comprehensiveness and relative completeness is essential as this provides a framework to guide methodology selection and for developing tailored or individual approaches. In one of the first examples of work in this area, Dalton (1999) proposed a set of common stages through examination of published literature, with individual stages of IA methodologies identified and then extracted for tabulation, analysis and comparison. However, a limitation with Dalton’s model, for the purposes of this study, is that the IA methodologies that he compared and contrasted are not explicitly identified. Botha and Boon (2003) also conducted a comparison of IA methods in pursuit of common stages. They adopted Barker’s (1990) classification of methods as a framework, and then identified common stages for the IA methods they associated with three of Barker’s five classifications: operational advisory, geographical, and hybrid audits (the other two classifications are cost–benefit and management information). Botha and Boon’s findings are more usable for the purposes of this study than Dalton’s, but Barker’s classifications have not been popularly adopted for anything other than historical classification of IA models and origins, and include several IA models (particularly those associated with operational advisory approaches) which are no more than summary guidelines or have been superseded by the IA methodologies explored in this study. Fur‘ther, the level of identified stages across models is not consistent. For example, for the operational advisory model, ‘‘send memos’’ and ‘‘analysis’’ are both considered phases, yet the latter is a significantly greater undertaking than the former. The distinction between ‘‘geographical’’ and ‘‘hybrid’’ audits is also debatable as the former refers to information mapping, which can easily be included in the latter. One further limitation for the purposes of this study is that while Botha and Boon included Orna (1990) and Buchanan and Gibb (1998) in their investigation, they do not include Burk and Horton (1988), Orna’s (1999) revised methodology, or Henczel (2001). However, limitations aside, the models do provide useful guidance for development of a ‘‘methodological baseline’’, particularly the hybrid model, which, due to its broader remit and definition, is considered by the authors to be the most applicable to the IA methodologies within scope for this study (Botha and Boon mapped Buchanan and Gibb (1998) to the hybrid model). However, the omission of Burk and Horton (1988), Orna (1999), and Henczel (2001) limits the usefulness of the model for the purposes of this study. This paper is in part a response to this absence of a contemporary and comprehensive analytical framework which could be used to evaluate the IA methodologies which have been placed in the public domain. The approach adopted was similar to both Dalton’s and Botha and Boon’s, but focused upon the four IA methodologies selected for this study. The first step was to compile a master list of IA stages from each of the four IA methodologies. The master list was compiled firstly, through identification and listing of each of the discrete stages/tasks offered by each respective approach, and secondly, through grouping of identical activity, and removal of any duplication. The final step was consideration of additional stages/tasks, which may be required but were not found within existing methodologies; however, none were identified. ARTICLE IN PRESS S. Buchanan, F. Gibb / International Journal of Information Management 28 (2008) 3–11 5 The output of this exercise was a high level, generic baseline, which identified seven methodological stages, which are summarised as follows:  Setup: project planning, preparation of business case, endorsement, organisational communication, and preliminary analysis;  Review: strategic analysis (internal and external), organisational (cultural) analysis;  Survey: survey of information users, identification and inventory of information resources, mapping of information flow;  Account: cost, business benefit and/or value of information resources;  Analyse: analysis of findings;  Report: production and dissemination of IA findings and recommendations; and  Guide: organisational information management policy and/or information strategy development, implementation of recommendations, establishment of the IA as a cyclical process, and monitoring and control. These stages were then compared and contrasted with Botha and Boon’s hybrid model as illustrated in Fig. 1. Apart from variation in terminology (largely through grouping of different terms from different methodologies), the models are quite similar; however, there are two notable variations:  Botha and Boon’s ‘‘plan’’ stage maps to both ‘‘setup’’ and ‘‘survey’’. Both Buchanan and Gibb and Henczel include initial planning stages associated with overall setup, and later planning stages associated with preparation for data gathering (in what is referred to here as the ‘‘survey’’ stage).  Botha and Boon’s ‘‘compile’’ stage maps to both ‘‘report’’ and ‘‘guide’’. This acknowledges that Orna, Buchanan and Gibb, and Henzcel have individual stages/ steps for both production of the information report and ongoing activity such as strategy and policy development, and establishing the IA as a cyclical process. Each of the four IA methods were then mapped to this methodological baseline (approximate), as illustrated in Table 1, which illustrates the relative comprehensiveness of each of the respective IA methods. Burk and Horton’s approach can be seen to be focused on core IA tasks, lacking stages for initial setup, strategic and organisational review, and post-audit policy and/or strategy development. Buchanan and Gibb’s approach lacks an initial setup stage but is otherwise similar to both Orna and Henzcel. Orna and Henzcel are very similar, with only minor variation between (for example, Orna conducts a preliminary review prior to setup in order to guide the latter). Finally, Burk and Horton adopt a now largely discounted bottom-up approach, while Buchanan and Gibb, Orna and Henczel are all top-down approaches. It is important to note that this comparison of individual approaches, while usefully illustrating relative comprehensiveness and commonality, should not be considered conclusive because it is an approximate mapping at a high level, which does not fully identify or assess how well each of these methodologies address each of these stages, which is why applicability and usability are also considered in the later sections. On a final note, while identifying the relative comprehensiveness of the respective methodologies is of value, the authors believe that the true value of this exercise has not necessarily been the comparison, but more significantly the methodological baseline which has been identified from this comparison, for this provides a reusable framework to assess completeness of approach (albeit at a high level) not just when considering adoption of an existing IA methodology, but also when developing an individual or tailored approach (by identifying the core stages/steps). 3.2. Applicability Applicability, in this context, refers to the ability of the IA to meet the broad spectrum of organisational requirements. As Botha and Boon noted in their own review (while referring to Robertson, 1994), an IA methodology should not ‘‘limit organisations in the execution of information audits, but rather guide them in terms of elements to investigate and tasks to include’’. Two criteria are proposed to measure this capability:  Application: the ability of the method to address each of the elements and perspectives of an IA as defined by the information audit scope matrix (Buchanan & Gibb, 2007).  Flexibility: the ability of the method to be tailored to any of the above elements and perspectives; and to the required depth and breadth. ARTICLE IN PRESS Promote Cost Analysis Collect Plan Define Compile Setup Guide Report Account Survey Review Analyse Botha & Boon (2003) Proposed Fig. 1. Identifying an IA methodological baseline. S. Buchanan, F. Gibb / International Journal of Information Management 28 (2008) 3–116 3.2.1. Application In the previous paper in this series (Buchanan & Gibb, 2007), it was proposed that there are four elements and three perspectives to the role and scope of an IA. Elements were derived from Earl’s (2000) information strategy taxonomy: management, technology, systems, and content. They provide the auditor with the flexibility to focus on discrete types of information resources dependent upon organisational requirements. Perspectives provide a second dimension, allowing the information audit to be scoped not just according to information resource type(s), but also by one or more desired organisational views. These dimensions of IA role and scope came together as the scope matrix illustrated in Fig. 2. Fig. 2 also illustrates the respective capability (approximate) of each IA methodology to address each of these elements and perspectives. Burk and Horton (1988) can be seen to have the narrowest application of the four methods, capable of addressing each of the four elements but only from the ‘‘resource’’ perspective (being predominantly bottom-up in approach and lacking steps for organisational analysis). Buchanan and Gibb (1998), Orna (1999) and Henczel (2001) all have identical application: capable of addressing all four elements from both a strategic and resource perspective. The lack of process application capability is the key observation from this exercise, a gap shared by all four methods. While Buchanan and Gibb, Orna and Henzcel all include the mapping of information flow within their respective methodologies (which, to a degree, draws parallels with process modelling), none include process modelling as an explicit task or activity (although both Buchanan and Gibb and Henzcel briefly refer to process modelling as an option). Burk and Horton do not include information flow, their approach being based on tabular mapping of information resources to organisational units. In the view of the authors, this is a key omission as the process perspective transcends the limitations of a relatively static functional view by focusing not on organisational structure, but on the dynamic relationship between information resources, information flow, and business tasks and activity (Gibb, Buchanan, & Shah, 2006). Further, the process models potentially generated by this perspective provide significant opportunity for achieving synergy and integration with related activity, such as business process modelling, information system architecture, and the early stages of information systems development, particularly if similar modelling conventions are adopted (Buchanan & Gibb, 2007). This synergy is key to extending the future value of the IA. 3.2.2. Flexibility The authors propose that there are two dimensions to flexibility: firstly, the ability to remove or refine methodological stages/tasks according to the defined requirement; and secondly, the ability to adapt to the required organisational scope. Both aspects, it is reasonable to assume, would be found within initial IA ‘‘setup’’ activity. Both Orna (1999) and Henczel (2001) provide explicit setup stages, while Burk and Horton (1988) and Buchanan and Gibb (1998) do not (see Table 1). The authors of all four approaches indicate that their methodology could be tailored to individual requirements, but none include explicit guidelines. There is similar limited guidance for establishing organisational scope. Both Orna and Henzcel provide ARTICLE IN PRESS Management Technology Systems Content Strategic Process Resource Burk & Horton (1998) Buchanan & Gibb (1998) Orna (1999) Henczel (2001) Fig. 2. IA scope matrix. Table 1 A relative comparison of IA methodologies Setup Review Survey Analyse Account Report Guide Burk & Horton 1 3 2 4 Buchanan & Gibb 1 2 3 4 5 Orna 2−4 1 5 6 7 8−10 Henczel 1 2 3−4 5 6−7 Note: Sections 2.1–4 provide individual stage names as per the respective stage numbers above. S. Buchanan, F. Gibb / International Journal of Information Management 28 (2008) 3–11 7 some guidance, but Burk and Horton and Buchanan and Gibb, although referring to organisational scope, do not. Orna recommends a project-oriented approach to the IA, beginning with a project where information is of high strategic value and has high potential for adding value. Other recommended criteria are that there should be clear and pragmatic boundaries, potential for ‘‘quick-wins’’, and information-aware staff. Henzcel states that ‘‘The scope of an initial (or first generation) information audit should definitely be the entire organisation’’, but also states that this could be preceded by a pilot project of a selected business unit (by type of information or operational or functional level), as preparation for the full IA. As an absolute rule, the authors consider this is an unrealistic goal, particularly when consideration is given to complex corporate and/or federated organisational structures (it is almost always desirable to audit an entire organisation, but it is not always practical, and not always necessary as the focus or priority may be individual business units or processes). 3.3. Usability ISO 9241-11 (1998) defines usability as: the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use. User satisfaction, it can be argued, is at the heart of usability. This is typically measured by evaluating how effectively and efficiently user requirements have been met (e.g. fitness for purpose), and also through evaluation of the overall user experience (e.g. ease of use). 3.3.1. Empirical evidence It has previously been noted that there is a general lack of available information audit case studies, which have explicitly tested methods as part of their respective audits (Botha & Boon, 2003). Both Orna and Henzcel include case studies with their respective methodologies, but the cases provided are done so as examples, without scope for methodological critique or usability test. Further, usability should ideally be tested by an external party (Open Group, 2003). It is therefore deemed more useful to identify and discuss case studies from the field. However, of the identified case studies, the majority (Dubois, 1995; Garratt & Du Toit, 2002; Guenther, 2004; Haynes, 1995; Langley, Seabrooks, & Ryder, 2003; Soy & Bustelo, 1999; Tali & Mnjama, 2004; Theakston, 1998; Wood, 2005) do not make reference to a specific adopted IA methodology, while those that do (Booth & Haines, 1993; Jones, 2005; Lamoral, 2001; Lubbe & Boon, 1992) make only brief mention and provide no methodological critique nor feedback on usability. All four IA methods reviewed for the purposes of this study would appear to receive equal attention with no one method distinguishing itself as the ‘‘preferred approach’’ of auditors, with all regularly cited. This limited evidence from the field supports Botha and Boon’s observation that ‘‘more methodologies need to be tested in practice’’. A notable observation from this review of existing case studies is that it would appear that the preferred approach is to adopt a tailored approach based upon and referring to established IA methods, but predominantly built around standard everyday research tools such as questionnaires and interviews. This ‘‘simplified’’ approach suggests that existing IA methods may be too complex and/or are not readily adoptable (for example, Soy and Bustelo (1999) refer to ‘‘complicated methodologies’’). However, this is difficult to gauge accurately, as the majority of cases simply do not provide enough detail. 3.3.2. Skills requirement Beginning with the skills required to conduct the IA, the authors of all four approaches acknowledge that a multidisciplinary approach is required, and that the skillset required is quite considerable, drawing from several disciplines beyond the natural boundaries of most information professionals. For example, Buchanan and Gibb (1998) identify:  Project management  Strategic analysis  Systems analysis  Statistics  Accountancy Of course, some of these can be shared across a team but the requirement remains considerable, particularly for the primary auditor. The basic information skills requirement is broadly similar across all four methods; however, the more specialised skill requirements are not, with some significant variance across methods, largely dictated by individual approaches to the ‘‘review’’ and ‘‘account’’ stages (see Table 1). These key variances are as follows:  Burk and Horton (1988) have no strategic or organisational analysis steps, and use simple cost/value ratios rather than formal accounting methods for the account stage (though they do advise that the accounting practices of the parent organisation should be adopted).  Buchanan and Gibb (1998) include in-depth strategic analysis steps and a formal accounting stage.  Orna (1999) also includes a degree of strategic analysis but with the emphasis more on organisational analysis (structure, management philosophy, etc.). Orna (1999) also includes cost activity, but in contrast to Buchanan and Gibb’s formal accounting approach, recommends simple cost/value measures based upon Burk and Horton’s (1988) approach.  Henczel (2001) includes strategic analysis steps based on Buchanan and Gibb (1998), but adopts a simpler approach to ‘‘account’’, focused on high-level costs associated with IA recommendations. ARTICLE IN PRESS S. Buchanan, F. Gibb / International Journal of Information Management 28 (2008) 3–118 In summary, Buchanan and Gibb require the broadest skill-set, drawing extensively from both strategic management and accountancy disciplines. Orna and Henzcel are next, but adopt a simpler, less in-depth approach, particularly for the ‘‘account’’ stage. Burk and Horton demands the narrowest skill-set of the auditor, but this is largely due to the narrow scope of application. The second related consideration when discussing the skills requirement is who the intended user might be. All authors, not surprisingly, pitch their respective methodology to information professionals. Notably, Buchanan and Gibb, in acknowledgement of the broad skillset required to conduct an audit, recommend that it be led by a senior information professional, a distinction not made by the other authors. None specify audit experience as a prerequisite of the auditor role, with both Orna and Henzcel pitching their approach to the first time auditor (Burk and Horton and Buchanan and Gibb do not discuss this). While audit experience would of course be of benefit this acknowledges that experienced information auditors are currently few in number and that pragmatic decisions will have to be made when approaching audits for the first time, particularly with regard to scope (e.g. by beginning small). A further related consideration is whether the auditor should be internal or external: both Orna and Henczel promote internal resourcing (although Henczel does recommend external help with data analysis), while Burk and Horton and Buchanan and Gibb are both neutral on this point, each identifying the relative strengths and weaknesses of internal versus external. 3.3.3. Tools support Closely related to the skillset requirement are the tools and techniques which are provided in support of the IA. Similar to the skillset requirement, the tools and techniques required to support the IA process are broad ranging, covering (but not limited to) strategic and organisational analysis, data gathering and analysis, information flow and process modelling, systems analysis, cost/value accounting, and reporting and presentation skills. The following sections discuss the tools support provided by each of the respective IA methodologies. When introduced, Burk and Horton (1988) was the most comprehensive methodology available to identify and define an organisation’s information resources. From a tools support perspective, the methodology provided an extremely useful template for IRE capture (referred to as an ‘‘inventory data form’’), identifying and detailing all the required data fields necessary to build an inventory of IRs. The methodology also included tables and weightings for determining cost/value (focused on value with reference to recommended methods for costing which Burk and Horton acknowledged as a problematic area). However, while the IRE template has proven useful, and been adopted/refined by both Orna and Buchanan and Gibb, the rating methods for assessing and ranking IRE values has been described as unclear (Buchanan & Gibb, 1998). Notably, the methodology was also supported by InfoMapper software, a database based application designed to provide a purpose built inventory system. However, Barclay and Oppenheim (1994), in a trial conducted at Trainload Coal, concluded that the software was inflexible, cumbersome and of limited value with both the authors and the participating organisation concluding that it would have been simpler to have adapted an existing commercial database applica- tion. One further toolset limitation with Burk and Horton relates directly to the comprehensiveness of the methodology. As previously noted, the methodology focuses on identification of an organisation’s IREs and associated information mapping, with no stages included for organisational analysis and strategy and/or policy development. This is a methodological limitation, particularly with regard to organisational analysis. The implications for toolset support are simple: if the scope/comprehensiveness of the methodology is narrow then so too will be the corresponding tools and techniques which are provided or suggested to support the methodology. Toolset limitations were also identified with Orna’s (1990) methodology. Nickerson (1991) and Buchanan and Gibb (1998) both highlighted the need for a more comprehensive set of tools and techniques to support several of the stages of the methodology, with Buchanan and Gibb drawing particular attention to a lack of tools and techniques to support the initial organisational analysis steps, which frame this methodology. However, while still not comprehensive for all stages, Orna does provide further illustrative examples in later publications (Orna 1999, 2004). Buchanan and Gibb (1998) provide a more comprehensive toolset to support the audit by utilising existing tools and techniques drawn predominantly from business and management science disciplines. Their approach was to detail the purpose and tasks for each IA stage, and provide reference to appropriate tools and techniques which could be utilised to complete the stage (rather than detailed guidance/examples). In this way, tools support is evident for all stages of their IA methodology. Notably, the methodology also provides a meta model for mapping the relationship from business strategy to information strategy to information resources (subsequently adopted by Henczel, 2001), and identified three established methods from accounting which could be applied to the problematic area of costing/valuing information resources. However, there is one notable toolset limitation, for although process modelling is identified as an option, no process modelling guidance or tools are provided. Henczel’s (2001) approach is similar to both Orna (1999) and Buchanan and Gibb (1998), drawing extensively from both. Unfortunately, Henzcel’s methodology has also been criticised for lacking practical guidance, as Webster (2001) noted, ‘‘At times it is too vague and evades detail by claiming much depends on each organisation’s culture’’. There is limited guidance for organisational analysis and ARTICLE IN PRESS S. Buchanan, F. Gibb / International Journal of Information Management 28 (2008) 3–11 9 information flow modelling, and a lack of detail on the recommended weightings and scales, Henzcel also refers to process modelling, but similar to Buchanan and Gibb (1998), provides no tools or guidelines. Buchanan and Gibb’s (1998) information resource database approach is recommended, but Henczel does not address Buchanan and Gibb’s (1998) own concerns regarding the potential complexity of this approach. Finally, allocation of costs to information resources is mentioned as a further step during the data analysis stage, but there are brief guidelines. 4. Conclusions Comparative review of methodological comprehensiveness illustrated that Burk and Horton (1988) lack stages for initial setup, strategic and organisational review, and postaudit policy and/or strategy development. Buchanan and Gibb (1998) lack an initial setup stage but are otherwise similar to both Orna (1999) and Henczel (2001), who all provide relatively comprehensive methodologies. Notably, Burk and Horton adopt a now largely discounted bottomup approach, while Buchanan and Gibb, Orna and Henczel are all top-down approaches. Guidelines for setting and managing scope are limited across all four IA methods. The recommendation is for more detailed guidance based upon the proposed IA scope matrix (see Fig. 2). With regard to scope, Burk and Horton has the narrowest application of all four methods, largely restricted to a resource orientation (see Fig. 2), while Buchanan and Gibb, Orna and Henzcel are all capable of both strategic and resource orientation. None have process orientation, which is considered a key methodological limitation. The recommendation is for this to be incorporated into future IA methodologies. Buchanan and Gibb (1998) provide the most comprehensive IA toolset, with tools and techniques listed and recommended for each of their stages and respective steps, but the methodology also requires the broadest range of skills. Both Orna and Henzcel lack practical tools and techniques for some steps, but adopt a simpler, less indepth approach, requiring a slightly narrower skillset. Burk and Horton provide useful templates and have the narrowest skillset of all four, but the limited scope and applicability of the method largely negates these benefits. All four methods lack tools/techniques for process modelling. The recommendation is for process modelling tools and techniques to be incorporated into IA toolsets which align with those used for related activity, such as business process modelling and information system requirements engineering. Achieving this synergy will extend the value of the IA. In conclusion, each of the four IA methodologies selected for this review have their own respective strengths and weaknesses, and applicability. None lack purpose or could be described as un-usable. Selection of an appropriate methodology should be based upon two key factors: firstly, the organisational requirement and the resulting IA scope and orientation required; and secondly, the skills and experience of the auditor, and the corresponding toolset support required. The above review should assist with this selection process. References Barclay, K., & Oppenheim, C. (1994). An evaluation of InfoMapper software at Trainload Coal. Aslib Proceedings, 46(2), 31–42. Barker, R. L. (1990). Information audits: Designing a methodology with reference to the R&D division of a pharmaceutical company. Department of Information Studies, Occasional Publications Series No. 8. Sheffield: University of Sheffield. pp. 5–14, 27–34. Best, D. (1985). Information mapping: A technique to assist the introduction of information technology in organizations. In B. Cronin (Ed.), Information management: From strategies to action (pp. 75–94). Aslib: London. Booth, A., & Haines, M. (1993). Information audit: Whose line is it anyway? Health Libraries Review, 10, 224–232. Botha, H., & Boon, J. A. (2003). The information audit: Principles and guidelines. Library Review, 53(1), 23–38. Buchanan, S. J., & Gibb, F. (1998). The information audit: An integrated strategic approach. The International Journal of Information Management, 18(1), 29–47. Buchanan, S. J., & Gibb, F. (2007). The information audit: Role and scope. The International Journal of Information Management, 27(3), 159–172. Buchanan, S. J., Gibb, F. (2008). The information audit: Theory versus practice. The International Journal of Information Management, in press, doi:10.1016/j.ijinfomgt.2007.09.003. Burk, C. F., & Horton, F. W. (1988). InfoMap: A complete guide to discovering corporate information resources. Englewood Cliffs, NJ: Prentice-Hall. Carlisle, D. K. (2005). Resource review: Conducting an information audit. Information Management Journal, 39(2), 68–69. Dalton, P. (1999). Investigating information auditing. Library & Information Research News, 23(74), 45–50. Dubois, C. P. R. (1995). The information audit: Its contribution to decision making. Library Management, 16(7), 20–24. Earl, M. J. (2000). In D.A. Marchand, T.H. Davenport, & T. Dickson (Eds.), Mastering Information Management (pp. 16–22). London: Financial Times Prentice Hall. Garratt, O., & Du Toit, A. (2002). Accountability and demonstration of the value of information serices in South African law firms. Aslib Proceedings, 55(3), 130–137. Gibb, F., Buchanan, S., & Shah, S. (2006). An integrated approach to process and service management. International Journal of Information Management(26), 44–58. Gillman, P. L. (1985). An analytical approach to information management. The Electronic Library, 3(1), 56–60. Guenther, K. (2004). Conducting an information audit on your intranet. Online, 28(5), 46. Haynes, D. (1995). Business process reengineering and information audits. Managing Information, 2(6), 30–32. Henczel, S. (2001). The information audit: A practical guide. London: K.G. Saur. Henderson, H. L. (1980). Cost effective information provision and the role for the information audit. Information Management, 1(4), 7–9. ISO 9241-11. (1998). Guidance for usability. Geneva: International Organization for Standardization. ISO 1779. (2005). Information technology—security techniques—code of practice for information security management. Geneva: International Organization for Standardization. Jones, H. (2005). Risking knowledge management: An information audit of risk management activities within the Hobart City Council. Library Management, 26(6/7), 397–407. ARTICLE IN PRESS S. Buchanan, F. Gibb / International Journal of Information Management 28 (2008) 3–1110 Lamoral, D. (2001). An evaluation of information provision at the Institute for Commercial Forestry Research, South Africa: The findings of an information audit. Journal of Librarianship and Information Science, 33(4), 177–190. Langley, E. A., Seabrooks, J., & Ryder, D. (2003). Information audit as a holistic approach: A case study. Available from /http://www.sla.org/ division/dst/LangleySLA062003.pdfS. Lubbe, W. F., & Boon, J. A. (1992). Information audit at a university. South African Journal of Library and Information Science, 60(4), 214–223. Nickerson, G. (1991). Book review: Practical information polices: How to manage information flows in organisations. Database, 14(6), 86. Open Group. (2003). Evaluating usability. Available from /http:// www.opengroup.org/openbrand/testing/testprocs/pd4.pdfS. Orna, E. (1990). Practical information policies: How to manage information flow in organisations. Aldershot: Gower. Orna, E. (1999). Practical information policies (2nd ed.). Aldershot: Gower. Orna, E. (2004). Information strategy in practice. Aldershot: Gower. Quinn, A. V. (1979). The information audit: A new tool for the information manager. Information Manager, 1(4), 18–19. Reynolds, P. D. (1980). Management information audit. Accountants Magazine, 84(884), 66–69. Riley, R. H. (1976). The information audit. Bulletin of the American Society for Information Science, 2(5), 24–25. Robertson, G. (1994). The information audit: A broader perspective. Managing Information, 1(4), 34–36. Soy, C., & Bustelo, C. (1999). A practical approach to information audit: Case study. Managing Information, 6(9), 30–38. Tali, M., & Mnjama, N. (2004). Information audit at the Southern African Development Community (SADC) Secreteriat. Library Management, 25(4/5), 199–207. Theakston, C. (1998). An information audit of National Westminster Bank UK’s learning and resource centres. International Journal of Information Management, 18(5), 371–375. Underwood, P. G. (1994). Checking the net: A soft-systems approach to information auditing. South African Journal of Library and Information Science, 62(2), 59–64. Webster, M. (2001). A guide to information audits. Information World Review, 66. Wood, A. (2005, June 23–25). St Helena Hospice information audit report executive summary. Implementation of quality systems and certification of biomedical libraries conference. EAHIL workshop, Palermo. Wood, S. (2004). Information auditing: A guide for information managers. Available from /http://www.freepint.com/shop/reportS. Worlock, D. R. (1987). Implementing the information audit. Aslib Proceedings, 39, 255–260. Steven Buchanan is an Information Systems Lecturer in the Graduate School of Informatics, University of Strathclyde. He has carried out extensive consultancy work and research in the areas of information strategy, enterprise architecture, information systems, and information audits. He has worked across Europe and throughout Australasia for a number of public and private sector organisations, spanning telecommunications, finance, education, government, and microelectronics. Forbes Gibb is a Professor of Information Science in the Graduate School of Informatics, University of Strathclyde. He has been involved in several major EU funded research projects (SIMPR, STAMP, AUTOSOFT, MIND) and teaches in the areas of information strategy, service management and content management. ARTICLE IN PRESS S. Buchanan, F. Gibb / International Journal of Information Management 28 (2008) 3–11 11