self_assessment_toolkit.doc

69
University of London Computer Centre 20 Guilford Street London WC1N 1DZ 0207 692 1345 www.ulcc.ac.uk Ed Pinsent, Project Manager Assessing Institutional Digital Assets The AIDA self-assessment toolkit

Transcript of self_assessment_toolkit.doc

Page 1: self_assessment_toolkit.doc

University of London Computer Centre20 Guilford Street

London WC1N 1DZ0207 692 1345www.ulcc.ac.uk

Ed Pinsent, Project [email protected]

Assessing Institutional Digital Assets

The AIDA self-assessment toolkit

Page 2: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

REVISION HISTORYFirst draft released 16 May 2008Revisions following QA by Kevin Ashley

23-29 May 2008

Additions by Patricia Sleeman 21-28 May 2008Additions by Jim Jamieson 10 June 2008Draft released on website 11 June 2008QA by Colin Love of Technology leg 30 June 2008Additions made to Technology leg 01 July 2008Final QA by Kevin Ashley 02 July 2008

1

Page 3: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

Introduction

Recognising current institutional capabilities is an essential pre-requisite for taking effective decisions about how to create, manage, store and preserve their assets. Likewise, understanding future requirements is necessary to enable an institution to decide whether specific actions need to be taken in regard to particular assets, or when and how it is desirable to improve on its current capabilities. The actions an institution can take will be determined, in part, by its institutional readiness and maturity in relation to digital asset management.

The AIDA project intends to help you with these decisions, linking technical awareness services with information management knowledge.

Background

The Assessing Institutional Digital Assets (AIDA) Project is managed by the University of London Computer Centre, and funded under the 1/07 JISC Capital Programme Call, to develop and test digital asset management tools in a variety of institutional settings.

ContentsIntroduction.......................................................................................................2Background......................................................................................................2Contents...........................................................................................................21. What are digital assets?...............................................................................32. What is digital asset management?..............................................................33. Introduction to the AIDA self-assessment toolkit..........................................54. Who should use this self-assessment toolkit?..............................................55. How do I use this toolkit?..............................................................................55A: FAQs about using the toolkit......................................................................66. What's next?.................................................................................................7Assessment 1: Organisation Leg......................................................................8Assessment 2: Technology Leg.....................................................................19Assessment 3: Resources Leg.......................................................................32APPENDIX: Cornell's Five Stages..................................................................41

Stage 1: Acknowledge................................................................................41Stage 2: Act................................................................................................42Stage 3: Consolidate...................................................................................43Stage 4: Institutionalise...............................................................................44Stage 5: Externalise....................................................................................45

2

Page 4: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

1. What are digital assets?

Any form of salient information that plays a role in your Institution's efficiency and effectiveness. If managed properly, assets can maximize efficiency, productivity and profitability.

They are held in digital form. They might have been made on a computer (born digital). Or they could have been digitised - eg a book / periodical scanned and made into a PDF; a collection of 35mm slides converted to TIFF files; magnetic audio tapes converted to FLAC files.

They could be included in, but not limited to, this list of things:

Reports Audio Digitised books or periodicalsScientific data Moving image Publicity materialInstitutional records Audiovisual Emails Research outputs WebsitesDatabases Distance Learning CoursesImage repositories e-Learning ObjectsGeographic information systems

Licensed e-journal files

They could be stored (sometimes permanently) in an archive, a digital library, or an Institutional Repository. Or they could be kept for short to medium term for business reasons, then disposed of according to a records management schedule. Quite possibly, they are not being managed at all; sometimes assets can be sitting on someone’s desktop filestore, or languishing on a disk in a drawer.

They may be both shared and shareable. They could have reusable content that can support both short-term and long-term use. On the other hand, some of them may contain confidential or sensitive information that means sharing has to be managed and secure.

Although the word ‘asset’ can imply some sort of financial value, this self-assessment toolkit (along with most institutions) takes a wider view. Digital objects can be thought of as assets because they help defend the value of other things (as evidence for patent claims, for instance), because they are needed for regulatory compliance, because they have intellectual value, or because they meet some other organisational need.

They probably won't all be 'static', because things may change: value, use made, currency, importance, contextual significance, and so forth, all affecting the status of your assets.

2. What is digital asset management?

Digital asset management is the systematic management of digital data, such as text, image, audio, and video files, so that the institution can understand its requirements for the assets, and ensure that those requirements are being met in the most effective manner. Requirements may include reuse,

3

Page 5: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

commercial exploitation, or availability for audit purposes, amongst others. Effective asset management can increase value and utility of the assets: if something must be kept for one reason, then it may available for other uses, even if those other uses would not in themselves cause the asset to be retained.

With the increasing dependency on digital information, institutions are recognizing that effective management of these assets is critical. Asset management includes, but is not limited to, digital preservation. It should also include activities such as selection, appraisal, retention, destruction; and concern itself with continuity and maintenance of assets. Digital asset management is the broadly scoped challenge of deploying highly valued repositories for digital information.

For example: Two institutions, faced with the decision to retain the same type of material, may well reach the same conclusion about the importance and period of retention. But the actions which result may well be different, and will depend on the resources and technologies available to each institution and their relative degree of integration with relevant institutional workflows. Institution A may be able to utilise format-specific expertise in its institutional repository, which already has extensive holdings of a similar type; institution B may well decide to outsource some or all preservation actions, since it has no existing expertise in this area and does not expect to acquire significant holdings of this type in the medium-term future. On the other hand, if institution B anticipated a large growth in holdings of this type in the coming years, its best course of action might be to acquire the necessary expertise and resources to be able to deal with the problem itself.

Managers of digital collections often struggle with common challenges and questions. These include but are not limited to:

Selection of hardware and hardware limitations. Managing the persistence of assets within the data repository. Cataloguing (metadata) processes. Control and security of digital assets. Rights management. Content creation. Interoperability among digital libraries. Long-term funding.

One of the assumptions lying behind AIDA is that this variation in appropriate solutions is commonplace. Many tools and strategies already exist to cope with different aspects of the asset management problem, but they aren’t all appropriate in every setting. AIDA therefore aims to allow you to evaluate your institution against a recognised capability scale, and then suggests appropriate actions based on that evaluation. This workbook deals with the first of those tasks: self-evaluation, or assessment. Later products from AIDA will deal with the second task, of recommending courses of action.

4

Page 6: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

3. Introduction to the AIDA self-assessment toolkit

This is a toolkit to enable you institution to perform self-assessment of your capacity, state of readiness, and overall capability for digital asset management.

This is not an audit. It's about measurement, not improvement; you don't need to supply 'evidence' that you are doing anything. But you do need to know that evidence exists, that you can get hold of it, and where you can get hold of it. (If you can't find any evidence at all, then that in itself may tell us something).

Although the toolkit is weighted and scored, there's no such thing as a 'bad score'.

4. Who should use this self-assessment toolkit?

Our approach to asset management seeks to bring together the best elements of the expertise of records managers, librarians, data curators, repository managers and others in developing an institution-wide approach to asset management.

What this means is that probably no single person can complete all the parts of the toolkit and that additional help is required. IT staff may also contribute, such as systems administrators, webmasters, and IT managers.

5. How do I use this toolkit?

It's structured as a set of simple elements, each one describing an aspect of digital asset management. Each element is positioned on one of five Stages, indicating a stage of maturity or development. Stage 1 is the least developed, Stage 5 is the most developed. Each stage is outlined with a short and very general description, backed up with additional indicators and exemplars.The process is spread over three discrete areas. These three 'legs' will tell us what we need to know about your Institution's assets from an Organisational, Technology and Resources point of view.

These Five Stages (Acknowledge, Act, Consolidate, Institutionalise and Externalise). and the Three Legs are based on the Cornell University maturity model, originally designed to assess an Institution's readiness for digital preservation. The AIDA project has adapted the Cornell model to apply to digital asset management. See the attached Appendix for further characterisation of the Five Stages.

Along with the typical characteristics, we try to provide exemplars or indicators of practice. Using these indicators, its then up to you to evaluate your own institution, and identify your own practice. You might be well advanced in one

5

Page 7: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

area, but require a lot of effort to improve practice in another. Only you know the priorities of your institution and the resources available to you.

For each element, try and locate your institution at the stage which best describes the place you're working in. Please record your decisions using the assessment scorecard provided. There is space allocated on the form for any additional observations or notes you may wish to make.

It would also help if you could complete the profile information on the first page of the scorecard. In particular, give us some indication of the scope of the assets, or asset collections, that have been included in your assessment.

5A: FAQs about using the toolkit

Aren't these elements and descriptions a bit simplistic and general?

Yes. We should add a note of caution about the tool and its elements. They are very simplified and should not be interpreted too literally. The descriptions given are intended to support communication between different groups who are encouraged to take a multi-disciplinary and systematic approach to analysis and self-assessment.

What if I'm not sure where to put myself?

There may not be an exact match. You may find difficulty in selecting one or the other. Don’t spend too much time hovering between a Stage 2 or Stage 3. You should be able to recognise yourself 'instantly' from one of the short descriptions or exemplar/indicators. If you can't do that, then it's not that you're in the wrong - it means that the AIDA team have failed to do the job properly.

Also, don't spend any time looking for information or documentary proof. However, if you have to admit that the evidence is not available and your answer would just be based on a general 'feeling', then score yourself as a Stage 1 for that element.

This exercise is taking me three days to complete. Am I doing something wrong?

The process should not take too much time. The information you need to complete the exercise should be readily available to you. If it isn’t, then it may as well not exist for the purposes of this assessment exercise. We'd be very surprised if one person could do all this by themselves. Ideally, we want a mix of people who are information professionals (archivist, records manager, IPR and copyright expert, Information Manager, Knowledge Manager, Librarian) and IT professionals.

What about these 'Possible supporting sources'? I haven't got any of those. Even if I had it would take me a long time to find copies.

6

Page 8: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

This is not an audit. AIDA is not expecting you to produce any evidence, or official documentation, to back up your assertion that your policies are at Stage 4, for example. The 'supporting sources' are just there as prompts, to describe the sort of place where the element in question might be documented in some way.

6. What's next?

At the end of this stage of the process, you'll have a completed scorecard which you should send back to the AIDA Project team for assessment.

At this stage we'll also ask you for some feedback on the process:

Was it a straightforward exercise? Could you understand what the process was trying to do? Did we omit anything important? Were you able to 'recognise' yourself in the indicators? Were any of the elements not a good fit for what you are actually

doing?

We'll then send you a picture of where you sit on the ‘five stages’ chart, showing you institutional readiness for digital asset management, based on what you've told us about your current Organisational and Technological Infrastructure, and Resources Framework.

Later project outputs will use this to provide the following information:

Recommended actions and suggestions for how you can move forward to subsequent stages in the five stages path

Reassurance about potential threats which can be ignored (because the assets involved have a short useful lifetime, for example)

A further toolkit for analysis of your assets based on simple classification, and on formats

Recommendations for digital asset management based on current use and future use, including suggestions for making decisions about retention periods and future growth patterns across the entire lifecycle

7

Page 9: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

ORGANISATION LEG

Assessment 1: Organisation Leg

Organisation Element 01:

Mission statement

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

No mission statement relating to the management / creation of digital assets.

Digital asset management is on the agenda.

A mission statement is in development.

Mission statement is written and fully reflects institutional commitment to ownership of digital assets, their management and their creation.

Mission statement is published, accessible to others, and externally recognised.

Indicators / exemplars

Institution relies on departments to identify and manage their own digital assets. Low awareness amongst staff and misunderstandings common.

Issues are discussed at senior management level, recognising issues that will arise as a result of changing practice. Staff are being consulted and options explored.

Mission statement exists, but is not widely communicated within the institution.

Statement is communicated internally.

Mission statement is used and copied as an exemplar of good practice nationally or internationally.

Possible supporting sources

Written mission statement.Legal or legislative mandate.Regulatory requirements.

Explanatory notesWe recognise that it is unlikely that your Institution will have signed up to a single stand-alone policy or mission statement on digital asset management. Any statements may be scattered across several places, and even then any guidance relating to the creation, storage or value of assets may only be implied rather than made explicit. Besides the supporting sources, other suggested sources include: records management policy; digital library principles and guidelines; IR agreements, terms of conditions of use; archivist's collection and preservation policies; webmaster's terms and conditions of website use; e-Learning Object repository policies; also any institutional or department-wide policies governing information management.

8

Page 10: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

ORGANISATION LEG

Organisation Element 02:Policies and procedures

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Policies and procedures are implied or not yet written.

Policies are still in development, or are very project-specific.Procedures support projects and local pockets of activity.

Policies are written and implemented.Policies support programmes for asset management.

Policies are specific to the institution. They are implemented and vetted by senior management.Policies and procedures are integrated.

Policies are embedded in institutional practice.

Indicators / exemplars

Departments follow ad-hoc procedures in regards to asset management.

Policy is being discussed at senior management level.Some Departments may have local procedures, but they are not centralised and there are variances.

Policies are department-specific.Local procedures exist in defined areas.

Fully developed policy documents and procedure manuals for all aspects of asset management.Evidence exists of senior support for these policies.

See explanatory notes.

Possible supporting sources

Policy documents.Procedure manuals.Rules, protocols, handbooks, workflows.

Explanatory notesFor this element, it's assumed that the policies and procedures could apply to any and all aspects of digital assets and their entire life cycle. Hence written policies could exist for (e.g.) creation, digitisation, storage, transfer, management, cataloguing, metadata creation, retrieval, sharing, workflow, distribution, records management, destruction, retention, archiving, deposit, and preservation. However, Institutions at Stage 5 may find that their methods for doing these things are so advanced and embedded in institutional practice that their control is not made explicit through written policies. Alternatively, the institution’s activities may be embedded in national or international ventures where policy is set at a higher level. This does not constitute a lack of policies.

9

Page 11: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

ORGANISATION LEG

Organisation Element 03:

Policy review

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

No mechanisms for policy review.

Initiation of policy review groups to comment on drafts.Departments review their local procedures.

Policy review groups are established and meeting on a regular basis.

Policies are embedded in institutional mechanisms and so are regularly renewed, updated and developed.

Policies are reviewed in line with external standards (such as ISO 9000), or are externally managed and reviewed.

Indicators / exemplars

Departments review their practices in ad-hoc fashion, or not at all.

Departments have established review processes, but they are unstructured and uncoordinated.

Reviews take place on a sporadic basis.

Cyclical processes take place on a regular basis.

Institution undergoes regular external audits to ensure quality of its policies.

Possible supporting sources

Review cycle.Policy review and update records.

10

Page 12: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

ORGANISATION LEG

Organisation Element 04:

Asset management and

sharing

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

No cohesive strategy exists.No strategy or support for sharing.

Shared storage is available and sharing is encouraged, but retrieval is not always easy.

Managed storage available, based on local/departmental needs.

Storage is centrally managed.

Digital repository allows sharing and reuse, and external access.

Indicators / exemplars

Individual silos of digital assets kept in ad-hoc storage. No cross-departmental awareness of assets.

A leave of absence or illness of staff results in certain assets being unretrievable.

Sharing enabled across some boundaries.Asset management takes place in defined areas.

Systems allow centralised management of assets.Wider sharing is possible.Re-use and repurposing is widespread.

Systems allow centralised management of assets.Wider sharing with external stakeholders is possible.External re-use and repurposing is widespread.

Possible supporting sources

Organisational charts.Workflow models.

Explanatory notes

See also TECHNOLOGY 02, Appropriate technologiesAnd RESOURCES 03, Technological resources allocation

One of the hallmarks of digital asset management is sharing. It allows the systematic management of digital data, so that they can be reused and re-purposed. It aims to maximize the value of assets by facilitating easy storage and retrieval while protecting and, at times, enhancing their utility. In some Institutions, this may be happening if you have an IR project underway; see TECHNOLOGY 11.

11

Page 13: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

ORGANISATION LEG

Organisation Element 05:

Asset continuity

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

No evidence of contingency planning.

Contingency plans are considered when a crisis occurs.

Departmental contingency plans, for assets that require continuity and protection, are in process of consolidation.

Formal plans exist for risk management, contingency planning, and succession planning, where such plans are suitable for valuable assets or assets in need of protection.

Risk management and exit strategy is in place for all external partnerships and dependencies.

Indicators / exemplars

Asset owners cannot identify appropriate successors or arrangements should the need arise.Assets very likely to be unprotected.

Loss or damage to assets has occurred at least once.Breach of confidentiality has taken place at least once.

No formal centralised plan as yet, but asset owners can point to indicators that would form the basis of a plan.

Institution can demonstrate a level of contingency planning that is appropriate to management of its asset collections.

Consortium or partnership can demonstrate a level of contingency planning that is appropriate to management of its asset collections.

Possible supporting sources

Contingency plans.Rick management strategies and plans.Evidence of security failures.

Explanatory notes

This element refers to the continued availability of digital assets. It is appropriate to apply this line of enquiry to those assets that require continued availability; not all assets do. For example, institutional records which are scheduled for destruction next year, forthcoming lack of support for their file format is not a concern. If the material needs to be kept for 10, 20 or 100 years, then presumably you will be thinking of some form of action.

Asset continuity is about the protection and assurance of that availability, and confidentiality. We're looking for a level of planning that is appropriate to the assets. 'No plan' is acceptable, if the assets that are in scope don't need continuity.

See also RESOURCES 04, Risk analysis; and TECHNOLOGY 03, 04, 05.

12

Page 14: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

ORGANISATION LEG

Organisation Element 06:Audit trail of

activities and use

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Asset creation practices are not documented.No audit trail of user behaviour.

Asset creation practices are documented in some departments.Local audit trails of user activity in some departments.

Departments have local creation standards by which they are able to prove certain things of use to the organisation.

Institution has a documented history of important stages in the operation of its asset management life cycle.

Audit activity is embedded in the Institution's workflow and may be virtually invisible.Collaborative commitment to transparency and accountability.

Indicators / exemplars

Some departments are not transparent in their actions, or do not feel they are accountable to others.

Some departments may be sharpening their audit procedures in response to a particular event.

Audit trails of user activity exist but are not yet centralised.

The history can be used to generate reports to meet targets or comply with certain external requirements.

Policy has some legal (statutory and contractual) requirements for which audit records are used to demonstrate compliance.Evidence is accessible to external stakeholders.

Possible supporting sources

Audit trail records.Policies, procedures, and results of changes.Asset retention strategy document.Evidence of audit trails meshing with other policies or audits (eg for records management or archival management)

Explanatory notes

Assuming you are technically capable of recording and auditing your activities, this element shows how you will use that information to demonstrate certain things about your activities, and about the behaviour of users in your organisation. For AIDA, you are not required to present any evidence or reports, just the degree to which you are capable of generating such audit information. See also RESOURCES 05, which relates to financial audit and transparency.

13

Page 15: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

ORGANISATION LEG

Organisation Element 07:

Monitoring and feedback from

users

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

No mechanisms for relating user experiences to asset collections.

Limited and finite project to assess or monitor the value of local assets.

Departmental programmes underway to assess the value and use being made of their assets.Awareness is being raised.

Monitoring mechanisms are in place to ensure that assets are valued and used at appropriate levels.

Collaborative mechanisms ensure that best use is being made of the asset collections, both internally and externally.

Indicators / exemplars

Value of digital asset collections not known; the necessity of collecting or maintaining them is not being verified.

One-time audit of a department reveals large amount of resource is being wasted on a useless asset collection.

Departments gaining awareness of the value of their assets.Some user evaluation programmes are being set up.

Institution has a coherent view of the value of its assets.Regular user evaluation of assets is taking place.

External user evaluation is taking place.

Possible supporting sources

Audit trails.Formal or informal feedback records.

Explanatory notes

Assets are created with end users in mind. The Institution needs some mechanism whereby they relate the user experience to the actual assets. User evaluation will be an indicator here. For example, an Institution's journal subscription and purchasing policy is based on the assumption that the University is interested in acquiring regular assets on a specific researched subject area. But circumstances may change and unless you have a measurement mechanism you could be wasting time money and effort in generating and maintaining assets that are not really needed.

14

Page 16: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

ORGANISATION LEG

Organisation Element 08:

Metadata management

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Metadata creation is not managed.

Academic staff are empowered to create basic metadata.

Cataloguing and archival skills available to support programmes for metadata creation.

Metadata is embedded in institutional workflow, using appropriate standards.

Metadata meets externally recognised standards and is integrated across systems.

Indicators / exemplars

No guidance to staff for metadata creation.No policy for external assets' metadata.

Metadata schemas are being invented ad-hoc by individuals or lone departments.

Metadata conforms to standards of interoperability.

Internal and externally-acquired assets are governed by appropriate guidance.

Metadata management is capable of flexibility and expanding to meet the requirements of all stakeholders and partnerships.

Possible supporting sources

Written procedures and guidance for all aspects of the management of metadata. Written procedures for managing the metadata of externally-acquired assets.

Explanatory notes

See also TECHNOLOGY 10.

This element is about metadata creation policies and whether they are working. Institutional assets are not tagged with as much management metadata as we would like. When completing this element, consider the value of improved object or collection metadata (such as retention periods or expected use), which may help you to recognise the value of metadata which is created or applied as part of normal workflow.

Take external acquisitions into account. External assets may lack metadata. Metadata may need conversion, or it may need adding to. The risk is that metadata could be lost, or duplicated, if you lack a policy for its management. External acquisitions may require a mixed policy, one which allows for the addition of new metadata and the management of existing metadata.

15

Page 17: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

ORGANISATION LEG

Organisation Element 09:Contractual agreements

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

No consistent mention of asset management in agreements.

Best practice is hard to align with actual staff practices.

Obligations for asset management creation are defined. Some alignment with team-working and cross-departmental practices.

Detailed contractual agreements in place for all departments.Comprehensive deposit guidelines and transfer agreements in place.Agreements aligned with team working.Room for negotiation.

Consortial agreement or other collaborative contracts in place.

Indicators / exemplars

Staff rarely use contracts or agreements when creating digital assets.

'Gentleman's agreements'; too many implicit understandings in contracts.

Most staff are aware of how agreements affect their work.

All staff work within a comprehensive contractual framework.

All staff work within an agreed collaborative framework.

Possible supporting sources

Contracts.Agreements.Related guidelines.Deposit and transfer guidelines.

16

Page 18: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

ORGANISATION LEG

Organisation Element 10:Intellectual

Property Rights and rights

management

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

No clarity on ownership or rights associated with digital assets.

Basic awareness of departmental ownership, but rights management issues are still very implicit.

Programme of awareness-raising across Departments is underway.

Institutional confidence in integrity of rights information.

Formal shared agreements between organisations (internal and external) to establish clear ownership of assets.

Indicators / exemplars

Assets may exist where the rights are unknown or have expired.

Departments still unsure of which assets they are responsible for.Rights management not fully understood.Expiry period is not clearly understood.

Basic understanding of ownership responsibilities and rights issues.

Mechanisms are in place for detecting expiration of rights, where needed.

Rights management works through collaboration with partners.

Possible supporting sources

IR deposit agreements.

Explanatory notes

This element is concerned with the Institution's overall understanding of who owns the rights on its assets, and when these rights expire. Its main concern is with copyright and IPR for assets deposited in an IR, but general "ownership" issues are of interest too, if the assets are corporate records.

17

Page 19: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

ORGANISATION LEG

Organisation Element 11:

Disaster planning and business

continuity

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

No written disaster plan.No business continuity plan.

Mini-disaster triggers awareness that a disaster plan is needed.

Written disaster plan is in development.Written business continuity plan is in development.

Disaster plan is embedded within institutional practice.Business continuity plan is embedded within institutional practice.

Disaster plan is shared with external partners.

Indicators / exemplars

Project manager for disaster and continuity has been appointed.

Plans are reviewed regularly.Plans are communicated to all staff.Plans have gained organisational acceptance.

Possible supporting sources

ISO 17799 certification.Disaster and recovery plans.Service continuity plan.Business continuity plan.Documentation linking roles with activities.

Explanatory notes

This element refers to the creation and organisational acceptance of the disaster plan and/or business continuity plan. See TECHNOLOGY 09 for implementation and testing of these plans.

18

Page 20: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

TECHNOLOGY LEG

Technology Element 01:

Technological infrastructure

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Infrastructure does not allow or encourage good asset management.

Infrastructure meets local/departmental needs, but it is difficult to do anything outside narrow institutional boundaries.

Infrastructure allows or encourages sharing of assets.

Technological infrastructure is adequate to sustain asset management across the institution.Technological infrastructure is well-supported.

Infrastructure is distributed and integrated with national and international elements of infrastructure.

Indicators / exemplars

Infrastructure is decentralised, or has too many disparate components.No formal asset list.

Gradual awareness that sharing certain resources is necessary for adequate operation of service.

Infrastructure is in process of being assessed in terms of suitability for asset management.

Infrastructure is sufficient for the organisation’s needs and is supported using appropriate SLAs etc. Upgrades and enhancements are planned and budgeted for.Formal asset list exists.

Technical infrastructure is well organised and highly integrated. Links to appropriate external services.

Possible supporting sources

Technical architecture documentation.Software inventory.System documentation.Support contracts.Use of strongly community-supported software.

Assessment 2: Technology Leg

19

Page 21: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

TECHNOLOGY LEG

Technology Element 02:Appropriate technologies

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Technologies are mixed, decentralised, disparate, mismatched, or non-existent.

Realisation that appropriate hardware and software technologies need to be considered over long-term in relation to management of assets. Planning process begins.

Planning and identification of appropriate technologies according to specific needs.

Appropriate hardware and software technologies infrastructure for management, storage and access of assets across the entire institution.

Appropriate hardware and software technologies infrastructure for the collaborative management of assets.

Indicators / exemplars

Hardware and software are failing to support the creation and management of digital assets.Basic access needs are not met. Hardware and software is purchased ad-hoc in response to department's requests.Incompatible software and hardware in place across the organisation.Many technologies are project-driven and over-specific.

Use of compatible software across multiple departments.Definition of requirements.Most hardware meeting individual user needs.

Institution is re-assessing its technology investment policy and infrastructure.

Infrastructure investment is planned, to meet the institution's needs.

Distributed and integrated technologies.

Possible supporting sources

Technology watch.Documentation of procedures.Designated community profiles.User needs evaluation.Hardware inventory.

20

Page 22: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

TECHNOLOGY LEG

Explanatory notes

See also ORGANISATION 04 and RESOURCES 03.

Your response to this element may depend on the sort of assets that are in scope. If considering open-access materials, does your institution have the appropriate technologies for allowing access to these assets at any time? If some form of management of assets is required, the appropriate technologies may be those which handle long-term retention, or offer increased protection and managed access for particularly valuable assets.

21

Page 23: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

TECHNOLOGY LEG

Technology Element 03:

Ensuring availability and

integrity

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

No backup policy.Storage of digital assets allows existence of uncontrolled copies.

Backup and copying functions in place in response to specific threats of loss.

Managed regular backup is taking place based on local/departmental needs.

Backup function exists for all assets.Backup anticipates growth of asset collection.Other types of asset protection than backup exist, more suited to long-term retention.

Backup functions exist outside the organisation.Copying of assets takes place outside the organisation in a distributed way.Synchronisation is an integrated feature.

Indicators / exemplars

Lack of coherent backup policy has led to inaccessible or lost assets.Assets cannot always be located precisely; many ambiguous locations.Master copies not clearly identifiable.

Backup services are largely reactive.Multiple copies of assets may exist but no evidence of copy synching.Minor disaster leads to focus on ensuring synchronisation planning.

Backup services are pro-active.Institution is evolving different copying policies for different classes of digital assets.

Multiple copies managed and synched.Number and location of all copies is known.Storage program includes backup and offsite storage for backups.Storage program manages unchanging assets separately and recognises their existence.

Multiple copies managed and synched.Number and location of all copies is known.Storage program includes backup and offsite storage for backups.

Possible supporting sources

Documentation of what is being backed up and how often.Audit log/inventory of backups.Validation of completed backups.Disaster recovery plan.Testing of backups.Support contracts for hardware and software for backup mechanisms.

Random retrieval tests.System tests.Location register/log of digital assets.Workflows.System analysis of how long it takes for copies to synchronize.Procedures/documentation of operating procedures related to updates and copy synchronization.

22

Page 24: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

TECHNOLOGY LEG

Explanatory notes

See also ORGANISATION 05.

Ensuring the availability and the integrity of assets involves activities such as backup, synchronisation, and management of the location of copies. Note that institutions at Stage 4 may be using methods other than backup to protect their assets that require long-term retention.

Copies are not just made for protection; copies can be made to optimise best use of the asset. For example where there are multiple sites with the same data, copying can be a way of managing bandwith and bringing the assets nearer to the users. Even so 'master copies' still need to be identified. There is also the possibility of making multiple copies to protect valuable data. Where multiple copies exist, are they all in synch, and do you know that they are?

23

Page 25: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

TECHNOLOGY LEG

Technology Element 04:Integrity of information

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Detection processes for data corruption, if any, are ineffective.Storage and other media untested.

Processes exist for detection, avoiding and repairing loss but they fall short of what is tolerated within the policy.

Policies and procedures are being developed, but not yet centralised.

Processes exist for detection, avoiding and repairing loss.Storage program includes media testing program.

All incidents of data corruption or loss, andsteps taken to repair/replace corrupt or lost data are reported to the external or consortial administration.

Indicators / exemplars

Data corruption goes unnoticed.

Initiation of error checking and media testing and planning.

Assessment of storage media is underway.Detection processes becoming more pro-active.

Preventative detection checks taking pace regularly.

The institution records, reports, and repairs where possible all violations of data integrity. System is capable of notifying system administrators of any logged problems.

Possible supporting sources

Documents that specify bit error detection and correction mechanisms used.Risk analysis.Error reports. Threat analyses.

Explanatory notes

See also ORGANISATION 05. There is also overlap with the Obsolescence element, TECHNOLOGY 05 (next).This element is about your capability for dealing with corruption (of bitstreams, or decaying storage media).

24

Page 26: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

TECHNOLOGY LEG

Technology Element 05:

Obsolescence

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Institution suspects it may have some digital materials that cannot be mounted, read or accessed.

Survey of obsolescent materials is underway.

The institution has a good understanding of the obsolescence issues across many / all its departments.

Obsolescence is dealt with pro-actively, and there are contingencies in place for future eventualities arising from obsolescence.

Automated notification systems in place.

Indicators / exemplars

Institution is unaware of possible data losses or that data may be held on obsolete media or in obsolete file formats.

Loss of digital assets through obsolescence triggers awareness of the problem.

Awareness within the institution that hardware, software andmedia obsolescence issues are more effectively dealt with before obsolescence occurs. Action being taken to mitigate the effects of obsolescence.

Actions are taken for dealing with obsolete file formats, storage media, storage drives, hardware and software.

File format registries and open-source software are widely used.

Possible supporting sources

Explanatory notes

See also ORGANISATION 05.

25

Page 27: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

TECHNOLOGY LEG

Technology Element 06:Changes to

critical processes

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Processes are not sufficiently documented when changed.

Realisation of the importance of identifying critical processes as well as changes.

Beginnings of documentation of changes to critical processes and identification of appropriate responses.

Documented process to notify about changes and resulting actions.

Identification of appropriate external partners who may assist with change documentation.

Indicators / exemplars

No awareness of critical processes.

Changes to processes are recognised, but it is not known if they affect compliance with the Institution's mandatory policies.

Institution manages changes in processes in data management, access, archival storage, ingest, and security. It is known what changes were made and when they were made. Traceability makes it possible to understand what was affected by particular changes to the systems.

Possible supporting sources

Documentation of change management process.Comparison of logs of actual system changes to processes versus associated analyses of their impact and criticality.

26

Page 28: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

TECHNOLOGY LEG

Technology Element 07:Security of

environment

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

No systematic analysis of the information environment and no security. No access control.

Initial planning and analysis of the information environment.

Information environment is being examined in terms of its suitability for the security of digital assets.

Systematic analysis of the information environment takes place, to ensure security. Program includes access-controlled area for storage media.

Regular analysis addresses external threats and denial of service attacks.

Indicators / exemplars

Institution maintains a systematic analysis of such factors as data, systems, personnel, physical plant, and security needs.

Possible supporting sources

ISO 17799 / 27001 certification.Documentation describing analysis and risk assessments undertaken and their outputs.Logs from environmental recorders.Confirmation of successful staff vetting.

27

Page 29: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

TECHNOLOGY LEG

Technology Element 08:

Security mechanisms

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Security needs are not defined. Networks may be insecure.

Security needs increasingly defined but not yet widely implemented.

Security mechanisms are developing and becoming more pro-active.

Processes are in place to address defined security needs.

Security needs are well defined and integrated, taking external dependencies into account.

Indicators / exemplars

No firewall on internet link.No definition of security role in job descriptions.

Firewall installed, but with basic configuration to protect key systems only.

Firewall policies are managed but not overly restrictive.

Firewall policy rules force deny-all.Policies are reviewed regularly.

Fully configured firewall in place. Logs are checked daily.System file integrity checking in place.

Possible supporting sources

ISO 17799 / 27001 certification.System control list.Risk, threat, or control analyses.Addition of controls based on ongoing risk detection and assessment.

28

Page 30: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

TECHNOLOGY LEG

Technology Element 09:

Implementation of disaster recovery

plan

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

No written disaster plan or continuity plans.

Business continuity needs are increasingly defined but not yet expressed as planning documents.Disaster impact scenarios are being defined and threat analysis is underway.

Disaster and continuity plans are written, but awaiting implementation and testing.

Disaster plan is implemented and tested.Processes in place for disaster and business recovery.

Responsibilities for disaster recovery are shared with external partners.

Indicators / exemplars

Plans are tested regularly.All staff are aware of plan.

It is possible to identify who can assist externally in event of disaster. Plans are tested regularly.

Possible supporting sources

ISO 17799 certification.Disaster and recovery plans.Business continuity plan.Service continuity plan.Documentation linking roles with activities.

Explanatory notes

See also ORGANISATION 11.

29

Page 31: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

TECHNOLOGY LEG

Technology Element 10:

Metadata creation

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Metadata creation is not automated, nor integrated with assets at their creation stage.

Tools for creating metadata are in development and being tested.

Most departments have technological capability for automated metadata creation, but the schema are not centralised.

Automated metadata tools (eg metadata extraction, authoring) are available and are fully integrated with asset management cycle across the Institution.

Automated metadata tools are available and are fully integrated with asset management cycle across the Institution and its external partnerships.

Indicators / exemplars

Possible supporting sources

Explanatory notes

See also ORGANISATION 08. This element is concerned with your Institution's technological capacity to create metadata, and the capacity to automate that process.

30

Page 32: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

TECHNOLOGY LEG

Technology Element 11:Institutional Repository

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Assets are not managed in a repository.

Need for repository identified. Planning for repository infrastructure started.

Repository infrastructure is being established. Managed programme is underway across the institution to protect the integrity of its digital assets.

Digital repository arrangements for managing asset collections are established in-house.

Consortial digital repository arrangements are in place.

Indicators / exemplars

Longevity of data could be at risk.Retrievability is not made easy. Institution has no IR.

Local projects put in place to assure protection of some departmental assets. Institution has IR at pilot stage.

Institution has IR but it can't accept or manage all known asset types, or manage appropriate levels of restriction.

Assets are protected; the longevity of the data is assured. Institution has IR capable of handling all its valuable assets with appropriate security levels.

Digital assets are secure, and retrievable by others.

Possible supporting sources

Explanatory notes

Not all Universities or HFEs have an Institutional Repository (IR), although the JISC are making some assumptions about their use in the HFE sector. An Institutional Repository could be managed using DSpace, Fedora or ePrints, but it could easily be any centralised or shared repository that enables sharing of resources. Some characteristics of an IR might include managed access; automated metadata creation; remote submission; preservation.

Your response here will reflect the stages of availability and overall use made of the IR. Is it just a pilot? Can it take restricted materials? Can it take research data? Can it handle records and perform records management functions? If you don't have an IR at all, score yourself at Stage 1.

31

Page 33: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

RESOURCES LEG

Resources Element 01:

Business planning process

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Business plan not yet written.

A business plan that helps sustainability of assets is on the agenda.

A business plan is written and in process of implementation.

Processes are in place throughout the Institution to support sustainability of assets.

Business partnerships and consortial agreements help to support sustainability of assets.

Indicators / exemplars

Very low level of financial commitment.

Process for a business plan is being developed with help of key staff.

Objectives and strategies are being set up and all staff notified.

Sustainable funding for core programme areas.Income and assets are generated through services.

Income is generated through third-party partnerships and external grants.Distributed financial management between organisations may be possible.

Possible supporting sources

Business plan.Annual financial reports.Operating plans.Budgets.Financial forecasts.

Assessment 3: Resources Leg

32

Page 34: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

RESOURCES LEG

Resources Element 02:Review of

business plan

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

No review mechanisms exist.

Initiation of business review groups.Departments review their local business procedures.

Business review groups established and meeting on a regular basis.

Institutional processes are in place to regularly review and adjust business plan.

Institution can demonstrate its responses to external audits.

Indicators / exemplars

Departments review their business plans in ad-hoc fashion, or not at all.

Departments have review processes, but they are unstructured and uncoordinated.

Reviews taking place sporadically.

Cyclical planning processes are performed annually.

Institution undergoes regular external audits to ensure quality of its policies.

Possible supporting sources

Business plan.Audit planning records and results.Financial forecasts.Recent audits.

33

Page 35: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

RESOURCES LEG

Resources Element 03:

Technological resources allocation

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Allocation of technological resources does not match the scale of the asset collections.

Some one-off asset projects are successful when seeking allocation of resources.

Long-term planning of resource allocation is taking place at senior level.

Policy and dedicated funds available for technology development.

Dedicated resources are shared in a common pool.

Indicators / exemplars

Policies are inflexible.It is rarely possible to obtain an increased allocation of (eg ) network capacity or server capacity when needed.

Certain asset collections are better resourced than others; some may suffer from inadequate resource allocation, not enough storage space.

Institution is aware of disproportionate allocation and is pro-actively reassessing its allocation of technology resources.

Institution is fully capable of assigning the necessary technological resources to all its important and valuable asset collections. Technology watch is in place for emerging technologies and future needs are anticipated.

Distributed and integrated technologies are in place.

Possible supporting sources

Explanatory notes

See also ORGANISATION 04 and TECHNOLOGY 02.

This element is about having the accessible technological resources to hand when needed, for asset management.Your response could vary to reflect the degrees of severity of the problem, e.g. knowning whether more network or more disk capacity can be made available when needed. If the driving force behind technological resources is not related to the scale of the asset collection, then things could be problematic.One example is bandwith allocation always being based on the number of students who use it; an inflexible policy means you can never expand that allocation.

34

Page 36: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

RESOURCES LEG

Resources Element 04:

Risk analysis

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Risks associated with assets not yet identified.

Some departmental awareness of local risks attached to their asset management.

Institution has identified and documented some risk categories for its digital assets, but balance is not maintained.

Institution shows ongoing commitment to risk analysis.All staff aware of the risk management plan.

Risk analysis is based on international standards as well as incorporating external assistance. External dependencies increase the institutional risk appetite.

Indicators / exemplars

No formal risk analysis. Mini-crisis will trigger local efforts at assessing possible localised risks.

Risk assessment becomes more cohesive and planned, using a model for predicting and planning.

Institution can demonstrate a level of risk analysis that is appropriate to management of its asset collections.

Consortium or partnership can demonstrate a level of risk analysis that is appropriate to management of its asset collections.

Possible supporting sources

Risk register.Technology infrastructure investment planning documents.Cost benefit analyses.Financial investment documents and portfolios.Licenses and contracts.Evidence of revision based on risk.

Explanatory notes

See also ORGANISATION 05 and TECHNOLOGY 03, 04, 05. There may also be some overlap with ORGANISATION 11 and TECHNOLOGY 09.

35

Page 37: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

RESOURCES LEG

Resources Element 05:

Transparency and auditability

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Financial practices are difficult to audit.

Local auditing of business activity in some departments.

All departments engaged and pertinent documentation is becoming available to all staff.

Financial practices are transparent and compliant across the Institution.

Practices are transparent and compliant and meet international standards.

Indicators / exemplars

Some departments are not transparent in their financial transactions, or do not feel they are accountable to others.

Documentation of practices and decision-making process in some departments is underway.

Audit trails of business activity are not yet centralised.

Institution can demonstrate level of transparency that is appropriate to management of its asset collections.

Financial practices are audited by external third parties.

Possible supporting sources

Demonstrated dissemination requirements for business planning and practices.Examples of accounting and audit requirements, standards, and practice.Evidence of financial audits already taking place.

36

Page 38: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

RESOURCES LEG

Resources Element 06:

Funding

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Funding gap is recognised, but financial commitment to asset management is low.

Some short-term projects exist to address the funding gap.

Funding sources identified.Funding programmes are being established.Some funding and support beyond projects, but limited.

Funding inbuilt to core function of organisation according to requirements. Sustainable funding is in place, dedicated for maintenance of assets.

Institution uses outside sources of expertise for asset management (eg consultants and contractors).

Indicators / exemplars

Asset maintenance is not adequately supported.

Funding comes in as short-term, one-off grants or ad-hoc awards.

Funding sources are ongoing and regular, but still not sustainable.

Dedicated funds for technology development, replacement and upgrades.Sustainable funding for core activities and enhancement.

Institution is actively seeking further potential funding sources.

Possible supporting sources

Annual financial reports.Fiscal policies, procedures, protocols, requirements.Budgets.Financial analysis documents.Business plans.Evidence of active monitoring and preparedness.

37

Page 39: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

RESOURCES LEG

Resources Element 07:Staff skills

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

No dedicated staff for asset management.

Awareness-raising for departmental staff is taking place.

Staff skills across departments are being harmonised, in line with management-level support.

Appropriate staff are available for asset management. Organisational expertise exists.Technical expertise supports asset management.

Dependencies exist among a collaborative pool of staff, both internal and external.

Indicators / exemplars

Departmental staff lack skills to manage their assets.Staff lack awareness of the value of assets.

Skills gap is recognised; gap is being addressed through finite projects.

There is increasing consistency in the way that assets are created and managed by staff.

Funding enables the steady maintenance of core staff skills. Key staff posts can be filled quickly.

Looking outwards at skill sets of external organisations with a view to collaborative efforts.

Possible supporting sources

Staffing plan.Competency definitions.Job descriptions.Staff profiles and CVs.

38

Page 40: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

RESOURCES LEG

Resources Element 08:

Staff numbers

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Institution is understaffed, or there is a mismatch between the requirements of its mission statement and staff numbers.

Staff numbers increase slightly on an ad-hoc project basis.

Appropriate and sufficient staff with sustainable funding.

Appropriate numbers of staff are in place to support good asset management.

Appropriate numbers of staff to support good asset management and preservation of assets.

Indicators / exemplars

Little or no money available for recruiting additional staff.

Some departments, who recognise value of their assets, are recruiting more staff accordingly as budgets allow.

Staff numbers across departments are managed within recruitment programmes, in line with management-level support.

Funding enables the Institution to keep staff numbers at a suitable level.

Staff numbers match external commitments and requirements.

Possible supporting sources

Organisation charts.Definitions of roles and responsibilities.Mission statement.

39

Page 41: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

RESOURCES LEG

Resources Element 09:

Staff development

Level of implementationStage 1 Stage 2 Stage 3 Stage 4 Stage 5

Inadequate investment in training. Staff skills becoming obsolete.Institution has no staff development programme.

Some departments recognise local development needs among their staff.

A co-ordinated institutional programme of development is underway across departments.

Institutional commitment to professional development / currency of skills and expertise.Institution has appropriate professional development and training policy.

Staff development is heavily influenced by external requirements and commitments.

Indicators / exemplars

Skill sets are not a good fit for asset management requirements.

Identification of training needs takes place on an ad-hoc basis.Individual staff sent on vocational training courses as time and budget allows.

Institution has not yet devised an appropriate professional development and training policy in support of its digital assets.

Staff skill sets are evolving in line with technological changes.

Looking at external organisations with a view to skill sharing and exchange.

Possible supporting sources

Professional development plans and reports.Staff appraisal reports.Performance goals.Training policy.Training requirements.Training expenditure.Certificates.

40

Page 42: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

APPENDIX: Cornell's Five StagesAdaptation of the Cornell maturity model for the

management and creation of digital assets

These five Stages represent a maturity model which can be used as a benchmark for understanding; institutions will pass through each stage, however briefly. They provide a way of communicating about asset management development. They enable one to measure progress towards programmatic goals for creation and management of digital assets.

Stage 1: AcknowledgeUnderstanding that the management and creation of your digital assets is a local concern

It begins with the sense that the organisation's digital assets are someone else's responsibility.

The enormity of the problem is realised, but this leads to paralysis - nothing is done.

People hope that the future will take care of itself (misplaced optimism) or that there is a single 'magic bullet' (ie single technological solution) which can solve everything

Yet people in the organisation depend heavily on their digital assets. No ownership is recognised, either locally or at senior level.

Then access difficulties are encountered - an important asset cannot be found because it's unmanaged, or (even worse) it has been lost, deleted, or corrupted.

This can lead to acceptance, possible taking of ownership, and acknowledgment.

This stage involves someone 'saying yes'.

Key Indicators:

Organisational infrastructure: Often non-existent; policies are implicit, or very high-level

Technological infrastructure:Non-existent, or heterogeneous and decentralised; disparate elements

Resources:Generally low, finite, ad-hoc financial commitment

41

Page 43: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

Stage 2: ActInitiating projects for management or creation of digital assets

It begins with a response to a specific threat. This can often be very reactive in nature.

Management of digital assets is still perceived as something outside the mainstream of organisational functions.

There is still a tendency to view it exclusively as a technology problem, which can be 'fixed'.

Thus, a one-time fix is put in place for something which is actually an ongoing problem.

But the inadequacy of this approach is soon evident. So, 'acting' may not be enough - but you can learn a lot through the

action.

Key indicators:

Organisational infrastructure:Implicit policies, or expressed in general terms; increased evidence of commitment

Technological infrastructure:Project-specific; reactive; ad-hoc location. 'Islands of automation' are being created, which lack integration, are isolated and ineffective.

Resources:Often project-based and finite funding

42

Page 44: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

Stage 3: ConsolidateMoving from projects to programs for managing digital assets

It begins with the recognition that others are doing this too - recognise external efforts, and external standards.

Realise that your projects are not sufficient - they are not compatible with long-term planning.

More explicit statements of recognition of the problem. Activities turn into regular ongoing work, and separate projects are

consolidated. The programs you develop are seen as tied to mission of the

organisation, and it is recognised that they deserve support.

Key indicators:

Organisational infrastructure:Development of basic and essential policies

Technological infrastructure:Assessment of technology investment and requisite infrastructure; shift to pro-active mode. A hard look at what is needed to manage digital assets.

Resources:Some funding and support beyond projects, but limited

43

Page 45: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

Stage 4: InstitutionaliseIncorporate the larger environment

It begins when you realise that uncoordinated and finite approaches to managing your digital assets are too expensive to run, and are redundant.

You begin to map your activities to external standards and practices. Build an integrated response to digital asset management problems, by

coordinating across the entire organisation. The institution develops internalised support. Start saying no - recognise there are certain things you cannot do

alone ("That database is too large and complex for us to manage by ourselves").

Key indicators:

Organisational infrastructure:Consistent, systematic management; comprehensive policy framework

Technological infrastructure:Technology planning anticipates needs; infrastructure investments are planned and implemented

Resources:Sustainable funding identified for core program areas and enhancement

44

Page 46: self_assessment_toolkit.doc

AIDA self-assessment toolkit - © 2008 University of London Computer Centre

Stage 5: ExternaliseEmbracing collaboration and dependencies with other institutions

It begins when you realise that your institutional programs, while necessary, are still not enough. You need to look beyond organisational boundaries.

You make formal and informal arrangements with others. Integrated management approaches are coupled with risk

management. You make commitments and connections with other institutions who

are also managing their assets, sharing knowledge and expertise. Create dependencies, recognise that it is a collaborative effort,

involving obligations. You can't do it all alone.

Key indicators:

Organisational infrastructure:Virtual organisations complement institutional ones; collaboration an inherent feature of successful planning in any aspect of digital asset management. Funding applications for are often more successful when done in collaboration.

Technological infrastructure:Distributed and highly integrated; extra-organisational features and services

Resources:Varying levels of investment, but sustainable funding; possibly distributed financial management between organisations, possibly from a single funding source.

45