THE NSDL EVALUATION PAGES

Introduction | Reports | Resources | White Paper | Consent Forms | Background

THE 'RESOURCE LIFECYCLE': A CONCEPTUAL META-FRAMEWORK FOR NSDL PROGRAM EVALUATION

An NSDL Core Integration White Paper: Latest revision - March 2006

THIS IS A DRAFT DOCUMENT. BEFORE CITING, PLEASE CONTACT: Michael Khoo - mjkhoo@ucar.edu

1 Introduction

This White Paper outlines the macro-level framework - the 'resource lifecycle' - that is guiding the development of individual micro-level evaluations of different aspects of the NSDL program.

The meta-framework described in this paper is based on the concept of the digital resource. A digital resource is defined as any servable electronic file, retrieved by a user after a search of NSDL, including text files, pictures, films, animations, audio files, software applications, applets, etc. The resource lifecycle meta-framework describes the various stages in the 'lifecycle' of a resource, from the moment of creation through to the moment of educational use, and beyond to the moment of redesign and improvement. In doing so, it identifies appropriate points within this lifecycle at which NSDL's past and current capabilities may be evaluated.

This model provides a set of general guiding principles and research questions, which can then be tailored to individual contexts in order to generate specific research questions and strategies. It is not offered as the definitively correct or prescriptive description of NSDL, but rather as a flexible, reconfigurable template, within which various locally relevant evaluation activities can be planned and implemented.

2 Focus, outcomes and audience

The focus of the evaluation is on a program-level, rather than an individual project-level, evaluation of NSDL. The central aims of the evaluation will be to assess how NSDL works as a program, particularly with regard to the efficacy of NSDL's program-wide organizational communication, organizational knowledge processes, and organizational integration. The evaluation will thus look at how well the efforts of various NSDL projects are supported by and thus contribute to the overall NSDL program.

The evaluation findings will be used to inform the NSDL Core Integration report due at NSF at the end of 2006. The findings will also be used to inform a series of smaller and more focused evaluation reports that will be released to the NSDL community in the intervening period.

3 What is a meta-framework and why is it necessary?

A meta-theory is a general theoretical framework that can be used to develop specific, individual, locally relevant theories (Giddens, 1984). In the context of NSDL's evaluation plan, an evaluation meta-framework is thus a general evaluation framework that NSDL can use to develop specific, locally relevant NSDL evaluation strategies.

An evaluation meta-framework is useful for NSDL because NSDL is

and a successful NSDL evaluation will have to address each of these points.

Organizational Complexity
NSDL is a complex program that has funded over 200 projects covering a wide range of activities, including library architecture, database and search engines, web site design and usability, resource creation, collection development, and community and outreach activities. Theoretically, therefore, NSDL can be modeled as a sociotechnical artifact (van House, Bishop & Buttenfield, 2003).1 That is, NSDL can be modeled as a mixture of projects, technologies, organizations, people and practices that are connected, mutually constitutive, and emergent and evolving. As NSDL projects vary widely in form and function, there is thus no one single evaluation strategy suitable and applicable to the whole of NSDL; and an NSDL evaluation strategy will therefore have to be 'multi-faceted' (Marchionini, Plaisant & Komboldi, 2003), in order to bring to bear a range of evaluation strategies appropriate to a range of contexts and questions.

Heterogeneous Community
An NSDL evaluation also has to address the fact that NSDL is a distributed, federated and heterogeneous organization, which includes a wide range of personnel and 'communities of practice' (Wenger, 1998). Different communities of practice will have different forms of digital library knowledge that will be partly tacit and taken-for-granted by each group. They will have different definitions of NSDL as an organization and as a digital library, and as a consequence, an NSDL evaluation will have to address the concerns of and be tailored to the interests of these different groups.

Narrative Coherence
Finally, an NSDL evaluation report should paint a coherent and compelling picture of NSDl's past achievements and future goals. Here, Reeves et al. (2003) stress that digital library evaluation strategies and reports should, at a foundational level, be informed by questions that can inform and support the future development of that digital library. Evaluation should not just measure and describe what is there, but provide a clear pointer to what might be, particularly in the form of data that can inform future development strategies. An NSDL evaluation strategy should be able to address the complexity and heterogeneity outlined in the previous two points, and also synthesize a coherent narrative that is useful for all NSDL stakeholders. An evaluation meta-framework can provide such a bridge from complexity to coherence.

Using a small set of basic concepts to address NSDL's organizational complexity, an evaluation meta-framework can therefore tie multi-faceted evaluation efforts into a coherent narrative useful for future NSDL development efforts, and generate a coherent series of specific research questions that addresses major areas of NSDL activity.

4 Assessing NSDL's educational impact

A crucial question to be addressed in the CI evaluation work is the extent of NSDL's educational impact. At present, the classroom impact of digital libraries is understudied, and compared with research into digital library architectures, tools, and services, relatively little is known about how digital libraries and associated technologies are actually used in educational settings (e.g. Arms, 2005). There is however a number of theoretical and practical obstacles that stand in the way of developing an evaluation strategy that could remedy this deficit.

One significant obstacle is that the use of digital libraries is a complex phenomenon that is theoretically underdetermined. That is, we lack detailed understandings and models of all the variables associated with digital library use that would allow us to isolate and study the impact of the technologies themselves (Kozma & Quellmaz, 1995). While the quality of a library's resources and services are crucially important, to evaluate this in context we need to know about a wide range of attendant technological, economic and social contingencies, such as a school's intranet and bandwidth, the number and age of a school's computers, the presence or absence of technological support staff, the attitudes of teachers and policies of school administrators towards new technologies, the impact of new and unfamiliar technologies on existing work patterns and practices, the impact of new educational policies on teaching practices, and so on. Further, we also need to know how these variables are related. Without a thorough understanding of all these issues, it will be impossible to design controlled experiments and instruments that can successfully isolate the impact of one variable amongst many (in this case, NSDL).

This is not to say that 'impact' studies are impossible, but rather that evaluation research in this direction will have to begin by identifying useful impact models and variables, rather than assuming 'NSDL' and 'educational impact' exist as unambiguous and well-articulated phenomena that can studied with relative ease.

NSDL-CI evaluation strategies are therefore shifting the emphasis of the 'impact question' away from the macro- and towards the micro-level. This involves a corresponding shift in the unit of analysis away from macro-level educational metrics, such as test scores, and towards individual teachers and their micro-level practices. The 'impact question' then becomes not 'How has NSDL impacted test scores?', but rather 'How has NSDL impacted teaching and learning practices?', such as the use of technology in classrooms to demonstrate scientific concepts (Kozma & Quellmalz, 1995; Sumner & Marlino, 2005).

A central aim of the evaluation will therefore be to work towards the development of models of educational technology use that can describe and measure how educational technologies impact teacher and pupil practices, based on the understanding that those practices and variables are embedded in complex educational and technological environments (Oliver & Harvey, 2002).

5 NSDL's organizational complexity

A central task of the NSDL program has been to develop organizational structures to support distributed interdisciplinary teams in the creation and use of networked interoperable educational digital libraries. The organizational structures that have been developed so far have been complex, emergent, and distributed in time and space.

While pilot organizational structures were proposed at the start of the project, the final form of these structures was not pre-determined. In practice, NSDL's organizational structures have evolved over time, both in response to the lessons learned from earlier stages of the project, and also in response to the technological, pedagogical and organizational challenges that have emerged since the project's inception. Recently, these emerging organizational structures and relationships have begun to be replaced by more formal institutional relationships, as exemplified by the Memoranda of Understanding established with the Pathways Projects.

NSDL's organizational structures are also distributed in space. Some two NSDL hundred projects have been funded, located across the United States. These projects were and are linked together in a variety of face-to-face and electronic arenas, including the Annual Meeting, workshops, telephone conferences, e-mail lists, newsletters, and wikis. Levels of participation in these arenas vary greatly.

NSDL's distributed structure over both time and space means that the task of evaluating its organizational structure is not a straightforward one. Information and data are spread out amongst its member projects, a number of which are no longer operational. Further, NSDL's status as a grant- rather than contract-awarding institution means that there has, historically, been no direct obligation for funded projects to report progress and evaluation results to any central NSDL office. Projects are expected to submit final reports directly to NSF, but from NSDL CI's point of view it is unknown what these reports might contain, if they contain any individual evaluation results, or even whether they were submitted at all. NSDL Core Integration does not have any automatic access to individual project policies, rubrics, metadata schema, and so on.

One significant evaluation task will therefore involve contacting various NSDL projects and asking them systematically to codify and submit their organizational knowledge, lessons learned, etc., for collation and review. A second significant task will involve developing organizational models that can both account for and also evaluate how well NSDL's organizational components integrate and work together.

6 The 'resource lifecycle' evaluation meta-framework

The complex structure of NSDL raises a number of questions for NSDL evaluation, including:

To address these and other issues, the NSDL evaluation will be carried out within a meta-framework that models the basic social and technological landscape within which NSDL activities are conducted. This meta-framework permits the development of individual and specific frameworks that can evaluate different aspects of NSDL.

The meta-framework adopted for NSDL evaluation is the 'resource lifecycle' model. The model focuses on digital resources in general rather than digital libraries in particular, and the core feature of the model involves tracking a digital resource through various stages from the moment of creation through to the moment of use, and then beyond to the moment of redesign and improvement.

This meta-framework is an idealized model, that permits the generation of more precise NSDL evaluation models and questions, and models that integrate disparate dimensions of NSDL such as resource quality, metadata quality, usability, GUI quality. It thus provides coherence and focus for a range of diverse and heterogeneous evaluation concepts and practices.

The meta-framework identifies five basic areas of NSDL activity (see figure 1).


Figure 1: The resource lifecycle (summary version)


Four of these areas deal sequentially with the process of digital educational resource creation, collection, and use:

The fifth area deals with NSDL project activities aimed at developing, co-ordinating and reinforcing the knowledge and communication of NSDL as an organization, such as the Core Integration projects.

To provide a more fine-grained analysis the model can be further broken down into 12 sub-stages, namely: resource creation; resource review; resource aggregation; item-level metadata; collection-level metadata; accessioning; NSDL outreach; NSDL web site; resource search/discovery; resource use/support; resource sharing; and resource feedback (see figure 2).


Figure 2: The resource lifecycle (expanded version)

Resource creation > outreach activities:

Outreach activities > resource re-use:


Taken together, these stages constitute a 'production line' model, the successive stages of which involve transformations of the digital resource, in the process adding value and utility to that resource. For example, a resource that has been reviewed for pedagogical effectiveness, scientific accuracy, and technological functionality is more valuable than a resource that has not; a resource described by accurate metadata is more valuable than a resource that is not; a resource with metadata embedded within a powerful search and discovery tool is more valuable than one that is not; and so on. While the activities of NSDL projects may not directly cover all the stages of the resource lifecycle, all stages of the resource lifecycle do affect NSDL's activities in one way or another.

This resource life-cycle meta-framework has several advantages for NSDL evaluation work. First, the model's stages provide useful conceptual boundaries that identify where evaluation efforts may be applied. Second, it provides a coherent overview of how disparate evaluation activities - such as webmetrics and ethnographic observation - may be integrated into and contribute towards an overall evaluation plan. Third, the resource lifecycle model provides a framework for evaluating NSDL's organizational communication and knowledge processes across the organization, for identifying existing organizational knowledge, and for making recommendations for improving intra-organizational communication and knowledge processes.

Finally, the model extends the remit of NSDL evaluation to include areas where future NSDL development may be directed. These new areas will be strategically important for NSDL, as NSF embraces a cyberinfrastructure model in which digital libraries can act as institutional and informational 'pipes' between the scientists on the one hand and the classroom and society on the other hand. In this model, envisioning a digital library as but one of a number of discrete network components connecting science and society unnecessarily limits the potential of digital libraries to develop digital services that (a) access science data and data sets, or (b) provide rich pedagogic experiences. In other words, in cyberinfrastructure terms, NSDL's future interests lie at least partly in extending its activities towards data exposure and classroom use, and here the resource lifecycle model speaks particularly towards NSF's interests in developing cyberinfrastructure as a digital pipeline/publishing model, and NSDL's potential to supply such a model (figure 3).


Figure 3: NSDL as digital library vs. NSDL as cyberinfrastructure


7 Concerns with and limitations of the approach

The model is too resource-centric and thus not applicable to all NSDL projects

NSDL has funded a range of projects in different tracks, including collections, services, and targeted research. Many of these projects are not directly involved in resource creation and review, so how will an evaluation that focuses primarily on resources adequately assess the contributions of these tracks to NSDL?

The resource lifecycle model is not intended as a strict 'one-size fits all' description of NSDL projects and their activities, but as a heuristic that will guide the development of individual evaluation initiatives. The model concentrates not on the resource itself, but on the resource lifecycle. The focus is thus on examining the contributions that various projects make to the resource lifecycle, rather than specifically to the resources that they may (or may not) create. The design of outreach workshops and community services, for instance, is just as suitable a subject for the evaluation program as is the design of metadata schema; and here, the evaluation will deploy more integrated, program-oriented evaluation strategies and metrics.

Granularity and the unit of analysis

How exactly is a resource defined in order to track it through the lifecycle?

The purpose of the resource lifecycle is not strictly to define 'resource,' but to act as a focal point for the identification and implementation of evaluation questions appropriate to each stage of the lifecycle. For instance, in the early stages of the lifecycle, the focus will be on collecting resource review rubrics and the evaluation of NSDL support to its members in this area. In the latter stages of the lifecycle, the emphasis will be on evaluating outreach and workshop activities through questionnaires and surveys. Thus, what constitutes a resource and the level of granularity at which it is defined may vary through the lifecycle, with the high level aim of the evaluation is to ascertain how well NSDL as an organization provides an institutional framework within which resources can be valued, adopted, used, shared, and re-created.

The theoretically underdetermined nature of digital library use remains

The resource lifecycle model provides a useful framework to address the issue of the underdetermined nature of digital library use, but on its own, it does not resolve this issue. The questions of how to evaluate digital library use-in-context is not answered directly, but are deferred by the model, and re-emerge in the stage of use and re-use. Does the question of NSDL's educational impact therefore remain unanswered?

The evaluation will involve not just capturing 'views from the center' through surveys, web metrics, etc., but the development of models and instruments for the study of teachers using NSDL collections and services in the classroom. As was described above (section 4), these studies will focus on the impact of NSDL on the micro-level practices of teachers, with the question becoming not 'How has NSDL impacted test scores?', but rather 'How has NSDL impacted teaching and learning practices?' A significant outcome of the evaluation here will be the development of metrics that can assess these changes in practices. It is likely that these studies will be more resource intensive, and will take a longer time to complete, than 'one shot' instruments such as questionnaires, and lab-based usability tests.

8 Conclusion

NSDL is a complex sociotechnical artifact that is composed of multiple and heterogeneous technologies, member organizations, policies, and practices. Any evaluation of NSDL will not only have to address this complexity, but will also have to be meaningful to a wide range of audiences, including project PIs and administrators, NSF managers, and users. To address these complexities, NSDL Core Integration has adopted a program evaluation model - the 'resource lifecycle' - that acknowledges this complexity, and also models it, provideinbg a series of evaluation measuring points about which a coherent narrative concerning NSDL's performance as an organization and as a program may be woven.

9 References

Arms, W. 2000. Digital libraries. Cambridge, MA: The MIT Press.

Arms, W. 2005. A viewpoint analysis of the digital library. D-Lib Magazine 11 (7-8), July/August 2005.
http://dlib.org/dlib/july05/arms/07arms.html

Bishop, A., N. Van House, & B. Buttenfield. 2003. Digital library use: Social practice in design and evaluation. Cambridge, MA: The MIT Press.

Borgman, C. 2000. From Gutenberg to the global information infrastructure. Cambridge, MA: The MIT Press.

Giddens, A. 1984. The constitution of society. Cambridge: Polity Press.

Kozma, R, and E. Quellmalz. 1995. Issues and needs in evaluating the educational impoact of the National Information Structure. White Paper, Department of Education workshop, "The Future of Networked Technologies for Learning."
http://www.ed.gov/Technology/Futures/kozma.html

Oliver, M., and J. Harvey. 2002. What does 'impact' mean in the evaluation of learning technology? Educational Technology & Society, 5 (3).
http://ifets.ieee.org/periodical/vol_3_2002/oliver.html

Sumner, T., and M. Marlino. 2004. Digital libraries and educational practice: A need for new models. Proceedings of the ACM/IEEE Joint Conference on Digital Libraries (JCDL 2004), Tucson, Arizona, pp. 170-178.

NSDL. 2000-2005. NSF Program Solicitation.
http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf05545

Reeves, T., X. Apedoe, & Y. Hee Woo. 2003. Evaluating digital libraries: A user-friendly guide.
http://dlist.sir.arizona.edu/398/

Wenger, E. 1998. Communities of practice: Learning, meaning and identity. Cambridge: Cambridge University Press.

10 Other Resources

Digital Library Use and Evaluation Studies

Adams, A., & A. Blandford. 2001. Digital libraries in a clinical setting: Friend or foe? Proceedings of the 5th European Conference on Digital Libraries, Darmstadt, Germany, pp. 213-224. Springer Lecture Notes in Computer Science, Vol. 2163. Berlin: Springer.

Adams, A., and A. Blandford. 2005. Digital libraries' support for the user's "information journey." Proceedings of JCDL 2005, Denver, Colorado, pp. 160-169. New York: ACM Press.

Kilker, J., & G. Gay. 1998. The social construction of a digital library. Information Technology and Libraries, 17(2), pp. 60-70.

Khoo, M. 2005. Tacit user and developer frames in user-led collection development: The case of the Digital Water Education Library. Proceedings of JCDL 2005, Denver, Colorado, pp. 213-222. New York: ACM Press.

Star, S., & K. Ruhleder. 1994. Steps towards an ecology of infrastructure: Complex problems in design and access for large-scale collaborative projects. Proceedings of CSCW 1994, Chapel Hill, North Carolina, pp. 253-264. New York: ACM Press.

Weedman, J. 1998. The structure of incentive: Design and client roles in application oriented research. Science, Technology, and Human Values, 23(3), pp. 315-345.

Online resources

Links to online DL research papers are available on the Online Resources page:
http://eval.comm.nsdl.org/cgi-bin/wiki.pl?OnlineDocs

11 Contact Information

For comments or for further information, please contact:

Michael Khoo, Evaluation Co-ordinator
NSDL Core Integration
University Corporation for Atmospheric Research
P.O. Box 3000
Boulder CO 80307-3000
USA

e-mail: mjkhoo@ucar.edu
tel: +1 303 497 2604
fax: +1 303 497 2933