NSDL Logo
 Annual Meeting
 Swiki Main
Sunday Sessions
  New Projects Luncheon
  Orientation
  Posters
Monday Sessions
  Opening Keynote
  Intellectual & Economic
  Research Challenges
  DLs & Education
  Implications
Tu & Wed Strands
  Birds of a Feather
  Building Collections
  Deployment & Continuity
  Services Development
  User-Centered Design
  Committees
 

Evaluating Digital Libraries on Many Levels: Library Use, User Needs, Accessibility Issues and Educational Impact



Susan Musante, American Society for Microbiology's MicrobeLibrary


Boots (Lillian) Cassel, Villanova University


Andrew Kirkpatrick, CPB/WGBH National Center for Accessible Media

Christine Walker, Advanced Technology Environmental Education Center

Chris Neuhaus, University of Northern Iowa

K. Ann Renninger, Swarthmore College

Session Description

As digital libraries have evolved, so have the metrics, tools and reasons for performing evaluations. A number of NSDL projects have begun to evaluate facets of their digital libraries using log analysis tools to track activity; or, they have combined structured interviews with online questionnaires to learn more about users' needs and preferences. One project is looking at ways to make all of NSDL more accessible to users or developers with disabilities. Co-presenters and attendees will share their experiences, learn about the theories underpinning current evaluation efforts, discuss how to use evaluation results and identify potential collaborators or mentors.


Notes - Evaluating Digital Libraries on Many Levels: Library Use, User Needs, Accessibility Issues and Educational Impact


Notes from Session - Flora McMartin

Evaluating MicrobeLibrary on Many Levels: Library Use, User Needs, Accessibility Issues, and Educational Impact

Susan Musante

See presentation slides

Types of evaluation done by attendees
  • Library Use
  • Users Needs
  • Impact

CITIDEL and the XML Log Standard

Lillian Cassel

See presentation slides

Accessibility in Digital Libraries

Andrew Kirkpatrick

See presentation slides

Digital Library Evaluation: Measuring Impact, Quantifying Quality, or Tilting at Windmills?

Chris Walker

See presentation slides

The Math Forum/Math Tools: Evaluation

K. Ann Renninger

See presentation slides
  • For chronological list of evaluation activities, see Ann


Discussion/Questions


Weblog analysis: what have you done re: human visitors vs. machines re: preferences for resources. How do we distinguish?

Theres a user ID in the weblogs, google publishes all Ips.

Could use Lillians tool and use it to sort this out.

Business/industry the garden group did an analysis that an impact evaluation can cost 2 3 times more than the cost of the start up of the original project. What % of the budget do you devote to evaluation?

Math Forum - <10% devoted. Much is integrated into making the site work

Follow-up NSF doesnt have a clue how much it really costs

Math Forum just getting the validity/reliability and indicators together takes a great deal of time, what is the relationship between the questions and the impact. Once this is under control, it doesnt take as much

At Scout, we put in minimal amounts most of money goes to software engineers. NSF, evaluation, sustainability and outreach NSF expects this, but doesnt fund this. Luckily, users are very interested in giving feedback theyve had great responses.

MF has been funded by lots of sources, each evaluation for each grant program has contributed to the overall evaluation of the site. Theyve been able to bootstrap to support the evaluation process. Vision of their evaluation program as a whole has helped them create more reliable measures.

The scholarship of teaching (Tom Reeves) if you can get faculty involved, as well as doctoral students to focus their research are two ways this is helpful.

Has anyone done user studies re: what they think is the value how much would they pay?

In Microbiology, they asked about registration and subscription. Registration OK but only a small number said subscriptions OK, but the sample was so small, and the budget problems associated w/ the professional organization may force the issue. There will be a member/non-member fee. $50 for non-members, $25 for members this will be a big experiment.

Math Forum they dont charge, the problem of the week, the large number of users of this service meant that if you wanted to have mentoring, they would have to pay a minimal fee. Users dont seem to understand that they have everything available to them that theyve always had, all except the mentors. It was questionable how much people used/read the mentors responses. The staff time was cost prohibitive.

Do people value something they dont pay for?

MF there was a huge number of responses, some of this was classroom work, which is not the purpose of the mentor program

Susan the NPR funding model

For Boots effectiveness of the metadata content?

Is it sufficient to respond to queries we have browsing by categories, is the meta data sufficient to meet their purposes?

Back to charging for sustainability

Look at the traditional library model scanned articles, which cost, are not used by people. People dont get the relationship between the library and what folks find on the internet they find it because of the library system, and a CU IP. As soon as you put a pricetag on it, users will go somewhere else.

Scout have 350,000 readers/week. Its so cheap to put out, but people wont pay to do it e.g., it takes $30K. People say they cant live without it, yet they wont pay for it.

From Un. Ca. they are looking at branding at the record level big issue when you are providing free content. People dont value it in the same way.

Accessibilty you look at interface and the resources in the library. Metadata ssociated w/ the resource and recommendations for that.

This is a relatively new area see SALT site referenced on Andrews slides. How to make elearning systems accessible. Within that theres a set of recommendations for that. Will need to target users and resources.

Follow-up if you are measuring the level of accessibility of the site, is this generalizable to the success of the site?

Tools are not designed to be generalizable questionable at best.

Dublin Core has a small working group not sure about its status. Most useful is to go to users and have them use the site in front of the developers, e.g., using a screen reader. Can also use student disability groups.

Comments

Please enter any comments in the following format.
  • (commenters' initials) - month/day [comment date]
  • comment

(pac)-10/22
Is this the session where I heard that an article on Users Guide to Evaluation of Digital Libraries was going to be posted?



NSDL thanks DLESE for hosting the swikis for the NSDL Annual Meeting 2003.

 Swiki Features

ViewEditPrintLockReferencesUploadsHistoryMain PageChangesSearchHelp