Printing Evaluating Digital Libraries on Many Levels: Library Use, User Needs, Accessibility Issues and Educational Impact



Evaluating Digital Libraries on Many Levels: Library Use, User Needs, Accessibility Issues and Educational Impact



Susan Musante, American Society for Microbiology's MicrobeLibrary


Boots (Lillian) Cassel, Villanova University


Andrew Kirkpatrick, CPB/WGBH National Center for Accessible Media

Christine Walker, Advanced Technology Environmental Education Center

Chris Neuhaus, University of Northern Iowa

K. Ann Renninger, Swarthmore College

Session Description

As digital libraries have evolved, so have the metrics, tools and reasons for performing evaluations. A number of NSDL projects have begun to evaluate facets of their digital libraries using log analysis tools to track activity; or, they have combined structured interviews with online questionnaires to learn more about users' needs and preferences. One project is looking at ways to make all of NSDL more accessible to users or developers with disabilities. Co-presenters and attendees will share their experiences, learn about the theories underpinning current evaluation efforts, discuss how to use evaluation results and identify potential collaborators or mentors.


Notes - Evaluating Digital Libraries on Many Levels: Library Use, User Needs, Accessibility Issues and Educational Impact


Notes from Session - Flora McMartin

Evaluating MicrobeLibrary on Many Levels: Library Use, User Needs, Accessibility Issues, and Educational Impact

Susan Musante

See presentation slides

Types of evaluation done by attendees

CITIDEL and the XML Log Standard

Lillian Cassel

See presentation slides

Accessibility in Digital Libraries

Andrew Kirkpatrick

See presentation slides

Digital Library Evaluation: Measuring Impact, Quantifying Quality, or Tilting at Windmills?

Chris Walker

See presentation slides

The Math Forum/Math Tools: Evaluation

K. Ann Renninger

See presentation slides


Discussion/Questions


Weblog analysis: what have you done re: human visitors vs. machines re: preferences for resources. How do we distinguish?

There�s a user ID in the weblogs, google publishes all Ips.

Could use Lillian�s tool and use it to sort this out.

Business/industry � the garden group did an analysis that an impact evaluation can cost 2 � 3 times more than the cost of the start up of the original project. What % of the budget do you devote to evaluation?

Math Forum - <10% devoted. Much is integrated into �making the site work�

Follow-up � NSF doesn�t have a clue how much it really costs

Math Forum � just getting the validity/reliability and indicators together takes a great deal of time, what is the relationship between the questions and the impact. Once this is under control, it doesn�t take as much

At Scout, we put in minimal amounts � most of money goes to software engineers. NSF, evaluation, sustainability and outreach � NSF expects this, but doesn�t fund this. Luckily, users are very interested in giving feedback � they�ve had great responses.

MF � has been funded by lots of sources, each evaluation for each grant program has contributed to the overall evaluation of the site. They�ve been able to bootstrap to support the evaluation process. Vision of their evaluation program as a whole has helped them create more reliable measures.

The scholarship of teaching � (Tom Reeves) if you can get faculty involved, as well as doctoral students to focus their research are two ways this is helpful.

Has anyone done user studies re: what they think is the value � how much would they pay?

In Microbiology, they asked about registration and subscription. Registration OK but only a small number said subscriptions OK, but the sample was so small, and the budget problems associated w/ the professional organization may force the issue. There will be a member/non-member fee. $50 for non-members, $25 for members � this will be a big experiment.

Math Forum � they don�t charge, the problem of the week, the large number of users of this service meant that if you wanted to have mentoring, they would have to pay a minimal fee. Users don�t seem to understand that they have everything available to them that they�ve always had, all except the mentors. It was questionable how much people used/read the mentors� responses. The staff time was cost prohibitive.

Do people value something they don�t pay for?

MF � there was a huge number of responses, some of this was classroom work, which is not the purpose of the mentor program

Susan � the NPR funding model

For Boots � effectiveness of the metadata content?

Is it sufficient to respond to queries � we have browsing by categories, is the meta data sufficient to meet their purposes?

Back to charging for sustainability

Look at the traditional library model � scanned articles, which cost, are not used by people. People don�t get the relationship between the library and what folks �find on the internet� � they find it because of the library system, and a CU IP. As soon as you put a pricetag on it, users will go somewhere else.

Scout � have 350,000 readers/week. It�s so cheap to put out, but people won�t pay to do it � e.g., it takes $30K. People say they can�t live without it, yet they won�t pay for it.

From Un. Ca. they are looking at branding at the record level � big issue when you are providing free content. People don�t value it in the same way.

Accessibilty � you look at interface and the resources in the library. Metadata ssociated w/ the resource and recommendations for that.

This is a relatively new area � see SALT site referenced on Andrew�s slides. How to make elearning systems accessible. Within that there�s a set of recommendations for that. Will need to target users and resources.

Follow-up � if you are measuring the level of accessibility of the site, is this generalizable to the success of the site?

Tools are not designed to be generalizable � questionable at best.

Dublin Core has a small working group � not sure about its status. Most useful is to go to users and have them use the site in front of the developers, e.g., using a screen reader. � Can also use student disability groups.

Comments

Please enter any comments in the following format.

(pac)-10/22
Is this the session where I heard that an article on Users Guide to Evaluation of Digital Libraries was going to be posted?