Comm Portal -> Educational Impact

This web site will be decomissioned in March 2009. If you use it, please email systems@nsdl.org immediately.

Home
Mail Lists
Documents
Events
Deliverables
Resources
 
Edit This Page
Workspace Wiki
Summary Page
Community Portal

 


EIESC meeting at JCDL 2004

Monday, June 7, 2004
5-7 pm
Palo Verde II room

Tentative Agenda

PI Survey Taskforce (Sarah Giersch)
Collections Assessment Metrics Taskforce (Judy Ridgway)
Eval. Case Studies Taskforce (Bethany Carlson)
Webmetrics Workshop (with TSC)
News/Updates - Annual Report, Annual Meeting, Policy Committee, Sustainability Committee
Project Descriptions (all)
Other(s) EIESC Meeting at JCDL 2004 June 7 Notes

Survey (Sarah Giersch)

see PPT review of history of PI Survey. EIESC is moving to a survey that comes from the SCs. EIESC is seeking a volunteer from its committee as well as a representative from other SCs Possibility of having some money to help with an annual survey. But if it is ongoing committee responsibility, volunteer EIESC oversight will still be required. Graduate students should feel empowered to volunteer.

Collections Assessment Metrics Taskforce (Anita Coleman for Judy Ridgway)

Serves as a means for characterizing NSDL MR (subject, audience, format, learning resource type). Discovered wide variations in consistency. Used human labor and 2 machine methods. Currently stopped because too much effort was expended by volunteers. Tammy says there needs to be a mixed model where the EIESC does some work, setting parameters, but pay someone to do coding and testing and running of experiment. Possibility of future workshop on Collections Assessment. For example, setting parameters. Also, studying current collections assessment work. That last issue is probably a good one for the survey. "Classification Approach" Liz mentioned that she has a funded project along these lines, that she's writing a paper. Tammy says purpose was just to characterize library because we didn't know. Do we still want to focus on the question, "How are collections growing and evolving?" And if we do, what are the specifics of the question; what do we think we can answer. Sarah says ARL and their eMetrics work is grappling with same issues. Tammy thinks people wanted to answer the question of what is in the library and how it changes over time - indicate the breadth & depth of library, feedback into collection development. Flora - such feedback isn't being used to solicit or accept collections proposals Susan - what are expectations we can set to work with Pathways projects and help answer these questions Tammy - maybe taskforce can work with Pathways projects to set common standards. If so, we should deal with that now, this summer, to set expectations before awards are given. Judy wants to continue with this. Susan - most projects have a projected target, so maybe we just need to ask a different question to answer the growing and evolving piece better

Evaluation Case Studies Taskforce (Anita Coleman for Bethany Carlson)

Following up Evaluating Educational Impact Workshop for lighthouses and birds of paradise. Looking for volunteers to help collect case studies from projects that are already collecting case studies. [Mick Khoo has a case study from DWEL]

Webmetrics Workshop (Anita Coleman)

Held August 2-3 in Costa Mesa, just prior to MERLOT meeting. 1. Information gathering (literature review, best practices) 2. Understanding the scope of logging and analysis, what is needed, and why 3. Rerunning the pilot study with original 6 participants and a call for volunteers 4. Listing of projects developing tools 5. Sampling log data (at specific times in year) from various we might need to re-think #3 given our discussions on Pathways need a simplified version of rubrics let Casey know of your or others interest

News/Updates

Project Impact Content Analysis of Project Websites both came out of EIESC work for Annual Report. Looked at NSDL projects that have been funded since 2000. Sample that was handed out is a rough approximation, some projects do different things, e. g. workshops A project impact website would reflect work that has been done and as a means of recruiting by demonstrating what work is being done and what NSDL community can contribute to. Annual Meeting Sarah is the EIESC rep Please volunteer for Program Advisory Sub-Committee if you want to help. Policy Committee Anita attended PC meeting in February. Boots & Howard are the EIESC PC Liaisons. Boots' first meeting was in February. The PC has been focusing on Annual Meeting, purposes of PC, PC relationship to CI, and Community Services SC. Any input can be given to PC. Sustainability Committee SSC is also trying to identify best practices (in Sustainability). Ted Kuwana is doing that if EIESC members are interested in the topic and wanting to contribute. Flora - one of the interesting questions is "What is the NSDL collection worth?" Liz - another interesting question is "What are other sustainability options besides NSDL program funding through NSF?" Flora - earlier questions were all of the nature of what individual projects were doing

Other Issues

The 4 Questions Mary - we had the 4 questions, but one of the basic questions is "Who are the users?" Tammy - "who are the users of what? DLESE? NSDL?" Mary - start with basics, "Who are users of NSDL.org?" -- ask Pathways projects to report on this and provide a structured reporting mechanism -- Pathways might do case studies of use, provide demographic data Representation of Pathways at Webmetrics Workshop might be key. Or maybe having the workshop report available once Pathways awards are given so that standards and guidelines are ready for them. Tammy - part of our problem with answering the questions in the past is that there was too much diversity. we don't want to characterize NSDL by focusing on 4 nodes, but it's a good place to start. focus in on a few things for Pathways and CI as a start -- results may take longer than a year, but setting expectation and making preparation will help us determine if we can answer the question and how much effort it will take MathForum's Ideas for Possible Activities Wes, Ann, Steve developed some modest proposals. 1 - development of NSDL Evaluation Journal Ellen - a DL evaluation journal is a good idea Tammy - we're cannibalizing other things - there are so many other sources, adding one more doesn't seem to be the answer Chris N - reverse process by telling people about those existing sources what about an evaluation track in D-Lib? or an "evaluation" issue of a DL journal Bonnie Wilson of D-Lib likes for articles to include aspect of evaluation Gene - works in progress could also be accepted Ellen - a yearly bibliography annual additions to the Evaluation Annotated Bibliography would benefit community is that a D-Lib or EIESC effort Ellen - there are benefits to publishing it maybe one of the things we need to do is make "User-Friendly Guide to Evaluating Digital Libraries" and "Annotated Bibliography" more visible Anita - we want to add Literature Review section to project impact section of website Susan - I can highlight any important information in New Projects Orientation at Annual Meeting Tammy - Journal of Digital Libraries is actively looking for articles on evaluation. Tammy is editor in charge of this, so if you have questions or want suggestions, ask her. 2 - establish an evaluation support and feedback network Ellen - I've been to evaluators' meeting as part of a larger meeting - maybe we should try that with NSDL and its annual meeting Gene - MathForum had success when they opened up evaluation the full swat team, going out to projects, is another option **An Evaluation SIG or Panel at the Annual Meeting might provide an exchange forum** Gene - many people don't seem to know what to do in regards to evaluation Chris N - people involved in evaluation are also looking for new ideas or different ways to do things, so experienced practitioners want the help as much as those new to evaluation. but it needs to be informal. the JCDL workshop was great, but too structured. Susan - many projects tell me that their evaluation plans are "we're going to put some students in front of a website" - many projects are stuck in mindset that all they need to evaluate is usability of the website - an organized thing from this group about "beyond website evaluation" to inform other projects about the other evaluation issues to address Chris N - "Outcomes Assessment" or some different term from "evaluation" might help convey the point, "impact" Liz - so what's the middle point b/c people can't be expected to do the full force and we want them to do more than they are now 3 - have professional development workshops for sharing of evaluation approaches specific to virtual environments NSDL is full of busy people so there must be an incentive for doing work - the committee can figure that one out.

Attendees

Laura Bartolo
Boots Cassel
Anita Coleman
Tina Finnerman
Sarah Giersch
Karen Henry
Ellen Hoffman
Susan Jesuroga
Casey Jones
Gene Klotz
Liz Liddy
Cathy Lowe
Mary Marlino
Dave McArthur
Flora McMartin
Chris Neuhaus
Youfen Su
Tammy Sumner
Chris Walker