NSDL Logo
 Annual Meeting
 Swiki Main
Sunday Sessions
  New Projects Luncheon
  Orientation
  Posters
Monday Sessions
  Opening Keynote
  Intellectual & Economic
  Research Challenges
  DLs & Education
  Implications
Tu & Wed Strands
  Birds of a Feather
  Building Collections
  Deployment & Continuity
  Services Development
  User-Centered Design
  Committees
 

Educational Impact & Evaluation Standing Committee


This committee ensures that participatory and stakeholder evaluation principles are integrated into the design, development, and implementation of NSDL.

http://eduimpact.comm.nsdl.org


Wednesday October 15 2.30 pm

Location: Federal B

Agenda


2:30 - 2:45: Welcome and Elections of new officers


2:45 - 3:30: Presentation and Discussion of Outcomes of Educational Impact Workshop


3:30 - 3:50: Presentation and Discussion of Outcomes of PIs Survey

3:50 - 4:10: Update and Discussion of NSDL Collections Assessment
Project

4:10 - 4:30: Update and Discussion of the EIESC Role in the production of the NSDL Annual Report

An important goal of this meeting is to get wider community feedback on the results and recommendations of the recently completed workshop on "Developing a Strategy for Evaluating the Educational Impact of NSDL."

Workshop participants were asked to make recommendations at three levels:

  • Recommendations for NSDL (Project PIs, Core Integration, Policy and Standing Committees)
  • Recommendations to NSF (including changes to the NSDL Program Solicitation, funding emphases, other cognizant programs)
  • Recommendations to the broader educational research and educational technology communities

Clearly, these recommendations have the potential to influence the
direction of future NSDL projects, but also the way that current projects are perceived as contributing to our understanding of educational impact. As such, it is vital that we receive your feedback as the results of this workshop are written up and moved forward.

An agenda and logistical information for the meeting are included below. We look forward to seeing all of you next week and moving forward on developing our collection vision for NSDL.

Tammy Sumner (Chair, EIESC)

Sarah Giersch (Co-Chair, EIESC)

Notes - Educational Impact & Evaluation Standing Committee


Wednesday, October 15, 2003, 1.30 pm-3.40 pm

Washington, DC

Notes taken by Mimi Recker

1. NSDL annual/progress Report (Tammy Sumner)


Tammy discussed the role the Standing Committee (SC )took in the NSDL annual/progress report. The importance of reports in communication, and as reference and outreach documents was stressed.

She also reminded the SC of the initial questions that has driven the work of the SC over the last 2 years, and how these fed well into the report:
  1. How are people using library?
  2. How are collections growing?
  3. How are distributed library building and governance processes working?
  4. What processes are needed to answer these 3 questions?

Discussion questions:

should SC participate in this report building process? How to coordinate volunteers in a production process? How should other SCs get involved?

Recommendation:

The SC recognizes the need to pull together its work in some readable account that is disseminated to necessary audience. Whether it appears in NSDL report is a separate issue. Need a more standardized process for coordinating SC work with Core Integration (CI) outreach group.

Mimi Recker will represent the EIESC in developing such processes with CI and Policy Committee.

2. NSDL Collections Assessment (Judy Ridgway)


Judy presented on goals, methods, findings, and current status of assessment study. See the working groups report on EIESC site.

Discussion:

A key difficulty is that all libraries are using different controlled vocabularies in important collection and item-level metadata fields (including resource type, subject, audience). SVM machine learning algorithm is promising in inducing a crosswalk. Work is ongoing in this area. Could this approach improve search engine performance because it provides query expansion?

Qualitative study:

correlate what people are searching for and map to what is currently in the library. Can help focus collection development. SVM ML technique again might be useful.

Recommendation: Judy should press forward!

3. 2003 NSDL PI survey (Sarah Giersch)


See Sarahs report on EIESC web site.

Discussion:

Who defines expectations for participation and collaboration for NSDL projects? Should it be in NSF RFP? Who defines collaboration? Who defines incentives and benefits to collaborate? Is collaboration essential for library building? If so, it should appear in RFP; and should be measured. Should CI publish info on what they will support? Are there case studies of successful collaboration that can be disseminated? Do we want to do this survey again?

Recommendations:

  • a. Add explicit language to NSF RFP to set expectations for collaboration for participating in NSDL community building. Include in RFP review criteria. Tammy, Ann, Sarah, and Flora to work on justification for this addition and language.
  • b. A discussion is needed on how to measure quality of informal and formal collaboration.
  • c. Ask EIESC group for feedback on report; make clear what aspects of report need feedback; identify action items as a result.

4. Elections


Flora McMartin briefed on election process. Sarah and Tammys term ends December 31. Term is 2 years.

Two volunteers for slates:

Anita Coleman (University of Arizona) volunteers to serve as chair.
Laura Bartolo volunteers to serve as co-chair.

Received nomination and seconds for slate.

This will be followed by email elections.


Anita Coleman Information

I am an Assistant Professor in the School of Information Resources and Library Science, University of Arizona, Tucson, which I joined in 2001. I have worked on a couple of NSF digital library projects - Alexandria Digital Library (ADEPT) and NSDL Geo-Technical Rock and Water Resources (GROW) Digital Library. Additionally, I also serve on the DLESE Steering Committee. My research and teaching interests revolve round information organization and information behaviors in the context of learning. From the White Paper in 2000 where five levels of evaluation were identified [URL: 2000 White Paper]
to the change in the committee name from merely evaluation to Educational Impact & Evaluation, and the goal of building a shared vision for evaluation as an integral activity in the NSDL through the development of metrics, instruments, and toolkits, the progress made is commendable. To be involved in continuing this work, as Chair, is both an honor and a challenge to which I look forward. Thanks for your consideration and support of my nomination.


5. Developing a strategy for evaluating the educational impact of the NSDL (Mary Marlino)


See Marys presentation on EIESC web site.

Discussion.

What about quantitative data? Some of these are available in annual report. Requires coordinated library instrumentation, which is hard.

Recommendations:

Announce existence of evaluation bibliography and methods paper drafts to other SCs and in WhiteBoard report.

Need plan to disseminate findings from evaluation efforts.
Tracking/evaluating library usage is important (not just at NSDL portal) -- but hard.

6. Closing


A strong vote of appreciation was expressed to the outgoing SC chair and co-chair, Tammy Sumner and Sarah Giersch

Meeting closed at 3.40pm.


Comments

Please enter any comments in the following format.
  • (commenters' initials) - month/day [comment date]
  • comment





NSDL thanks DLESE for hosting the swikis for the NSDL Annual Meeting 2003.

 Swiki Features

ViewEditPrintLockReferencesUploadsHistoryMain PageChangesSearchHelp