Survey Preview Mode - close window or click here to return anytime


Evaluation Practices Survey 2005

This 1-page survey should take 3-5 minutes to complete. The results will be anonymized before being publicly posted. Please: only one response per funded project. Responses are due by Friday, Sept. 30, 2005.

Thank you for your cooperation,

 - Sarah Giersch, Jim Dorward, Educational Impact and Evaluation Standing Committee



1. My project is being funded by the following NSDL track:
Core Integration
Services
Collections
Targeted Research
Pathways
Other:
 
2. The evaluation goals for my project can be characterized primarily as:
Process-Oriented (e.g., community building, educational outreach, collection building processes)
Usability and/or Accessibility (e.g., per ADA or W3C guidelines)
Usage tracking (via web metrics, profiles, etc.)
Collections assessment
Beliefs and attitudes about science
Educational Impact
Other:
 
3. The specific evaluation goals for my project are:
 
 
4. Please indicate the figure that most closely approximates the number of FTE’s that are devoted to evaluation in your project.
Less than .25 FTE
2.6 to .50
.51 to .75
1 or more FTE
 
5. Do you currently use, or intend to use, an independent evaluator (outside the immediate staffing for your project)?
Yes
No
I don't know
 
6. Where do you currently go for information about evaluation and evaluation techniques?
We have this expertise in my office
A colleague or a friend
Other departments on campus (e.g., Education, Psychology, Human-Computer Interaction, etc.)
Evaluation
The web
Other:
 
7. What evaluation resources or assistance, if any, would you like NSDL to provide?
Workshops
Web-based information
Evaluation skills training at the NSDL Annual Meeting
 
8. What other resources do you need to assist you in planning and implementing evaluation activities for your project? Please specify.
 
 
9. Would you attend a session or workshop on evaluation at the NSDL Annual Meeting?
Yes
No
 
10. Please check topics of interest for an evaluation session or workshop (check all that apply):
System and web log analysis
Survey data
Focus groups
Interviews
Observation of learners or participants
Classroom observation
Performance assessment
Student/user journals or logs
Heuristic evaluation
Student work samples
Usability testing
Specific information about designing and using different data collection methods
Tools for designing evaluation plans (templates, checklists, examples, budgets)
Data analysis resources
Other:
 
Thank you for taking the time to fill out this survey.


   





Survey Preview Mode - close window or click here to return anytime