Overview
http://ia.usu.edu/viewproject.php?project=ia:4789
student log-in: mimi
[R:]
Title:
NSDL Annual Meeting 2007 - Webmetrics
Overview:
Body:
Webmetrics and the Instructional Architect
Mimi Recker & Bart Palmer
Utah State University
- Objectives
- Google Analytics
- Database Analysis
- Artifact Analysis
- Lessons Learned
- Show All
Abstract
In the approach taken by Instructional Architect project, teachers are viewed as designers of learning activities, making use of the recent abundant availability of high-quality, free online resources for learning, such as those provided by the NSDL. Our analytical approach then uses the online artifacts created by teachers as a data source to characterize how they design with, use, and share online resources in educational contexts.
About the IA
The
Instructional Architect (IA) is a tool that enables educators to easily access and acquire online resources, organize and adapt those resources into activities for their students, and make those new activities available to a variety of audiences.
Objectives of our Webmetrics
- A sense of site activity.
- What are teachers doing?
- What are students doing?
The IA is a service of the
Google Analytics
Process
- Paste script in template (6 lines)
- Registration
- Translate business talk
- Identify initial goal conversion
- About 2 hours on first iteration
Outcomes / Demo
Google Analytics- Confirmed student use (top content is iamstudent)
- Confirmed differential use by students and teachers (bounce rate)
- Where are they coming from? (maps: World, US)
- Ebb and flow of summer and weekends (confirmed)
- Mostly direct traffic sources (bookmarks?)
Benefits:
- Two words: Eye Candy!
- Easy to implement
- Summary data that we did not have to invent
- Mapping without purchasing access to location database
- Emailed and exporting reports
Costs:
- Loss of detail--tracking individual users for further study (more privacy, but we have informed consent)
- Not possible to restructure data (but filters work pretty well).
- Iterative process to define goals in economic terms--not a great fit, but also not exclusive trouble.
- Definition of "hit" and "visit" are determined by Google.com (a true cost?)
Database Analysis
Process
- Decide what to store (iterative and tied to theory)
- Database update/store in the database
- Complex SQL queries to collate into meaningful tables
- Scrubbing the data
- Analysis and interpretation
Outcomes
Example Workshop Data (live)- Very detailed reports at the user level (e.g., user studies)
- Xin Mao's dissertation
- Workshop impact
- Found "Workshops in the Wild" (spontaneous, driven by users)
Benefits:
- Very Powerful
- Very Complex
- Ask very detailed questions of the user (single sample or longitudinally)
- Expand user understanding in conjunction with other data (e.g., survey data, artifact analysis, discussion board comments, observations)
- Track everything, decide what and how to analyze later.
Costs:
- Very powerful
- Very complex
- Very labor intensive
- Unclear how to compare user activity profiles longitudinally with such detailed and seasonal data (this is more an example of the cost of power and complexity...)
- Privacy issues
- Opened a can of SPAM!
Artifact Analysis
Process
- Tied to a specific theoretical framework ('Teaching as design')
- identify design strategy and granularity of resources used
Outcomes
- Indication that resource granularity and design strategy are related
| Design strategy |
Resource granularity | Offload | Adapt | Improvise | Total% |
Small | 8 | 6 | 7 | 30 |
Medium | 13 | 7 | 16 | 51 |
Large | 7 | 6 | 0 | 19 |
Total% | 40 | 27 | 33 | |
Benefits:
- Impact teacher professional development workshops
- Implications for DL collection policy
Costs:
- Not 100% reliable
- Very labor intensive
Lessons Learned: the three towers
Approaches vary in terms of ease of implementation and analysis, and quality and precision of answers.
Challenges of triangulating masses of data, finding complementarity, and informing development (especially when such data contradicts findings from other methods).
Three Objectives
- A sense of site activity
- Access ebbs in summer and weekends
- Site is doing what intended
- bounce rates of projects: ~85%, other pages: ~20%
- exit rates of projects: ~70%, other pages: ~7% - [GA]
- What are teachers doing?
- Using the site
- top user time-on-pages "add content" and "my resources" - [GA]
- project edits are made in seconds - [GA]
- Lab use
- sequence/volume of hits - [DB & GA]
- IP address groups - [DB]
- In-class use?
- single hits with run through links - [DB]
- many times from same IP as most edits - [DB]
- resource use?
- using "medium" sized resources - [AA]
- mainly "offload" or "improvise" on activity design - [AA]
- What are students doing?
- They are using the site
- top content "iamstudent" - [GA]
- diverse IPs - [DB]
- short to long time-on-page [DB & GA]