NSDL testing framework

Draft of 12 June 2002 by Dean B. Krafft and Jon Phipps

Introduction

Our initial test plan for the Core Integration portion of the NSDL consists of the following overall elements:

In the sections below, we list the types of tests that will be performed. This is a very early draft of the list of tests. Contributions are welcome.

User Interface

Primary UI usability testing will be based on end-user testing organized by the Evaluation Working Group. Functional problems will be reported to the CI developers through the use of Comm Portal trackers in a workspace set up for this purpose. The test plan needs to include:

The following elements of the system will be included in the usability testing:

Data integrity

Data integrity tests are in two forms. Automated tests include checking a variety of counts and statistics as the data passes through the various stages of the system: ingest, Metadata Repository, and search. These tests also confirm that XML and database conversions preserve the data. The second type of tests includes manual visualization and sampling of data to verify that the metadata being harvested from collections accurately represents the underlying resources.

Components

Component-level testing is based on functional (or acceptance) testing at the available external interfaces. Test data typically enters the process through a special NSDL test collection. This collection provides both a fixed set of test cases that ensure data is uniformly maintained as the system is developed and expanded, and a growing set of test cases that exercise the new functionality being developed for the library.

Sample test data includes:

The component testing is primarily based on automated test cases built with HTTPUnit that access and verify web pages that the overall system creates in response to OAI and uPortal requests.

Component tests include:

Metadata Repository

Search

Authentication

User Interface

SDSC Archive Service

Operational

Operational testing verifies the integrity of the system under normal and abnormal operational conditions. One of the major challenges of developing reasonable operational tests for the NSDL is the great uncertainty in the expected system load. To set some sort of an initial base, we inquired as to the loads on the Library of Congress catalog service. Currently, the LOC limits web catalog access to 275 simultaneous sessions. Another 250 simultaneous Z39.50 users are allowed, for a total of 550 simultaneous external sessions. This may serve as a reasonable initial target for the NSDL uPortal service.

We currently plan to use a combination of JMeter (to impose a specific system load), HTTPUnit (to create and verify specific accesses), and JUnitPerf (to determine test response times) to do load testing on uPortal, the Metadata Repository, and the Search service.

Load testing (server and I/O)

System integration

Integration tests at the code level will be carried out by a daily Ant build, deployment, and “smoke test” test suite run. Service level integration testing will take place in conjunction with the functional component testing described above.

Disaster recovery and system failover

We will develop a full set of procedures and protocols for backups, system failover, and disaster recovery. Quarterly tests of full system rebuild and recovery from backup tape will be performed. Quarterly tests of system failover will also be performed.

Security

Security of the systems will be based on a set of procedures and processes designed to ensure that the systems are protected against known security hazards. Standard systems administration processes, including firewall filtering, lockbox system logging, monitoring vendor security bulletins, minimizing services, and other similar steps will be taken to maintain the security of the systems.