Usability Test Report Sample

Dublin Core

Title

Usability Test Report Sample

Subject

usability

Description

This document is an example of a usability test report for LIS 5472.

Creator

Marcia Mardis

Source

LIS 5472

Date

Spring 2018

Contributor

Marcia Mardis

Rights

CCBY

Relation

usability proposal
week 13 slides
usability questionnaire
week 12 slides

Format

file

Type

text

Text Item Type Metadata

Text

2018 Spring LIS 5472
Group 9
Group Members: Dr. M and Others

Class Examples Digital Library: Usability Test Report

Executive Summary with Critical Findings

The Class Examples Digital Library was designed to be an educational resource for students in LIS 5472. The target audience is MSI and MSIT students enrolled in the digital libraries course. While MSI and MSIT are the primary audience, I also wanted to create a site that would be accessible to faculty and students at other universities, considering varying levels of experience and comfort with the Web. It was important to us to create a site that is accessible and usable, with limited encountered errors to alleviate potential user frustrations. To accomplish these goals, our team developed a list of questions based on Nielsen’s 10 Usability Heuristics (Nielsen, 2005). These questions allowed us to view our site from an outside user’s perspective and gain insight into what improvements and enhancements should be made to future versions of our digital library.

The team conducted a usability analysis of our digital library using the Usability Questionnaire we created. The findings of each team member were then compiled to identify problematic areas in terms of usability. It is interesting to note that at times when one team member was satisfied and responded ‘yes’ to a question, the other responded ‘no’ and noted potential changes to be made. This allowed identification of both positive and negative aspects of the digital library from multiple perspectives, which resulted in useful recommendations for changes and enhancements to future versions of the site.

When considering the usability of our site, the team was particularly concerned with ensuring that users could see how the displayed class artifact fit with others in the class. Perrin (2017) emphasized the importance of helping users quickly and easily find what they need. Findings from the test revealed that users found the site to be well-organized and fairly simple to learn. In addition, users felt that pages provided information about the current item and well as related items. As we were pleased with the overall organization and content of the site, we attempted to identify ways to enhance and add additional features to improve the user experience even further. Some of these include adding new pages with helpful information such as a site map or a vocabulary page, and altering the site’s theme and colors for increased visibility.

The test results revealed a critical issue with the searching feature of our site. It was noted that the search does not clearly identify when zero terms are found and no suggestions or alternative terms are provided. Currently, the search procedure leaves little room for user accidents or errors. This could potentially cause frustration for users unable to find what they need. To address this important issue, the bulk of our recommendations for improvement centered on enhancing the searching mechanism and browsing options to create a better search experience for the user.


Test Results

This section discusses some of the results from our Usability Heuristic Questionnaire, revealing both positive and negative aspects of our digital library. The quoted titles in bold are the ten heuristics developed by Nielsen (2005) that were used when creating our Usability Questionnaire. Two questions were developed for each heuristic and team members were to check ‘yes’ or ‘no’ for each question when evaluating. Each section below briefly discusses the question, the evaluator responses, and compiled noterelating to their answers. As there were two evaluators, it is noted if a question received both a ‘yes’ and a ‘no’ response from both participants.

“Visibility of system status”: These two questions asked if all pages had a navigation bar with site identifying information and whether users could easily return to the homepage from any location within the site. The first question received a ‘yes’ response from both team members as a navigation bar that provides information and direction is present at the top and bottom of every page. The second question received a negative answer from both. The homepage is accessible by clicking on the main title banner or by clicking the “Home” link at the bottom of each page. However, a “Home” link is not prominently featured on any page so it is not necessarily ‘easy’ to return to the homepage.

“Match between system and the real world”: These questions related to how well our site uses clear and understandable vocabulary and whether the site is organized in a logical form that is easy to navigate and makes clear how a particular item relates to other items. The clarity of vocabulary received both a ‘yes’ and ‘no’ response from the two team members. It was noted that the vocabulary used throughout the site is generally clear and understandable. However, there is the potential that users may be less familiar with academic terms such as “proposal” or “activity” which are often used in the assignment descriptions and collection information. The question relating to logical site organization also received a mixed response of ‘yes’ and ‘no’ from the two team members. Generally the site is simple to navigate, but it was noted that an alphabetized directory of all the items in the collection is not available to users.

“User control and freedom”: The two questions asked if users could edit their own contributions and if they were provided with multiple viewing options for images. Both questions received affirmative answers from the evaluators. Users have the ability to edit any comments they leave on the site. In addition, users have the freedom to choose which method they would like to use to view the image. They can view the basic item from the item page, use the Zoom-it image viewer to zoom closer and see additional details, or finally click on the image itself to see the full-size Jpeg file.

“Consistency and standards”: These questions related to standardization of page layouts and whether vocabulary was used consistently throughout the site. Both questions received ‘yes’ responses from the evaluators. The layout is uniform throughout the site thanks to the Omeka theme that provides a standardized layout of all pages. The vocabulary is generally consistent throughout the site, and the use of controlled vocabularies ensure that locations, image types and animal names have a standardized terminology.

“Error prevention”: The questions from this section asked whether all links were working and whether a contact form was provided for encountered errors. Both of these questions were answered affirmatively by the evaluators. All links the team members tested on the site are functioning properly at this time.
Users have frequent and prominent links throughout the site pages to contact site administrators to report problems or errors.

“Recognition rather than recall”: These two questions related to the information provided by the site. Users were asked whether links to the collections were visible on all pages and whether a general site information page was available. Evaluators answered both questions with ‘yes’ responses. Users are provided with prominent links to browse and search the collections on the navigation toolbar. A general
“About” and a “Frequently Asked Questions” page are provided to familiarize users with the site’s layout, terminology and other important information.


“Flexibility and efficiency of use”: The four questions asked about the availability of basic and advanced search options and the ease of browsing on the site, second, and third questions received a ‘yes’ response from both evaluators. The site provides both advanced and basic searching options. The fourth question received both a ‘no’ and a ‘yes’ response. The site has a browsing feature and users can search by tags for additional browsing capabilities. However, it was identified that the site lacks an alphabetized listing of assignments or a site map to further guide user discovery and browsing.

Recommendations:
1. Use tags to include more common terms for class items, such as “planning” for “proposal.”
2. Use tags to link pieces of the same content (e.g., “digitization activity,” “digitization proposal,” “digitization assignment”) so that a search of “Activity 11” will get all items related to digitization.
3. Create a controlled vocabulary of assignment types (activity, proposal, assignment).
4. Add links to the master item display page.

“Aesthetic and minimalist design”: Considering the design of the site, the first question asked if the site used contrasting colors that were easy to view, while the second question asked if each page’s
information was concise and relevant to the page topic. The first question received both a ‘yes’ and a ‘no’ response. The site design features bold and bright colors that are visually pleasing and generally easy to read. However, the orange text for the un-clicked hyperlinks can be challenging to read on the white background. The second question also received both a ‘yes’ and a ‘no’ response. The information provided on each page is succinct and pertinent to the topic identified as the page subject. It was also noted that while browsing items in a collection, the “Featured Item” is a randomly selected animal that is not always a part of the collection being viewed.

“Help users recognize, diagnose, and recover from errors”: These two questions asked first if the system alerted users to search terms not found and second, if users were given additional search options when terms were not found. The first question received both a ‘yes’ and a ‘no’ response from the
evaluators. The system returns a message of “Browse Items (0) total” if the search term is not found, however it was noted that this is not a particularly visible or obvious way of expressing the result to users. The second question also received a ‘yes’ and a ‘no’ response from the two evaluators. Users are provided with options to browse the collections if their term is not available, however the system does not given any additional terms or suggestions.

“Help and documentation”: The final two questions asked whether searching tips were provided for new uses and whether a help page was available for further assistance. Both questions received
affirmative responses from the evaluators. A “Searching Tips” page is provided to guide new users through the site’s search features. A “Help” page provides basic assistance and information for potential problems, as well as a “Contact Us” feature to report encountered errors or to request assistance.

Recommendations for Improvement

Based on the test results conducted by the digital library team, we have developed a series of recommended improvements and enhancements for future versions of our digital library site. The primary recommendations relate to the browsing and search features of our site. As we are developing the site to reach a younger target audience who are often easily frustrated, providing simple searching and navigation are two of our biggest concerns.


There are several options for the user to return to the homepage, however none of them are prominent links. This could create a more complicated process of returning ‘home’ if users are not aware of the links. To optimize the user’s comfort and provide the “easy exit” recommended by Nielsen (2005), we recommend including a link to the homepage on the navigation toolbar.

While the overall terminology and vocabulary is understandable and clear, we recommend adding a page defining some of the more academic terms used throughout the site, for words such as “assessments” or “introductions.” These terms, while appropriate for the site’s target user level, may not be as well known to some of our users. A page providing relevant terms and definitions will provide a resource for anyone unfamiliar with some of the more academic or course-specific terminology used on the site.

The site is easy to browse and provides a variety of features to encourage user discovery. To further enhance this, we recommend adding both a site map and an alphabetized listing of all items contained in the collection. These additions will provide more browsing options as well as another method of site organization for users who may be searching the site in a more linear manner.

The theme and color scheme of the current layout are generally adequate in terms of visibility. However when making future improvements, we recommend selecting or configuring a theme with some additional options for text coloration as some of the orange hyperlinks can be challenging to view.

The “Featured Item”, which showcases a randomly selected assignment from the collection, is a part of the current Omeka theme applied to the site and therefore does not appear to be customizable. Overall this is a positive feature as it promotes user discovery by highlighting various assignments in the collection.
However, we noticed the “Featured Item” appears on pages while browsing the collections. For example, one may see a featured bird on the side after clicking the link to “View the items in Digitization”. As this may be potentially confusing to users, we recommend researching ways to limit the “Featured Item” to only specific pages within the site.

The search feature does indicate that zero items are returned if a term is not found, which alerts users that their search was unsuccessful. However, we recommend making it more apparent that no results were found for the search term. The sample page provided by Niel (2009) of a humorous “Page Not Found” with helpful links and tips is an excellent example of how to proceed. Possible solutions include adding
“Sorry, we didn’t find anything with that term!” or another similar message to alert users to the problem. In addition to the previous issue, the system does not provide additional searching suggestions or take misspellings into account. We recommend enhancing the search mechanism to provide suggestions or to have increased sensitivity to spelling errors. A possible solution could be to integrate the search
suggestion with the “Page Not Found” message.

These recommendations for improvement address the critical findings of our usability analysis. We believe the analysis allowed us to recommend some changes that will improve on the identified problem areas and further enhance the successful elements of our digital library. It is our hope that these improvements will create increased usability and a more successful user experience.


References

Nielsen, J. (2005). Ten Usability Heuristics. Retrieved from http://www.useit.com/papers/heuristic/heuristic_list.html

Neil, T. (2009). 6 Tips for a Great Flex UX: Part 5. Retrieved from http://designingwebinterfaces.com/6- tips-for-a-great-flex-ux-part-5

Reeves, T.C., Apedoe, X., & Woo, Y. (2005). Evaluating digital libraries: A user-friendly guide.
University Corporation for Atmospheric Research; National Science Digital Library. Retrieved from http://www.dpc.ucar.edu/projects/evalbook/index.html