Summary: This article provides a summary of a usability evaluation of a university portal website. University faculty, staff, and student users were asked to complete representative search tasks and provide feedback on the portal usability. Several user interface design issues were found to impact user performance in terms of task success and perceived task difficulty, in addition to overall satisfaction. From these results, recommendations are made for university portal design related to the default ‘home’ page, channel customization and configuration, and placement of user-specific functions.
Portal websites are sites that serve as a gateway to a large amount of information. Portal web pages are often divided into subsections called portlets, or channels, and are accessible to both public and private audiences. For example, Yahoo.com serves as a public point of access to a multitude of information ranging from news and weather to movie reviews and music. Corporations and universities use private portal websites as secure points of access to employee or student information. In higher education, this is sometimes viewed as the "intranet" complement to the university’s public web presence. Through this portal, university staff may access employment information from Human Resources, faculty may access student class lists and enter grades, and students may access course materials, grades, and financial aid information. One major benefit of a university portal at the institutional level is the ability to access all of this information using only a single sign-on (Olsen, 2002; Nielsen, 2008).
There are many tools that are being used to build enterprise portals. In 2000, the uPortal open source project was initiated in an attempt to standardize the content and services provided. This allowed educational institutions access to services and information and saves "content/service providers from redundant work in developing a user interface and navigation" (Wheeler, 2004). In a recent review of 23 corporate and educational portals, Nielsen (2008) reports steady growth in portal usage, increased numbers of features offered, and increased collaboration within organizations.
While the idea of a single point of access for quick and easy access to critical academic information sounds idyllic, designers of such interfaces are faced with the challenges of presenting the material in a logical and usable manner. This becomes especially challenging as access to more information becomes available and the size of the portal interface increases.
This article reports on the findings of a usability study of the university portal at Wichita State. The evaluation was conducted prior to the portal going "live" in late 2007. Users from three primary user groups, faculty, staff, and students, were asked to complete a series of basic search tasks. The portal interface consisted of four tabs of information for students and five tabs of information for faculty and staff.
Representative participants were recruited from the university campus. One Pentium-class computer running Windows XP at 1024 x 768 screen resolution was used to access the university portal running Sungard’s Luminus Platform v3.3.3 (Figure 1). Participants were videotaped using a web camera and the software program Morae™ 2.0 (TechSmith, 2007) to capture the on-screen events of the website for each task. In addition, Morae™ was also used to gather performance data, including time on task and the number of pages navigated by each participant.
Figure 1. Sample page of the Wichita State university portal.
The tasks were representative of common activities conducted by each user group. These tasks included searches for both general and specific information within the portal interface. Table 1 shows a sample of the kind of information each user group was asked to find.
Table 1. Sample of tasks completed by each user group.
|Final exam for class||W4 info||Teaching schedule|
|Financial aid status/summary||Health benefits||Class list|
|Transcript request status||Link to personal address||Dashboard office hours/customization|
|Final grades for spring 2007||Department contact info||Final exam time for class you teach|
|Drop/add course||Sick leave accumulated||Place to enter final grades|
|Campus weekend events||Pay stubs for past 3 months||Phone contact for Payroll|
|Textbook for class||Map of the campus||Course number|
|Dept phone number||Dates for spring break 2008||Faculty senate meeting time|
|Add tab for Google to portal||Customize Financial Advisor (Budget Officers)||Customize portal channels|
Participants were tested in the laboratory on an individual basis. They were asked to search for the information, one task at a time, and were allotted up to 5 minutes per task. After each task, participants were asked to rate its ease/difficulty on a 5-point scale.
The entire test session ranged from 30-60 minutes, depending on the speed of the participant. Once testing on all tasks was complete, participants completed a satisfaction survey (Brooke, 1996) and were interviewed for overall comments. Tasks were presented to each participant in a random order to avoid order effects. Performance data (success, task completion time) and subjective data (perceived task difficulty, satisfaction) was gathered for each participant.
Analysis of the satisfaction scale (maximum score = 100) showed that the Staff users reported the highest level of overall satisfaction (M=72.5 SD=17.18) followed by Faculty (M=55.0 SD=29.2) and Students (M = 52.5 SD=28.7) though the difference was not statistically significant (F(2,15) = 1.08, p > .05).
The following summarizes the main issues identified in the usability study:
1. Default tab
The default tab for ALL user groups in this portal was a tab entitled "Student Resources". While this worked well for the students, it was a problem for the faculty and staff, who also searched this tab first for nearly all of their information. Many did not realize until later in the study that there were multiple tabs of information available to them. In particular, they did not notice that there was a tab entitled "Faculty/Staff", which really should have been their starting point. The default "home" tab is important to the design of the portal because this is where first-time users start to build a mental representation of the site’s purpose and structure. In this study, each user group assumed that the first page displayed would be relevant to them. Nielsen (2008) refers to this as "role-based personalization" and suggests that this is the direction that all intranet applications should take.
2. Class Management
Crucial tasks pertaining to class maintenance (i.e., posting grades, posting a syllabus, looking up a roster, and finding a class schedule) were reported to be difficult and were less successful for the faculty. This was because there were several channels dedicated to class maintenance (see Figure 2). Faculty did not understand the differences between the "Schedule", "Faculty Dashboard" and "Final Grade" channels. They each listed a subset of the classes taught by the faculty member, but each was listed in a different order which users did not find to be logical (such as by term, alphabetical, or numerical). They did not realize that one channel was to be used for grade assignment, one for class listings, and one for showing their daily schedule. Instead, users wanted a single channel from which all class maintenance activities could be conducted.
Figure 2. Faculty were confused why there were three channels that each had course listings in different orders.
3. Icon Representation
The use of icons throughout the Luminus portal was prevalent though not intuitive. Accessing information within the "Faculty Dashboard", editing that channel (see Figure 3), and posting final grades within "Final Grade Assignment" (see Figure 4) were challenging tasks in particular for the faculty users. The icons were not recognizable as representing particular tasks, nor were they identified by participants to be linked to additional information. While pop-up informational tags were available for each icon upon mouse-over, most participants did not recognize this and instead wanted to simply click on the course name. Figure 3 shows the icon links for editing the course listings in the channel, viewing a class roster, and viewing instructor information such as the syllabus, office hours, and email address. Likewise, Figure 4 shows the confusing icon links users needed to click to enter student grades.
Figure 3. Users did not notice the Edit icon (top) which was the way to configure the channel content. Most clicked on the course link rather than the other highlighted icons to manage their courses.
Figure 4. Faculty found the highlighted message icons confusing. They did not realize they needed to click these to enter final grades.
Users in all three user groups had difficulty customizing the portal pages. This included editing the information within a channel as well as changing the number and order of channels on a particular tab. Most users believed that they should be able to drag and drop or right-click with their mouse to move a channel. Instead, the portal interface required them to click on a customization link. Those that found the link were still unsuccessful due to the non-intuitive manner in which channels had to be manipulated. Figure 5 shows a sample customization page. To move a channel, users had to click on the small arrow icons. When using the arrows to move a channel in the horizontal direction, the channel shifted to the next column as expected, but dropped to the bottom of the column. Many users who reached this point in the task became disoriented when the channel seemingly disappeared. This also proved to be a very inefficient method for moving the channel vertically since each click only moved it one location. This resulted in many clicks and constant tracking by the user to get the channel to a new location.
One additional problem with the customization of portal was the default page of the customization area (see Figure 6). Most users assumed that the customization link would default to the page they were currently viewing. Instead it defaulted to the page last customized. As a result, some users completed the customization task not even realizing they were changing the wrong page and then were confused why their changes were not shown on the actual portal page.
Figure 5. Requiring multiple clicks by use of the arrows to move channels was problematic; especially when channels defaulted to the bottom of the next column when moving horizontally. Participants would commonly lose track of where the channel was.
Figure 6. The customization default tab was not the tab the user was currently viewing.
5. Facilitating User Scanning
When attempting to locate a department phone number, users correctly began their search by looking on the Directory tab. Figure 7 shows that this tab had three different search fields, each in a different channel. Users often typed their search in the wrong search field because they did not take the time to read the instructions for each channel. Instead, they quickly scanned the page for what looked like a search field. In this case, too many similar fields caused confusion and error (Figure 7).
Figure 7. Users often entered their search term in the first field they saw on this page. They did not take the time to figure out which search field to use.
Results from the usability analysis of the university portal revealed several usability issues impacting end-user satisfaction. It is expected that these same issues may occur in other academic portals as well as in industry intranet websites. As a result, the following general recommendations are made:
- Default the portal to the page/content most appropriate for each user’s role. Having a single sign-on is only valuable to users if it is individualized to them.
- Allow common activities (e.g., all course management) to be conducted from a single channel.
- Use action-oriented icon links only if they are intuitive and easily recognized.
- Make customization of the portal pages easy and intuitive. Allow users to drag and drop channels to reposition.
- Provide a single point of search that is accessible from everywhere in the portal.
Providing a secure portal interface can be powerful and time-saving to its users if it is designed properly. Developers are encouraged to conduct usability testing to make sure the layout and intended functionality matches the users’ expectations. Failure to do this may lead to increased user frustration, dissatisfaction, and negative perceptions of the university (Sikorski, 2006).
Brooke, J. (1996). SUS: A Quick and Dirty Usability Scale, in P. Jordan, B. Thomas, B. Weerdmeester, & I. L. McClelland (eds.), Usability evaluation in industry, (pp. 189-94). London, UK: Taylor & Francis.
Nielsen, J. (2008). Enterprise Portals Are Popping. Available online: http://www.useit.com/alertbox/portals.html
Olsen, F. (2002). The Power of Portals. The Chronicle of Higher Education. Available online: http://chronicle.com/free/v48/i48/48a03201.htm
Wheeler, B. (2004). The Open-Source Parade. EDUCAUSE Review, vol. 39, no. 5 (September/October 2004): 68–69. Available online: http://connect.educause.edu/Library/EDUCAUSE+Review/TheOpenSourceParade/40503?time=1210624020
Morae™, TechSmith ©2006 (http://www.techsmith.com/morae.asp)
Sikorski, M. (2006). Building employer credibility in corporate intranet portals. In
Proceedings of the 13th Eurpoean conference on Cognitive ergonomics: trust and control in complex socio-technical systems, pp. 49–54.