Frictionless Clients: High Stakes Assessment Tests

High-stakes assessment testing is a big deal, for both students and the industry. Nationwide, teachers and students know the stress of the increasingly important state educational assessments. Here in Massachusetts, its the MCAS. These tests are long and complicated, and because they have been traditionally delivered via paper, inflexible in design and delivery. A new industry is evolving to conduct “e-assessments.” For a recent project, I had the opportunity to analyze a number of these products. Two of the main players in the electronic assessment space are Smarter Balanced (SBAC) and Partnership for Assessment of Readiness for College and Careers (PARCC).

SBAC

Smarter Balanced Screenshot

 The objective is to convert every student assessment to an electronic one, delivered via computer. According to Wikipedia, here are the advantages. “E-assessment is becoming widely used. It has many advantages over traditional (paper-based) assessment. The advantages include:

  1. Lower long-term costs

  2. Instant feedback to students

  3. Greater flexibility with respect to location and timing

  4. Improved reliability (machine marking is much more reliable than human marking)

  5. Improved impartiality (machine marking does not ‘know’ the students so does not favour nor make allowances for minor errors)

  6. Greater storage efficiency – tens of thousands of answer scripts can be stored on a server compared to the physical space required for paper scripts

  7. Enhanced question styles which incorporate interactivity and multimedia.”

During my research, I have discovered that some of companies creating these tests have little experience developing software applications, their backgrounds are in test design or courseware. The assessment application designers follow few standards or other conventions that would be familiar to children taking the tests. The assessments are being designed to be given on a standard desktop computer with any browser that school system has in place. Tablets are also being considered, but they are drawing the line at phone-size technology.

The analysis I performed found many inconsistencies within the test itself as well as with standard web applications most students will have encountered. Internal inconsistencies include the visual language such as the choice of colors and no clear indicators of what is clickable and where the targets are for the answers. Language and placement of buttons are also confusing, as though a different designer created each button and selected their own words. Inconsistencies with web standards include taking over the browser controls so buttons such as Back/Forward and Zoom cannot be used. In addition, resizing the browser or its contents does not adjust or flow the content to read easily. See the frictionless design Student Experience Analysis document to review the details of the design review.

Rather than releasing software that will create unnecessary challenges to the students taking the test, SBAC and PARCC should have user experience designers redesign their applications so the student will not stumble over clumsy UI; the student should be focused on taking the test and the tool should just be the vehicle that guides through the content. A heuristic analysis and a usability study may point out the problems in the current design, however the question remains if the design is the best one in the first place. Before diving headlong into e-assessments, vendors need to address these solvable design flaws so that our students can focus on the important test in front of them.

SmarterBalance Student Experience Analysis

Leave a Reply