Skip to content
© 2026 Flying Circle LLC
College Board ACES logo

CollegeBoard

Competencies
Initial Architecture
User Research
Stakeholder Interviews
Interaction Design
Workflow Definition
Product Discovery
Asset Delivery

Reimagining ACES for Higher Ed

Partnership Context

How user-focused design helped an education leader transform their digital platform
The College Board provides tools and services that help higher education institutions make data-informed decisions about admissions, placement, and student success. One of those tools, the Admitted Class Evaluation Service (ACES), allows universities to submit data and receive studies that connect admissions factors—like test scores and GPA—to actual outcomes.
ACES is trusted by institutions nationwide. But its platform experience was falling behind. College Board was receiving consistent feedback from admissions teams and institutional researchers: the submission workflow was hard to follow, inconsistent, and easy to get wrong. Flying Circle was brought in to redesign the ACES experience—making the full submission process easier to navigate, more transparent, and more aligned with the real-world needs of higher education teams.

Challenge for Maintaining Context

Submission workflows were too rigid and generic
Users were dropped into a workflow without context. There was no sense of where they were in the process, what steps came next, or whether their data had been received. This lack of orientation led to frustration, delays, and, in many cases, incomplete submissions.

Solution to Changing States

ACES is trusted by institutions nationwide. But its platform experience was falling behind. College Board was receiving consistent feedback from admissions teams and institutional researchers: the submission workflow was hard to follow, inconsistent, and easy to get wrong. Flying
ACES submission workflow with visual progress indicators

Outcome of Visual Indicator

Dramatic Reduction in Submission Errors
Administrators now move through ACES with clarity and confidence. The process navigation quickly became the most praised feature in usability testing and significantly reduced reliance on email updates and help desk support.

Challenge to Confusing Connections

Simplifying Term Mapping
Each ACES study type—such as placement validation or predictive modeling—had different requirements, but the legacy interface treated them all the same. This forced users to interpret field labels and sequences that didn’t always apply, increasing confusion and error rates.
Term mapping interface showing calendar type selection

Solution

We replaced the table-heavy interface with a visual term-mapping tool. Users could now select from clear, illustrated calendar types—such as 2-term or 3-term systems—and the interface adjusted accordingly. Plain-language labels and descriptions made it easier to move forward with confidence.

Outcome

Term mapping became a quick, intuitive step rather than a source of confusion. Submission errors decreased, and administrator satisfaction with this section dramatically improved during usability testing.

Challenge to Overwhelming Options

Managing Large Data Uploads
Submitting large datasets often meant working with hundreds of fields across multiple categories. The legacy interface presented everything at once with minimal structure, making it hard to see where errors occurred or whether data was valid.
Data dictionary upload interface with field validation

Solution

We introduced collapsible field groups, inline validation feedback, and visual field status indicators (green for complete, red for error, gray for unused). Uploads could now be reviewed step-by-step, with confidence-building previews and confirmations throughout.
Redesigned data submission with collapsible field groups

Outcome

The redesigned data dictionary process gave users the control they needed. Institutions could now submit with fewer mistakes, delegate across teams more effectively, and avoid unnecessary resubmissions.

Status Challenge

Tracking Study Progress
Once a study was submitted, users had no visibility into whether it was received, reviewed, or needed revisions. Most updates were delivered via email and lacked context, leaving users unsure of what to do next.
ACES study dashboard showing submission status cards

Solution

We designed a dashboard using card-based views for each active or completed study. Each card displayed the current state—Needs Feedback, In Review, Complete, or Draft—and included direct links to the next step or reviewer messages.

Outcome

Users now have a reliable command center for all their ACES activity. The dashboard has reduced confusion, improved transparency, and brought a sense of control to institutions managing multiple ongoing submissions.
Tracking Study Progress
Once a study was submitted, users had no visibility into whether it was received, reviewed, or needed revisions. Most updates were delivered via email and lacked context, leaving users unsure of what to do next.
See the success of the Flying Circle
approach to
Data and Science
Technologies
Denso logo
Transportation AI
GridSmart logo
Enterprise Mobile
Experiences
The General logo
Enterprise Mobile
Experiences
Oak Ridge National Laboratory logo
Data and Science
Technologies
FNMIS logo
Education
College Board logo
Health
HealthMedx logo