Coral.org site redesign | Becoming a more accessible site

Team
3 members

Duration
3 weeks
My roles
User Researcher, Visual & Interaction Designer

Tools
Invision, Miro, Figma
PROTOTYPE

Our Problem: The usability and navigational issues makes it challenging to connect the user’s intentions with the site.

Our Solution: Redesign the website to deliver the essential information and support for interested individuals faster and easier.

The Purpose: To simplify process of donating and applying to volunteer for users, as well as raise awareness in sustainability.

Coral Reef Alliance Redesign

Redesigned the Coral Reef Alliance website to boost donations and volunteer applications by enhancing usability and creating an experience as inspiring as their mission to save coral reefs.

I. Research & Empathy

Introduction

Coral reefs are some of the most valuable, diverse, and beautiful ecosystems on Earth. They host thousands of species of marine life and are estimated to have millions of organisms that depend on these environments. It is important to know that humans are dependent on these resources as they provide us with the plant and animal biodiversity to develop essential drugs, buffer our shorelines from dangerous waves and storms, and also help support our commercial fisheries and local tourism.

Despite their exceptional economic and recreational value, coral reefs have been severely threatened by pollution, disease, and habitat destruction.

The Coral Reef Alliance (CORAL) is on a mission to save the world’s coral reef by collaborating with communities to reduce direct threats to the reefs in ways that provide long-term benefits to people and wildlife.

Heuristic Evaluation

To begin, heuristic evaluations were tested to understand how CORAL wanted to connect with their target users and their goals. We reviewed the current website and learned that it was already aesthetically pleasing and had a decent delivery of information. However, the unorganized navigation, and lengthy forms really made the website usability very challenging.

Proto-Persona

This proto-persona was to help jumpstart our research objectives for the following usability tests/interviews.

Chad is our scuba diving science teacher that wanted to raise awareness about coral reefs, but did not know exactly how to do his part.

Current Website Usability Testing

Since we didn’t get a reply from the stakeholder, we wanted to widen our scope another way. Our team asked a few of our peers to test out the current CORAL website, with a few tasks in mind — learn where your donations go, donate, and volunteer.

Overall the website is basically navigable and users liked the imagery. There were a significant number of issues regarding finding where they are in the site and how dragging it can be reading through long sections. Users had complaints that the site can be easier to navigate without the unnecessary need of drawn-out information.

Competitor Analysis

Survey Data

For survey questions we concerned ourselves around 3 topics:
•  Common sustainability knowledge
•  The thinking process behind donating –
•  and volunteering.

Interview Results

Our research questions revolved around 3 topics:
•  The user’s knowledge on a sustainable lifestyle
•  Their decision making process in donating and volunteering,
•  Their habits and experiences in doing so.

With a grand total of 8 survey responses and 5 quality interviews, our notes came out with pretty interesting data.

(1) That users have some basic knowledge of coral reefs and what they provide to marine life, but not so much human benefits.
(2) That they would like to follow a sustainable lifestyle when they can, but can become costly and time-consuming.
And (3) That they DO have the means to donate and volunteer, but CAN’T due to lack of money, time, and effort.

Affinity Diagram

The affinity diagram pooled similar behavioral patterns that highlighted user’s goals, pains, and gains. It shows that their major pain points root from financial and physical constraints (like not having money to donate or free labor to provide).

The key takeaway with this diagram, however, are their issues with boring sections, lengthy donation forms, and demanding volunteer applications, all of which we aim to resolve.

II. Definition

User Persona

User Insight

Problem Statement

Value Proposition

Storyboard

III. Ideation

Prioritization Matrix

This reveals that our team wanted to focus on the clarity and simplicity of the donation and volunteering process. We highlighted simple features that can make a big impact on users such as big “call to action buttons” and convenient and easy-to-fill-out forms. We also wanted to go into more detail about volunteer opportunities and responsibilities, as well as where donations go to engage users and encourage them to sign up.

Card Sorting (Before)

This is the card sorting for the current Coral website. As you can see, the blue tabs are 6 primary pages in the main navigation bar and there were many secondary pages that flow under that. Under Coral Reefs 101, there were many tertiary articles that were hidden and kind of just dropped in a long overwhelming list.

Card Sorting (After)

In the redesigned card sorting, it consolidated the primary pages into 4 main sections. Under coral reefs 101, we decided kept all the educational articles, but organized them into different categories and subcategories so they were easy to understand and navigate through. Also focused more on the main tasks of donating and volunteering, designating a new section that focused more on those two tasks.

Sitemap

For thoroughness, established a new sitemap with 4 primary pages in the menu and the following secondary & tertiary pages.

User Flow

The user flow focused on 3 main tasks, highlighted by the yellow diamonds. The 3 main choices were to donate, to volunteer, and the option to learn about a more sustainable lifestyle. It is designed in a way that whether the user decides to take that action or not, they will automatically be guided to the next action so that the main tasks are clearly displayed.

IV. Development

Lo/Mid-fi Wireframes

Based on the research, we created a mid-fi prototype v1 to test our ideations for Todd’s predefined user flow. Going for a minimal UI with prominent call-to-action buttons ensures there is always something for the user to do next after completing a task such as volunteer interest form, making a donation or looking up ways to make informed everyday changes.

Usability Testing

The data from tests of prototype v1 produced the following key findings:
•  Smooth, clean user flow is achieved
•  Measure NPS 1 "How likely are you to lead a more sustainable lifestyle based on these suggestions on a scale of 0 (very unlikely) to 10 (very likely)?" - 7.25/10
•  Measure NPS 2 "How likely are you to recommend this organization to your family or friend on a scale of 0 (very unlikely) to 10 (very likely)?" - 8.5/10

Key feedback from the users:
•  The content for membership benefits that became available at $50+ donations (or $5+ monthly) was not clear.
•  The “Live Sustainably” page was too long and wordy, which made the information not worth the attention and time to read through.
•  The “Next” button should be more visible once a section is filled in.

V. Prototype

Hi-fi Wireframes

RWD Mobile to Desktop

Example 1: Home page
•  The hamburger menu expands into a header navigation bar.
•  The fish logo is replaced with the full titled version.
•  The “Donate” button is made prominent in the header across all web pages.
•  The title typography (H1, H2) is increased by 1.5x to take advantage of a larger screen.

Example 2: “Ways To Give Back” primary category
•  The modules for the secondary categories and the footer move from vertical to horizontal listing depending on the screen width.
•  The header image of the coral moves from above the title to the side for visual prioritization.

Example 3: “Take Action” card decks
•  A wider screen displays more cards.
•  Desktop view also adds a scroll bar for a more intuitive UI compared to hands-on swipe motion.

Final Thoughts & Plans

Through user research, it became apparent that visuals aren’t the only metric of a good website. In fact, without a consistent robust navigation flow and well-defined site map, user like Todd can easily lose interest in the organization, even if he full-heartedly believes in its mission.

Due to the constraints of the case study, we had to prioritize features we’d like to develop. Some of the next steps this project could take include:
•  Develop the “More Ways to Give/FAQs” secondary category, which provides resources and answers to common questions.
•  The organization partners would also need to be featured on the website more in CORAL’s interests.
•  User Flow for Coral Reefs 101 - an interactive book format.

Case Study: Optimizing the Company Support Portal for Internal Employees, Improving Request Navigation and Efficiency

Conducted UX research and IA testing to uncover usability issues in the internal help portal, resulting in clear recommendations to simplify navigation, reduce friction, and increase efficiency for employees submitting service requests.
15 min. read

Disclaimer:
This case study is based on real research, insights, and design work; however, due to confidentiality agreements, certain details have been modified, generalized, or omitted to comply with non-disclosure agreements (NDAs). Any proprietary information, including specific company names, internal data, and strategic details, has been altered or anonymized while preserving the integrity of the design process and key learnings. The purpose of this case study is to illustrate the UX challenges, methodologies, and outcomes without disclosing sensitive or proprietary information.

UX Researcher: Martin Carpio + 2 UX Researchers
Client: Tech Systems Team
Target Users: Internal Employees

Challenge: The multi-method study aimed to address key issues encountered by different user groups (between internal requesters and internal intake teams) when navigating and using the internal company help portal. The primary challenges included:
1. Users struggled to identify the correct service request type for their issues.
2. The service catalog was designed from a fulfiller’s perspective, making it difficult for end-users to navigate.
3. Second and third-hand complaints indicated usability issues, but no structured feedback was available.
4. A lack of governance and consistency in reporting system health.

Activities: User Interviews, Content Audit, Tree Testing (Information Architecture)

Background

In Q3 2024, the Tech Systems team sought to improve the internal help portal experience. The primary goal was to increase intake and resolution efficiency by 50% by optimizing ticket submission and service request processes. The UX Research team was tapped to investigate user pain points, audit the information architecture, and recommend solutions.

Key insights and recommendations
The research uncovered several critical usability issues within the internal service portal:
1. Complexity of Navigation: Users struggled to locate key features due to an unintuitive layout. So we should implement streamlined menu structures for better accessibility.
2. Information Overload: Excessive content made it difficult for users to focus on completing tasks. So we should introduce prioritization for clearer categorization to reduce cognitive load.
3. Weak Search Function: Search results were often irrelevant, leading users to browse manually. So we should refine search parameters to return more accurate results.
4. Task Completion Bottlenecks: Certain workflows were inefficient, causing increased support requests. So we should redesign key workflow processes, especially repeated ones, to reduce task completion time and improve usability.

Short-term Impact and Next Steps
- Expected 25% reduction in task completion time, leading to increased efficiency.
- Enhanced user satisfaction due to a more intuitive navigation structure.
- Future iterations will include A/B testing to validate design improvements before implementation.

The Process

To uncover usability challenges and identify opportunities for improvement, a mixed-method research approach was applied:

1. User Interviews
Invited employees across different roles to understand their frustrations and needs. We interviewed participants from key user groups, including internal requestors, intake teams, and repeat users, ensuring a diverse range of perspectives. These revealed frustrations with navigation, lack of transparency, and complexity in request submission.
- Over two weeks, we conducted one-on-one remote interviews with employees from various departments who frequently interacted with the internal service portal.
- Each session lasted about 45–60 minutes and was structured with a mix of open-ended questions to explore user behaviors, challenges, and expectations.
- We asked participants to walk us through their typical workflows, from submitting a request to receiving support, identifying pain points along the way.
- Stakeholders from the IT Team joined as observers to gain direct insights into user frustrations.

Findings from Interviews:
a. Users struggled to determine the right request category for their issues, leading to incorrect submissions and processing delays.
b. Many found the service catalog overwhelming, as it was designed with fulfillers (internal support teams) in mind rather than end-users.
c. There was a lack of feedback loops, where users were unaware of where their request stood in the process.
d. Several users defaulted to emailing support teams directly rather than using the portal due to the perceived complexity of the internal service portal.


2. Content Audit
We conducted a content audit to examine the structure and organization of the service catalog in the internal service portal. The goal was to identify inefficiencies in how information was presented and determine whether it aligned with user expectations. This highlighted structural issues, inconsistencies, and outdated elements contributing to inefficiencies.
- We systematically reviewed all service request types, analyzing naming conventions, categorization, descriptions, and redundancy in options.
- We documented inconsistencies in terminology, duplicate entries, and vague descriptions that caused confusion.
- We identified legacy services that were no longer relevant but still appeared in the system.
- We compared findings from the content audit with user feedback from interviews, validating issues such as difficulty in finding the right request type.

Findings from the Content Audit:
a. Duplicated and outdated service categories cluttered the system, making it difficult to navigate.
b. Overly technical language in request descriptions confused users, making it unclear whether they were selecting the correct option.
c. Certain high-traffic requests were buried deep in the catalog, requiring multiple clicks to access.
d. There was no standardized way to update and govern service catalog entries, leading to inconsistencies across teams.


3. Tree Testing
Assessed the effectiveness of the current information architecture and identified areas for improvement by a Tree Test, a usability method that evaluates how easily users can find information within a site’s structure. This validated the other tests’ findings, proving that restructuring the service catalog could significantly improve usability.
- We created a digital prototype of the current service catalog and ran tree tests using a usability testing platform.
- Participants were given tasks, such as “Find the service request type you would use to report a system outage.”
- We measured success rates, time taken to complete tasks, and common paths users took to locate requests.
- After initial testing, we proposed a new categorization structure based on interview insights and conducted a second round of tree testing to compare performance.

Findings from Tree Testing:
a. Users struggled to locate the right request type quickly, often clicking through multiple irrelevant categories before finding the correct one.
b. Some high-priority services were buried under generic categories, making them hard to find.
c. The terminology in the navigation did not match user expectations, leading to misinterpretations of request categories.
d. The new structure we tested, which simplified categories and used more user-friendly terminology, resulted in a 30% improvement in task success rate.

🧑🏻‍💻 Any thoughts or comments? Let's chat! Connect through LinkedIn or drop me an email.

👆🏼