Enhancing User Experience in Aid Planning:
Redesigning a Reporting Tool for Streamlined Aid Monitoring and Distribution Strategy.
Background
Scope
Project Timeline: 4 monthsRole/Contribution: Lead Researcher, Designer, Data Analyst
Crossfunctional Team: UX, Project Manager, Data Engineer
Methods: User Interviews, Usability Testing, A/B Testing, Desk Research
Tools: Figma, Miro, UserTesting, Tableau, Zoom
Users: 30+ humanitarian aid coordination lead, managers
Problem
Approaching 3 years of ongoing aid distribution on the frontlines in Ukraine, a server-based reporting tool relied by monitoring and aid relief coordinators has been experiencing increasingly slow load times. Additionally, the team is split into two groups: one that relies heavily on the tool – “the experts”, and another that rarely or never uses it – “the novices”.
The project aims to promote this reporting tool as the go-to solution for all team members while requesting for an update on current features and data.
Goals
Fix Slow Loading Issues:
Identify and resolve the root cause of slow dashboard load times and ensure automation runs smoothly.
Onboard New Users:
Research how “novice” and “expert” users interact with the dashboard and implement findings to enhance their experience.
Enhance Time to Insight:
Test, iterate on the prototype, and launch a new dashboard to fully replace the existing one.
Process
Summary
As a 'UX Team of One', it was crucial for me to clearly communicate and integrate my design process with the rest of the team.
While all phases of the project, from research to ideation, played an important role in shaping the product, the testing and iteration phases were the most vital as the less formal testing and iteration sessions resulted in refinements that were key our success.
Key Questions
“ How are users currently using the dashboard? Are they positioned to utilize the dashboard effectively? ”
“ What are some turn-offs for onboarding users as they transition to using a new tool for their insights? ”
“ How can we streamline the experience for new users while preserving the familiarity for existing users? ”
Insights & Results
Reduced Visual & Data Clutter
Combined with a targeted sampling of data, a reduced time to insight was achieved where visual elements were organized into cohesive groups. Using gutter space to establish clear separation between sections, helping users in quickly identifying the most relevant parts of the dashboard.
Insights from usability testing revealed that reducing the number of elements on the screen reduces task completion time and helps with clarity. By aligning the layout with user workflows, we achieved a more intuitive, purpose-driven experience that improves the ease of finding critical information.
Research-Driven Updates, Smoothening Onboarding
Based on insights from user interviews, we identified that a lack of trust in the tool was the primary barriers to adoption. By understanding the key insights that strategists are looking for, we redesigned the information architecture to ensure each component of the dashboard suite directly addresses a core question. The subsequent features were then implemented to help strategists efficiently reach their insights.
To achieve our goal of transitioning users to the dashboard, we also conducted additional testing with each iteration of the prototype. This iterative process helped create a product more closely aligned with user goals while gradually introducing them to the dashboard’s features.
Tested Data Viz Best Practices to Reduce Cognitive Load
In addition to aligning the dashboards with the strategists’ mental models, we identified that certain chart types didn’t effectively address the users’ core questions. To streamline insights and make information retrieval more intuitive, we re-evaluated and redesigned the visual data displays. Each update was carefully tested to ensure it aligned with users’ workflows, allowing them to find answers quicker and with greater ease.
The most significant change was made to a 100% stacked bar chart, which was transformed into a butterfly chart. This adjustment allowed users to more quickly determine whether there were more survey responses in the "positive" or "negative" categories. Initially there was huge pushback on the new layout of the chart, however through time and communication, our stakeholders were able to see why this change was needed, as it ultimately helped our users assess aid distribution urgency quicker.
Methodology
Problem-scoping Interview
Before the project kicked off, scoping interviews with each stakeholder and user was established to understand the current issues and define the project’s scope. This session allowed them to express concerns about their workflows, the existing product, and any other feedback that could help guide the project.
“ What are some issues you are facing while using this dashboard? ”
Recurring User Interviews
Throughout the project, we conducted recurring user interviews to identify pain points from both “novice” and “expert” users. These sessions enabled our team to evaluate whether each iteration effectively addressed these issues, ensuring that each iteration addressed the pain points.
“ What are the metrics of interest when you use this dashboard? ”
Moderated Usabilty Testing
To test the prototype before each iteration was launched, moderated usability testing was held with users from both groups. The users were asked to give feedback while asked to perform a task. This also allowed us to capture metrics such as task completion time, and any path deviations as key measure of success.
“ Do you find the addition of this feature helpful to your process? ”
A/B Testing
Between each iteration of the prototype, A/B testing was utilized to test updates between each group of users. Leveraging the knowledge of the operation and metrics to look out for, this allowed our team to cross test updated features between groups to gauge receptiveness.
“ Which iteration found more success when addressing user needs? ”
Desk Research
Since the project as a whole involved charts, some research on data visualization best practices was required to understand the factors and parameters for each chart and whether they are suited the best to quickly match the mental models of the users.
“ Which chart type helps users get to what they need effectively? ”
Challenges
Stakeholder Management
One of the challenges during the project was managing stakeholder expectations when one was adamant about keeping certain features that were taking up valuable screen space. This space could have been better utilized to display higher-priority charts, and balancing these differing views required careful stakeholder management.
We worked closely with the stakeholder to explain the impact of these decisions, aligning the project’s goals with the overall user needs and ensuring that the final design prioritized functionality and clarity.
Introducing Unfamiliar Features
While we reserached and implemented the best chart types that aligned with the users' mental models, the introduction of these new chart types faced resistance from stakeholders due to their unfamiliarity with the users' usual routines.
In order to combat this, we explained the rationale behind the design choices, and incrementally changed the charts before each iteration to ease the learning curve for our users. Additionally, we provided detailed explanations upfront to ensure our users understood the changes.
Scope Creep
Initially when the project kicked off, our main goals were to research and redesign the dashboard as well as the underlying data issues however over some time into the project, it became clear that the monitoring team intended to make the dashboard the central tool for their operations.
Although this shift in direction introduced scope creep, with the availability of time, we were able to revisit the research phase and adjusted our scope to ensure the dashboard met the needs of both the “experts” and “noivces”.
Impact
2.5 minute
average
was reduced when users were tasked to perform daily metric checks for the regions/communities of interest.
70%
dashboard load time decrease was achieved due to data sampling optimization and implementation of design best practices.
100%
both current and new users were able to onboard successfully to utilize the dashboard to plan and monitor for aid distribution.
Reflection
Designing for “Novice” and “Experts”
When considering design for 2 groups of users, given the “novice” and “experts”, there should be a middle ground for of familiarity of the product and accessibility above all. For both groups, we made sure to allocate time and resources to fully explain our reasons behind each change in order to address the unique needs of each group.
To balance initial pushback met by the team, and the importance of shifting the product to be more usable, our team concluded that incremental changes should be made over a period of time to ensure that the learning curve for new features can be eased.
Iterations Promotes Improvements
Improvements come with iterations, and iterations never end. Whenever a new feature was pushed out user interviews were conducted, it was always met with a lot of pushback since due to the features not meeting the expected.
Our team went back to the drawing board after each testing phase to weigh out the pros and cons of keeping or removing each feature. While there some trade offs were made, ensuring that our users can continue to use the product with ease was the highest priority, and ultimately led us to success.
Document and Communicate
When our team first took on this project, we found ourselves with a lack previous documentation to work with especially in iteration history. Due to the fast pace nature of the industry, roles are always rapidly shifted, there wasn’t a big emphasis on spending time for documentation.
To support future projects and prevent similar setbacks, our team implemented a system of structured documentation, ensuring that future teams have a reliable foundation to build on. This approach helps maintain continuity, accelerates project onboarding, and supports a more consistent user-centered design process.
Being the team member who wears multiple hats, it was imperative that I was adaptable in managing my roles of doing research, configuring data calculations, and even the visual aspects of the dashboard. Keeping user needs at the forefront, we carefully chose and implemented design elements that truly served the users. Although we leveraged industry best practices, we prioritized practical solutions that resonated with our users, ensuring that each design trade-off was user-centered, and backed with our research findings.