NeuroMatch - EEG Analysis and Detection Software
Aug 2020 - Jun 2024 / Remote / UX design
What is it?
NeuroMatch is a medical software capitalizing on the cloud, big data, and machine learning to identify neurological patterns in EEG (Electroencephalogram) data to produce models that help physicians diagnose brain diseases.
Why is it needed?
The time-consuming nature of reading EEG creates a significant road block to efficient, accessible neurological care. The NeuroMatch reading interface, detection tools, and trend models help doctors find the "needle in the haystack" within patient data that provides evidence to support a diagnosis faster with greater accuracy. Big picture overviews produced by pattern recognition can expedite the reading process while simultaneously allowing for each step of a patient study to be done all in once ecosystem that is remotely accessible anywhere at anytime.
How did I contribute?
As a junior UX designer while NeuroMatch was in it's very early alpha stage I worked under the design lead to help develop the design system and interface during the preliminary stages of user research and competitor analysis.
Analyzed user studies to produce the profiles that would be used to inform design decisions
Created components, layouts, and wireframes and prototypes in Figma
Manage and maintained the design library
Rapid startup pacing led to drastic increase in responsibilities.
Performed user interviews with technicians and physicians
Work with product manager, engineers, science team to develop design requirements
Provided feature design proposals to the project manager and science teams
By the time the FDA approval process begun I was already fulfilling senior UX responsibilities had become the point of contact for the entire product design.
Responsible for the entire design library from components to mockups
Onboarded new team members to the product and design
Worked directly with product, science, and dev teams to plan and design features
Tools:
Design: Figma, Zeplin
Image: Photoshop, Illustrator
Video: AfterEffects, PremierPro, Frame.io
Project Management: JIRA, Confluence
Highlights:
I produced our first demo video of beta NeuroMatch product features and LVIS debuted NeuroMatch in a special lecture by the founder, Jinhyung Lee, at the 1st World Korean Scientists and Engineers Conference. It was received well. This led to a commitment between LVIS and the city of Daegu to build a research center dedicated to the development of AI-based brain disease research, with NeuroMatch being deployed in multiple hospitals and medical institutions in the region.
Case Study: NeuroMatch
Problem Identification
System-Level Challenges (Macro)
Wider structural issues in neurology + healthcare
Delayed diagnoses due to lack of biomarker-based precision
FDA compliance hurdles preventing modern software/hardware updates
High cognitive burden from multi-system workflows
Organizational & Workflow Challenges (Meso)
Direct problems within hospital departments and teams
Use of separate software for patient records, EEG viewing, analysis, and reporting
Proprietary EEG tools from different hardware vendors with outdated UX
Custom hospital workarounds that result in inefficient manual workflows
User-Level Challenges (Micro)
Daily pain points experienced by physicians
Constant context switching between multiple platforms
High administrative load managing, annotating, and archiving data
Lack of parallel workflows for collaborative reporting
Frustration with non-intuitive interfaces and outdated UIs
Tracking patient progress over time in a central place
User Discovery & Role Analysis
Conducted interviews + observations to understand role-specific workflows.
Mapped overlapping responsibilities, workflow stages, pain points.
Identified unique user types: Monitor, Technician, Physician.
Process Breakdown & Pain Points
Mapped out report production pipeline: Live monitoring → Data review → Report writing.
Discovered fragmented tool use (EEG reader, analysis tool, reporting tool all separate).
Barriers due to proprietary software, outdated interfaces, and FDA-bound limitations.
Admins often patched workflows with their own workarounds.
Design Direction & Strategic Focus
Establish seamless transitions between reading + reporting.
Prioritize clarity of user contributions, work ownership, and a collaborative workspace.
Leverage benefits of online platform: anywhere/anytime access, data transparency.
Built workflow diagrams to unify cross-team understanding (Design ↔ Dev ↔ Science).
Testing & Iteration
Summary:
Created interactive prototypes in Figma based on real user workflows.
Focused on signal visualization, annotation, and report editing flows for each role.
Ran validation sessions with actual hospital staff (techs, monitors, physicians) across multiple iterations
Key Testing Focus Areas:
Speed of navigation for large signal data
Ease of annotation tools across devices
Report contribution clarity for multi-role teams
Outcomes:
Identified key friction points around annotation workflows and report ownership clarity.
Simplified complex UI components into role-aware tool views.
Collected qualitative feedback that shaped feature prioritization (e.g., toggling AI trend views, customizing EEG view).
Results & Impact
Reduced tool-switching across roles
Increased clarity of roles and user ownership in report flows
Alleviated apprehensions of ai driven tools through transparency and non intrusive implementations

Reflection
This project taught me how to design within an extremely complex technical field under very strict regulatory systems while keeping the user experience focused and efficient. It pushed me to become a better communicator and collaborator. Having been apart of the full UX design cycle from the start to release, it was extremely validating to see the efforts of the LVIS team culminating into a major collaborative effort with the city of Daegu, Korea.