Project Overview
Client Requirements:
When our team was brought on by Dell Medical to assist in the development of MediLinker, the application lacked any UX design or research. The development team wanted us to help with the general UI/UX design of the application through market research and usability testing. They also wanted our assistance in the usage of terms and idioms for explaining blockchain and other aspects of the application, design elements to assist patients and clinics when interacting with the application, and designing their credential system.
When our team was brought on by Dell Medical to assist in the development of MediLinker, the application lacked any UX design or research. The development team wanted us to help with the general UI/UX design of the application through market research and usability testing. They also wanted our assistance in the usage of terms and idioms for explaining blockchain and other aspects of the application, design elements to assist patients and clinics when interacting with the application, and designing their credential system.
Our Methodology:
- We started with a client kickoff to gather as much context about the application as we could and understand what the points of focus and issues are for MediLinker.
- We then performed a heuristic evaluation on the given iteration of the application and divided them into scenarios based on user tasks.
- Based on the context we had of the application now, we performed competitive analysis on other applications and products.
- The heuristic evaluation and competitive analysis provided us with enough information to be able to recommend some valuable design recommendations to the team for the next iteration of MedilLinker.
- Once those design changes were implemented, we then took the new screens and created a prototype to perform usability testing.
- To recruit participants for our usability test, we created a screener based on MediLinker's target audience. We then created our moderator script and participant packets for the actual test sessions.
- We then analyzed the data we had collected from our Usability testing in qualitative and quantitative contexts.
- We then used these insights to inform us of another round of recommendations and suggestions for MediLinker and presented that to the team.
BackgroundÂ
Research
Research
Client Kickoff:
Our research process started off with the Client kickoff meeting. During the meeting, the dev team for MediLinker walked us through the concept of the MediLinker, provided us the current iterations of screens, briefed us about the backend and the interactions between the different stakeholders involved with MediLinker, and requested our help in the future development of the application.Â
‍
My teammates and I took notes and we compiled these notes on Figma through sticky notes. We then used those notes to create an Affinity Map where we organized them into different categories. We used this affinity map throughout the project as a reference.
Link to affinity map (For better viewing experience) đź”—
Note: Click on the images below to expand them
Our research process started off with the Client kickoff meeting. During the meeting, the dev team for MediLinker walked us through the concept of the MediLinker, provided us the current iterations of screens, briefed us about the backend and the interactions between the different stakeholders involved with MediLinker, and requested our help in the future development of the application.Â
‍
My teammates and I took notes and we compiled these notes on Figma through sticky notes. We then used those notes to create an Affinity Map where we organized them into different categories. We used this affinity map throughout the project as a reference.
Link to affinity map (For better viewing experience) đź”—
Note: Click on the images below to expand them
Initial Designs
After the client kickoff meeting, the development team sent over the current iteration of MediLinker screens. We took these screens and organized them into different scenarios based on the general functionality of the application
Scenario 0: Onboarding Process
- User is given a quick introduction as to what MediLinker is and its purpose.
- They are then prompted to create an account and a digital wallet.
Scenario 1: First Enrollment at Clinic
- Here the users are asked to enroll into a clinic.
- To do so, they must scan a QR code and make a "connection" with the clinic.
Scenario 2: Enrollment at a second clinic
- Here the users are asked to enroll into a second clinic.
- They are also prompted to share credentials with the second clinic.
Scenario 3: Sharing/Consenting Data Between Clinics
- Users are prompted to choose a clinic to share information between clinics
Scenario 4: Patient changing information on Blockchain Wallet
- Here the user is trying to edit their personal information on their blockchain digital wallet.
Scenario 5: Patient consent to research Project
- User is prompted to participate in a research study with a clinic.
- The user must provide consent to the research study through the app.
Scenario 6: Patient removing consent for research study
- Here the user is trying to revoke consent from the research study they no long want to participate in.
Scenario 7: Medical Power of Attorney/Guardianship for geriatric and pediatric patients
- The user here is assumed to be a guardian for a geriatric / pediatric patient.
- They are supposed to switch profile to the patient they are the guardians of and share information that is requested by the clinic.
Heuristic Evaluation
After we had arranged the screens into different scenarios, we decided to perform a Heuristic Evaluation. We based the evaluation on Jakob Neilsen’s 10 Usability Heuristics (https://www.nngroup.com/articles/ten-usability-heuristics/).
We each went through our assigned scenarios and commented on any Heuristic violations we noticed and categorized them with a severity rating along with any recommendations we had for it.
We each went through our assigned scenarios and commented on any Heuristic violations we noticed and categorized them with a severity rating along with any recommendations we had for it.
Competitor Analysis
To gain a better understanding of how others are solving the same problems that MediLinker is trying to, we performed a thorough competitive analysis. We divided our competitors into two categories: direct and indirect competitors.Â
Direct Competitors
Initially, we struggled with identifying who the direct competitors actually were since the application was a hybrid of a digital wallet and a healthcare application. Therefore for our list of direct competitors, we chose applications that dealt with healthcare records, digitizing documents, and digital wallets.
Link to direct competitors (For better viewing experience) đź”—
Initially, we struggled with identifying who the direct competitors actually were since the application was a hybrid of a digital wallet and a healthcare application. Therefore for our list of direct competitors, we chose applications that dealt with healthcare records, digitizing documents, and digital wallets.
Link to direct competitors (For better viewing experience) đź”—
Indirect Competitors
We also looked at some competitors that didn’t necessarily fit the category of a healthcare app or digital wallet. However, these apps are good models for some specific features that MediLinker also has or are good recommendations for some of the heuristic violations we saw with MediLinker.
Link to indirect competitors (For better viewing experience) đź”—
We also looked at some competitors that didn’t necessarily fit the category of a healthcare app or digital wallet. However, these apps are good models for some specific features that MediLinker also has or are good recommendations for some of the heuristic violations we saw with MediLinker.
Link to indirect competitors (For better viewing experience) đź”—
Competitive Matrices
Once we had analyzed all of the direct and indirect competitors, we created two different competitive matrices to compare these applications and products against each other and also to the current iteration of MediLinker
Once we had analyzed all of the direct and indirect competitors, we created two different competitive matrices to compare these applications and products against each other and also to the current iteration of MediLinker
Feature Matrix
The first competitive matrix was a feature matrix that focused on general information about the application, features and functionality, market info, strengths and weaknesses, and commonalities with MediLinker.
Link to feature matrix (For better viewing experience) đź”—
The first competitive matrix was a feature matrix that focused on general information about the application, features and functionality, market info, strengths and weaknesses, and commonalities with MediLinker.
Link to feature matrix (For better viewing experience) đź”—
Heuristic Matrix
The second competitive matrix focused more on the heuristics of these competitors so we could use them as inspiration to solve some of MediLinker's heretic violations.
Link to heuristic matrix (For better viewing experience) đź”—
The second competitive matrix focused more on the heuristics of these competitors so we could use them as inspiration to solve some of MediLinker's heretic violations.
Link to heuristic matrix (For better viewing experience) đź”—
Initial Findings & Suggestions
We created a halfway report to present our key findings from our Heuristic evaluation and Competitive analysis. This report also contained recommendations for the next iteration of screens based on the data we had collected.
New
Designs
Designs
In the next step, we collaborated with the design team on the next iteration of MediLinker. We utilized the key findings from our report and recommended certain improvements to the design. This new design was higher fidelity and had more visual design elements. We took these screens again after it was handed off to us by the designers and we organized them into scenarios.
Scenario 0: Onboarding Process
- The user receives an email prompting them to download and try MediLinker.
- They can either log in or create a new account.
- After which they are prompted to fill out a "Quick Fill Information" form.
Scenario 1: First Enrollment at Clinic
- Here the users are prompted to "connect" to a clinic.
- To do so, they must scan a QR code and request a "connection" with a clinic.
Scenario 2: Create a credential through primary clinic
- After the connection request is accepted by the clinic, the user is now asked to create a credential with them.
- The user is prompted to fill in details regarding their drivers license to create that credential.
Scenario 3: Send Attributes
- Users are now prompted to share specific information with their primary clinic.
- They are supposed to "send attributes" through the credentials they have created with the the primary clinic.
Scenario 4: Manage and Remove a connection
- The user is trying to edit / report certain information they have shared with the clinic.
- After which they are removing their "connection" with that clinic.
Scenario 5:Â Update files
- The user's drivers license has expired. They are prompted to update the file.
- To update the file, they must re-enter their drivers license information and verify that with the clinic.
Scenario 6: Consent for research study
- The user is notified of a research study taking place in his clinic.
- They decide to take part in the study and give their consent.
Scenario 7: Medical Power of Attorney/Guardianship
- The user here is assumed to be a guardian for a geriatric / pediatric patient.
- They are supposed to switch profile to the patient they are the guardians of.
Usability Testing
With these new sets of screens, we were now almost ready to perform usability tests. The point of usability testing was to test out if the flows work well and if users are able to complete certain tasks and interact with the prototype successfully. We also wanted to gauge priorities of existing or desired features and functionality. Finally, we wanted to assess the general sentiment towards the concept of the application, the labels and terminology being used, and some specific points of confusion that the designers had for certain aspects of the application that they wanted us to test out.
Test Documents
Participant Screener
Firstly we had to create a screener script that could assist us in recruiting participants that fit the demographics we were looking for. We tried to screen participants based on age, yearly clinical visits, technology usage, and blockchain knowledge.
Firstly we had to create a screener script that could assist us in recruiting participants that fit the demographics we were looking for. We tried to screen participants based on age, yearly clinical visits, technology usage, and blockchain knowledge.
Moderator Script
For the actual test sessions themselves, we created a Moderator script which gave a quick introduction of what we were trying to achieve, ice breakers, pre-test questionnaire, reference to the participant tasks, and a final post-test questionnaire.
For the actual test sessions themselves, we created a Moderator script which gave a quick introduction of what we were trying to achieve, ice breakers, pre-test questionnaire, reference to the participant tasks, and a final post-test questionnaire.
Participant Packet
We then created a test packet for the participants to refer to during the test sessions. This consisted of tasks that they were supposed to perform with the prototype, a Likert scale to fill after completing a task, and space for any additional comments.
We then created a test packet for the participants to refer to during the test sessions. This consisted of tasks that they were supposed to perform with the prototype, a Likert scale to fill after completing a task, and space for any additional comments.
Prototype for testing
Using the new screens, I designed the prototype to facilitate and simulate the real application experience as much as possible. The prototype consisted of multiple states based on the success/failure of certain tasks and would be reflected in the kind of information being shown to the participant. There were also “fail-safes” and “skips” if any of our participants would get stuck or frustrated and we had to intervene to move them along the test.
Try NowUsing the new screens, I designed the prototype to facilitate and simulate the real application experience as much as possible. The prototype consisted of multiple states based on the success/failure of certain tasks and would be reflected in the kind of information being shown to the participant. There were also “fail-safes” and “skips” if any of our participants would get stuck or frustrated and we had to intervene to move them along the test.
Takeaways
Better time management during test sessions
We encountered time management difficulties where in some of our testing sessions went over the 1 hour mark. While our participants didn't have an issue with it, we should never go over the allotted time. We either had to reduce the number of tasks or do a better job of moderation.
We encountered time management difficulties where in some of our testing sessions went over the 1 hour mark. While our participants didn't have an issue with it, we should never go over the allotted time. We either had to reduce the number of tasks or do a better job of moderation.
Moderate participants to ensure they don't get stuck
While we want to gain as much insight and gather as much feedback as we could from our participants, we needed to make sure to keep the test moving along and not let our participants get stuck on a task.
While we want to gain as much insight and gather as much feedback as we could from our participants, we needed to make sure to keep the test moving along and not let our participants get stuck on a task.
Notetaking and Moderating simultaneously is hard
While this is something I got better as I kept moderating more and more testing sessions, initially it was really hard for me to both take notes and also moderate the test.
While this is something I got better as I kept moderating more and more testing sessions, initially it was really hard for me to both take notes and also moderate the test.
Need to share info on trends & patterns noticed during testing sessions with other moderators
For our testing process, we had decided to divide testing sessions between teammates based on availability. However, we didn't inform or catch each other up on the observations made during those test sessions. This resulted in us not knowing some of the trends or problem areas to further probe and gain better insight on. If we had done so, we would have been able to utilize our test sessions efficiently to gain better insight on some o the common issues and trends.
For our testing process, we had decided to divide testing sessions between teammates based on availability. However, we didn't inform or catch each other up on the observations made during those test sessions. This resulted in us not knowing some of the trends or problem areas to further probe and gain better insight on. If we had done so, we would have been able to utilize our test sessions efficiently to gain better insight on some o the common issues and trends.