Loading…
Our first ever all-virtual UXPA Boston Conference took place on October 23, 2020. Thank you to all who attended!
Research [clear filter]
Friday, October 23
 

9:15am EDT

Find it and Fix It: How a UX audit empowers you to identify your product’s top UX issues
UX professionals often struggle to understand why users aren’t using their product the way we wanted them to. Usability testing only gets us so far. How can we gain more precise and actionable insight into what’s holding our users back from getting the most out of our work?

A UX audit can identify the most pressing issues facing the users of your product. In this session, you’ll learn how to synthesize data from different sources to create a multi-layered, multifaceted, and robust depiction of what problems need to be fixed and how to prioritize fixing them. Participants will leave this session ready to run a UX audit themselves or in partnership with stakeholders from different teams.

Speakers
avatar for Judy Xu

Judy Xu

UX Researcher, HubSpot
Judy is a UX researcher at HubSpot, where she focuses on delivering user insights and recommendations for her teams. She enjoys using a blend of qualitative and quantitative research and statistical methods to find what matters most to users. Prior to HubSpot. she was at SimpliSafe... Read More →



Friday October 23, 2020 9:15am - 10:00am EDT
5 - Liberty (Career)

9:15am EDT

Is That Really Me? A Case Study in Measuring Emotional Engagement of Customers Using a Virtual Dressing Room in an e-Commerce Website
UX professionals have begun to focus on the how to deliver a highly engaging customer experience. One of the key challenges is to accurately measure a wide range of customer emotions including engagement, joy, frustration, trust, confidence, surprise, and disgust.

In my presentation I will share the results from a case study that focused on measuring customer emotions while using different virtual dressing rooms on three e-commerce websites. I will present biometric data from users, including eye tracking, facial expressions, and galvanic skin response (GSR) to show a more complete picture of the emotional user experience which would otherwise be difficult to detect using analytics, market research, or traditional user research methods.

Speakers
avatar for William Albert

William Albert

Executive Director, Bentley University User Experience Center
Bill Albert is Executive Director of the Bentley University User Experience Center. Albert brings more than 20 years of experience in UX research, design, and strategy to his role leading the center. He has expertise in qualitative and quantitative user research methods, service design... Read More →


Friday October 23, 2020 9:15am - 10:00am EDT
4 - Republic (Research & Panels)

9:15am EDT

Learn from the Best: How Experienced Professionals Moderate Usability Tests
Usability testing is by far the most widely used usability method. It’s important that we learn from experienced practitioners about this method so we can continue to improve our skills in usability testing and avoid common errors.

Through presentations and interactive quizzes based on video clips, this talk reports how 15 experienced usability professionals and two graduate students moderated usability tests. The purpose of the study was to investigate the approaches to moderation used by experienced professionals. Based on this study, the talk presents some of the characteristics that distinguish good and bad moderation.

Each moderator independently moderated three think-aloud usability test sessions of Ryanair.com, the website of a low-fare European airline. All moderators used the same six usability test tasks. The test sessions were video recorded so that both the participant and moderator were visible.

Key observations were identified by asking other study participants to review a random video from each moderator. Each video was reviewed by five to seven study participants. With this approach, the data, not a single person, author, organizer, or moderator, determines what the key observations are.

This study, which is the tenth in a series of Comparative Usability Evaluation (CUE) studies, documents a wide difference in moderation approaches. The talk presents important issues in usability test moderation, including time management, building trust and rapport, pilot sessions, design discussions, and how you can keep sharp.

Speakers
avatar for Rolf Molich

Rolf Molich

Owner, DialogDesign
Rolf Molich's main interests are: Usability evaluation, UX strategy for beginners, UX certificationRolf owns and manages DialogDesign, a tiny Danish usability consultancy.Rolf has worked with usability since 1984. Before that he worked as a successful software engineer.Rolf is the... Read More →


Friday October 23, 2020 9:15am - 10:00am EDT
1 - Back Bay A/B (Research)

10:15am EDT

Research Ready to Build: Compelling Artifacts that Speak Your Agile Team’s Language
Slides at: https://www.slideshare.net/uxforward/research-ready-to-build-compelling-artefacts-that-speak-your-agile-teams-language-239013572
Communicating design and research results, in a way an Agile development team can best leverage them, is critical to achieve your vision of the user experience. Our teammates have their own favorite methods for making sense of information, which we can adopt to bridge communications gaps. Join our talk to learn several techniques UX designers and researchers can use “speak the language” of business, development, and quality assurance (QA). Success (and failure) stories will illustrate how we have used our coworkers’ methods and vocabulary to engage teammates and deliver great experiences. We show how to make the key points stay with teammates by collaboratively translating them into a scope of work everyone can understand. Our examples cover tactically planning one sprint’s backlog, through a list of related capabilities for a product epic, to strategically planning a multi-year vision roadmap. You will walk away with processes, tips and tricks for better collaborating with Business, Dev and QA, ultimately making research and design insights relevant and actionable for everyone.

Speakers
avatar for Joshua Ledwell

Joshua Ledwell

Principal Experience Designer, Autodesk
Josh Ledwell is an experience designer who creates efficient, satisfying, and delightful software workflows at Autodesk. He pioneered the Customers in Sprint Reviews collaboration method used by over a dozen development teams. Josh has a master’s degree in Human Factors and Information... Read More →
avatar for Devashree Desai

Devashree Desai

Senior Experience Designer, Autodesk
Devashree Desai is an experience designer at Autodesk in Boston. An architect and urban planner in her previous life, she went from designing physical spaces to designing digital spaces for users. She has a masters degree in Interaction Design from Northeastern University.



Friday October 23, 2020 10:15am - 11:00am EDT
4 - Republic (Research & Panels)

10:15am EDT

What Did I Miss? The Hidden Costs of Deprioritizing Diversity
Characteristics like race, ethnicity, gender, and disability status—whether they are socially constructed, superficial, or highly visible—can have a significant impact on how we experience the world, and how the world experiences us. In my experience as a UX researcher, diversity is the first thing to vanish from the recruit when the going gets tough; I want to show you why that is not okay, and what you can do about it in your own practice. Researchers may already believe that diversity is important in theory, but this presentation will go further by demonstrating why it is imperative for a strong user research study, providing examples of what we miss when we do not insist on diversity, and offering talking points to help make the case to clients, stakeholders, and internal teams.

In this presentation, we will examine the problem space through a sociological lens and talk about what it means to live a different experience based on your demographics and characteristics. Then, I will discuss the results of a survey that asks about how researchers think about and practice diversity recruitment in their screeners; we will look at examples where opportunities were uncovered or completely missed because the researcher did or did not insist on retaining diversity criteria in the recruit. Finally, I will offer suggestions to help researchers make the case for including and retaining diversity criteria in their screeners to ensure that the participants they talk to are representative of the full spectrum of their intended audience.

Speakers
avatar for Megan Campos

Megan Campos

Senior Experience Researcher, Mad*Pow
Megan is a Senior Experience Researcher at Mad*Pow in Boston, where she has worked across multiple verticals including finance, healthcare, and education to deliver user insights and provide strategic recommendations for her clients. Since her undergraduate years as a sociology major... Read More →



Friday October 23, 2020 10:15am - 11:00am EDT
1 - Back Bay A/B (Research)

11:15am EDT

Riding the synthesis wave: How to avoid drowning in your qualitative data
Ever finished an amazing UX Research study, only to find yourself drowning in interview recordings and incomprehensible notes? When analyzing qualitative research, it’s tempting to take shortcuts, because the analysis process can be so time consuming and hard. However, the rigor of your analysis can make or break the impact of your results and recommendations.

We’ll explore the common analysis pitfalls within UX research and illuminate techniques to improve your synthesis process. Learn how to mitigate the influence of your own bias and transform surface level patterns into nuanced, meaningful insights to level up your research findings.

Speakers
avatar for Margot Lieblich

Margot Lieblich

Research Lead, HubSpot
After stumbling into a design thinking workshop while working in healthcare IT, I found my passion in the field of UX and design research. Since that happy accident, I've gone on to complete my Master's in Human Factors and Information Design at Bentley University and currently work... Read More →



Friday October 23, 2020 11:15am - 12:00pm EDT
1 - Back Bay A/B (Research)

1:30pm EDT

Beyond the Binary: Design Principles for Gender-Identity Inclusion
Gender identity isn’t binary. But until now, our platform gave research participants two options: male or female. It wasn’t working for our users, our customers, our employees, or our company.

So we took a hard look at how our platform asked users about gender—then talked to ~70 people across the gender spectrum about how we could build a more inclusive experience for our non-cisgender users.

It surfaced design principles for gender-identity inclusion—and insights for any company looking to move beyond the binary. The lead researchers will share:
  • How they designed the research—and why we launched it
  • What they found from the conversations, research and prototype testing
  • How you now can conduct more inclusive research
  • How you can use the insights to make your company more inclusive
Link to slides: https://docs.google.com/presentation/d/1pbQ8eYevqEEk707ldFdPXBYiedJXXN0X54ZeeC-xgSk/edit?usp=sharing

Speakers
avatar for Jess Mons

Jess Mons

Director of Business Intelligence, dscout
Jess is the Director of Business Intelligence at the mobile research platform dscout. They have dedicated over eight years to telling stories with data. From their 3+ years spent as one of the company’s Lead Research Advisors, they also gained experience designing, executing, and... Read More →
avatar for Lindsey Brinkworth

Lindsey Brinkworth

Research Analyst, dscout
Lindsey is a Research Analyst with the Studio consulting team at dscout and considers herself to be a genuine ‘People Nerd:’ passionate about building empathy and understanding her participants. During her 3+ years with dscout, she has worked hand-in-hand with some of the world’s... Read More →


Friday October 23, 2020 1:30pm - 2:15pm EDT
1 - Back Bay A/B (Research)

2:30pm EDT

Enterprise Customer Insights Research - How to Make an Impact?
Customer research is typically hard to do for companies who build enterprise softwares, where the development cycle and customer feedback cycle can be long, and also it is relatively hard to find targeted customers to study. Here, we want to share our stories about how we collaborate with other customer-facing departments within the company to schedule customer interview sessions, what kinds of research deliverables we provide, and how we communicate the results to influence product roadmap planning and feature prioritization.

1.   Collaborate with other customer-facing departments to find customers  It is always hard to find customers to talk to for enterprise softwares, so leveraging the help of other customer-facing departments would be very efficient. Those departments can be Customer Success Managers or Account Managers, Pre-sale Engineers, and Support and Services, etc. Some best practices that we’ve learned are:
  • Hold regular meetings between PM+UX team and other teams. 
Our PM/UX team have monthly meetings with PSE (Pre-Sale Engineer) team and CSM (Customer Success Manager) team, as well as biweekly feature request review meetings with support/services team. Building great relationship with these teams and communicate often are very important. 
  • Set specific goals to schedule interview sessions
 Every week, we aim to have 2 one-on-one sessions with internal people and 1 customer interview session. Setting specific goals helps us to keep moving. 
  • Leverage already scheduled customer meetings
We don’t want to overwhelm customers with multiple meetings, so joining the already scheduled meetings (such as regular customer review meetings, demo meeting and user training meetings) would be another good way to make sure that we can interact with customers quickly. Sometimes, we can squeeze our interviews into that session, and other times, just listening to their feedback to our demos or trainings can also give great insights. 
 
2.    Customer research methods and deliverables
Though we’ve used various research methods such as interviews, surveys and usability testings, the 2 most effective methods are through 1-hour customer interview sessions and Salesforce customer case analysis, since they give us the most valuable and actionable insights.
Most of our customer interview sessions are done through web meetings, where we usually ask the customers to share their screen, walk us through how they currently use our products and also share what they are trying to accomplish. Sometimes, we would ask customers to share their own roadmaps when possible. These details are super helpful to understand customer goals and use cases. Typical deliverables that we’ve provided are:
  • In-depth customer profile & use case report
  • Persona use case summary (Once we talk to enough customers with particular job roles, we would summarize the patterns. One thing to notice that we don’t provide generic persona description, but provide detailed use case patterns, which we found more useful.)
  • Experience Journey Map / Task flow (This journey map includes not only the flow that how customers use our product, but all the other products customers are using and integrating with.)
The deliverables from customer interview sessions have greatly helped the whole company to better understand our customers, not just at the feature level, but more on the fact WHY customers need to do certain kinds of tasks.
Another effective research deliverable is summarizing customer feature request cases from Salesforce. Often times, when we’ve heard lots of complains about one problem area (both from internal team and customers as well), we would run a Salesforce report and do a deep analysis of all the cases. Our summaries have driven the product team to raise the priority of these problems in order to fix them quickly.


3.  Communicate the results effectively
How to communicate the results and provide actionable insights also need careful consideration, since people usually don’t have the patience to read a long ppt file. Two important best practices that we’ve found are: have a one-page summary and visualize the results as much as you can. Thus, even just from a glance, people can still get what your report is trying to say. And one last small tip, always include a screen capture of the one-page summary in your email, for those people who don’t even want to download your report!

Speakers
avatar for Meng Yang

Meng Yang

Manager of User Experience and Customer Insights, NetBrain
10+ years of experience in interaction design and user experience research. This is the 4th time that my presentation is accepted to Boston UXPA!



Friday October 23, 2020 2:30pm - 3:15pm EDT
1 - Back Bay A/B (Research)

3:30pm EDT

Boundary Maintenance for UX Researchers
Whether by nature or by training, user researchers spend time listening to others, learning, and caring about their needs. It’s an occupational hazard to take on the issues, burdens, and concerns of others. This can become a problem when we become emotionally involved in what really belongs to someone else, or when others take advantage of our caring stance. This presentation gives practical advice for how to spend time listening to others’ stories and still maintain our own healthy selves.

Speakers
avatar for Kris Engdahl

Kris Engdahl

Principal User Researcher, Indigo Ag
Kris has been practicing and leading UX Research for over 20 years, working in different industries, including databases, healthcare, hospitality, and now agriculture. She's led teams and been a solo researcher. She is currently building a research practice at Indigo Ag.


Friday October 23, 2020 3:30pm - 4:15pm EDT
1 - Back Bay A/B (Research)
 


Twitter Feed

Filter sessions
Apply filters to sessions.