Adaptive UX: LinkedIn
Improving the interaction flow of LinkedIn's job search through adaptive user interfaces.
LinkedIn's capabilities help users access opportunities by maximizing the potential of their professional identity and network. My team sought to understand how adapting the app's job search interface could improve navigation, anticipate user needs and recover errors in reasoning.
I was involved in all aspects of the project journey from research to identifying design opportunities, and was in charge of developing the high fidelity prototype.
Our team began by reading academic papers about adaptive artificial intelligence to understand the current capabilities of the technology, as well as key considerations that should be accounted for when blending AI with product design.
1. Being accurate is not enough: how accuracy metrics have hurt recommender systems
2. Principles of mixed-initiative user interfaces
3. Guidelines for Human-AI Interaction
Exploring Apps for Adaptation
My team began by mapping out the current user flows of 10 popular apps to help us get a better sense of opportunities for integration of interface adaptability.
As we began narrowing our scope, we conducted value and impact assessments based on the following criteria:
1. Size of value experienced by target users: How often is the transaction to be elaborated upon by people?
2. Value for service providers: What incentives do service providers have for incorporating adaptive UX?
3. Potential for ML interaction and development: How likely can the adaptation be correctly triggered or inferred?
4. Risk of errors: How much would false positives or negatives impact the user’s experience with the app?
5. Types of possible adaptation: Would interactions primarily be proactive or reactive?
Our team decided to tackle LinkedIn’s job search given the wealth of opportunities to collapse the customised job results and filters pages, a flow that takes 10 screens and a minimum of 12 taps, and the value such an intervention would have for users.
We conducted think-a-louds and semi-structured interviews with students to validate our assumptions about their pain points when searching for jobs on LinkedIn, and learn what matters the most to them during the job hunting process.
Based on the feedback from our interviews, we created two personas that represent users on opposite ends of the job hunting spectrum. These personas helped to guide us through the process of creating adaptive designs that could benefit a wide range of users.
Identifying Design Opportunities
With new insights from our interviews, our team marked up the current LinedIn interaction flow with identified and anticipated pain points, as well as ways to potentially alleviate such challenges. We then focused on how we could collapse screens through implicit and explicit feedback to the system.
We began by identifying possible points of intervention and potential features, and translating such ideas into initial prototypes.
For the novice generalist, we focused on creating an experience that could process explicit feedback through means such as liking and disliking jobs, and applying company filters. This process would allow for a faster adaptation rate. We also wanted to introduce an element of serendipity with a recommender, that would prompt such a user to consider options that are tangentially related to the job market they are interested in, and might be a good fit for them.
For the experienced specialist, we propose a similar filtering strategy that would prioritize search results based on the weighted importance of job search specifications. This would allow for the user to see more senior positions where they has strong connections, allowing for potential matches to happen quicker.
1. Ask for clarification from the user when they dismiss a job listing or company to tune the model more precisely: Currently, when a user dismisses a job listing or company with a left swipe, the model doesn’t learn why that listing or company was a poor fit. This could lead to a very overfit model and a frustrating experience. We could address this by adding in a pop-up after a user dismisses a result that asks for clarification to ensure that the model is learning appropriately.
2. Value for companies: Currently, profile data, user preferences and system feedback dictates what jobs are being shown to users; however, we are interested in exploring how our intervention could provide value for companies looking to hire new employees. Perhaps another match score criteria could be an initial assessment of mutual interest.