Skip to content

Blog


Beyond the Click: Elevating DoorDash’s personalized notification experience with GNN recommendation

June 25, 2024

|

Nimesh Sinha

DoorDash has redefined the way users explore local cuisine. Our highly interactive notification system has been an integral part of this experience by not only keeping users updated about deliveries but also by acting as a pathway to personalized restaurant recommendations.

Our notifications are meticulously designed to be an essential line of communication that keeps both utility and personalization in mind. Because we believe each meal should be an exploration and an opportunity to discover new culinary delights, we leverage personalized notifications to open unexplored avenues of cuisine for users to try. With each successful experience, users gain a stronger sense that DoorDash understands their preferences and values, fostering loyalty and engagement.

In this blogpost, we discuss how we deploy Graph Neural Network (GNN) models to create personalized restaurant recommendations to share with users through push, email, and hub notifications, as shown in Figure 1. 

Real-world data often exhibits a rich, interconnected structure that can be naturally represented as a graph. A graph is a data structure consisting of two components: nodes, often called vertices, and edges. For instance, a simple graph G can be defined as G = (V, E), where V is the set of nodes and E are the edges between them. In DoorDash context, a user and a restaurant can be nodes and order can be an edge between them.

GNNs are a class of neural networks designed specifically to work with graph-structured data. They leverage the relationships represented by edges, along with the attributes of nodes to perform a range of Machine Learning tasks like node classification, edge or link prediction. Used broadly by such companies as Pinterest, LinkedIn, UberEats, Airbnb, and Meta, GNNs can analyze relationships between products and users to create a recommendations framework based on past behavior and interactions.

Figure 1: Notification content generation and user experience

Why GNNs?

Without manual feature engineering, traditional machine learning models struggle to discern relationships between various entities such as users, and restaurants and edges such as order. But because GNNs work with graphs that already model complex interactions and relationships, they can make recommendations based on the entire heterogeneous feature set of products and users. GNNs reduce the need for some degree of manual feature engineering by focusing on:

  • The interdependency of training examples that enhance each other;
  • Expressive and learnable aggregations; and 
  • Flexibility across tasks and training objectives.

GNN models also have the advantage of being able to learn collaborative filtering and combine that with node features and attributes to recommend new restaurants to try based on previous interactions. GNN models can transform and smooth node features through passing and transforming them around each node's neighborhood. The graph view of users and restaurants is powerful because it allows enriching an entity's representation with information from other connected entities. Moreover, this feature enrichment is learned and will be optimal for the given task.

DoorDash's Marketplace recommendation challenges, such as reaching inactive users through notifications and leveraging fragmented data, are ideally addressable by GNNs. GNNs excel in extracting personalized information by utilizing interconnected data from various sources such as users, restaurants, orders and views. It makes generating more relevant recommendations even when user interaction is minimal.

How GNNs work

GNN learns on graph data. Our DoorDash data is a rich source for graph creation with various nodes like user, and restaurant, balanced against edges such as orders, views, as shown in Figure 2, and 3 below.

Figure 2: Our DoorDash Data in a graph
Figure 3: An example of data collected and used to build graphs

GNN mechanism

Figure 4: The above two pictures represent node classification using GNN. The left side picture shows that we want to classify the node A. The right side shows how the information is passed across the neighbors to help in classifying the node A. [Source: Professor Jure Leskovec's slides]

There are multiple components in the GNN models:

  • Message passing: In message passing, each node in the graph sends a message to its neighboring nodes, typically relaying the node's features, which are then processed by the receiving nodes.
  • Aggregation: Each node aggregates messages from neighboring nodes to update its representation. This step involves combining incoming messages in a meaningful way, such as summing or averaging them. The aggregated information is then used to update that node's features to capture the influence of its neighbors.
  • Stacking of layers: In GNN, multiple layers of message-passing and aggregation are stacked up to capture more complex representations of the graph's nodes. Each layer in the stack inputs node features from the previous layer, processes them through message passing and aggregation, then updates the node representations. As information passes through multiple layers, each layer learns to capture different levels of abstraction and relationships in the graph. The deeper the network, the more complex and abstract the features are that it can learn, allowing for more accurate predictions and better performance on graph-related tasks.
  • Learning objective: Ultimately, we define the task based on the learning objective, such as node classification, link prediction, graph prediction, or node clustering.

Creating graph datasets

DoorDash uses multiple data from user<>restaurant interactions to build graphs and train GNNs. Heterogeneous graphs are created based on one year's worth of data collected. The graph is defined based on a relational table as shown in Figure 5 below. Each table defines a node type, each row in a table is a node, and matched nodes define an edge.

Figure 5: Data collected and used to build graphs in tabular form showing attributes of nodes and edges between them

Model training

There are three broad components for GNN model training:

  1. Encoder: A GNN's encoder is responsible for creating the graph structure and node features in a latent space representation that captures the relational information and dependencies among the graph's nodes. We encode the features into user and restaurant embeddings.
  2. Model: The model lies at the core of a GNN to process the latent node representations generated by the encoder to make predictions. It typically involves additional layers of neural networks that process the node embeddings and produce the desired output. We obtained high performance in our offline evaluation results with ID-GNN (Identity aware GNN), which is designed to leverage the structural and identity-specific features of graph-structured data. In ID-GNNs to embed a given node, an ego network centered at that node is first extracted, then message passing is applied, where the messages from the center node and the rest of the nodes are computed using different sets of parameters.
  3. Prediction: Ultimately, a GNN uses the learned model to make predictions on unseen or new data. We defined our modeling task as link prediction between user and restaurant. The model component predicts the probability that a user will order from a given restaurant within the next seven days.

We train the GNN model with the above-mentioned components and retrieve both user and restaurant embeddings.

Offline evaluation

We measure the performance of the GNN model in terms of evaluation metrics like MAP@k, and MRR@k with multiple values of k ranging from 1 to 10 against our production LightGBM model. The GNN model showed a significant lift in offline performance.

Integrating GNN recommendations in the ecosystem

We send notifications to users once per day, so we don't need our ML system to be online. We retrieve user and restaurant embeddings from the GNN model. Because we only want to send recommendation notifications for restaurants in a user's delivery area, we use geolocation to select restaurant candidates. Based on the geohash and distance of the restaurants from the user address, we identify up to N closest restaurants in the delivery range of each user's address. We retrieve restaurant candidates based on user's recently used addresses. During the ranking stage, we use GNN-based user and restaurant embeddings to compute the likely affinity between users and candidate restaurants for the user address. As shown in Figure 6 below, we then apply post-processing filters to introduce freshness, remove restaurants with low ratings and to deduplicate restaurants, for instance, by eliminating restaurants for multiple locations of the same franchise. These restaurant recommendations are then sent to our users via notification platform.

Figure 6: Recommendation flow

Impact

After we sent personalized recommendations with the heading “Try Something New,” powered by GNN, we saw a 1.8% relative increase in the input metrics such as push / hub engagement metrics. We also noticed stat sig lift in our top line metrics like monthly active users. The impact was widespread on all user cohorts, validating that our users are satisfied with the quality of our recommendations.

Looking into the future

We plan to extend GNN applications to generate recommendations for more notification types, helping us reach our users with more diversified and unexplored content. We will continue to iterate our GNN model architecture to improve the personalization of recommendations for notifications.

Acknowledgments

Special thanks to Zidong Yang, Lei Xu, Yanjia Xu, Richard Tian, Mengyang Cui, and Juncao Li for making GNN possible at DoorDash.

About the Author

  • Nimesh Sinha

    Nimesh Sinha is a Machine Learning Engineer on the New Verticals team at DoorDash. His focus is on recommendations and personalization.

Related Jobs

Location
San Francisco, CA; Mountain View, CA; New York, NY; Seattle, WA
Department
Engineering
Location
San Francisco, CA; Sunnyvale, CA
Department
Engineering
Location
San Francisco, CA; Sunnyvale, CA; Seattle, WA
Department
Engineering
Location
Pune, India
Department
Engineering
Location
San Francisco, CA; Seattle, WA; Sunnyvale, CA
Department
Engineering