At DoorDash, route optimization is a key component of our dispatch system, known internally as DeepRed.
Category Archives: engineering
Tackling technical challenges to build a global logistics platform
There are a variety of novel challenges involved when a growing tech company wants to expand quickly and efficiently into new markets.
The Beginner’s Guide to Kotlin Coroutine Internals
When moving from a monolith to a microservices architecture, engineering teams often need to master a new programming paradigm.
How DoorDash Quickly Spins Up Multiple Image Recognition Use Cases
DoorDash has rich image data collected by Dashers, our delivery drivers, that we use in a number of use cases.
Improving Subgroup Analysis with Stein Shrinkage
DoorDash is often interested in knowing not only the average effect of an intervention, but also the more granular effect of that intervention on specific cohorts.
Pioneering DoorDash’s Platform Evolution in Pittsburgh
Today, I‚Äôm excited to announce that DoorDash is building an engineering team in the ‚ÄúSteel City‚Äù – Pittsburgh, PA.
Building Frictionless MFA to Protect Against Account Takeovers
With the rise of digital accounts that enable impactful transactions, keeping these accounts secure from unauthorized account takeovers is becoming essential for any online business. With millions of regular users and the ability to spend money or order food, keeping accounts secure is a top priority at DoorDash as well.
DoorDash 2021 Summer Intern Projects
DoorDash prides itself on offering an internship experience where interns fully integrate with Engineering teams and get the kind of real industry experience that is not taught in a classroom.
Eight Things We Learned from Implementing Payments in the DoorDash Android App
Effective implementation of payments in a mobile app requires precise attention to factors such as payment methods, the user experience, and fraud prevention.
How to Run Apache Airflow on Kubernetes at Scale
As an orchestration engine, Apache Airflow let us quickly build pipelines in our data infrastructure.