About the roleAbout LANCH
LANCH, the fastest-growing consumer company in DACH, is seeking a talented and motivated Data Engineer to join our dynamic team.
Founded in 2023 and headquartered in Berlin, LANCH partners with restaurants and top creators to launch delivery-first food brands such as Happy Slice pizza, Loco Chicken, and the new Korean-style Koco Chicken. Beyond virtual kitchens, we are rolling out a network of physical restaurants and retail brands (“Happy Chips”, “Loco Tortillas”) that already reach thousands of supermarkets. Backed by €26 million in Series A funding (Feb 2025), our Tech & Data team is building the platforms - LANCH OS and the Partner Portal - that power everything from menu management to supply-chain automation.
The Role
We’re looking for our first Data Engineer to lay the foundations of LANCH’s end-to-end data platform. You’ll own everything that turns operational events into trusted, analysis-ready datasets - from real-time streaming and batch pipelines to the orchestration frameworks that keep them humming. Working hand-in-hand with product, engineering, and ops, you will design and implement the data infrastructure that powers menu optimisation, delivery routing, brand performance dashboards, and much more.
Key Responsibilities
- Architect and launch a scalable event-streaming platform (e.g., Pub/Sub, Kafka) that captures orders, logistics updates, and app interactions in real time.
- Build and maintain a modern Reverse ETL layer (e.g., Census, Hightouch) to push clean warehouse data back to internal applications like our Partner Portal, LANCH OS, or our CRM.
- Evolve our Airflow and ELT environment: modular DAG design, automated testing, CI/CD, observability, and cost-efficient GCP execution.
- Collaborate with backend engineers to instrument services for analytics & tracing; champion event naming conventions and schema governance.
- Set engineering standards - code reviews, documentation, security, and infra as code (Terraform) - that will scale as we 10x the team and data volume.
About youAbout You – what will make you thrive at LANCH
- 2+ years building data infrastructure in cloud environments.
- Professional experience in designing and developing ELT pipelines.
- Hands-on experience with at least one streaming technology (Pub/Sub, Kafka, Kinesis, Dataflow, ...).
- Fluent in Python for data processing; comfortable writing performant SQL (BigQuery dialect a plus).
- Proven track record orchestrating pipelines with Airflow (or Dagster, Prefect) and deploying via Docker & GitHub Actions.
- Product mindset: you enjoy sitting with ops teams or restaurant managers to translate fuzzy business challenges into robust data pipelines.
- Bias for action and ownership: you prototype quickly, measure impact, and iterate - yesterday’s idea should be today’s scheduled DAG.
- Collaborative communicator - fluent English; conversational German.
- Eager to work mostly on-site in our Berlin Prenzlauer Berg office.
Our Tech Stack- Data Warehouse: BigQuery
- Transformation & Modelling: dbt, SQL
- Orchestration: Airflow
- Streaming / Messaging: Google Pub/Sub, Apache Kafka (greenfield)
- Backend & APIs: Python, FastAPI, SQLModel, PostgreSQL
- Infrastructure: GCP, Terraform, Docker, GitHub Actions
- Analytics & BI: Metabase, Pandas, Notebook-based exploration
- Reverse ETL: Census, Hightouch, ... (greenfield)
If parts of the stack are new to you, no worries - what matters most is your drive to learn fast and build data products that power thousands of meals a day.
If shaping the data foundation of a high-growth food tech startup excites you, we’d love to meet you.