LeethubLeethub
JobsCompaniesBlog
Go to dashboard

Leethub

Curated tech jobs from FAANG and top companies worldwide.

Top Companies

  • Google Jobs
  • Meta Jobs
  • Amazon Jobs
  • Apple Jobs
  • Netflix Jobs
  • All Companies →

Job Categories

  • Software Engineering
  • Data, AI & Machine Learning
  • Product Management
  • Design & User Experience
  • Operations & Strategy
  • Remote Jobs
  • All Categories →

Browse by Type

  • Remote Jobs
  • Hybrid Jobs
  • Senior Positions
  • Entry Level
  • All Jobs →

Resources

  • Google Interview Guide
  • Salary Guide 2025
  • Salary Negotiation
  • LeetCode Study Plan
  • All Articles →

Company

  • Dashboard
  • Privacy Policy
  • Contact Us
© 2026 Leethub LLC. All rights reserved.
Home›Jobs›Wheely›Data Engineer
Wheely

About Wheely

Luxury ride-hailing with a focus on privacy

🏢 Tech👥 201-500 employees📅 Founded 2010📍 Brackenbury Village, London, UK💰 $43.1m⭐ 3.8
B2CB2BMarketplaceTransportMobility

Key Highlights

  • Founded in 2010, headquartered in London, UK
  • Available in London, Moscow, Paris, and Dubai
  • $43.1M raised in Series B funding
  • Thousands of certified chauffeurs driving under the Wheely brand

Wheely is a privacy-first luxury ride-hailing platform founded in 2010, headquartered in Brackenbury Village, London. The company operates in major cities like Moscow, St. Petersburg, Paris, and Dubai, providing high-quality chauffeuring services with a rigorous driver certification process. With $4...

🎁 Benefits

Wheely offers a stock options plan, monthly credits for rides, generous training allowances, comprehensive healthcare benefits, and a daily lunch allo...

🌟 Culture

Wheely prioritizes quality and customer satisfaction by employing a strict driver certification process, ensuring a premium experience for clients. Th...

🌐 Website💼 LinkedIn𝕏 TwitterAll 40 jobs →
Wheely

Data Engineer

Wheely • London, England, United Kingdom

Posted 3w ago🏢 HybridMid-LevelData engineer📍 London
Apply Now →

Skills & Technologies

SQLPythonKafkaAirflowSnowflakedbtMetabaseMLflowCensus

Overview

Wheely is seeking a Data Engineer to enhance their Data Team by optimizing data integration pipelines and providing seamless data experiences. You'll work with technologies like SQL, Python, and Kafka. This role requires 3+ years of experience in data engineering or MLOps.

Job Description

Who you are

You have 3+ years of experience in Data Infrastructure Engineer, Data Engineer, or MLOps Engineer roles, demonstrating a strong understanding of analytical databases such as Snowflake, Redshift, and BigQuery. Your expertise includes troubleshooting and configuring these databases to ensure optimal performance. You are fluent in SQL and Python, allowing you to effectively manage and manipulate data. Your experience with data pipelines includes deployment, configuration, and monitoring, particularly with tools like Kafka and Airflow. You have a structured approach to data modeling, applying performance tuning techniques to enhance efficiency. Additionally, you are skilled in containerizing applications and code using Docker and Kubernetes, which is essential for modern data workflows. You possess the ability to identify performance bottlenecks and have experience researching and integrating open-source technologies for data ingestion, modeling, and BI reporting.

Desirable

Experience with ML Ops tools such as MLflow and familiarity with BI reporting tools like Metabase and Observable would be advantageous. You are comfortable working in a team environment and have a proactive approach to addressing feature requests, bug fixes, and data quality issues. Your intermediate level of English allows you to communicate effectively with business users and data scientists.

What you'll do

As a Data Engineer at Wheely, you will enhance the Data Team by implementing architectural best practices and optimizing low-level processes. You will support the evolution of data integration pipelines using technologies such as Debezium and Kafka, ensuring that data flows seamlessly across the organization. Your role will involve data modeling with dbt and working with database engines like Snowflake to create efficient data structures. You will also engage in ML Ops practices using tools like Airflow and MLflow, contributing to the deployment and monitoring of machine learning models. Your responsibilities will include addressing feature requests from various business units, resolving bug fixes, and tackling data quality issues to maintain the integrity of our data systems. You will enforce code quality through automated testing and adhere to coding standards, ensuring that the team delivers high-quality data solutions.

What we offer

Wheely provides a supportive work environment with an in-person culture while allowing flexible working hours and the option to work from home when needed. We offer a monthly credit for Wheely journeys, a lunch allowance, and professional development subsidies to help you grow in your career. You will receive top-notch equipment to perform your best work and a relocation allowance depending on your role level. Join us in building a platform that prioritizes user privacy while delivering a five-star service to millions of rides across multiple cities.

Interested in this role?

Apply now or save it for later. Get alerts for similar jobs at Wheely.

Apply Now →Get Job Alerts