LeethubLeethub
JobsCompaniesBlog
Go to dashboard

Leethub

Curated tech jobs from FAANG and top companies worldwide.

Top Companies

  • Google Jobs
  • Meta Jobs
  • Amazon Jobs
  • Apple Jobs
  • Netflix Jobs
  • All Companies →

Job Categories

  • Software Engineering
  • Data, AI & Machine Learning
  • Product Management
  • Design & User Experience
  • Operations & Strategy
  • Remote Jobs
  • All Categories →

Browse by Type

  • Remote Jobs
  • Hybrid Jobs
  • Senior Positions
  • Entry Level
  • All Jobs →

Resources

  • Google Interview Guide
  • Salary Guide 2025
  • Salary Negotiation
  • LeetCode Study Plan
  • All Articles →

Company

  • Dashboard
  • Privacy Policy
  • Contact Us
© 2026 Leethub LLC. All rights reserved.
Home›Jobs›Wheely›Data Engineer
Wheely

About Wheely

Luxury ride-hailing with a focus on privacy

🏢 Tech👥 201-500 employees📅 Founded 2010📍 Brackenbury Village, London, UK💰 $43.1m⭐ 3.8
B2CB2BMarketplaceTransportMobility

Key Highlights

  • Founded in 2010, headquartered in London, UK
  • Available in London, Moscow, Paris, and Dubai
  • $43.1M raised in Series B funding
  • Thousands of certified chauffeurs driving under the Wheely brand

Wheely is a privacy-first luxury ride-hailing platform founded in 2010, headquartered in Brackenbury Village, London. The company operates in major cities like Moscow, St. Petersburg, Paris, and Dubai, providing high-quality chauffeuring services with a rigorous driver certification process. With $4...

🎁 Benefits

Wheely offers a stock options plan, monthly credits for rides, generous training allowances, comprehensive healthcare benefits, and a daily lunch allo...

🌟 Culture

Wheely prioritizes quality and customer satisfaction by employing a strict driver certification process, ensuring a premium experience for clients. Th...

🌐 Website💼 LinkedIn𝕏 TwitterAll 38 jobs →
Wheely

Data Engineer

Wheely • Λευκωσία, Nicosia, Cyprus

Posted 6h ago🏢 HybridMid-LevelData engineer📍 Nicosia
Apply Now →

Skills & Technologies

SQLPythonKafkaAirflowSnowflakedbtMetabaseMLflowCensus

Overview

Wheely is seeking a Data Engineer to enhance their Data Team by optimizing data integration pipelines and supporting business users. You'll work with technologies like SQL, Python, and Snowflake to ensure a seamless data experience.

Job Description

Who you are

You have 3+ years of experience in Data Infrastructure Engineer, Data Engineer, or MLOps Engineer roles — you've successfully built and maintained data pipelines and have a strong understanding of data architecture. Your expertise includes analytical databases like Snowflake, Redshift, and BigQuery, and you're skilled in troubleshooting and optimizing these systems. You are fluent in SQL and Python, allowing you to write efficient queries and scripts to manipulate and analyze data effectively. You have experience with data integration tools such as Debezium and Kafka, and you understand the importance of monitoring and maintaining data pipelines to ensure reliability and performance.

You are familiar with data modeling techniques and apply performance tuning strategies to enhance data processing efficiency. Your experience with containerization technologies like Docker and Kubernetes enables you to deploy applications seamlessly. You have a proactive approach to identifying performance bottlenecks and are comfortable researching and integrating open-source technologies to improve data ingestion and reporting processes. Your communication skills allow you to collaborate effectively with business users and data scientists, ensuring that their data needs are met.

Desirable

Experience with ML Ops tools such as Airflow and MLflow is a plus, as is familiarity with BI reporting tools like Metabase and Observable. You are open to learning and adapting to new technologies, and you thrive in a collaborative environment where you can contribute to team success.

What you'll do

As a Data Engineer at Wheely, you will enhance the Data Team by implementing architectural best practices and optimizing data integration pipelines. You will support business users and data scientists by providing a seamless data experience, ensuring that they have access to high-quality data for their analyses. Your responsibilities will include deploying and configuring data pipelines using tools like Kafka and Airflow, as well as managing data modeling processes with dbt. You will work with analytical databases such as Snowflake to ensure efficient data storage and retrieval.

You will also be responsible for enforcing code quality and implementing automated testing to maintain high standards in data engineering practices. Collaborating with various business units, you will address feature requests, bug fixes, and data quality issues, ensuring that the data infrastructure meets the evolving needs of the organization. Your role will involve identifying performance bottlenecks and implementing solutions to enhance data processing efficiency.

What we offer

Wheely provides a supportive work environment with an in-person culture while allowing flexible working hours and the option to work from home when needed. We offer an employee stock options plan, relocation allowance, and a lunch allowance to ensure your well-being. Our comprehensive medical insurance includes dental coverage, and we provide best-in-class equipment to support your work. Additionally, we offer professional development subsidies to help you grow in your career. Join us at Wheely and be part of a team that values user privacy and strives to deliver a five-star service to millions of rides across multiple cities.

Interested in this role?

Apply now or save it for later. Get alerts for similar jobs at Wheely.

Apply Now →Get Job Alerts