
Luxury ride-hailing with a focus on privacy
Wheely is a privacy-first luxury ride-hailing platform founded in 2010, headquartered in Brackenbury Village, London. The company operates in major cities like Moscow, St. Petersburg, Paris, and Dubai, providing high-quality chauffeuring services with a rigorous driver certification process. With $4...
Wheely offers a stock options plan, monthly credits for rides, generous training allowances, comprehensive healthcare benefits, and a daily lunch allo...
Wheely prioritizes quality and customer satisfaction by employing a strict driver certification process, ensuring a premium experience for clients. Th...

Wheely • London, England, United Kingdom
Wheely is seeking a Data Engineer to enhance their Data Team by optimizing data integration pipelines and providing seamless data experiences. You'll work with technologies like SQL, Python, and Kafka. This role requires 3+ years of experience in data engineering or MLOps.
You have 3+ years of experience in Data Infrastructure Engineer, Data Engineer, or MLOps Engineer roles, demonstrating a strong understanding of analytical databases such as Snowflake, Redshift, and BigQuery. Your expertise includes troubleshooting and configuring these databases to ensure optimal performance. You are fluent in SQL and Python, allowing you to effectively manage and manipulate data. Your experience with data pipelines includes deployment, configuration, and monitoring, particularly with tools like Kafka and Airflow. You have a structured approach to data modeling, applying performance tuning techniques to enhance efficiency. Additionally, you are skilled in containerizing applications and code using Docker and Kubernetes, which is essential for modern data workflows. You possess the ability to identify performance bottlenecks and have experience researching and integrating open-source technologies for data ingestion, modeling, and BI reporting.
Experience with ML Ops tools such as MLflow and familiarity with BI reporting tools like Metabase and Observable would be advantageous. You are comfortable working in a team environment and have a proactive approach to addressing feature requests, bug fixes, and data quality issues. Your intermediate level of English allows you to communicate effectively with business users and data scientists.
As a Data Engineer at Wheely, you will enhance the Data Team by implementing architectural best practices and optimizing low-level processes. You will support the evolution of data integration pipelines using technologies such as Debezium and Kafka, ensuring that data flows seamlessly across the organization. Your role will involve data modeling with dbt and working with database engines like Snowflake to create efficient data structures. You will also engage in ML Ops practices using tools like Airflow and MLflow, contributing to the deployment and monitoring of machine learning models. Your responsibilities will include addressing feature requests from various business units, resolving bug fixes, and tackling data quality issues to maintain the integrity of our data systems. You will enforce code quality through automated testing and adhere to coding standards, ensuring that the team delivers high-quality data solutions.
Wheely provides a supportive work environment with an in-person culture while allowing flexible working hours and the option to work from home when needed. We offer a monthly credit for Wheely journeys, a lunch allowance, and professional development subsidies to help you grow in your career. You will receive top-notch equipment to perform your best work and a relocation allowance depending on your role level. Join us in building a platform that prioritizes user privacy while delivering a five-star service to millions of rides across multiple cities.
Apply now or save it for later. Get alerts for similar jobs at Wheely.