
Luxury ride-hailing with a focus on privacy
Wheely is a privacy-first luxury ride-hailing platform founded in 2010, headquartered in Brackenbury Village, London. The company operates in major cities like Moscow, St. Petersburg, Paris, and Dubai, providing high-quality chauffeuring services with a rigorous driver certification process. With $4...
Wheely offers a stock options plan, monthly credits for rides, generous training allowances, comprehensive healthcare benefits, and a daily lunch allo...
Wheely prioritizes quality and customer satisfaction by employing a strict driver certification process, ensuring a premium experience for clients. Th...

Wheely • London, England, United Kingdom
Wheely is seeking a Data Engineer to enhance their Data Team by optimizing data integration pipelines and supporting business users. You'll work with technologies like SQL, Python, Kafka, and Snowflake to ensure a seamless data experience.
You have 3+ years of experience in Data Infrastructure Engineer, Data Engineer, or MLOps Engineer roles, demonstrating a strong foundation in analytical databases such as Snowflake, Redshift, and BigQuery. Your expertise includes deploying and monitoring data pipelines using tools like Kafka and Airflow, ensuring that data flows smoothly and efficiently across systems. You are fluent in SQL and Python, allowing you to manipulate and analyze data effectively. Your experience in data modeling emphasizes a DRY and structured approach, applying performance tuning techniques to enhance data accessibility and usability. You are also skilled in containerizing applications and code using Docker and Kubernetes, which helps streamline deployment processes. Your ability to identify performance bottlenecks ensures that the data infrastructure remains robust and responsive to business needs.
Experience with researching and integrating open-source technologies related to data ingestion, data modeling, and BI reporting is a plus. You have a proactive mindset, always looking for ways to improve data quality and support business units with feature requests and bug fixes. Your intermediate level of English allows you to communicate effectively with team members and stakeholders.
As a Data Engineer at Wheely, you will enhance the Data Team by implementing architectural best practices and optimizing low-level processes. You will support the evolution of data integration pipelines, utilizing tools like Debezium and Kafka to ensure that data is accurately captured and processed. Your role will involve data modeling with dbt and working with database engines such as Snowflake to create efficient data structures. You will also engage in ML Ops using Airflow and MLflow, ensuring that machine learning models are effectively integrated into the data pipeline. Additionally, you will be responsible for BI reporting using Metabase and Observable, enabling business users to derive insights from data easily. You will cover business units with feature requests, bug fixes, and address data quality issues, ensuring that the data infrastructure meets the needs of the organization. Enforcing code quality, automated testing, and maintaining code style will be part of your responsibilities, contributing to a high standard of work within the team.
Wheely provides a supportive in-person culture while allowing flexible working hours and the option to work from home when needed. You will receive a monthly credit for Wheely journeys, a lunch allowance, and professional development subsidies to help you grow in your career. We also offer a cycle-to-work scheme and top-notch equipment to ensure you have everything you need to succeed. Depending on your role level, a relocation allowance may be available to assist with your transition to our team. We value your personal information and ensure it is collected, stored, and processed in accordance with Wheely’s Candidate Privacy Notice.
Apply now or save it for later. Get alerts for similar jobs at Wheely.