
Luxury ride-hailing with a focus on privacy
Wheely is a privacy-first luxury ride-hailing platform founded in 2010, headquartered in Brackenbury Village, London. The company operates in major cities like Moscow, St. Petersburg, Paris, and Dubai, providing high-quality chauffeuring services with a rigorous driver certification process. With $4...
Wheely offers a stock options plan, monthly credits for rides, generous training allowances, comprehensive healthcare benefits, and a daily lunch allo...
Wheely prioritizes quality and customer satisfaction by employing a strict driver certification process, ensuring a premium experience for clients. Th...

Wheely • Dubai, Dubai, United Arab Emirates
Wheely is seeking a Data Engineer to enhance their Data Team by optimizing data integration pipelines and providing seamless data experiences. You'll work with technologies like SQL, Python, Kafka, and Snowflake. This role requires 3+ years of experience in data infrastructure or related fields.
You have 3+ years of experience in Data Infrastructure Engineer, Data Engineer, or MLOps Engineer roles, showcasing your ability to troubleshoot and optimize data systems. Your expertise includes analytical databases such as Snowflake, Redshift, and BigQuery, where you have configured and monitored deployments effectively. You are skilled in data pipelines, particularly with tools like Kafka and Airflow, ensuring smooth data flow and integration. Your approach to data modeling is structured and efficient, applying performance tuning techniques to enhance data accessibility and usability. You are proficient in containerizing applications and code using Docker and Kubernetes, which allows for scalable and efficient deployment. Your fluency in SQL and Python enables you to manipulate and analyze data effectively, making you a valuable asset to any data team.
As a Data Engineer at Wheely, you will enhance the Data Team by implementing architectural best practices and optimizing low-level processes. You will support the evolution of data integration pipelines using tools like Debezium and Kafka, ensuring that data flows seamlessly across systems. Your role will involve data modeling with dbt and working with database engines such as Snowflake to maintain high-quality data standards. You will also engage in ML Ops using Airflow and MLflow, contributing to the deployment and monitoring of machine learning models. Additionally, you will be responsible for BI reporting using Metabase and Observable, helping business users derive insights from data. You will cover business units with feature requests, bug fixes, and data quality issues, ensuring that all data-related needs are met promptly. Enforcing code quality, automated testing, and maintaining code style will be part of your daily responsibilities, contributing to a robust and efficient data infrastructure.
Wheely provides a competitive employee stock options plan, relocation allowance, and lunch allowance to support your transition and daily needs. We also offer comprehensive medical insurance, including dental coverage, ensuring your health and well-being are prioritized. You will receive best-in-class equipment to perform your job effectively and professional development subsidies to help you grow in your career. While we have an in-person culture, we understand the importance of flexibility, allowing for remote work when needed. Join us in redefining premium transportation and be part of a fast-growing scale-up that values exceptional talent and innovative solutions.
Apply now or save it for later. Get alerts for similar jobs at Wheely.