LeethubLeethub
JobsCompaniesBlog
Go to dashboard

Leethub

Curated tech jobs from FAANG and top companies worldwide.

Top Companies

  • Google Jobs
  • Meta Jobs
  • Amazon Jobs
  • Apple Jobs
  • Netflix Jobs
  • All Companies →

Job Categories

  • Software Engineering
  • Data, AI & Machine Learning
  • Product Management
  • Design & User Experience
  • Operations & Strategy
  • Remote Jobs
  • All Categories →

Browse by Type

  • Remote Jobs
  • Hybrid Jobs
  • Senior Positions
  • Entry Level
  • All Jobs →

Resources

  • Google Interview Guide
  • Salary Guide 2025
  • Salary Negotiation
  • LeetCode Study Plan
  • All Articles →

Company

  • Dashboard
  • Privacy Policy
  • Contact Us
© 2026 Leethub LLC. All rights reserved.
Home›Jobs›Rackspace›Data Engineer (Infra/DevOps Focus)
Rackspace

About Rackspace

Your partner in managed cloud solutions

🏢 Tech👥 5K-10K📅 Founded 1998📍 San Antonio, Texas, United States

Key Highlights

  • Headquartered in San Antonio, Texas
  • Over 200,000 customers including BMW and NASA
  • $1.5B+ raised in funding
  • Approximately 7,000 employees worldwide

Rackspace Technology, Inc., headquartered in San Antonio, Texas, is a leading managed cloud computing company that provides services such as cloud migration, managed hosting, and multi-cloud solutions. With over 200,000 customers, including major brands like BMW and NASA, Rackspace has raised over $...

🎁 Benefits

Employees enjoy competitive salaries, stock options, generous PTO policies, remote work flexibility, and comprehensive health benefits....

🌟 Culture

Rackspace fosters a customer-centric culture with a strong emphasis on service excellence and innovation in cloud technology, encouraging employees to...

🌐 Website💼 LinkedIn𝕏 TwitterAll 83 jobs →
Rackspace

Data Engineer (Infra/DevOps Focus)

Rackspace • Vietnam - Remote

Posted 2 months ago🏠 RemoteMid-LevelData engineer📍 Vietnam
Apply Now →

Job Description

We are looking for a highly skilled Azure Data Engineer with expert knowledge in cloud infrastructure and DevOps automation. This critical hybrid role will be responsible for designing, building, optimizing, and automating our entire end-to-end data platform within the Microsoft Azure ecosystem. The ideal candidate will ensure our data solutions are scalable, reliable, and deployed using modern Infrastructure as Code (IaC) and CI/CD practices.

Key Responsibilities
Data Platform Development & Engineering
Design & Implement ETL/ELT: Develop, optimize, and maintain scalable data pipelines using Python, SQL, and core Azure data services.
Azure Data Services Management: Architect and manage key Azure data components, including:
Data Lakes: Provisioning and structuring data within Azure Data Lake Storage (ADLS Gen2).
Data Processing: Implementing data transformation and analysis logic using Azure Data Factory (ADF), Azure Synapse Pipelines, and Azure Databricks (using Spark/PySpark).
Data Warehousing: Designing and optimizing the enterprise Data Warehouse in Azure Synapse Analytics (SQL Pool).
Data Modeling and Quality: Define and enforce data modeling standards and implement data quality checks within the pipelines.

Cloud Infrastructure & DevOps Automation
Infrastructure as Code (IaC): Design, manage, and provision all Azure data resources (ADLS, Synapse, ADF, Databricks Clusters) using Terraform or Azure Resource Manager (ARM) Templates/Bicep.
CI/CD Implementation: Build and maintain automated Continuous Integration/Continuous Deployment (CI/CD) pipelines for all code (data, infrastructure, and application) using Azure DevOps or GitHub Actions.
Containerization & Compute: Utilize Docker and manage deployment environments using Azure Kubernetes Service (AKS) or Azure Container Instances (ACI) when required for data applications.
Monitoring, Logging, & Security: Configure comprehensive monitoring and alerting using Azure Monitor and Log Analytics. Implement network security and access controls (RBAC) across the data platform.

Required Skills & Qualifications
Azure Cloud: Strong hands-on experience designing and deploying end-to-end data solutions specifically within the Azure ecosystem.
Programming: High proficiency in Python (including PySpark) and expert knowledge of SQL.
DevOps & IaC: Proven, production-level experience with Terraform (preferred) or ARM/Bicep for automating Azure infrastructure deployment.
CI/CD: Experience setting up CI/CD workflows using Azure DevOps Pipelines or GitHub Actions.
Data Tools: Deep working knowledge of Azure Data Factory, Azure Databricks, and Azure Synapse Analytics.
Orchestration: Experience with workflow orchestration tools like Azure Data Factory or Apache Airflow.

Preferred Qualifications
Azure certifications such as Azure Data Engineer Associate (DP-203) or Azure DevOps Engineer Expert (AZ-400).
Familiarity with Data Governance tools such as Azure Purview.
Experience with real-time data ingestion using Azure Event Hubs or Azure Stream Analytics.
 

Interested in this role?

Apply now or save it for later. Get alerts for similar jobs at Rackspace.

Apply Now →Get Job Alerts