LeethubLeethub
JobsCompaniesBlog
Go to dashboard

Leethub

Curated tech jobs from FAANG and top companies worldwide.

Top Companies

  • Google Jobs
  • Meta Jobs
  • Amazon Jobs
  • Apple Jobs
  • Netflix Jobs
  • All Companies →

Job Categories

  • Software Engineering
  • Data, AI & Machine Learning
  • Product Management
  • Design & User Experience
  • Operations & Strategy
  • Remote Jobs
  • All Categories →

Browse by Type

  • Remote Jobs
  • Hybrid Jobs
  • Senior Positions
  • Entry Level
  • All Jobs →

Resources

  • Google Interview Guide
  • Salary Guide 2025
  • Salary Negotiation
  • LeetCode Study Plan
  • All Articles →

Company

  • Dashboard
  • Privacy Policy
  • Contact Us
© 2026 Leethub LLC. All rights reserved.
Home›Jobs›Socket›Senior Data Engineer
Socket

About Socket

Simplifying blockchain integration for developers everywhere

🏢 Tech👥 11-50

Key Highlights

  • Over 100 trusted wallets and apps including Coinbase & Metamask
  • Processed $10 billion+ in transaction volume
  • 7 million+ transactions across multiple blockchains
  • Headquartered in San Francisco, CA

Socket is pioneering the first Chain Abstraction protocol, allowing developers to seamlessly integrate with any app, user, and asset across various rollups and chains. Trusted by over 100 wallets and applications, including Metamask, Coinbase, and Opensea, Socket has processed more than $10 billion ...

🎁 Benefits

Socket offers competitive salaries, equity options, flexible remote work policies, and generous PTO to support work-life balance....

🌟 Culture

Socket fosters a culture of innovation and collaboration, prioritizing engineering excellence and a remote-friendly environment that encourages creati...

🌐 Website💼 LinkedIn𝕏 TwitterAll 19 jobs →
Socket

Senior Data Engineer

Socket • United States

Posted 1d agoSeniorData engineer📍 United states
Apply Now →

Skills & Technologies

PostgreSQLKafkaClickHouse

Overview

Socket is hiring a Senior Data Engineer to build and maintain scalable data infrastructure. You'll work with technologies like PostgreSQL, Kafka, and ClickHouse to handle high-volume data streams. This role requires significant experience in data engineering and pipeline development.

Job Description

Who you are

You have 5+ years of experience in data engineering, with a strong background in designing and building scalable data pipelines. You understand the intricacies of data ingestion, processing, and transformation, and have a proven track record of working with high-volume event streams and historical data. Your expertise includes developing APIs that deliver analytics and trend reports, ensuring that data is accessible and reliable for both internal teams and external customers.

You are skilled in optimizing data storage and query performance, using systems like ClickHouse, Kafka, and PostgreSQL. You have experience implementing data quality monitoring to ensure accuracy and completeness across datasets. You thrive in collaborative environments and are comfortable working across the stack, from ingestion pipelines to analytics APIs.

You are customer-obsessed, always prioritizing the needs of users and striving to exceed their expectations. You take ownership of your work and are non-territorial regarding your role, willing to wear many hats to contribute to the team's success. You are proactive in seeking feedback and continuously improving your skills and processes.

Desirable

Experience with real-time analytics and event-driven architectures is a plus. Familiarity with data governance and security practices will help you excel in this role. You are also encouraged to apply if you have experience with cloud platforms and data warehousing solutions.

What you'll do

In this role, you will design and build scalable data pipelines that ingest, process, and transform high-volume event streams and historical data. You will develop and maintain APIs that deliver analytics, trend reports, and drill-down capabilities to internal teams and external customers. Your work will ensure that data flows reliably and is accessible when teams need it.

You will build robust infrastructure for data quality monitoring, ensuring accuracy and completeness across customer and artifact datasets. Your responsibilities will include optimizing data storage and query performance using systems like ClickHouse, Kafka, and PostgreSQL to support both real-time and batch use cases.

You will implement usage tracking, auditing, and event logging to enhance the overall data ecosystem. Collaboration with cross-functional teams will be essential as you work to ensure that data insights are effectively utilized to secure the software supply chain.

What we offer

At Socket, you will be part of a mission-driven team that values innovation and collaboration. We offer a competitive salary and benefits package, along with opportunities for professional growth and development. You will have the chance to work with cutting-edge technologies and contribute to a product that helps organizations manage their open-source code securely. Join us in making a significant impact in the software development and security landscape.

Interested in this role?

Apply now or save it for later. Get alerts for similar jobs at Socket.

Apply Now →Get Job Alerts