Jobgether is looking for a Staff Engineer - Data Platform and Lakehouse to provide architectural leadership and hands-on engineering for our core data infrastructure. This role designs and implements a cloud-native platform that powers advanced analytics and AI applications.
What You'll Do
- Design, implement, and maintain scalable, secure, and maintainable data platforms on Databricks and AWS.
- Provide architectural leadership and ensure consistency, resilience, and performance across distributed data processing systems.
- Develop reusable data pipelines, workflows, and ETL/ELT processes using Databricks Workflows, Airflow, or AWS Glue.
- Translate business objectives into technical platform capabilities in collaboration with product and cross-functional teams.
- Support AI/ML initiatives, including feature engineering, model deployment, and real-time data processing.
- Drive adoption of data governance standards, including access control, metadata management, lineage, and compliance.
- Establish CI/CD pipelines and DevOps automation for data infrastructure.
- Evaluate and integrate emerging technologies to enhance development, testing, deployment, and monitoring practices.
What We're Looking For
- 15+ years of experience in software development, covering the full SDLC from design to deployment and support.
- Proven ability to design and implement cloud-native data architectures on Databricks and AWS.
- Deep expertise in Apache Spark, including performance tuning and distributed computing best practices.
- Advanced proficiency in Python and SQL, with solid software engineering foundations.
- Hands-on experience with Databricks Unity Catalog, Feature Store, Delta Live Tables, and data pipeline orchestration tools.
- Strong understanding of ETL/ELT design, data quality validation, observability, and monitoring practices.
- Experience supporting AI/ML workloads and SaaS product integrations.
- Strong communication and collaboration skills for working with engineers, product managers, and data scientists.
- Knowledge of data governance, security, compliance, and metadata management best practices.
- Strategic mindset with the ability to align technical decisions with business goals.
Nice to Have
- Azure or GCP experience a plus.
- Experience with Monte Carlo preferred.
Technical Stack
- Databricks, AWS, Apache Spark, Python, SQL
- Databricks Unity Catalog, Databricks Feature Store, Delta Live Tables
- Airflow, AWS Glue, Monte Carlo
Team & Environment
Works closely with cross-functional teams, including product managers, engineers, and data scientists.
Benefits & Compensation
- Competitive salary range: $170,000–$190,000 plus bonus potential.
- Flexible working options, including fully remote or hybrid schedules in major metropolitan areas (NYC, Boston, Chicago).
- Comprehensive financial, health, and lifestyle benefits.
- Generous annual leave, floating holidays, volunteering days, and a birthday day off.
- Learning and development programs to support career growth.
- Collaborative and inclusive culture with DE&I initiatives and regular social/networking events.
- Employee Assistance Program providing wellbeing, counselling, legal, and financial support.
Work Mode
This role is hybrid and open to candidates in the United States.
Jobgether provides equal employment opportunities.



