GEICO is seeking a Data Engineer with strong expertise in Azure Databricks. This role will focus on building, supporting, and administering scalable, high-performance data pipelines that power real-time and batch analytics for trading, risk, and operational use cases.
What You'll Do
- Design, develop, and maintain robust data pipelines using Azure Databricks, Confluent, DLT, Spark pipeline, and Delta Lake to support trading and market data workflows.
- Self-study the existing pipeline and enhance existing data pipelines, ensuring continuity, scalability, and performance improvements.
- Provide production pipeline support services, including job monitoring, incident resolution, and performance tuning in production environments.
- Administer Databricks workspaces, unity catalog, including cluster configuration, job scheduling, access control, and workspace optimization.
- Build and maintain CI/CD pipelines using GitLab, enabling automated testing, deployment, and versioning of data engineering code.
- Follow and enforce best practices in code management, including modular design, code reviews, and documentation using GitLab workflows.
- Collaborate with fellow team members, business analysts, and data architect to understand data requirements and deliver high-quality solutions.
- Build reusable components and frameworks to accelerate development and ensure consistency across data platforms.
- Actively participate in Agile ceremonies (e.g., sprint planning, stand-ups, retrospectives) and contribute to continuous improvement of team processes.
What We're Looking For
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering.
- At least 2 years working with Azure Databricks.
- Strong proficiency in PySpark, SQL, and Python.
- Experience supporting production pipelines, including monitoring, alerting, and troubleshooting.
- Experience with GitLab CI/CD, including pipeline configuration, runners, and integration with cloud services.
- Familiarity with financial capital markets domain, such as market data feeds, order books, trade execution, and risk metrics.
- Proven ability to work effectively in Agile development environments.
Nice to Have
- Azure certifications (e.g., Azure Data Engineer Associate).
- Experience with real-time data processing using Kafka or Event Hubs.
Technical Stack
- Azure Databricks, Confluent, DLT, Spark pipeline, Delta Lake
- PySpark, SQL, Python
- GitLab CI/CD
- Kafka, Event Hubs
GEICO is an equal opportunities employer and positively encourages applications from suitably qualified and eligible candidates regardless of gender, sexual orientation, marital or civil partner status, gender reassignment, race, colour, nationality, ethnic or national origin, religion or belief, disability or age.



