Oscilar is hiring a Senior Data Engineer to design, build, and maintain the core data infrastructure that powers our AI-driven decisioning and risk management platform. In this role, you will deliver highly reliable, scalable data pipelines and storage solutions that support real-time analytics and critical machine learning models.
What You'll Do
- Architect and implement scalable ETL and data pipelines spanning ClickHouse, Postgres, Athena, and diverse cloud-native sources to support real-time risk management and advanced analytics.
- Design, develop, and optimize distributed data storage solutions for high performance and reliability at scale, serving mission-critical fraud detection and compliance models.
- Drive schema evolution, data modeling, and advanced optimizations for analytical and operational databases, including sharding, partitioning, and pipeline orchestration.
- Own the end-to-end data flow: integrate multiple internal and external sources, enforce data validation and lineage, and automate and monitor workflow reliability.
- Collaborate cross-functionally with engineers, product managers, and data scientists to deliver secure, scalable solutions that enable fast experimentation and robust operationalization of new ML/AI models.
- Champion radical ownership by identifying opportunities and implementing innovative technical and process solutions within a fast-moving, remote-first culture.
- Mentor and upskill team members, cultivate a learning environment, and contribute to a collaborative, mission-oriented culture.
What We're Looking For
- 5+ years in data engineering, including architecting and operating production ETL/ELT pipelines for real-time, high-volume analytic platforms.
- Deep proficiency with ClickHouse, Postgres, Athena, and distributed data systems like Kafka; proven experience with both batch and streaming pipeline design.
- Advanced programming in Python and SQL, with expertise in workflow orchestration (Airflow, Step Functions), CI/CD, and automated testing for data.
- Experience in high-scale, low-latency environments and an understanding of security, privacy, and compliance requirements for financial-grade platforms.
- Strong communication, business alignment, and documentation abilities, capable of translating complex technology into actionable value for customers and stakeholders.
- Alignment with Oscilar’s values: customer obsession, radical ownership, bold vision, efficient growth, and unified teamwork with a culture of trust and excellence.
Nice to Have
- Experience integrating Kafka with analytics solutions like ClickHouse.
- Knowledge of event-driven architecture and streaming patterns like CQRS and event sourcing.
- Hands-on experience with monitoring tools like Prometheus, Grafana, or Kafka Manager.
- Experience automating infrastructure with tools like Terraform or CloudFormation.
- Proficiency with Postgres, Redis, ClickHouse, and DynamoDB, including data modeling, query optimization, and high-transaction databases.
- Familiarity with encryption, role-based access control, and secure API development.
Technical Stack
- Databases: ClickHouse, Postgres, Athena, Redis, DynamoDB
- Streaming: Kafka
- Languages: Python, SQL, Java
- Orchestration & Infrastructure: Airflow, Step Functions, Terraform, CloudFormation
- Monitoring: Prometheus, Grafana, Kafka Manager
Team & Environment
You will work alongside industry veterans from Meta, Uber, Citi, and Confluent and collaborate cross-functionally with engineers, product managers, and data scientists.
Work Mode
This is a remote-first position.
Oscilar is an equal opportunity employer.

