Deeter Analytics is hiring its first Data Engineer to join a new algorithmic division. In this foundational role, you will own critical datasets end-to-end, designing and running the data backbone that powers quantitative trading. You will work directly with traders and researchers to transform messy external feeds into clean, reliable, and high-performance data products.
What You'll Do
- Architect cloud-native batch and streaming ELT for diverse sources; standardize, de-duplicate, and document data; define schemas and redundancy.
- Stand up the core data platform including storage, orchestration, metadata catalog, CI/CD, IaC, and observability with a focus on simplicity and cost-awareness.
- Implement data quality checks, anomaly detection, and maintain survivorship-bias-free histories while handling corporate actions and entitlements.
- Expose clean data via APIs, query layers, and shared libraries to produce 'research-ready' datasets for fast backtests and production.
- Partner with quantitative analysts, data scientists, and software engineers to scope, prototype, and productionize new datasets quickly; own incident response and runbooks.
- Uphold security and access hygiene including IAM/least-privilege, secrets management, and audit.
What We're Looking For
- 1+ years building and operating production data pipelines or platforms.
- Strong skills in Python and SQL.
- Comfortable working on at least one major cloud provider (AWS/GCP/Azure).
- Experience with Docker and infrastructure-as-code tools like Terraform.
- Experience with orchestration tools such as Airflow, Prefect, or Dagster.
- Experience with distributed/batch compute (e.g., Spark/Dask/Beam), data warehouses/lakes, and columnar formats (e.g., Parquet/Delta/Iceberg).
- Experience with monitoring/observability (logs, metrics, traces) and cost management.
- Proven delivery for quantitative users or ML/DS teams; clear thinking, clean design, and pragmatic trade-offs.
Nice to Have
- Financial or time-series data experience (corporate actions, vendor entitlements/licensing), alternative data ingestion.
- Multimodal ETL experience (NLP/embeddings, transcription, basic image/video processing).
- Experience with dataset/version control and reproducibility (e.g., LakeFS/DVC) and research workflow tooling.
Technical Stack
- Languages: Python, SQL
- Infrastructure: AWS/GCP/Azure, Docker, Terraform
- Orchestration: Airflow/Prefect/Dagster
- Compute & Storage: Spark/Dask/Beam, Parquet/Delta/Iceberg
Team & Environment
You will be the first Data Engineer in a new algorithmic division, working directly with traders and researchers.
Work Mode
This is a fully remote position.
Deeter Analytics is an equal opportunity employer.





