You will serve as a Data Architect in a high-visibility initiative called Project Outpost, focused on transforming the organization's evolving data environment into a structured, high-performance ecosystem. Join In All Media Inc, where engineering ownership and business alignment drive long-term innovation in a fully remote, distributed setting.
What You'll Do
- Define Data Strategy: Architect and implement the long-term data roadmap to transition the organization into a structured, high-performance ecosystem.
- Schema Design: Design and optimize complex data models for both transactional (operational) and analytical use cases.
- Performance Tuning: Lead database performance tuning and structural improvements to ensure high-speed query execution and system reliability.
- Identify Critical Insights: Use analytical intuition to independently identify business-critical data points and define clear analysis goals.
- Standardization: Establish and enforce organizational naming conventions, data governance practices, and structural consistency across the landscape.
- Automation: Utilize Python to create automation scripts for data exploration, cleaning, and landscape management.
- Cross-Functional Collaboration: Partner with Product, Backend Engineering, and Stakeholders to ensure data schemas support both business goals and application UX.
What We're Looking For
- Advanced SQL Mastery: Expert-level proficiency in query design, CTEs, window functions, and complex aggregations.
- Data Modeling: Proven track record of designing scalable schemas for diverse use cases (Transactional vs. Analytical).
- Optimization Expertise: Deep knowledge of database performance tuning and structural improvements.
- Functional Python: Proficiency in using Python (loops, functions, dictionaries) for data exploration and automation tasks.
- Strategic Intuition: The ability to navigate ambiguity and independently identify how data structure impacts business outcomes.
- Self-Direction: Proven ability to manage complex data landscapes and architectural shifts without constant supervision.
- Fluent English: Strong communication skills for daily collaboration with international stakeholders.
Nice to Have
- Cloud Warehousing: Specific experience with Snowflake or similar modern cloud data warehouse environments.
- Governance: Experience implementing formal data governance frameworks and metadata management.
- ETL/ELT Optimization: Hands-on experience optimizing specific data transformation pipelines to reduce latency.
- Modern Data Stack: Familiarity with tools like dbt or Airflow for managing the data lifecycle.
- LATAM Experience: Experience working in remote, distributed teams across Latin American time zones.
Technical Stack
- SQL
- Python
- Snowflake
- dbt
- Airflow
Benefits & Compensation
- Collaboration with teams aligned to Central Time (CT), USA
- Opportunity to work on high-scale platforms with measurable product impact
- Remote, distributed work environment across LATAM
Work Mode
Candidates must overlap at least 4–5 hours daily with US-based stakeholders. Work is fully remote within LATAM regions in a globally distributed setup.
EEO








