United States Remote (Country)

v4c.ai is hiring a Data Engineer

About the Role

V4C.ai is hiring a Data Engineer to join our remote US team. You will support the design, development, and maintenance of data solutions using Databricks, helping our clients and internal teams process, transform, and analyze data effectively. You’ll work on building reliable data pipelines and workflows in a collaborative environment.

What You'll Do

  • Collaborate with team members and stakeholders to understand data requirements and contribute to building scalable data pipelines and workflows in Databricks.
  • Develop and implement ETL/ELT processes using Databricks, Python, SQL, and related tools to ingest, transform, and prepare data.
  • Assist in optimizing data workflows for better performance, reliability, and cost-efficiency within Databricks environments.
  • Support the creation and maintenance of data models, tables, and integrations in cloud platforms (Azure, AWS, or similar).
  • Work closely with cross-functional teams (data analysts, scientists, and engineers) to deliver clean, accessible data for analytics and reporting.
  • Monitor data pipelines, troubleshoot basic issues, and contribute to documentation and best practices.
  • Stay curious about new Databricks features and data engineering trends to support ongoing improvements.

What We're Looking For

  • Bachelor’s degree in Computer Science, Data Science, Engineering, Information Systems, or a related field (or equivalent practical experience).
  • 1-2 years of professional experience in data engineering, data processing, analytics engineering, or a closely related role (internships, co-ops, or academic projects with relevant tools count toward this).
  • Hands-on experience and comfort building basic data pipelines or transformations.
  • Proficiency in Python and SQL.
  • Basic understanding of cloud platforms such as Azure, AWS, or GCP (ex: working with storage, compute, or data services).
  • Solid analytical and problem-solving skills with attention to detail and a focus on writing clean, maintainable code.
  • Strong communication skills and ability to work collaboratively in a remote team environment.
  • Eagerness to learn, take ownership of tasks, and grow within data engineering.

Nice to Have

  • Experience with Scala is a plus but not required.

Technical Stack

  • Databricks
  • Python
  • SQL
  • Azure
  • AWS
  • GCP

Team & Environment

You'll be working within cross-functional teams that include data analysts, scientists, and engineers.

Work Mode

This is a fully remote position for candidates based in the United States.

Required Skills
DatabricksPythonSQLAzureAWSGCPData EngineeringETLData PipelinesCloud PlatformsData WarehousingBig Data
Your first international client?

Don't lose them over invoicing

Clients ghost freelancers with unprofessional invoicing. Glopay gives you a real EU company partnership so they take you seriously from invoice #1.

Instant EU company partnership
Invoice builder with your branding
Automated payment reminders
Real-time payment tracking
Get EU company now
Ready in 24 hours
About company
v4c.ai

v4c.ai is a premier IT services consultancy specializing in Dataiku, the Universal AI platform, to drive strategic business transformation. We partner with organizations to accelerate their journey towards AI-driven success by offering a comprehensive suite of Dataiku and generative AI services.

Visit website
Job Details
Category data
Posted a month ago