Houston or Saint Louis or Chicago or Detroit or Atlanta or Boston or Minneapolis or Denver Remote (Country) Employment USD 92,118 - 202,730 Yearly

Perficient, Inc. is hiring a Databricks Solutions Architect

About the Role

Perficient, Inc. is hiring a Databricks Solutions Architect to serve as an expert in modern Data and AI platforms. This role combines deep technical expertise in Databricks and cloud infrastructure with client-facing presales, team leadership, and solution design to deliver best-fit architectural solutions.

What You'll Do

  • Partner with sales teams to lead technical presales conversations, including discovery sessions, solution design workshops, and customer presentations.
  • Translate business requirements into scalable Data and AI architectures leveraging Databricks across Azure, AWS, and GCP.
  • Design and implement end-to-end solutions including data ingestion, transformation, modeling, governance, and sharing.
  • Develop and deliver Proof-of-Concepts and technical demos showcasing advanced capabilities such as Delta Live Tables/Lakeflow, Unity Catalog, GenAI integrations, and ML/AI pipelines.
  • Integrate Databricks with cloud-native services and enterprise systems.
  • Architect analytical solutions that enable BI, ML, and AI-driven insights.
  • Create go-to-market materials, reusable solution accelerators, and reference architectures.
  • Define and package solution offerings aligned with industry use cases.
  • Provide architectural assessments, roadmaps, and best practices for modern data platforms.
  • Collaborate with Product Owners, SMEs, and cross-functional teams to align technical design with business objectives.
  • Support Agile delivery teams with technical guidance, sprint planning, and solution implementation.
  • Mentor engineers and consultants on Databricks and modern data architecture patterns.
  • Act as a thought leader by driving innovation, influencing technical direction, and evangelizing platform capabilities.

What We're Looking For

  • Minimum of 10 years of professional experience with Data Management and Cloud technologies including Enterprise Data Warehouses, Data Lakes and Lakehouses.
  • 3-5 years of professional experience with Databricks and Lakehouse Architecture.
  • 4-5 years of hands-on experience with big data technologies such as Spark, Hadoop, Kafka.
  • Experience implementing, designing, and building solutions leveraging Databricks Unity Catalog, ML/AI and Workflows.
  • At least 3 Databricks project implementations that exploit the full capabilities (discover, design, implement and optimize) – is required.
  • At least 5 years architecting ETL/ELT solutions using commercial tools.
  • Proficiency with Classic ML/AI, Gen AI and Agentic AI type solutions and architectures.
  • DevOps experience with Databricks CI/CD (Asset Bundles), Git, Terraform.
  • Design and architecture experience supporting both batched and streaming workloads for Big Data Pipelines.
  • At least 3 years of experience in a professional services company, consulting firm, or agency.
  • A solid understanding of delivery methodology and experience leading teams in implementing solutions according to the designed architecture.
  • Demonstrated ability to leverage AI tools to enhance productivity and streamline workflows.
  • Ability to elicit requirements and communicate clearly with non-technical individuals, development teams, and other project members.
  • Experience working with a globally distributed team and managing off-shore teams.

Nice to Have

  • Master’s degree in computer science or related field.
  • Certification in Databricks (Champion or Professional), Apache Spark and/or Cloud Platforms (AWS/Azure).
  • Leadership and team management experience with the ability to provide strategic planning and oversight.
  • Experience leading customer workshop sessions to educate customers on the latest technology trends and best practices.

Technical Stack

  • Databricks, Spark, Hadoop, Kafka, Delta Live Tables/Lakeflow, Unity Catalog
  • Cloud Platforms: Azure, AWS, GCP
  • Cloud Services: Azure ADF, ADLS, Event Hub, AWS S3, Glue, Lambda
  • DevOps: Git, Terraform

Team & Environment

You will be part of Perficient's Data and Analytics Business Unit.

Benefits & Compensation

  • Total compensation range: $92,118 to $202,730.

Work Mode

This position is local-country based within the United States.

Perficient, Inc. proudly provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, gender, sexual orientation, national origin, age, disability, genetic information, marital status, amnesty, or status as a protected veteran in accordance with applicable federal, state and local laws.

Required Skills
DatabricksSparkHadoopKafkaDelta Live TablesLakeflowUnity CatalogAzureAWSGCPAzure ADFData LakehouseML/AIEnterprise Data Warehouses
Freelancing without stability?

Get steady projects, keep your freedom

Iglu connects you with international clients and handles contracts, payments, and admin. You get consistent work and flexibility — no more chasing invoices or worrying about gaps.

Consistent client projects
Contract & payment management
Flexible work schedule
Revenue-sharing compensation
See open positions
Work from anywhere
About company
Perficient, Inc.

Perficient is a leading global digital consultancy. We imagine, create, engineer, and run digital transformation solutions that help our clients exceed customers’ expectations, outpace competition, and grow their business.

Visit website
Job Details
Department Data and Analytics
Category data
Posted 14 days ago