Bangalore, Karnataka, India On-site Employment

Qorvo is hiring a Data Systems Engineer

About the Role

Qorvo is looking for a Data Systems Engineer to join the EBS Reporting Team within our IT Enterprise Business Applications organization. In this role, you will be responsible for leading the design, development, and management of our data infrastructure on the Databricks platform within an AWS government cloud. We are a place to innovate and shape the future of wireless communications, bringing a commitment to excellence and growth.

What You'll Do

  • Establish and grow a data engineering framework to ensure the reliability, scalability, quality, and efficiency of data pipelines, storage, processing, and integration.
  • Establish data pipelines to ingest and curate data containing SAP business content from S/4 to Databricks.
  • Improve, maintain and execute the data strategy at Qorvo including governance, project prioritization, resourcing, and value delivery.
  • Follow the Medalion Architecture (Bronze, Silver, Gold) to logically organize data in a lakehouse.
  • Work effectively in an Agile Scrum environment.
  • Create technical, functional, and operational documentation for data pipelines and applications.
  • Use business requirements to drive the design of data solutions/applications and technical architecture.
  • Work with other developers, designers, and architects to ensure data applications meet requirements and performance, data security, and analytics goals.
  • Work with test team efficiently and effectively structure requirements, define test scenarios, and validate changes.
  • Anticipate, identify, track, and resolve issues and risks affecting delivery.
  • Coordinate and participate in structured peer reviews, walkthroughs, and code reviews.
  • Provide application/technical support.
  • Maintain and/or update technical and/or industry knowledge and skills through continuous learning activities.
  • Adhere to lean principles and standard processes to ensure continuous improvement.
  • Communicate clearly and effectively.

What We're Looking For

  • B.S. in Computer Science/Engineering or a relevant field.
  • 5+ years of experience in the IT industry.
  • 5+ years of hands-on experience in data engineering/ETL using Databricks on AWS/Azure cloud infrastructure and functions.
  • Expert understanding of data warehousing concepts (Dimensional (star-schema), SCD2, Data Vault, Denormalized) implementing highly performant data ingestion pipelines from multiple sources.
  • Expert level skills with Python / PySpark and SQL.
  • Experience with CI/CD on Databricks using tools such as Unity Catalog, Jenkins, GitHub Actions, and Databricks CLI.
  • Experience integrating end-to-end Databricks pipelines to take data from source systems to target data repositories ensuring data quality and consistency.
  • Strong understanding of Data Management principles (quality, governance, security, privacy, life cycle management, cataloging).
  • Experience evaluating the performance and applicability of multiple tools against customer requirements.
  • Experience working within an Agile delivery/DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
  • Experience with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT).
  • Hands on experience developing batch and streaming data pipelines.
  • Able to work independently.
  • Energetic and self-motivated, with a willingness to learn and openness to change.
  • Ability to work in a fast-paced, changing environment, and with all levels of the organization and cope with rapidly changing information.

Nice to Have

  • Masters degree preferred.
  • Experience with SAP ECC or S/4, AWS Redshift, Power BI.
  • Experience consuming CDS views from SAP S/4.

Technical Stack

  • Databricks, AWS, Azure, Python, PySpark, SQL
  • Unity Catalog, Jenkins, GitHub Actions, Databricks CLI
  • Delta Lake, Delta Sharing, Delta Live Tables (DLT)
  • SAP ECC, SAP S/4, AWS Redshift, Power BI

Team & Environment

You will be part of the EBS Reporting Team under Qorvo’s IT Enterprise Business Applications organization.

Work Mode

This is an onsite position based in Bangalore, India.

We are an Equal Employment Opportunity (EEO) employer and welcome all qualified applicants. Applicants will receive fair and impartial consideration without regard to any characteristics protected by applicable law.

Required Skills
DatabricksAWSAzurePythonPySparkSQLUnity CatalogData WarehousingETLJenkinsGitHub ActionsDatabricks CLI
Landing international contracts?

Invoice globally with an EU company

GloPay creates an Estonian partnership for you automatically. Your clients get proper invoices, you keep 95% of payments. Setup takes 5 minutes, works in 100+ currencies.

EU-registered company for compliance
Multi-currency invoicing & payments
Expense tracking & tax reports
Money in your bank in 1 business day
Start invoicing free
5% per invoice • No subscriptions
About company
Qorvo

Qorvo supplies innovative semiconductor solutions that make a better world possible. The company combines product and technology leadership, systems-level expertise and global manufacturing scale to solve customers' complex technical challenges, serving high-growth segments including consumer electronics, smart home/IoT, automotive, EVs, battery-powered appliances, network infrastructure, healthcare and aerospace/defense.

Visit website
Job Details
Department Data and Analytics
Category data
Posted 14 days ago