Data Platform DevOps Engineer

Key Responsibilities: Build, maintain, and optimize CI/CD pipelines for data, analytics, and ML assets using Azure DevOps or GitHub Actions, with artifact promotion across environments. Automate Databricks and Fabric operations (clusters, jobs, workflows, Lakehouse/Lakehouse Warehouse, permissions, asset bundles) with Python/PowerShell and REST/CLI. Implement Infrastructure-as-Code for Azure, Databricks, and Fabric resources using Terraform/Scripts; including secure networking, secrets, and poli

Cognizant - Hong Kong - Full time

Salary: Competitive

Key Responsibilities:
  • Build, maintain, and optimize CI/CD pipelines for data, analytics, and ML assets using Azure DevOps or GitHub Actions, with artifact promotion across environments.
  • Automate Databricks and Fabric operations (clusters, jobs, workflows, Lakehouse/Lakehouse Warehouse, permissions, asset bundles) with Python/PowerShell and REST/CLI.
  • Implement Infrastructure-as-Code for Azure, Databricks, and Fabric resources using Terraform/Scripts; including secure networking, secrets, and policies.
  • Standardize release workflows for notebooks, packages, Delta tables, and semantic models with testing, approvals, and rollout/rollback strategies.
  • Embed governance and security controls leveraging Unity Catalog, Microsoft Purview, Key Vault, RBAC/ABAC, and secrets rotation.
  • Enable observability with Azure Monitor, Log Analytics, and Databricks/Fabric telemetry; define SLIs/SLOs, alerts, and on-call runbooks.
  • Drive cost efficiency via autoscaling, cluster policies, spot capacity, and job scheduling aligned to workload patterns.
Required Experience:
  • Technical Experience: 6+ years in DevOps or Platform Engineering for data/AI on Microsoft Azure, including Databricks and Microsoft Fabric.
  • Expert in building automation in the cloud using Python/PowerShell/Spark.
  • CI/CD: Hands-on experience with Azure DevOps or GitHub Actions, YAML pipelines, environment strategies, and artifact versioning for data and ML. DevSecOps experience with integration on SonarCube and VeraCode.
  • IaC & Automation: Proficiency with Terraform or Bicep, Python and/or PowerShell, and Databricks/Fabric CLIs and APIs.
  • Understanding of Data Platform concepts: Working knowledge of Delta Lake, Spark, ADLS Gen2, Data Factory/Fabric Data Factory, and Power BI integration.
  • Data Product & Governance domain understanding: Experience in developing and managing data products, including implementing governance frameworks, metadata management, data cataloging, lineage, and data quality controls.
  • Security & Governance: Experience implementing Unity Catalog/Purview lineage, access controls, secrets management, and compliant deployments.
  • Problem-Solving Skills: Proven ability to identify, troubleshoot, and resolve complex technical issues using automated solutions.
  • Communication & Documentation: Clear technical communication and the ability to support globally distributed teams.
  • Experience producing high-quality documentation: technical specifications, functional specifications, user playbooks.
  • Experience working with SAFe Agile framework and driving design reviews, demos, and user training.
  • Experience working in retail, luxury, and consumer industry is preferred.
Education:
  • Bachelor's in Computer Science, IT, or related field.
  • Azure Cloud Certifications in related technologies are highly valued, e.g., DevOps Engineer Expert (AZ-400), Azure Solutions Architect Expert (AZ-305).

23188526
Ad