Cloud Engineer
We have partnered with a leading FS Client, who are looking for a Cloud Engineer, with hands on experience with Azure cloud platforms.
Responsibilities:
- Design & Management: Develop, provision, and oversee secure, scalable Azure Databricks platforms to facilitate enterprise-wide data transformation for insurance data workloads.
- Collaboration: Work alongside architects, engineers, and security teams to establish and implement robust infrastructure standards, ensuring reliable connectivity and seamless integration with legacy systems, cloud data sources, and third-party platforms.
- Infrastructure as Code: Utilize tools like Terraform to automate the provisioning and configuration of Azure and Databricks resources, aligning with DevOps best practices.
- Automation: Streamline deployment, monitoring, and incident response workflows using GitHub Actions to enhance consistency, traceability, and operational efficiency.
- Monitoring & Performance: Regularly assess platform health, resource utilization, and performance; anticipate scaling requirements and conduct tuning to ensure optimal operation of data pipelines and analytics workloads.
- Security & Compliance: Enforce adherence to enterprise and regulatory standards, including RBAC, managed identities, encryption, PDPO, GDPR, and specific insurance requirements.
- Data Governance: Manage the integration of Informatica tools to support data governance efforts, including cataloguing, data lineage, and compliance checks.
- Documentation: Create and maintain comprehensive documentation of infrastructure architecture, network topology, security configurations, and operational runbooks for governance, audits, and handover processes.
- Troubleshooting: Identify and resolve infrastructure issues, perform root cause analysis, and drive continuous improvement for platform reliability.
- Continuous Learning: Stay updated on the latest Azure, Databricks, and DevOps features, recommending and implementing enhancements to optimize platform capabilities and cost-effectiveness in alignment with Hong Kong Life and General Insurance business priorities.
Requirements:
- Education: Bachelor's degree in Computer Science, Information Technology, or a related field.
- Experience: 5+ years of hands-on experience in designing, deploying, and managing Azure cloud platforms, specifically supporting Azure Databricks for insurance data and analytics workloads.
- Technical Expertise: In-depth knowledge of provisioning, configuring, and managing Azure Databricks clusters and workspaces, including the handling of structured, semi-structured, and unstructured insurance data.
- Integration Skills: Proficient in integrating Azure Data Factory and Azure Data Lake Storage Gen2 with Databricks for automated data workflows.
- Infrastructure as Code: Experienced in using tools like Terraform for automated deployment and configuration of Azure and Databricks services.
- Informatica Solutions: Familiar with deploying and integrating Informatica solutions for effective metadata management, cataloguing, and governance.
- Security Knowledge: Strong understanding of platform security measures (RBAC, NSG, managed identities, Key Vault), monitoring, alerting, and cost optimization in regulated environments.
- CI/CD Automation: Hands-on experience with GitHub Actions for automating CI/CD pipelines related to platform and pipeline deployments.
- Incident Response: Proven ability in incident response, troubleshooting, and performance optimization for critical insurance data workloads.
- Communication Skills: Excellent documentation, collaboration, and communication skills to effectively support technical and business users within the insurance sector.