Your New Company
Our client is one of the largest FS Institution who is seeking a skilled Data Engineer to help build and optimize their next-generation Data and Analytics Platform for Centre of Excellence.
Your New Role
- Build Scalable Data Pipelines: Develop distributed pipelines for processing large volumes of structured and unstructured data in batch and near-real-time.
- Transform and Enrich Data: Enable advanced analytics and visualization through data transformation under a Continuous Delivery model.
- Task Coordination: Assist the DE CoE lead in managing offshore teams, clarifying requirements, and creating data models and dictionaries.
- Stakeholder Collaboration: Work with business units, enablement teams, and system teams to ensure timely and high-quality task delivery.
- Optimize Performance: Automate and standardize processes to improve team efficiency using best practices.
What You'll Need
- -3+ years of experience in data pipeline development, ideally in Financial Services.
- Proficiency in Java, Scala, Python, or SQL for programming.
- Experience with Data Lake frameworks (Apache Hudi, Iceberg, Delta Lake) and Data Processing tools (Apache Spark, Airflow).
- Strong knowledge of data security capabilities like encryption and anonymization.
- Familiarity with Agile methodologies, continuous integration, and automated releases.
- Hands-on experience with data validation in large-scale enterprise environments.
- Excellent communication and collaboration skills, with experience working in international teams.
What you need to do now
Click 'apply now' to send your CV to cherry.ho@hays.com.hk, or reach out to Cherry Ho at +852 2230 7493 for a confidential conversation.