The Data Engineering Director plays a pivotal role in conceiving, designing, and implementing the cloud data infrastructure, pipelines, flows, integrations, and other data fabric components that will serve as the foundation for the enterprise’s next generation Data Hub and Innovation Platform. These components will enable a state-of-the-art advanced analytics capability that includes MLOps, and integrates with our traditional & LowCode automation tools to enable enterprise-wide self-service data discovery and business process automation.
My Asset Management client is looking to fill this opportunity.
- Track record of architecting and delivering modern cloud-native data lakes and pipelines of high complexity in an enterprise environment.
- Proficiency with Data Modeling and Master Data Management.
- Understands data analysis to cleanse & standardize data and how to implement a variety of data flows, ETL methods and integrations.
- Proficiency with both traditional RDBMS based systems, including Data Warehouses and Marts, and more modern NoSQL and cloud-native big-data technology stacks such as document-oriented databases, Hadoop, Apache Spark, columnar data files (e.g. Parquet), etc.
General IT & Systems Engineering
- Demonstrated ability to understand, work with and deliver robust solutions in a variety of programming languages, frameworks, technology stacks, runtime environments, etc.
- Working knowledge of ML/AI concepts, workflows, and toolsets (R Studio, Jupyter Notebook, etc.) preferably in both their cloud-native and desktop deploments.
- Ability to white-board and collaboratively design solutions with both technical and business SME interlocutors.
- Experience and proficiency with modern DevOps practices including TDD, CI/CD, etc., for both code and configuration changes.
- Experience and proficiency with REST APIs, service-oriented architectures (SOA) / microservices, virtualization, containerization (Docker/K8S), and serverless deployment architectures.
GENERAL CANDIDATE ATTRIBUTES:
- Has strong interpersonal skills, able to communicate effectively and proactively with external and internal stakeholders.
- Understands Agile Methods and Ways of Working
- Has implemented data solutions in any of the 3 major public cloud vendors (AWS, Azure, GCP)
- Familiar with Asian local markets and regulatory practices as regards enterprise data management.
- Is self-motivated, self-driving and independent.
- Attentive to detail and able to master volatility, uncertainty, complexity, and ambiguity.
- Experienced in using business analyst methods such as flow charting, requirements capturing, stakeholder analysis, use cases, brainstorming, solution prototyping, etc.
DOMAIN SPECIFIC ATTRIBUTES:
- Working knowledge of asset management business processes and value streams strongly preferred.
- Experience working with data scientists, “quants” and other advanced/scientific users of cutting-edge algorithms on large data sets, preferably in asset management or at least other financial services.
QUALIFICATIONS / EXPERIENCE:
- Degree level or higher
- 7-10 yrs technical experience showing increasing levels of responsibility, sophistication of solutions implemented, and ability to deliver
- Relevant certification(s) for one of the 3 cloud vendors listed above.
- Fluency in both written and spoken English