Temasek's Digital Technology team is seeking pioneers with the grit and passion to champion the end-to-end digital transformation of our company. Every day, we collaborate closely with our business teams to break new ground in leveraging digital technology, design, data science and artificial intelligence to further enhance Temasek's capabilities in the new digital era.
The Digital Technology team stays as nimble as possible, relentlessly pursuing cutting edge technologies and best practices (Agile, Quality Engineering, Lean, Design Thinking etc.) in order to build high-quality products and high-performing teams. Temasek's strong commitment to digital transformation brings about tremendous opportunities for huge impact to the organization, the local ecosystem and even the world. We are looking for a "10x multiplier" - an individual with an insatiable intellectual curiosity and a heart for people, and therefore able to dramatically amplify the team's effectiveness
If our ambition resonates with you, would you like to take the step to make history with us? Responsibilities
The AI Team works with the various project teams to deliver insights via data and machine learning.
This responsibilities of this role include:
- Ensuring that data pipelines are developed with best practices and deployed successfully into production for in-house applications.
- Leading key data pipeline architectural decisions for in-house applications.
- Developing ETL architecture standards, best practices and coding conventions.
- Optimizing performance of ETL jobs.
- A bachelor's degree in Computer Science/Information Technology focus on Software Application Development or a related field.
- Minimum 5 years of hands-on work experience deploying data pipelines in a production environment
- Experience working in a multi-disciplinary team of machine learning engineers, software engineers, product managers and subject domain experts
- High proficiency in SQL and relational databases
- High proficiency in at least 1 data pipeline orchestration tool (Eg. Airflow, Dagster)
- High proficiency in Python and related data libraries (eg. Pandas)
- Experience with Docker
- Experience with CI/CD tools like Jenkins
- Experience in Agile working environment
- Experience with cloud services i.e. AWS - RDS, EKS, EMR, Redshift
- Experience with Snowflake
- Experience with Flask deployment of micro-services, preferably FastAPI
- Experience with big data tools: Spark, Hadoop, Kafka etc.
- Experience with SQLAlchemy and Alembic libraries is a plus