Intern - ONT, Data & AI Engineering (Fall 2026, Jul to Dec)
Singapore, SG
SGX Internship Program
We are Asia’s leading and trusted securities and derivatives market infrastructure, operating equity, fixed income, currency and commodity markets to the highest regulatory standards. We also operate a multi-asset sustainability platform, SGX FIRST or Future in Reshaping Sustainability Together (sgx.com/first). We are committed to facilitating economic growth in a sustainable manner leveraging our roles as a key player in the ecosystem, a business, regulator and listed company. With climate action as a key priority, we aim to be a leading sustainable and transition financing and trading hub offering trusted, quality, end-to-end products and solutions. As Asia’s most international, multi-asset exchange, we provide listing, trading, clearing, settlement, depository and data services, with about 40% of listed companies and over 80% of listed bonds originating outside of Singapore. We are the world's most liquid international market for the benchmark equity indices of China, India, Japan and ASEAN. In foreign exchange, we are Asia's leading marketplace and most comprehensive service provider for global FX over-the-counter and futures participants. Headquartered in AAA rated Singapore, we are globally recognised for our risk management and clearing capabilities.
Be a part of this fast-paced world where your contributions count. SGX offers you the unique experience of gaining insights into the Exchange business and the finance value chain. You will be immersed in the respective specialist functions – at SGX, we believe that our interns are involved in real-time work from Day 1.
Note: Leave of Absence (LOA) may be required to take on the internship.
Operations and Technology Roles:
- Platform Engineering
- Data & AI Engineering
- Platform Delivery and Management
- Tech Service Management
- Strategy and Transformation
- Product Designs and Platforms
- Information Security
Data & AI Engineering
Brief Overview:
As one of the key innovation drivers at Singapore Exchange, the Data & AI Engineering unit plays a pivotal role in shaping the future of data-driven solutions. We are at the forefront of emerging technologies such as Enterprise Data Platform, Artificial Intelligence (AI), Generative AI (GenAI), and Cloud Computing. Our responsibilities span across designing robust data engineering and machine learning architectures, ensuring data integrity, developing data dictionaries and governance models, and building advanced analytics and visualization tools. These efforts aim to transform the organization and foster a strong data-driven culture.
Learning Outcomes
The Data & AI Engineering unit is seeking a Data Engineering Intern with a passion for to work with data, build ETL pipelines, curated data model and ensure data quality etc. This internship offers an opportunity to learn what is a production grade setup and how to design data structure and pipelines at scale. The selected intern will gain hands-on experience with our suite of products and services, learn about the data domain (e.g. trading and/or post trade), build data pipelines / ETL etc., and develop end-to-end production pipelines that deliver tangible business impact across multiple real-world use cases.
Job Description
- Build scalable and reliable ETL pipelines and processes to ingest data from multiple sources
- Create curated layers for users to self-help to the data
- Work with business users to understand data needs
- Build, test and deliver datasets to production
- Build internal tools and dashboards
Technical responsibilities
- Data Extraction, Transformation and Quality:
- Use kdb+ and q language for querying large datasets from the data warehouse.
- Perform data cleaning and transformation using either q (for KDB+) , SQL / SQLX or Python (pandas, NumPy etc.).
- Implement reconciliation and data quality checks.
- Data pipelines:
- Develop and implement data pipelines, ETL and curated layers.
- Optimize data processing time and cost.
- Tooling & Automation:
- Automate workflows and implement CI/CD.
- Automate testing.
- Cloud Development (subject to project):
- Develop and deploy workflows on Google Cloud Platform (GCP) using services like BigQuery, Cloud Storage, Cloud Run/Cloud Run Functions, Composer (Airflow) and Data form
- Business Understanding & Presentation:
- Translate technical findings into actionable data insights for business teams.
- Prepare clear and compelling presentations, ensuring non-technical stakeholders can easily understand the impact.
- Document design and solution in Confluence.
Knowledge and Skills
Must-Have:
- Highly driven, proactive, and a strong team player.
- Excellent interpersonal skills with strong written and verbal communication in English.
- Ability to multitask effectively and manage large datasets.
- Good understanding of database concepts
- Proficiency in SQL
Nice-to-Have (Advantageous):
- Experience with q/KDB+.
- Familiarity with Google Cloud Platform (GCP).
- Good understanding of financial markets.
- Familiar with Git, Jira and Confluence.
Job Segment:
Database, Cloud, Data Modeler, Data Warehouse, Information Security, Technology, Data