Data Engineer – Capital Markets or Stock Exchange
Job Overview
-
Date PostedNovember 15, 2025
-
Location
-
Expiration date--
Job Description
411_2591884
Data Engineer – Capital Markets or Stock Exchange
Required Skills
- sql
- kafka
- hadoop
- spark
- mysql
- event streaming
- metadata management
- database optimization
- capital markets domain knowledge
- problem-solving
- documentation
Job Description
Job Type: Full-time
Job Summary:
Join our team as a Senior Data Engineer – Capital Markets and play a vital role in shaping the data infrastructure for a dynamic stock exchange environment. You will design, implement, and optimize robust data solutions that power advanced business intelligence and real-time insights for our leading financial services institution. Our async culture values written communication, enabling deeper focus and effective collaboration across the business.
Key Responsibilities
- Design, develop, and maintain scalable, secure data architectures and ETL pipelines to support business intelligence initiatives.
- Implement and optimize real-time event streaming frameworks and services using technologies such as Kafka.
- Manage and optimize relational and non-relational databases to ensure high performance and availability.
- Leverage big data technologies (Hadoop, Spark, Kafka) to process, analyze, and deliver insights from large volumes of capital markets data.
- Establish and maintain robust metadata management and data lineage practices, ensuring data quality and compliance.
- Collaborate closely with data scientists, business analysts, and stakeholders to understand and deliver on evolving data requirements.
- Lead cross-functional problem solving for data integrity, reliability, and performance issues.
Required Skills and Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, or a related discipline.
- 6-9 years of data engineering experience, including 2-5 years specifically within a leading stock exchange (e.g., Dubai, London, etc.).
- Strong command of SQL and expertise with database management systems (MySQL, PostgreSQL, MongoDB).
- Proficiency in developing end-to-end ETL pipelines and integrating multiple data sources.
- Hands-on experience with big data tools (Hadoop, Spark) and event streaming technologies (Kafka, Event Hub, etc.).
- Solid grasp of DevOps practices relevant to data engineering workflows.
- Exceptional problem-solving abilities and a meticulous approach to data quality and documentation.
Preferred Qualifications
- Prior experience working with or supporting data initiatives for capital markets or stock exchanges.
- Demonstrated expertise in creating and managing metadata repositories.
- Advanced knowledge of real-time data streaming and analytics platforms.
#J-18808-Ljbffr
2025-11-11 12:41:31