We are seeking a highly motivated and experienced Data Engineer to join our Strategic Data Democratization product development team. The ideal candidate will have strong skills in data modelling, data integration, and data warehouse design, as well as experience with ETL processes and data pipeline management. The Data Engineer will work closely with cross-functional teams to develop and maintain a scalable and high-performance data architecture that supports the needs of our business.
Key Responsibilities
1. Data Modelling: Develop and maintain logical and physical data models for the data warehouse. Design and implement data structures that support reporting and analytics requirements. Collaborate with data analysts and business stakeholders to understand data requirements. 2. Data Integration: Develop and maintain ETL processes to integrate data from various sources into the data warehouse. Optimize data integration processes for performance and reliability. Troubleshoot and resolve data integration issues. 3. Data Pipeline Management: Design and maintain data pipelines that support real-time data ingestion and processing. Ensure data quality and consistency throughout the pipeline. Monitor and optimize data pipeline performance. 4. Data Warehouse Design: Design and maintain the overall data warehouse architecture. Implement and maintain data governance policies and procedures. Ensure data security and privacy compliance. 5. Cross-Functional Collaboration: Work closely with data analysts, business stakeholders, and other technical teams to understand data needs and requirements. Collaborate with software engineers to ensure that data integration and reporting requirements are incorporated into software development projects. Participate in project planning and estimation. Requirements & Skills: Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 5+ years of experience in data engineering, with a focus on data integration, data modeling, and data warehouse design. Proficiency in SQL and experience with relational databases, such as Teradata , Hadoop , MSSQL , MySQL, PostgreSQL, or Oracle. Strong experience with ETL tools, such as Apache NiFi, Talend, , Informatica , ODI Experience with Cloud & On Premise data platforms, such as Oracle, AWS, Azure, or Google Cloud Platform. Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Familiarity with Agile SDLC
Must have some years as a Data Engineer in CV
MUST BE FAMILIAR AND HAVE EXPERIENCE WITH AGILE METHODOLOGY
ETL requirements and the filtration conditions;
ETL for Informatica or ODI? – Tool Agnostic ETL skill, ODI preferred
Any special skills require – TerAdata, Hadoop, Hive , Spark, FSLDM knowledge (to ingest data from source file to FSLDM model) or basic ETL skill? – financial ecosystem knowledge preferred