ETL with Hadoop Experience

ETL with Hadoop Experience

Job Description

The ETL Hadoop data engineer will be responsible for analyzing the business requirements design develop and implement highly efficient highly scalable ETL processes

Candidate is required to perform daily project functions with a focus on meeting the business objectives on time in rapidly changing work environment and should be able to lead and drive globally located team to achieve business objectives

Required Skills: 8+ years of hands on experience working with ETL Hadoop Knowledge of various components of Hadoop ecosystem and experience in applying them to practical problems Hive Impala Spark Strong knowledge of working with relational databases like Teradata DB2 Oracle Sql server

Hands on experience in writing shell scripts on Unix platform Experience in data warehousing ETL tools MPP database systems

Understanding of Data Models Conceptual Logical and Physical Dimensional Relational Data Model Design Analyze functional specifications and assist in designing potential technical solutions Identifies data sources and works with source system team and data analyst to define data extraction methodologies

Good knowledge in writing complex queries in Teradata DB2 Oracle PL SQL Maintain batch processing jobs and respond to critical production issues communicate well with stakeholders on his her proposal recommendations

Knowledge status risks regarding delivering solution on time

Strong experience with Data Analysis Data Profiling Root Cause Analysis

Should able to understand Banking system processes and data flow

Can work independently lead and mentor the team

Job Type: Contract
Job Location: Remote
Job Category: ETL informatica OLTP SQL

Apply for this position

Allowed Type(s): .pdf, .doc, .docx
error: Content is protected !!