- Implement Data warehouse & Big/Small data designs,data lake solutions with very good data quality capabilities.
- Thorough with the concepts of Continuous Integration, Deployment and Delivery (CI/CD).
Very good hands on at Data modelling and designing ETL pilelines & solutions.
Integrate domain data knowledge into development of data requirements.
Look across multiple systems, understands the purpose of each system and defines data requirements by systems.
Identify downstream implications of data loads/migration (e.g., data quality, regulatory, etc)
. Good understanding of implementing data governance,encryption solutions.
Be Proficient in AWS GLUE ETL tool and having good knowledge on any of the ETL tools(Informatica cloud,Talend etc.)
. Must have working experience with SQL, and MPP databases (e.g.: MYSQL, Postgres, Redshift) implementing Data Models.
. Expertise with Python Language and Apache Spark. Have working experience in projects involving these.
- Working experience with AWS Analytics services like EMR, Athena, Glue, Data-Pipelines
- Understanding and technical knowledge on AWS service like VPC, EC2, ECS, S3. Should have used these technologies in previously executed projects.
Ability to leverage data assets to respond to complex questions that require timely answers
Effective team building and problem-solving abilities
Strong communication and interpersonal skills.
- Should have a minimum 6+ years of working experience in above technologies or projects
Bachelor's Degree in Computer Science, Information Technology or equivalent with 6 to 8 years of experience as senior data engineer.
Salary: Not Disclosed by Recruiter
Industry:IT-Software / Software Services
Functional Area:IT Software - Application Programming, Maintenance
Role Category:Programming & Design
Employment Type:Permanent Job, Full Time
Desired Candidate Profile
Enormous IT Services Pvt Ltd
Contact Company:Enormous IT Services Pvt Ltd
Address :Plot No. 27, # 102 Sagacity Heights, House No:152, 1st floor, Kavuri Hills, Phase 1, Madhapur