
Overview
Experience- 5+ years
JD-
Qualifications:
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. List knowledge, skills, and/or abilities required. Reasonable accommodations may be made to enable individuals with disabilities to perform essential functions.
· 5+ years of ETL experience.
· Experience with core Python programming for data transformation.
· Intermediate-level Python and PySpark skills. Can write, read, understand and debug existing code Python and ySpark code from scratch.
· Strong knowledge of SQL fundamentals and snowflake experience, understanding of subqueries, can tune queries with execution hints to improve performance.
· Able to write SQL code sufficient for most business requirements for pulling data from sources, applying rules to the data, and stocking target data
· Proven track record in troubleshooting ETL jobs and addressing production issues like performance tuning, reject handling, and ad-hoc reloads.
· Proficient in developing optimization strategies for ETL processes.
· Basic AWS technical support skills. Has ability to log in, find existing jobs and check run status and logs
· Will run and monitor jobs running via Control-M
· Can create clear and concise documentation and communications.
· Can document technical specs from business communications.
· Ability to coordinate and aggressively follow up on incidents and problems, perform diagnosis, and provide resolution to minimize service interruption
· Ability to prioritize and work on multiple tasks simultaneously
· Effective in cross-functional and global environments to manage multiple tasks and assignments concurrently with effective communication skills.
· A self-starter who can work well independently and on team projects.
· Experienced in analyzing business requirements, defining the granularity, source to target mapping of the data elements, and full technical specification.
· Understands data dependencies and how to schedule jobs in Control-M.
· Experienced working at the command line in various flavors of UNIX, with basic understanding of shell scripting in bash and korn shell.
Education and/or Experience: Include the education and experience that is necessary to perform the job satisfactorily.
· Bachelors of Science in computer science or equivalent
· 5+ years of ETL and SQL experience
· 3+ years of python and PySpark experience
· 3+ Snowflake experience
· 3+ years of AWS and unix experience
· Preferred certifications:
o AWS Certified Cloud Practitioner (amazon.com)
o Python and PySpark certifications
Job Types: Full-time, Permanent
Pay: ₹1,000,000.00 - ₹2,000,000.00 per year
Schedule:
- Fixed shift
Application Question(s):
- How many years of total experience do you currently have?
- How many years of experience do you have in python and PySpark ?
- How many years of experience do you have in ETL ?
- How many years of experience do you have in Snowflake?
- How many years of experience do you have in AWS?
- How many years of experience do you have in SQL?
- What is your current CTC?
- What is your expected CTC?
- What is your notice period/ LWD?
- What is current location?
Work Location: In person