600000 - 1600000 Indian Rupee - Yearly
Pune, Maharashtra, India
Information Technology
Full-Time
Fragma Data Systems
Overview
- Responsible for developing and maintaining applications with PySpark
- Contribute to the overall design and architecture of the application developed and deployed.
- Performance Tuning wrt to executor sizing and other environmental parameters, code optimization, partitions tuning, etc.
- Interact with business users to understand requirements and troubleshoot issues.
- Implement Projects based on functional specifications.
- Good experience in Pyspark - Including Dataframe core functions and Spark SQL
- Good experience in SQL DBs - Be able to write queries including fair complexity.
- Should have excellent experience in Big Data programming for data transformation and aggregations
- Good at ELT architecture. Business rules processing and data extraction from Data Lake into data streams for business consumption.
- Good customer communication.
- Good Analytical skills
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in