Pune, Maharashtra, India
Manufacturing & Industrial
Full-Time
UST
Overview
Role Description
Hiring Locations: Chennai, Trivandrum, Kochi
Experience Range: 3 to 6 years
Role Description
The L1 Data Ops Analyst / Data Pipeline Developer is responsible for developing, testing, and maintaining robust data pipelines and monitoring operational dashboards to ensure smooth data flow. This role demands proficiency in data engineering tools, SQL, and cloud platforms, with the ability to work independently and in 24x7 shift environments. The candidate should be capable of analyzing data, troubleshooting issues using SOPs, and collaborating effectively across support levels.
Key Responsibilities
Development & Engineering:
Sql,Data Analysis,Ms Excel,Dashboards
Hiring Locations: Chennai, Trivandrum, Kochi
Experience Range: 3 to 6 years
Role Description
The L1 Data Ops Analyst / Data Pipeline Developer is responsible for developing, testing, and maintaining robust data pipelines and monitoring operational dashboards to ensure smooth data flow. This role demands proficiency in data engineering tools, SQL, and cloud platforms, with the ability to work independently and in 24x7 shift environments. The candidate should be capable of analyzing data, troubleshooting issues using SOPs, and collaborating effectively across support levels.
Key Responsibilities
Development & Engineering:
- Design, code, test, and implement scalable and efficient data pipelines.
- Develop features in accordance with requirements and low-level design.
- Write optimized, clean code using Python, PySpark, SQL, and ETL tools.
- Conduct unit testing and validate data integrity.
- Maintain comprehensive documentation of work.
- Monitor dashboards, pipelines, and databases across assigned shifts.
- Identify, escalate, and resolve anomalies using defined SOPs.
- Collaborate with L2/L3 teams to ensure timely issue resolution.
- Analyze trends and anomalies using SQL and Excel.
- Follow configuration and release management processes.
- Participate in estimation, knowledge sharing, and defect management.
- Adhere to SLA and compliance standards.
- Contribute to internal documentation and knowledge bases.
- Strong command of SQL for data querying and analysis.
- Proficiency in Python or PySpark for data manipulation.
- Experience in ETL tools (any of the following):
- Informatica, Talend, Apache Airflow, AWS Glue, Azure ADF, GCP DataProc/DataFlow.
- Experience working with cloud platforms (AWS, Azure, or GCP).
- Hands-on experience with data validation and performance tuning.
- Working knowledge of data schemas and data modeling.
- Certification in Azure, AWS, or GCP (foundational or associate level).
- Familiarity with monitoring tools and dashboard platforms.
- Understanding of data warehouse concepts.
- Exposure to BigQuery, ADLS, or similar services.
- Excellent written and verbal communication in English.
- Strong attention to detail and analytical skills.
- Ability to work in a 24x7 shift model, including night shifts.
- Ability to follow SOPs precisely and escalate issues appropriately.
- Self-motivated with minimal supervision.
- Team player with good interpersonal skills.
- Timely and error-free code delivery.
- Consistent adherence to engineering processes and release cycles.
- Documented and trackable issue handling with minimal escalations.
- Certification and training compliance.
- High availability and uptime of monitored pipelines and dashboards.
Sql,Data Analysis,Ms Excel,Dashboards
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in