Panchkula, Haryana, India
Information Technology
Full-Time
ClearTrail Technologies
Overview
Key Responsibilities
Experience : 5-8 years of strong individual contributor experience as a DevOps, System, and/or Hadoop Domain Expertise :
- Lead the deployment, configuration, and ongoing administration of Hortonworks, Cloudera, and Apache Hadoop/Spark ecosystems.
- Maintain and monitor core components of the Hadoop ecosystem including Zookeeper, Kafka, NIFI, HDFS, YARN, REDIS, SPARK, and HBASE.
- Take charge of the day-to-day running of Hadoop clusters using tools like Ambari, Cloudera Manager, or other monitoring tools, ensuring continuous availability and optimal performance.
- Manage and provide expertise in HBASE Clusters and SOLR Clusters, including capacity planning and performance tuning.
- Perform installation, configuration, and troubleshooting of Linux Operating Systems and network components relevant to big data environments.
- Develop and implement automation scripts using Unix SHELL/Ansible Scripting to streamline operational tasks and improve efficiency.
- Manage and maintain KVM Virtualization environments.
- Oversee clusters, storage solutions, backup strategies, and disaster recovery plans for big data infrastructure.
- Implement and manage comprehensive monitoring tools to proactively identify and address system anomalies and performance bottlenecks.
- Work closely with database teams, network teams, and application teams to ensure high availability and expected performance of all big data applications.
- Interact directly with customers at their premises to provide technical support and resolve issues related to System and Hadoop administration.
- Coordinate closely with internal QA and Engineering teams to facilitate issue resolution within promised Skills & Qualifications :
Experience : 5-8 years of strong individual contributor experience as a DevOps, System, and/or Hadoop Domain Expertise :
- Proficient in Linux Administration.
- Extensive experience with Hadoop Infrastructure and Administration.
- Strong knowledge and experience with SOLR.
- Proficiency in Configuration Management tools, particularly Data Ecosystem Components :
- Hortonworks, Cloudera, Apache Hadoop/Spark ecosystem deployments.
- Core components like Zookeeper, Kafka, NIFI, HDFS, YARN, REDIS, SPARK, HBASE.
- Cluster management tools such as Ambari and Cloudera : Strong scripting skills in one or more of Perl, Python, or Management : Strong experience working with clusters, storage solutions, backup strategies, database management systems, monitoring tools, and disaster recovery : Experience managing KVM Virtualization : Excellent analytical and problem-solving skills, with a methodical approach to debugging complex : Strong communication skills (verbal and written) with the ability to interact effectively with technical teams and : Bachelor's or Master's degree in Computer Science, Computer Engineering, or a related field, or equivalent relevant work experience.
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in