Information Technology
Full-Time
JPMorganChase
Overview
Job Description
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.
As a Software Engineer III at JPMorgan Chase within the Consumer & Community Banking, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.
Job Responsibilities
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.
As a Software Engineer III at JPMorgan Chase within the Consumer & Community Banking, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.
Job Responsibilities
- Design, develop, and maintain scalable data pipelines and ETL processes to support data integration and analytics.
- Utilize Python for data processing and transformation tasks, ensuring efficient and reliable data workflows.
- Implement data orchestration and workflow automation using Apache Airflow.
- Deploy and manage containerized applications using Kubernetes (EKS) and Amazon ECS.
- Use Terraform for infrastructure provisioning and management, ensuring a robust and scalable data infrastructure.
- Develop and optimize data models to support business intelligence and analytics requirements.
- Work with graph databases to model and query complex relationships within data.
- Create and maintain interactive and insightful reports and dashboards using Tableau to support data-driven decision-making.
- Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs.
- Implement AWS enterprise solutions, including Redshift, S3, EC2, Data Pipeline, and EMR, to enhance data processing capabilities.
- Work hands-on with SPARK to manage and process large datasets efficiently.
- Formal training or certification on software engineering concepts and 3+ years applied experience
- Hands-on practical experience programming skills in Python, with basic knowledge of Java.
- Experience with Apache Airflow for data orchestration and workflow management.
- Familiarity with container orchestration platforms such as Kubernetes (EKS) and Amazon ECS.
- Experience with Terraform for infrastructure as code and cloud resource management.
- Proficiency in data modeling techniques and best practices.
- Exposure to graph databases and experience in modeling and querying graph data.
- Experience in creating reports and dashboards using Tableau.
- Experience with AWS enterprise implementations, including Redshift, S3, EC2, Data Pipeline, and EMR.
- Hands-on experience with SPARK and managing large datasets.
- Experience in implementing ETL transformations on big data platforms, particularly with NoSQL databases (MongoDB, DynamoDB, Cassandra).
- Familiarity with modern front-end technologies
- Exposure to cloud technologies
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in