Bangalore, Karnataka, India
Information Technology
Full-Time
AKIRA Insights
Overview
Key Responsibilities
S3 (Simple Storage Service) :
- Hands on experience in data related activities such as data parsing, cleansing quality definition data pipelines, storage and ETL scripts.
- Expert knowledge in AWS Data Lake implementation and support (S3, Glue, DMS Athena, Lambda, API Gateway, Redshift).
- Experiences in programming language Python/PySpark/SQL.
- Experience with data migration with hands-on experience.
- Experiences in consuming rest API using various authentication options with in AWS Lambda architecture.
- Orchestrate triggers, debug and schedule batch job using a AWS Glue, Lambda and step functions.
- Hands-on experience with Redshift, including data models, storage, and writing effective queries.
- Understanding of AWS security features such as IAM roles and policies.
- Knowledge of the Devops tools and CI/CD process.
- AWS certification in AWS is highly Skills & Expertise :
S3 (Simple Storage Service) :
- Data storage and partitioning, lifecycle policies, data replication, encryption at rest and transit, and data versioning, Data archive and Data sharing :
- Creation of Lambdas, configuring event-driven functions, monitoring, and integration with other AWS services of s3,Glue,API Gateway, Redshift etc.
- Creating & Managing ETL pipelines, Glue crawlers, job scheduling, and integration with S3, Redshift, and :
- Creating Tables, Views and Stored procedures, parameter tuning, workload management, configuring triggers, Redshift Spectrum usage for querying S3 data.
- Designing Restful APIs, securing endpoints using IAM or Cognito, throttling and logging API usage.
- Aware of existing VPC design, subnets, NAT gateways, peering, routing, and network ACLs for services creation.
- Configuring ALB/NLB for load distribution, health checks, sticky sessions, and SSL termination. CloudTrail :
- Enabling auditing and compliance logging, managing trails, integrating with CloudWatch and third-party SIEM :
- Knowledge about Sagemaker ML model training and deployment workflows, managing notebooks, endpoints,
- Managing version control, branching strategies, and automating workflows for testing and deployment.
- Building pipelines for build-test-deploy automation, plugin management, parallel execution, and integrations with SCM and artifact :
- Static code analysis, security vulnerability checks, technical debt reporting, integration into CI/CD.
- Understanding of ML model lifecycle : data preprocessing, training, evaluation, deployment, and monitoring.
- Experience supporting Data Scientists and ML Engineers in deploying models to production.
- Familiarity with tools and workflows like SageMaker, MLflow, Airflow (optional), and pipelines integration
Similar Jobs
View All
Talk to us
Feel free to call, email, or hit us up on our social media accounts.
Email
info@antaltechjobs.in