Free cookie consent management tool by TermsFeed Manager - Data Architect - Hyderabad | Antal Tech Jobs
Back to Jobs
2 Days ago

Manager - Data Architect - Hyderabad

decor
Chennai, Tamil Nadu, India
Information Technology
Full-Time
Deloitte

Overview

Summary

Position Summary

Manager – Data Architect - Deloitte Support Services India Private Limited

Key Responsibilities/Job Duties:

With 10 – 13 years of hands-on experience developing applications leveraging the following skills.

  • Lead Data Engineering Team: Lead, mentor, and manage a team of 10+ data engineers, providing technical guidance, code reviews, and career development to foster a high-performing team.
  • Databricks Platform Ownership: Own the Databricks platform architecture and implementation, ensuring the environment is secure, scalable, and optimized for the organization’s data processing needs. Design and oversee the Lakehouse architecture leveraging Delta Lake and Apache Spark.
  • Unity Catalog Implementation: Implement and manage Databricks Unity Catalog for unified data governance. Ensure fine-grained access controls and data lineage tracking are in place to secure sensitive financial data and comply with industry regulations.
  • Cluster Provisioning & Policies: Provision and administer Databricks clusters (in Azure), including configuring cluster sizes, auto-scaling, and auto-termination settings. Set up and enforce cluster policies to standardize configurations, optimize resource usage, and control costs across different teams and projects.
  • Databricks SQL Optimization: Collaborate with analytics teams to develop and optimize Databricks SQL queries and dashboards. Tune SQL workloads and caching strategies for faster performance and ensure efficient use of the query engine.
  • Performance Tuning: Lead performance tuning initiatives for Spark jobs and ETL pipelines. Profile data processing code (PySpark/Scala) to identify bottlenecks and refactor for improved throughput and lower latency. Implement best practices for incremental data processing with Delta Lake, and ensure compute cost efficiency (e.g., by optimizing cluster utilization and job scheduling).
  • Data Solutions Collaboration: Work closely with application developers, data analysts, and data scientists to understand requirements and translate them into robust data pipelines and solutions. Ensure that data architectures support analytics, reporting, and machine learning use cases effectively.
  • DevOps Integration: Integrate Databricks workflows into the CI/CD pipeline using Azure DevOps and Git. Develop automated deployment processes for notebooks, jobs, and clusters (infrastructure-as-code) to promote consistent releases. Manage source control for Databricks code (using Git integration) and collaborate with DevOps engineers to implement continuous integration and delivery for data projects.
  • Data Governance & Security: Collaborate with security and compliance teams to uphold data governance standards. Implement data masking, encryption, and audit logging as needed, leveraging Unity Catalog and Azure security features to protect sensitive financial data.
  • Platform Innovation: Stay up-to-date with the latest Databricks features and industry’s best practices. Proactively recommend and implement improvements (such as new performance optimization techniques or cost-saving configurations) to continuously enhance the platform’s reliability and efficiency.

Minimum Qualifications

  • Education & Experience: Bachelor’s or above degree in Computer Science, Information Systems, or a related field. 7+ years of experience in data engineering, data architecture, or related roles, with a track record of designing and deploying data pipelines and platforms at scale.
  • Databricks & Spark Expertise: Significant hands-on experience with Databricks (preferably Azure Databricks) and the Apache Spark ecosystem. Proficient in building data pipelines using PySpark/Scala and managing data in Delta Lake format.
  • Cloud Platform: Strong experience working with cloud data platforms (Azure preferred, or AWS/GCP). Familiarity with Azure data services (such as Azure Data Lake Storage, Azure Blob Storage, etc.) and managing resources in an Azure environment.
  • SQL and Data Warehousing: Advanced SQL skills with the ability to write and optimize complex queries. Solid understanding of data warehousing concepts and performance tuning for SQL engines.
  • Job Optimization & Performance: Proven ability to optimize ETL jobs and Spark processes for performance and cost efficiency. Experience tuning cluster configurations, parallelism, and improving job runtimes and resource utilization.
  • Unity Catalog & Security: Demonstrated experience implementing data security and governance measures. Comfortable configuring Unity Catalog or similar data catalog tools to manage schemas, tables, and fine-grained access controls. Able to ensure compliance with data security standards and manage user/group access to data assets.
  • Leadership Skills: Experience leading and mentoring engineering teams. Excellent project leadership abilities to coordinate multiple projects and priorities. Strong communication skills to effectively collaborate with cross-functional teams and present architectural plans or results to stakeholders.
  • Problem-Solving: Excellent analytical and problem-solving skills, with the ability to troubleshoot complex data pipeline issues quickly. Attention to detail in maintaining data quality and reliability across the platform.

Preferred Qualifications

  • Certification: Databricks Certified Data Engineer Professional (highly preferred) or Databricks Certified Data Engineer Associate. Equivalent certifications in cloud data engineering or architecture (e.g., Azure Data Engineer, Azure Solutions Architect) are a plus.
  • Financial Services Experience: Prior experience in the financial services industry or other highly regulated industries. Familiarity with financial data types, privacy regulations, and compliance requirements (e.g. handling PII, PCI data) can be beneficial.
  • Advanced Education: Master’s degree in computer science, Data Engineering, or related field is a plus.
  • Additional Big Data Technologies: Exposure to related big data and streaming tools such as Apache Kafka/Event Hubs, Apache Airflow or Azure Data Factory for orchestration, and BI/analytics tools (e.g., Power BI) are advantageous.
  • CI/CD & DevOps: Experience implementing CI/CD pipelines for data projects. Familiarity with Databricks Repos, Jenkins, or other CI tools for automated testing and deployment of data pipelines.
  • Performance and Cost Management: Proven strategies or projects where you significantly improved performance or reduced cloud costs (e.g., using spot instances, job scheduling, or improved query design) in a Databricks/Spark environment.

Tools & Technologies

  • Databricks Lakehouse Platform: Databricks Workspace, Apache Spark, Delta Lake, Databricks SQL, MLflow (for model tracking).
  • Data Governance: Databricks Unity Catalog for data cataloging and access control; Azure Active Directory integration for identity management.
  • Programming & Data Processing: PySpark and Python for building data pipelines and Spark Jobs; SQL for querying and analytics.
  • Cloud Services (Azure-focused): Azure Databricks, Azure Data Lake Storage (ADLS Gen2), Azure Blob Storage, Azure Synapse or SQL Database, Azure Key Vault (for secrets).
  • DevOps & CI/CD: Azure DevOps (Azure Pipelines) for build/release pipelines, Git for version control (GitHub or Azure Repos); experience with Terraform or ARM templates for infrastructure-as-code is a plus.
  • Other Tools: Project and workflow management tools (JIRA or Azure Boards), monitoring tools (Azure Log Analytics, Spark UI or Databricks performance monitoring), and collaboration tools for documentation and design (Figma, Visio, Lucidcharts etc.).

#EAG-Technology

Our purpose

Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities.

Our people and culture

Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work.

Professional development

At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India .

Benefits To Help You Thrive

At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you.

Recruiting tips

From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters.

Requisition code: 310870
Share job
Similar Jobs
View All
2 Hours ago
Chief Technology Officer
Finance & Banking
  • 10 - 18 Yrs
  • Bangalore
About the Role: We are seeking a dynamic Chief Technology Officer (CTO) to lead our technology function, blending strategic vision with deep technical expertise. The ideal candidate will have hands-on mastery in modern full-stack development, clou...
decor
23 Hours ago
Web Developer ( AI-Assisted )
Information Technology
  • 1 - 4 Yrs
  • Anywhere in India/Multiple Locations
About the Role Were hiring a self-driven Web Developer who can build modern, scalable websites and applications using AI tools like ChatGPT, Claude, Gemini, and more to accelerate development. You should be comfortable using these tools to write, d...
decor
23 Hours ago
Web Developer ( AI-Assisted )
Information Technology
  • 1 - 4 Yrs
  • Anywhere in India/Multiple Locations
About the Role Were hiring a self-driven Web Developer who can build modern, scalable websites and applications using AI tools like ChatGPT, Claude, Gemini, and more to accelerate development. You should be comfortable using these tools to write, d...
decor
23 Hours ago
Dotnet Developer
Information Technology
  • 1 - 6 Yrs
  • Anywhere in India/Multiple Locations
JOB DESCRIPTION / RESPONSIBILITIES Develop, test, and deploy high-quality web applications using .NET Core, ASP.NET Core, and other related technologies. Collaborate with cross-functional teams to analyze, design, and implement software solut...
decor
1 Day ago
ARM Worldwide - Senior Android Developer - Java/Kotlin
Information Technology
  • Bangalore, Karnataka, India
Job DescriptionWe are looking for an experienced Android Developer to build and maintain high-quality mobile applications.The ideal candidate will have a strong background in Android development and a proven ability to deliver scalable, user-friendl...
decor
1 Day ago
Quality Engineering Lead (Test Lead)
Information Technology
  • Bangalore, Karnataka, India
Project Role : Quality Engineering Lead (Test Lead)Project Role Description : Leads a team of quality engineers through multi-disciplinary team planning and ecosystem integration to accelerate delivery and drive quality across the application lifecy...
decor
1 Day ago
Software Engineer
Information Technology
  • Bangalore, Karnataka, India
Job DescriptionOptum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy b...
decor
1 Day ago
KiE Square Analytics - Data Engineer - ETL
Information Technology
  • Bangalore, Karnataka, India
Key Responsibilities Design, develop, and maintain data pipelines and ETL processes for efficient data integration and transformation. Manage and optimize data storage and data flows on Oracle Cloud Infrastructure (OCI). Work with large-scale dat...
decor

Talk to us

Feel free to call, email, or hit us up on our social media accounts.
Social media