Overview
Line of Service
Internal Firm Services
Industry/Sector
Not Applicable
Specialism
IFS - Information Technology (IT)
Management Level
Senior Associate
Job Description & Summary
At PwC, our people in integration and platform architecture focus on designing and implementing seamless integration solutions and robust platform architectures for clients. They enable efficient data flow and optimise technology infrastructure for enhanced business performance.
In enterprise architecture at PwC, you will focus on designing and implementing architectural solutions that align with the organisation's overall strategy and goals. Your work will involve understanding business products, business strategies and customer usage of products. You will be responsible for defining architectural principles, analysing business and technology landscapes and translating content / develop frameworks to guide technology decisions and investments. Working in this area, you will have a familiarity with business strategy, processes and experience in business solutions which enable an organisation's technology infrastructure. You will help to confirm that technology infrastructure is optimised, scalable, and aligned with business needs, enabling efficient data flow, interoperability, and agility. Through your work, you will communicate a deep understanding of the business and a broad knowledge of architecture and applications.
Why PWCAt PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations.
Job Description & Summary:
A strong team player who will be the part of the core technical architect team of PwC India responsible for “Application/Solution design, development, management, understanding of technical architecture constructs and operational support”.
Hands-on technical experience in the range of 4-8 years with good understanding of SLDC, integration architecture, Azure Cloud and Data Engineering, Microsoft technologies and having good verbal & business communication skills. He/She will be responsible for designing, building, and maintaining scalable data pipelines and systems to support our data-driven initiatives and ensure the availability and quality of our data. The person will work closely with analysts, and other stakeholders to implement robust data solutions that empower quality.
Responsibilities:
Role & Responsibilities:
- Collaborate with cross-functional teams to understand data requirements and translate them into technical specifications.
- Write clean, scalable code and Low Code no Code solutions in cross cutting technology platforms.
- Design and develop scalable, reliable, and performant data pipelines to process and manage large datasets.Creating and managing data mapping and transformation routines to ensure data accuracy and consistency.Implementing data ingestion pipelines from multiple data sources using Azure Data Factory.
- Implement and maintain ETL processes to extract, transform, and load data from various sources into our data warehouse.
- Optimize and maintain existing data systems and infrastructure to improve efficiency and scalability.
- Ensure data quality and integrity by implementing robust data validation and monitoring processes.
- Develop and maintain documentation related to data architecture, processes, and workflows.
- Stay updated with the latest industry trends and best practices in data engineering and apply them to enhance our data systems.
- Assist in the integration of new data sources and tools into the existing data ecosystem.
- Provide support and troubleshooting for data-related issues as needed
- Understanding of integration architecture.
- Should be able to handle operational support responsibilities
- Attend daily scrum calls and update the ADO user stories.
- Familiarity with architecture patterns, APIs (REST, SOAP, RPC), XML and JSON formats
- Working experience on Azure Cloud data storage and processing.
- Proficiency in Azure Data factory, Azure Storage, Azure SQL, Azure functions, Azure Logic apps, Azure app services, Azure Networking, Azure key Vault integrations.
- Experience in ETL operations between on premises/on cloud and Azure SQL/Azure Blob Storage/ Azure Data Lake Gen2/Azure Synapse Analytics using Azure Data Factory V2
- Good hands-on experience in writing pipelines in azure data factory to fetch data from different sources and loading into Azure Data Lake and Azure SQL Database, facilitating seamless data movement and transformation to support advanced analytics. Responsible for creating Linked services, datasets, pipelines in Azure data factory. Creating data flows to transforming the data to Azure using Azure data factory and scheduling triggers and notification jobs.
- Proficiency in programming languages such as Python, .NET, C#, TypeScript, Javascript, Java.
- Strong knowledge of SQL and experience with relational Databases.
- Learnability towards cutting-edge data projects that drive business insights and innovation.
- Excellent problem-solving skills and attention to detail.
Good to have Qualifications:
• Experience in data engineering or a related role, with a strong understanding of data architecture and data processing frameworks.
• MS Fabric familiarity is a good to have.
• Write Notebooks in Apache Spark pool using Pyspark, spark SQL to ingest the data from Data Lake to the blob storage with data quality checks and transformations.
• Understanding of Agile methodologies. • Experience with big data technologies such as Hadoop, Pig, Spark is a good to have. Databricks level notebooks creation with Pyspark.
• Using PYSARK to read CSV, Parquet, JSON files and applying transformation then load into SQL Tables.
• Experience with data visualization tools or platforms. • Knowledge of machine learning concepts and tools.
Mandatory skill sets:
Data Engineer, Azure Data factory, ETL, SQL Azure, Cloud Computing and Data Engineering.
Preferred skill sets:
Essential Skills & Personal Attributes: ● Positive, “can-do” attitude towards colleagues, clients and problems alike. ● Team Player ● Lateral Thinker ● Inquisitive mind and capacity to delve into details. ● Work in an organized manner. ● Adhere to timelines.
Years of experience required:
4 -8 years
Education qualification:
Bachelors
Shift Hours
IST
Education (if blank, degree and/or field of study not specified)
Degrees/Field of Study required: Bachelor Degree
Degrees/Field of Study preferred:
Certifications (if blank, certifications not specified)
Required Skills
Azure Data Factory, Data Engineering
Optional Skills
Team Player
Desired Languages (If blank, desired languages not specified)
Travel Requirements
Available for Work Visa Sponsorship?
Government Clearance Required?
Job Posting End Date