
Overview
Role Description:
Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases.
Role Responsibility:
- Translate functional specifications and change requests into technical specifications
- Translate business requirement document, functional specification, and technical specification to related coding
- Develop efficient code with unit testing and code documentation
Role Requirement:
- Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.)
- Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.)
- Knowledgeable in Shell / PowerShell scripting
- Knowledgeable in relational databases, non-relational databases, data streams, and file stores
- Knowledgeable in performance tuning and optimization
- Experience in Data Profiling and Data validation
- Experience in requirements gathering and documentation processes and performing unit testing
- Understanding and Implementing QA and various testing process in the project
Additional Requirement:
- Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Snowflake, ensure the effective transformation and load data from diverse sources into data warehouse or data lake.
- Implement and manage data models in DBT, guarantee accurate data transformation and alignment with business needs.
- Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting.
- Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance.
- Establish best DBT processes to improve performance, scalability, and reliability.
- Expertise in SQL and a strong understanding of Data Warehouse concepts and Modern Data Architectures.
- Familiarity with cloud-based platforms (e.g., AWS, Azure, GCP).
- Migrate legacy transformation code into modular DBT data models.
#SeniorDataEngineer #DBTDeveloper #SnowflakeDeveloper #DBTJobs #SnowflakeJobs #ModernDataStack #SrDataEngineering #SeniorDataEngineer #ETLDeveloper #DataTransformation #SQL #Python #Airflow #Azure #AWS #GCP #Fivetran #Databricks #ADF #Glue #CloudData
Job Types: Full-time, Permanent
Pay: ₹1,500,000.00 - ₹2,000,000.00 per year
Benefits:
- Provident Fund
Schedule:
- Day shift
Application Question(s):
- We are looking for candidates who can join immediately or within a 30-day notice period. Would you be able to join within 30 days from the date of offer? (Note: Job Location: Chennai)
- Are you experienced in building and maintaining scalable data pipelines using tools like DBT and Snowflake? Please briefly describe a project where you used these tools.
- How comfortable are you with writing advanced SQL queries, including procedures, analytical functions, and performance tuning? Can you share an example of a complex SQL transformation you've implemented?
- Have you worked on migrating legacy ETL/ELT processes into modular DBT models on cloud platforms such as AWS, Azure, or GCP? If yes, please provide a short overview of the approach you followed.
Work Location: In person