Location: Chennai (Multiple openings)
Experience: 4 years
We are looking for a skilled and experienced DBT-Snowflake Developer to join our client organization!
As part of the team, you will be involved in the implementation of the ongoing and new initiatives of the company. If you love learning, thinking strategically, innovating, and helping others, this job is for you!
Role Description:
Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases.
Required Experience:
- Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.).
- Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.)
- Knowledgeable in Shell / PowerShell scripting
- Knowledgeable in relational databases, nonrelational databases, data streams, and file stores
- Knowledgeable in performance tuning and optimization
- Experience in Data Profiling and Data validation
- Experience in requirements gathering and documentation processes and performing unit testing
- Understanding and Implementing QA and various testing process in the project
- Knowledge in any BI tools will be an added advantage
- Sound aptitude, outstanding logical reasoning, and analytical skills
- Willingness to learn and take initiatives
- Ability to adapt to fast-paced Agile environment
Additional Requirement:
• Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Snowflake, ensure the effective transformation and load data from diverse sources into data warehouse or data lake.
• Implement and manage data models in DBT, guarantee accurate data transformation and alignment with business needs.
• Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting.
• Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance.
• Establish best DBTprocesses to improve performance, scalability, and reliability.
• Expertise in SQL and a strong understanding of Data Warehouse concepts and Modern Data Architectures.
• Familiarity with cloud-based platforms (e.g., AWS, Azure, GCP).
• Migrate legacy transformation code into modular DBT data models.