brand logo
View all jobs

Data Engineer - Snowflake

Chennai, Hyderabad, Bangalore
Job Description
Job Title: Data Engineer – Snowflake

Tiger Analytics is a global AI and analytics consulting firm. With data and technology at the core of our solutions, our 2800+ tribe is solving problems that eventually impact the lives of millions globally. Our culture is modeled around expertise and respect with a team-first mindset. Headquartered in Silicon Valley, you’ll find our delivery centers across the globe and offices in multiple cities across India, the US, UK, Canada, and Singapore, including a substantial remote global workforce.
We’re Great Place to Work-Certified™. Working at Tiger Analytics, you’ll be at the heart of an AI revolution. You’ll work with teams that push the boundaries of what is possible and build solutions that energize and inspire.

Curious about the role? What your typical day would look like?
As a Data Engineer – Snowflake, you will have the opportunity to solve some of the most captivating data management problems. On a typical day, you might
· Utilize Snowflake SQL to construct sophisticated stored procedures and best practices with data warehouse and ETL ideas, demonstrating mastery of Snowflake data modeling and ELT.
· Develop and maintain data architecture and data models.
· Create standardized procedures for data flows and conduct efficient data integration with other third-party tools and snowflake.
· Collaborate with data science experts, BI developers, and analysts to create custom data models and integrations with snowflake.
· Write SQL queries; do tuning, testing, and problem analysis.
· Involve in Snowflake Developing scripts Unix, and Python and working with Snowflake utilities.
· Gather and analyze system requirements.
Job Requirement
What do we expect? Skills that we’d love!
· 5+ years of experience working in a data warehousing system and a minimum of 1 year of experience building and implementing a full-scale data warehouse solution based on Snowflake.
· Enjoys working on Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data and build OLAP queries.
· Practical knowledge of Snowflake tools such as SnowPipe, SnowPython, Snow SQL, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.
· Comprehensive knowledge of the Data Warehouse/ODS, the ETL concept, and the fundamentals of modeling structure
· Passionate to work in Python and recommended to have Snowpro certification
· Enthuse to work in data warehousing, including OLTP, OLAP, Dimensions, Facts, and data modeling, as well as the ability to collect and evaluate system needs
· Profound exposure to any Cloud Data Ecosystem– Azure, AWS, GCP and working familiarity with any ETL tool (Informatica or SSIS) and data visualization tools (Tableau/Power BI)

You are important to us, let’s stay connected!
Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow.
We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire.

Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry.