brand logo
View all jobs

Snowflake - Senior Architect

Chennai, Hyderabad, Bangalore
Job Description
Job Title: Snowflake - Senior Architect
Bangalore | Chennai | Hyderabad
Tiger Analytics is a global AI and analytics consulting firm. With data and technology at the core of our solutions, our 4000+ tribe is solving problems that eventually impact the lives of millions globally. Our culture is modeled around expertise and respect with a team-first mindset. Headquartered in Silicon Valley, you’ll find our delivery centers across the globe and offices in multiple cities across India, the US, the UK, Canada, and Singapore, including a substantial remote global workforce.
We’re Great Place to Work-Certified™. Working at Tiger Analytics, you’ll be at the heart of an AI revolution. You’ll work with teams that push the boundaries of what is possible and build solutions that energize and inspire.
Curious about the role? What your typical day would look like?
As a Senior Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and an Architect as demanded by each project to define, design, and deliver actionable insights.
On a typical day, you might
  • Collaborate with the clients to understand the overall requirements
  • Create a robust, extensible architecture to meet the client/business requirements
  • Translate the Business requirements into technical artifacts
  • Identify the right technology stack and tools that best meet the client's requirements
  • Design end-to-end solutions along with data strategy, including standards, principles, data sources, storage, pipelines, data flow, and data security policies
  • Collaborate with data engineers, data scientists, and other stakeholders to execute the data strategy
  • Perform data analysis to understand the relevant data sets
  • Design and create Data Models to cater to all the requirements
  • Identify and implement the right orchestration tool
  • Define the right security and access policies
  • Implement the Snowflake best practices
  • Define the right data distribution/consumption pattern for downstream systems and consumers
  • Perform the code reviews and ensure quality deliverables
  • Work with DevOps team to define and establish CI/CD process
  • Own end-to-end delivery of the project
  • Work closely with the clients during UAT Testing and ensure all client issues/concerns are addressed
  • Proactively identify the performance bottlenecks and propose corrective actions
  • Design and Develop reusable frameworks
  • Work with the Project Managers to prioritize work for each sprint to meet project milestones and deadlines
  • Closely monitor the Project progress and provide regular updates to the leadership teams on the milestones, impediments etc
  • Guide /mentor team members, and create the technical artifacts
  • Demonstrate thought leadership
Job Requirement

  • Overall 13+ years of experience, a minimum 8 years of experience as an Architect on Analytics solutions, and around 3 years of experience with Snowflake.
  • Minimum 3+ years of experience working in Cloud platforms, familiarity with Public Cloud architectures
  • Excellent understanding of Database and Datawarehouse concepts
  • Decent experience in migration projects, migrating from  On-prem to Snowflake
  • Comprehensive knowledge of the Snowflake capabilities such as Snow pipe, SnowSQL,  Streams, and Snowflake architecture and awareness of Snowflake roles and user security.
  • In-depth knowledge and experience in data migration from RDBMS to Snowflake cloud data warehouse
  • Experience in implementing Snowflake integration with DBT and other modern data stack tools
  • Well-versed with Incremental extraction loads - batched and streaming
  • Experience with Azure, AWS, or GCP
  • Ability to architect data architecture solutions that are in alignment with the strategic technology roadmap and emerging industry trends
  • Strong exposure to Data Modelling, Data Access Patterns, and SQL
  • Knowledge of  Cost Management, Infrastructure Planning & Disaster Recovery
  • Good experience in Data Governance and a good understanding of DevOps practices
  • Understanding of Data Virtualization
  • Experience in working on Big Data Platform
  • Ability to work closely with cross-functional teams
  • Excellent communication (verbal and written) and excellent interpersonal skills
You are important to us, let’s stay connected!
Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow.
We are an equal opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire.
Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry