Required Experience, Skills & Competencies:
· Hands-on experience with:
Hadoop ecosystem - HDFS, Hive, Sqoop, Kafka, ELK Stack, etc
Spark, Scala, Python, and core/advanced Java
NoSQL databases e.g. Hbase, Cassandra, MongoDB
Relevant GCP components required to build big data solutions
Good to know: Databricks, Snowflake
· Ability to develop and manage scalable Hadoop cluster environments
· Good understanding of data warehousing concepts, distributed systems, data pipelines, ETL
· 3+ years of professional experience with at least 2 years in big data engineering
Designation will be commensurate with expertise/experience. Compensation packages are among the best in the industry.
#LI-UNPost