Required Experience, Skills & Competencies:
Hadoop ecosystem - HDFS, Hive, Sqoop, Kafka, ELK Stack etc
Spark, Scala, Python and core/advance Java
NOSQL databases e.g. Hbase, Cassandra, MongoDB
Relevant AWS or Azure components required to build big data solutions
Good to know: Databricks, Snowflake
Ability to develop and manage scalable Hadoop cluster environments
Good understanding of data warehousing concepts, distributed systems, data pipelines, ETL
3+ years of professional experience with at least 2 years in big data engineering
Designation will be commensurate with expertise/experience. Compensation packages are among the best in the
industry.