Location: Yerevan, Armenia
Category: information technology
Type: Full Time
Deadline: 22-Sep-19 12:00:00 AM
Salary:
Description
Responsibilities
- Writing Spark applications using Scala
- Scripting and writing Airflow DAGs with Python
- Interacting with Redshift (Spectrum) and Athena using SQL, Postgres or MySQL
- Building ETL jobs in SQL
- Leveraging AWS services to build modern cloud based data solutions
RequiredQualifications
- Experience with Scala, either experience with Java to switch to Scala later
- Experience with Python or Java
- Experience with Spark
- Experience with various data formats (any of: Parquet, Avro, ORC, ProtoBuf)
- Experience with Hadoop and AWS data ecosystem (any of: Redshift, EMR, Glue/Hive, Athena/Presto, Zeppelin, etc.)
- Experience with ETL/ELT tools and concepts, data modeling, SQL, query performance optimization
- Experience with building stream processing applications using Kinesis or Kafka
- AWS certification is a plus
- DevOps experience is a plus
- Ability to work with remote team
- Advanced written and verbal English communication skills
Benefits
No comments:
Post a Comment