Scala/Spark Developer with working knowledge in AWS Cloud technology
Job Responsibilities / Role
- \ "The successful candidate will work on the AWS infrastructure. They will need to assimilate information from many sources, therefore require excellent data analytical skills.
- They will build ETL transformations using Scala/Spark.
- They will create Step functions, work with EMR instances, execute Scala/Spark codes within created EMR instances, lambda functions.\ "
Mandatory skills
- Programming Languages - Scala Spark
- Cloud Technology Services in AWS: EMR, S3, Step Functions
- Database – SQL Server, Oracle
- Experience with the following technologies: GitHub
- Working experience with Terraform modules to create and handle the infrastructure (Glue catalog, S3, code artifacts, etc).\ "
Nice to have skills
- Experience with the following technologies: JIRA and Confluence
- Excellent problem solving, analytical skills and technical troubleshooting skills.
- Experience in presenting and explaining complex issues and solutions to senior stakeholders in a clear and concise way.
- Strong teamwork and interpersonal skills;
ability to communicate and persuade at all levels;
and able to establish positive working relationships.
- Strong experience in IT build development and support.\ "
Qualifications: University degree in technology, preferably in computer science.