Cloud Engineer (Mid or Senior)

I’m interested

Location

Krakow, Poland

Contract type

Full-time

Hours

40

Job description

Our data engineering teams together with data scientists are working on advanced systems capable of collecting and processing huge amounts of data, finding patterns, reasoning and acting faster than humans by using machine learning algorithms that can incrementally get smarter each time new data enters the system. We are a group of people passionate about the science 'behind the bet', always outcome-oriented and focused on solving real-world problems based on data and advanced analytics.

 

Your role in the team

 

We are building a world-class Big Data platform which will give us power to process streams of data and enable machine learning and advanced analytics capabilities. Everything is cloud-based for scalability and speed to market.

 

As a Cloud Engineer for Big Data, you’ll be building tools and processes for CI/CD pipelines to test and deploy new Big Data platform. You'll work with DevOps practices and be active in all phases of the development life-cycle.  You will be measuring performance, costs and drive initiatives to optimize it. You will take care of infrastructure automation, containerisation, distributed component integration, security, log aggregation, monitoring and troubleshooting Big Data Platform issues together with data engineers. You will be working mainly with AWS and may do same work in our private cloud environment

 

Skills needed:

 

Must haves:

 

  • Current hands-on experience of Amazon Web Services (AWS) and Kubernetes
  • Understanding of Infrastructure as a Code paradigm and hands-on experience with Terraform
  • Experience with Docker or other virtualization technology
  • Experience CI/CD (preferably with Gitlab or Jenkins)

 

Nice to haves:

 

  • Experience with networking fundamentals, firewalls, load balancers, and across AWS account communication
  • Good understanding of cloud computing paradigm (distributed logging, service discovery, stateless applications, scaling, HA)
  • Experience with any scripting language (Bash, Python, etc.)
  • Experience in building pipelines for automating of applications scaffolding, testing, building, auto-scaling and integration 

 

 

What we offer

 

  • Possibility to work with latest Cloud technologies and build comprehensive Data Platform from scratch
  • Long term flexible working practices - Our employees have the opportunity to work from home up to 80% of the time with 20% of office time built in to ensure we get some face to face collaborative team time
  • Growth within technical area and leadership – it is your choice where you want to go
  • Flexible employment (B2B or Employment Contract to choose from)
  • Creative rights scheme
  • IT conferences, internal trainings, lunch and learn sessions and an individual training budget

#datajobs