DevOps Engineer - Big Data Infrastructure

I’m interested

Location

Krakow, Poland

Seniority level

Associate

Workplace policy

Hybrid

Job description

 

Our data engineering teams together with data scientists are working on advanced systems capable of collecting and processing huge amounts of data, finding patterns, reasoning and acting faster than humans by using machine learning algorithms that can incrementally get smarter each time new data enters the system. We are a group of people passionate about the science 'behind the bet', always outcome-oriented and focused on solving real-world problems based on data and advanced analytics.

 

Your role in the team  


We are building a world-class Big Data platform that will give us the power to process streams of data and enable machine learning and advanced analytics capabilities. Everything cloud-based for scalability and speed to market.

 

As a DevOps Engineer for Big Data Infrastructure, you’ll be building infrastructure, tools and processes that will enable other teams and end-users to provide business value. You will take care of deployment automation, containerisation, distributed component integration, security, log aggregation, monitoring and troubleshooting of Big Data platform developed from scratch in-house. You will be working mainly with AWS cloud.

 

 

Skills needed:

Must-haves:

 

  • Current hands-on experience of Amazon Web Services (AWS) and Kubernetes
  • Understanding of Infrastructure as a Code paradigm and hands-on experience with Terraform
  • Experience with Docker or other virtualisation technology
  • Experience CI/CD (preferably with Gitlab or Jenkins)

 

 

Nice to haves:

 

  • Experience with networking fundamentals, firewalls, load balancers, and across AWS account communication
  • Good understanding of cloud computing paradigm (distributed logging, service discovery, stateless applications, scaling, HA)
  • Experience with any scripting language (Bash, Python, etc.)
  • Experience in building pipelines for automating of applications scaffolding, testing, building, auto-scaling and integration

 

What we offer

 

  • Long term flexible working practices - Our employees have the opportunity to work from home with some office time built in to ensure we get some face to face collaborative team time
  • Possibility to work with latest Cloud technologies and build comprehensive Data Platform from scratch
  • Growth within technical area and leadership – it is your choice where you want to go
  • Flexible employment (B2B or Employment Contract to choose from)
  • Creative rights scheme
  • IT conferences, internal trainings, lunch and learn sessions and an individual training budget

 

 

#datajobs