Cloud Engineer for Big Data

I’m interested

Location

Leeds, UK

Hours

37.5

Seniority level

Associate

Workplace policy

Hybrid

Job description

 

Our data engineering teams together with data scientists are working on advanced systems capable of collecting and processing huge amounts of data, finding patterns, reasoning and acting faster than humans by using machine learning algorithms that can incrementally get smarter each time new data enters the system. We are a group of people passionate about the science 'behind the bet', always outcome-oriented and focused on solving real-world problems based on data and advanced analytics.

 

Your role in the team

 

We are building a world-class Big Data platform which will give us power to process streams of data and enable machine learning and advanced analytics capabilities. Everything is cloud-based for scalability and speed to market.

 

As a Cloud Engineer for Big Data, you’ll be building tools and processes for CI/CD pipelines to test and deploy new Big Data platform. You'll work with DevOps practices and be active in all phases of the development life-cycle.  You will be measuring performance, costs and drive initiatives to optimize it. You will take care of infrastructure automation, containerisation, distributed component integration, security, log aggregation, monitoring and troubleshooting Big Data Platform issues together with data engineers. You will be working mainly with AWS and may do same work in our private cloud environment

 

Skills needed:

 

Must haves:

 

  • Current hands-on experience of Amazon Web Services (AWS) and Kubernetes
  • Understanding of Infrastructure as a Code paradigm and hands-on experience with Terraform
  • Experience with Docker or other virtualization technology
  • Experience CI/CD (preferably with Gitlab or Jenkins)

 

Nice to haves:

 

  • Experience with networking fundamentals, firewalls, load balancers, and across AWS account communication
  • Good understanding of cloud computing paradigm (distributed logging, service discovery, stateless applications, scaling, HA)
  • Experience with any scripting language (Bash, Python, etc.)
  • Experience in building pipelines for automating of applications scaffolding, testing, building, auto-scaling and integration 

 

What we offer

 

We’ll Balance flexibility and performance together, giving you the tools you need to give us your best. And, we’ll offer you the chance to Belong, as part of a strong, winning team that spans the world.

 

We have welcomed our teams back to the office and have a balanced approach to office and home working.  Our employees have the opportunity to work from home up to 80% of the time with 20% of office time built in to ensure we get some face-to-face collaborative team time - and the chance for a coffee and a catch up!

 

We’ll welcome you on-board with an  individual training budget to allow you to develop your skills. You'll enjoy 33 days holiday including bank holidays (not to mention an extra day for your birthday),a rewarding bonus scheme, healthcare, an attractive pension package and benefits scheme.

 

Plus, our season ticket loan and handy Metro Card loan will save you money getting to work and getting around town. And if you see the journey to work as part of your fitness regime, you’ll just love our Cycle to Work scheme.

 

#datajobs