Big Data Engineer (Mid or Senior)

I’m interested


Krakow, Poland

Job description

Our data engineering teams together with data scientists are working on advanced systems capable of collecting and processing huge amounts of data, finding patterns, reasoning and acting faster than humans by using machine learning algorithms that can incrementally get smarter each time new data enters the system. We are a group of people passionate about the science 'behind the bet', always outcome-oriented and focused on solving real-world problems based on data and advanced analytics.


Your role in the team


We are building world-class Big Data platform which will give us power to process streams of data and enable machine learning and advanced analytics capabilities. Everything cloud-based for scalability and speed to market. You will be working with large volumes of data with both – batch and streaming processing, and integrate our AWS based platform with a range of internal and external systems. You will have exposure to different phases of development cycle, from design, through coding and testing, up to release into production.


Skills needed 


  • You’re an expert in data engineering and data quality.
  • Strong skills in custom ETL design implementation and maintenance, and you’ve used SQL to handle large data sets and complex data transformations. 
  • Hands on and experience with schema design and dimensional data modelling, experience in dbt is a plus.
  • Experience with programming languages, Python preferred.
  • You have worked in small focused scrum teams delivering events driven integrations across multiple teams.
  • You're experienced in working within an integration environment with testers to ensure end to end performance and resilience SLA’s can be achieved.
  • Experience in writing well designed, testable, efficient code which follows good coding standards.
  • Agile mindset and practice in software development process e.g. Scrum, Kanban, TDD, BDD.
  • You have experience mentoring other Engineers (if you're applying for a Senior role).


Nice to have:


  • Experience with cloud solutions for Big Data (Snowflake, GCP BigQuery, AWS Redshift) is a plus.
  • Experience with data pipelines (e.g. Airflow) and streaming processing (Kafka, Kinesis, Spark Streaming, Flink) is a plus.



What we offer


  • Long term flexible working practices - Our employees have the opportunity to work from home with some office time built in to ensure we get some face to face collaborative team time.
  • Possibility to work with latest Cloud technologies and build comprehensive Data Platform from scratch.
  • Growth within technical area and leadership – it is your choice where you want to go.
  • Flexible employment (B2B or Employment Contract to choose from).
  • Creative rights scheme.
  • IT conferences, internal trainings, lunch and learn sessions and an individual training budget.