Data Platform Engineer

Are you interested in being a part of data transformation? Do you want to build software used by tens to hundreds of developers in their day to day job? Our data team is building a rock solid platform for improving and easing the job of data engineers, analysts, scientists and developers as well. Consisting of small and custom libraries to huge infrastructures holding terabytes of data. You can be part of the whole initiative!

Our data platform team is developing and managing a set of different tools that together make one unified platform, ready to use by anyone in the company. The team focuses on providing software for automatizing a lot of tedious tasks done by data engineers, scientists and developers. You will be working on a job orchestration tool leveraging Kubernetes engine to schedule tasks. All built on top of Airflow, with inhouse built software that reduces time to production of any data pipeline. It covers the whole development lifecycle and simplifies all the steps – development, deployment, testing, and debugging. Developers can care about only their code that usually transforms loads of data and doesn’t need to bother with stuff like deployment strategies.

Of course, with building our platform we need to be sure we provide tooling that is still able to produce high-quality data. You can also be part of building the metadata service that collects all the metadata available in the company. We’re leveraging this data to automatically monitor data quality and automatize change management as well as. 

We’re very keen to open source our stuff. You can checkout our GitHub with tons of projects. A few from our team are – Contessa (Data Quality Library) or pg2avro (translator of Postgres types to Avro compatible types).

If you are interested to work on such projects and push the limits of software and automatization, you are the ONE we are looking for! If you love to experiment and build on top technologies like airflow, write python apps or anything else from the open source world come to see us!

Few examples of our Data Platform Engineer work:
Data workflow management: To manage our data loads for the Analytics world we’re using the Apache Airflow. Apache Airflow enables scheduling data-related workflows with a code-as-configuration model and web frontend, driving our data routines to feed up data provisioning customers.

Real-time streaming infrastructure: To enable our analytics teams to move quickly, getting accurate data with minimal delay is a core focus in data provisioning & engineering. Currently, we are building out real-time infrastructure to allow for the easy development of streaming applications that includes anomaly detections and forecasts.

    What will you do?

  • Write code (heh), probably in python
  • Design and implement parts of the data platform. On projects like – airflow plugins, IAM parts, self-service monitoring, testing framework, CI deployment, etc.
  • Identify weak spots, refactor code that needs it during development
  • Optimize code and usage of 3rd party services for speed and cost-effectiveness
  • Regularly update and clearly communicate with the team about your progress and struggles
  • on call / slack duty
  • What do we expect?

  • 3+ years of full-time, industry experience in software development
  • Broad knowledge of different types of data storage engines – (non)relational, row/column oriented dbs. Hands on experience with at least 2 of them – e.g. Postgres, MySQL, QB/redshift, elastic
  • Experience with orchestration tools (Airflow best fit)
  • Advanced query language (SQL) knowledge
  • Working with batch and real-time data processing
  • Strong coding skills in Python (preferred) / Go / JS
  • Rigor in high code quality, automated testing, and another engineering best practices
  • Cloud Knowledge – Google Cloud (best fit), AWS, Azure
  • BS/MS in Computer Science or a related field (ideal)
We believe we are a fun bunch to work with and you’ll get to train in a fresh and global company, talk with people from around the world and never get bored.

You’ll work in one of the most promising tech companies (awarded Forbes Startup of 2017, Super-brand Award 2017, Deloitte Technology Fast 50 (fastest growing technological company in Central Europe) and here is what we offer:

• We use our work time wisely and we have a flexible work schedule.
• We like to party and hang out together.
• We also enjoy common benefits, such as meal vouchers, flexible benefits scheme, sick days, VIP Medical Care, flight vouchers, Multisport card for free, etc..
• Do, fail, learn – repeat! We understand that mistakes happen and we learn fast.
• Besides a fair salary, we can also look forward to quarterly bonuses dependent on our performance.
• Kids and parties are welcome in our offices.

Salary starting from 1 500 EUR gross depending on relevant experience and skills.

Sounds interesting to you? Grab your chance and apply today!

Throughout the recruitment process and for some time after it’s finished, we’re going to process your Personal Data. You can find all the necessary information in our Privacy Policy available at

kIT RecruiterKatarína Daniš

Linkedin profile

LocationBrno, Czech Republic


Employment typeFull-time

Apply now

Curious about what you're getting into?

We all share the same passion, but each team has its own spirit. Find out if this one is the right fit for you.

Get to know the team


Tech community events in 2020

To meet talented people, we organize and join tech events. We talk about what we do and how we do it, and we connect with others. Visit to see where you can find us.



a job alert

Drop us a resume