Are you interested in being a part of Kiwi.com data transformation? Do you want to build software used by tens to hundreds of developers in their day to day job? Our data team is building a rock solid platform for improving and easing the job of data engineers, analysts, scientists and developers as well. Consisting of small and custom libraries to huge infrastructures holding terabytes of data. You can be part of the whole initiative!
Our data platform team is developing and managing a set of different tools that together make one unified platform, ready to use by anyone in the company. The team focuses on providing software for automatizing a lot of tedious tasks done by data engineers, scientists and developers. You will be working on a job orchestration tool leveraging Kubernetes engine to schedule tasks. All built on top of Airflow, with inhouse built software that reduces time to production of any data pipeline. It covers the whole development lifecycle and simplifies all the steps – development, deployment, testing, and debugging. Developers can care about only their code that usually transforms loads of data and doesn’t need to bother with stuff like deployment strategies.
Of course, with building our platform we need to be sure we provide tooling that is still able to produce high-quality data. You can also be part of building the metadata service that collects all the metadata available in the company. We’re leveraging this data to automatically monitor data quality and automatize change management as well as.
We’re very keen to open source our stuff. You can checkout our GitHub https://github.com/kiwicom
with tons of projects. A few from our team are – Contessa (Data Quality Library) or pg2avro (translator of Postgres types to Avro compatible types).
If you are interested to work on such projects and push the limits of software and automatization, you are the ONE we are looking for! If you love to experiment and build on top technologies like airflow, write python apps or anything else from the open source world come to see us!
Few examples of our Data Platform Engineer work:
Data workflow management: To manage our data loads for the Analytics world we’re using the Apache Airflow. Apache Airflow enables scheduling data-related workflows with a code-as-configuration model and web frontend, driving our data routines to feed up data provisioning customers.
Real-time streaming infrastructure: To enable our analytics teams to move quickly, getting accurate data with minimal delay is a core focus in data provisioning & engineering. Currently, we are building out real-time infrastructure to allow for the easy development of streaming applications that includes anomaly detections and forecasts.
What will you do?
- Write code (heh), probably in python
- Design and implement parts of the data platform. On projects like – airflow plugins, IAM parts, self-service monitoring, testing framework, CI deployment, etc.
- Identify weak spots, refactor code that needs it during development
- Optimize code and usage of 3rd party services for speed and cost-effectiveness
- Regularly update and clearly communicate with the team about your progress and struggles
- on call / slack duty
What do we expect?
- 3+ years of full-time, industry experience in software development
- Broad knowledge of different types of data storage engines – (non)relational, row/column oriented dbs. Hands on experience with at least 2 of them – e.g. Postgres, MySQL, QB/redshift, elastic
- Experience with orchestration tools as Airflow best fit (nice to have)
- Advanced query language (SQL) knowledge
- Working with batch or real-time data processing
- Strong coding skills in Python (preferred)
- Rigor in high code quality, automated testing, and other engineering best practices
- Cloud Knowledge – Google Cloud (best fit), AWS, Azure (nice to have)
- BS/MS in Computer Science or a related field (ideal)
Why it rocks to work for us?
- We deploy immediately after a job is completed, not after months of QA.
- Do, fail, learn – repeat! We understand that mistakes happen and we learn fast.
- We decide which cutting-edge technologies are appropriate for the task.
- We code at hackathons and other competitions.
- We support the local technological community.
- We visit and speak at conferences and technological events worldwide.
- We use our work time wisely with a friendly vacation policy and work schedule.
- We also like to party and hang out together.
- We work, play, relax, workout and even nap in our offices.
- We’re a great team of passionate and fun-loving people from across the globe who you’ll love working with.
- We look forward to you joining our team-buildings and parties!
We offer you
•Besides a motivating salary, we offer quarterly bonuses dependent on the company’s overall results and your own performance.
•We also enjoy benefits, such as flexible working hours, 30 paid vacation days, sick days, private medical insurance, Andjoy scheme, annual subscription for Biking (Smou).
•Flight vouchers to celebrate your kiwi anniversaries.
•Occasional work from home and/or our Office located in the centric cozy neighborhood (Sant Antoni) in Cloudworks coworking space where you can enjoy Kiwi.com’s private office space, two common terraces with great views, discounts on lunch meal providers, common coworking activities (yoga, games, etc.), free refreshments and shower.
•Hardware from Apple or Microsoft based on your preferences.
•We offer unlimited contracts within a forward-thinking and ambitious company.
•Relocation package (including visa support).
•Dogs and parties are welcome in our offices.
Interested? Join us and hack the traditional ways of travel!
Kiwi.com is proud to be an equal opportunity workplace and an affirmative action employer. We review applications for employment without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, citizenship, age, uniformed services, genetic information, physical or mental disability, medical condition, marital status, or any other basis prohibited by law.