Senior Data Engineer - DataOps at
22K - 30K PLN / month
Wrocław, Poland
Salary: PLN 22 000 - 30 000 net (B2B) | PLN 18 000 - 25 000 gross (UoP) Location: remote work (with access to coworking spaces in your city)
We are a fast-growing tech company created by experienced international talents. Our product is a top-rated online platform for small and medium businesses to grow sales through outstanding customer service. Our goal is to create a frictionless customer experience for individual users and, at the same time, help entrepreneurs worldwide grow their businesses by giving them access to a top-notch AI-driven tool.

Working at Tidio means impacting thousands of companies and millions of their users. But our clients are not the only ones who can grow with us. By joining Tidio, you can grow, too!

A few facts about us:
  • Our product is in the world's top 5 most popular live chat solutions, and our goal is to become no. 1. We were voted #10 on G2’s Top Customer Service Products for 2023.
  • The new Tidio AI feature answers up to 70% of customers’ questions in seconds and is available to users even on a free plan. It’s a real AI revolution! 🚀🤖
  • Every month, our widget is viewed by 510 million unique users, which is 6.2% of the global population. This means 27 million queries to our API daily and over 550k WebSocket connections in the peak time.
  • Currently, we hire over 180 fantastic people and over 60 in the Engineering and Infrastructure teams. The plan is to grow the team even bigger soon.
  • In March 2022 we secured $25 mln in a Series B Investment round (read 👉 TechCrunch’s article to learn more)

Would you like to see what working with us looks like? Check out our #GrowWithTidio video 🎥 Want to know more about our Data Team?

In a nutshell, our Data Team delivers insights to help stakeholders navigate the product strategy and maintain a healthy service. As data engineers, we collect, transform and maintain data in multiple dimensions.
Our data platform is built upon AWS with Infrastructure as a Code, using Terraform and Ansible. We ingest data from MySQL databases and external APIs (using custom integrations or FiveTran). We use Snowflake as a Data Warehouse, fuelled by tailor-made ELT jobs orchestrated by Airflow. Our Data Analysts use Snowflake SQL (DBT soon) to access data and then Python or R for manipulation and modelling. The dashboards displaying the data are made in Tableau.
As a Senior DataOps Engineer, you will:
  • Build and maintain a solid, scalable data infrastructure using Containerisation and CI/CD best practices. Implement new and modify existing data ingestion and transformation flows (ETL/ELT).
  • Participate in solution design and code peer-reviews, to help establish and share best DataOps practices. Contribute to our internal data engineering library.
  • Build and maintain observability of the data infrastructure and monitor it to spot problems before they occur.
  • Build and maintain great, genuine relations with elements of advisory, coaching and mentorship of less experienced team members.
  • Build integrations for third-party data sources.
  • Represent DE Team across the organization (in front of stakeholders), on behalf of the DE Lead.
  • Help with the design and development of the data quality framework. Monitor data consistency and run data quality tests.
You are the perfect fit if you have:
  • At least 5 years of Data Engineer experience.
  • Working knowledge of Python. Hands-on coding and debugging experience using modern software delivery methods.
  • Expertise in ETL technologies and processes.
  • Strong understanding of Apache Airflow orchestration tool.
  • Experience with Git (GitLab), Terraform and Ansible.
  • Ability to correctly understand business processes and translate business requirements into technical ones.
  • Strong understanding of solutions such as Distributed Systems, Database Systems (including NoSQL), Operating Systems, algorithms, and data structures.
  • Advanced SQL.
  • Working knowledge of OLAP systems like Snowflake, BigQuery and Table formats like Apache Iceberg.
  • Working knowledge of Spark and Spark cluster setup.
  • Experience in cloud computing technologies, preferably AWS.
  • Working knowledge of containers (Docker), container orchestration systems (Kubernetes).
  • Drive for self-development – following the latest trends and publications in the field of your specialization on an ongoing basis.
  • Ability to speak English freely.
You will earn extra points for:
  • Experience with DBT, GCP, bash scripting, Prometheus, Grafana.
  • Experience with Go or any other language used in data engineering.
  • CDC experience is a huge plus.
  • Experience with Queueing systems (RabbitMQ) and streaming (Kinesis/Kafka).
We would like to offer you:
  • A real impact on the company’s growth 🚀
  • Remuneration 22 000 - 30 000 PLN net in the case of a B2B contract or 18 000 - 25 000 PLN gross on the contract of employment
  • Work with an experienced team that continually shares knowledge and is not afraid of testing new solutions;
  • Great development opportunities – company-supported courses and conferences;
  • Flexible working time – an optimum work-life balance is important!
  • Possibility to work 100% remotely, use one of our two offices in Poland, or book a coworking space in your city;
  • 26 days off guaranteed in a year;
  • Individual work tools – Macbook Pro, Dell screen, JBL headphones? You can tailor the equipment to your needs;
  • Sport & wellness benefit or its financial equivalent;
  • Private medical care or its financial equivalent;
  • Mental well-being program – individual therapy sessions and resources for employees;
  • Budget for 1:1 English language classes;
  • Free access to one of the most popular e-book/audiobook services;
  • Regular integration events (company-wide meetings, team events);
  • Discounts on Apple products;
  • Our famous onboarding bagels on your first day!

Would you like to meet other Tidioers in person? Make sure to visit one of our sites in Poland (more info here).
What happens when you send your CV?
  • Screening call with a recruiter about the position and the team [1h]
  • Interview with Michał (Head of Data Team) about your experience [1h]
  • A recruitment assignment
  • Call with the Data Engineering Team (feedback on the task with additional questions) [1h]
  • Offer and fireworks! 🎉
Don't hesitate and apply right away!

Diversity StatementOne of Tidio’s core values is to play fair. Therefore, we treat all candidates equally. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, or disability status. This means recruitment and selection of talent to Tidio is only based on individual merit and qualifications directly related to professional competence.

Find your dream job 🚀

Get recommended to multiple jobs and make your profile visible to top company recruiters
Highlight your strengths and get matched for culture fit
Get free access to job search tools
Gyfted 2021, Palo Alto, CA 94305. All rights reserved.