Who are we looking for?
We are looking for an experienced data engineer to help build out our data platform in order to maximise the value we can bring to our customers.
With platform we mean the toolkit & (data engineering) best-practices that our professional services team can leverage when serving our customers. You’ll help ensure that professional services can get all the value out of the data that we collect & standardise as well as the data coming in from our customers.
You are thus a bridge between the engineering team which uses data engineering & science to standardise and augment all the data we scrape as well as our professional services team (consisting of data analysts, BI wizards, data engineers & scientists) that leverages our data.
What are you responsible for?
You will need to understand how data we collect is processed within Daltix as well as how it is used to serve our customers, using this point of view we expect you to:
- Provide tooling (such as Python scripts or internal web applications built using Retool) that simplify or even automate tasks done by professional services.
- Model how we store data in the context of our professional services team. Concretely this means that you will design & build data marts that professional services can use to serve our customers in a performant way.
- E.g. help design tables to perform optimally given client use-cases.
- Design & implement ETLs to simplify the implementation of certain use-cases we see repeated at our customers.
- E.g. Set up data pipelines and ETLs to help ingest client data in a way that fits professional services while also adhering to data engineering best practices.
- Administration of our business intelligence tool Looker, which includes designing & maintaining efficient LookML code that is used as a base for dashboards and looks.
- Managing the data infrastructure (schedulers, computing frameworks, …) needed for professional services.
- Work with our full-stack web developers to make our data available for our customer facing web-applications.
What makes you successful in this role?
We’re looking for a candidate with the following qualities as a minimum:
- You speak fluent English as you’ll work in an international team. Moreover you are a strong communicator, both written and verbally.
- You’ll be part of a team but we need you to be able to organise your own work, we are a small company after all.
- You should be comfortable with the Python, SQL, Bash & Git as our current (data) stack is built around them.
- You’ve been a data engineer (as in creating & managing SQL-based ETLs, modeling data, creating/managing data infrastructure) in a professional environment before (at least 2y of experience)
It’s even better when:
- You’ve worked in (data) startups or scale ups before and have built data architectures/infrastructure up from the ground.
- You’ve managed a (cloud-based) data warehouse such as Snowflake, Redshift, Bigquery, …
- You’ve worked with AWS (or Azure or GPC) and have used Terraform to manage your infrastructure.
- You’ve worked with BI tools (Tableau, Qlik, ...), perhaps even new ones such as Looker?
What we offer
- We are a young, entrepreneurial and fast-growing company; you will have the opportunity to directly help shape our future and have a positive impact on our clients’ business.
- You'll be part of an international team of experts in a scaling & still very dynamic startup.
- Flexible work arrangements (we are very remote friendly!) with a lot of autonomy in what you do and where you do it - our work philosophy centers on empowerment and we trust you to know your schedule and work when you feel most productive.
- You will be able to participate in relevant training to stay at the top of your field.
- An open company culture where we play as hard as we work.
- Health Insurance coverage.
- Occasional drinks and (COVID-19 proof) team events.
- You’ll get the chance to meet and work with industry professionals and help lift the company to the next level.