Avaron AB - Logo

Consultant in ML/Data Pipelines at Avaron AB

Avaron AB

Stockholms län, Stockholm

Previous experience is desired

30 days left
to apply for the job

About the Company

Avaron AB is a growing consultancy focused on technology, finance, and business support. We match your expertise with the market's most interesting assignments, offering a platform where your professional development is central.

About the Assignment

You will join a team building and operating customer-facing recommendation and personalization solutions. The environment is centered around Google Cloud Platform (GCP) and modern data/ML engineering practices, with a strong focus on reliable pipelines, high code quality, and scalable delivery.

Job Description
  • Develop and maintain ML/data pipelines using Vertex AI pipelines/Kubeflow and orchestration tooling (Cloud Composer/Airflow).
  • Work with BigQuery and write efficient SQL to support analytics and ML use cases.
  • Build and maintain Python codebases with strong engineering standards (OOP, linting/type checks, testing, and CI/CD).
  • Collaborate with stakeholders and engineers to design robust data models and system designs.
  • Contribute to code reviews and continuous improvement of engineering practices.
  • Manage access and permissions in GCP (IAM, service accounts) and use Data Catalog where relevant.
  • Support infrastructure and deployment practices aligned with Infrastructure as Code principles.
Requirements
  • Hands-on experience with GCP, including Vertex AI pipelines/Kubeflow pipelines.
  • Experience with BigQuery and strong SQL skills.
  • Experience with Cloud Composer/Airflow.
  • Experience with IAM and service accounts.
  • Experience with Data Catalog.
  • Understanding of Infrastructure as Code concepts.
  • Strong Python skills, including OOP and best practices.
  • Experience with code quality tooling such as flake8, mypy, black, SonarQube, and pre-commit.
  • Strong testing practices (unit and end-to-end) and familiarity with Pytest, fixtures, and unittest.
  • Deep understanding of dbt (preferably in a GCP context).
  • Good Unix and shell skills.
  • Git proficiency (PR workflow and resolving merge conflicts).
  • Ability to create CI/CD pipelines using GitHub Actions.
  • Strong Docker knowledge.
  • Understanding of data modeling and system design.
Nice to have
  • Experience with Dataflow.
  • Experience with Kubernetes.
  • Experience building high-availability APIs.
  • Background in machine learning–based recommendation systems and personalization.
Application

Selections are made on an ongoing basis, so we recommend that you apply as soon as possible.

🖐 Was this job fit for someone?
Share

Other jobs in the same field

Maybe it’s time to broaden the search with these available jobs

Keyword / Occupation
Similar jobs
Latest posts
  • Public Opinion - Opinion Polls: Indicator's Survey 2026 – S increases, L below threshold
    Wed, 28 Jan 2026 - 09:35
  • National Debt - The National Debt – Sweden’s debt is rising but remains historically low
    Tue, 13 Jan 2026 - 09:59