Databricks Platform Engineer | Databricks Platform Engineer (m/w/d)

Union Investment

Frankfurt am Main, Hessen, Deutschland
Published Feb 22, 2026
Full-time
No information

Job Summary

As a Databricks Platform Engineer at Union Investment, you will play a pivotal role in the AI Competence Center, focusing on the development and maintenance of a Databricks Data Intelligence Platform. Your daily responsibilities include ensuring the availability, security, and scalability of Databricks workspaces using modern Infrastructure-as-Code (IaC) tools like Terraform and Terragrunt. You will manage cluster configurations, compute policies, and access controls while defining data catalogs and schemas via Unity Catalog. This role is highly technical, requiring the implementation and optimization of CI/CD pipelines through GitLab to automate software delivery. This position is particularly attractive for professionals seeking to work at the intersection of Asset Management and Artificial Intelligence within a stable, long-standing financial institution. The role offers a high degree of flexibility with mobile working options, a competitive salary range of €60,000 to €90,000, and comprehensive benefits including a complimentary Deutschlandticket.

Required Skills

Education

Completed degree in Computer Science, a comparable IT-related field of study, or equivalent professional qualification.

Experience

  • Several years of professional experience in software development and system operations using modern tech stacks.
  • Proven experience in the stable, secure, and cost-effective operation of applications on Microsoft Azure.
  • Demonstrated expertise in building and maintaining reusable Infrastructure-as-Code (IaC) templates, specifically with Terraform.
  • Practical experience working with Databricks, Delta Lake, and Spark.
  • Experience in implementing and improving CI/CD pipelines using GitLab.

Languages

Not specified

Additional

  • Location: Frankfurt, Germany. Full-time position. Includes responsibility for application availability, security, and scalability. Requires proficiency in Databricks Asset Bundles (DAB) and SDKs for workspace configuration.