Research Associate – LLMs for Secure Code Generation (SMACKS Project) | Wissenschaftliche:r Angestellte:r – LLMs zur Generierung von sicherem Code (Projekt SMACKS), IT/2603

Hochschule Esslingen

Esslingen am Neckar, Baden-Württemberg, Deutschland
Published Feb 18, 2026
Full-time
Fixed-term

Job Summary

This research position is part of the SMACKS project at Esslingen University of Applied Sciences, focusing on the intersection of Machine Learning and IT Security. As a Research Associate, you will delve into the architecture and training of Large Language Models (LLMs) to detect and remediate vulnerabilities in software code. Your daily responsibilities include optimizing LLM architectures, researching scalable methods for large code contexts, and investigating Explainable AI (XAI) to interpret model mechanisms. You will also build and operate training pipelines on GPU cluster infrastructure and evaluate models against industry benchmarks. This role is particularly attractive as it offers the opportunity to pursue a doctorate (PhD) while working in a collaborative environment with modern ML toolchains. The position is based at the Flandernstraße campus and includes flexible working hours, home office options, and a contract extending until March 2030.

Required Skills

Education

Excellent Master's degree in Computer Science or a related field.

Experience

  • Professional experience in software engineering with Python and C++
  • Practical experience in training and fine-tuning LLMs using modern Machine Learning toolchains
  • Deep theoretical understanding of LLM architectures and their fundamentals
  • Experience in research and publication of scientific results
  • Demonstrated interest or experience at the intersection of generative AI and IT security

Languages

Not specified

Additional

  • Location: Esslingen (Campus Flandernstraße). Fixed-term contract until March 31, 2030. Opportunity for doctoral studies (PhD) provided. Reference number IT/2603 must be cited in application.