Join Our Team!

We support students who want to apply for DAAD or CSC and similar doctoral grants.

Open Position

Ut convallis, magna sed dapibus tincidunt, nulla lacus sollicitudin nisi, id commodo

Open Thesis

Ut convallis, magna sed dapibus tincidunt, nulla lacus sollicitudin nisi, id commodo

Finished Thesis

Ut convallis, magna sed dapibus tincidunt, nulla lacus sollicitudin nisi, id commodo

Open Position

📣 Research Assistant Opportunity: LLM & Knowledge Graphs 📚

We are excited to announce an opening for a Research Assistant position focused on Language Models (LLM) and Knowledge Graphs, commencing in November. Join our dedicated team and be at the forefront of cutting-edge research in this rapidly evolving domain.

  • Role: Research Assistant

  • Duration: Starting asap (Flexible start date) 

  • Commitment: 19 hours/week

Required Skills:

  • Basic understanding of transformer models such as BERT, GPT, and T5.

  • Familiarity with Knowledge Graphs and their integration with language models.

  • Familiarity in using libraries like Hugging Face’s Transformers and PyTorch.

  • Hands-on experience with LangChain or a strong desire to learn.

  • A grasp on fine-tuning techniques and practices for large language models.

  • Strong analytical skills and attention to detail.

Key Responsibilities:

  • Assist in designing, implementing, and evaluating experiments related to LLM and Knowledge Graphs.

  • Dive deep into transformer architectures, understanding their intricacies, and optimizing their performance.

  • Work with LangChain, fine-tuning, and other related areas to ensure the smooth integration of LLMs into our existing systems.

Open Thesis

Determinantal Point Processes for Prompt Engineering for LLMs

The performance of a large language model (LLM) is sensitive to the way it is prompted. Automated prompt engineering methods aim to find suitable prompts for a given task by sampling several prompts and evaluating them. Existing automatic prompt engineering methods do not generate sufficiently diverse sample prompts or rely on several meta-prompting tricks to achieve the desired results. In this thesis, we will use a method for prompt selection to directly optimise diversity and estimated performance by exploiting so called determinental point processes. The thesis will involve comparisons of this technique to state-of-the-art prompt engineering methods such as PromptBreeder from DeepMind.

Requirements

  • Excellent and long standing interest and knowledge in mathematics

  • Good programming skills in Python and PyTorch (optional)

Finished Thesis

Praesent lectus leo, convallis id neque nec, ultrices euismod nibh. Sed ac rhoncus quam. Fusce tristique tellus diam, vel porta eros iaculis vitae.

Design and Development of Murphy System: Generating Meaningful Negative Samples for KGEs

Ali, Semab

Design and Development of Murphy System: Generating Meaningful Negative Samples for KGEs Masters Thesis

University of Bonn, Germany, 2022.

BibTeX | Tags: FinishedThesis | Links:

Going Beyond the Paradigm of Knowledge Graph Embedding Models

Kumar, Abishek

Going Beyond the Paradigm of Knowledge Graph Embedding Models Masters Thesis

University of Bonn, 2022.

BibTeX | Tags: FinishedThesis | Links:

Unveiling the Effect of using Moebius Transformations on Knowledge Graph Embeddings

Aykul, Can

Unveiling the Effect of using Moebius Transformations on Knowledge Graph Embeddings Masters Thesis

University of Bonn, Germany, 2020.

BibTeX | Tags: FinishedThesis | Links:

Research Vacancies

For now, we do not have open positions but we support students who want to apply for DAAD or  CSC  and similar doctoral grants.

How to Apply

If interested please send your CV to Dr. Sahar Vahdati: