The goal of the project was to convert requirements into knowledge graphs. To accomplish this we utilized a pre-trained LLM (OpenAI GPT-4o). Training our own LLM was out of scope. We labelled our test set to get a grasp of the ontology (SPEC) used to split up sentences and be able to correct the LLM. The LLM had to output in a JSON format based upon the ontology. Some other parts of the project were a simple dashboard website with an input field and a output field, either JSON of graph could be chosen. An auditor to give feedback on the structure of the requirement (based on INCOSE guide to writing requirements). Lastly an openAPI specification so the client would be able to easily import the functionality into AWS.
The client was enthousiastic about the project and was open to discussion and changes of the original vision. The communication I had with the client went smoothly. Any questions were answered quickly. Overall a pleasant experience.
We split to team into parts of the project right from the beginning. We tried to seperate the parts as much as possible to leverage expertise and minimize delays due to switching of tasks. Everybody had about equal work. Two members had the scrum duties emplaced on them and worked together to keep the repository and scrumboard in check.