Hi,
I was trying to connect the LLM Application to our LLM server which should be openai compatible. I can call that LLM from my PC and I can start the Wiki locally and install the LLM Application but trying to connect to the LLM it always gets an ERROR as response:
“An error occured: [{‘type’: ‘missing’, ‘loc’: (‘body’,), ‘msg’: ‘Field required’, ‘input’: None}]”
Unfortunately there is no debug logging I could activate so I tried downloading the source code and adding some debug messages so I can see what the application is really calling. Problem there is I had to make some changes to even get the code to build. Some XML files do not comply to the validator. Managed that but then when trying to install it from my local maven repo I got “Can’t find descriptor for the component with type [interface org.xwiki.extension.handler.ExtensionHandler] and hint [xip]”
Is there a way I could debug the application?