I am trying to make use of llm so that users can ask xwiki question to get response based on the Wxiki pages I add.
I created a local model using gemma3:1b llm model. The XWiki AI Chat works fine when no collection is added to the context.
I added one wiki page (test) with a one line content to a collection (My docs). After this when I ask the XWiki AI Chat I only get an empty response.
Any help is appreciated. Thanks in advance.