AI in XWiki brainstorming

Given the current rate of advancement in the field I think it’s not too crazy to go a step further with the generalization. Touching upon ideas proposed earlier along with some possible but not fully tested approaches:

Idea 44: XWiki personified

Enabling it to:

  • Talk (XWiki as a platform and/or the knowledge within)
    – Fine tuning one of the many LLMs available today (or tomorrow) on the available documentation, code, forum discussions etc. to imbue it with understanding of what XWiki is, what it can do and what users usually struggle with (PEFT/LoRA)
    – Maybe provide a way for further fine tuning on the knowledge within
    – Using vector databases for fast similarity checks and/or giving it access to solr (check out stuff like faiss
  • See - images and maybe even auto-taken screenshots of any rendered page ( see SAM )
  • Do (tasks)
    – Something akin to AutoGPT/BabyAGI/AgentGPT
    – Using frameworks like LangChain (or others) to provide it with access to the XWiki API as a tool ( with user validation prior to executing a proposed set of actions )
  • Speak (optional - by tts)
    – Providing a hands free way of interacting with the platform
    – Things like bark or uberduck
  • Generate (content - based on the context of the wiki combined with prompts form the user)
  • Research (additional information gathering from external sources)
  • Auto-repare (to a certain extent)
    – Given access to logs and the information about the general setup + maybe a bit of clever prompt engineering for guidance, it may be able to diagnose itself, propose solutions and execute them (again with the use of something like AutoGPT)
  • Code
  • Give it contextual or selectable personalities

The main idea being abstracting most of the complexity by creating a natural language interface between the user and the platform, truncating the learning curve needed to use and manage an instance.

1 Like