The point is not to explain it here but in the doc. It’s not useful to use vague terms that nobody will understand. Either we remove “Self-evaluation” or we detail what it means.
Now, reporting wcag issues or categorizing them is not self evaluation to me. I understand self-evaluation as the capability to track how good or how bad we are at reaching WCAG 2.2 AA, ie where we stand. i can be wrong, but the point is that this term needs to be defined.
I was just asking what process was changed in practice. From your answer, I gather that you didn’t really mean that any process was changed but more that the current processes are exposed.
Side note: I am proposing some process changes in my comments above
The Web Content Accessibility Guidelines (WCAG) defines requirements for designers and developers to improve accessibility for people with disabilities. It defines three levels of conformance: Level A, Level AA, and Level AAA.
XWiki
is
partially conformant
with
WCAG 2.2 level AA.Partially conformant
means that
some parts of the content do not fully conform to the accessibility standard.
Goals
The XWiki core Committers aim at XWiki being fully conformant with WCAG 2.2 level AA.
Feedback
We welcome your feedback on the accessibility of
XWiki.
Please let us know if you encounter accessibility barriers on
XWiki:
Accessibility of
XWiki
relies on the following technologies to work with the particular combination of web browser and any assistive technologies or plugins installed on your computer:
HTML
WAI-ARIA
CSS
JavaScript
Limitations and alternatives
Despite our best efforts to ensure accessibility of
XWiki , there may be some limitations.
Add an Accessiblity Manager role (and elect someone) → WIP, should be done soon
Write down somewhere how we include accessibility in our designs → We do not have formal design practices yet. There was a project to get one but it wasn’t finalized. I decided to remove the design process affirmation from the accessibility statement proposal. We can add it back later when it becomes relevant
Link to our development practices for WCAG → Done
Link to the accessibility automated tests → Done
Check if SonarQube has some accessibility rules and if so, link to them and consider enabling them for us → I checked and there are some. However I don’t know how to activate them properly. IMO this can wait a bit and we can update the accessibility statement once this is set up.
I added some detail (see the post above)
Reporting and categorizing is not self evaluation in itself, but it allows us to observe its evolution in time. IMO it’s a good indicator of the state of accessibility. E.g. if at some point the no. of accessibility issues gets to zero it would be a good indicator that accessibility is better at that point that it ever was before. Let me know if you don’t agree with the details I added.
The other points you mentionned didn’t need much more discussion, I addressed them in the updated proposal above
Interesting. I can help but the first step is to list which ones we would like to activate. The real question is whether this would be useful for us. If so, then we should start a thread proposal to activate some. It could be a duplicate of our current WCAG tests. However, the nice thing is that our WCAG tests are based on existing functional tests while these rules are static rules based on source files, so it could help find mistakes. It would be good to see if we fail those rules currently and whether the failures are real and not false positives.
If you’re interested by it, it would be nice to create a jira and plan it.
I agree that we can continue the work on the accessibility statement in parallel.
What I don’t like with this statement is that it’s too vague. A project that just started working on WCAG would be able to write exactly the same… We need to link to some measure of our progress on the compliance.
Same as above, we shouldn’t add links in the text. Links should be put on words and sentences. In this case:
I need to check the exact rules but this will be faster and more convenient to check than integration tests (and contrary to integration tests, it’s trivial to spot out where you might have broken something with your changes).
Created a ticket on jira for this topic
Removed all the See our
I’m not sure how to formulate this in a better way. In the template, they had three option: conformant, partially conformant and not conformant. If we want to respect strictly the definition of WCAG it would be only conformant or not. A nice thing with the accessibility statement is that it goes beyond this simple conformance status. A project that just started working on WCAG would not have much in its accessibility statement
As for a metric for the progress of compliance, we don’t have many numbers that make sense without their context. One number that could make sense is the number of failing test categories from our automated tests. Eventually this will reach (and stay at) zero and we’ll need a new metric but at least that’s something we can use for now…
The issue with counting the number of open issues is that all accessibility issues are not equal in criticity. Moreover, the more refined the experience is for screen reader users and the more issues we can see. As an analogy, if you have a text that’s completely unreadable because of color, you probably won’t see and report the fact that it’s slightly off center.
It was added by default in the template. From what I understand, it means that XWiki will not be accessible for tools (web browsers, screen readers, …) that do not support these technologies. HTML and CSS is hard to imagine, however I can see a new browser not fully supporting WAI-ARIA, and some users disabling Javascript. We inform the user that if they disable Javascript, XWiki will not have the expected level of accessibility.
I agree that the default wording here is quite roundabout, I reworked it to be more to the point.