Original question by: @iii@mander.xyz

On the one hand, I think OCR, text to voice, image to text, … has improved quite a lot.

On the other hand more and more stuff is locked away in apps, and javascript blob websites, so I can imagine it’s harder for accessibility tools to access information.

But I’m just guessing. Do any of you know first or second hand?

  • Oh, I can definitely talk about this! I’m a FE engineer (10yr industry) with expertise in accessibility (aka a11y), and have directly made contributions to that space

    The good news is that the modern web has very rich support for a11y. ARIA is the technology for assistive web, which is very well supported by all major browsers and can support both static and dynamic content. WCAG provides a foundational set of guidelines for accessible web, but the major players (such as Google) often have their own accessibility guidelines with stricter requirements than what WCAG provides. Visual accessibility modes such as high contrast, dark mode, and zoom are also very well supported by the major OS’s, and are nowadays configurable for your needs. The best screen reader in my opinion is VoiceOver. The worst is JAWS.

    Imo most problems arise at the user and implementation levels. Smaller companies and inexperienced devs may not care about making their websites accessible, which can actually create broken DOM trees that screen readers struggle to parse. Similarly, very few users provide alt text for images (which is what screen readers see). The blame here is on the people rather than the technology.

    One upcoming technology I’m actually excited for is AI assisted screen readers, which can look at a website and fill in the blanks where ARIA struggles, like parsing images and graphs. It can also parse websites that lack accessibility information. I’ve seen demos where you can even ask it questions about the content, like asking for an explanation of the trend in a graph.

    Happy to answer any questions.

  • Lasherz@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    7 days ago

    Can’t speak much on that, but I can tell you that I worked on the computer of someone who used high contrast settings to help them see the screen better and after visiting 3 or 4 major websites it was clear most don’t account for background gui changes and will use less code rather than forcing it, which leads to white on white or black on black text. Part of why I was there was it was getting worse, and so they got a 55" TV as a monitor, which allowed them to go back to standard windows appearance and avoid all the nonsense.

  • kimara@sopuli.xyz
    link
    fedilink
    arrow-up
    3
    ·
    7 days ago

    Bssed on WebAIM Million report, it could be said that it has improved marginally (https://webaim.org/projects/million/). Yes, maybe the tools that people use are getting better, but the native experience is not getting better a lot.

    From the report the biggest accessibility pitfall is still poor contrast. It’s sad that user would need tools to access information that only has poor contrast.