I posted here a little while ago about a team at a Unity hackathon putting together a prototype of native screenreader support.
Turns out Unreal are further along, they've released screenreader functionality as an experimental feature in their current preview build. So while that's not a proper release, release shouldn't be too far away.
Unity is the most popular game development tool, Unreal is the second most.. if both of them manage to pull of it it will have profound and lasting impact on blind accessibility of games. Not of gameplay (unless it's a UI based game, like football manager or hearthstone), but getting UI at least partially blind accessible out of the box would obviously make a huge difference.
Here's what the changelog has to say about it -
"Slate Screen Reader Support (Experimental). Screen readers narrate an application’s UI to a user. Some examples of screen readers are: NVDA (NonVisual Desktop Access) and JAWS (Job Access With Speech) on Windows; and VoiceOver on IOS. UE4 now automatically sets up basic support in any application that enables the feature, and now developers can customize that basic support by modifying UMG properties. In this release, experimental implementations are in place for Windows and IOS."