2015-07-01 16:22:25

Please guys, support the project to make Unity Accessible.

Put your Vote to this feedback:
http://feedback.unity3d.com/suggestions … essibility

Unity will let us make audio games much quicker and easier! currently i am having trouble making a new account on their website.

Thumbs up

2015-07-01 16:35:46

I will support this, but I can't ever say that it is possible. Unity uses 3D rendering in its user interface and, as it currently stands, there is absolutely no screen reader that can interact with, and work well with, a 3d interface. So, it is unlikely that this will work unless we can convince Unity to completely remodel their user interface, which is unlikely. However, I'll support it.

"On two occasions I have been asked [by members of Parliament!]: 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out ?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question."    — Charles Babbage.

Thumbs up

2015-07-01 18:29:31

I am also having problems creating an account.
I don't think it is that difficult for 2d games. We just need access to maybe some widgets in the games themselves as well as access to the editing items.

Thumbs up

2015-07-02 05:25:12

They're rendering text as 3D objects, true... but I'd be amazed if they aren't using a render function that knows when its rendering text--Either they have a render_text function, or a generic render function that gets the data from the text object. The former is easy to make screen reader accessible. The latter would be more inefficient, but having the text show up in the accessible layer whenever its geometry is accessed shouldn't be hard.
(Unless the geometry is accessed directly, rather than through methods, which would be odd for a big professional project like Unity.)

Alternatively, if there is a render(object) function, there could be a line like:
if object instanceof text then render_accessibly((text)(object));

Even if it would be better from the ground up, it would need to be a horrible Eldrich cludge of an engine if accessibility in some fashion couldn't be worked in. Excellent accessibility might be difficult to impossible, but we have people who stuck with Android in 2013 and I still use Noteworthy Composer, so mediocre accessibility would still be an enormous improvement.

(Ahem. <Insert rant about tactile displays here> )

Some of my games
Keep up to date by following @Jeqofire on twitter!
Ear Ninja?

Thumbs up

2015-07-07 17:16:35

Unity 3D can also render 2d games, with an adapted 2d editor view. So, it may be cool for some people smile

Thumbs up

2016-11-16 01:10:39

Unity has added an accessibility option for their account creation, so now one can go add votes to this issue.
I posted on there about how it is possible to do code development almost 100% without any editor while still having access to Unity as a library and a collaboration tool.
Apparently there is a blind developer who has managed to get Unity working, so we'll see if he answers me on how to setup the project. I'll also ask around to see what other Unity developers I know say.

Thumbs up

2016-11-19 13:12:14

I'd be very interested to hear how he manages that.

Thumbs up

2016-12-01 01:48:35

Hi everyone, I would like to join the discussion on this.
I just gave the issue a few of my votes on the Unity Feedback site. I thought the list Frastlin compiled was excellent.

My attempt to use Unity with NVDA was pretty short lived. The console, the inspector and the hierarchy windows all stay completely quiet for me. Without access to these major parts, Unity is of course pretty limited.
Is this an NVDA only thing? What screen readers are you all using?

Thumbs up

2016-12-01 15:46:31

Unity is pretty usable if you don't use the editor. I just created a template you can use to get started at:
https://github.com/frastlin/Screenreade … tyTemplate

Please let me know what you think.

Thumbs up +1

2016-12-02 00:45:34 (edited by mikrima 2016-12-02 00:47:56)

Hi frastlin,

that's very cool, thank you for the template. I just downloaded it and gave it a try.
Would you be ok if I gave you a bit of feedback?

At least on my computer, the default save location for the log file is access protected if Unity is not run in administrator mode. This might not be an issue for everyone, but I could imagine that it could throw off some users.
Also, the log file belongs to a project, you wouldn't want it to mix with log files from other Unity projects. It seems to make sense to pick a default location that would place it within the project's folder.
My suggestion would be to use the project's data path as a default.
The file would be created in the Assets folder of the project. This is both easy to find, and you won't be getting an access denied error. It would also be a good default for any new project.
To set the path to that, you could simply add the following line right in the beginning of the constructor of MyDebugger:

LogPath = Application.dataPath + "/Unity_Logfile.txt";

The most important issue is that the main scene has errors. The sample app doesn't run properly and throws an error.
This is due to some broken game objects in the scene. There is one object that is called "Missing Prefab" which has nothing on it. Then there are two objects with blank names. One of these has a missing script and a camera on it. The other has a sprite renderer with a missing sprite and another missing script.
In addition to the blank object containing a camera, there is also another camera object in the scene. Neither of the two cameras has the main camera tag. The one with the blank name is also disabled. Because of that, the script ExampleScript.cs crashes in line 26, when it tries to access the main camera.

Lastly, I wrote a quick and dirty Unity plugin that uses NVDA to read out the items in the scene hierarchy. I tried to use Tolk, but it wouldn't recognize NVDA, and the Jaws API isn't Unity friendly enough, because it is 32 Bit. It works fairly well. Maybe that is something worth integrating into your template project?

Thumbs up

2016-12-02 08:46:00

Just to be clear, is this about making games created with unity screen reader accessible, or allowing screen reader users to use unity for game creation? I'd be very interested on the latter.

Thumbs up

2016-12-02 09:38:35

The latter.
The difficulty with Unity is that one needs to use objects and attach stuff to those objects. Most people attach stuff by going into the editor and doing something to attach stuff to objects visually. What this template does is give you the boilerplate code to load resources (sounds and pictures), create objects, and attach scripts to objects.

mikrima, I will implement those changes! But to be honest, I don't know C# very well and I just put this template up because people wanted it and this shows one how to do it.
My brother is a Unity developer, so he gave me a template he uses and I just stripped out all the visual aspects and put in audio stuff instead.
I really think the example file should be a small tutorial on objects, moving them and whatnot, basically giving one a crash corse in Unity.

Thumbs up

2016-12-02 13:24:03

Are the unity developers bothering to implement this at all? I don't see this getting far without their cooperation.

Thumbs up

2016-12-02 15:29:52

Read the readme and you'll understand.
This is not anything Unity needs to do, it is already implemented. This template just shows how to do development in Unity without the Editor. (although the editor is needed to press play, that is not too difficult to do).

Thumbs up

2016-12-02 18:01:07

frastlin, would you be ok if I cleaned up the scene and maybe the scripts a little and send it to you for review?

Here is my version of a cleaned up ExampleScript.cs (I also added a few extra comments)

using UnityEngine;
using System.Collections;

//This is an example script that is attached to an object in the MainScene in the unity editor after the Play mode is entered.
// See the function called "Main" near the end of the file for further explanation.
public class ExampleScript : MonoBehaviour
    // This is a reference to an audio player to play sound files
    AudioSource audioSource = null; 

    // The following method is called as soon as this component is created, but before the Start function.
    void Awake()
        // Debug.Log is like Console.WriteLine, but will print to the Unity console as there is no regular console by default.
        Debug.Log("Hello world!");

        // Load the sound file
        var sound = Resources.Load<AudioClip>("Sounds/Death");

        // Add an audio player component to this game object and save a reference
        audioSource = gameObject.AddComponent<AudioSource>();

        // Tell the audio player which sound file to play
        audioSource.clip = sound;
    // This method is called after Awake, but before the first Update.
    void Start()
        // Nothing to do here in this example script

    // The Update method is called every frame.
    void Update()
        // Check whether the space key was pressed this frame
        if (Input.GetKeyDown(KeyCode.Space))
            // Since the space key was pressed, tell the audio player to play the sound that was assigned to it

class ProjectMain
    // The following method is called automatically as soon as game mode is entered.
    // It will create a new empty game object and attach an instance of the example script component to it
    public static void Main()
        // Create a new game object with the name "Test Object"
        var obj = new GameObject("Test Object");

        // Add the Example Script component to it.
        // This will trigger the function Awake, OnEnable, Start and Update to be called to that script.

Thumbs up

2016-12-02 21:06:09

perfect! I like it! I would suggest two things:
1. the debug comment should say something about the debugger.
2. I would like to see the main camera initialized in the script so it is a little more clear about how to move the player.

Are you going to send a PR my way? Or would you like me to just incorporate what you said?

Thumbs up

2016-12-02 21:08:07

BTW, how are you finding the instructions? Is it working for you?

Thumbs up

2016-12-02 23:21:26

Yes, PM is incoming soon.
I'll work the camera in there, no problem, and add extra commentary to the debug line.

I am fiddling with the instructions a little. With the change in the debugger script, step 4 of the installation instructions becomes optional. This is a good thing. The less steps are required, the better. Just less intimidating that way for those new to the whole thing.
The second step of the "Running Unity" instructions talks about a system window that might come up when opening the main scene. I am not sure what this is, I don't recall this happening.
This is just nit picking on a high level though!

On a more technical note: Unity actually compiles the source code every time you change it and then switch back to the Editor window. It doesn't just do this when you press ctrl P. If you do press it however, Unity will wait for the compilation to finish before actually starting the game.
If there are any compile errors, they will end up in the console, so we gotta find a way to make this known to the user. If there are any compilation errors, Unity will not allow game mode to be started. There is no audio notification about this, only a text rendered in the 3D viewport, telling you to please fix the compile errors first.

Another bit of extra information is that everything that is inside the Resource folder will be packed up into the build when Unity creates the final build. This is important because Unity cannot check whether you are actually using all of the files in there or not. During development, lots of assets, from textures to sound files and music, are added to the project and then replaced. But the files are not always deleted. This can severely clutter up your build. It also makes the resulting build really large.

The usual way to make sure that things that are not needed don't end up in the build is to put sounds and graphics in a different folder. And then use prefabs in the resource folder that reference them. However, I have been playing around with this for a day now and cannot find a way to create proper prefabs without using the visual Editor.

Also I just realized that I didn't mention this earlier - I am sighted. I'm Michelle from icodelikeagirl.com, the one that's trying to build a plugin to make games created with Unity work with VoiceOver and TalkBack. I found this thread from the comment that you left under one of my blog posts.

Thumbs up

2016-12-03 05:16:58

@frastlin: you do need some basic level of accessibility support, such as being able to read the console.

Thumbs up

2016-12-03 10:27:46

Victorious, the template that is up on github has a script that will output all the stack trace and debug errors to a text file located where ever you wish.

mikrima wow! It is so awesome that you are helping with this, it is exactly what is needed. If you want, I can add you to the project so you don't need to send me PR requests.
I think a line can be added to the debug script to play a System Sound
when there is a bug.
Step 2 is to get out of the project selection screen. The project selection screen is completely inaccessible, so the instructions are to tell if one is in the project selection screen. There is a better way to describe it though.

Thumbs up

2016-12-03 13:37:58 (edited by mikrima 2016-12-03 13:38:24)

Hi frastlin,
Yes, I'd love to join the project.

I also added something that will monitor the console and speak up when there is a compile error, reading out the name of the script and the error message itself.

The code also monitors any attempt to enter Play Mode. If the game cannot be started, because there are compile errors, it will read out a message. That way the user isn't left in silence, wondering what is happening.

This currently only works with NVDA and uses Microsoft SAPI as a backup. I couldn't get Tolk to work with Unity properly yet.

Thumbs up

2016-12-03 17:01:40

That is awesome!
I would love to try this!
I would have the option to read or not as some people may just want the debug log to be in the folder.
I am worried though that if the debug message is spoken by the screen reader, the user isn't going to be able to pick up the exact location on first listen. Perhaps if it said the problem first, then the line number, it would be easier. But I often times arrow through the debug logs so I can get the details. I need to try it first though.
Eventually there should be a library that one can use that will speak using a screen reader on any OS, but this is totally a start!
It will be a lot faster developing knowing when there are errors rather than waiting for the script to start.
I sent you the collaborator invite, so you can push all these awesome updates!

I think what I should do next is actually make a simple game in Unity and test all these things.
What I was thinking is a simple Go Fish game that uses recorded speech rather than a screen reader so it can be eventually tested on every platform. I would also like everything to easily have graphics added in as I believe that will be the eventual use of Unity for blind developers. (Have the blind person write most of the game logic and sound and a sighted person add in any animation and graphics)
The first questions I have are:
1. For the cards and deck, should I be using instances of GameObject for everything? Can I make a class Card that inherits from GameObject and then make 52 Card objects?
2. are Components just object attributes for game objects? If I have a value attribute in my Card class, should it be a Component object? When does one use a Component vs just making attributes and methods in the object's class?
3. What exactly are children in Unity? I think of parents and children when I think of classes (like the Card class being a child of the GameObject class), but I don't think that is what Unity is calling children. Are components made children or is a whole object made a child?
4. How event based is Unity? Should I be only triggering events through a tree and adding and removing items from that tree based on different events? (this is kind of what children look like...)

Do you happen to know of any introduction to Unity that only uses code, not the inspector and GUI?
If not, that is what I eventually would like to create.

Thumbs up

2016-12-03 19:44:38 (edited by mikrima 2016-12-03 19:49:45)

Great, thanks for the invite. I just pushed the updates. Give them a try when you get the chance.

I have to admit that I had to look up the rules to Go Fish again, because it's been such a long time.
It's definitely a fun game. But it also a multiplayer game, so you will have to either deal with network code or create a simple AI for it.

Anyway, to your questions:
I have been told repeatedly that I have very limited skills in explaining things, so I apologize in advance.

1. GameObjects: Yes, definitely use individual game objects for each card. If the goal is to later add things like graphics or animation, you will need that separation of them being their own individual entities.
When you make your card class, you need to derive from MonoBehaviour, like all components, and then add that component to your GameObject.

2. Components: Components are C# scripts that are linked to a game object. A game object is really nothing more than an empty entity that sits around in space and does almost nothing other than having a position, rotation and scale. But you can attach scripts to it that give it more functionality. It's where practically ALL the functionality comes from. Scripts attached to a game object are called components. Game Objects can have multiple components attached to them - and usually they do. The scripts need to be derived from MonoBehaviour - that will cause the default functions, like Start and Update etc, to be called.

3. Children: The scene in Unity is hierarchical. That means game objects in the scene can have sub object attached to them. Those are also game objects, with their own components attached. But because their are children of another object, they are tied to their position. Simply put: If I put a piece of paper in my pocket and then walk into my yard, the piece of paper will also be moved into the yard. It didn't move on it's own, and it didn't change its relative position to me (in other words, it is still in my pocket). But it goes where I go.
This has a lot of very helpful consequences for the graphics of a game. For example: One can make a game object and place it in the top half of the screen. Then you add a few children to it. One is a text on the screen, and there are two images of stars that are positioned just right and left of the text.
If all of these are children have the same parent game object, then you can animate and move that parent game object around the screen and the text and the stars will move along with. They won't change position relative to each other - the stars will always stay on the sides of the text.
Other than graphics, using children comes with other conveniences. If you delete the parent game object, all children will also be deleted. That is VERY comfortable. Also, if you hide the parent, all the children will become invisible too. Very convenient if you want to disable your gameplay to bring up something like a pause menu. If a game object is inactive, or hidden, then the script components that are attached to it will no longer be updated. Sounds will no longer play, game logic will stop, animations will halt and so on. It's a very effective pause mechanism.
For your Go Fish game, I'd recommend putting the game logic script on a game object, and then creating all the cards as children underneath it.

4. Events: Unity is both very event based and not very event based at all. You have your standard functions that every component (aka MonoBehaviour) comes with, such as Awake, OnEnable, OnDisable, Start, Update, OnDestroy and such. But you can also register yourself as a listener for all kinds of events.  You can also very easily create your own callbacks functions and have other scripts register with you to become listeners to that event. And then there is the option of sending an event message through the hierarchy of your scene, which means every component (aka script deriving from MonoBehaviour) that implements a function names like that event will get called.

I know that there are probably close to a million tutorials for Unity out there, but I don't think I've come across one yet that doesn't use the Editor.

Personally, I think prefabs are the most powerful feature in Unity, aside from the general principle of using components for everything. And I haven't yet figured out how to create them without the Inspector - which is sadly completely inaccessible.

Thumbs up

2016-12-03 22:27:19

Your explanations really help and seem clear to me.
I don't know the difference between a simple object and a prefab, but does this forum post help?
http://answers.unity3d.com/questions/86 … -to-a.html

Thumbs up

2016-12-03 23:20:06

Yeah, I actually know that code. I think I should have written that while it is possible to create prefabs via script of course, I haven't found a GOOD way to create them without the Inspector. smile

When you create a game object and add one or more components to it, these component scripts often have public variables. A component that would move an object could for example expose a variable for the movement speed.
Every variable that is marked as public in the script is displayed in the Inspector, so that game designers can very comfortably change the parameters - without having to touch the code.
Once the designer has attached all the components he wants and set all variables to sensible values, he can save this GameObject setup. That is then called a prefab.
At runtime, instead of creating a GameObject via script from scratch, attaching all components again and setting all values yet again, a script can simply instantiate an instance of said prefab. One line of code and Unity will create a GameObject with all the components as they were setup. This even includes children - prefabs can be quite complex.

Prefabs can for example be used for menu screens, with all buttons and labels etc already set up.
They can be used for enemies in a game, where the prefab will contain the character graphics, links to sound files, the AI scripts, a weapon object and whatnot.
There are about a million more examples - prefabs make game creation simply a LOT easier.

But without access to the Inspector, it's really hard to set any values on any of these variables.
It's a little frustrating too, because Unity already has wonderfully functioning keyboard navigation for the Inspector. I can use the arrow keys to go through all the exposed variables and all the components on a game objects. I can see that the Editor highlights the current selected variables for me. But it doesn't forward this information to a screen reader. It's so close to being accessible, but then falls short.

I tried looking in the Unity code to find any way to retrieve what item in the inspector is currently being highlighted, so I could write code that sends that information to NVDA - but no luck.

Thumbs up