2020-03-28 17:44:30

Keep in mind, there's a lot that can be done in 2-D. Think roguelikes, tactical games like XCom, etc. And I suspect Pitch Black, The Veil, etc. will all be 2-D. 2-D is far from a dead end, and there's so much that audio games haven't even touched. I still want to see a touchscreen-accessible roguelike, with explore-by-touch for investigating the map, combined with an Android-style swipe system (Swipe up/down to toggle between all visible objects, doors, monsters, treasure, weapons, potions, etc. then left/right to find the previous/next object, along with an audio icon system playing sounds so you can manually walk to the object too.) To my knowledge, no one has done anything like that.

2020-03-28 19:38:33

@24
As someone who has tried, this is not possible.  If you don't believe me, try it and get back to me.  You're not going to fix it with programming constructs.  Most of the issues with these things are your constraints going wrong for no obvious reason and you can only debug constraints visually by drawing it and seeing what it's doing.

I tried to adopt Ode to 2D physics once.  Same problem.  It's hard to get everything you need turned off.

Whether the bodies are rigid or not doesn't change anything.

If you want to try non-tile-based simulations your answer is Box2D, which at least limits you to two dimensions.  You can turn off rotation there if you want and that'll give you something sane to work with from a "I can only debug this by printing floats to the console" perspective.  There's a few other libraries like it that you might get something from.  But fully 3D audiogames are a pipe dream for the most part, and even if you cook up something only a very tiny part of the community will be able to play it.

Even Box2D requires being comfortable with transformation matrices though, which has as a prerequisite being very comfortable with trigonometry and vector math, and unlike what most people do there's no shortcut or way around it.  I have used Box2D and probably will again, since continuous collision detection is a hard problem, but that only works for me because I've got the math background to follow what's going on.  I don't know enough to know if you personally do, but for the most part this community as a whole doesn't.

But as @26 said, this is honestly not a problem.  Sighted people still make 2D games for themselves even now.  There's enough stuff that you can do without going 3D that you don't really need to be sad that you can't.

@25
Something like Godot is a better bet, but on the whole I'm not personally sold on this idea of collaborating with sighted colleagues.  The elephant in the room is that, often, you have to change gameplay to make it work for blind people.  Sometimes, like with Sequence storm, that's not a big deal.  But games like Sequence Storm are kind of far between, and even if you had the technology you'd probably have trouble getting a team together.

It would be nice if it happened, but I don't think it's realistic.  Anyone who is indie enough to want to collaborate on this kind of thing would be happy to use whatever because inclusivity.  Anyone who is big enough to decide on the tech first with no chance of changing it is almost certainly going to not care, because they'll be considering money first, and "can we change the gameplay to be less fun for the sighted" is not what they want to be hearing.  Even if these tools are accessible, you'll be slower than them and won't be able to tell if you horribly broke the graphics, either.

It could happen, but the things that have to align to make it happen make this much less practical than it at first seems, even assuming the tools are accessible.

But even accessibility menus like Sequence Storm has introduce issues, broadly speaking.  As soon as you have a high score board, being unable to distinguish between someone making the game easier for accessibility or someone making the game easier as a way to cheat becomes a real problem.  "Sorry, you can't have any sort of reliable online scoring mechanism" or "Sorry, multiplayer might just randomly have one of the players turn on these 5 things for blind people" don't sell, and the consequences of doing them are you've broken the balance of the game for the sighted players, which are 99.99% of the revenue.

So yeah, sure it can happen.  It has before, even.  But it's not as impactful as everyone likes to think.

@26
I'm being told that the APH Graphiti isn't dead after all and was at CSUN this year, so we might be getting a 2D braille display with touch sensitivity at a reasonable-ish price point soon.

unfortunately if you used it, you'd probably not have that many players.  And it's not portable of course.  But it's almost large enough to flat out adapt Rogue, and if they ever scale it up to 4x the size you'll literally have an entire terminal's worth of 2D space to play with.

But even at the small size there's a lot of promise there--you could probably announce/provide audio and just use dots to indicate points of interest, or something along those lines.

I think Pitch Black is dead, also. There was some sort of big controversy about them not paying voice actors. I don't have the details on that though.

My Blog
Twitter: @ajhicks1992

2020-03-28 20:13:35

@27, what about for someone like me who has a little vision or can get some sort of help for it?
I don't mean just the constraints, the whole things (didn't work with box2d, but I tried bullet and made some objects with it).
also panda3d has a way to debug bullet objects with nodes and node paths.

2020-03-28 20:25:39

@28
Maybe with some vision, yes.  But it's still not going to lead to something playable without vision.  You might be able to get as far as 3D graphics but your gameplay is going to be fundamentally 2D when you're done, unless you're writing for other people with some vision only.

My Blog
Twitter: @ajhicks1992

2020-03-28 20:33:36 (edited by Ethin 2020-03-28 20:58:55)

@26, I've always been fascinated with touch input. All the touch input systems I've seen give you X and Y positions, not gestures. How, then, do you figure out what gestures are being used? Or am I just looking at the wrong libraries?
Edit: OK, so SDL2 answers my question -- SDL_DollarGestureEvent.

"On two occasions I have been asked [by members of Parliament!]: 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out ?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question."    — Charles Babbage.
My Github

2020-03-28 21:04:06

@30
The math for swipes is fairly straightforward: you look for touch down, you look for touch up, and you can get the velocity by comparing the times and the x/y coordinates.  Then you just set a velocity/direction threshold.

The OS will usually offer this for you as a built-in thing, and of course more complicated gestures are, well, more complicated, and you might want some filtering to get rid of jitter, but basic gesture recognition isn't actually so bad as it might at first seem.

My Blog
Twitter: @ajhicks1992

2020-03-28 22:08:56

@30 https://gitlab.com/lightsoutgames/godot … nReader.gd Search for "touch". I can't get a line number right now, but "touch_index" is a good place to start.

I don't know that this is a perfect algorithm, but I hacked it together on a small USB touchscreen, brought it to Android, and in both cases it felt pretty good to use. There's a lot that isn't handled--most widgets that need more interactivity than taps for instance--but it's good enough to use a basic button-driven UI on a touchscreen-only interface.