2018-05-13 08:46:15 (edited by magurp244 2018-05-13 23:20:25)

As reported on [Wired]: Famed robotics company Boston Dynamics will be releasing a commercial version of their dog'like robot SpotMini next year. The robots frame will be modular, and can accommodate various attachements, go up and down stairs, open doors, compensate for impacts, and navigates environments using stereoscopic cameras. There currently isn't any price listing, but it will be interesting to consider what sort of accessibility applications this could be used for.

-BrushTone v1.3.3: Accessible Paint Tool
-AudiMesh3D v1.0.0: Accessible 3D Model Viewer

Thumbs up

2018-05-13 10:00:00

I've heard of experiments like this before though. The problem as I see it however is that Ai logic isn't quite there yet.

A robot could go around objects and likely with satnav find places (although given satnav accuracy isn't quite what it should be right now that could be interesting), however firstly how would it spatially compensate for your  size and movement? since obviously a guide dog has to be looking out for your head as well as your feet, has to leave enough room to guide you and the dog around objects, has to watch for sticking out handles etc, all factors which require actual judgement and spacial logic on the dog's part, especially since manifestly the situations one encounters in dayly life aren't necessarily consistant, EG roadworks etc.

Secondly, and most critically is the fact that dogs, like humans can perform broad category judgements from fairly lose sets of criteria. For example if I tell Reever in a shop "find the counter", it doesn't matter whether that counter is metal or glass, what hight the counter is, whether it has a display of things on it or not, whether it has a cash register on it or not, whether there is a person behind it etc. She can perform exactly the same sort of instant soft categorisation that a human can, putting together a large range of factors to interpret the idea of "counter", she even associates the bar in a pub with the word "counter" as well.

She can do this with a lot of objects and even basic concepts, for example she will find "home" even when we're staying in a hotel, going as far as taking me to the door of a hotel room, and will find "out" meaning find the exit we came in by, aside from more mundane objects like "find the bus stop", "find the door", or "find the steps"

Judging with the way AI recognizes objects, I don't think we're yet at the point where the basic pattern matching that is used could get around to these sorts of concepts, especially ones like "home"

Its an interesting idea, but to me guiding just seems too necessarily built on the need for category judgements and logical assumptions to be something which Ai could cope with.

What I do think would be a good idea would be an increase in matters such as satnav control and directions to places, with an Ai use to find routes and accurate satalite information to tell you where you are, since one thing a guide dog cannot do is read a map big_smile.

Of course this is also leaving aside the fact that even if a robotic guide dog were developed, aside from all  practical concerns of what it would run on, how it would cope with different physical surfces etc I'd be willing to bet the price would be so prohibitively, insanely expensive  nobody could afford one privately anyhow.

With our dreaming and singing, Ceaseless and sorrowless we! The glory about us clinging Of the glorious futures we see,
Our souls with high music ringing; O men! It must ever be
That we dwell in our dreaming and singing, A little apart from ye. (Arthur O'Shaughnessy 1873.)

Thumbs up +1

2018-05-13 11:48:15

Well while I am someone saying that going away from traditional methods of orientation for the blind to modern day technology helping us get around, I'm  still staying traditional here and saying that if I am going to get a guide dog, I will get a guide dog and not some robot doing the guiding.
At a meeting in Dresden in Germany we had one of those experiments around, it looked like one of those small lawn mowers we had as kids, together with a joystick at the handle to control the thing. It looked quite flimsy and we were told that we shouldn't bump it against walls because the sensors would get damaged if we would do so.
Logical at first, but for a devic used in the street with bumps, cobble stones and the like, this sort of misses the point for me.
Also the system was quite large and had a majority of other faults, for me it's like the aira service, an interesting thing, but who the hel would use such a thing.

Greetings Moritz.

Hömma, willze watt von mir oder wie, weil wenn nich, dann lass dir mal sagen, laber mir kein Kottlett anne Wange und hömma, wo wir gerade dabei sind, dann iss hier hängen im Schacht, sonns klapp ich dir hier die Fingernägel auf links, datt kannze mir mal glaubn.

Thumbs up

2018-05-13 23:02:32 (edited by Jeffb 2018-05-13 23:03:24)

I would be interested to see how well this guide robot dog works. I wouldn't use it without my cane but I would try it if I could afford it. I have a feeling that it’s going to be expensive. Also if it had more cameras it may be able to take in a lot more information.
My thought is that fast forward 50 100 years that perhaps we would have actual sighted guide androids. They could carry things for us as well as hold a conversation while walking us places. Although by that point robots may be trying to destroy humanity.

Kingdom of Loathing name JB77

Thumbs up

2018-05-13 23:17:47

Those are some excellent points Dark, and I don't think dogs may be in any danger of losing their jobs in the near future, as they also make wonderful companions. Perhaps the title may be a bit misleading, as Boston Dynamics haven't talked about any applications for accessibility or using it as an actual guide dog, they plan on deploying it in office buildings and construction sites initially, eventually moving into home applications. The robot weights 66 pounds and can run 90 minutes on a single charge, they've also demo'd the robot navigating obsticals and environments autonomously, and plan on allowing third party scripting and attachments for further customization.

There's a video of the whole presentation [here] in which the developers provide commentary on the SpotMini, its capabilities, and their future goals.

-BrushTone v1.3.3: Accessible Paint Tool
-AudiMesh3D v1.0.0: Accessible 3D Model Viewer

Thumbs up

2018-05-13 23:35:10

About the idea of robotic guide dogs: I think when that kind of technology comes about, there'll be no need to put it in a robot, we'll wear it ourselves in some way. Like a pair of smart glasses that have such good AI that they can safely help you navigate streets/buildings by a mixture of audio cues from censors and spoken information. You'd probably still need a cane as a ultimate failsafe, but I'm sure this kind of technology will happen in the not-too-distant future.

Thumbs up

2018-05-14 01:52:16

Such a devices have floated around in development for awhile such as wrist mounted Audio Canes, or chest mounted devices. There's one in particular though called [After-Sight], there's a long running thread on its development on the Raspberry Pi forum's [here]. It's equipped with The vOICe, along with object and facial recognition systems and was slated to be around 100$ or less. Unfortunately they've run into a bit of a snag with funding as the canadian government denied their application for Charitable Status, they haven't given up but progress has slowed a bit lately due to other factors. You can however still assemble and build your own or talk to them about it, the spec's and hardware listings are available on their site, if your feeling adventurous.

Thinking more about the guide dog notion for SpotMini, the robot has stereo camera's all around its body but they seem to have a view range up to around waist height, although it has the ability to look up and down further by moving its body. Many of the demonstrations involve the robot having been guided around the environment before hand to build a map, otherwise it wouldn't know where to go as it has no spacial GPS functions or recognition yet, although it seems able to react to and bypass dynamic obsticals.

Given its progress it could probably keep track of a user and their hands and navigate them around obsticals like door knobs and other dynamic hazards, but for head tracking it would probably need a camera module added for a wider field of view, either that or it would need to be programmed to stop and occasionally look around at different elevations. Identifying locations like "Home" or bus stops would probably be fairly easy with a GPS module or map data of those locations, and it can already identify and navigate stairs or chart its path into and out of a room. As Dark mentioned with category judgements though, in a fixed setting it could probably be programmed to identify specific objects to a degree, but identifying things like counters and other context objects dynamically would probably prove far more problematic, unless it could tap into a database that labels and identifies local objects for it.

It would likely function best in established environments, like an office settting as they plan to deploy it in or even a public library. I could imagine a SpotMini being able to guide people around obsticals to a set location in the library, and depending on the organization even retrieve books. Something else to consider for lofty uses could be its potential for interplanetary applications or situations where flesh and blood dogs may not be well suited, given its stereo camera's maybe it could even serve as a mobile platform for the vOICe navigation, or feed other sensory data to the user to help them navigate alongside it. Actually considering it can be piloted as opposed to autonomous that last idea sounds pretty neat, heh. Cost is still an open question as they haven't given a price, though its worth considering that many training agencies cite the cost of training a guide dog ranging from 30,000$ to 35,000$ or more depending, although some agencies absorb this cost for applicants.

-BrushTone v1.3.3: Accessible Paint Tool
-AudiMesh3D v1.0.0: Accessible 3D Model Viewer

Thumbs up