Search This Blog

Saturday, July 28, 2018

Sonar for the visually impaired - Part 1 Theory

Projects / Project Swiftlet 
Original post date: 05/20/2015

This is a low cost portable Ultrasonic range finder and sonar device to help a visually impaired person navigate indoors.

The goal of this is to design an easy to use, low cost, light weight with long battery life device for the visually impaired person. I am going to optimize the design for manufacturing.

With the help of 2 ultrasonic range finders on each side of my head giving me audio/haptic feedback on the relative distances on both sensors, I should be able to orient myself towards the centre of an indoor passage. Stair detection is possible with the device. A proximity alert would warn me when I am about to run into a wall.

There is also an analog sonar mode that plays back the return ultrasonic echo at 1/20 speed. A similar research project and paper was published. With training, this allows more information of the surrounding than the range finder.

I believe that the goal is realistic, within reach and yet can change the life of others for the better. This is a new area for me to learn and explore, I'll try to document and share my thoughts and my notes in the project logs.

Motivation and what I am trying to do:

I got this idea when I was watching Daredevil on TV. Our hero is a blind person with a heightened senses. He even fights crimes in his spare time. But we live in reality and don't get superhuman senses from unknown chemicals.

On the other hand, a very crude form of echo location might be something within reach of a hacker. This is not meant to be a toy prop, but an experiment to see if I can design something low cost that would help a visually impaired person to navigate indoors.

It is a serious problem as there are 40 millions people world wide and 1.3 millions in the US that are legally blind. This is a personal project right now funded by myself. At some point when the project is ready for alpha release, I'll present this and work with the local groups in the area for a trial and hopefully production.

What this can do and how it is different than similar projects:

There are a few projects/design that I came across. My design has a different design philosophy. I want tighter integration for better battery life, size, weight reductions than is possible with using off the shelf modules. The design should also be intuitive and ergonomic for the end user.

Two rangefinders (one for each side) are located on a pair of glasses. As the head is turned, each of the range finders would try to map out the distances in front. With simple math, the device can figure the amount of correction to get the user to walk parallel to the hallway.


With simple geometry, the angle ϴ can be worked out. The device can then inform the user how much correction is needed to stay parallel to the wall.

tanϴ = w / (d1 - d2), so ϴ = arctan( w / (d1 - d2) )

The device can detect the distance and the amount of drop in front e.g. a staircase Since the head mounted sensors have a relative a fixed height (vs sensors on hand) makes it easier for the detection.

We can calculate the following from what we know or measure. Angle ϴ - measured by the Accelerometer; distance l return by the range finder while h is the height of the transducer from the floor.

depth of the stair from the sensor is h2 = l x cos ϴ so the relative depth of the stairs is h2 - h
distance from the step is d = l x sin ϴ

This type of information can be presented more easily by voice feedback.

One of the reason for a head mounted device is that voice feedback can be provided easily and discretely. Voice feedback allows for more complex information to be presented. e.g. the device should be able to tell the user the measured distance to a wall or the object.

The dual speakers or dual haptic feedback from the differences of the distances would help the person to align the heading towards the centre of a hallway. When walking, the body naturally follows the direction of one's head. The absolute distances from the sensor can be used to implement a proximity alarm that tells the me that he/she is about to run into a wall or a person approaches.

The sampling alternates between one of the two range finders. This reduces the interference otherwise would result. It also reduce peak power consumption and allows me to share the same receiver circuit. The update rate can be reduced if the person is moving slowly or that the reflection is far away.

The unit is powered by a rechargeable Li-ion battery with a target battery life of weeks. Readily available AA Alkaline/NiMH batteries can be used as emergency replacement. The circuit is designed to optimize power consumptions. Unused circuits are shut down to minimize power consumption. The unit can be placed in sleep mode if it remains stationary for a long time.

Design:

Unlike most of my projects, this is a new area for me. As such the project logs documents my learning process. I have compiled a TOC of the projects logs and organize them as below. At some further point, I might come back and document the implementation in a more conventional way like my other projects. Until then, hopefully they help the readers understand my thinking process as well as providing the background reference material.

Preliminary Design


  • Similar Projects/Products/Technical Publications
    http://www.popsci.com/ultrasonic-helmet-lets-anyone-see-bat
    >When the echoes rebound off objects, the sound waves travel into two bat-shaped ears -- called pinna -- that rest on either side of the helmet and help gauge the direction of the echo. Molded from clay, each pinna has an ultrasonic microphone embedded at the center. A computer program records the echoes and instantly slows them by a factor of 20.
>Dropping the pace and the pitch makes the imperceptible ultrasonic echoes audible to the human ear. Sonic Eye wearers can then use the echo delay to judge distance or mentally track their surroundings (see video below).

>Results: Naive subjects were able to make laterality and distance judgments, suggesting that the echoes provide innately useful information without prior training. Naive subjects were generally unable to make elevation judgments from recorded echoes. However, trained subjects demonstrated an ability to judge elevation as well. Conclusion: This suggests that the device can be used effectively to examine the environment and that the human auditory system can rapidly adapt to these artificial echolocation cues.



  • The iGlasses™ Ultrasonic Mobility Aid is a head-mounted device which enables more informed, confident, and efficient pedestrian travel. Objects in your path are detected by the ultrasonic sensors and communicated via gentle vibrations. As obstacles get nearer the frequency of the vibration will increase. The device is intended as a secondary mobility device to complement the traditional long cane or guide dog.

  • Detection Range: 0-3 meters (detection map)
  • Weight & Dimensions: 75 grams 6.75" x 5.75" x 2"
  • $96.10

Background info

  • TED Talk: "Daniel Kish: How I use sonar to navigate the world" He uses tongue clicking noise to form a picture of his surrounding.
  • TED Talk: "David Eagleman: Can we create new senses for humans?" The brains can taken in substituted senses through learning.

    After watching the two videos, I can't help but think that the raw sonar return signal might also of use to the end user to get more information for the surrounding. This is one of the things I like to explore in this project. Normally only the first echo is used to determine the distance to the closest object. The ADC in the ARM processor also has access to the received analog signal thanks to the tighter integration. Some post-processing is required to convey the return timing, amplitude of the envelope of the sampled return signals into audio feedback within the hearing range. With training, this would provide additional information than just a simple range measurement. Looks like my hunch is correct and this type of analog feedback would be useful in my device in additional to the rangefinder.

Licenses:

  • HaD Project page + youtube videos: Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International
  • Hardware:
    -  Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International
    -  My current Eagle CAD license does not permit commercial use. Please contact me for commercial licensing.
  • 3D modelling:
    - My models: Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International
    - Models from 3dcontentcentral:
       http://www.3dcontentcentral.net/3DContentCentral/Terms-of-use.aspx
    - Models from EagleUp: https://eagleup.wordpress.com/f-a-q/
  • Firmware:
    - My code: Mozilla Public License (very likely)
    - Keil RVDT Compiler/Environment generated code
    - ChibiOS: http://www.chibios.org/dokuwiki/doku.php?id=chibios:licensing:start

    - RT Core: GPL3. https://github.com/fabiobaltieri/ChibiOS/blob/master/exception.txt
      I am going to use it in unmodified form for static linking explicitly permitted under GPL linking  exception for non-GPL code.

    -  HAL: Apache 2.0
    I might customize/alter the low level drivers (LLD) code, start-up code and release my modified code under the same license.
  • Freescale's sensor fusion library: License: BSD-3-Clause
    http://www.freescale.com/webapp/sps/site/prod_summary.jsp?code=FRDM-KL26Z&fpsp=1&tab=Design_Tools_Tab



No comments:

Post a Comment

Note: Only a member of this blog may post a comment.