JoWalk is an app aimed to help visual impaired people to detect objects or differences on the floor.
It uses AI to determine the most present pattern.
JoWalk has 3 modes of utilization, accessible in rotation by a single tap on the screen:
the first gives you an haptic signal depending on the visual pattern detected by the camera;
the second mode matches the upper area of the rear camera (that corresponds to ahead with the device horizontally positioned, screen upside) with the center area of the camera, giving an haptic feedback every time the detection runs (this detection interval is changeable by swiping up and down on the screen and is lowered by movement) and two haptics if there is a difference in the pattern recognized in front respect to the center. You can tilt your device to explore more far area.
The third mode matches the camera captured image by swiping down and give you haptic feedback if you are pointing to a similar image. When swiping down you will receive a smooth haptic when the reference image is captured, about after a second swiping is completed.
To permit the app to use haptics go to Settings ->Sounds & Haptics ->System Haptics Option ->Toggle this option on. If the device don't use haptics you will receive acoustic feedbacks, but the app is designed to give haptic feedbacks. I you want to use audio feedback shake your device in mode 1 and shake again in mode 1 to return to silent mode.
NOTE: JoWalk is developed to be used in a safe environment, JoWalk doesn’t detect every obstacle or subsidence and is light influenced. JoWalk uses AI to match areas and it means that there is a lot of battery consumption at every frame analyzed, in mode 2 you can reduce this consumption lowering the frame analyzing rate, in mode 3 it detects approximately two frames per second.
The matching of images is based on DTD dataset by M.Cimpoi, S. Maji, I. Kokkinos, S. Mohamed, A. Vedaldi.