How MindInventory Built a CoPilot for Visually Impaired People?

splash screen
  • biped
Start a Conversation

The Brief

  • Our client is a startup developing smart navigation harnesses worn over the shoulders tailored for blind/visually impaired people and hemispatial neglect patients. This AI-powered harness is equipped with super wide-angle cameras to facilitate an object detection avoidance system that detects head-level and ground-level obstacles detection, holes, branches, traffic signs, bicycles, pedestrians, and more.

  • Upon detection, it generates intuitive 3D spatial sounds to warn the user about obstacles or pedestrians. Apart from the beep warning, it also pronounces GPS instructions. It comes with bone-conduction headphones for users to get non-intrusive sound feedback. It features an appropriate setting in the user profile of the mobile app for blind people using wheelchairs using which the user can customize settings based on their preference.
biped ai screen
lady with biped ai

The Challenge

  • Our Client had trouble Integrating AI/ML capabilities into IoT devices, especially for real-time applications like smart navigation harnesses and it posed significant technical challenges. These included ensuring seamless data flow and processing between the sensors (cameras), the IoT device (harness), and the AI processors, all while maintaining low energy consumption. Additionally, achieving reliable Bluetooth connectivity between the harness and mobile applications in varying environments compounds the complexity. Balancing the computational demands of AI/ML models for object detection and spatial sound generation with the physical limitations of wearable technology requires innovative approaches to hardware design, software optimization, and energy management.
The Challenge

The Solution

  • Our team leveraged their technical expertise along with considering the intricacies of the challenges to provide the following solutions:

Context-aware Navigational Assistance

Our Team developed advanced AI/ML algorithms to enable the harness to understand and interpret complex contexts within the user’s environment. Our solution focused on context-aware technology that assessed environments (e.g., indoors vs. outdoors, crowded vs. isolated spaces) and adjusted navigational assistance accordingly.

Predictive Path Optimization

We leveraged AI to predict and recommend optimal paths for navigation in real-time, based on a comprehensive analysis of the user’s environment and historical path data. We used machine learning models that could predict potential hazards and changes in the surroundings before the user reached them, offering proactive alternative routes.

Multimodal Feedback System

Apart from 3D spatial sounds, we planned to explore and integrate a multimodal feedback system using AI. This allowed us to customize feedback based on the environment and user preferences, potentially incorporating gentle vibrations for silent alerts, enhancing the safety and usability of the harness.

AI-powered App Development Consultation

We constantly guided our client starting with the selection of appropriate AI technologies and models for implementing computer vision features. Our design and development team worked with the client to customize the mobile app backed by target user research.

Real-time Data Processing and Analysis

To handle real-time data from the wide-angle cameras and other sensors efficiently, we enhanced the harness’s AI capabilities for immediate data processing and analysis. This involved optimizing algorithms for speed and efficiency, ensuring that users received timely and accurate navigational assistance without delays.

Enhanced Object Recognition and Prediction Models

Our Computer Vision Programmers created AI/ML model specialized in enhanced object recognition and predictive analytics. These models are capable of accurately detecting a wider range of obstacles and predicting potential hazards in the path of blind or visually impaired users and hemispatial neglect patients.

UI UX Design

In the UI UX design aspect, our designers focused on enhancing the user experience (UX) specifically tailored for blind and visually impaired individuals by considering simplification of navigation, menu structures facilitating easy interaction, and accessibility features.

Seamless Bluetooth Connectivity

Our team focused on establishing effective communication between the mobile app and low-frequency Bluetooth hardware components. This connectivity is crucial for enabling users to configure the hardware and access various features through the mobile app.

The Impact

  • Following the launch of this innovative mobile app designed to complement the AI-powered smart navigation harness, we meticulously tracked user engagement and outcomes, uncovering remarkable impacts:
75%

Improvement in Obstacle Avoidance Efficiency

60%

Reduction in Navigation related Stress

50%

Enhanced User Satisfaction related to GPS Instructions

biped all screens
train virtually screens
start walking screen