about
Sight: Detect & Navigate is a LiDAR-based navigation assistant for blind and low-vision users. It requires an iPhone with a LiDAR scanner, such as supported iPhone Pro models.
The app helps users move with spoken guidance, haptic feedback, walking directions, and obstacle awareness. It uses the camera and LiDAR sensor for real-time environmental awareness, and location for walking routes through Apple Maps.
It is free to use, and features such as object detection and LiDAR can be used offline. Once a route is set under the navigation tab, the entire app can function offline.
Camera and LiDAR processing happens on device. The app has no account, no ads, no analytics, no tracking, no in-app purchases, and no subscriptions.
Thank you for downloading the app!
privacy policy
Sight: Detect & Navigate ("the app") is built and operated by Kinshuk ("we", "us"). This policy explains exactly what information the app handles, where it goes, and what your rights are.
quick version
- Sight: Detect & Navigate does not have a server. Camera frames, scene descriptions, settings, and onboarding state stay on your device.
- We do not collect, store, sell, share, or transmit any personal data.
- We have no analytics, no advertising, no tracking, no third-party SDKs of any kind.
- The app is free, with no in-app purchases and no subscriptions.
what the app accesses (and why)
To work as a navigation aid for blind and low-vision users, Sight: Detect & Navigate uses several iOS APIs. Camera, LiDAR, haptic feedback, settings, and onboarding state are processed on your device only. Location is used in memory and may be sent to Apple only when you use Apple Maps search, routing, or reverse-geocoding features.
| Permission | What it does | Where the data goes |
|---|---|---|
| Camera + LiDAR | Detects nearby obstacles in real time using your iPhone Pro's depth sensor. | Stays on device. Frames are processed in memory and discarded — never saved, never uploaded. |
| Location (GPS) | Powers walking turn-by-turn navigation, Free Roam street callouts, and the location announcement when you shake for your current location. | Used in memory by Apple Maps' routing API and Apple's reverse-geocoding API. We do not store it. |
| Haptics | Vibration feedback for obstacle distance and turn confirmations. | No data involved. |
data we store on your device
The following is saved locally on your iPhone using Apple's SwiftData and UserDefaults:
- Your settings (voice on/off, haptic intensity, radar range, beacon enabled, callouts enabled, etc.)
- Onboarding completion flag.
You can erase all of it by deleting the app.
third-party services
The only network calls Sight: Detect & Navigate makes go to Apple's own services, on your behalf, as part of normal iOS APIs:
- Apple Maps (
MKLocalSearch,MKDirections) — destination search and walking routes. Subject to Apple's Privacy Policy. - Apple's reverse-geocoder (
MKReverseGeocodingRequest) — converts your GPS fix to a street name for callouts.
We do not use Google, Facebook, Firebase, Mixpanel, Sentry, Crashlytics, or any analytics SDK. We do not have a backend at all.
children
Sight: Detect & Navigate is rated 4+ and is suitable for all ages. We do not knowingly collect any data from anyone — including children.
your rights
Because we don't collect or store anything off your device, there is nothing for you to request, export, correct, or delete from us. Deleting the app removes all local data instantly.
changes to this policy
If anything ever changes (for example, if a future version adds a feature that requires a server), we will update this page and bump the "Last updated" date. We will never silently start collecting data.
contact
If you have any questions, accessibility feedback, or concerns:
support@wisdominurmovement.com