The Problem
Communication isn’t always easy. For neurodivergent individuals, people experiencing anxiety, or anyone dealing with sensory overload, finding the right words—or any words at all—can feel impossible. In classrooms, meetings, or public spaces, asking for help can be overwhelming when you’re already struggling.
I wanted to create something simple: a way to say «I need help» without having to explain why, without drawing attention, without words.
That’s how Tihi was born.
What Is Tihi?
Tihi is a minimalist iOS app that lets you send silent signals to a support person with a single tap. No typing, no explaining, no verbal communication required. Just tap a button, and your support person gets an instant notification.
Think of it as a digital hand signal—discreet, immediate, and clear.

Built-in Signals
The app includes nine pre-set signals that cover common needs:
- Overwhelmed – When everything feels like too much
- Need Break – Time to step away
- Need Quiet – Sensory input is too much
- Need Help – General support request
- Can’t Focus – Struggling to concentrate
- Sensory Overload – Lights, sounds, textures overwhelming
- Need to Move – Body needs movement
- Need to Talk – Ready to communicate verbally
- I’m Good – Checking in positively
Plus a Custom Signal option for personalized messages like «Need water,» «Too cold,» or «Bathroom.»

The Development Journey
Starting Simple: Local Communication
I started with the core concept: two devices, one sender, one receiver. The first version used MultipeerConnectivity for direct Bluetooth/WiFi communication between nearby devices. No internet, no servers, completely private.
The pairing process needed to be simple. I implemented three methods:
- QR Code pairing – Scan a code to connect instantly
- Auto-discovery – Find nearby devices automatically
- Recent connections – Quick reconnect to trusted devices

The Background Challenge
The first major hurdle: iOS background limitations. When a receiver’s phone locked, the connection would drop after about 10 minutes. This was frustrating—the whole point was to receive signals when you couldn’t actively monitor your phone.
I implemented several solutions:
- Keep-alive pings to maintain the connection
- Trusted devices that auto-reconnect when they come back in range
- Immediate notifications with triple haptic feedback
- Signal history so nothing gets lost
But the fundamental limitation remained: iOS suspends background apps. Local connections work great when both devices are active, but struggle with extended background use.

The iMessage Solution
This led to the second major feature: iMessage Mode.
By building an iMessage extension, Tihi can now send signals anywhere in the world through Messages. This solves the distance problem and the background problem—iMessages arrive reliably no matter what.
The implementation involved:
- Creating a TihiMessages extension with a signal picker interface
- Encoding signals as iMessage data
- Building a monitoring service in the main app to detect incoming signals
- Sharing data between the app and extension via App Groups
Now users can choose:
- Local Mode for nearby, private, internet-free communication
- iMessage Mode for remote support anywhere
Design Philosophy
Throughout development, I maintained a few core principles:
Minimalism: Every screen serves a clear purpose. No clutter, no unnecessary features. Large touch targets, high contrast, monospace fonts for clarity.
Accessibility: Haptic feedback, and visual indicators ensure the app works for everyone.
Privacy: No accounts, no cloud storage, no data collection. Communication happens directly between devices or through Apple’s encrypted iMessage.
Dignity: The app never makes you explain yourself. A single tap is enough.
Real-World Use Cases
The beauty of Tihi is its flexibility:
In Education: A student feeling overwhelmed can silently alert their teacher without disrupting the class. The teacher gets a notification and can respond appropriately.

At Home: A child experiencing sensory overload can signal their parent from another room. No need to find them or verbally explain.
In the Workplace: Someone with anxiety can discreetly let a colleague know they need support during a meeting.
Remote Support: A college student can signal their parent across the country via iMessage. An adult living independently can reach their support network anywhere.

Technical Stack
- SwiftUI for the entire UI
- MultipeerConnectivity for local device-to-device communication
- Messages Framework for iMessage extension
- UserDefaults with App Groups for data sharing
- UserNotifications for alerts
- CoreHaptics for feedback
Lessons Learned
1. iOS Background Limitations Are Real
No amount of clever coding can overcome iOS’s fundamental background restrictions. MultipeerConnectivity works great in the foreground but struggles in the background. The solution wasn’t technical—it was offering an alternative (iMessage Mode).
2. User Testing Is Everything
Early versions had complex pairing flows. Real users showed me they needed something simpler. MultipeerConnectivity became the primary method because they’re instant and almost foolproof.
3. Privacy Matters
Users told me they appreciated the no-account, no-server approach. In a world of data collection, sometimes the best feature is what you don’t do.
What’s Next
Tihi is live on the App Store, and I’m gathering feedback from real users. Future possibilities include:
- Apple Watch support for even more discreet signaling
- Signal scheduling for predictable needs
- Pattern recognition to identify trends in signal history
- Customizable haptic patterns for different signals
But the core will always remain simple: one tap, instant communication, no words needed.
Note: Tihi is a communication tool and not a substitute for professional medical or therapeutic support.
About the Development Process
Total development time: ~3 months
- Month 1: Core local communication, UI design, signal system
- Month 2: Background reliability, trusted devices, connection improvements
- Month 3: iMessage extension, polish, TestFlight testing
Built solo using SwiftUI, with input from neurodivergent individuals and their support networks to ensure it meets real needs.
The hardest part wasn’t the code—it was understanding the problem deeply enough to build the right solution.

