By Arifin Ilham
In an era of increasingly unpredictable climate change, communities face extreme weather and unexpected disasters. From heat waves to heavy rains, tidal waves to flash floods, all kinds of disasters can strike at any time — reminding us that nature can no longer be predicted with certainty. Amid this uncertainty, self-reliance in increasing awareness and preparedness becomes the key to survival. In efforts to strengthen that self-reliance, technology emerges as an extension of every citizen’s senses — able to detect invisible natural signs and hear nature’s whispers earlier, enabling faster responses before disaster strikes.
For people with sensory disabilities such as those who are deaf or blind, technology acts as an additional sense that connects them with a wider world. Web-based early warning systems and applications with visualized information help the deaf understand disaster messages as if they were hearing nature’s signals through images and text. Meanwhile, for the blind, audio descriptions or alternative texts accessed through screen readers deliver disaster information, becoming “eyes” that interpret nature’s messages through sound. Unfortunately, not all digital early warning systems are friendly to people with disabilities. As a blind person, I realize how essential access to information through senses other than sight truly is. Technology as an early warning and rapid response tool must be able to “speak” in multiple sensory languages — it cannot rely on a single form of communication. In other words, systems must be inclusive and reach all citizens’ eyes and ears, without exception.
The development of inclusive technology innovation for early warning systems can refer to the principles of the Web Content Accessibility Guidelines (WCAG). These guidelines are global standards ensuring that all digital content is accessible to everyone, including those with sensory disabilities such as hearing and vision impairments. Of the many principles, several are especially relevant in the context of disaster early warnings.
First, perceivable — information must be presented through multiple media. Text and visual elements for the blind should be supplemented with audio descriptions, while for the deaf, audiovisual content should include alternative text and vibration notifications as danger cues that are not only heard but also felt.
Second, operable — meaning all system functions are easy to use, including simple navigation, clearly labeled buttons and icons, and voice commands for those who struggle to access visual interfaces.
Third, understandable — ensuring information is clear for everyone. Color codes in visual information such as disaster risk maps must be accompanied by textual explanations or audio descriptions. Video transcripts should be manually verified, as automatic systems often make mistakes and cause confusion. The inclusion of virtual sign language interpreters can further enhance equal access and clarify important messages.
Fourth, robust — meaning systems can run across various devices and assistive technologies, from screen readers to digital braille displays to sign language translation apps. The system should also work on low-spec devices to reach more people with limited access. With these principles, digital technology becomes an inclusive pair of eyes and ears for all citizens.
Indonesia already has various disaster early warning applications developed by national and local institutions. Some are widely known to the public, while others are more specialized for specific groups. From the perspective of a blind user like myself, each of these apps has its strengths and weaknesses in terms of accessibility. For example, the Info BMKG app provides weather, earthquake, and tsunami alerts. For the blind, it’s relatively accessible through screen readers, though visual graphs and maps without alternative text remain challenging. For the deaf, interactive visuals and text notifications are very helpful, though sign language support in educational videos is still limited.
Then there’s InaRISK Personal by BNPB, which maps disaster risk down to the village level. Its interactive maps help the deaf understand complex data, but for the blind, the lack of audio descriptions or alternative text makes the maps difficult to access. Meanwhile, the BasarnasApp facilitates emergency reports and SAR services, but its reliance on call centers is a major barrier for the deaf, who need alternative communication channels like chat or video calls with sign language interpreters.
In addition to national-scale apps, local innovations such as Jakarta Kini (Jaki) provide real-time notifications about river water levels, floodgates, and flooding. However, the interface can sometimes be too simple and not informative enough for users with disabilities. Joko Tingkir predicts tsunamis with audio and vibration alerts, while Difgandes assists vulnerable groups, including the elderly and disabled, in receiving help — and helps SAR teams with clear guidance during evacuations. Sipakedifa and Sivabel, on the other hand, focus on e-learning and evacuation systems tailored for people with disabilities.
The main advantage of these applications lies in their innovative design thinking and direct involvement of vulnerable groups. However, most remain regionally limited and still in development, meaning they are not yet widely available. Looking at the range of these applications, Indonesia already has a strong foundation for building inclusive early warning systems that enhance preparedness for all. General apps like Info BMKG and InaRISK Personal excel in data and reach, but still face accessibility challenges. Local innovations such as Sipakedifa and Sivabel stand out from the disability perspective but are not yet robust enough for emergency and rapid response.
As a blind person, my experience using these apps provides a different perspective. Digital technology is no longer just a tool for information or communication — it has become a second pair of eyes and ears, strengthening independence in facing climate change, even with limited senses. From early warning apps with interactive maps to mobile phone notifications that use sound and vibration, technology acts as a digital sensory system that captures signs of danger. Every piece of information becomes a call for safety that can be responded to faster, without waiting for external help that often faces many barriers.
The implementation of the Web Content Accessibility Guidelines (WCAG) principles in early warning and rapid response innovations benefits not only people with sensory disabilities like the deaf and blind but also other vulnerable groups in receiving important disaster messages. Children can absorb information better through visuals, while the elderly can understand messages more easily when presented in simple language. With inclusive design, all citizens can respond swiftly to natural phenomena. Digital technology, as an extension of human senses, enhances citizens’ ability to adapt to environmental changes while fostering awareness and preparedness.
This awareness shifts the stigma surrounding vulnerable groups — from being seen as those who always need help to becoming individuals capable of saving themselves independently. Thus, safety is no longer about competition — who knows first or who prepares faster — but about a shared right: safety as something accessible to all citizens who are aware, prepared, and equipped to face potential threats, even within limitations or far from rescue reach.