Future-Proofing Online Safety – A Practitioners’ Vision

Today, conversations and attitudes around technology rarely land in the moderate middle. It is rightfully easy to be awed by the acceleration of AI and be excited about the potential. It is equally easy to be frustrated with proliferation of risks and lack of systematic solutions to handling them. So, what happens when a room full of experts with careers focused on advancing digital development and protecting children and young people online, such as CARE, IREX, and Development Gateway, are asked to think about the future they would like to see?
This is the question that kicked off an unusual conversation hosted by IREX in February 2026. Responses are illuminating because they manage to balance the understanding of risks and concerns with optimism and a way forward to reducing barriers to safe and productive use of new digital benefits in a very pragmatic way. The hopes and ambitions of participants point us toward an envisioned future where:
1. People are prepared to thrive
At the center of the vision is a shift away from basic access toward meaningful participation and agency. Skills, agency, and voice are prerequisites for human thriving.
- Young people are prepared to enter the world of work as it will exist in five years—not only with skills, but with clarity about where they add value and the confidence to exercise agencies.
- We are no longer working on making sure people are connected; we are focused on helping communities use digital tools safely, productively, and on their own terms.
- Youth, including girls and young women, are actively listened to and meaningfully involved in shaping the digital futures they will inherit, including the design of products, policies, and norms.
2. Development closes gaps
Experts consistently pointed to a future where digital dividends are not limited by geography or ability.
- Rural communities have caught up and are fully able to benefit from digital opportunities.
- The most vulnerable populations—children, women, people with disabilities, people living in humanitarian crises, and others at heightened risk - are protected from harm and have digital tools that are designed to serve them.
- Digital development no longer reproduces offline inequalities but actively works to reduce them.
3. Safety is a public good
Rather than treating online harm as an individual or parental failure, experts envisioned a future where safety is collectively upheld.
- There is a broad agreement that digital spaces can and should be free from harm to children and young people.
- Harms such as image-based abuse, online harassment, doxxing, and others that particularly target girls and women are neither expected nor inevitable features of digital life but addressable failures of design, governance, and accountability.
- Online and offline safety are recognized as societal priorities, not private burdens.
- Responsibility for safeguarding is no longer pushed down to parents and caregivers alone.
4. A mature, responsible tech sector prioritizes user safety
Experts described a future in which safety is embedded and serves as a baseline feature of innovation, not a reaction to harm.
- The tech industry has matured, widely accepted safety standards.
- Responsible design is an expectation from users, regulators, and investors; it is not an exception.
- Safety-by-design is normalized across the lifecycle of digital products and services.
- Digital products serving vulnerable populations, such as those in humanitarian settings, displaced populations, and other high-risk groups, are held to explicit safety standards.
5. Governance and enforceable standards
Experts emphasized that governments must be equipped to act. Rules without implementation capacity create false confidence and uneven protection.
- Governments have not only standards for digital products and services, but practical tools to help industry implement them.
- In humanitarian and fragile contexts, non-state actors, UN agencies, and humanitarian organizations that play a de facto governance role adhere to standards to enforce safe and responsible design of digital products and services.
- Regulatory approaches are nuanced and proportionate, favoring targeted interventions over blanket bans.
- Online safety is no longer politicized; it is treated as a technical, social, and child protection issue grounded in evidence.
6. A balanced distribution of power
Finally, many of the aspirations reflected a desire to rebalance who holds influence in digital ecosystems. Accountability follows power, and power must be directed to protecting public interests.
- Public interest actors are no longer structurally disadvantaged in negotiations over safety, data, and accountability.
- Crisis-affected communities, young people, rural communities, and others who use technology but previously did not have a say in its design, are part of these negotiations.
- Customers and regulators expect safety, and industry responds accordingly.
Articulating what we want is an important milestone. It helps recognize what stands in the way of this future and what do we need to put in place today if we want to have these aspirations become reality in five years.
There are serious challenges to attaining these aspirations. While digital innovations move at what can seem like lightning speed, many of the prerequisites to human thriving identified above center on ability of governments, civil society, and communities to stay on top of these developments and respond appropriately to them. No static education program can keep up with the pace of change. As a result, stakeholders are constantly rushing to respond to the latest technological innovations after they have already been launched, rather than being able to focus on building the digital futures they hope for. A new way of keeping track of digital progress and translating it into policy and enforcement is necessary.
Perceptions around regulation and advocacy for online safety are also a challenge. Online safety is often perceived negatively as an obstacle to innovation and economic growth. Online safety regulations are seen by many as stifling the development of new technology, and some tech firms fear that integrating safety measures will be costly or will be detrimental to achieving the metrics that matter to them, like user numbers or ad sales. These concerns commonly prevent both governments and tech companies from acting to strengthen the safety of digital products. However, despite the widespread nature of these fears, there has been very little research to assess the actual cost or returns of online safety and the cost of inaction. Without evidence that safety is a net benefit for all, tech companies view it as an optional, premium feature, which will only be available to those who can pay for it.
Finally, we have not managed to translate market principles of demand and supply and customer service orientation fully into design. This is especially true to products and services designed to serve those who are not traditionally present in those feedback loops – children, youth, people with disabilities, displaced populations, rural communities, and others like them – who do not have visibility as a “market segment”. As a result, many well-intentioned digital products at best fail to work as they should, and at worst, introduce additional harm to those who use them. Concerns around safety being something that you can pay for, not the default, are arising, which means that low-income communities bear the burden of harm as we test new tech. This is not a design gap that can be closed by adding a consultation at the end of a product cycle. It requires designing with those at greatest risk. When we build for the most vulnerable first, we build better for everyone. Peer‑reviewed studies and systematic reviews, in areas such as medical diagnosis for example, consistently show that newer, higher‑capacity AI models demonstrate significantly higher accuracy than earlier versions. Access to these higher‑performing models is increasingly limited to paid or enterprise subscriptions, meaning subscription access is often a prerequisite for improved accuracy.
Articulating a shared vision is only the first step. Even within the short discussion, a path forward has emerged that would help overcome the barriers and move towards the future.
To arrive at the digital future, we envision in five years, our biggest challenge is not to predict technological change, but to deliberately reshape the conditions under which digital products are built, governed, and adopted. The experts in the room converged on a small set of priorities that move beyond awareness‑raising or reactive fixes and instead focus on durable, systemic change.
Treat online safety as shared infrastructure, not an individual burden
Shift safeguarding away from parents, caregivers, or users alone by cost‑sharing responsibility across industry, government ministries and regulators, and civil society.
Prove the value of safety
Invest in tools, metrics, and models that help governments and companies keep pace with technological change. Generate evidence on both the cost of digital harm and the cost of inaction, and embed safety expectations into funding criteria, procurement guidelines, and investment decisions, so responsible design is rewarded rather than penalized.
Reenvision public sector readiness
Utilize adult learning, individual and institutional incentives, and new instructional platforms and roes within governments to create continuous learning pathways for those who create and enforce policies to keep up with technological advancements and work together in a matrixed approach across government.
Bring all customers as equally valuable decision-makers in digital design
Move beyond one‑off consultations toward standing market driven and public policy driven mechanisms that keep youth, caregivers, survivors, people with disabilities, and rural users embedded in design, policy, and enforcement processes over time. The strongest, most resilient digital products are those built from the margins outward, starting with those who face the greatest risk and the least power. When safety works for them, it works for everyone. This principle should govern not only product design but procurement criteria, funding conditions, and governance standards.
This is not a call to slow innovation. It is a call to align innovation with trust, accountability, and human thriving, especially for children and young people who have the most at stake and the least power. The future described here is achievable within five years, but only if safety is treated as a core feature of digital progress, not an afterthought.
Contact: Katya Vogt (kvogt@irex.org), Phoebe Bierly (pbierly@irex.org), and Dina Hanania (Dina.Hanania@care.org)
Copyright © 2026, IREX: International Research & Exchanges Board, Incorporated and CARE. All rights reserved.