Alice Munyua on Championing Africa’s Voice in Global Tech and AI Ethics
The African Innovators Series(TAIS): Tech, Data, and AI Changing the Game
Welcome to Issue #20 of TAIS, where every Friday we spotlight visionary changemakers reshaping Africa’s tech, data, and AI landscape, one breakthrough at a time.
In today’s issue, we spotlight Alice Munyua, a Kenyan digital rights advocate and AI ethics expert who is shaping how technology intersects with social justice across Africa. From her work advancing internet freedom and policy with global organizations to convening powerful networks that center African women in AI, Alice embodies a new wave of changemakers dedicated to reclaiming AI for inclusive, equitable futures. Through her leadership and advocacy, she’s not only influencing policy but transforming how African voices are heard in global tech conversations.

Q: Over the years, you’ve moved from national ICT policy and broadcasting to Mozilla Africa Mradi, and now to convening the African Women in AI Network. What core mission has guided you across these diverse roles, and how has your focus shifted along the way?
Collective Intelligence Reclaiming AI for African Women, from broadcasting to AI justice
A: From the corridors of Vatican Radio to the founding of Her Intellect, my journey has always been rooted in one mission, ensuring African voices, especially African women are not only included but central to shaping our digital futures. I began at the intersection of technology and humanity, studying "social communications" at the Vatican’s Gregorian University, while producing programmes for the Vatican Radio’s Africa service. I was educated by Jesuits and they instilled in me a discipline for continuous learning, curiosity and reflection, which shaped my early projects like Radio Kwizera, which was focused on reimagining how radio could serve as a tool for peace and reconstruction in post genocide Rwanda, where radio was used to fuel the genocide.
My work evolved across geographies from Malawi, where I helped develop Malawi’s Communications Policy as well as a community based broadcasting unit (Development Broadcasting Unit-DBU) and sixty radio listening clubs across rural areas in Malawi. To Kenya, where I led policy advocacy efforts that birthed the Kenya ICT Policy (2006). I also created the Kenya ICT Action Network (KICTANet), and laid foundations for liberalised telecommunications and the arrival of M-PESA, while serving as a Board Member and chair of the boards technical committee of the Communications Commission of Kenya (then CCK)The thread that weaves through these roles is a commitment to inclusive governance. Whether as Vice Chair of the Internet Corporation for Assigned Names and Numbers (ICANN) Governmental Advisory Committee, as chair of the Kenya’s Country Code Top Level Domain name Authority (KENIC) board of directors or creating and leading Mozilla Corporation’s Africa programme the “Africa Mradi”, I have consistently worked to challenge extractive, exclusionary frameworks and champion digital development that reflects lived African realities.
One question I love to ask, and I asked it during the launch, was what should a truly African women-led internet and AI ecosystem look like?
It looks like trust. Like consent. Like care as infrastructure. It sounds like our languages. It respects our rhythms. It sees our struggles, pain and wins not as data noise but as critical herstory.
Q: When you launched the African Women in AI Network, what specific steps did you take to move from idea to execution? What tools, partnerships, or funding mechanisms helped get it off the ground?
From Burnout to Breakthrough
A: The African Women in AI Network/HerIntellect, was not born from strategy decks, but from corporate burnout, rage, and love. After yet another panel in which African women were excluded from substantive AI conversations and reduced to ceremonial roles, I had enough. I remembered all the times I had seen frameworks designed without us with solutions that looked good on paper but failed us spectacularly in practice.
After Mozilla dropped nearly all the global majority-focused projects, I resigned and began to quietly, reach out to like-minded people, through hallway chats at conferences and summits, discovery calls, and mapping who was doing what. Over 100 women across 20+ countries emerged in our database. With seed support from my savings, little budget from Mozilla’s Africa Mradi before it was shut down and pro bono help from feminist and technologists, we launched in January 2025, in Nairobi Kenya. What we lacked in structure, we made up for in trust and community. We launched in partnership with AFRALTI and Moringa School.
Her Intellect is not about “building for” It is about “building with.” So peer mentorship circles, policy fellowships, teach-ins, and learning cohorts that center African women's knowledge are critical. These decentralised approaches will help create AI that is shaped by our lived experiences instead of abstract ethics or tag lines like building "trustworthy AI” or “Open Source Public AI”I have created another African women’s network supporting women going through perimenopause and menopause transitions. It is called “The Second Act Circle.” A little more discreet and quiet for this one for obvious reasons. But I am hoping AI could be applied here as well.
I intend to prototype feminist, decentralized approaches to AI shaped by our lived experiences instead of abstract ethics.
Editorial commentary: Alice is building the African Women in AI Network/HerIntellect without venture capital or corporate backing. This matters because it changes what the network can become. Most AI ethics organizations need to keep their funders happy. They write reports about bias and host panels about inclusion. They rarely challenge the fundamental business models that create the problems they're supposed to solve.
Alice's network starts from a different position. It doesn't need to pretend that tech companies will regulate themselves or that existing governance structures just need minor tweaks. It can say what others can't: current AI development is extractive by design.
The network's emphasis on "lived experiences" over "abstract ethics" points to a different theory of change. Most AI governance efforts focus on principles and frameworks. Alice is betting that centering actual user experiences will produce better outcomes than debating theoretical harm.
Her willingness to keep experimenting with different levers of change is illuminating. Radio for post-genocide reconstruction. Policy frameworks for Kenya's digital infrastructure. Corporate diversity programs. Multilateral governance bodies. Each approach revealed both possibilities and constraints. The African Women in AI Network emerges from this accumulated knowledge.
But Alice isn't anti-institutional by ideology; her partnerships with AFRALTI and Moringa School show strategic engagement rather than wholesale rejection. She's creating parallel structures that can influence existing systems while maintaining independence.
The "Second Act Circle" for women experiencing perimenopause suggests Alice understands that technology justice requires addressing women's full experiences, not just policy advocacy.
The real test won't be whether the network grows or gets funding, but whether it can demonstrate that inclusive governance produces different technological outcomes at the scale and speed AI development demands. This means moving beyond critique to show what AI tools designed through peer mentorship circles and learning cohorts actually look like and whether they work better than committee-designed systems.
Q: In your Mozilla Mradi work, how did you ensure that community-led innovation wasn’t just a buzzword? What did co-design or participatory development look like on the ground, and what constraints surprised you?
A: Community Innovation in Practice: Mozilla Corporation's Africa Mradi, was a “community led innovation project” It was not just a tagline, it was the blueprint. We started with the premise that innovation does not only happen in garages in the USA, Silicon Valley boardrooms or high budget builders labs/studios. In global majority countries. It happens in Nairobi’s civic tech meetups, in Lagos based data feminist collectives, and in the grassroots internet communities of Kibra, Uasin Gishu, and Kampala, among others.
The Africa Mradi, "community led innovation project" became our compass to move from rhetoric to reality, I made the following deliberate design choices:
Decentralised decision making, engaging in collaborative projects and partnering with several entities, among them Moringa school, AFRALTI, AU-NEPAD, Strathmore, SMART Africa, ITU, government of Kenya, Nairobi and Usasin Gishu Counties, the Gladys Boss Foundation, Psychewell Essence Foundation, among others, we co-designed programmes with grassroots innovators, not around them.
Funded care: We embedded transport stipends, childcare, and translation not just linguistics translation, we translate strategies, OKRs, KPI’s values, acronyms, concepts, etc not as add ons, but as infrastructure.
Redefined success: I partnered with TECTONA for the Mozilla’s Africa Mradi and Summit Strategies for KICTANET to monitor, evaluate and document iterative learning, knowing community timelines rarely fit neatly into capitalism timelines
What most surprised me was Mozilla dismantling the project and laying off the team at the foundation and my programme manager as soon as I resigned. The Africa Mradi revealed the limits of Mozilla’s corporate ethics. Despite promising beginnings, most tech companies from the global North behave like Mozilla’s leadership, they could not sustain the patience and humility required for genuine community partnerships and engagement in the global majority.
That rupture fueled my next chapter. Creating the “African Women in AI/Her Intellect”, from Frustration to collective action
Q: Building a women-centered AI pipeline in Africa often means working without formal infrastructure. What alternative models have worked in your experience, and why?
A: Alternative Models: What works in the absence of infrastructure: With limited formal pipelines, we turned to what we have always done so well, community, solidarity, and reciprocity. Models that have worked include:
Peer Mentorship: Reciprocal relationships that span technical coaching, cultural navigation, and collective learning/healing.
Learning Circles: Rotating, member led sessions on topics like feminist machine learning and data sovereignty.
Context Rooted Fellowships: Like the grants to rural based startups led by women, the mental health AI project for Maasai girls in Kajiado, Kenya in partnership with Psychewell Essence Foundation, which merged indigenous knowledge and insight with tech to serve real needs without displacement.
What binds these and us together? A refusal to wait for permission as demonstrated by GenZ women, who are engaging and creating successful, fulfilling projects and community while building lives away from the usual societal templates that has kept African women in a bind. They simply do not care anymore, they have already moved on and society will have to deal with this new phenomenon, as a GenZ parent, I am so proud.
Kenya women “chama” model is another great example.
Editorial commentary: The speed of Mozilla's project dismantling after Alice's resignation exposes a fundamental structural problem: corporate "community-led innovation" initiatives depend entirely on internal champions. When champions leave, communities become expendable overhead. This dependency stems from how these programs are designed, as corporate social responsibility add-ons rather than core business functions.
Alice's "funded care" model reveals why this dependency exists. By making childcare, transportation, and time costs visible, she inadvertently showed how corporate programs use "merit-based" selection to filter for economic privilege while claiming neutrality. The participants who can afford these hidden costs appear more "qualified," creating a feedback loop that reinforces exclusion while maintaining plausible deniability about barriers.
This explains why GenZ women are bypassing permission structures entirely rather than fighting for inclusion, as Alice pointed out so well. They've observed how previous generations won seats at tables designed to eject them once their champions left. Building parallel systems is pragmatic recognition that corporate inclusion is structurally unstable.
Kenya women chama model, as briefly mentioned by Alice, offers a great template precisely because it operates outside these corporate dependencies. These savings circles scale through trust networks and rotating leadership rather than external funding or permanent institutions. They survive because no single person's departure can destroy them. Whether AI governance could adopt similar mechanics (collective decision-making, mutual accountability, distributed leadership) remains to be tested, but it might produce more durable change than Silicon Valley's champion-dependent diversity initiatives.
Q: How do you translate community findings into influence at policy tables? Could you walk us through a moment where local research directly shaped a decision or national guideline?
From community to policy
A: Policy impact begins with knowing how to speak the languages. From on the ground/grassroots care to bureaucratic, diplomatic speak and strategy.
A key example is my submission to Kenya’s Data Protection Bill process and coauthoring a white paper that directly influenced Kenya’s 2019 Data Protection Act embedding principles of inclusion, participation, decentralisation, and accountability.
In 2018, when I joined Mozilla, I led Mozilla’s push back against Kenya’s invasive digital ID system, Huduma Namba. We supported strategic litigation that was led by several partners and went on to coauthor a white paper proposing people first identity frameworks. Huduma namba project was reviewed, including by the World Bank who added a section that required data protection prior to implementing these super invasive digital ID projects that incidentally are only forced down on African countries not the global north. These wins were not just about timing or data. They were about framing, shifting the conversation from tech as a tool to tech as terrain of power.
When I founded KICTANET, the aim was to support a multistakeholder approach to the Kenya ICT policy process in 2006, I had not anticipated that the network would grow beyond the need I created it for. I started off with various working groups, the research working group informed the policy framework and translated community experiences into policy briefs that ended up shaping several policies not just the Kenya ICT policy 2006. We adopted this approach for consequent policies, including the Freedom of Information Act, Consumer protection Act. Creation of the ICT board, which is now ICT authority, I ensured that the then Permanent Secretary appointed several women onto the board on my recommendation. I was not waiting to be invited to the table we brought our own.
This approach was also encapsulated in Kenya’s new constitution 2010 that provides for public participation on any policy process.
At the global level, I convened and chaired the United Nations Internet Governance Forum held in Kenya in 2011. I introduced a day dedicated to engaging policymakers to understand internet governance and engage with other stakeholders. This practice continues today. But because of gatekeeping, no one would want to admit that this was first introduced by an African woman and that I was the first ever African woman to convene and chair the UN-IGF.
These wins were not just about timing or data, they were about framing and taking up space without asking permission.
I learned that policy impact begins with knowing how to speak all these languages, grassroots, bureaucracy, diplomacy, multistakeholder, the maddening acronyms that try to make the tech world mysterious and exclusive, internet governance, strategy, etc. KICTANet, to date continues to translate community experiences into policy briefs that shape Kenya’s tech ecosystem.
Q. What metrics do you track that traditional donors or funders overlook but you know matter when measuring empowerment and inclusion in AI?
A: Traditional funders track outputs, I love and prefer to also track transformation:
Confidence Shifts: From “I am not sure” to “I am building something no one else is.”
Resilience: Who stays engaged through life’s disruptions
Ripple Influence: Who sets the agenda after attending, not just who shows up
Policy Echoes: Whose ideas end up in frameworks, briefs
Which language is used and how is it used, to include or exclude.
Psychological Safety: Do women feel safe enough to speak, fail, or rest? For example, one early mentee from Kenya, initially unsure about her role in AI, has gone on to co-lead a national digital rights campaign and recently coauthored a chapter on feminist data policy for an African Union consultation. That is empowerment in action.
These “invisible indicators” tell us if we are shifting culture, not just hitting quotas, OKRs or KPI’s. The goal is not just skill building. It is thriving with dignity.
Q: You’ve worked across government, civil society, diplomacy and tech. What tactics (language, framing, coalitions) have proven most effective in getting gender equity or ethics in AI taken seriously?
A: Getting gender equity and ethics in AI taken seriously requires more than data. It demands translation across power centers, strategic storytelling, coalition building, strategic litigation, etc.
Framing ethics as infrastructure, rather than ideology somehow seems to make policy makers listen a little more and pay attention. When presented with ethics as essential digital infrastructure like cybersecurity policymakers tend to listen.
Translating values without dilution. Feminist principles become “data governance,” “user trust,” or “risk mitigation” when needed.
Building unexpected coalitions has been by far the most rewarding. From telecom regulators to Big Tech engineers, and forming cross sector alliances that have disrupted silos.
Using real stories. A woman’s testimony about algorithmic denial of a loan resonated more than charts and statistics with Kenyan policy makers.
What binds these together? A refusal to wait for permission.
Policy Influence from the Ground Up: Policy impact begins with knowing how to speak all these languages, from grassroots care, bureaucratic strategy to diplomacy. At KICTANet, we translated community experiences into policy briefs that shaped several policy frameworks. Intend to do the same with HerIntellect.
In 2018, I pushed back against Kenya’s invasive digital ID system. I co-authored a white paper proposing people-first identity frameworks, which influenced the 2019 Data Protection Act. These wins were not just about timing or data. They were about framing and shifting the conversation.
These strategies shift perceptions and move ethics from idealism to imperatives.
Editorial commentary: The standard narrative suggests grassroots advocates must learn to speak power's language to be heard. Alice flips this entirely by forcing power structures to engage with community realities and by making exclusion politically costly rather than convenient. Her approach to Kenya's digital ID system demonstrates this inversion. Rather than arguing against Huduma Namba on privacy grounds that policymakers could dismiss as theoretical, she reframed invasive identification as a structural barrier to digital participation. The results? the World Bank and other funders added data protection requirements, not because they suddenly cared about privacy, but because exclusion became a project risk they couldn't ignore.
This tactical approach extends to her metrics for measuring impact. While traditional funders track participation rates and skill-building outputs, Alice monitors power shifts. Who moves from uncertainty to agenda-setting, whose language gets adopted in policy frameworks, whether psychological safety increases enough for women to take risks. These indicators capture whether interventions change who gets to shape technology, not just who gets trained to use it.
The "unexpected coalitions" Alice builds: telecom regulators, Big Tech engineers, cross-sector alliances, work because they're constructed around shared institutional interests rather than shared values. Regulators want compliance frameworks, engineers want clear technical standards, civil society wants accountability mechanisms. Alice finds the intersection where these interests align with feminist outcomes, creating durable partnerships that survive ideological shifts.
This pragmatic approach explains why her policy influence has sustained across different political moments and institutional contexts. She's not trying to convert people to feminist principles; she's making feminist outcomes the most efficient path to goals they already have.
Q: Everyone talks about “ethical AI.” Can you describe a real case where you had to make a hard ethical call in a project?
A: In one AI health tool project, I discovered the training data was scraped from online forums without consent. Though technically legal, it violated trust and safety
I insisted we rebuild the dataset with proper community engagement and informed consent. We have since lost time, money, and even partners but we preserved dignity.
Ethical AI is not theoretical to me and to many African women. It is also becoming a survival issue. Especially since the usual stereotypes and biases are quietly baked into AI algorithms. It is about choosing justice over convenience, even when it is unpopular. I would make the same call again.
Q. What’s a recent failure or misstep that taught you something essential about what African women need in the AI/data space?
A: I piloted a storytelling chat to help African women share their journeys but did not anticipate the trauma it would surface. We lacked mental health support and scaffolding (modern/Western therapy and psychology. I find that it is harmful to African women, personal experience), and a few participants have withdrawn, no longer engaging as actively, not for lack of value but for lack of emotional safety.
The lesson? Inclusion has to take into consideration many factors. It is not just about having equal access, it is also about care. Also ethical AI is not about checkbox fairness, it is about refusing to build on injustices, even when it is inconvenient.
Editorial Commentary: Alice's decision to rebuild the dataset rather than use scraped forum data demonstrates that ethical choices in AI development often require accepting losses that did not need to exist in the first place. She chose informed consent over convenience, knowing it would cost time, money, and partnerships. This was more about refusing to build on foundations of injustice, than it was about external constraints.
Her storytelling platform failure reveals a different challenge. Creating space for marginalized voices inevitably surfaces trauma that existing support systems aren't designed to handle. The platform didn't fail because it used the wrong frameworks, but because it succeeded in creating safety for stories that had been suppressed, without adequate scaffolding for the emotional impact.
Both examples point to the same insight. Ethical AI development requires infrastructure and patience that most tech projects don't budget for, whether that's proper consent mechanisms or trauma-informed community support.
Q: What country or initiative currently inspires you when it comes to inclusive AI governance in Africa? And what gaps do you think pan-African frameworks like the AU still haven’t addressed?
A: Rwanda’s Centre for the Fourth Industrial Revolution comes to mind and inspires me. They are attempting to centre community voices, digital public goods, and gender responsiveness. While in South Africa, grassroots data justice movements continue to push the boundaries of consent and ethics.
The African Union strategies still fall short. Mostly written by external consultants, that prioritise competitiveness and harmonisation, often sidelining issues of equity, gender, digital colonialism, environment, historical factors, theft of African resources, among others. They do not reflect the lived experiences of Africans. But then when you look closely, the African Union and most of the regional economic blocks are funded by foreign grants the AU’s budget is largely provided by the European Union.It is time we embed ethics as policy, not just principle. A continent wide Feminist Digital Charter anchored in care, sovereignty, participation, accountability, inclusion, care, transparency, and justice. This Charter should sit alongside economic frameworks to ensure AI development and digital governance reflect African values and our realities.
Q: You’ve worked at the intersection of global and local. What strategies have helped elevate African voices in global AI conversations, and where do gatekeepers still exist?
A: Global AI spaces often tokenise African voices. My work at Mozilla, ICANN, and now HerIntellect is focused on shifting us from invited speakers to agenda setters.
I was recently invited by a South African event organiser, promising to pay my flight, accommodation and demanding quite a lot without pay. This has been the model that most non-African tend to use when working with experts from the African region.
Another one wanted to have access to all my networks, demanding that I introduce them to at least three to five people to help them with an endowment fund. The audacity and caucasity was unbelievable.
They are not used to African women prioritising ourselves and as a result, we no longer get invited to these spaces, they invite African women who have assimilated into the white supremacy narrative and will participate in putting down their own.
But gatekeeping is much larger than the individual and persists, from racist algorithms to extractive philanthropy, with the most insidious gatekeeper being the skin colour hierarchy system that was created and implemented by colonisers. This is the real gatekeeping. It is baked into AI algorithms, embedded in legal codes, reinforced through various policies, while disguised as grant applications, whispered in hiring and conference panels , embedded in education systems in various tech corporation mission statements that talk about equity (“the internet is a global resource open and accessible to all” “one world one internet” “do no evil” “move fast and break things”) but protect power and racial hierarchies. It lives in current AI models trained on anti-Blackness, in academic institutions that reward proximity to whiteness while extracting knowledge from indigenous people then calling it discovery. It is in schools where African children are punished severely for speaking their local languages, it is in data sets designed to never track trust and care only for risks. It is in the “Trust and Safety protocols” that profile exclude, mostly African people under the guise of protection.
Many global AI initiatives especially in multilateral spaces still rely on large Northern think tanks to define priorities, with African experts brought in too late or tokenistically, if you agree with their ideologies.
Funding remains heavily concentrated in the Global North, and African led work is often underfunded, under cited, and subjected to higher scrutiny.
We have not been able to convince our African “rich uncles and aunties” to invest in local tech ecosystem so we still rely heavily on global north funding to support our startups and innovation.
African women are now leading, from the Kenya ICT Action Network, to advising tech platforms, publishing in top-tier journals, building networks that no longer ask for permission.
We are not waiting to be “discovered”, we are documenting, building, archiving, and dreaming aloud on our own terms, with our own tools, in our own voices. And that, ultimately, is the future of AI I believe in, one where global does not mean caucasian and inclusion does not mean the kind of assimilation that forces one to become someone they are not.
One of the most important questions regarding AI is how we can use it to remember our inherent intelligence. This is what necessitated the name change from “African women in AI to HerIntellect” it reads and sounds more like what we are really thinking about in relation to AI, which is the future needs more coherent humans. AI is not an autonomous discrete entity separate from humans, rather it is another disruptive innovation, it is not just about building robots, it is also about exploring what it takes to rebuild trust in technology.
So, what would AI created by African women look like? With the current state of the world, Africa needs to now look inward with grave seriousness. To create a future where we all feel safe using AI made with and for us, we must start with a foundation of openness, trust, inclusivity, participatory and transparency, not just passive usage, but co-creators and decision makers. And continue to ask Who is AI for? Who gets to shape its future? And, crucially, who is left behind when we do not all have a seat at the table?
As so many Africans have said before me “ we must reimagine our political economic systems. Reclaim our narratives. rebuild from our own soil and stop copy pasting from the west. No one is coming to save Africa, we have to save ourselves”.
Editorial commentary: The expectation that African experts should provide free labour, networks, and legitimacy to Northern initiatives while receiving little to no compensation is a continuation of colonial patterns in which African knowledge is extracted, repackaged, and valued only when filtered through non-African institutions.
The deeper problem is how technical systems embed these same hierarchies while claiming to be neutral. Alice’s account of “racist algorithms” and “Trust and Safety protocols that profile and exclude mostly African people under the guise of protection” exposes the machinery behind what is often presented as objective or necessary. This is gatekeeping in its most insidious form; bias camouflaged as policy, exclusion hidden behind the language of risk management, and inequality coded into the very infrastructure of AI.
The funding landscape compounds this reality. African AI initiatives remain largely accountable to donor priorities rather than community-defined needs. With “African rich uncles and aunties” reluctant to invest in the local tech ecosystem, dependence on Global North funding persists, ensuring that even well-meaning inclusion efforts ultimately serve external institutional agendas more than African technological sovereignty.
In this context, Alice’s decision to reframe “African Women in AI” as “HerIntellect” is a philosophical pivot. It shifts the centre of gravity from asking to be included in pre-existing systems to claiming the right, and the capacity, to build our own. The framing moves the question from “How can we participate in AI?” to “What would AI look like if we built it ourselves?” This subtle but profound shift transforms the conversation from one of permission-seeking to one of sovereignty. It asserts that African AI futures must be designed, and governed by those who live with their consequences.
Closing remarks
Alice’s career resists the idea that inclusion means fitting African women into pre-existing systems. Her work, from policy reform to rebuilding datasets, reframes the systems themselves, treating care, consent, and trust as infrastructure, not afterthoughts.
She moves fluidly between working inside institutions and building outside them, using each position to challenge extractive models and create space for community-defined governance. This dual strategy, strategic engagement without dependency, is what makes her influence last beyond any single role.
In short, Alice’s career is a case study in how to shift technological power from adaptation to authorship, and why that matters for everyone living with the consequences of AI systems.
Thank you for reading!
Join the mission
This newsletter is independently researched, community-rooted, and crafted with care. Its mission is to break down walls of complexity and exclusion in tech, AI, and data to build bridges that amplify African innovation for global audiences.
It highlights how these solutions serve the communities where they're developed, while offering insights for innovators around the world.
If this mission resonates with you, here are ways to help sustain this work:
📩Become a partner or sponsor of future issues → reambaya@outlook.com
→ 🎁Every child deserves to be data literate. Grab a copy of my daughter's data literacy children's book, created with care to spark curiosity and critical thinking in young minds. (Click the image below to get your copy!)
You can also sponsor a copy for a child who needs it most or nominate a recipient to receive their copy. Click here to nominate or sponsor.
→ 🧃Fuel the next story with a one-time contribution. Click the image below to buy me a coffee (though I'd prefer a cup of hot chocolate!)
These stories won't tell themselves, and mainstream tech media isn't rushing to cover them. Help ensure this reaches the audience it deserves.
Let’s signal what matters together.
Thank you for being part of this journey!




Great interview, Rebecca.
I loved how Alice framed metrics at the human level — confidence, resilience, and psychological safety. These are the measures donors and governments should track, not just hard targets.
I was also struck by her point that stories have more influence on policymakers in Kenya, likely thanks to her confident delivery and visible frustration with the slow progress for women in AI.
Well done!