Building Fair Tech: Lipika Kapoor on Why AI Must Meet People Where They Are
"The non-technical perspective is not a gap in an AI company. It is a necessity. Every company, in the end, is building for human beings." Today’s woman dreamer, Lipika Kapoor, is co-founder of Nabu Sciences, building a human-AI advantage for organizations navigating transformation. An MIT Sloan MBA and former investment banker, Lipika brings a uniquely human-centered lens to AI. In her Women Who Win interview, she reflects on learning to lead without a technical background, designing AI that people actually adopt, and her vision for building "fair tech" that meets people where they are. A relatable and timely story, she shares, “There is a particular pressure non-technical founders face, and women especially, in rooms full of builders. The instinct is to over-explain your presence. Or to go quiet when the technical conversation heats up. I had to resist both.” We are excited to share her journey on Women Who Win.
1. Tell us your story. You began your career in India then pursued investment banking after completing your MBA at MIT Sloan and you are currently a co-founder of an AI company focused on improving organizational management. What sparked your interest in becoming an entrepreneur, and what ultimately inspired you to make that leap, especially coming from a traditional banking trajectory?
I kept asking the same question everywhere I went: why does impact move so slowly when the problems are so urgent?
It followed me through many years at J-PAL, MIT's global poverty action lab, where I worked on scaling evidence-based programs across South Asia. The research was airtight. The evidence was there. And yet change felt slow.
That friction never left me.
I went to MIT Sloan to find new tools, new frameworks, and a new language for the problem I had been carrying for years. I came from a non-traditional background - behavioral science research, development economics, and policy - in a class dominated by consultants, bankers, and engineers. Finance was not my world.
But I had spent years watching the people who put up the capital make the decisions. I wanted to stop watching. I wanted to learn how capital actually moves. And make it work. Because investment banking is where capital allocation actually happens. Where risk is priced, value is decided, and you learn the logic that determines where resources flow and why.
So I taught myself. Started early. Put in the work. Got in.
And then I kept returning to the same question.
The answer came while I was mentoring fellow MBA students, many of them navigating fields that had not been built for them. I realized what I had done, taught myself an entirely new discipline under pressure, was exactly what millions of people would need as AI remade the economy.
The people who would thrive were not going to be the most credentialed. They would be the ones who knew how to learn to learn.
That insight was personal. It was also, I believed, a business. That is why I co-founded NABU Sciences.
2. As a non-technical co-founder of an AI company, what were the biggest challenges you faced in the early stages of building credibility and navigating the technical landscape?
The biggest challenge was not the technology.It was unlearning the belief that I needed to apologize for not being an engineer.
There is a particular pressure non-technical founders face, and women especially, in rooms full of builders. The instinct is to over-explain your presence. Or to go quiet when the technical conversation heats up. I had to resist both.
My value was never going to come from pretending I could write production code. So I did not pretend. And I did not stop there either. Because here is what I have learned: in the age of AI, picking up technical concepts is not what it used to be. When you learn with AI alongside you, the curve is different. Steeper at first. Then suddenly, surprisingly fast. Things that once required years of training become accessible in weeks if you are curious enough and honest enough about what you do not know.
I am not a coder. But I understand the processes. I ask better questions than I did a year ago. I know when something does not hold up. And I bring what no amount of technical training alone can give: understanding, with real precision, how human beings actually change. That is not a soft skill. It is the hardest unsolved problem in AI deployment.
Most AI rollouts fail not because the model is wrong. They fail because no one designed the human system around it. Across more than a decade spanning policy, investment banking, and co-founding NABU, I had been studying exactly those questions. That became my edge. The non-technical perspective is not a gap in an AI company. It is a necessity. Every company, in the end, is building for human beings.
3. What advice would you give to other founders who may have the vision for an AI product but do not come from a technical background?
I taught myself investment banking without a finance background.
Not because I was fearless. Because the cost of playing it safe felt higher than the cost of getting it wrong. That is the thing nobody tells you. The discomfort of learning in public is temporary. The cost of waiting until you feel ready is not.
So here is what I know:
Know your problem cold. Not the technology. The problem. Who is suffering. What specifically changes when it is solved. That clarity is rarer than any technical skill. And in the rooms that matter most, it is more powerful.
Get technically literate without trying to become an engineer. Learn enough to ask the right questions. Learn enough to know when the answer does not hold up. That is all you need.
Choose your technical co-founder the way you would choose a life partner. Slowly. With enormous attention to how you disagree.
And then the thing I wish someone had said to me directly, as a woman building in a space that was not built for me: We are often the ones who place the ceiling ourselves. Stop waiting to feel ready. Dream bigger. Dream higher.Then rebuild the room.
4. Many organizations experiment with AI tools but struggle with adoption. What have you learned about designing AI so that people actually want to use it in their daily work?
Most organizations are asking the wrong question. They ask: how do we get people to use this tool? The right question is: does this make me better at what I do, or does it make me less necessary?
That is the fear underneath every stalled rollout. And it is a legitimate one.
Through NABU’s work across executives, governments, and frontline teams, people learn AI in four stages. First as a search engine. Then as a tool for creating outputs. Then developing a real sense of where AI extends their thinking versus where it introduces risk. And finally, fluency, where AI becomes part of how you think, not just something you occasionally reach for. Most organizations try to skip from stage one to stage four overnight. Then wonder why nothing changed. Real adoption requires scaffolding every stage. And leaders who are willing to learn openly alongside their teams. Because when people see their leaders learning, they give themselves permission to do the same.
One more thing. Trust is a two-way street. If a farmer is rejected for a loan by an AI model, they need to be told why, in the language they actually speak, and what they can do. That is not a UX detail. That is the accountability layer that determines whether AI-powered inclusion is real, or just a talking point.
5. At the India AI Impact Session, you spoke about "fair tech" and the idea that AI can enable inclusion at scale. What opportunities excite you most at the intersection of AI and social impact?
I was in Delhi for the India AI Impact Summit, one of the largest gatherings on AI policy in the Global South. Sitting with some of my mother's friends one evening, one of them said very simply: "AI is happening in the world. But it is not for me."
That sentence stopped me. It is the most precise description of the fairness problem I have ever heard. Not a policy paper. Not a conference panel. Just a woman, quietly naming the gap. It is not about devices or connectivity. It is about whether people feel a technology was built with them in mind, or just delivered to them.
For the first time in history, the cost of accessing expertise is no longer determined by your zip code or your bank account. Legal guidance, financial advice, medical decision support. These have always been functions of where you were born and how much money you had. AI changes that structurally. Not at the margins. At the root.
In India the scale is staggering. More than 500 million people work in the informal sector. The dominant models were built in the West, trained on English. Reaching the last mile requires more than translation. It requires AI that meets people where they actually are. A street vendor who has paid every bill on time for years should not be locked out of formal credit because they do not fit an old underwriting model. That is the fairness I care about. Not as a principle. As a measurable outcome in a specific person's life.
6. To end on a fun note, what is one way you personally use AI in your daily life that you now cannot imagine living without?
I use it the way I used to use long walks.
To think something through before I have to say it out loud to another person.
It pushes back. It finds the hole I missed. It offers the frame I was circling around but could not quite land.
It does not always get it right. But I always get somewhere.
Thank you Lipika for sharing your story on Women Who Win. We are excited to welcome you to our global women’s network!
Bio: Lipika Kapoor is the strategic engine behind NABU Sciences, driving growth, AI adoption, and leadership transformation in academia and enterprise. A former investment banker and MIT Sloan MBA, she combines financial rigor with bold vision to help organizations leap into the age of intelligent systems. At NABU, Lipika leads initiatives in AI training, adaptive leadership, and change management - translating complex innovation into real-world impact.