Dr. Shruti Shankar Gaur is a polymath and thought leader working at the intersection of education, inclusion, diversity, policy, and innovation. Holding a Ph.D. in Inclusive Education, she has been honored with a University Gold Medal and the Certificate of Academic Excellence by India’s Ministry of Human Resource Development, reflecting her commitment to transformative change.
As founder of Research & Innovation in Education (RIEDU), she leads initiatives like the Young Editors Program, fostering young global writers focused on inclusion and diversity. Inspired by UN SDG 4.5 and 4.7, her work includes teacher training workshops, academic publishing, and her poetry collection, Four Decades.
At The Digital Economist, till recently she served as Program Director at the Center of Excellence, managing the fellowship program and interdisciplinary collaborations. She has represented the organization at G20 India, the World Economic Forum in Davos, and Cannes Lions, contributing to global discussions on AI, policy, and socio-economic transformation.
Beyond policy, she is a Creative Partner, Mentor, and Strategist at Sankarsingh-Gonsalves Productions in Canada, advocating for culturally inclusive storytelling. In New Delhi, she is Director of Research, Innovation, and Inclusion at ae-research, leading the launch of its first DEI Lab.
Dr. Gaur’s work spans education, policy, and creative industries, ensuring a lasting impact on global inclusion, equity, and innovation. Whether mentoring young writers, shaping policy, or driving research, she remains a catalyst for transformative change.
https://linktr.ee/dr.shrutishankargaur
https://www.linkedin.com/in/drshrutishankargaur/
https://drshrutishankargaur.com/
https://www.amazon.in/Four-Decades-Prose-My-Life-ebook/dp/B0B4K8B4M5/ref=tmm_kin_swatch_0?_encoding=UTF8&qid=1673179228&sr=8-1
https://www.thedigitaleconomist.com/
[00:00:02] Hello everybody and welcome to the Crypto Hipster Podcast. This is your host Jamil Hasan, the Crypto Hipster, where I interview founders, entrepreneurs, executives, thought leaders, amazing people all around the world of crypto, blockchain, and today AI as well. I have an amazing guest. I've known her for half a year now through The Digital Economist. Her name is Dr. Shruti Shankar Gaur. Dr. Gaur, Shruti, welcome to the show.
[00:00:33] Thank you. Thank you for having me Jamil. It's a pleasure. You're very welcome. You're very welcome. So let's kick things off and I ask everybody the same question. Get amazing answers all the time. Is this, is what is your background and is it a logical background for what you're doing now? Okay, so let me begin by saying that I'm a unique data set.
[00:01:00] I'm a publicist. I'm a publicist. I have worked extensively in organic chemistry and yet eventually I ended up becoming an educationist and I have a PhD in inclusive education. I'm also a university gold medalist and was awarded certificate of academic excellence by Ministry of Human Resource and Development, which it was called in by Government of India.
[00:01:22] And I'm a published poet. I have worked extensively in inclusion and diversity space and in education space for more than a decade now in India. And I have my own initiative called REDU, Research and Innovation in Education, and where we work with children in the domain of inclusion and diversity. So yeah, that's me. Awesome. Awesome. I did not know you were a poet. Very cool.
[00:01:52] We'll get into that a little bit later. I want to, I want to find out your about first of your association with the digital economist, what it has been, what it will be going forward. So yeah, the digital economist is my sacred space. It's, it's something that is so precious to me. It was in 2022 that I first collaborated with them on a position paper. And ever since, you know, I have been a part of it and loved every bit of it. I joined as senior fellow in 2023.
[00:02:22] In 2024, I started working as program director, which I had been, you know, program director until now, until the last few days. So yes, where I was managing the fellowship program of the Global Think Tank. And yeah, it's a wonderful space for, I think, anyone to grow, learn and, you know, evolve and to be part of the change.
[00:02:51] Yeah, I agree. So part of the change is every year we have a footprint and I couldn't go this year and I hope to go next year at the World Economic Forum in Davos, right? So regarding technology and AI, you know, what are some of the most critical areas that must be addressed that you found from a socio-cultural and inclusive perspective?
[00:03:21] Yeah, great question, I would say. So I went to Davos in 2023. That was my first time. And it was the first time that AI, you know, 2022 was AI air. So Jan 23, it was all about AI and it did in so many panels. This year, definitely AI had a big presence. But along with that, what I observed was that there were more concerns being addressed.
[00:03:47] So while we were talking about, you know, focus of AI or the impact of AI in today's context as well as in future perspective. But yes, there were a lot of conversations happening that the way of concerns about the way AI was evolving and how to build guardrails, how to build guidelines and ensure that AI ethics, you know, are in place for safety.
[00:04:15] So, yeah, a lot of concerns from that perspective I was part of. And I think that this is very imperative that the pace at which AI is growing, we build the guidelines and the guardrails and set up AI ethics for it to evolve in a beautiful way, you know, because otherwise it's scary at times. Oh, yeah. So you used two adjectives there when describing AI.
[00:04:44] You used scary, which is a lot, which is a public, the common sentiment and safety. How do you bring safety into the conversation? How do you ensure safety? That's a good question to have. So I think I would like to look at it from a sociocultural or philosophical perspective, because that's what my background is instead of technology perspective.
[00:05:07] So if you look at it from philosophy perspective, I think human race or human civilization, the way it has evolved is being in control always. So we are, we are and we were scared of the unknown. And AI is venturing into the unknown. So the moment we are venturing into the unknown, our safety net is not there. And it's scary.
[00:05:33] Now, so definitely for us all, we would like to build those safety nets and be part of it. So what you, so what you said that, you know, how these two relate to, I think they are the same, same perspective, like two parts of the same coin.
[00:05:53] So, and it's important to, you know, while we are evolving in the AI era that we are, it's important that we start building these guardrails, which help us navigate in the unknown. Else, how will we navigate in the unknown? The way we are moving ahead with. Yeah. I hope that answers your question. It does. I remember going to Tony Robbins where he, where you walk on fire, where you work on, walk on coals back in 2016.
[00:06:23] And he said, the key is to dip, is to dance, learn how to dance with uncertainty. You know? And so I always remember, I always remember that wherever I go. So you mentioned humans, right? Right. So the role of AI technology in human evolution, you know, Darwinism, right? What is that like, what is that to you? Interesting, because you didn't say role of humans in AI evolution.
[00:06:52] You said role of AI in human evolution. So that means we have accepted and agreed that AI is a big force or tsunami or whatever. And we are in an AI age. And it has the power to influence, navigate, manipulate human evolution, human thinking, human working, you know, whole civilization landscape.
[00:07:21] And if I look at it, the world order is changing. From a sociocultural perspective, the world order is changing. Most of the world order has crumbled. We are yet not sure what's the way ahead. And what I would like to work on is that right now, change is a forced change. Like, you know, I might not like it. I might not want it. But I have a phone.
[00:07:50] Because everyone has. I was part of the change. What I would like to be part of is being part of the conscious change. So, yeah. And for that to happen, a new homeostasis, a new sort of balance, a new harmony has to be built. Knowing very well that AI is here to stay. We have to learn to coexist and to navigate our lives with AI.
[00:08:17] And human evolution, you know, instead of going into the unknown in an unconscious way, we have to navigate ourselves into the unknown in a conscious way. What is, and we have to ask ourselves primary questions like, what is the vision for humanity 100 years from now or 50 years now? What are the core values which makes anyone human? Or, you know, which differentiates us from AI?
[00:08:44] Or which are the core values which are, will always remain relevant? It doesn't matter if we have technology or if we don't have technology. But that is what makes us who we are. So, yeah. I'm wondering, you know, I'm looking at from the time that Facebook came on board and the iPhone came on board, there's been dramatic change, right?
[00:09:11] But how do you help people or make people want, like, practice conscious change? How do you bring consciousness into that change as you're going forward? Very good question, I would say. I would say that change has to begin at the individual level. So, if you ask me that AI is just a mirror of our own consciousness.
[00:09:41] Say, for example, Charles Eisenstein's theory of separation. My disconnect with myself makes me, you know, more acceptable to virtual reality. Because, like, the evolution that we are talking about now, it's not going to be an organic evolution. AI is not organic. It's artificial. That's why it's called artificial.
[00:10:09] That's something that we have human has built upon. And if I talk about in early 20th century, there had been this theory of new sphere. You can check it. It was by a philosopher and a theologist.
[00:10:26] And they said that human evolution or AI from the concept of having impact of a global conscious knowledge will be impacted through humans. You know, let me explain and simplify it. So, that, for example, in an AI world, AI will impact the way we think.
[00:10:50] So, the whole conscious knowledge pool that we have, AI will have a role to play in that. AI will have an impact on that. And that will decide the future course of human evolution. And that's the theory of new sphere, which was, you know, theorized in early 20s, 20th century. Yeah. It's interesting you said that because AI is not new. It's been around.
[00:11:19] It comes in cycles, in 20-year cycles, and it's been around since the early 1900s, right? So, but now it seems to be, everybody's talked about it, but now it seems to be more relevant or more prevalent or more, I guess, becoming a reality. So, you are a thought leader. You are a social thinker, an education person, educationist, and a poet, right?
[00:11:46] So, you have three different diverse areas that you're involved in. So, what are your biggest concerns from each of those areas? And how do you bring it together and, you know, move forward being conscious in those three realms?
[00:12:05] So, you know, my biggest concern here, Jamil, is that if AI is being projected as a demigod, which it is, mind you, in many spheres, you know, as if it is the savior being born, giving fuel to the salvationist culture that we are part of, then I don't agree with that. And I have biggest concern there.
[00:12:35] And if you, even in Nexus, where you will know Aharari, and even in Finite Education, both these books, they have, the authors have mentioned that the birth of a digital species. So, they have said that AI is not a tool, you know, that we might think of. Like, you said that. You just said that, you know, it has become a part of who we are. It's of our whole world. It's a tsunami now.
[00:13:02] So, it's a digital species and we have to learn to coexist with the digital species. And our concern here, be it as a social thinker, be it as an educationist or even as a poet is how to save what is human in this world. And how to ensure that humanity is not marginalized. And how to ensure that we are connected.
[00:13:30] You know, there is a connect, human to human connect, human to nature connect. And we are not separated. Because maybe I may be connecting to you through technology right now, but maybe I'm disconnected to people around me here. So, yeah, that is my biggest concern that how to remain human or save what is human or humanity in this AI world.
[00:13:57] So, I'm wondering if that was some of the topics on the agenda at Davos this year. They were not. If you ask me. Yeah, the only thing closest they were talking about was AI ethics and guardrails. But, yeah, no one is actually thinking about the root causes and the impact and how disconnected we are.
[00:14:25] You know, the social or cultural perspective of it. Why I'm in blockchain, why I'm in crypto is because I think, you know, it's the first time in human history where you're at this intersection of social, you know, technology. And, you know, really advancement, technological advancement, but it is social and technology and investing, but finance, really.
[00:14:55] You know, so at that triangle, right? What is the world of that triangle in AI comparatively look like to you? Does it help the world of the blockchain of that, you know, intersection of those three? Or does it enhance it? Does it take away from it? What could be the role? What is the role? What could be the role? What is the role of blockchain AI?
[00:15:24] And what's the third one? Well, the role of blockchain is, you know, this combination of finance, technology and, you know, social impact. Right. So how does AI either take away from that or enhance it? I don't think AI can enhance it or take away from that. None of them. If you ask me from a sociocultural perspective, how I see blockchain.
[00:15:54] So let's go back to what are the values that blockchain stands for? It stands for transparency. It stands for trust. It stands for decentralization and, you know, protection of our privacy. Now, if you look at the world around, do you see trust? No. Trust is breaking. Do you see transparency? I don't see that happening. Do you see decentralization being promoted?
[00:16:23] In fact, the opposite is happening. More centralized, more controlled. Do you see, you know, being sensitive to or respecting privacy or protection, data protection? No. Your and my data is just a click away. Everything can be, you know, just everything can be known. So that is why. And I come from inclusion background. I come from diversity background.
[00:16:49] I see blockchain technology as a technology of hope. But at the same time, I'm very well aware that the world right now is not in, you know, is not an ally or not in union with the blockchain technology because of the values it stands for.
[00:17:11] However, you and I are people who would want to work to ensure that the world starts, you know, stands for all these values. And Jamil, the day that will happen, blockchain will be the next AI. Blockchain doesn't need support of AI to, you know, do something.
[00:17:33] If it aligns with the values that humanity right now have, which is like in a very bad state. So that's my vision. That's my perception. And I can also share with you, there's a very beautiful anecdote that, you know, it's shared in a story of B by Daniel Quinn.
[00:17:58] And he said that there is a difference between vision and program. And blockchain is a program. AI is a vision. So for AI vision, it vision pulls itself. So automatically AI is growing and evolving. Blockchain, because it's still a program, it's still a plan. It needs a push effect. And we have to push it till it becomes a vision. Vision.
[00:18:28] Or change its course in some other form till it takes its own shape and becomes as big and as independent or autonomous as AI is. Which is on its growing on its own. So yeah. Got it. So I want to, I want to, it gives you something to think about. I want to, I want to shift gears a little bit. But stay within your, within, you know, your expertise.
[00:18:56] And you have been a champion and a crusader for inclusion and diversity. I want to talk about the job markets eventually get there. But I want to talk, I want to start by talking about policy. You know, the U.S. has taken a policy recently to roll back, you know, inclusion, diversity, DEI policies. They're not the only country.
[00:19:22] What's your, what's your perspective on that rollback of DEI policies in the USA and around the world? Great question. I would like to continue what I was saying that, again, DEI is, push was a program. It was not a vision. So it was already, we were just pushing it to happen.
[00:19:49] And the moment that push, there was not enough reason to push it, which was from the government. Like, sorry to say that. And that happened and it rolled back. I was not surprised when it happened. And I'm not worried even because it's happened. There is a snowballing effect across the world. Because I have worked extensively in inclusion space at the ground level.
[00:20:16] One thing that I have understood is that you cannot teach people inclusion. The reason being that the cultural training or the cultural indoctrination, which is pasted on us all, because of that, there are so many cultural or unconscious biases that it's difficult to go back to, you know, being who you are.
[00:20:44] Because when we talk of diversity, when we talk of equity and inclusion, these are fundamental values of being human. And that means the cultural training has ensured that we discriminate other person from us on the basis of caste, gender, region, language, religion, XYZ, you know, all these parameters. So I and you are different and we don't become we.
[00:21:14] So I believe it's working backwards, going back to the core, removing those postings, understanding that at the core level, compassion is omnipresent. Inclusion is omnipresent. You know, I will give you an example. You might think that Shruti is talking too philosophical. So I was taking this workshop a few years back, like at least 10 or 12 years back.
[00:21:43] I was taking this workshop in Delhi, Gorman School for pre-serviced in-service teachers. And this was to orient them towards inclusive education about inclusion and diversity, to embrace students of every kind, including disability. So for two days, I ran this workshop. We did a lot of work, a lot of activities, trying to, you know, trying our best. And at the end of the workshop, after the workshop was completed,
[00:22:13] one of the teachers, she came to me and she said, you know, ma'am, it was so good, you were so nice, but don't you think that disability is a curse from previous birth? How do you expect us to allow that child to be in our class with other children? You won't believe Jamil. I was like, the two days I have completely wasted. And it's my failure that I couldn't, but I could not.
[00:22:42] Then I thought about it, contemplated it. I was so, so filled with sadness, I would say. Because I thought that how do I fight the cultural training that since childhood, this teacher who is well-educated, who understands at the cognitive level, that inclusion is important for if we want the whole society to grow. But the cultural indoctrination that disability is a curse from previous birth, how do I negate that?
[00:23:11] So that is why I believe inclusion, for inclusion to happen actually, it has to work at a very, that is why I work with children, if you ask me. Because I feel that they are the most inclusive people. They are the most beautiful. They just love everyone. They don't discriminate. They are still without those cultural training and pastings.
[00:23:36] So I can work with them and I can show how beautiful the world is when we celebrate diversity. So I hope that answered your question. Oh, it does. It's good. It's good. Yeah. So that's your perspective on it. So I want to find out from a policy perspective, what's the impact of the Trump factor on the sociocultural perspective around the world?
[00:24:03] I guess for or against the US? You know, what's your... So my take here would be that, you know, it's just another step. Because what is happening and that is happening at least for last few years across the world. And now US has also joined the bandwagon.
[00:24:32] That is worldwide pseudo-democracy is happening. Is that place, basically. And US also proved it, that yes, yeah, pseudo-democracy is there as well. And there was a time when the whole world looked up to United States of America. I don't think they do at this point of time.
[00:24:56] And it's a time, like I had discussed initially, it's a time of turmoil. It's a time of change, which is always messy. It's a time of building a new world order. And to build the new world order, there has to be a lot of destabilization. There has to be a lot of de-harmony going on. And yeah, and I think I'm hopeful, Jamil, that we will find the new homeostasis. We will find the new balance.
[00:25:26] Let this broth boil. Let it brew for a while. Because without that, any sort of revolution, any sort of change won't happen until we are in that comfortable position. And I think Trump is the best person to shake our seats and to make us, to wake us up and to get us do some work. Yeah. Yeah. That makes sense. We're going to need to do work.
[00:25:56] And speaking of work, I want to talk about the job market, right? What essential changes do we need in education to ensure the future generation is equipped for the new world? What's coming down the pike, including AI, Balaji, and everything. You know, what changes need to happen? I think we should ask with asking the basic questions. Like, first question is, what is the new world?
[00:26:26] Second question is, what is the role of education? So, if you ask me, the fundamental role of purpose of education is preparation of life. And, but if we talk about from the narrow perspective, so I would like to quote Robert Kiyosaki, who said that the job of education is to get a job. As simple as that. And then we come back to the second question, what is the world that we are heading toward?
[00:26:51] Like, so that we know that this is the world and this is where our children have to be prepared. We know that, that factory model education is redundant. It's not required, but we, and we don't know the world we are heading into. But we know one thing for sure. Adaptability is one single trait that would be required so much more. You be in any work area, you know, but you need to be adaptable.
[00:27:19] Because the, what, the social change that used to happen earlier in decades, now it's happening in months. So unless and until, you know, you're not agile, it will become difficult to cope up to that. And I think second or third most important skills to be nurtured is curiosity and critical thinking.
[00:27:44] So, and, you know, if you ask me what is the role of AI, you know, or technology in education, I think the best thing that technology has done is it provides access, opportunity, and it can be equitable. So, and that is why I would, I would say, I would still be hopeful that AI can de-standardize our standardized education system.
[00:28:15] Because it can help learners develop and work according to their individual education plan. Whoever needs what based on that. And, yeah, I think only concern will be EQ and SQ if they are taken care of. Got it. So you said a key word there was redundancy. And I think about the education system.
[00:28:41] I think about my education, you know, and a lot of people come out of school, both high school and college, financially illiterate. You know, how do you, because they, you know, they'd rather teach you something like what happened 200 years ago instead of, hey, this is how you, this is how you make a sound financial decision.
[00:29:04] So how, how can technology, AI, blockchain, you know, help redefine and remake, you know, the new, the education system and make it non-redundant? Yeah, so I think first what I said is de-standardizing the standardized education system. Because we don't need our students or children to know all the information. Information is available, right?
[00:29:33] We are living in an information era. It's just a click away. We want them to be able to analyze, to correlate, to connect and to comprehend different informations. And for that, for that to happen, they have to have the curiosity. They have to have, you know, curiosity will lead them to question.
[00:30:02] So they, to be curious, the why is this, why is this, how are they interconnected and all that. And with AI, because we see a lot of, you know, I love this visual representation of things, you know. So that, that makes you understand any topic or concept in a very different, in a very, I would say, simple way.
[00:30:29] So that is how I see that technology can play a very big role. Considering financial thing, if you're talking about education's role is not to teach finances, because its role is to create like the factory model education, which was built during industrialization. The role is to create followers, to create workers. So they will not want any of us to know about finances.
[00:30:58] So that you can do work for someone else. If you learn finances, you will know how to work for yourself. So I think that's still, because I'm not a big fan of tech solutions. I find them old wine wrapped in a new bottle. So I would say still for finances, it's, it's our, it comes to us learning the game, you know, us learning the trade ourselves only.
[00:31:27] I think the key word you said there was curiosity. Curiosity. I think curiosity, right. It must make sense to me is it would drive, it would drive conscious change. Yes. I agree. Absolutely. Makes sense. Awesome. Well, I want to thank you very much for speaking with me today. This has been an enjoyable and delightful conversation. I have one last question, really.
[00:31:54] And, you know, is how can people find out more information about you? How can they find out, you know, how can they follow your work, poetry and everything else? How can they find out more information about the digital economist? You know, how can they do that? Yeah, of course. I can put the link. I have my portfolio about my organization as well. And yeah, Amazon link of my book as well. So I look forward to interacting and, you know, yeah, to the audience.
[00:32:24] And thank you so much for having me. It was an honor. And I thoroughly loved our conversation. Yeah. Awesome. Thank you very much for your time today. Thank you.


