Max Giammario is the CEO and Founder of Kindred, pioneering the world’s first IP-powered agentic AI protocol. He is developing emotionally intelligent AI companions that seamlessly integrate across devices. With a background in Human-AI interaction (Master’s & PhD researcher), two successful startup exits, and extensive work with Web3 leaders like Filecoin, Tezos, and Ronin, Max is at the forefront of AI and decentralization.
[00:00:03] Hello everybody and welcome to the Crypto Hipster Podcast. This is your host, Jamil Hasan, the Crypto Hipster, where I interview founders, entrepreneurs, executives, thought leaders, amazing people all over the world of crypto and blockchain and recently AI. Today I have another amazing guest for you. I'm going to try to pronounce his last name for you. His first name is easy, it's Max. Last name, Giammario. He's the CEO and the founder of Kindred. Max, welcome to the show.
[00:00:33] It's a pleasure to be here. Thank you for joining me. Thank you. I look forward to speaking with you and my first question is always the same. It's what is your background and is it a logical background for what you're doing now? Yes, an interesting one. I guess I grew up in a farm in Australia, so surrounded by a lot of different animals, I guess. And so I started entrepreneurship as well, very young age of 10.
[00:00:58] And I've been doing a lot of different businesses, but really when I started my master's and also participating in PhD research, it focused around the idea of artificial life. So arguably this new kind of species that we are engendering with the new technology of today, with things like machine learning and neural networks taking off.
[00:01:20] And so my focus was on how we can apply these type of concepts when it comes to artificial life into things like mental health and not in the way that you would think. So one of the things that we uncovered when it comes to mental health is one of the easiest ways to improve someone's mental well-being is by adopting a pet. Right. But there are a number of reasons why that is the case and also a number of reasons why people don't adopt pets. Right. Some people just don't like animals and that's pets. Some people are allergic, things like that.
[00:01:50] And so if you were to create something that could fill that gap in terms of companionship in a more mass scale, then you've suddenly created a very scalable solution to essentially helping a lot of people mentally. And so that was the essence of what was born for when we look at Kindred.
[00:02:11] Awesome. Awesome. So your vision, your mission at Kindred is what exactly? Just mental health or is it much more? Yeah. So, I mean, I actually don't like to talk too much about the mental health angle because people then assume it's a mental health tool and I don't want that. What I want it to be is something that people just relate to, similar to a pet. Right. So when someone adopts a pet, you don't think, oh, they must be going through some mental issues.
[00:02:37] It's not the case. Right. You adopt it because you actually want a companion in your life. It's something that sparks you joy. And so the same thing should happen with that AI. Eventually, we will have personal AI for each and every person. Right. It'll be as ubiquitous as your phone. So when you leave your house, even when you wake up, the first thing you see is your phone. You leave and your phone is always by your side.
[00:02:58] And I think the same thing will happen with your personal AI. So imagine something like Siri or Google Assistant, but something way more superpowered and arguably something you can actually form a deeper emotional bond with. And so the whole concept of Kindred is how do we bolster that emotional connection? How do we create connections between us and AI that feel authentic, genuine and actually something that helps us throughout the entire day?
[00:03:24] And so one of the main pillars we do to leverage that is IP characters. The reason we do that is to cross the uncanny valley of using a personal AI. You probably use very loosely things like Siri or Google Assistant on your daily basis or even ChatGPT. But to actually truly form a connection, you have to visualize them. And we are crossing this or attempting to cross this uncanny valley. We're just not prepared to integrate AI into our lives.
[00:03:52] It's just intrusive and we don't trust it. And what we found with emotional AI and also the ability to visualize it, it's also important that we kick off the relationship, arguably with something we are familiar with. In this case, things like characters. So Bugs Bunny or Hello Kitty. We're currently the only AI company that could be partnered with these IP characters. And it uses that psychological phenomenon that we associate when it comes to building trust with people with the same characters.
[00:04:22] And so that's the essence of Kindred. Awesome. So I want to get into some of these emotions. First thing to find out, though, why it's important to create emotionally intelligent AI companions. They address fundamental human needs for connection and support. So traditional AI assistants, they are these facelift things that really just can't resonate with them.
[00:04:51] And we've seen actually in so many times, like sci-fi movies and TV shows and fiction books, where we always tend to gravitate towards anthropomorphizing or making our robots feel more human. Right. And that trend will continue. You see a lot of the physical AI space right now. They're humanoid robots, but they will eventually slowly push more towards being something you can actually be friends with. Friendly. Has a face.
[00:05:17] And that anthropomorphizing of the robot can be conversed also to something like our personal AI. And so I think that having that emotional connection there is going to be absolutely fundamental to actually seeing something in our daily life that we can relate to. And it becomes more important. And so the analogy I can give is something like HAL 9000. Right.
[00:05:40] We, I just do not see a future in which we have something like HAL 9000, a cold, robotic, functional voice that gets everything done. Right. I think for us, we have a human need to be something that connects with that thing, that artificial life. All right. So a few years ago, somebody said to me, and I believe them to be true, that everybody will have a robot. Okay. And the robot will be like a pet. Now.
[00:06:10] I have a pet. She's a five year old Bernie doodle dog. She's 35 pounds. The benefits that I get from her are, you know, if I'm having a bad day, she rolls over, I rub her belly and feel better about myself immediately. I take her for four miles of walks every day. She's helped me with my health.
[00:06:31] If I had a robot instead of the pet, instead of the actual pet, but I had, I had the AI companion instead of a pet, what benefits could I get that I'm getting from my relationship with my dog right now? Yeah. I mean, look, honestly, I'm not trying to compete on that level. I think that you can never replace a physical pet. The love and bondual form with that dog is, is paramount to anything. Um, I think the distinction I would make here is yes, there is that functional side.
[00:06:59] So you have, uh, this companion that not only forms a part of that support network. You have just the joy you have in life, uh, with that companion going through this journey, but also now you have that entire functionality. So it serves a clear use case in your life, acting as a personal assistant, acting as something that is able to warn you of things, uh, your news notifications, everything goes through that one companion.
[00:07:24] Um, but I also like to, to distinguish here that your, the idea here is that your dog that you have right now, if I were to give you a slightly more functional version of your dog, you would not replace it. Right. Cause you really have this bond with the dog. You didn't go and adopt that dog for the sake of like, what can it do for me? That wasn't the nature of why you adopted it. Um, and I see the same thing happening with AI. Uh, I, the AI that we use on a daily basis changes.
[00:07:54] We have absolutely no loyalty. I personally change from XAI to chat GPT to perplexity to deep seek on a daily basis. Right. And so the, the angle here is why not focus on something that is intangible, that bond, and then provide the functionality without the assessment being purely on that functional layer. It's on that actual joy companionship, the interface. Got it. Okay.
[00:08:21] So I saw, I saw, um, an article, this is many years ago. I don't know what the stats are today, but they surveyed a thousand British men, the UK. And it said 75% of the men surveyed had zero friends. Right. So if we have a society that's headed toward more friendless net or more or less, less friends, right.
[00:08:48] What benefit could these men or other people have from, you know, having a, an intelligent, you know, AI companion? Yeah. I mean, these companions are designed, arguably trained as professional therapists. So they know exactly how to approach people. They understand the emotional resonance of their voice. They understand how to approach a vast database of different emotional problems, mental wellbeing.
[00:09:16] They can flag and potentially note different things in the way that, uh, a user is talking to them. Right. So for example, if a person today is talking about how they feel terrible and they're like snowballing, and these are like little tiny red flags that arguably. Most maybe friends wouldn't pick up on because they aren't trained in mental health. Um, but because our companion is referencing, uh, essentially and has the, the intelligence of let's say the most professional therapist.
[00:09:44] Then it's able to at least tailor more appropriate answers, uh, and help the user in that sense. The idea of, uh, well, I guess if you take that too far, there's always decided when it comes to AI girlfriends and boyfriends in which I don't personally work in. Um, that is not the realm that we are looking at. I am not looking at replacing intimacy. I think there are some human connections that should be reserved to humans. Right. Um, and, and for me, why we're called kindred, right.
[00:10:14] Which is family, um, is we're replacing that kind of pet slash plant slash small family member that you would have, that you have a trusted relationship with rather than something like an AI girlfriend in which you are truly trying to replace something. You know, it is quite human and quite intimate. If I pick up an AI girlfriend, my wife's going to be kind of mad at me.
[00:10:38] I think, you know, um, some people, a lot of people, you know, are afraid of AI, right? Um, they don't look at the benefits of, you know, um, of having companion or using AI at all. Right. And you talked about trust earlier.
[00:10:57] So how do we break down that trust barrier, um, so that these overlooked aspects of the human AI interaction, you know, can, can be realized, people could realize the benefits. Yes. That's actually a great question. Uh, the, the stepping stones here, uh, are quite big, right? We have to pay, take some, some fundamental leaps here in terms of our psyche.
[00:11:21] Um, I would say that the first thing that comes to mind really is the idea that, uh, we, we can't, I wouldn't approach it based on a functional perspective. Um, you, you can't just thrust upon this kind of concept strictly from a functional perspective like ChatGPT. Right. And so there's no emotional bond when you use ChatGPT. It's straight up. You give it a question and the replies with the answer.
[00:11:45] Um, whereas what we want to do is approach it from the angle of companionship first, adopt the companion or the AI, um, as a friend, as a thing to entertain you, something to spark joy. And then as you get used to building that bond, then you provide that functional component level. The second part I would say is IP characters.
[00:12:07] So that is one of our key unique insights, I would say, when it comes to how we approach, uh, integrating a personal AI into our daily lives. What we found is that using IP characters as a way to cross that unfamiliarity or that trust is really, really effective. Um, we, we just tend to have more trust when it comes to a character because we've seen it before. We know the law, we understand its personality. Um, we can to a degree relate to it because of its background.
[00:12:36] In addition to that, it also helps us when it comes to reaching new users, because now we're essentially bringing to life their favorite character as their friend and AI sidekick. And so you can imagine something like the hundred million community fans of Hello Kitty, right? Instantly now having the ability to talk and be with their Hello Kitty companion, or let's say Kuromi companion on a daily basis. And that's like such a powerful concept. Let's see. Interesting.
[00:13:05] Um, you know, community is something that I know a lot of the good crypto projects and I want to talk about decentralization. All the good crypto projects have a strong community, but you see less emphasis on the community recently and more emphasis on financial aspects of decentralization. Right.
[00:13:26] So what's the role of the community and how does, how, how will AI and relationship focused AI help make the community stronger? Yeah, I think that's one of the reasons we love Web3 in that sense. Um, it is a beautiful way to provide a sense of purpose, meaning ownership in the community.
[00:13:53] And I think it's actually something that's necessary for a product like ours. So you imagine Kindred, I think when we came into it, we came into it with the very kind of linear thinking of, okay, people would love pets. They would love 2D vector illustrations, and this will be the AI going forward. As I learned very quickly, uh, there are so many different demographics of people that want different avatars, right?
[00:14:19] Whether it's IP characters, whether it's their favorite celebrity as a chibi version, uh, whether it's their game character, right? Or their own avatar itself where they create their own. And so to be able to scale this truly to cater towards every single person on earth, right? Which is obviously not the goal right now, but something that we see in the future, right? Everybody will have a personal AI and I believe it will be visualized.
[00:14:42] And so to be able to cater towards that, giving the ecosystem, that distribution of royalties, that sense of ownership is going to be absolutely key to scaling this and allowing people to create their own avatars. Similar to why Apple opens an app store, right? There's so much that Apple can do and there's so much that Apple can't do. And so being able to open up that community just allows the scars to be limited.
[00:15:12] The, uh, I think it's been, it's been a few years, but, um, a lot of my, a lot of my guests a few years ago were building NFTs. Okay. And you hear a lot, you hear a lot less about NFTs now, but I'm looking at what you're doing and I'm like, I think there could be a resurgence of NFTs, but with the, with the AI aspect, you know, to it.
[00:15:38] How do you think your, your company, your creation, your innovation can bring back NFTs to the fourth run? Yeah. So actually we, we just launched the Clara collection. So Clara is our first, let's say mascot of Kindred and it's an NFT collection that will eventually become something you can jump onto, it jumps onto your screen and ends up being your AI sidekick. Um, that was launched a couple of days ago. We're still number one on abstract as the top NFT project.
[00:16:05] Um, and so like that gave us a good confidence in being able to, to see this system in place. And so the idea here is that every IP collection we launch on a monthly basis. So let's say next month is Hello Kitty. The month after that is Astro Boy and so on and so forth for the next 16 months. Each one of those collections are NFTs, right? And the key distinction here is that each one that you have, right? Essentially is that unique entity.
[00:16:32] It's actually quite important for the emotional bond to express the uniqueness. The fact that this is one-on-one to an individual user, right? Just like for example, your dog, right? You know that there's only that dog in existence. There's one-of-one. Like even if you try to clone it, it's going to be that one dog that you always stick to, right? And that sense of appreciation that this is something that is, uh, once it's gone, it's gone forever. That is actually quite an important part to express.
[00:17:00] And the best way to express that is with an NFT and blockchain. And then obviously all the other side benefits of having that Web3 component comes into play. Including build the community. Yes. Yes. Awesome. So I want to talk about human-to-human. Actually, what's been on my mind recently is a couple nights ago in my town, in my son's high school, we had a speaker. And the speaker, he's been sober since 2008.
[00:17:29] But he was a college basketball star and a pro NBA player who got involved in drugs. And he grew up in this, you know, era of isolationism. Or isolation, not isolationism, but isolation, loneliness.
[00:17:47] And I know we haven't, I know we have an epidemic with drugs, you know, still, you know, like, how can, do you think that Kindred can help these, you know, people who are lost early on in their lives get companionship, have, fill that void that might have been there so that we can solve the drug problem going forward?
[00:18:40] Yeah.
[00:18:46] And so that's what it's based on, you know, it's a
[00:19:10] bunch of code, right? It was a digital pet. And yet we formed these extremely deep bonds with it. And one of my most interesting facts when I was researching this was one of the most loyal user base of Tamagotchi was middle-aged working class Japanese men. And it actually exemplified exactly what we were studying, which was the idea that you could create something that assisted and helped
[00:19:39] those that were suffering from an intense level of loneliness or isolation within where they're living, right? And so in this case, it was a Japanese kind of working class, right? The salaryman. And so we were seeing this genuine bonds being formed. And so if we're able to pair that with something that could be scalable or more approachable, let's say, to the mass audiences that might potentially be facing some sense of loneliness, but not to the extreme level
[00:20:08] where they are seeking help, right? Or they're taking meds, right? And so before that even happens, why not address this issue as a preventative somewhat tool? And yeah, I think that that is probably the most rewarding component of this entire thing. It's really seeing people use this and it sparks joy. And it's something that like you, it accompanies you throughout your workday and it just makes you smile. And that has such a profound impact in your mental wellbeing.
[00:20:38] Interesting. Yeah. I would say, I'm trying to think back to when I was in college and I had, I had, you know, I was at first, I had a massive homesickness. And that eventually affected my grades. I would think that if I didn't have that homesickness, I could have this AI companion, that probably when I was back then, that could have helped. Yes. A hundred percent. Yeah. It's like, it's great. You raised that up. It's,
[00:21:04] I love hearing as well, the amount of use cases here, like all the, but how this applies to such a broad range of people. And so we're definitely seeing huge demand from students, as you mentioned, like studying abroad, but just in general, like students have that combination of having to do a lot of work when it comes to studying. But then also remaining isolated, potentially in an overseas or a foreign country. But then also it's like, we're seeing this in terms of children,
[00:21:31] very young children. So we did a pilot program in which we worked with parents and children as young as five or six, right? We've also dealt with elderlies, right? So 60 plus in retirement homes and applications in healthcare, right? And so this has this kind of far reaching impact on a lot of different industries, simply because you have to remember everything is about in a sense,
[00:21:57] relationships, right? The human connection that we have. And so to inject this kind of human connection in a way that's functional, but also loving, I think has so much application in so many different industries. Let's talk, let's go down, let's go down to healthcare. It's been on my mind recently, you know, without getting into details, but besides the, besides the wellness, besides the
[00:22:22] mental health section of, of, you know, health, what are the benefits do you see that could happen at that forefront of AI and web three together that will cause breakthroughs in the areas of, of, of, of health? Yeah. Oh, wow. That's actually a really interesting question. Um, I think, yeah, for healthcare, especially, uh, you actually look more towards the physical component of AI. Um,
[00:22:50] so we are working with some hardware companies here to like be able to bring these kinds of characters, uh, and this kind of approach to physical robots. And I think that healthcare is one of those ones where it's often actually overlooked the patient recovery journey, right? Let's say for someone in the hospital is, is very much on a functional side, right? They're often ending, uh, with, I can, there's a lot of studies you can research here, but they often, after a patient has gone through this
[00:23:18] hospital experience, especially along the duration of time, they are extremely isolated. They feel super lonely. Um, they are living and being in a hospital, which nobody, nobody likes being in. Um, and so to introduce something where your companion that you've already formed a bond with is able to traverse into, let's say for example, the healthcare environment and become this ubiquitous companion. Yes. It's collecting all the data. And luckily with decentralization, we are approaching ways in
[00:23:47] which we make that, uh, that data owned by the user, but then shared so that that transparent, but secure approach. Um, but by doing that essentially it's like, you aren't seeing these companions as just for, uh, your healthcare journey, right? Uh, the other example would be when we worked with children, it was for a Bugs Bunny companion, right? And their parents. And the key distinction
[00:24:13] here was that we didn't call the Bugs Bunny, a edutech tool for children, right? Um, we didn't just use the Bugs Bunny for helping students with their homework or to teach them something new, right? Uh, I believe that in this kind of educational perspective, uh, children will see this Bugs Bunny as, oh, okay. All right. This is for learning education. And then I can go back to my games. I can go back to my like fun time,
[00:24:39] but this is like seen as work, right? Which is not the case that I want to go with down. Right. And, and like you said, you have children. Um, so you probably could resonate with this. We create Bugs Bunny so that it stays with them throughout the day. So when they're watching hanging out with them when they're outside playing, right, it might be on their smartwatch, right? Just like playing along with them. And then the great thing is Bugs Bunny is able to slowly
[00:25:07] push the child into, let's say, oh, it's homework time. Let's try to do that now. Or maybe push the child from watching, um, YouTube videos of just like straight up roadblocks all the time into something like a TED talk or something more educational. Right. And so these small little nudges that potentially a great friend would do, right. It's something that like we see as one of the key blocks that allow us to form those deeper bonds with, uh, or the Bugs Bunny with the child,
[00:25:33] because a child is no longer seeing the Bugs Bunny as an educational tool, but rather this great friend that happens to like push him or her to like more educational perspectives. I love it. I, I, I, when I, when I was a child, I grew up, I watched Bugs Bunny. I watched merry melodies every Saturday morning and Bugs Bunny cartoons. It was great. You know? So I think if Bugs Bunny wanted me to do something, I'd say, yeah, sure. Uh, you know? Um, yeah. So
[00:26:03] I'm thinking about what you can disrupt about what, who, what kind of fields you're going to disrupt the most. And the one that comes to my mind is, is coaching. You know, I'm like, am I going to need a personal trainer if I have an AI companion? Am I going to need a therapist if I have an AI companion? I probably, so, so what, how do you, how do you, you know, communicate to the, to people who are most threatened by you? You know, how, you know, say, okay, you know, there's a still role
[00:26:33] for you here. You know, um, no need to be threatened. Like, how do you, how do you get through them? Um, because you're going to see, I see, I didn't try to push back on him. Are we talking from our competitors or our consumers? I'm talking more of a competitor perspective. You know, I think the consumers will be happy not to, not to pay you $500 an hour for a therapist, you know, but your therapist would
[00:26:59] be kind of straight. Like, how do we get those, those people to, who are the coaches who are being replaced to, to bring this into their business and use that as a companion? I think for us, it's, uh, the analogy I can give is something like when there were the AI chatbot boom, right? If you remember like a lot of enterprise, a lot of, uh, websites that you go onto, uh, they had these like rudimentary chatbots that you could talk to and there was a little bubble that popped up in the bottom right of your screen. I'm sure most people are familiar with them.
[00:27:29] Um, it was super annoying because yes, they weren't as functional, but also, um, that chatbot has no knowledge about you, right? No personal data. It also doesn't understand the context you're coming from. And so every single time you went onto these websites, you had to retrain that chatbot to understand your specifics, right? It also just straight up didn't work that well. Right. And so the key here is what we focus on is the interface. We have the users, the interface that every single
[00:27:58] user will go through, right? Whether for me, it's my bugs bunny or for you, it's your Scooby-Doo, right? You end up talking to that companion all the time. It knows everything about you. And so for example, if you wanted to connect this to a fitness coach, right? The fitness coach would be the one that trains the data would be the one that trains the agent behind the backend. And then they don't focus at all on the interface. They don't focus on the front end whatsoever. That's not their specialty. What I want to do is extract the knowledge, the value that fitness
[00:28:26] coaches come from, right? All of their background knowledge and use that and plug it into our interface instead of them having to do both the knowledge base and also this often rudimentary or cheaply made interface chatbot, right? And so combining those two things just enables or means that these specialists, let's say a therapist or that coach can purely focus on what they do best, right? Which is training
[00:28:52] the AI to essentially provide those relevant questions, answers, knowledge base, things like that. And then we just focus entirely on the interface. Got it. So you're basically being able to extract their expertise and their knowledge and use that to help your consumers, but you rely on the experts. Yes. So that's how you get them interested. So like, here you go. Please share your expertise. I think
[00:29:21] it's a good idea. So awesome. So what other innovations do you envision at the forefront? AI and blockchain that we haven't seen yet that we're going to see real soon? I think the biggest issue, at least in my experience with AI and decentralization, or at least AI, is the idea of ownership. Like we see, it's actually quite timely. Yesterday, the virality of
[00:29:49] ChatGPT's new image generation model, which allowed for Studio Ghibli type images to occur. Have you seen this? Yeah. Yeah. Yeah. And so everybody is making Studio Ghibli images of themselves, of pictures. And this is like a major IP issue because where, how does Studio Ghibli even attempt to enforce
[00:30:18] this, right? This is clearly, clearly copyright infringement. No doubt about it, right? The Studio Ghibli is purely referenced from Studio Ghibli's art style and everybody knows it, right? But the thing is who actually is able to distribute the royalties there if Studio Ghibli does decide to try to enforce this, right? Is it the AI generation model? Is it the original AI, let's say in this case,
[00:30:42] open AI? Is it all the references that the AI model has taken into consideration in order to generate that image, right? Or is it actually the AI creator? So for example, if I use that image generation and because I did a specific prompt, shouldn't I also technically own part of that image because I'm adding value, I've created it in that sense. And so that is a really, really difficult problem,
[00:31:09] right? To distribute that kind of ownership or royalties to the relevant people that created that artwork. And so I think that decentralization allows for that transparency to occur, right? It allows for us to distribute fairly or at least an attempt to distribute fairly in a way that everybody can see. I agree. I agree. Sounds good to me. So awesome. So I want to,
[00:31:35] I want to, yeah, I want to thank you very much for your time today. This has been a wonderful conversation and time has moved fairly fast because I enjoy hearing all this stuff. So it's great. So thank you. I have one last question. It's how can people find out more information about you, about Kindred? How can they start to use, you know, your platform? How could that, how any of that?
[00:31:57] Most of our updates come from X. So Kindred AI, Kindred underscore AI or our discord, which you can find all of those links in our website, which is kindredlabs.ai. You can also follow me. I tend to talk a lot when it comes to emotional intelligence in AI and just essentially the outlook of artificial life going forward. Awesome. Awesome. Thank you very much for your time today. I really appreciate it. Thank you for the discussion.


