Yannik Schrade is the CEO and Co-Founder of Arcium, a parallelized confidential computing network, bringing fast, scaleable and universal encryption. As the lead architect and visionary behind Arcium, Yannik created the project by leveraging his expertise in cryptography with the goal of advancing data security and confidentiality.
A renowned thought leader, Yannik has spoken at several prominent industry conferences, including the 2024 World Economic Forum, where he challenged TradFi leaders on the importance of decentralization, privacy, and trustlessness.
He previously founded ShiftScreen, an iOS app, which attracted over 100,000 paying customers and was a regular top seller globally. Yannik studied Computer Science and Mathematics at the Technical University of Munich, and he also studied law, adding another dimension to his extensive portfolio.
[00:00:00] Hello everybody and welcome to the Crypto Hipster Podcast. This is your host, Jamil Hasan, the Crypto Hipster where I interview founders, co-founders, executives, entrepreneurs, thought leaders, you name it all over the world of crypto and blockchain globally
[00:00:15] and I have an amazing guest today for you and I'm looking forward to this. I have the co-founder and the CEO of Arcium Yannik Schrade joining us. Yannik, welcome. Hey, I'm Yannik. Great pleasure to be here today.
[00:00:35] Yeah, very excited. Awesome. So, my thank you for joining me. So let's kick things off. I ask everybody the same question again. All kinds of great answers. And what is your background and is it a logical background for what you're doing now?
[00:00:53] So, originally I started 12 years ago. So at the age of 12 to teach myself programming. And so it has always been part of my childhood to program applications and just egg around with things.
[00:01:17] But then after high school for some mysterious reason, I decided to go study law instead of pursuing mathematics, computer science or physics.
[00:01:30] And yeah, at the time I thought that it was the right decision and even during studying law, I was very attracted to legal texts or me are combining traditional law with automation artificial intelligence and just digital technology in order to make it more up to date, I guess.
[00:01:56] But I noticed during COVID when I had some spare lecture free time that yeah, I was thrown back more heavily into this computer science round and at that point I was just programming quite simple application that helped me at that time to
[00:02:18] my iPad pro as a full desktop computer because they were in the trend during COVID everyone was working from home.
[00:02:26] That was programming an application that allowed a secondary monitor to be connected to your iPad and then have the full desktop work experience and overnight that turned into a big success. And so at that point, doing my last civil law course I then decided okay.
[00:02:48] So, which things up let's instead go back to mathematics and computer science. And so that's how I came back to my original passion.
[00:03:00] And then I met my co-founders studying at university and all of us were doing computer science and math and we were just intrigued by block chain systems and privacy technology essentially. And so that's how we ended up teaming up to build.
[00:03:21] Now, continental compute and originally when we started off just general on-chain privacy. Again, I've been I've always been sort of privacy fanatic I would say. I think one of the biggest achievements in my life.
[00:03:40] So far has been onboarding my entire family into using a signal as primary messenger and family communications now running over a signal instead of psychobrox what's up. Yeah, we got to have things talked about.
[00:04:00] And so, I think it's a common very cool yeah who introduced me to signal was John Draper here the crunch man back in 2017 and you know he told me all about that.
[00:04:13] I haven't followed up with him in years but I don't know but yes, it was a great platform and and confidential computing and I'm going to get into that too.
[00:04:21] First of all, what if I not know what is arqueem all about and how do you redefine trust in this digital age? Yeah sure.
[00:04:33] So what we are essentially trying to tackle is we want to remove trust from computing because in the centralized computing world there's many actors involved that one has to trust.
[00:04:49] That one has to trust their data and the processing over the data with right and so we on our day to day basis these data exploits.
[00:05:01] And the attacks on individuals on corporations and most of those exploits are due to the infrastructure that is being used and that infrastructure as this centralized infrastructure being very prone to attacks.
[00:05:22] So for us, it's this introduction of confidential computing and decentralized confidential computing and what's this amazing concept about confidential computing is that operations can be performed over encrypted data without having to decrypt the data first right.
[00:05:42] So you can include data make it secure make it inaccessible to anyone once still be able to process that data and this is a concept that has been pursued in the traditional computing space as well for quite some time.
[00:05:59] So the go to technology in general has been so called trusted execution environments which are dedicated chips provided by a hardware manufacturers to get over some security and data privacy promises associated with them.
[00:06:19] But yeah, similar to just generalized server infrastructure what we've seen with these kinds of systems is that they inherently require trust by whoever's using them. And so for us it has always been this struggle to remove trust and replace those trusted systems with productively based systems.
[00:06:44] So that the only process I guess you require is that some mathematical proof as well. And so yeah, I think that's where we where we're standing in order to read a fine trust can get into the more detailed technical ways we are facilitating that I guess but.
[00:07:14] In general our approach is decentralized things and introduce cryptographic protocols that protect data and allow arbitrary computations to be executed over that data so what we are trying to achieve is. Everything to become encrypted and anything to be possible to be computed over that encrypted data.
[00:07:41] Yeah, very cool. I do want to get into the technicals would do that after this question.
[00:07:49] I want to find out you know I saw some I saw an announcement that you are not part of a new DEC you know what is that all about and what is your role and your vision for your role in it.
[00:08:05] Yeah sure so essentially the idea was to to give this kind of vision and mission that. I think the main reasoning is that we require more education about what we're trying to do here.
[00:08:58] We require more education about the importance of data security and also the kind of solutions that exist out there so they need to be more education and if we unite. Yes, startups in this space and we are able to facilitate this. Education in the ecosystem and then.
[00:09:24] Yeah we want to clearly have this sort of separated new category for web free technology with decentralized confidential computing and for us it's important to ensure that those use cases the DEC offers.
[00:09:42] And expand upon financial use cases so it's not just about financial privacy decentralized confidential computers generalized confidential computer so the ability to create. I think that the structure that fully confidential smart contracts that can perform any action.
[00:10:05] And perform arbitrary on chain actions but at the same time use this technology to also facilitate. Continental off chain operations so in general what with decentralized confidential compute we are trying to achieve is to offer.
[00:10:24] Cloud if you were so really a computation execution environment can be used both by enterprises but can also be used for. Continental smart contract applications and so there was trust this need in the entire blockchain ecosystem to introduce a new term and you term that really.
[00:10:47] Yeah, that's the uniqueness of what we are trying to achieve. I looked at some of the companies projects in that coalition pretty strong so you're in good hands and you're a good company I think.
[00:11:04] I do want to find out then now how our team can help enhance that security you know and the tactical side right the large healthcare providers. Yeah, alright so.
[00:11:21] Let's get into it so first things first I think what's important about our approach and might even be unique in the DCC or just blockchain space in general I think is that.
[00:11:36] For us we don't just we didn't came about using block chains or decentralization for block chains or decentralization sake. But instead because using decentralization and blockchain together with the cryptography that we're using is just this natural fit that allows for new properties to emerge.
[00:12:00] And so the core technology that we using is so called secure multi party computations and they are cryptographic primitive that allows a set of parties to collaboratively run a computation over distributed data so data that every one of those party souls without having to share that data.
[00:12:23] So the two of us could be able using a multi party computation protocol to compute the average of our salaries or income or network whatever without having to disclose any direct information we are able to.
[00:12:39] And compute anything in a distributed way and and receive the output but there's no risk of the data the input data or the intermediary results to be leaked so that's a very powerful cryptographic primitive and.
[00:12:56] At the core of multi party computations work and it's a concept of secret sharing so if there's multiple parties that run a computation and they have private input data their secrets.
[00:13:09] They take those secrets split them into shares essentially blinded randomized data at that point and distribute that data for the other participants and they can't gain any knowledge about original secret just from those shares only all shares combined back together can form the secret and so multi party computation essentially means multiple parties taking their secrets.
[00:13:39] Creating randomized blinded shares sending them to the other parties so everyone has this heap of shares and then running a computation over those shares. And what we're using for that is so called some what homomorphic encryption which essentially means that.
[00:14:00] Yeah, one one operation is very easy and efficient to be executed over this data so how you can imagine it really is.
[00:14:09] Each player having their data and then just running a local computation over that data and they can't learn anything from running this computation because it's just as random data that are they are operating over and then some communication is involved between the participants.
[00:14:25] Exchanging some intermediary results running more computations and then at the end they reconstruct the output and that's this core principle of multi party computation that we utilizing.
[00:14:39] And what's a beautiful about this this technology compared to others is that there's there's two properties that we care about trustlessness but also performance and because there are very trustness technology to for example fully homomorphic encryption which on its own could be more trustless even then the technology that we're using.
[00:15:04] But at the same time having to force in this example healthcare providers who run computations at a 10,000 to 100,000 times slow down performance latency.
[00:15:19] And it's so unreasonable that no one will opt to use privacy enhancing technology right so it always matters that we're able to actually supply a practical solution and what we found.
[00:15:33] And then in practice is that even those fully homomorphic encryption based systems have trust assumptions involved that are the same then that the trust assumptions that we use with multi party computation cell what are those trusted assumptions those trusted assumptions basically can be boils down to two cases.
[00:15:54] Honest maturity and dishonest maturity so honest maturity protocols like with practical visiting false tolerance would imply that in a computation more than half of participants would have to behave honest in order for the computation to hold confidentiality.
[00:16:15] Or us that's not really reasonable enough trust in some to because.
[00:16:21] Yeah, having to trust that more than half of players in such a computation or honest seems risky and so the trust assumption that we prefer and that we are using is that this honest maturity trust assumption so which means.
[00:16:38] In a set of participants the only has to be one honest participant so one honest participant is to requirement which means if you have a set of two players.
[00:16:51] You would trust yeah have this chance of one of them and being honest but if you have a set of 10 100 or more players it becomes a really reasonable trust assumption especially. If for those computations and they are sort of permissionless and you.
[00:17:09] Having the ability to become an honest player right so if you can also be part of the computation with the others in you being honest. Which.
[00:17:19] To be the trust assumption against yourself if you care about the continuity of your data and general then this becomes a zero trust game essentially.
[00:17:29] And so that's what we are able to facilitate trust lies and performance wise what we utilizing our essentially break rules in multi party computation.
[00:17:46] Protocol break throughs over the last years in the so called pre processing model so splitting the computation into two phases a pre processing phase and an online phase and. The pre processing phase, just generating so called.
[00:18:04] I correlated randomness that can be consumed in the online phase but this pre processing phase being completely independent of whatever is being executed in this online phase.
[00:18:15] So by splitting those computations is possible to have execution in this online phase where the actual inputs and the algorithm the function of the computation.
[00:18:27] And this being used allows this phase to come to very close to plain text execution speed which is again super important because every time you want to have an enterprise or a blockchain protocol or anyone utilize.
[00:18:47] In Hansen technology it can't and that's what what what has been shown in practice is can't really enforce the sort of I performance or cost penalty upon.
[00:19:03] The company using it or the end user right and you should not have to go out of your way to have privacy and that's really what we're trying to achieve. And so what we're building out with our Q using this technology at our core is.
[00:19:23] Paralyzed and confidential computing network so purely computational network that is sort of chain agnostic. That is able to run those computations and what that means instead as a healthcare provider what you can do with our him is you can create a so called mxp.
[00:19:48] And you can see execution environment which is a virtual encrypted environment and you can open suction environment and you can define encrypted state.
[00:20:00] And you can define functions and then just let the network run confidential operations over that state without new trusting some singular server server that could be hacked so it's. And it's a secure execution environment which can be provided as a replacement to those traditional centralized server applications.
[00:20:23] But more importantly, and I think that's only one of the use cases this autonomous execution environment if you will. What so fascinating about the.
[00:20:35] The technology we have and this core primitive of multi party computations is that we can directly facilitate things like federated learning with them right so if we have. And hospitals that have access to super sensitive patient data.
[00:20:54] I as an individual don't want those hospitals to share my data and from a regulatory standpoint, I also have to write to expect that they will not share my data at the same time.
[00:21:08] As a patient, I would love if they if in general we we had a health care system where it's it's possible to predict using machine learning for example this is as early as possible right but I don't want my data to be.
[00:21:26] I'm shared or exploited so with the technology we're building is and it becomes possible to use the data to gain insights without ever having to risk that data.
[00:21:38] So what's possible is to use our Kim as ten hospitals and every hospital remains the owner of their sensitive data while at the same time they're able to train a machine learning algorithm over that sensitive data that can predict different kinds of diseases.
[00:21:58] And without ever exposing any of the patient data and with this system we can even go full circle so it doesn't even have to be those parties just training together and receiving some plain text machine learning model it can even be.
[00:22:17] This machine learning model remaining encrypted forever so they just provide original data. Right with our Kim with an MXE which creates an encrypted machine learning model and then a doctor wants to predict.
[00:22:34] The the the the likelihood of someone having this this disease that that has been trained on those features and they. Confidentially provide their data so the data that they use as as inputs for the prediction is also not being exposed on this encrypted model.
[00:22:56] They get their prediction back and no data ever is being leaked so it's really end to end encrypted AI if you will at that point. And so I think that would be a very very straightforward use case that we are able to facilitate with with our Q for.
[00:23:17] Yeah, long-term providers I think I love it. I'm speaking out with silently clapping.
[00:23:30] There's a few things to wonder why I want to make sure I understand this right first of all because I've taught the people about multi party computing and you're the only one using in this matter.
[00:23:44] You assume a design you assume because it's because you're working with randomness you assume everybody's design is but one person is honest which gives you your control factor and then everybody else can be random in this honest because you have that one base. Right.
[00:24:02] Yeah, and it can be any it can be anyone right if if we have a set of 100.
[00:24:08] Computers you want computer I'm on computer you could be honest I could be honest the both of us could be honest doesn't matter as long as there's one person that it's not part of this malicious group trying to control everything.
[00:24:24] Okay, I understand that's brilliant and that's why it works number two you know I got to say I'm I'm part of I've been part of a few studies. I was got to get this rare thing that wanted one in two million.
[00:24:39] And and I've had it for a whole entire time of the podcasting three and a half years you know it's not it's not doesn't have to size a two murder is not does not the mass test size but there's a new drug that came out in December and there's been study since.
[00:24:53] I'm part of that I'm not I answer everything honestly. Um, by saying that like these medical companies these pharma companies that are conducting research for people like me and there if they use your platform.
[00:25:08] What will be the benefits to them to be you know in the as part of clinical research as part of these trials. What insights could they be able to gain to use and what should they use you.
[00:25:20] Yeah, so I think there's there's multiple aspects to that one one is that it's impossible without this kind of the geography based security and to to have this.
[00:25:39] Yeah, cross entity and collaboration because the data trust under data protection laws cannot be shared and processed in such a way so there we have this regulatory limit potentially.
[00:25:54] And now what I think is more interesting is that we can have new forms of collaboration even between competitors arise because. If you as and it doesn't have to be in the healthcare sector right if we think about.
[00:26:11] If we think about supply chain for example, right we have different logistics players that have data and all players all competitors would benefit from shared insights or from some shared models.
[00:26:27] But they can't or not willing to share any of their sensitive customer company whatever data with this kind of system they're able to do so. So completely new intelligence and insights can arise from this kind of collaborative approach.
[00:26:44] So I think it's just this win will situation where you don't have to give up any of your data. Yeah, guy. So I want to I want to investigate this because you probably have a good answer. I interviewed a federated learning company in some weeks ago.
[00:27:06] And a lot of people use Chad GBT, you know. I have made learning provides better data set for a I for companies doing your kind of research then just the regular Chad GBT social grab what do you thought's. Why, why, why, why let's the case.
[00:27:37] Yeah, so I think it boils down to. So sensitive data that can be used.
[00:27:47] I think the the patient example really really highlights that it's really this tension if we think about it on our individual level because that's I think what we need to think about for this health care case.
[00:28:01] And it's really on an individual level having sensitive data that has to be protected right. Again, I'm the biggest privacy at book it and what I found is just that with this kind of technology. I can have both privacy and at the same time. Get the benefits of.
[00:28:24] Confidentially operating over my my my encrypted data so I think it's completely new types of data that haven't been been used before kind of be used. And also where I think in this kind of example with. Federated learning and blockchain come into play is the ability to.
[00:28:54] To have better data because there can be a real incentive structure behind the data. So it can be. You provide a good and crypto data that and you can even generate as you know the truth about the quality of your data set for example right.
[00:29:13] And then this training process can happen in a very valuable way so everyone can be convinced okay. This model has been trained on good data and whoever provided the data is rewarded correspondingly so I think that just really.
[00:29:32] Yeah, there's more potential I wouldn't say that in any case the models could be would be better but I think there's just more potential that better data and more sensitive data can be used.
[00:29:48] Yeah, and it's I think it's really about about also cross referencing data right so good example could be just the government in general. If you if you. Bertu use.
[00:30:06] Text data social security data all of that cross reference it train models on that you potentially would be able to. Use policies you want to create and predict the outcome of those policies on different. Yeah, different different parts of the country I guess right so.
[00:30:28] Things like that are not possible because the data can be shared but it could be it could be neutralized if done securely crypto graphically with. What do we need for example so it's just completely new kinds of cross referencing and data connections being able.
[00:30:53] I like it. I like the idea I like the concept of that of thinking about what to implement and then using you know, you know, DPP using the most being chain multiple party computing both using federally using your platform really.
[00:31:13] You predicted the outcome will be I mean I think that's I think that's very fascinating and I want to you know if we can do that then what's possible. Exactly right and and the beautiful thing is that. And that's that's really.
[00:31:31] What's what's striving the team and that's again. Blockchain comes into play here mainly because for for for NPC the loss as to have and to that something that I wasn't able to talk about. But implementing a new kind of cryptographic protocol combined with.
[00:31:52] NPC that allows us to enforce execution to have sense of surprises and for especially those autonomous execution use cases right where. This country and computing environment replaces sort of your centralized sober so we are able to do that and cryptographically identify.
[00:32:13] Apart from that is misbehaving right so in this multi party computation we have this perfect fast assumption but at the same time and those. NPC scientific papers and this this on some authority trust model have this practical limited limitation that.
[00:32:33] But this honest player can dedos the computation so they can trust for the computation to abort by sharing wrong data and then the computation doesn't produce output which in the decentralized setting is not idea if you want guaranteed operation of your encrypted state.
[00:32:52] So we were able to implement a protocol that allows. The participants not only to identify that something wrong has happened because that's being identified always and but pinpoint the player who misbehaved so caught shooter identification and with this protocol combined with.
[00:33:15] Yeah, collateral staking and snatching mechanism we can enforce guaranteed correct execution in this network so. That for one is very blockchain comes into play for us and also using this centralization techniques and since parents you techniques so you can have this.
[00:33:35] This public lecture of the history of those computationals how they operated and with what other notes they run operations their up time if they ever cheated if they ever misbehaved and so things like that just at this.
[00:33:53] That's the first that further reduced the trust assumption so we can yeah have.
[00:34:00] Have a have the trust assumption go down even for right so that's where blockchain really comes into play for us and it's just as logical fit of those two kinds of technologies and yeah for us reading the trial really the driving force.
[00:34:19] To get back to our previous point. And is that we realized fast when playing around with the technology and reading the papers and what's possible to be done with this decentralized content and compute because it allows us to have.
[00:34:38] These kinds of insights that benefit our societies and humanity in general without having to risk the individuals freedom and privacy I think that's so powerful. And that's what we are trying to achieve. I hope you achieve it. It sounds wonderful. Wow.
[00:35:03] Great this place this this this this industry keeps evolving every day and I'm going to happy to be here to say it so. I want to thank you very much for speaking with me today.
[00:35:16] I really enjoyed speaking with you and I have one last question it's really simple. I'm going to ask you how many people find a more information about RPM and about you that can they start use your platform.
[00:35:31] Yeah first of all was a pleasure talking to you as well. And so you can go to rkm.com so importantly archions.c. And the name actually comes from the Latin plural form of arcs so arpion network if translated from Latin but mean the network of fortresses.
[00:35:59] And so you can learn more about our network at our website or on our Twitter at RPM hq and right now we're in the face of rolling out the private test net.
[00:36:14] So we are testing the infrastructure we're onboarding node operators and developers to test out the network and then step by step. We want to go through multiple iterations of the private test net move to public test net and then eventually to main net.
[00:36:33] So that's the stage we're currently at so you can learn more about arqm by just joining our community.
[00:36:41] We've seen incredible growth of our discord server for example and I'm doing every two weeks session of archium declassified is how we call this area's there where I talk about the technology use cases updates what we're doing so that's a great place to learn more.
[00:37:03] Excellent thank you very much for your time today. Alright thank you so much for watching me.


