IOHK research update with chief scientist Prof. Aggelos Kiayias.

Do repost and rate:

Tim: So Aggelos, thank you very much for joining us on the show today. Let's start with a very straightforward question, we talk about cryptocurrencies, but you've been working in cryptocurrency since the mid 90's, what is cryptocurrency?

 

Aggelos: Cryptography, first of all, has been my personal passion, I started working on this when I was a young student in college, this was something very inspiring, discovering this area, much less was known at that time, compared to how popular it is today, certainly things were very different. Then in the mid 90's, the internet itself, although functional, but by no means was it as popular as it is today, it was something that you would have access to if you were working in a lab or university, there would be people who would have access to it, you know, in very specific ways, through these services like America Online and others, so it was a very different time.

Cryptography, as a discipline, especially in its modern form, of course has been driven by the internet. Realizing that if we want to take advantage of the internet, if you want to put it that way, to its full capacity, will have to develop ways for people to have secure and reliable communications and information exchanges over the internet, cryptography would be an integral part of that. So modern cryptography is a scientific domain, mostly it's part of computer science, it's also touching on mathematics, electrical engineering, physics of course, but it would still be considered a field of computer science, in the way it's developed today.

As a research area, I think we can trace it back to the pioneering work of a number of people in the 1970s, including Ralph Merkel, Adi Shamir, Ron Reverst, Wui Diffie, Martin Hellman. These are the pioneers of the 1970s who made major advances in cryptography, which really shaped the field, as we know it today, and in fact a lot of the things they did are still very widely used.

As a scientific discipline I think it basically came with the creation of the International Association for Cryptologic Research, which was in the early 1980s. That was also another very interesting moment because it basically said that cryptography should have its own home, separate from other areas in computer science, and in fact not directly associated with, at that time, the big association with computational research, which was ACM and Atrip Leep. ISCR basically has been the home for cryptography, for all the decades up to today, and those people, including myself, did research, in cryptography aspired to present their work and communicate advances in cryptography, in these venues.

This gives you a little bit of background, how cryptography came about as a scientific field. Now, what is cryptography, I think that's also an important thing to clarify. Cryptography, if I were to define it, in a sentence, is a science that aims to organize digital systems so that trust is distributed across its components, so that the susceptibility of the systems to attack is minimized, so that's what cryptography tries to do. It's like a way of designing systems, so that trust is distributed to their components in an optimal way, with respect to a class of attacks that we're particularly interested in mitigating.

And there are two important elements in that process. First of all computational hardness will play a role. The concept of computational hardness is if you want one of the most central concepts in computer science, it's also the hallmark of the most important question in computer science right now, from the theoretical perspective, which is the famous question p equals np, because this is a question that is at the heart of computational hardness, what is computational hardness, in particular asks this question in the context of these two important kinds of problems in computer science. Cryptography has this very interesting role in that context, it tries to shape computational hardness into a defense mechanism, and it tries to do that with respect to not just a handful of attacks, but with large classes of attacks, which are well defined in a computational model. So this gives kind of a bird's eye view for, if you will, what cryptography as a scientific discipline is all about.

Tim: So I guess a significant legacy in crypto despite the relatively recent advent of cryptocurrencies, But where are we as an industry, we're at early age, but evolving, we're maturing, there's a similar maturation happening around the application of core research within crypto projects. What's your sense of the current state of thinking there?

Aggelos: Well it's certainly better, compared to a few years ago, when Input Output was basically the only company that was really doing it. I was very happy to see other projects taking a similar approach to the, you know, the one that we took. But they still remain a handful of projects that really engaged in it. It's not very surprising, I think one reason is, first of all, doing research in cryptocurrency and more broadly in other areas that touch the cryptocurrency space, such as game theory, economics, etc., is very hard. And at the same time, the industry, so to speak, is not yet mature enough, if you will, to properly value projects. So you can easily do it without doing a lot of research, focusing on marketing, or some specific application, domain, that people are interested in at that particular time. You can still see projects that get a lot of mileage, without necessarily investing a lot of effort in thinking, planning and designing what they do. But I think this is just a matter of time, comparatively a few years ago, I think the space has matured a lot, this trend is definitely going to continue in the near future.

Tim: Tell us a little bit about the Input Output research approach, also in the context of the software development organization.

Aggelos: The research that Input Output does basically covers all areas related to distributed LEDGER technology, and in a first principles way. I think it's important to focus on how this works, because I think this in itself needs to be appreciated. When you try to design a system, in a first principles way, and you don't just produce a blog post or a design document that says what the system will look like, but you try to articulate the system in the context of computer science research and cryptography in particular, you have to present the system in the context of the research domain it belongs to, you have to explain why the system has the benefits it claims to have, what the state-of-the-art is, in what way the state-of-the-art is lacking, and what exactly is the problem that the system that is being proposed is solving. In fact there are a lot of benefits to taking such a first principles approach, while at first it seemed to take longer, in fact, there are many cases where this helped us arrive more quickly at certain desirable outcomes, I'm just going to give you an example of that, because this is one that I've been thinking about recently. A lot of people now are talking about Ethereum and EIP1559, this is like a major breakthrough for Ethereum, so what's the main feature that people are talking about, they're talking about, of course, fees, and traditionally in Ethereum, following Bitcoin, the fees on the blocks go to the miners who issue the blocks. But what happens with this EIP1559 proposal, is that there will be a base fee that will be burned, and what's the main difference here, is that when a transaction fee is burned, that potentially benefits all Ether holders, because it realizes a slight reduction in the supply, in the existing supply of Ether.

Now, I haven't seen that discussed much, when you talk about Cardano, in fact Cardano has been doing this from the beginning, in a much more sophisticated way, and without burning. How does it work, the fees in Cardano don't go to the miner, but are put into a reward pool, which is then equitably distributed to the Ada holders who participate in the staking, all of them. And even more, so you can immediately see the parallel, right, the fees are benefiting everybody, but actually in a much more sophisticated way, because the Cardano reward mechanism will exclude those holders that don't contribute to the system, so those holders that don't contribute to the security of the system will not benefit from a particular fee.

So now you might wonder, and by the way, I should say this is something that wasn't even discussed, I haven't seen it discussed much in this business, for example, I certainly didn't see this being discussed in the context of Ethereum development. Despite the fact that this was a feature from day one for Cardano. You might ask yourself "how did we come up with such a sophisticated mechanism, which we didn't particularly celebrate either, and in fact these were first principles. While we were designing the original Ouroboros paper, the protocol as well as the paper itself, when we were writing it we were trying to prove some theorems, some of these theorems were about economic equilibrium in the protocol. When I was organizing the proof, the protocol being in equilibrium in a certain configuration, the only way you could get the mathematical proof, to get it through, would be to adopt such an organization of the rewards, in this epoch-based way that Cardano does. So here you would see an example of the mathematics, just for the fact that you're trying to achieve certain high-level properties that are important, the mathematics itself basically demands certain things in the design. Then, by following them, you can get to this particular development, I mean, as I often tell my students, one has to listen to the math, if you are able to tell them what the problem is, and articulate the problem in the right language, they will tell you the solution, if you are able to listen.

Tim: How did the Input Output team grow and develop its approach and processes over the last two years?

Aggelos: The overall Input Output team has of course grown a lot over the last few years. The overall focus, especially in research, has fundamentally not changed, we still strive for high impact, top quality research results, which are widely disseminated in the top places for research, in the area of distributed ledgers, cryptography, as well as financial technologies. Now, maybe I should say a couple of words, just to remind all the listeners about how Input Output research operates, so we have a three-pronged approach. First, we have our in-house researchers, these are full-time employees in Input Output, we have embedded research, which are people working in university labs, which are funded by IOHK. And finally we have external collaborators, external research projects, which is basically outsourcing research, some specific research problems, with partners, with contracts, which are very specific in terms of time and delivery. So this three-pronged approach, everything comes together, works together and delivers the research that you have seen that the company has been producing over the years. I also want to emphasize a little bit the research partner position, within Input Output. These are extremely selective posts, someone already has to have published for several years, after their PhD, in the top places for their respective area, to be considered for the research fellow position. I have to say that I am truly blessed to be able to work with some really amazing people who serve as research fellows at Input Output.

Tim: You and your team of course have delivered and contributed a significant volume of research, I'll provide a link so everyone can dive into that. Perhaps you can highlight a few pieces of recent research that you would like people to understand and know more about.

Aggelos: In the last few months we've either published or been accepted for publication a few papers that I'm very excited about, so I'll tell you a little bit about them. The first one is Hydra, which appeared in Financial Cryptography and Data Security 2021 earlier this year. This is just the first installment of the Hydra protocol package. Specifically what we released was what we call the head protocol, this covers the case where you have a second layer protocol that is maintained by a set of entities that are online and are interested in moving extremely fast, to the physical boundaries of the underlying network, state, or part of the state of the main chain protocol. A unique feature of Hydra is that it accomplishes that in an isomorphic sense, isomorphic being a Greek word meaning "equal form" isomorphic. The key point behind this concept is that the scripting language that is available on the second layer is identical to the one that runs on the main chain. That, if someone wants to run a distributed application on top of this second layer, they don't have to make any specific or special adjustments to have their smart contracts operating on Hydra. So this distinguishes this second layer approach from, let's say others, for example Lightning, which is a second layer solution on top of Bitcoin, which doesn't respect the underlying scripting language, so an application, for example, running on top of Lightning will have to recognize the way Lightning works. There is more coming regarding Hydra, we are now working on the tail protocol, which also itself will come in a couple of deliveries, this covers the other use case, which deals with a second layer configuration, which does not have the same online requirements for all participants. In particular, what we're focusing on, is a configuration where one participant is highly available while the remaining participants might not be, might connect to the system intermittently. So more updates on these new editions of the Hydra protocol, covering these other use cases, will be coming very soon, as we finalize the first versions of that protocol, focusing on payments.

Another document I should mention, which I'm very excited about, is Ouroboros Chronos, which was accepted into Euro Crypto and Eurocrypt 2021, and was actually scheduled to be presented last month. However, this year Eurocrypt was moved a little later in the year, to October, due to the pandemic. So there is a hope that probably in October there will be more feasibility for people to travel, it is hoped that the conference could be in person, or at least partially in person and partially online, this has yet to be determined. In any case this is a very exciting outcome, not only because Eurocrypt is one of the top two conferences in cryptologic research, but also because the outcome itself demonstrates something, which initially when we started working on these issues many years ago, seemed exceptionally difficult. The question is whether in offering a proof-of-stake blockchain discipline you need a concept of time agreement between the participants. This was something that seemed essential and in fact was used in all these proof-of-participation blockchain protocols. But however, with Ouroboros Chronos we proved that this is not necessary. So the Ouroboros Chronos protocol in fact demonstrates how it is possible, based on necessary and dynamic participation, to build a global concept of time and export it, not only to the participants of the protocol, but also to any application that is running on top of the ledger. That means that any external dependency and reliance on a global clock is removed, in this sense the protocol becomes its own time keeper.

Tim: Those are the recently published papers, what other things is the team working on that you feel are particularly noteworthy?

Aggelos: Of course I mentioned the upcoming work with the Hydra queuing protocol, which continues very intensively over the next month. Something else that I'm very excited about is the work we've been doing over the last few months and will be intensifying over the next few months, and perhaps the remainder of the year, is on the concept of transaction fees. This thread of research started earlier this year with a blog post I wrote about babel fees, this is the concept that allows Ada holders to not only use user-defined tokens, but also pay for transactions using SATs tokens. So what happens there in terms of process is that the blog post first presented the idea, we collected some feedback that way. We worked on the document, now I'm very happy to say that the document is online, it's already as a technical report and it's also in the process to be submitted for academic peer review, very soon we will also go to the stage of looking at the prototype for that system.

Following that, and complementary to that work, is something that we have been looking at for some time now, but will be intensifying during the summer and the rest of the year, is the concept of stable tariffs. Stable rates are addressing the problem of having predictable and fair transaction costs for the entire cardholder community in a system like Cardano. The fee problem is a very crucial one, of course we widely recognize it as essential for a blockchain system. What we want to achieve with this research is first of all predictability, so in other words, if you're a heavy issuer of transactions, you should be able to plan ahead, depending on your transaction needs, have the resources that are adequate for you to be able to engage with the system. So we don't want, and I should stress, even at layer one, we don't want transaction issuers to have surprises as to what they might encounter when they're engaging with the system.

At the same time the fee should be fair and consistent with the service and the effort that the system requires to maintain. So, in other words, it should not be possible for fees to be manipulated, or to be set at values that are not consistent with the utility that the system offers. These are particularly fascinating and interesting questions, this is a really difficult problem, and I'm very excited to now be leading the team towards making some very interesting advances in this direction. Something I offered a glimpse of in a recent post on stable rates that came out about a week ago. Of course this is just one of many active research threads in Input Output, there are many more to mention but I think going through them is in itself perhaps a much longer discussion, but there will be many of them that you will see coming out. Perhaps just to select just one, there's our work for Voltaire and the governance system, which requires very interesting research advances, both in terms of game theory as well as about the cryptographic mechanism that supports voting.

Tim: Aggelos, of course the area of rewards, fees, engagement, these are all hot topics within the community, of course you and your team have been heavily involved in this, do you have any updates in this area?

Aggelos: In fact in the last few months we did a deep dive, we investigated how the incentive mechanism has been absorbed by the community and how people are relating to it. There are very interesting findings that will soon be shared, as well as slight adjustments to the system that improve the way the mechanism operates. One has to realize Tim that this is a difficult problem, real decentralization and security is not just counting the number of validators or the size of market capitalization, we are in for something real, we want a decentralized resilient system. Given the challenge, like the analysis we did about the system, I'm quite happy with the system currently, there are a lot of truly independent pools, and there are a lot of meaningful discussions in the community about how staking should work. But there are also things we would like to tweak and make better. At the same time, Ada's popularity and expanding user base has also given little opportunity for new people to become familiar with the mechanism. One thing that was actually interesting to observe is how suboptimal the choices people made about delegation are, there are also many suboptimal pool offerings, but suboptimal delegation is something we actually observed extensively. Basically what I mean by suboptimal is people not making the best decision to maximize their rewards. And in fact this generates a lot of problems in the system, just to explain that, the theoretical equilibrium, with any set of pools that exist, there's an expected idea of return in Ada, you can think of this as an ROI, ROA metric. Many delegators accept much worse ROI, ROA returns on their Ada delegations to a pool. So if you want to think of it this way, they don't play the staking game optimally to maximize their returns. And in fact this gives a lot of opportunity for some pool operators to take advantage of it. What we see, for example, as a strategy that takes advantage of that, are traders creating multiple pools, and creating multiple pools only gives marginal benefits when the delegators play optimally, that is basically when the whole system is in equilibrium, but when the system is not close to equilibrium, this behavior can be quite profitable for those who engage in it. And in fact this is bad for the system, multiple operators crush small SPOs and leave them with no chance to contribute to the system. Interestingly, in a typical case, they also deny a number of rewards from their own delegators. So raising awareness of these issues and creating a more engaged delegate community is something we will continue to work on.

What the community should expect, there will soon be some immediate adjustments to the way people engage with the mechanism, to give them a better experience and understanding about how the system works. And a little further down the line we will also be making careful and surgical adjustments to the reward mechanism itself, so that the efficiency of the whole system, from an economic perspective, is maximized.

Tim: Another topic that people might have heard something about is the Mithral thin client project, maybe you can tell us a little bit about that, and why that matters.

Aggelos: Yes, Mithral is an important set of cryptographic methods that we're investigating now, which are around building thin clients. From a cryptographic perspective, if I start with that, we're focusing on this primitive of non-interactive succinct knowledge arguments. Sometimes people talk about them as Snarks, in the broader cryptographic community. A Snark is a cryptographic primitive that allows you to create a very short proof about a certain assertion. The main problem we want to solve is how to create a very short certificate, where the majority of the holder population is behind it. The benefit of such a mechanism is that you can allow the whole community of holders, for example in Cardano, to create a very short assertion and say "that is the current state of the system". Furthermore, you can even create a chain of such statements, so in fact you can create a statement saying "well, at the present moment, this is the current community of holders", it's behind that message, and that's the current state of the system, and then you can present the next statement, and the next one. These assertions are very short, so the result of this, of this process, is that you end up with a certified system state, which is completely trustless, you don't have to trust anybody, because the whole community is behind it, but you can validate it extremely quickly. So the benefit of this, is that somebody can connect to the blockchain much faster compared to what's happening now. We've already completed the first round of research on these techniques, there's a technical report coming out, there's a paper that was submitted for peer review and what the community can expect very soon is to start seeing a prototype being tested, as well as the system being deployed as part of Cardano.

Tim: Aggelos, of course we are very busy in terms of the deployment and the road to Alonzo right now, what does your summer look like?

Aggelos: This is a very busy summer for research as well, there are a lot of research threads that we are actively pursuing. Maybe I'm just going to say a few words about one, this relates to Ouroboros Omega, it's a particular part of Ouroboros Omega, maybe just a word about Omega. Omega is the project that takes all the research that we've done on Ouroboros over the last few years, and converges it all to a single protocol, which will be a layer one, highly scalable, highly efficient protocol that could be used in the context of Cardano. But now specifically, within that context, what we're looking at relates to network separation security. So just a couple of words regarding the motivation we had behind looking at this issue. If you look at consensus, from first principles, as a problem it has two properties, consistency and liveness. And these two properties, which of course ideally you want to have them together, obviously, it doesn't make sense to have one without the other. When the system goes to extreme conditions, at least one of them has to fail, it is impossible to maintain both consistency and liveness. When you design a consensus protocol, you have, or at least that's the conventional wisdom, so to speak, you have to decide what you're going to claw back, for example Bitcoin favors liveness, the original Ouroboros protocol was also favoring liveness. But other classical consensus protocols are favoring consistency. Since mathematically you can have both, you can say "okay, that's the end of the story". But there's a very interesting point here, that okay, one of the properties has to go, if you're in extreme conditions, but the question is "can this be a decision that's driven by the user and not the protocol itself?". Basically this could be quite an exciting aspect, because depending on the threshold model that every user prefers, they can choose which property, in times of system disaster, they can choose to claw back. So depending on the particular use case that a certain user of the protocol is interested in, they may choose to clause a different property, some users may choose to clause consistency, and others may choose to clause liveness. This is very natural and makes the system much more flexible, let's say compared to any consensus protocol that by default chooses one of the two properties. So I'm very excited to go deeper right now over the summer, on one of our research threads, with the Ouroboros research on it, and in fact I'm thinking that hopefully by the end of the summer we'll have this done, and this will be one of the major end points in our development of Ouroboros Omega, which I would consider one of the jewels of our research department over all these years, finally delivering this proof of participation protocol that basically solves every problem that we could have imagined when we started this journey towards understanding these kinds of protocols.

Tim: Aggelos, thank you very much.

Aggelos: Thank you Tim.

 

Source of Spanish Content

Regulation and Society adoption

Ждем новостей

Нет новых страниц

Следующая новость