Fairness in Blockchain: An academic insight - Ask me anything

Welcome to our AMA on Klausā€™ recently published research paper on Fairness :innocent:

Klausā€™ paper examines the concept of Fairness in blockchain-based trading systems and whether they can provide a guarantee that order book access is treated transparently and fairly, i.e, handling attempts of frontrunning, rushing, denial of service, or other attempts to manipulate the sequencing of trading instructions.

His research explores adding a module to blockchains that supports the concept of ā€˜relative fairnessā€™ so that competing transactions may be sequenced under a known and understood protocol, and not subject to a validatorā€™s discretion. This provides a fast enough transaction speed whilst maintaining a pretty good approximation at fairness.

Have a quick read, or check out Rebeccaā€™s summary then letā€™s pick Klausā€™ big brain apart, 1 question at a time!:brain:

The AMA will go live on Tuesday 7th July 2020, 17.00-18.00 BST

10 Likes

Klausā€™ Dystopian Labs talk is a pretty good 101 on the paper too ā€¦ Here is the recording

3 Likes

Hi Klaus,

I was wondering whether youā€™ve done any modelling / simulations for ā€œProbabilistic Relative Block Fairnessā€ (and for the more advanced definitions / protocols) of how many requests would end up scheduled unfairly with reasonable assumption on say latency, implying relatively low r_max.

Cheers,
David

2 Likes

Hello and welcome to our AMA on @klausā€™ ā€˜Fairnessā€™ paper!

This AMA will last from now until ~6pm BST. Klaus will be present and answering your questions when and as they come. I will make an announcement 5 mins before the end as we begin to wrap up the discussion!

Iā€™m looking forward to seeing what everyone comes up with and how Klaus meets any challenges to his research!

Without further ado, letā€™s begināœØ

2 Likes

Thanks Daigan, hoping for a heap of questions!

3 Likes

Hi David,
thereā€™s no simulations so far, that would be a good next step to do. The results I would expect depend quite some on the network architecture - if there is a high variance in message delivery times and a lot of messages, the probability for unfair schedules for the probabilistic protocol will go up. The other open question (which is hard to model) is the level of network control an attacker has - from what I see now an attacker would need quite some network control, but not an impossible level. On a well behaved network with low-ability attackers I would say that the probability that the probabilistic protocol creates some unfairness is very low, so it would be acceptable for some markets (also, not all unfairnesses create monetary loss). It would be good though if we could quantify this to give marketmakers more guidance which fairness-version they need for a particular market.

1 Like

Hi there, whatā€™s the worst example of order book manipulation that youā€™ve come across and would your proposal have mitigated this?

2 Likes

Hi Klaus,
I was curious how you account for the increase in latency necessary to ensure fairness? This is especially key to high frequency trading systems, no?
Best,
Seb

1 Like

Hey Klaus, I donā€™t understand why fairness is so hard ā€¦ what are the key problems to getting it right?

1 Like

Hi Sebastian,
thereā€™s two places where latency comes in. One is that the protocol adds a message roundtrip to collect votes from the validators (plus some additional time until enough validators are aware of the transaction, which is up to another roundtrip depending on the architecture). The other factor is that some messages cannot be scheduled right away, as the protocol does not have enough information to determine if it can do so fairly - this added latency depends on the fairness definition we use (for example, the timed one has a very low delay, while the probabilistic one depends on the maximum blocksize one wants to allow) and how badly behaved the network is (which would be another point to do simulations). We mitigate this in the protocols by having a breakpoint where something happens one way or the other, which is configurable by the system, so there is a configurable upper bound on message delay.
One way we reduce the impact is that fairness is only required for messages that affect the same market, i.e., some markets can chose that their transactions donā€™t need to be fair (which are then completely unaffected), and messages of different markets donā€™t need to wait for each other (also, this means every marketmaker can set their own parameters on maximum latency)

2 Likes

So since things canā€™t be provably fair, is provably consistent unfairness (relative fairness) enough?

Is this a proof of concept or is it ready to be implemented? If not, when/do you see it being adopted?

1 Like

What kinds of unfairness, in a system like Vega, would not create monetary loss?

1 Like

Hi Rebecca,
life just ainā€™t fair :slight_smile:

The hardness comes from two factors. One is that itā€™s quite difficult to find a good definition on what fair actually is in the first place (which is why we have three to choose from, and trying to add more in discussions with traders). The second one is that some of the more intuitive definitions of fairness can be proven to be impossible. The first one I came up with - if all good validators see a before b, then a must end up before b - cannot be enforced in this case the fair schedule depends on who is good and who isnā€™t, which is generally not known to the system. So, as many things on the consensus layer. this is a wild rid circumventing impossibilities and finding the right place to compromise.

1 Like

Hi Caocao,

I think the worst thing Iā€™ve seen was some game on Ethereum where the last person to bid before some timer expired would get a large pool of money, and someone did go and spent all the gas necessary to lock out all the other bidders in the end (donā€™t nail me on the details, would need to look up the references again) - this would definitively have been avoided. I donā€™t have numbers for order book frauds, anyone else can jump in here ?

1 Like

Hi Ti8*,
itā€™s in between. The protocol is good enough that it could be implemented now (i.e., if we use the optimized protocol thatā€™s the last in the paper, I donā€™t expect things to get substantially more performant, and it is already optimised to integrate into a blockchain with minimal integration pain). There are a couple of smaller things weā€™ve come up with after the writeup though, so what will be implemented in the end is likely a version 2 of the protocol.

1 Like

Hi Candia,

thereā€™s probably more, but there are definitively orders I can make that donā€™t compete with yours (e.g., if youā€™re selling while Iā€™m buying, and Iā€™m offering less than you wanted). In those cases, if I unfairly get in front of you, it shouldnā€™t affect you at all.

1 Like

Hey @klaus,

I have a question. Do you see any other projects one day picking up ā€˜Wendy the Fairness Widgetā€™ or will this be Vega exclusive?

Thanks,
Daigan.

1 Like

@klaus Does is there any consideration for punishing/removing adversarial validators from the wendy-set, or is that left to the corresponding consensus layer?

1 Like

Hi Klaus,

Would it be possible to design a protocol that reshuffles the order of transactions in the block (so the consensus layer would effectively only drive the constituents of the block, but not their order) in a way thatā€™s not a prior determinable by participants? Would this require an oracle to provide the entropy or can it be somehow internal to the protocol? Also, would such approach somehow fit into the fairness taxonomy that youā€™ve put forward in your paper?

I do realise that random block ordering may not be a desirable characteristics (especially in financial applications), itā€™s more of an abstract question.

Thank you in advance for your reply,
Witold

1 Like