VNP-018 Remove Liquidity SLA bond penalty

I have been thinking about the current penalties for Liquidity Providers not meeting the requirements of their liquidity commitment and currently, when an LP fails to fulfil their commitment by adequately quoting for a time less than the commitmentMinTimeFraction, they lose all of their allocated fee revenue for that epoch and are also subject to an amount of slashing depending on how much below the commitmentMinTimeFraction they were.

I think it is worth considering temporarily removing the bond penalty so that LPs will not get slashed if they fail to meet the time based SLA. This can be done by modifying the network parameter market.liquidity.sla.nonPerformanceBondPenaltyMax and setting it to 0.

Proposal JSON:

{
    "rationale": {
      "title": "VNP-018 Change market.liquidity.sla.nonPerformanceBondPenaltyMax from 0.05 to 0.",
      "description": "Change `market.liquidity.sla.nonPerformanceBondPenaltyMax` from `0.05` to `0` as set out in [VNP-018](https://community.vega.xyz/t/vnp-018-remove-liquidity-sla-bond-penalty/4367)."
    },
    "terms": {
      "updateNetworkParameter": {
        "changes": {
          "key": "market.liquidity.sla.nonPerformanceBondPenaltyMax",
          "value": "0"
        }
      },
      "closingTimestamp": 1700073000,
      "enactmentTimestamp": 1700073000
    }
  }

Some of the potential arguments in support of this change:

  • The protocol does not yet have strong liquidity and needs to attract Liquidity Providers, the risk of slashing will reduce the appeal of committing liquidity to markets on Vega.
  • Vega is a new technology currently in Alpha, and it is conceivable that issues may arise that could, in combination with bond slashing penalties, lead to unnecessary loss of funds for Market Makers. Including bugs, data node outages, chain halts etc
  • The time investment for Market Makers to understand and integrate with Vega is not insignificant and is a barrier for entry into the Vega ecosystem, we shouldn’t introduce further barriers at this early stage that will dissuade Market Makers from participating.
  • There are already market level fee penalties that are configurable for each market, these can already be used to apply strict penalties to Liquidity Providers without causing losses to their existing capital. These can also be tuned to create a highly competitive environment between LPs whereby a well performing LP can earn an outsized fee income relative to lesser performing LPs.

Some arguments against this change:

  • A lack of bond slashing might result in Liquidity Providers that frequently do not fulfil the requirements of the SLA and hence could negatively impact the quality of liquidity on markets on Vega.
  • There is an ongoing proposal to increase the market.liquidity.stakeToCcyVolume from 1 to 20, if this passes then LPs will need significantly less (95% less) capital in their bond and hence less capital at risk of slashing and so bond penalties will naturally be much less severe relative to the deployed capital of the LPs.

Overall I am in favour of removing the bond penalty for now. I think the pros outweigh the cons and having some good liquidity on Vega is better than having none.

Ultimately I have started this discussion to gauge the sentiment within the community and I welcome feedback and viewpoints from all participants.

Regards

Ed

2 Likes

I think I agree with this, but would prefer to wait until VNP 16 has passed.

Also if this bond penalty is set to 0, is there much of a reason to require a bond at all?

In the future, I think the bond will be useful if we can employ a flexible market listing program where MMs can quickly launch a market by putting up a bond

1 Like

Jubi,

Thanks for your response.

It is important to note that there is still a bond penalty in the form of an early exit penalty for when an LP removes their commitment and it brings the market stake below the target stake. So the bond still serves to penalise LPs that pull their liquidity at a time when the market has high demand for it.

However, I think it makes sense to bring back the non-performance bond penalty at a later date when markets on Vega actually have some decent liquidity and more MMs are integrated and confident in the system.

I think this is worth considering.

Even with 0 bond penalty, an LP that doesn’t meet the SLA will be losing all LP fees. So some motivation for meeting the SLA is still there.

What would you think of significantly reducing it, say to 0.001 instead of setting it to 0? So you lose 1 USDT out of a bond of 1000 USDT per epoch the SLA hasn’t been met.

Yes I think this is fine and would serve the same purpose.

I like to adopt the ethos of not trying to solve problems that don’t yet exist.

I understand the problem that bond slashing attempts to solve, namely LPs abandoning a market. That said, I think it likely to be very rare that LPs would do that, as there will always be an arbitrage incentive to close out open interest, even if the prices offered by LPs are not good.

So, generally, I am in support of this until some markets exist that are riskier and sufficient axiety about LPs abandoning those markets is warranted.

On the other hand, there’s an open proposal / discussion about reducing the bond requirement by a factor of 20x, which gets us 95% of the way there anyhow. So with that in mind, do we need to do both right now?

Hi David,

Yes I think the reduction in bond amount takes precedent and might be enough on its own to address the aims of this proposal.

Update:

Over the past two days the two largest liquidity providers have both been slashed for failing to meet their liquidity obligations as set out by the protocol. This is despite the fact that they quote continuously on the book. The likely scenario is that they only fell short of their volume obligation by a small amount and as such were considered to not be quoting by the protocol given that the current implementation does not differentiate between an LP that only just misses their obligation volume by a small amount and an LP that does not quote at all.

As a result, both LPs have withdrawn their liquidity commitments or, in the case where they are the only remaining LP, significantly reduced their commitment. The obligation on the Bitcoin December future has fallen from ~250,000 USDT to 20,000 USDT and the obligation on the Ethereum December future has fallen from ~160,000 USDT to 13,000 USDT.

image

image

This highlights the fact that the nonPerformanceBondPenaltyMax of the SLA for both markets should be set to 0 or the nonPerformanceBondPenaltySlope should be set to a conservative value such as 0.01. I see no reason to penalise the LPs so harshly when they are clearly doing a sufficient job quoting on the markets, it only acts to disincentivize participation in the Liquidity Provision system and drives Market Makers away from Vega Protocol at a time when it is most important to bootstrap liquidity on the protocol.

Regards

Ed

2 Likes

I would like to know what @Yy-Shadow thinks. He made the proposals and is likely linked to the LP.

This is not quite true. The protocol absolutely is designed to differentiate between an LP who only just missed the target and one that is futher off, via the “slope” parameter for the penalty (market.liquidity.sla.nonPerformanceBondPenaltySlope).

This parameter is set to 1.0 which means that if the required time on book is 95% than an LP must be below 90% to be given the maximum 5% penalty. If the slope were 0.5 then the penalty would max out at 5% with 85% time on book, a slope of 0.1 would mean a max. 5% penatly at 45% time on book, etc.

This doesn’t necessarily change the arguments above and it is certainly true that the combination of the current settings overall (SLA target, slope, max. penalty) appears to be too aggressive/harsh.

However, it’s important to consider the entire algorithm/protocol rather than look at one parameter in isolation. We should consider whether it would be better to change other parameters instead of or in addition to the maximum penalty parameter discussed so far here.

1 Like

Hi Barney, I am not refering to the time based aspect. I am referring to the obligation volume based aspect. ie; LP could quote with 100% uptime but if their quote volume is less than their obligation volume even by a tiny amount then their “time fraction” on the book will be 0%.

I think I remember reading somewhere that to check if the LP is meeting their obligation you would take the min(total_bid_volume, total_ask_volume) for the LP and compare it to their commitment * stakeToCcyVolume? Is this correct or am I remembering incorrectly?

With respect to taking into account other parameters. I think that harsh/aggressive fee penalties are fine, but such harsh bond penalties at such an early stage in the protocol’s lifetime are counter-intuitive given that we are at a stage where we can still see bugs, outages, datanode migrations etc. I think setting the slope and the max to more reasonable values would certainly be less extreme than setting the max to 0. What do you think of a conservative slope value such as 0.02? This would slash 2% of the LPs bond if they had 0% time fraction and ~1% if they had a 50% time fraction.

1 Like

Yes I think a slope like 0.02, even perhaps combined with a max of 0.01 so they can lose at most 1% of their bond if their time fraction is ≥50% below the SLA target would be totally fine.

Of course it can all be set to zero but then there is no discouragement for an LP posting much bigger commitment than they can meet and gaining ELS etc. advantage, so my preference would be to try to avoid that.

1 Like

I understand where you’re coming from but this is sort of also somewhat the point of the system.

The commitment is supposed to represent an amount that will pretty much always be available to trade against on the book, not the most that will be available each time they refresh their quote or whatever, and that’s what the current measurement aims to do.

If the most an LP is ever quoting is 10k then it might not make sense for them to commit to quoting 10k on the book 90% of the time because obviously as their orders trade and might not get replenished immediately, there’s going to be some time providing less than that, and then most LPs want to be able to move off the book or skew at times too.

I’m no expert on the exact parameters an LP might run their algos by (and I know these and even the strategies themselves vary quite a bit from market maker to market maker) but I would imagine that for a standard quote amount of 10k on each side, an LP might commit to say something in the region 5k–9k.

We are looking at whether there’s a potential improvement to the algo to count trades against the LP’s quoted volume during a block towards their total supplied, so that the commitment amount can be closer to the amount an LP is quoting and credit the liquidity that actually traded. I’m not sure now much difference this would make but it somehow seems desirable if we can make it work. Currently we have identified a few ways that might be exploited, so I’m not sure if it would in fact be workable. I’ll write up another post on this potential future enhancement later today or over the weekend.

Would be up for revisiting any of this of course.

2 Likes

@Ed-Commodum further to this and based on some other feedback from MMs and community members, I think it might actually be best to do what you initially suggested and set the max penalty to zero.

We had previously considered making the bond smaller and essentially a sybil prevention feature only (i.e. not at risk of slashing). This would be replaced by collecting the LP rewards/fees for some configurable number (1, 3, 7, …) of epochs prior to distribution and slashing those collected rewards instead.

I think we will likely reopen the discussion and move forward with implementing something like the above (subject to community and LP agreement). It is going to be necessary for hybrid/AMM liquidity anyway and seems to solve some frictions and issues for LPs that having their capital at risk of slashing creates.

1 Like

That newly proposed method sounds like an interesting idea to explore. Potential loss of unrealised earnings seems like a good motivator without being too threatening to LPs.

I will update the initial proposal with the correct timestamps shortly before I make the proposal submission on chain.

Thanks for your in depth discussion and feedback.

Thanks @Ed-Commodum - I agree it feels like a better approach and the existing parameters (max, slope) plus a new parameter for how many epochs’ earnings can be at risk would give pretty fine grained control. Will write up soon and appreciate your input at always.

Thanks!

2 Likes