Handling of Node Provider rewards

Hmm, the problem might have more to do with liquidity than token existence.

How much voting rewards are liquidated within 1 month of receipt? Low.

How much node reward is sold off within 1 month of receipt. Higher.

Eg, satoshis initial mined btc is perceived to be illiquid even though it exists.

Regardless, I’m not sure how much any of this matters.

2 Likes

besides this artificial floor it may be prudent to perhaps offer nodes who do not sell any ICP ( and those who use it to buy cycles) over some weighted period of time the chance to earn some of the cycle’s ICP burned if cycles’s icp burned exceeds the node and voting rewards in some time period (per day, or rolling weeks i suppose). this would make cycles burns slightly less deflationary but could be a carrot that can spur more decentralized marketing of the IC by node operators (sort of how BTC miners evangelize bitcoin)

in this case node operators who hodl the ICP they earn or who spend it inside the ecosystem (burning icp for cycles), have an added incentive to market the IC app developers.

we should take this opportunity not to punish the node operators for what is not their doing, and while adjusting for this extreme situation also offer them carrots that align with growth in cycles usage.

4 Likes

To ensure fairness, it’s also crucial for Dfinity to reevaluate its marketing strategy to stimulate demand. I have an idea that I’ve submitted a feature request, which you can find here.

This request represents an initial concept that has the potential to boost demand for computational cycles. While this may not fall directly within your scope of responsibility, if you can pass it along to the relevant individuals in dfinity who in charge for this area will be great.

2 Likes

Inflation in NNS is a huge problem. More than 30 million ICPs are issued every year. This is an ongoing thing.

2 Likes

Marketing for icp is at most weak. The tech, team, project and community is great. I just get my head around the poor marketing.

7 Likes

@bjoernek
The problem statement makes perfect sense. It’s essential to realize that modifying the NP rewards approach might inadvertently affect not just NP investors but also the overall perception and trust in Dffinity’s commitments. Decisions made now have repercussions that could echo in the platform’s long-term credibility.

Drawing from my experience and observation, I believe that the biggest area ICP could significantly benefit from is enhanced marketing and outreach. The platform could achieve more widespread adoption with effective marketing. Take a look at what google cloud did for large scale adoption in the Asia pacific where I was part of it. To any enterprise, sales and marketing are the lifeblood that ensures sustained growth and visibility. Whatever short or midterm strategies are in place to slow down inflation will not be enough. Let’s address the elephant in the room “Visibility and Adoption”

7 Likes

Agree, finally. All the technical advancmements and the outstanding r&d team are on the very next level. We have https outcalls, ckBTC, soon ckETH, the SNS option, Openchat , soon AI, and so on.
ICP is more then ready for mass adoption to take place.

The lack of a massive marketing campaign is what is holding it back big time. All the other big “chains” always have & had big marketing and weak r&d, wich still puts them at the fore front of the mainstream audience.

ICP has the greatest advancements in blockchain, but it is going under because nobody is reporting about the advancements to the mainstream, consequently there is no significant adoption. Attending blockchain events is great and important, aswell as being active on twitter. But it is clearly not enough to draw mainstream interest.

Others seem to pay to get visibility on youtube & news outlets, shouldn’t we do the same?
Including real world marketing like advertising on a public digital billboard e.g. NY Timesquare, London Piccadilly Circus…( Does ICP / Dfinity Have A Marketing Strategy? )

As MalithHatananchchige said, this could turn things around without modifying NP rewards and impacting the overall perception and trust in Dfinity’s commitments.

Suddenly modifying NP rewards could also rub them the wrong way. To suddenly change their rewards, would go in the same direction wich occurs when building on AWS “platform risk”, the platform suddenly changing underlying fundamentals, wouldn’t it?

Instead consider launching a massive marketing campaign to gain visibility & adoption.

Thoughts?

4 Likes

I’m sorry, but the math here makes no sense to me at all, and I am an engineer.

So you defined the ratio as follow: 2.22 (ICP/XDR)

Okay, and assuming we receive rewards equivalent to 1000 XDR, how many ICP would that be?

ICP amount = 1000 (XDR) * 2.22 (ICP/XDR) = 2220 ICP

I don’t know why you were dividing. Your units don’t even cancel out. In the above calculation the units do cancel out.

Please correct me if I am getting something wrong, but I’m pretty sure you shouldn’t be dividing.

While we understand that Dfinity is a “not-for-profit organization” and the Internet Computer protocol resembles a “non-profit public internet infrastructure”, it still needs to be managed with a business mindset, akin to a corporate entity. The key distinction here is that it’s not managed for maximizing profit, but rather to strike a balance between supply and demand.

Currently, I’ve observed that the community is exerting a considerable amount of effort on the supply side. However, there are limitations to how low the supply can be tweaked before it begins to negatively impact the protocol itself.

In my view, the lack of discussion on the demand side is the primary reason for the low consumption of computational cycles. This leads us to the question: What metrics should define a successful marketing strategy in the case of the Internet Computer Protocol (ICP)? Using the wrong Key Performance Indicators (KPIs) can yield misleading results. If we measure success solely based on metrics like the number of Internet Identity (II) creations, canister creations, or active developers on GitHub, it may appear as a successful marketing effort. But is this the most important metric for measuring successful usage adoption?

From my perspective, it all boils down to economic fundamentals. The most crucial metric to monitor is cycles consumption because it directly relates to the supply and demand dynamics of the tokenomics.

In my opinion, ICP should consider developing ‘products’ or ‘services’ that are closely tied to real-world usage and cycles consumption. Prioritizing products or services that generate sales should be a part of building this essential infrastructure layer.

While I understand that these endeavors are typically the responsibility of projects built on ICP, given the current market conditions and difficulties in securing funding, many projects are currently on hold.

To address this situation, it might be prudent for Dfinity to step in proactively and create basic products or services that can generate demand and absorb excess liquidity in the interim.

3 Likes

No, his calculation is correct.

1 XDR = 1.3 USD
1 ICP = ~3 USD => 1 ICP = ~2.3 XDR

3 Likes

Great topic, some of my thoughts on this as well. I personally think there are two slightly separate issues here:

  1. Node providers contributing to ICP inflation - this doesn’t really seem to be the case to a significant degree as that appears to be mainly driven by other factors from some of the statistics shown and maybe that is where one should look first.
  2. Overcapacity of nodes (i.e. 50% of nodes just “awaiting subnet”).

In regards to point 2, I think a certain degree of overcapacity is necessary, in order to allow for sufficient capacity to be present when ICP starts to scale more, because adding nodes is not a quick process. Even assuming there would always be potential node providers ready to jump and add nodes, the steps to become a node provider are not quick. Just getting through all the proposal steps would take around 3 weeks at best. One also has to negotiate with data centres, and then, crucially, procure the hardware. The type of node machines required are quite high spec and at the very best procurement lead times are around 8-10 weeks….if there are chip shortages and the like, this can also increase to 6 months. Then one also has to ship them to where they need to go - particularly for non-EU/US based node providers this adds complexity, as they often wouldn’t be able to procure locally. So it can be quite a lengthy process to add nodes. Therefore having a certain overcapacity of nodes in the network at any given time is probably prudent, does it have to be 50%? Probably not, but maybe 20-25%?

Right now the top 13 node providers (each having more than 40 node machines) account for 66% of all node machines. And 87% of these node machines are EU and US based. I think this is probably where I would start to take a look. How many of these are Gen 1 machines? I understand it’s normally a 4-year business model and the life of the node machines are assumed to be 4 years, so quite a few of these may come up to the 4-year mark soon? (I don’t think this information can be found on the dashboard). They should then be phased out and not renewed. And given such heavy EU/US concentration and only nascent emergence of nodes in other countries, it may make sense to not allow any more additional nodes in EU/US and focus on adding nodes in the rest of the world for decentralisation. Yes, adding nodes outside EU/US is often a bit more costly because bandwidth costs are significantly higher and requires a bit higher node provider compensation, but this is also changing as time passes. And one should probably think about setting a lower limit per data center and location for the maximum number of nodes, to ensure adequate decentralisation (maybe 20 nodes or so, right now there are some of these top node providers with more than 60 nodes all in one data center). Lastly, Dfinity themselves run more than 90 nodes total in EU/US, I guess they must all be Gen 1 nodes. I understand if they wanted to run a few nodes themselves for development purposes, but that should only be a few, so maybe these can be the first ones to be reduced/phased out, while allowing for some additional nodes to come online in the rest of the world to drive decentralisation.

10 Likes

It’s 2.22 XDR per ICP (XDR/ICP). The graphs all say ICP/XDR which will be 0.45, which is a mistake @bjoernek made. 0.45 ICP = 1 XDR or 1 ICP = 2.22 XDR

1 Like

Note that X/Y is the standard notation for currency pairs. X/Y = z means that 1 X is traded for z Y.

3 Likes

@Jesse @Qasmi1216 @BHare1985

To extend on what @THLO had mentioned:

  • In the post and the example given, I have used the standard notation for currency pairs. Stating that the ICP/XDR exchange is equal to 2.22 means that 1 ICP corresponds to 2.22 XDR.
  • An easy trick to remember this convention is “ICP/XDR = 2.22” can be re-arranged to “ICP = 2.22 XDR”.
  • However ICP/XDR is NOT the unit of the exchange rate, which to my understanding created some confusion.
  • So in summary, if you want to convert a given XDR amount (e.g. 1000 XDR) to ICP, then you need to divide by the ICP/XDR rate (e.g. 2.22).

I hope this clarifies !

4 Likes

The idea is to apply the ICP/XDR exchange rate floor only in the calculation of node provider rewards (which is done in the NNS governance canister) and not in the context of cycles minting (which is done in the cycles minting canister). Does this address your concern?

1 Like

Goodmorning @bjoernek , just wanted to follow up on this as I feel as though it has gone unnoticed.

I believe this is going to be a core factor in solving the potential of an inflation spiral - I would appreciate your feedback on the topic.

2 Likes

Thank you, @Accumulating.icp. I agree that the ICP protocol should have clear criteria or a model to determine the number of nodes needed both now and in the future. This model should take into account a variety of factors, including network diversification across node providers, data centers, and countries, as well as cost considerations, required spare capacity, and incentives for node providers. I plan to share some analysis and suggestions on this topic soon.

8 Likes

Yes. Thank you for elaborating.

2 Likes

This is not a solution but more of a tool so everyone could see the decentralization aspect
of upcoming data centers. The node map on the dashboard could have another node marker included for proposed data centers. Then everyone can visually see if the DC is actually helping to decentralize or if it lies within the bounds of current node overlaps. After the proposal is complete it could then be changed to upcoming(if adopted) or removed(if declined).

Another visualization could be color coding overlaps.
No overlaps, no color = nodes are contributing to decentralization optimally.
1 overlap, yellow= nodes are contributing to decentralization within acceptable parameters.
2 overlaps, orange= nodes are beginning to cluster. More nodes in these areas contributes to
centralization.
3 overlaps, red= Nodes are centralized. No further proposals from these areas should be accepted.

Number of overlaps to trigger color coding could be adjusted to reflect decentralization goals. I will admit that I dont know how many nodes are needed within any given area to optimize coverage. So adjustments are likely necessary to the suggestion but i think a visualization like this would help everyone with decision making as far as onboarding new nodes and data centers.

2 Likes

I wrote about a year ago that ICP should urgently update its tokenomics. The forum administrator deleted my post and warned me. I don’t enter the forum anymore either. Because ICP officials are so caught up in technology that they are not even aware that the tokenomics currently in effect is leading to ICP death. Under any circumstances, they must find a way to limit or even stop token issuance. I think it may be too late.

4 Likes