Of course, it would be great if the canister could be informed of the response to the HTTP request (or some kind of acknowledgement that it was received)
i.e. the email was successfully sent
We would need to employ idempotency in a similar way for responses/acknowledgements as well.
There is a scenario where Alice transfers the ICP to Bob according to canister C’s request. At this time, Alice called the send function in the Ledger canister.
Now Alice wants to tell canister C that she has completed this behavior, and call notify in Ledger canister to let the Ledger canister notify the C canister, but it cannot succeed because the Ledger canister requires the transfer target and the notification target to be the same.
Of course, this will bring some risks, but if you do relevant checks in the receiver canister, you can avoid these risks.
What are we doing about SEO? Sites within the IC will want to be findable by the old internet. Can we return the raw files to bots and allow them to read them normally at the boundary level?
For example: User A wants to make a cool newsletter site using ICME but notices it never appears on Google Search. Even if he puts in an extremely specific exact search. His newsletter is unnoticed by the general population. He may consider putting it back into the old web just to get the normal SEO.
I propose this be a high target item on the roadmap. Not as just dev. But as someone who understands real world problems that big tech is eating up. Um, um.
I wonder how we could leverage the community to help design and implement software changes to the IC.
On-chain governance in the form of the NNS is unique to the IC, and could help the network evolve much faster than other blockchains. (Look how long it’s taking for ETH 2.0.)
I’d bet there are quite a few of us who would be willing to help actually write the Rust code to add features to the IC. Being bottlenecked by the DFINITY Foundation on all of these large-scale changes doesn’t seem to be sustainable (or fast enough) in the long term. If we have a more formalized reward system and proposal review process for decentralized source code contributions, I think this could be a huge advantage for the IC.
Well, as you know, in one of my proposals I did actually write the code to implement the proposed change.
I agree that there may be a bottleneck with waiting on the DFINITY foundation to implement features that are voted upon. However, in this situation I think the bottleneck is not in the implementation, so I also agree that an improved process around proposing changes is needed.
Is it really unique? I understand that Tezos has had on-chain governance, including changing the core protocol and code in an on-chain governance-driven way, long before the IC existed?
But yes, a discussion on how the design and development can be actually community-owned is very welcome IMHO
A good next target seems to be open sourcing all code with each repository accepting pull requests from DFINITY and the community. If we can get to that point we’ll be in a good position to allow individuals, companies, DAOs, etc to start contributing.
The canister needs more detailed permission control. For example, the controller can upgrade the code and view the canister status; the observer cannot upgrade the code, but can only view the canister status.
Use a canister to manage the canisters of different users, just to get the status of these canisters, you need this function.
At the risk of embarrassing myself, I have to ask: I am not sure what “FOOD” means. is that the new way of saying FUD on the interwebs or something completely different?
I’m not an expert on the ICP ecosystem yet but yes in the blockchain.
This is my first review after completed the staking process and the following neurons>
I don’t very know if the followees are verified teams and if someone could create one as a ICP holder.
In that case for a staking period of 7+ years I see some kind of risk associated, for both
parts the token holder and the ICP ecosystem, since I know
this neurons has a lot of power.
What about to make a partnership with VeChain for the verification process?
They’re solving real world challenges for the Carbon footprint, Web3, DAO and Sustainability and they have strong partnerships in the quality audits.
And also a way to see all of this verified token holders on a list.
I’ve been pondering a concept that might sound unconventional and is still in the early stages of thought. I’m not entirely sure if it’s feasible, so I’d greatly appreciate your insights or corrections if I’m off track. The idea revolves around embedding a subnet within a subnet.
Concept Overview: Imagine a basic subnet comprising five nodes. What if one of these nodes is replaced with another subnet, also consisting of five nodes? This structure would potentially influence the consensus mechanism of the parent subnet, as consensus in the child subnet would impact the parent’s decision-making process.
Potential Implications:
Increased Security: By adding layers, the network might become more resilient to attacks or failures by virtue of having more independent nodes.
Asymmetric Consortium Subnets: This could be particularly beneficial in scenarios where consortiums involve both large and small companies. Small companies could manage their subnets, which then act as individual nodes in a larger subnet controlled by more influential corporations. This arrangement could balance power and participation, maintaining engagement and trust in the network.
Weakened Security / Higher Tolerance: This structure could allow a higher percentage of nodes to disconnect without disrupting the update process. For instance, if we abstract this idea to a 13-node subnet where each node is replaced by a 5-node subnet, the calculation for tolerance could be 4 * (5) + 9 * (1) / 13 * 5 = 44.6%. Although that make updates easier, that has the exact impact on lowering the percent of bad actors required to push a malicious update to 55.4%.
Slower Consensus: Updates which require updates from another subnet would take more time.
Seeking Feedback: This idea is still in its nascent stage and might have overlooked complexities or technical challenges. What are your thoughts on this? Could this nested subnet approach be a viable way to enhance the network, or are there fundamental flaws in this thinking? Any feedback, criticisms, or further ideas would be greatly appreciated.