Hi. I see that the source is open and it’s on DFINITY · GitHub but why choose GitHub to store the source of the Internet Computer and in particular, NNS?
As a starting point it’s understandable as GitHub is very popular at the moment but I’m wondering moving forward if the source code of the Internet Computer too will be hosted by truly decentralized means rather than by one single source code hosting company. If it’s already being done in this manner (in case the repositories on GitHub are only for reference purposes) then excuse my ignorance but I’d really like to receive some explanations in this regard.
Also in case the source code that’s being used to actually build the Internet Computer and upgrade the governance software isn’t being taken from GitHub then I’d like know where it is as well.
Sure, but someone needs to create a software stack that is as easy to use, trusted, reliable, available, etc. as the current alternatives are. Decentralized doesn’t mean anything if the software itself is not on par with what’s available.
I don’t think that it should be created by someone. Rather, the source code hosting should be a part of the chain, just like application canister hosting is. But I think before even attempting that, Internet Computer should have implemented a multi-level governance authority system (similarly to the proposed/under development idea) with the source hosting being the most trusted part of the chain so that the (sub)nodes can’t override the government of the source concerning the entire chain.
I wonder if the idea of the source code (like, every distinct commit) being NFTs sounds too much of a wild idea to you but to me a change towards this direction is a must in order to ensure that the entirety of the development process will be at least as trustable as the running chain itself.
I think I see where you’re coming from. If you’re talking about traceability and ensuring that the replicas are running the code they say they are running, then I agree. This should happen, and I believe there have been efforts in this area. I know dfinity is working on improved reproducible build pipeline, and there are also some community efforts that already provide this service - i.e. matching some code in a repo with a canister hash, to make sure the code published is actually the code used by the replicas.
The issue with the initial question is that github provides much much more than simple displaying of the source code. So even if someone codes the features that you described, and the code can live on chain somehow in perpetuity, github will probably still be used, as it is one of the most mature products out there in this field, and the advantages of using it are huge.
No matter how advantageous may sound using someone else’s established product for your new enterprise early on, depending on it to this extent will surely prove fatal in the event of conflict of interest between the two parties. My bet is similar to what my Spidey-Senses tell me: Microsoft (as in the United States of Microsoft) will want to intervene, if they haven’t already. This is nothing to be taken this lightly of if Internet Computer is to truly remain distributed, open and transparent to all by all means.
To be precise, we do not use GitHub at all for active development on the IC repo. It is there for open-sourcing and accepting PR’s from the community, but all the build automation and pipelines are hosted on GitLab.
Everyone on Dfinity would like to start using source control on the IC, but git is an awesome and flexible tool, and there are no git providers for the IC yet that give us anywhere the utility or reliability that will allow ~100 devs to work out of a single monorepo. Even if the source code was “hosted” in the IC as its canonical source, we’d still need tooling for build pipelines, comparing and approving PR’s, running automated tests, and so on.
Thank you all for replying. Also I didn’t know about the GitLab side of the things.
So this means that a GitLab-like collaboration environment & build pipeline is crucial for the core development and in order to what I’m wishing to happen, an alternative rivaling the likes of such must be built on the Internet Computer itself. All I was looking for is a clarification on your needs to achieve such a feat and whether you’re open to this idea or not so I’m happy to hear that you are!
Dfinity runs a private gitlab instance. I don’t actually know that much about it, since the sdk repos do actually operate in GitHub, and that’s mostly what I work on. We have a whole internal developer experience in charge of the IC repo and gitlab, and from what I understand, their big project is to migrate over to Bazel as the primary build tool right now.
I don’t know what their list of necessary features would be, but at the very least it would include being able to compile Rust projects with deterministic reproducibility of wasm hashes, something that we’ve had to rely on Docker for until now
Hmm, the change towards Bazel as the build system sounds promising and I hope it’ll be able to work on the sources hosted on the Internet Computer too in the future. I also hope the developments like ICFS will at least help in this regard.
Re: Bazel: It wasn’t always so; dfinity used to build all the rust code with cargo and nix. Bazel is just plain faster, a LOT faster. Any complete stack hosted on the IC will need similar fast builds. It might be that there are non-replicated builds on development branches but replicated builds for merge to main.
I believe that there is already a project that acts as “github on the IC”. In fact, I landed here when looking for it. Anyone know where I might find it?