It’s time to post my findings.
Unfortunately, I won’t post the raw results here as names/contacts were shared and I’m not sure people would agree with having those publicly available on the forum.
We’ve had a total of 38 answers :
- 28 answers from a survey that was general and open to any type of community member (investor, user, founder, developer…)
- 10 answers to a more technical survey focused on developers only.
How did people hear about the Internet Computer?
Very diverse answers, there doesn’t seem to be a major channel from which people have heard about the IC. A few prominent answers :
- Genesis launch event
- Supernova event
- Airdrop events (DSCVR)
- 2017 original white paper.
- Friends (word of mouth)
- Personal research
Highlight: the importance of the Genesis & Supernova events who attracted a lot of fresh eyes on the ecosystem to suggest that we should keep on having more of them.
Suggestion: events + hard push on social media (for a short period to become viral) seems to be an effective communication strategy and should be reused in the future.
Why do people think that decentralization matters?
A major value proposition of the Internet Computer is to enable decentralization of the entire tech stack (everything on-chain) compared to other chains that can only decentralize financial assets/tokens.
For financial assets the value of decentralization seems accepted but what about decentralized applications? The aim of this question was to hear opinions on why this proposition matters and ultimately why should people build on the Internet Computer compared to a centralized cloud provider.
Here are the answers :
- Decentralisation seems to offer more security & censorship resistance property.
- Decentralised system reduces trust assumptions in individuals/organizations and hence enables more efficient collaboration.
- Decentralisation is meaningless if there is still a single part of the system that relies on a centralized entity and the entire system is at risk of being unplugged because of that vulnerability.
- A decentralized application will enable better alignments of incentivizes between all stakeholders ( investors, users, developers).
- Decentralised cloud is the only way a DAO can truly operate.
- Decentralised application will enable the protection of user data.
Highlight: the value proposition of the Internet Computer seems to be closely associated with security, censorship resistance, alignment of incentives, and improved cooperation through DAOs.
Suggestion: the community should aim to maximize the following attributes when making decisions.
- Security (protocol level & application level).
- Censorship resistance.
- Alignment of incentives between actors (nodes operators, developers, users, holders…)
- Design a flourishing environment for DAOs to emerge.
If the Internet Computer were to fail in one of those following areas (security holes in the protocol / not resistant to censorship from major organizations) then all other attributes of the platform would be meaningless as the IC would be deprived of its core value proposition.
Which language (Motoko/Rust/Typescript…) do people choose, what motivated their choice, and what is their experience with the current language their using?
Out of the 10 teams that answered the developer survey :
- 6 are using Motoko
- 4 are using Rust
Motoko
Choosing Motoko was considered by most teams as a safe and secure choice: the official support of the Dfinity foundation is perceived as a huge guarantee for long-term support.
All of the 6 teams using Motoko are reporting a high level of satisfaction using Motoko and have never encountered a limitation specific to the language.
Rust
Rust was usually considered because the maturity of the language offers more available packages than Motoko and because the language can be used in other areas across the blockchain industry (and beyond!).
A lack of learning resources was mentioned for both languages.
Highlight: While having multiple languages is great and enables more and more developers to build on the IC there is also a risk of fragmentation for packages, tutorials, and codebase.
Suggestion: we can do a better job at presenting the different language options and explaining the pro and cons of each option. Working on more learning resources will indeed be useful. I don’t have a simple answer for the fragmentation problem I’ve mentioned.
What are the main difficulties of getting started and building on the IC?
Almost all teams felt that the main difficulties were the new concepts and terminology introduced when starting to build on the Internet Computer (canister, cycles, stable memory, upgrades…)
Some teams also mentioned that it was a long process to understand how the IC works because they wanted to have a deep understanding of it before building on top of it.
Documentation was mentioned as hard to understand but no specific part was identified.
Highlight: this section raises an interesting point: introducing new concepts is hard and is slowing down developer adoption of the platform. This leads us to an interesting question: how do abstraction and simplicity of use rank compare to other properties of the Internet Computer that were mentioned previously? How should we balance improving the building experience for end developers and the lightness/simplicity of the core protocol?
(This is especially interesting if we take into account that new concepts are being proposed in an attempt to fix the current limitations of the protocol).
Suggestion: This question deserves a whole separate topic, I will just quickly mention my opinion: if the simplicity of use is great I would put it lower in priority compared to the other properties mentioned such as security and censorship resistance. Other layers above the core protocol could be developed to improve the developer experience and abstract away concepts.
What limitations have you encountered?
Only two important limitations were mentioned.
- Response speeds (update calls / inter-canister calls)
- Memory limitation
The memory limitation is being worked on and canister stable memory has already been increased to 32GB.
The first limitation is inherent to the protocol and it remains to be seen if the protocol can be improved to further increase the speed of consensus/update calls.
What are your main wishes?
- Better troubleshooting and error messages with dfx & moc (Motoko compiler).
- Improved documentation and standards (NFTs and wallets).
- Improved authentication methods than Internet Identity (more interoperability).
- No-code tools.
- Integration with Ethereum.
- More memory.
- Support for more intensive computation.
- Advise from the foundation on scaling and business planning.
- Easier acquisition of cycles.
- More examples ( Flutter for mobile deployment, patterns for multi-canister dApps…).
- Faster asset loading.
Most of those wishes are currently being addressed by one or more initiatives.
Thanks to everyone who participated
If you have any other feedback, or what I’ve mentioned/raised in this post gave you ideas, feel free to post here or send me a DM to exchange.