Hi everyone - I’m building a decentralised version of YouTube. I’m stuck and need help.
My users will be uploading full-res very large videos, my current solution is to use IPFS. But all the pinning providers out there, frankly suck. So my only option is to host my own IPFS node, that seems kinda centralised haha.
I looked at the CanCan repo, it’s literally storing chunks of videos in memory in canisters. This can work for small video but not large video, which most video is these days.
Is there any plans to provide a solution of BLOB data / large media. Or is there another solution I’m not considering?
The media will either be an large single H264 MP4 or an m3u3 format for adaptive streaming (many files, still relatively large). All our metadata in currently on the ICP.
In a previous thread Diego stated that for canisters “Storing a 6 GB video - very possible”
I doubt any video for your MVP should exceed that amount.
At worst you may have to generate new cannisters for every upload but I could see that being very cycle intensive.
Lastly, if your files are > 8GB, split them up and perhaps you can build (or there already is) a tool which “seamlessly” transitions from canister to canister within your Dapp.
Maybe the opportunity for an ad break or query the user for some feedback on the video.
Similar to Netflix’s “Are you still watching?”
I understand it’s not the solution you’re looking for but hopefully it helped shed a bit of light ^^
You caught me on my vacation - sorry for missing this. Our certified asset canister / serviceworker solution kind of sucks at this right now, but you can upload the file using dfx and it can stream on raw.ic0.app fairly well.
We still don’t support scrubbing, progressive enhancement, or a bunch of other nice-to-have features for a professional video streaming service though. Someone will need to invest in that before a full youtube-quality experience is possible