Ya floats were a pain
Alright I think I need clarification on a few things because im confusing myself
Is the CBOR encoding a custom thing vs the HTTPS interface?
From what I understand the CBOR is encoded/decoded in the boundry nodes and doesn’t touch the canister itself. From the tests/examples I have seen for
call_raw, the encoding is just the raw candid, not cbor. Is there going to be CBOR inside the candid in blob form?
Looking at the
candy_libraryi get what is happening, allowing the ability to look at properties/meta data of the data itself. BUT what i don’t get is how it goes from a motoko value to a candy value. It seems like its just a manual process, similar to what im doing with Cbor values, where its just a variant storing motoko types with some meta data. Am I understanding it correctly or am I missing something.
How exactly do we want the code to look like with converting to and from cbor in the
call_rawmethod. I see in the description we want to be able to inspect whats inside or build cbor based on a definition, but I think Im having troubling picturing it. For
construct a payload for call raw using an uploaded candid definitionwhat does the definition look like? Is it a
*.didfile? If so is the idea to parse that file, build some kind of descriptor data for it, then use that for validation and creation of candid models? Thats also where im confused because im not sure what that has to do with CBOR, seems like a candid thing.
Any help would be great because im going in circles.
I think some sort of pseudo code of the
call_raw function of what you expect/want would be helpful as well.
Starting the candid portion in a different project
Will work similarly to have I have the CBOR where I will take the Blob and create a candy_library like model holding the candid value/type meta data
Sweet. I’ll try to get to your questions later today….sorry for the delay.
No worries. Have baby duties occupying most of my time and have the candid work ongoing
I have been pulled down a rabbit hole this week.
I needed to have a better system for encoding/decoding numbers along with having floats with different precisions. So I have moved all the work i have done already for those and much more to
Where this will be a referenced project in both the cbor and candid encoders.
I have it setup for the cbor project and now im going back to the candid project. Should be much easier since I already have implemented LEB128 encoding in the numbers project.
Dealing with 1’s and 0’s in long bit streams and bytes along with different endian-nesses can really destroy one’s mind.
You are a rock star! Fun things down those rabbit holes.
Just wanted to give an update
Im close on the candid project
Have done all the ‘encoding’ except for the func/service, plus a couple TODOs/polish, but its in a good state with all my tests passing
Still need to do the ‘decoding’ from bytes but that shouldn’t take as long now I have the structure
Any updates on my questions above?
Actually the Decoder was much easier than I expected and its in a similar situation as the encoder where its mostly done but needs some edge cases/polish and fixing a few broken tests.
I guess this would be a good time to figure out if the library is structured in the way we need and handle any initial reviews.
How is the review process going to work?
Alright @skilesare (forgot to tag you in previous comments, oops) Candid, Cbor and Numbers projects are ready to go/be reviewed. They are all working with all tests passing. There are a few TODOs here and there but should be fully functional. Each README should have at least some usage code for it
@Gekctek Thanks for working on these - looking forward to digging in!
Shout out to some Motoko people that might want to glance at these
Awesome! Will take a look.
This is awesome! We almost have reflection in motoko here:
Maybe the biggest thing for motoko since last week when we got x with y = z;
Nice. So how does the review process work for this bounty. I have to add a bit more documentation and misc things but not sure how to proceed
I think the first thing I see is that you should clear all the warnings unless there is a really good reason not to. Then we just ask for review from the community….probably give a demon during one of the community calls.
Sounds good. I think what I was making sure the main things were all there and no major reworks are needed before I do final cleanup and documentation
I have updated all the projects will better documentation and did some cleanup
By warnings do you mean the TODOs or are you getting compiler warnings that im not seeing?
As far as cleanup goes…for instance when I go to Motoko Playground - DFINITY and compile I see a bunch of warnings about “Warning in file candid/Decoder.mo:254:107 field hash is deprecated: For large
Int values consider using a bespoke hash function that considers all of the argument’s bits.”
Maybe this is a new warning coming in a new motoko? Either way, we should get ahead of it…also any Array.appends should be replaced by buffer logic.
Weird. I updated my
dfx but i still dont get warnings. ill have to do more investigation but at least the motoko playground works
I updated the Int.hash usage and im not getting anymore warnings now. So it should be good to go now
Nice , Thank You! I think you might find this helpful.