Thank you for sharing. Being a new motoko dev I agree that it would be helpful to understand when to use an Array
. I feel like I should avoid them entirely and always use a Buffer
, but that’s probably incorrect.
Edit: Looking at the Buffer class it seems like it is always mutable. So maybe the preference is to use a Buffer until you need to do some operation on a subset of its entries that won’t change. In which case I would use Buffer.toArray() to create a fixed size array. Does that sound right?
First, on behalf of Motoko the language and ecosystem, I apologize that these design choices are not documented more clearly.
There are roughly two ways that a value of some type could arise in your code (I say “entry point” for any public method of your actor):
-
The value is produced entirely during the activation of some entry point. It serves some role to answer the query or update request there.
-
The value is stored in an actor var (stable or otherwise) and mutated there. Its data persists between entry points’ activation, and generally gets larger and larger over time, with successive activations.
To avoid performance issues related to Array.append
-
For arrays in the first category, Array.append
is fine to use, assuming one avoids loops. This category corresponds to the situation mentioned here, by @roman-kashitsyn
-
For every value in the second category, use a Buffer
to store large sequences, not an Array
. Yes, it is very tempting to use a stable var
of type array because buffers are not stable, and cannot be stored that way. However, please resist this temptation unless there happens to be a natural bound for the array and it will never grow, just mutate in place. Since that basically never happens in practice, please use a buffer (stored as a non-stable actor var) to store this data. (To persist, use the pre and post upgrade hooks to write its data and read its data from some arrays used only in that step.). Alternatively, I wrote a tree-based (think “ropes”) representation for sequences that can be stored directly in stable vars. It should be worst-case O(log n) time for all insertions, making even better than Buffer
asymptotically. However, we have not tested this structure much; it remains experimental, and not yet part of base
.
If any use case you have is hard to classify into those two situations, let me know. I’d be happy to help give more details and figure it out.
4 Likes
Thank you @matthewhammer, this is really helpful!
I will follow this guidance while updating the canister and let you know if I run into any difficult decisions.
1 Like
I apologize for taking so long to follow-up on this thread. We solved the problem a while back and I neglected to close the issue (even after @domwoe was kind enough to remind me). I understand this is not good etiquette on a forum like this.
In short, the cycles leak was due to my failure to initialize the Certified Assets Provenance (CAP) service before deploying our NFT canister. During operation, whenever a token was sold or transferred the CAP service would attempt to report that event using the cronCapEvents
method:
public shared(msg) func cronCapEvents() : async () {
canistergeekMonitor.collectMetrics();
logEvent(level_3, "cronCapEvents()");
var _cont : Bool = true;
while(_cont){
var last = List.pop(_capEvents);
switch(last.0){
case(?event) {
_capEvents := last.1;
try {
ignore await CapService.insert(event);
} catch (e) {
_capEvents := List.push(event, _capEvents);
logEvent(level_2, "CapService Error : cronCapEvents()");
};
};
case(_) {
_cont := false;
};
};
};
};
public shared(msg) func initCap() : async () {
canistergeekMonitor.collectMetrics();
logEvent(level_1, "initCap()");
if (Option.isNull(capRootBucketId)){
try {
capRootBucketId := await CapService.handshake(Principal.toText(Principal.fromActor(this)), 1_000_000_000_000);
} catch e {};
};
};
After executing initCap
we no longer experienced any elevated burn rates and have been running smoothly for about a month now. On average we burn ~0.27T cycles/day.
Thank you to everyone who helped with this. It was a great learning experience.
4 Likes