The auto progress mode does both.
Maybe youâre requesting status in a tight loop which doesnât give the PocketIC server a chance to tick on its own in auto progress mode?
You could try to send a request to the path /instances/0/read/ingress_status
available in the latest unreleased PocketIC server, bypassing the HTTP gateway and production HTTP handler, to check the status of your request based on its request id. This custom PocketIC API is again simpler (no signatures/ingress expiry) than the production API exposed by the HTTP gateway serving /api/v2
and /api/v3
paths.
Tried a few different methods including running it manually in a terminal, all with the same results
How would one get that, or would i have to do a manual build with the repo
Thinking out loud, there is gotta be something wrong on my end because why would dfx call work with no tick but mine needs the tick, doesnât seem to make sense
I try to replace the âtickâ call with any delay or other http call to see if its just a timing/lock thing, but it doesnt work unless i tick atleast once, even though im in an auto_progress
You can download it from the dfinity/ic release page.
Ok, so it doesnât seem like its a http gateway specific thing, tried the ingress_status and i get the same results, only get a an Ok with a âtickâ and not with an artifical delay
RequestId requestId = await agent.CallAsynchronousAsync(canisterId, "inc", CandidArg.Empty());
// ICTimestamp currentTime = await pocketIc.GetTimeAsync();
// await pocketIc.SetTimeAsync(currentTime + TimeSpan.FromSeconds(5)); // Doesnt work
await pocketIc.TickAsync(1); // Works
// await Task.Delay(5000); // Doesnt work
IngressStatus ingressStatus = await pocketIc.GetIngressStatusAsync(requestId, EffectivePrincipal.Canister(canisterId));
switch (ingressStatus.Type)
{
case IngressStatusType.Ok:
break;
case IngressStatusType.NotFound:
throw new Exception("Ingress message not found");
default:
throw new Exception("Ingress message ????");
}
I get the âNotFoundâ if there is no Tick call
I know the AutoProgress works in some form because i have tests for it that have been working
// Test auto progress time
await using (await pocketIc.AutoProgressTimeAsync())
{
await Task.Delay(100);
ICTimestamp autoProgressedTime = await pocketIc.GetTimeAsync();
Assert.True(autoProgressedTime.NanoSeconds > resetTime.NanoSeconds);
}
Could you please compare the logs (exporting RUST_LOG=trace
) with and without additional ticks?
So I took the test code above and did it once with the âtickâ call and one with it commented out, thats all
They both seem to run the advance_time_and_tick
in between the ingress_status call and the process_canister_http_internal call, not befor the ingress_status call. So the advance_time_and_tick
is not being triggered before then? But doesnt seem to be an issue where i can just fix with a delay
not sure what the process_canister_http_internal is also
Without Tick
// Start async call
2024-12-13T22:15:19.940410Z DEBUG request{method=POST uri=/instances/0/api/v2/canister/7tjcv-pp777-77776-qaaaa-cai/call version=HTTP/1.1}: tower_http::trace::on_request: started processing request
2024-12-13T22:15:19.940928Z TRACE request{method=POST uri=/instances/0/api/v2/canister/7tjcv-pp777-77776-qaaaa-cai/call version=HTTP/1.1}: pocket_ic_server::state_api::state: update_with_timeout::start instance_id=0 op_id=call(7tjcv-pp777-77776-qaaaa-cai,Digest(125F6973BDBAD77FE9093FD3BF9C33B47EF738F2291127BABD26B3EC0475137D))
2024-12-13T22:15:19.940973Z TRACE pocket_ic_server::state_api::state: bg_task::start instance_id=0 state_label=StateLabel(0A000000000000000000000000000000) op_id=call(7tjcv-pp777-77776-qaaaa-cai,Digest(125F6973BDBAD77FE9093FD3BF9C33B47EF738F2291127BABD26B3EC0475137D))
2024-12-13T22:15:19.941334Z TRACE pocket_ic_server::state_api::state: bg_task::end instance_id=0 op_id=call(7tjcv-pp777-77776-qaaaa-cai,Digest(125F6973BDBAD77FE9093FD3BF9C33B47EF738F2291127BABD26B3EC0475137D))
2024-12-13T22:15:19.941345Z TRACE request{method=POST uri=/instances/0/api/v2/canister/7tjcv-pp777-77776-qaaaa-cai/call version=HTTP/1.1}: pocket_ic_server::state_api::state: update_with_timeout::synchronous instance_id=0 op_id=call(7tjcv-pp777-77776-qaaaa-cai,Digest(125F6973BDBAD77FE9093FD3BF9C33B47EF738F2291127BABD26B3EC0475137D))
2024-12-13T22:15:19.941362Z DEBUG request{method=POST uri=/instances/0/api/v2/canister/7tjcv-pp777-77776-qaaaa-cai/call version=HTTP/1.1}: tower_http::trace::on_response: finished processing request latency=0 ms status=202
// End async call
2024-12-13T22:15:19.941436Z TRACE hyper_util::client::legacy::pool: put; add idle connection for ("http", localhost:43029)
2024-12-13T22:15:19.941441Z DEBUG hyper_util::client::legacy::pool: pooling idle connection for ("http", localhost:43029)
// Start Ingress_status call
2024-12-13T22:15:19.992203Z DEBUG request{method=POST uri=/instances/0/read/ingress_status version=HTTP/1.1}: tower_http::trace::on_request: started processing request
2024-12-13T22:15:19.992687Z TRACE request{method=POST uri=/instances/0/read/ingress_status version=HTTP/1.1}: pocket_ic_server::state_api::state: update_with_timeout::start instance_id=0 op_id=ingress_status_0x26acb65104aaa0aed8e36d7984fdfdd939d56b08c4f520c4948a15bee463a468
2024-12-13T22:15:19.992742Z TRACE pocket_ic_server::state_api::state: bg_task::start instance_id=0 state_label=StateLabel(0B000000000000000000000000000000) op_id=ingress_status_0x26acb65104aaa0aed8e36d7984fdfdd939d56b08c4f520c4948a15bee463a468
2024-12-13T22:15:19.992765Z TRACE pocket_ic_server::state_api::state: bg_task::end instance_id=0 op_id=ingress_status_0x26acb65104aaa0aed8e36d7984fdfdd939d56b08c4f520c4948a15bee463a468
2024-12-13T22:15:19.992799Z TRACE request{method=POST uri=/instances/0/read/ingress_status version=HTTP/1.1}: pocket_ic_server::state_api::state: update_with_timeout::synchronous instance_id=0 op_id=ingress_status_0x26acb65104aaa0aed8e36d7984fdfdd939d56b08c4f520c4948a15bee463a468
2024-12-13T22:15:19.992814Z DEBUG request{method=POST uri=/instances/0/read/ingress_status version=HTTP/1.1}: tower_http::trace::on_response: finished processing request latency=0 ms status=200
// End ingress_status call
// Start advance_time_and_tick
2024-12-13T22:15:20.006910Z TRACE pocket_ic_server::state_api::state: update_with_timeout::start instance_id=0 op_id=advance_time_and_tick(104.399937ms)
2024-12-13T22:15:20.006971Z TRACE pocket_ic_server::state_api::state: bg_task::start instance_id=0 state_label=StateLabel(0C000000000000000000000000000000) op_id=advance_time_and_tick(104.399937ms)
2024-12-13T22:15:20.009221Z TRACE pocket_ic_server::state_api::state: bg_task::end instance_id=0 op_id=advance_time_and_tick(104.399937ms)
2024-12-13T22:15:20.009268Z TRACE pocket_ic_server::state_api::state: update_with_timeout::synchronous instance_id=0 op_id=advance_time_and_tick(104.399937ms)
// End advance_time_and_tick
// Start process_canister_http_internal
2024-12-13T22:15:20.009274Z TRACE pocket_ic_server::state_api::state: update_with_timeout::start instance_id=0 op_id=process_canister_http_internal
2024-12-13T22:15:20.009297Z TRACE pocket_ic_server::state_api::state: bg_task::start instance_id=0 state_label=StateLabel(0D000000000000000000000000000000) op_id=process_canister_http_internal
2024-12-13T22:15:20.009316Z TRACE pocket_ic_server::state_api::state: bg_task::end instance_id=0 op_id=process_canister_http_internal
2024-12-13T22:15:20.009342Z TRACE pocket_ic_server::state_api::state: update_with_timeout::synchronous instance_id=0 op_id=process_canister_http_internal
// End process_canister_http_internal
// Fail in my code with "Ingress message not found"
Exception thrown: 'System.Exception' in Sample.PocketIC.dll
With Tick
// Start async call
2024-12-13T22:14:52.108062Z DEBUG request{method=POST uri=/instances/0/api/v2/canister/7tjcv-pp777-77776-qaaaa-cai/call version=HTTP/1.1}: tower_http::trace::on_request: started processing request
2024-12-13T22:14:52.108712Z TRACE request{method=POST uri=/instances/0/api/v2/canister/7tjcv-pp777-77776-qaaaa-cai/call version=HTTP/1.1}: pocket_ic_server::state_api::state: update_with_timeout::start instance_id=0 op_id=call(7tjcv-pp777-77776-qaaaa-cai,Digest(2AF10D632FAEEBEE0F186C3696CFDC2E979B3E6D536E6D37D43F2E12A6981C0D))
2024-12-13T22:14:52.108742Z TRACE pocket_ic_server::state_api::state: bg_task::start instance_id=0 state_label=StateLabel(0C000000000000000000000000000000) op_id=call(7tjcv-pp777-77776-qaaaa-cai,Digest(2AF10D632FAEEBEE0F186C3696CFDC2E979B3E6D536E6D37D43F2E12A6981C0D))
2024-12-13T22:14:52.109127Z TRACE pocket_ic_server::state_api::state: bg_task::end instance_id=0 op_id=call(7tjcv-pp777-77776-qaaaa-cai,Digest(2AF10D632FAEEBEE0F186C3696CFDC2E979B3E6D536E6D37D43F2E12A6981C0D))
2024-12-13T22:14:52.109170Z TRACE request{method=POST uri=/instances/0/api/v2/canister/7tjcv-pp777-77776-qaaaa-cai/call version=HTTP/1.1}: pocket_ic_server::state_api::state: update_with_timeout::synchronous instance_id=0 op_id=call(7tjcv-pp777-77776-qaaaa-cai,Digest(2AF10D632FAEEBEE0F186C3696CFDC2E979B3E6D536E6D37D43F2E12A6981C0D))
2024-12-13T22:14:52.109188Z DEBUG request{method=POST uri=/instances/0/api/v2/canister/7tjcv-pp777-77776-qaaaa-cai/call version=HTTP/1.1}: tower_http::trace::on_response: finished processing request latency=1 ms status=202
// End async call
2024-12-13T22:14:52.109248Z TRACE hyper_util::client::legacy::pool: put; add idle connection for ("http", localhost:37621)
2024-12-13T22:14:52.109254Z DEBUG hyper_util::client::legacy::pool: pooling idle connection for ("http", localhost:37621)
// Start tick call
2024-12-13T22:14:52.119064Z DEBUG request{method=POST uri=/instances/0/update/tick version=HTTP/1.1}: tower_http::trace::on_request: started processing request
2024-12-13T22:14:52.119527Z TRACE request{method=POST uri=/instances/0/update/tick version=HTTP/1.1}: pocket_ic_server::state_api::state: update_with_timeout::start instance_id=0 op_id=tick
2024-12-13T22:14:52.119606Z TRACE pocket_ic_server::state_api::state: bg_task::start instance_id=0 state_label=StateLabel(0D000000000000000000000000000000) op_id=tick
2024-12-13T22:14:52.122470Z TRACE pocket_ic_server::state_api::state: bg_task::end instance_id=0 op_id=tick
2024-12-13T22:14:52.122534Z TRACE request{method=POST uri=/instances/0/update/tick version=HTTP/1.1}: pocket_ic_server::state_api::state: update_with_timeout::synchronous instance_id=0 op_id=tick
2024-12-13T22:14:52.122560Z DEBUG request{method=POST uri=/instances/0/update/tick version=HTTP/1.1}: tower_http::trace::on_response: finished processing request latency=3 ms status=200
// End tick call
// Start Ingress_status call
2024-12-13T22:14:52.146569Z DEBUG request{method=POST uri=/instances/0/read/ingress_status version=HTTP/1.1}: tower_http::trace::on_request: started processing request
2024-12-13T22:14:52.147109Z TRACE request{method=POST uri=/instances/0/read/ingress_status version=HTTP/1.1}: pocket_ic_server::state_api::state: update_with_timeout::start instance_id=0 op_id=ingress_status_0x471701f56b556c4c440ef7143c1c5b2f7ea404d2a9bc3f7a8c54eae075ea3ad8
2024-12-13T22:14:52.147154Z TRACE pocket_ic_server::state_api::state: bg_task::start instance_id=0 state_label=StateLabel(0E000000000000000000000000000000) op_id=ingress_status_0x471701f56b556c4c440ef7143c1c5b2f7ea404d2a9bc3f7a8c54eae075ea3ad8
2024-12-13T22:14:52.147189Z TRACE pocket_ic_server::state_api::state: bg_task::end instance_id=0 op_id=ingress_status_0x471701f56b556c4c440ef7143c1c5b2f7ea404d2a9bc3f7a8c54eae075ea3ad8
2024-12-13T22:14:52.147231Z TRACE request{method=POST uri=/instances/0/read/ingress_status version=HTTP/1.1}: pocket_ic_server::state_api::state: update_with_timeout::synchronous instance_id=0 op_id=ingress_status_0x471701f56b556c4c440ef7143c1c5b2f7ea404d2a9bc3f7a8c54eae075ea3ad8
2024-12-13T22:14:52.147270Z DEBUG request{method=POST uri=/instances/0/read/ingress_status version=HTTP/1.1}: tower_http::trace::on_response: finished processing request latency=0 ms status=200
// End Ingress_status call
// Start advance_time_and_tick
2024-12-13T22:14:52.152311Z TRACE pocket_ic_server::state_api::state: update_with_timeout::start instance_id=0 op_id=advance_time_and_tick(103.196967ms)
2024-12-13T22:14:52.152354Z TRACE pocket_ic_server::state_api::state: bg_task::start instance_id=0 state_label=StateLabel(0F000000000000000000000000000000) op_id=advance_time_and_tick(103.196967ms)
2024-12-13T22:14:52.154744Z TRACE pocket_ic_server::state_api::state: bg_task::end instance_id=0 op_id=advance_time_and_tick(103.196967ms)
2024-12-13T22:14:52.154780Z TRACE pocket_ic_server::state_api::state: update_with_timeout::synchronous instance_id=0 op_id=advance_time_and_tick(103.196967ms)
// End advance_time_and_tick
// Start process_canister_http_internal
2024-12-13T22:14:52.154785Z TRACE pocket_ic_server::state_api::state: update_with_timeout::start instance_id=0 op_id=process_canister_http_internal
2024-12-13T22:14:52.154805Z TRACE pocket_ic_server::state_api::state: bg_task::start instance_id=0 state_label=StateLabel(10000000000000000000000000000000) op_id=process_canister_http_internal
2024-12-13T22:14:52.154823Z TRACE pocket_ic_server::state_api::state: bg_task::end instance_id=0 op_id=process_canister_http_internal
2024-12-13T22:14:52.154847Z TRACE pocket_ic_server::state_api::state: update_with_timeout::synchronous instance_id=0 op_id=process_canister_http_internal
// End process_canister_http_internal
BUT
When i do add a delay, instead of a tick
await Task.Delay(500);
I do get the advance_time_and_tick
many times along with process_canister_http_internal
repeating but still has the same issue of Ingress message not found
So not an advance_time_and_tick
issue?
Could you please also again check that your system time matches the PocketIC time available at /instances/0/read/get_time
?
If you see
2024-12-13T22:14:52.109188Z DEBUG request{method=POST uri=/instances/0/api/v2/canister/7tjcv-pp777-77776-qaaaa-cai/call version=HTTP/1.1}: tower_http::trace::on_response: finished processing request latency=1 ms status=202
followed by
2024-12-13T22:15:20.009268Z TRACE pocket_ic_server::state_api::state: update_with_timeout::synchronous instance_id=0 op_id=advance_time_and_tick(104.399937ms)
and your message is the only pending message (so thereâs no contention on the server), then mismatching PocketIC time is the only reason I see for the ingress status being unavailable.
Get time from pocket ic seems to match
PocketIC time: 12/14/2024 21:50:33
System Time: 12/14/2024 21:50:33
So, i did some more testing to see why the dfx is different
-
If I do
--async
with dfx canister call, use that request id, the get the ingress_status, it is alsoNotFound
(unless i do a manual tick call) -
Im currently looking into the non
--async
version. It seems the dfx is using v2, so its a little different. Ill have to try
Have you waited until you see op_id=advance_time_and_tick
in the logs?
Been going through many many logs and i am not seeing anything helpful
But when doing some experimentation, i found:
BOTH the dfx AND code CAN work with a delay/waiting
The DFX always works
The Code works about 7% of the time
I noticed once in a while that the code would work, even with a small delay, just like the dfx but had a hard time pinning it down. I switched from looping to see if a request would EVER process to just making many different requests, and about 7% of the time (small sample size) the code would process with a 100ms delay, while the dfx code would work 100% with 100ms delay
Now the question is what is the issue?
It seems like it has to be on my side. Everything seems identical as far as i can tell in the log comparisons.
The my focus was to look if there is some issue with parsing/building the request id for the requests, but there would be an issue normally AND then I went and checked the v3, because there is no requestId parsing and waiting.
V3 has the exact same issue, a very small fraction is getting through. I was getting like 1 in 30ish was successful, so doesnt seem like request id
And I made sure all of this is running the same code, except for the dfx vs code call:
[Fact]
public async Task HttpGateway_Ingress_Code()
{
await this.Common(async (agent, canisterId, httpGateway) =>
{
return await agent.CallAsynchronousAsync(canisterId, "inc", CandidArg.Empty());
});
}
[Fact]
public async Task HttpGateway_Ingress_Dfx()
{
await this.Common(async (agent, canisterId, httpGateway) =>
{
var processInfo = new ProcessStartInfo
{
FileName = "dfx",
Arguments = $"canister call {canisterId} inc () --network {httpGateway.Url} --identity anonymous --verbose --async",
RedirectStandardOutput = true,
RedirectStandardError = true,
UseShellExecute = false,
CreateNoWindow = true,
WorkingDirectory = "/home/gekctek/git/ICP.NET",
};
Debug.WriteLine("dfx " + processInfo.Arguments);
using (var process = Process.Start(processInfo))
{
string output = process!.StandardOutput.ReadToEnd();
string error = process.StandardError.ReadToEnd();
process.WaitForExit();
// Optionally log or handle the output/error
Debug.WriteLine($"Output: {output}");
Debug.WriteLine($"Error: {error}");
return new RequestId(Convert.FromHexString(output.Trim().Substring(2)));
}
});
}
I just dont understand what could be different here. Im trying to think of ways to better compare the requests or something to see whats different, but the logs arent helping me
My current gut feeling is my client does something a little different/wrong, but randomly(?) and the IC mainnet can handle the difference fine, but pocketIC cant?
Im done for the night, but I feel like im getting no where
Iâd suggest to poll for the ingress status (with sleeps in between, i.e., not in a tight loop) until the ingress status is known (with a generous timeout, e.g., a few minutes). I wouldnât rely on the ingress status to always be available within a fixed and small delay, e.g., 100ms.
You could consider using a networking tool such as Wireshark to monitor your local HTTP traffic and compare the relevant HTTP requests to PocketIC.
Thats what my initial tests were. Mixture of just long delays and doing loops with the same request with delays. Both just sit indefinately
Ill see if I can find anything with this
Ok I have been testing the capture and comparting the http requests and cbor bodies, without much luck. Everything seems to match
Code
POST /api/v2/canister/7tjcv-pp777-77776-qaaaa-cai/call HTTP/1.1
Host: localhost:43031
Accept: application/cbor
Content-Type: application/cbor
Content-Length: 115
D9 D9F7 # tag(55799)
A1 # map(1)
67 # text(7)
636F6E74656E74 # "content"
A7 # map(7)
6C # text(12)
726571756573745F74797065 # "request_type"
64 # text(4)
63616C6C # "call"
6B # text(11)
63616E69737465725F6964 # "canister_id"
4A # bytes(10)
FFFFFFFFFFD000000101 # "\xFF\xFF\xFF\xFF\xFF\xD0\u0000\u0000\u0001\u0001"
6B # text(11)
6D6574686F645F6E616D65 # "method_name"
63 # text(3)
696E63 # "inc"
63 # text(3)
617267 # "arg"
46 # bytes(6)
4449444C0000 # "DIDL\u0000\u0000"
66 # text(6)
73656E646572 # "sender"
41 # bytes(1)
04 # "\u0004"
6E # text(14)
696E67726573735F657870697279 # "ingress_expiry"
1B 181164CCA3FBF740 # unsigned(1734278161601656640)
65 # text(5)
6E6F6E6365 # "nonce"
50 # bytes(16)
2DEC2F8B3A00250A1A3598341114BB3C # "-\xEC/\x8B:\u0000%\n\u001A5\x984\u0011\u0014\xBB<"
DFX:
POST /api/v2/canister/7tjcv-pp777-77776-qaaaa-cai/call HTTP/1.1
content-type: application/cbor
accept: */*
host: localhost:43031
content-length: 137
D9 D9F7 # tag(55799)
A1 # map(1)
67 # text(7)
636F6E74656E74 # "content"
A7 # map(7)
6C # text(12)
726571756573745F74797065 # "request_type"
64 # text(4)
63616C6C # "call"
65 # text(5)
6E6F6E6365 # "nonce"
50 # bytes(16)
0B041C680C8D52D5D53DE9782C15D88E # "\v\u0004\u001Ch\f\x8DR\xD5\xD5=\xE9x,\u0015Ř"
6E # text(14)
696E67726573735F657870697279 # "ingress_expiry"
1B 1811635314FE5800 # unsigned(1734276540000000000)
66 # text(6)
73656E646572 # "sender"
41 # bytes(1)
04 # "\u0004"
6B # text(11)
63616E69737465725F6964 # "canister_id"
4A # bytes(10)
FFFFFFFFFFD000000101 # "\xFF\xFF\xFF\xFF\xFF\xD0\u0000\u0000\u0001\u0001"
6B # text(11)
6D6574686F645F6E616D65 # "method_name"
63 # text(3)
696E63 # "inc"
63 # text(3)
617267 # "arg"
46 # bytes(6)
4449444C0000 # "DIDL\u0000\u0000"
Ive been also trying things like making sure the cbor matches perfectly, is in the right order. http having the same everything, but with no change
Also inspecting the âworkingâ code calls vs the ânot workingâ code calls doesnt show anything either, that i can see
In any case, there is something happening on the pocketIC side, even if its a bug on my end
I feel like my next step is to debug the pocketIC side
Is it reasonable to setup the pocketIC source code and run in locally vs the binary?
Im not sure what else to do, but see where the issue is coming from, but im not sure what im even looking for
This header in your code is less than the actual content length of 137.
The ingress expiry values are quite off: Sun Dec 15 2024 15:29:00 GMT+0000
for dfx vs Sun Dec 15 2024 15:56:01 GMT+0000
for your code - did you run these requests in quick succession? If so, then your ingress expiry is too large: the ingress expiry must not be in the past and more than 5 minutes into the future.
Sure. You can close the dfinity/ic repository, enter the dev container using ./gitlab-ci/ci/container/container-run.sh
and then run bazel build //rs/pocket_ic_server:pocket-ic-server
. Finally, copy the binary to the root of the repository (cp bazel-bin/rs/pocket_ic_server/pocket-ic-server .
) and then you can start it from there.
You can println!
the request ID (variable message_id
) in the v2 or v3 handler (whatever youâre using) and make sure your code arrives at the same request ID. If youâre using the v3 handler, you can check out its response using the http traffic monitoring tool (if the update call is executed fast within a few seconds, the http response should already contain the update call response).
No, spaced them out to track the http calls, so that should be good
I can see the v2/v3 message id and it does match, and i can see the http response, if it comes back.
It seems to get stuck on this line
match certification_subscriber
.wait_for_certification()
.timeout(Duration::from_secs(
ingress_message_certificate_timeout_seconds,
))
.await
{
is there some good places you would recommend i can check to see whats going on. Gets stuck here every time except on the ones where it is successful. This whole this is quite complicated so im not even sure where to start