**** dT 0.000 * top TEST ../../../../bin/varnishtest/tests/b00096.vtc starting **** top extmacro def pkg_version=trunk **** top extmacro def pkg_branch=trunk **** top extmacro def pwd=/Users/bsdphk/VT/varnish-cache/varnish-trunk/_build/sub/bin/varnishtest **** top extmacro def date(...) **** top extmacro def string(...) **** top extmacro def localhost=127.0.0.1 **** top extmacro def bad_backend=127.0.0.1:52392 **** top extmacro def listen_addr=127.0.0.1:0 **** top extmacro def bad_ip=192.0.2.255 **** top extmacro def topbuild=/Users/bsdphk/VT/varnish-cache/varnish-trunk/_build/sub **** top extmacro def topsrc=/Users/bsdphk/VT/varnish-cache/varnish-trunk/_build/sub/../.. **** top macro def testdir=/Users/bsdphk/VT/varnish-cache/varnish-trunk/_build/sub/bin/varnishtest/../../../../bin/varnishtest/tests **** top macro def tmpdir=/Users/bsdphk/VT/_vtest_tmp/vtc.33729.79318658 **** top macro def vtcid=vtc.33729.79318658 ** top === varnishtest "Test vcl_backend_refresh on streaming object" * top VTEST Test vcl_backend_refresh on streaming object ** top === barrier b1 sock 2 **** b1 macro def b1_addr=127.0.0.1 **** b1 macro def b1_port=52393 **** b1 macro def b1_sock=127.0.0.1:52393 ** top === server s1 { ** s1 Starting server **** s1 macro def s1_addr=127.0.0.1 **** s1 macro def s1_port=52394 **** s1 macro def s1_sock=127.0.0.1:52394 * s1 Listen on 127.0.0.1:52394 ** top === server s2 { ** s2 Starting server **** s2 macro def s2_addr=127.0.0.1 ** s1 Started on 127.0.0.1:52394 (1 iterations) **** s2 macro def s2_port=52395 **** s2 macro def s2_sock=127.0.0.1:52395 * s2 Listen on 127.0.0.1:52395 ** top === varnish v1 -vcl+backend { ** s2 Started on 127.0.0.1:52395 (1 iterations) **** dT 0.007 ** v1 Launch *** v1 CMD: cd ${pwd} && exec varnishd -d -n /Users/bsdphk/VT/_vtest_tmp/vtc.33729.79318658/v1 -i v1 -l 2m -p auto_restart=off -p syslog_cli_traffic=off -p thread_pool_min=10 -p debug=+vtc_mode -p vsl_mask=+Debug,+H2RxHdr,+H2RxBody -p h2_initial_window_size=1m -p h2_rx_window_low_water=64k -a '127.0.0.1:0' -M '127.0.0.1 52396' -P /Users/bsdphk/VT/_vtest_tmp/vtc.33729.79318658/v1/varnishd.pid -p vmod_path=/Users/bsdphk/VT/varnish-cache/varnish-trunk/_build/sub/vmod/.libs *** v1 CMD: cd /Users/bsdphk/VT/varnish-cache/varnish-trunk/_build/sub/bin/varnishtest && exec varnishd -d -n /Users/bsdphk/VT/_vtest_tmp/vtc.33729.79318658/v1 -i v1 -l 2m -p auto_restart=off -p syslog_cli_traffic=off -p thread_pool_min=10 -p debug=+vtc_mode -p vsl_mask=+Debug,+H2RxHdr,+H2RxBody -p h2_initial_window_size=1m -p h2_rx_window_low_water=64k -a '127.0.0.1:0' -M '127.0.0.1 52396' -P /Users/bsdphk/VT/_vtest_tmp/vtc.33729.79318658/v1/varnishd.pid -p vmod_path=/Users/bsdphk/VT/varnish-cache/varnish-trunk/_build/sub/vmod/.libs *** v1 PID: 33747 **** v1 macro def v1_pid=33747 **** v1 macro def v1_name=/Users/bsdphk/VT/_vtest_tmp/vtc.33729.79318658/v1 **** dT 0.026 *** v1 debug|Debug: Version: varnish-trunk revision 58e30e74ed511726f5c55f6f00fc9fb38120b3b0 *** v1 debug|Debug: Platform: Darwin,24.5.0,arm64,-jnone,-sdefault,-sdefault,-hcritbit *** v1 debug|200 307 *** v1 debug|----------------------------- *** v1 debug|Varnish Cache CLI 1.0 *** v1 debug|----------------------------- *** v1 debug|Darwin,24.5.0,arm64,-jnone,-sdefault,-sdefault,-hcritbit *** v1 debug|varnish-trunk revision 58e30e74ed511726f5c55f6f00fc9fb38120b3b0 *** v1 debug| *** v1 debug|Type 'help' for command list. *** v1 debug|Type 'quit' to close CLI session. *** v1 debug|Type 'start' to launch worker process. *** v1 debug| **** dT 0.123 **** v1 CLIPOLL 1 0x1 0x0 0x0 *** v1 CLI connection fd = 8 **** dT 0.124 *** v1 CLI RX 107 **** v1 CLI RX|twhdztiibjxpktmkbinuwnjzocdamezk **** v1 CLI RX| **** v1 CLI RX|Authentication required. **** v1 CLI TX|auth 635e85017aacce062ffcb61509df50dbd6420de8609fcded94e0711ec9800f14 **** dT 0.125 *** v1 CLI RX 200 **** v1 CLI RX|----------------------------- **** v1 CLI RX|Varnish Cache CLI 1.0 **** v1 CLI RX|----------------------------- **** v1 CLI RX|Darwin,24.5.0,arm64,-jnone,-sdefault,-sdefault,-hcritbit **** v1 CLI RX|varnish-trunk revision 58e30e74ed511726f5c55f6f00fc9fb38120b3b0 **** v1 CLI RX| **** v1 CLI RX|Type 'help' for command list. **** v1 CLI RX|Type 'quit' to close CLI session. **** v1 CLI RX|Type 'start' to launch worker process. **** v1 CLI TX|vcl.inline vcl1 << %XJEIFLH|)Xspa8P **** v1 CLI TX|vcl 4.1; **** v1 CLI TX|backend s1 { .host = "127.0.0.1"; .port = "52394"; } **** v1 CLI TX|backend s2 { .host = "127.0.0.1"; .port = "52395"; } **** v1 CLI TX| **** v1 CLI TX| **** v1 CLI TX|\timport vtc; **** v1 CLI TX| **** v1 CLI TX|\tsub vcl_recv { **** v1 CLI TX|\t\tif (req.http.stale) { **** v1 CLI TX|\t\t\tset req.backend_hint = s2; **** v1 CLI TX|\t\t} **** v1 CLI TX|\t} **** v1 CLI TX| **** v1 CLI TX|\tsub vcl_backend_response { **** v1 CLI TX|\t\tset beresp.ttl = 0.01s; **** v1 CLI TX|\t\tset beresp.grace = 0s; **** v1 CLI TX|\t\tset beresp.keep = 10m; **** v1 CLI TX|\t\tif (bereq.http.stale) { **** v1 CLI TX|\t\t\tvtc.barrier_sync("127.0.0.1:52393"); **** v1 CLI TX|\t\t} **** v1 CLI TX|\t} **** v1 CLI TX| **** v1 CLI TX|\tsub vcl_backend_refresh { **** v1 CLI TX|\t\treturn (beresp); **** v1 CLI TX|\t} **** v1 CLI TX| **** v1 CLI TX| **** v1 CLI TX|%XJEIFLH|)Xspa8P **** dT 0.234 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 0.344 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 0.416 *** v1 CLI RX 200 **** v1 CLI RX|VCL compiled. **** v1 CLI TX|vcl.use vcl1 *** v1 CLI RX 200 **** v1 CLI RX|VCL 'vcl1' now active ** v1 Start **** v1 CLI TX|start **** dT 0.421 *** v1 debug|Debug: Child (33757) Started **** dT 0.454 *** v1 debug|Child launched OK **** v1 vsl| 0 CLI - Rd vcl.load "vcl1" vcl_vcl1.1756735676.977452/vgc.so 1auto **** dT 0.570 *** v1 CLI RX 200 *** v1 wait-running **** v1 CLI TX|status *** v1 debug|Info: Child (33757) said Child starts *** v1 CLI RX 200 **** v1 CLI RX|Child in state running **** v1 CLI TX|debug.listen_address **** dT 0.571 *** v1 CLI RX 200 **** v1 CLI RX|a0 127.0.0.1 52398 **** v1 CLI TX|debug.xid 1000 *** v1 CLI RX 200 **** v1 CLI RX|XID is 1000 chunk 1 **** v1 CLI TX|debug.listen_address *** v1 CLI RX 200 **** v1 CLI RX|a0 127.0.0.1 52398 ** v1 Listen on 127.0.0.1 52398 **** v1 macro def v1_addr=127.0.0.1 **** v1 macro def v1_port=52398 **** v1 macro def v1_sock=127.0.0.1:52398 **** v1 macro def v1_a0_addr=127.0.0.1 **** v1 macro def v1_a0_port=52398 **** v1 macro def v1_a0_sock=127.0.0.1:52398 ** top === client c1 { ** c1 Starting client ** top === delay 0.02 *** top delaying 0.02 second(s) ** c1 Started on 127.0.0.1:52398 (1 iterations) *** c1 Connect to 127.0.0.1:52398 **** dT 0.572 *** c1 connected fd 18 from 127.0.0.1 52402 to 127.0.0.1:52398 ** c1 === txreq **** c1 txreq|GET / HTTP/1.1\r **** c1 txreq|Host: 127.0.0.1\r **** c1 txreq|User-Agent: c1\r **** c1 txreq|\r ** c1 === rxresp **** dT 0.593 ** top === client c2 { ** c2 Starting client **** dT 0.594 ** top === client c1 -wait ** c1 Waiting for client ** c2 Started on 127.0.0.1:52398 (1 iterations) *** c2 Connect to 127.0.0.1:52398 *** c2 connected fd 19 from 127.0.0.1 52403 to 127.0.0.1:52398 ** c2 === txreq -hdr "stale: 1" **** c2 txreq|GET / HTTP/1.1\r **** c2 txreq|stale: 1\r **** c2 txreq|Host: 127.0.0.1\r **** c2 txreq|User-Agent: c2\r **** c2 txreq|\r ** c2 === rxresp **** dT 0.665 *** s2 accepted fd 20 127.0.0.1 52404 ** s2 === rxreq **** dT 0.666 **** s2 rxhdr|GET / HTTP/1.1\r **** s2 rxhdr|stale: 1\r **** s2 rxhdr|Host: 127.0.0.1\r **** s2 rxhdr|User-Agent: c2\r **** s2 rxhdr|X-Forwarded-For: 127.0.0.1\r **** s2 rxhdr|Via: 1.1 v1 (Varnish/trunk)\r **** s2 rxhdr|Accept-Encoding: gzip\r **** s2 rxhdr|X-Varnish: 1004\r **** s2 rxhdr|\r **** s2 rxhdrlen = 158 **** s2 http[ 0] |GET **** s2 http[ 1] |/ **** s2 http[ 2] |HTTP/1.1 **** s2 http[ 3] |stale: 1 **** s2 http[ 4] |Host: 127.0.0.1 **** s2 http[ 5] |User-Agent: c2 **** s2 http[ 6] |X-Forwarded-For: 127.0.0.1 **** s2 http[ 7] |Via: 1.1 v1 (Varnish/trunk) **** s2 http[ 8] |Accept-Encoding: gzip **** s2 http[ 9] |X-Varnish: 1004 **** s2 bodylen = 0 ** s2 === expect req.http.if-none-match == "abcd" ---- s2 EXPECT req.http.if-none-match () == "abcd" failed **** dT 0.674 **** v1 vsl| 0 CLI - Wr 200 52 Loaded "vcl_vcl1.1756735676.977452/vgc.so" as "vcl1" **** v1 vsl| 0 CLI - Rd vcl.use "vcl1" **** v1 vsl| 0 CLI - Wr 200 0 **** v1 vsl| 0 CLI - Rd start **** v1 vsl| 0 Debug - sockopt: Setting SO_LINGER for a0=127.0.0.1:52398 **** v1 vsl| 0 Debug - sockopt: Setting SO_KEEPALIVE for a0=127.0.0.1:52398 **** v1 vsl| 0 Debug - sockopt: Setting SO_SNDTIMEO for a0=127.0.0.1:52398 **** v1 vsl| 0 Debug - sockopt: Setting SO_RCVTIMEO for a0=127.0.0.1:52398 **** v1 vsl| 0 Debug - sockopt: Setting TCP_NODELAY for a0=127.0.0.1:52398 **** v1 vsl| 0 Debug - sockopt: Setting TCP_KEEPALIVE for a0=127.0.0.1:52398 **** v1 vsl| 0 CLI - Wr 200 0 **** v1 vsl| 0 CLI - Rd debug.listen_address **** v1 vsl| 0 CLI - Wr 200 19 a0 127.0.0.1 52398 **** v1 vsl| 0 CLI - Rd debug.xid 1000 **** v1 vsl| 0 CLI - Wr 200 19 XID is 1000 chunk 1 **** v1 vsl| 0 CLI - Rd debug.listen_address **** v1 vsl| 0 CLI - Wr 200 19 a0 127.0.0.1 52398 **** v1 vsl| 1001 Begin c sess 0 HTTP/1 **** v1 vsl| 1000 Begin c sess 0 HTTP/1 **** v1 vsl| 1001 SessOpen c 127.0.0.1 52402 a0 127.0.0.1 52398 1756735677.516015 22 **** v1 vsl| 1000 SessOpen c 127.0.0.1 52403 a0 127.0.0.1 52398 1756735677.516015 23 **** v1 vsl| 1001 Debug c sockopt: Test confirmed SO_KEEPALIVE non heredity for a0=127.0.0.1:52398 **** v1 vsl| 1001 Debug c sockopt: Test confirmed SO_SNDTIMEO non heredity for a0=127.0.0.1:52398 **** v1 vsl| 1000 Debug c sockopt: Test confirmed SO_KEEPALIVE non heredity for a0=127.0.0.1:52398 **** v1 vsl| 1001 Debug c sockopt: Test confirmed SO_RCVTIMEO non heredity for a0=127.0.0.1:52398 **** v1 vsl| 1000 Debug c sockopt: Not testing nonhereditary SO_SNDTIMEO for a0=127.0.0.1:52398 **** v1 vsl| 1000 Debug c sockopt: Not testing nonhereditary SO_RCVTIMEO for a0=127.0.0.1:52398 **** v1 vsl| 1001 Debug c sockopt: Test confirmed TCP_NODELAY non heredity for a0=127.0.0.1:52398 **** v1 vsl| 1000 Debug c sockopt: Test confirmed TCP_NODELAY non heredity for a0=127.0.0.1:52398 **** v1 vsl| 1001 Debug c sockopt: Test confirmed TCP_KEEPALIVE non heredity for a0=127.0.0.1:52398 **** v1 vsl| 1000 Debug c sockopt: Test confirmed TCP_KEEPALIVE non heredity for a0=127.0.0.1:52398 **** v1 vsl| 1001 Debug c sockopt: SO_LINGER may be inherited for a0=127.0.0.1:52398 **** v1 vsl| 1000 Debug c sockopt: SO_LINGER may be inherited for a0=127.0.0.1:52398 **** v1 vsl| 1001 Debug c sockopt: Setting SO_KEEPALIVE for a0=127.0.0.1:52398 **** v1 vsl| 1000 Debug c sockopt: Setting SO_KEEPALIVE for a0=127.0.0.1:52398 **** v1 vsl| 1001 Debug c sockopt: Setting SO_SNDTIMEO for a0=127.0.0.1:52398 **** v1 vsl| 1000 Debug c sockopt: Setting SO_SNDTIMEO for a0=127.0.0.1:52398 **** v1 vsl| 1001 Debug c sockopt: Setting SO_RCVTIMEO for a0=127.0.0.1:52398 **** v1 vsl| 1001 Debug c sockopt: Setting TCP_NODELAY for a0=127.0.0.1:52398 **** v1 vsl| 1000 Debug c sockopt: Setting SO_RCVTIMEO for a0=127.0.0.1:52398 **** v1 vsl| 1001 Debug c sockopt: Setting TCP_KEEPALIVE for a0=127.0.0.1:52398 **** v1 vsl| 1000 Debug c sockopt: Setting TCP_NODELAY for a0=127.0.0.1:52398 **** v1 vsl| 1000 Debug c sockopt: Setting TCP_KEEPALIVE for a0=127.0.0.1:52398 **** v1 vsl| 1001 Link c req 1002 rxreq **** v1 vsl| 1000 Link c req 1003 rxreq **** dT 0.714 **** b1 macro undef b1_addr **** b1 macro undef b1_port **** dT 0.715 **** b1 macro undef b1_sock **** dT 1.646 **** v1 vsl| 0 Debug - sockopt: Not setting unmodified SO_LINGER for a0=127.0.0.1:52398 **** v1 vsl| 0 Debug - sockopt: Not setting unmodified SO_KEEPALIVE for a0=127.0.0.1:52398 **** v1 vsl| 0 Debug - sockopt: Setting SO_SNDTIMEO for a0=127.0.0.1:52398 **** v1 vsl| 0 Debug - sockopt: Setting SO_RCVTIMEO for a0=127.0.0.1:52398 **** v1 vsl| 0 Debug - sockopt: Not setting unmodified TCP_NODELAY for a0=127.0.0.1:52398 **** v1 vsl| 0 Debug - sockopt: Not setting unmodified TCP_KEEPALIVE for a0=127.0.0.1:52398 **** dT 3.477 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756735680 1.0 **** dT 6.509 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756735683 1.0 **** dT 9.550 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756735686 1.0 **** dT 12.464 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756735689 1.0 **** dT 15.504 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756735692 1.0 **** dT 18.529 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756735695 1.0 **** dT 21.564 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756735698 1.0 **** dT 24.503 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756735701 1.0 **** dT 27.562 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756735704 1.0 **** dT 30.498 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756735707 1.0 **** dT 30.572 ---- c1 HTTP rx timeout (fd:18 30.000s) * top Aborting execution, test failed * top RESETTING after ../../../../bin/varnishtest/tests/b00096.vtc ** c2 Waiting for client **** dT 30.596 ---- c2 HTTP rx timeout (fd:19 30.000s) ** s1 Waiting for server (5/-1) ** s2 Waiting for server (6/-1) ** v1 Wait **** v1 CLI TX|panic.show *** v1 CLI RX 300 **** v1 CLI RX|Child has not panicked or panic has been cleared *** v1 debug|Info: manager stopping child *** v1 debug|Debug: Stopping Child **** dT 30.608 **** v1 vsl| 0 CLI - EOF on CLI connection, worker stops **** dT 31.574 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 32.548 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 33.624 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 34.587 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 35.668 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 36.630 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 37.711 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 38.690 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 39.755 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 40.736 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 41.721 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 42.796 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 43.764 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 44.861 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 45.844 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 46.824 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 47.901 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 48.860 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 49.945 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 50.911 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 51.993 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 52.973 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 53.940 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 55.024 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 55.994 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 57.067 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 58.039 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 **** dT 59.122 *** v1 debug|Info: Child (33757) said shutdown waiting for 3 references on vcl1 # top TEST ../../../../bin/varnishtest/tests/b00096.vtc TIMED OUT (kill -9) # top TEST ../../../../bin/varnishtest/tests/b00096.vtc FAILED (60.002) signal=9 FAIL tests/b00096.vtc (exit status: 2)