**** dT 0.000 * top TEST ../../../../bin/varnishtest/tests/b00096.vtc starting **** top extmacro def pkg_version=trunk **** top extmacro def pkg_branch=trunk **** top extmacro def pwd=/Users/bsdphk/VT/varnish-cache/varnish-trunk/_build/sub/bin/varnishtest **** top extmacro def date(...) **** top extmacro def string(...) **** top extmacro def localhost=127.0.0.1 **** top extmacro def bad_backend=127.0.0.1:64104 **** top extmacro def listen_addr=127.0.0.1:0 **** top extmacro def bad_ip=192.0.2.255 **** top extmacro def topbuild=/Users/bsdphk/VT/varnish-cache/varnish-trunk/_build/sub **** top extmacro def topsrc=/Users/bsdphk/VT/varnish-cache/varnish-trunk/_build/sub/../.. **** top macro def testdir=/Users/bsdphk/VT/varnish-cache/varnish-trunk/_build/sub/bin/varnishtest/../../../../bin/varnishtest/tests **** top macro def tmpdir=/Users/bsdphk/VT/_vtest_tmp/vtc.75360.4eb8f09b **** top macro def vtcid=vtc.75360.4eb8f09b ** top === varnishtest "Test vcl_backend_refresh on streaming object" * top VTEST Test vcl_backend_refresh on streaming object ** top === barrier b1 sock 2 **** b1 macro def b1_addr=127.0.0.1 **** b1 macro def b1_port=64105 **** b1 macro def b1_sock=127.0.0.1:64105 ** top === server s1 { ** s1 Starting server **** s1 macro def s1_addr=127.0.0.1 **** s1 macro def s1_port=64106 **** s1 macro def s1_sock=127.0.0.1:64106 * s1 Listen on 127.0.0.1:64106 ** top === server s2 { ** s2 Starting server **** s2 macro def s2_addr=127.0.0.1 **** s2 macro def s2_port=64107 **** s2 macro def s2_sock=127.0.0.1:64107 ** s1 Started on 127.0.0.1:64106 (1 iterations) * s2 Listen on 127.0.0.1:64107 ** top === varnish v1 -vcl+backend { ** s2 Started on 127.0.0.1:64107 (1 iterations) **** dT 0.007 ** v1 Launch *** v1 CMD: cd ${pwd} && exec varnishd -d -n /Users/bsdphk/VT/_vtest_tmp/vtc.75360.4eb8f09b/v1 -i v1 -l 2m -p auto_restart=off -p syslog_cli_traffic=off -p thread_pool_min=10 -p debug=+vtc_mode -p vsl_mask=+Debug,+H2RxHdr,+H2RxBody -p h2_initial_window_size=1m -p h2_rx_window_low_water=64k -a '127.0.0.1:0' -M '127.0.0.1 64108' -P /Users/bsdphk/VT/_vtest_tmp/vtc.75360.4eb8f09b/v1/varnishd.pid -p vmod_path=/Users/bsdphk/VT/varnish-cache/varnish-trunk/_build/sub/vmod/.libs *** v1 CMD: cd /Users/bsdphk/VT/varnish-cache/varnish-trunk/_build/sub/bin/varnishtest && exec varnishd -d -n /Users/bsdphk/VT/_vtest_tmp/vtc.75360.4eb8f09b/v1 -i v1 -l 2m -p auto_restart=off -p syslog_cli_traffic=off -p thread_pool_min=10 -p debug=+vtc_mode -p vsl_mask=+Debug,+H2RxHdr,+H2RxBody -p h2_initial_window_size=1m -p h2_rx_window_low_water=64k -a '127.0.0.1:0' -M '127.0.0.1 64108' -P /Users/bsdphk/VT/_vtest_tmp/vtc.75360.4eb8f09b/v1/varnishd.pid -p vmod_path=/Users/bsdphk/VT/varnish-cache/varnish-trunk/_build/sub/vmod/.libs *** v1 PID: 75378 **** v1 macro def v1_pid=75378 **** v1 macro def v1_name=/Users/bsdphk/VT/_vtest_tmp/vtc.75360.4eb8f09b/v1 **** dT 0.025 *** v1 debug|Debug: Version: varnish-trunk revision 092c3a251979a846b932fdffa0a27c631a4d9df4 *** v1 debug|Debug: Platform: Darwin,24.5.0,arm64,-jnone,-sdefault,-sdefault,-hcritbit *** v1 debug|200 307 *** v1 debug|----------------------------- *** v1 debug|Varnish Cache CLI 1.0 *** v1 debug|----------------------------- *** v1 debug|Darwin,24.5.0,arm64,-jnone,-sdefault,-sdefault,-hcritbit *** v1 debug|varnish-trunk revision 092c3a251979a846b932fdffa0a27c631a4d9df4 *** v1 debug| *** v1 debug|Type 'help' for command list. *** v1 debug|Type 'quit' to close CLI session. *** v1 debug|Type 'start' to launch worker process. *** v1 debug| **** dT 0.123 **** v1 CLIPOLL 1 0x1 0x0 0x0 *** v1 CLI connection fd = 8 **** dT 0.124 *** v1 CLI RX 107 **** v1 CLI RX|steqnmvgiazxtfhdjkryjzidphrsuqdt **** v1 CLI RX| **** v1 CLI RX|Authentication required. **** v1 CLI TX|auth c5efc05d6ccc0b83a12eaa66e590f97826202f0de22237ef906c0d29217b5df7 **** dT 0.125 *** v1 CLI RX 200 **** v1 CLI RX|----------------------------- **** v1 CLI RX|Varnish Cache CLI 1.0 **** v1 CLI RX|----------------------------- **** v1 CLI RX|Darwin,24.5.0,arm64,-jnone,-sdefault,-sdefault,-hcritbit **** v1 CLI RX|varnish-trunk revision 092c3a251979a846b932fdffa0a27c631a4d9df4 **** v1 CLI RX| **** v1 CLI RX|Type 'help' for command list. **** v1 CLI RX|Type 'quit' to close CLI session. **** v1 CLI RX|Type 'start' to launch worker process. **** v1 CLI TX|vcl.inline vcl1 << %XJEIFLH|)Xspa8P **** v1 CLI TX|vcl 4.1; **** v1 CLI TX|backend s1 { .host = "127.0.0.1"; .port = "64106"; } **** v1 CLI TX|backend s2 { .host = "127.0.0.1"; .port = "64107"; } **** v1 CLI TX| **** v1 CLI TX| **** v1 CLI TX|\timport vtc; **** v1 CLI TX| **** v1 CLI TX|\tsub vcl_recv { **** v1 CLI TX|\t\tif (req.http.stale) { **** v1 CLI TX|\t\t\tset req.backend_hint = s2; **** v1 CLI TX|\t\t} **** v1 CLI TX|\t} **** v1 CLI TX| **** v1 CLI TX|\tsub vcl_backend_response { **** v1 CLI TX|\t\tset beresp.ttl = 0.01s; **** v1 CLI TX|\t\tset beresp.grace = 0s; **** v1 CLI TX|\t\tset beresp.keep = 10m; **** v1 CLI TX|\t\tif (bereq.http.stale) { **** v1 CLI TX|\t\t\tvtc.barrier_sync("127.0.0.1:64105"); **** v1 CLI TX|\t\t} **** v1 CLI TX|\t} **** v1 CLI TX| **** v1 CLI TX|\tsub vcl_backend_refresh { **** v1 CLI TX|\t\treturn (beresp); **** v1 CLI TX|\t} **** v1 CLI TX| **** v1 CLI TX| **** v1 CLI TX|%XJEIFLH|)Xspa8P **** dT 0.235 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 0.345 *** v1 vsl|No VSL chunk found (child not started ?) **** dT 0.413 *** v1 CLI RX 200 **** v1 CLI RX|VCL compiled. **** v1 CLI TX|vcl.use vcl1 *** v1 CLI RX 200 **** v1 CLI RX|VCL 'vcl1' now active ** v1 Start **** v1 CLI TX|start **** dT 0.418 *** v1 debug|Debug: Child (75388) Started **** dT 0.454 *** v1 debug|Child launched OK **** dT 0.563 **** v1 vsl| 0 CLI - Rd vcl.load "vcl1" vcl_vcl1.1756993618.150307/vgc.so 1auto **** dT 0.569 *** v1 debug|Info: Child (75388) said Child starts *** v1 CLI RX 200 *** v1 wait-running **** v1 CLI TX|status **** dT 0.570 *** v1 CLI RX 200 **** v1 CLI RX|Child in state running **** v1 CLI TX|debug.listen_address *** v1 CLI RX 200 **** v1 CLI RX|a0 127.0.0.1 64110 **** v1 CLI TX|debug.xid 1000 *** v1 CLI RX 200 **** v1 CLI RX|XID is 1000 chunk 1 **** v1 CLI TX|debug.listen_address *** v1 CLI RX 200 **** v1 CLI RX|a0 127.0.0.1 64110 ** v1 Listen on 127.0.0.1 64110 **** v1 macro def v1_addr=127.0.0.1 **** v1 macro def v1_port=64110 **** v1 macro def v1_sock=127.0.0.1:64110 **** v1 macro def v1_a0_addr=127.0.0.1 **** v1 macro def v1_a0_port=64110 **** v1 macro def v1_a0_sock=127.0.0.1:64110 ** top === client c1 { ** c1 Starting client ** top === delay 0.02 *** top delaying 0.02 second(s) ** c1 Started on 127.0.0.1:64110 (1 iterations) *** c1 Connect to 127.0.0.1:64110 *** c1 connected fd 18 from 127.0.0.1 64114 to 127.0.0.1:64110 ** c1 === txreq **** c1 txreq|GET / HTTP/1.1\r **** c1 txreq|Host: 127.0.0.1\r **** c1 txreq|User-Agent: c1\r **** c1 txreq|\r ** c1 === rxresp **** dT 0.592 ** top === client c2 { ** c2 Starting client ** top === client c1 -wait ** c1 Waiting for client ** c2 Started on 127.0.0.1:64110 (1 iterations) *** c2 Connect to 127.0.0.1:64110 **** dT 0.593 *** c2 connected fd 19 from 127.0.0.1 64115 to 127.0.0.1:64110 ** c2 === txreq -hdr "stale: 1" **** c2 txreq|GET / HTTP/1.1\r **** c2 txreq|stale: 1\r **** c2 txreq|Host: 127.0.0.1\r **** c2 txreq|User-Agent: c2\r **** c2 txreq|\r ** c2 === rxresp **** dT 0.655 *** s2 accepted fd 20 127.0.0.1 64116 ** s2 === rxreq **** dT 0.656 **** s2 rxhdr|GET / HTTP/1.1\r **** s2 rxhdr|stale: 1\r **** s2 rxhdr|Host: 127.0.0.1\r **** s2 rxhdr|User-Agent: c2\r **** s2 rxhdr|X-Forwarded-For: 127.0.0.1\r **** s2 rxhdr|Via: 1.1 v1 (Varnish/trunk)\r **** s2 rxhdr|Accept-Encoding: gzip\r **** s2 rxhdr|X-Varnish: 1004\r **** s2 rxhdr|\r **** s2 rxhdrlen = 158 **** s2 http[ 0] |GET **** s2 http[ 1] |/ **** s2 http[ 2] |HTTP/1.1 **** s2 http[ 3] |stale: 1 **** s2 http[ 4] |Host: 127.0.0.1 **** s2 http[ 5] |User-Agent: c2 **** s2 http[ 6] |X-Forwarded-For: 127.0.0.1 **** s2 http[ 7] |Via: 1.1 v1 (Varnish/trunk) **** s2 http[ 8] |Accept-Encoding: gzip **** s2 http[ 9] |X-Varnish: 1004 **** s2 bodylen = 0 ** s2 === expect req.http.if-none-match == "abcd" ---- s2 EXPECT req.http.if-none-match () == "abcd" failed **** dT 0.670 **** v1 vsl| 0 CLI - Wr 200 52 Loaded "vcl_vcl1.1756993618.150307/vgc.so" as "vcl1" **** v1 vsl| 0 CLI - Rd vcl.use "vcl1" **** v1 vsl| 0 CLI - Wr 200 0 **** v1 vsl| 0 CLI - Rd start **** v1 vsl| 0 Debug - sockopt: Setting SO_LINGER for a0=127.0.0.1:64110 **** v1 vsl| 0 Debug - sockopt: Setting SO_KEEPALIVE for a0=127.0.0.1:64110 **** v1 vsl| 0 Debug - sockopt: Setting SO_SNDTIMEO for a0=127.0.0.1:64110 **** v1 vsl| 0 Debug - sockopt: Setting SO_RCVTIMEO for a0=127.0.0.1:64110 **** v1 vsl| 0 Debug - sockopt: Setting TCP_NODELAY for a0=127.0.0.1:64110 **** v1 vsl| 0 Debug - sockopt: Setting TCP_KEEPALIVE for a0=127.0.0.1:64110 **** v1 vsl| 0 CLI - Wr 200 0 **** v1 vsl| 0 CLI - Rd debug.listen_address **** v1 vsl| 0 CLI - Wr 200 19 a0 127.0.0.1 64110 **** v1 vsl| 0 CLI - Rd debug.xid 1000 **** v1 vsl| 0 CLI - Wr 200 19 XID is 1000 chunk 1 **** v1 vsl| 0 CLI - Rd debug.listen_address **** v1 vsl| 0 CLI - Wr 200 19 a0 127.0.0.1 64110 **** v1 vsl| 1001 Begin c sess 0 HTTP/1 **** v1 vsl| 1000 Begin c sess 0 HTTP/1 **** v1 vsl| 1000 SessOpen c 127.0.0.1 64115 a0 127.0.0.1 64110 1756993618.679218 23 **** v1 vsl| 1001 SessOpen c 127.0.0.1 64114 a0 127.0.0.1 64110 1756993618.679219 20 **** v1 vsl| 1000 Debug c sockopt: Test confirmed SO_KEEPALIVE non heredity for a0=127.0.0.1:64110 **** v1 vsl| 1000 Debug c sockopt: Test confirmed SO_SNDTIMEO non heredity for a0=127.0.0.1:64110 **** v1 vsl| 1001 Debug c sockopt: Not testing nonhereditary SO_KEEPALIVE for a0=127.0.0.1:64110 **** v1 vsl| 1001 Debug c sockopt: Not testing nonhereditary SO_SNDTIMEO for a0=127.0.0.1:64110 **** v1 vsl| 1000 Debug c sockopt: Test confirmed SO_RCVTIMEO non heredity for a0=127.0.0.1:64110 **** v1 vsl| 1001 Debug c sockopt: Test confirmed SO_RCVTIMEO non heredity for a0=127.0.0.1:64110 **** v1 vsl| 1000 Debug c sockopt: Test confirmed TCP_NODELAY non heredity for a0=127.0.0.1:64110 **** v1 vsl| 1001 Debug c sockopt: Test confirmed TCP_NODELAY non heredity for a0=127.0.0.1:64110 **** v1 vsl| 1000 Debug c sockopt: Test confirmed TCP_KEEPALIVE non heredity for a0=127.0.0.1:64110 **** v1 vsl| 1000 Debug c sockopt: SO_LINGER may be inherited for a0=127.0.0.1:64110 **** v1 vsl| 1001 Debug c sockopt: Test confirmed TCP_KEEPALIVE non heredity for a0=127.0.0.1:64110 **** v1 vsl| 1000 Debug c sockopt: Setting SO_KEEPALIVE for a0=127.0.0.1:64110 **** v1 vsl| 1001 Debug c sockopt: SO_LINGER may be inherited for a0=127.0.0.1:64110 **** v1 vsl| 1001 Debug c sockopt: Setting SO_KEEPALIVE for a0=127.0.0.1:64110 **** v1 vsl| 1000 Debug c sockopt: Setting SO_SNDTIMEO for a0=127.0.0.1:64110 **** v1 vsl| 1001 Debug c sockopt: Setting SO_SNDTIMEO for a0=127.0.0.1:64110 **** v1 vsl| 1000 Debug c sockopt: Setting SO_RCVTIMEO for a0=127.0.0.1:64110 **** v1 vsl| 1001 Debug c sockopt: Setting SO_RCVTIMEO for a0=127.0.0.1:64110 **** v1 vsl| 1000 Debug c sockopt: Setting TCP_NODELAY for a0=127.0.0.1:64110 **** v1 vsl| 1001 Debug c sockopt: Setting TCP_NODELAY for a0=127.0.0.1:64110 **** v1 vsl| 1000 Debug c sockopt: Setting TCP_KEEPALIVE for a0=127.0.0.1:64110 **** v1 vsl| 1001 Debug c sockopt: Setting TCP_KEEPALIVE for a0=127.0.0.1:64110 **** v1 vsl| 1001 Link c req 1002 rxreq **** v1 vsl| 1000 Link c req 1003 rxreq **** dT 0.713 **** b1 macro undef b1_addr **** b1 macro undef b1_port **** b1 macro undef b1_sock **** dT 1.637 **** v1 vsl| 0 Debug - sockopt: Not setting unmodified SO_LINGER for a0=127.0.0.1:64110 **** v1 vsl| 0 Debug - sockopt: Not setting unmodified SO_KEEPALIVE for a0=127.0.0.1:64110 **** v1 vsl| 0 Debug - sockopt: Setting SO_SNDTIMEO for a0=127.0.0.1:64110 **** v1 vsl| 0 Debug - sockopt: Setting SO_RCVTIMEO for a0=127.0.0.1:64110 **** v1 vsl| 0 Debug - sockopt: Not setting unmodified TCP_NODELAY for a0=127.0.0.1:64110 **** v1 vsl| 0 Debug - sockopt: Not setting unmodified TCP_KEEPALIVE for a0=127.0.0.1:64110 **** dT 3.481 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756993621 1.0 **** dT 6.543 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756993624 1.0 **** dT 9.561 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756993627 1.0 **** dT 12.519 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756993630 1.0 **** dT 15.563 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756993633 1.0 **** dT 18.499 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756993636 1.0 **** dT 21.534 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756993639 1.0 **** dT 24.557 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756993642 1.0 **** dT 27.474 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756993645 1.0 **** dT 30.508 **** v1 vsl| 0 CLI - Rd ping **** v1 vsl| 0 CLI - Wr 200 19 PONG 1756993648 1.0 **** dT 30.572 ---- c1 HTTP rx timeout (fd:18 30.000s) * top Aborting execution, test failed * top RESETTING after ../../../../bin/varnishtest/tests/b00096.vtc ** c2 Waiting for client **** dT 30.594 ---- c2 HTTP rx timeout (fd:19 30.000s) ** s1 Waiting for server (5/-1) ** s2 Waiting for server (6/-1) ** v1 Wait **** v1 CLI TX|panic.show **** dT 30.595 *** v1 CLI RX 300 **** v1 CLI RX|Child has not panicked or panic has been cleared *** v1 debug|Info: manager stopping child *** v1 debug|Debug: Stopping Child **** dT 30.618 **** v1 vsl| 0 CLI - EOF on CLI connection, worker stops **** dT 31.570 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 32.553 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 33.631 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 34.583 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 35.671 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 36.633 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 37.716 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 38.693 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 39.771 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 40.759 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 41.745 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 42.836 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 43.813 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 44.782 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 45.850 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 46.824 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 47.904 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 48.874 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 49.862 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 50.949 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 51.925 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 53.019 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 53.996 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 54.979 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 56.074 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 57.064 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 58.053 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 **** dT 59.138 *** v1 debug|Info: Child (75388) said shutdown waiting for 3 references on vcl1 # top TEST ../../../../bin/varnishtest/tests/b00096.vtc TIMED OUT (kill -9) # top TEST ../../../../bin/varnishtest/tests/b00096.vtc FAILED (60.002) signal=9 FAIL tests/b00096.vtc (exit status: 2)