flake.lock: Update #305
Merged
flake.lock: Update #305
Garnix CI / check vm_vaultwarden_sso [x86_64-linux]
succeeded
Sep 16, 2024 in 4m 56s
Run results
Build succeeded
Details
Last 100 lines of logs:
�[2mserver # [ 12.570568] lldap-start[1114]: 2024-09-16T01:30:00.396076455+00:00 INFO ┝━ i [info]: Upgrading DB schema to version 6�[0m
�[2mserver # [ 12.571634] lldap-start[1114]: 2024-09-16T01:30:00.399396430+00:00 INFO ┝━ i [info]: Upgrading DB schema to version 7�[0m
�[2mserver # [ 12.572728] lldap-start[1114]: 2024-09-16T01:30:00.402142868+00:00 INFO ┝━ i [info]: Upgrading DB schema to version 8�[0m
�[2mserver # [ 12.573793] lldap-start[1114]: 2024-09-16T01:30:00.404990437+00:00 INFO ┝━ i [info]: Upgrading DB schema to version 9�[0m
�[2mserver # [ 12.575035] lldap-start[1114]: 2024-09-16T01:30:00.407931313+00:00 INFO ┝━ i [info]: Upgrading DB schema to version 10�[0m
�[2mserver # [ 12.576159] lldap-start[1114]: 2024-09-16T01:30:00.424198744+00:00 WARN ┝━ 🚧 [warn]: Could not find lldap_admin group, trying to create it�[0m
�[2mserver # [ 12.577426] lldap-start[1114]: 2024-09-16T01:30:00.429651113+00:00 WARN ┝━ 🚧 [warn]: Could not find lldap_password_manager group, trying to create it�[0m
�[2mserver # [ 12.578751] lldap-start[1114]: 2024-09-16T01:30:00.432814923+00:00 WARN ┝━ 🚧 [warn]: Could not find lldap_strict_readonly group, trying to create it�[0m
�[2mserver # [ 12.580198] lldap-start[1114]: 2024-09-16T01:30:00.436802581+00:00 WARN ┝━ 🚧 [warn]: Could not find an admin user, trying to create the user "admin" with the config-provided password�[0m
�[2mserver # [ 12.581837] lldap-start[1114]: 2024-09-16T01:30:00.507933409+00:00 INFO ┝━ i [info]: Starting the LDAP server on port 3890�[0m
�[2mserver # [ 12.582957] lldap-start[1114]: 2024-09-16T01:30:00.508368939+00:00 INFO ┕━ i [info]: Starting the API/web server on port 17170�[0m
�[2mserver # [ 12.584148] lldap-start[1114]: 2024-09-16T01:30:00.509404545+00:00 INFO i [info]: starting 1 workers�[0m
�[2mserver # [ 12.585123] lldap-start[1114]: 2024-09-16T01:30:00.510421714+00:00 INFO i [info]: Actix runtime found; starting in Actix runtime�[0m
�[2mserver # [ 12.587577] lldap-start[1114]: 2024-09-16T01:30:00.534381742+00:00 INFO i [info]: DB Cleanup Cron started�[0m
�[2mserver # [ 14.115159] authelia-auth.example.com-pre-start[1129]: Connection to 127.0.0.1 3890 port [tcp/ndsconnect] succeeded!�[0m
�[2mserver # [ 14.116712] lldap-start[1114]: 2024-09-16T01:30:02.060589402+00:00 INFO LDAP session [ 682µs | 100.00% ]�[0m
�[2mserver # [ 16.248401] authelia-auth.example.com-pre-start[1139]: Configuration parsed and loaded with warnings:�[0m
�[2mserver # [ 16.250140] authelia-auth.example.com-pre-start[1139]: - configuration key 'identity_providers.oidc.issuer_private_key' is deprecated in 4.38.0 and has been replaced by 'identity_providers.oidc.jwks': you are not required to make any changes as this has been automatically mapped for you, but to stop this warning being logged you will need to adjust your configuration, and this configuration key and auto-mapping is likely to be removed in 5.0.0 : see https://www.authelia.com/c/oidc for more information�[0m
�[2mserver # [ 16.254046] authelia-auth.example.com-pre-start[1139]: - identity_providers: oidc: clients: client 'dummy_client': option 'client_secret' is plaintext but for clients not using the 'token_endpoint_auth_method' of 'client_secret_jwt' it should be a hashed value as plaintext values are deprecated with the exception of 'client_secret_jwt' and will be removed in the near future�[0m
�[2mserver # [ 16.259979] systemd[1]: Started Authelia authentication and authorization server.�[0m
�[2mserver # [ 16.262441] systemd[1]: Reached target Multi-User System.�[0m
�[2mserver # [ 16.263546] systemd[1]: Startup finished in 1.970s (kernel) + 14.290s (userspace) = 16.261s.�[0m
�[2mserver # [ 16.339128] authelia[1144]: time="2024-09-16T01:30:04Z" level=debug msg="Loaded Configuration Sources" files="[/nix/store/xy297l93sg30xfxj4myhcai8xw9vr3q7-config.yml /var/lib/authelia-auth.example.com/oidc_clients.yaml]" filters="[]"�[0m
�[2mserver # [ 16.341083] authelia[1144]: time="2024-09-16T01:30:04Z" level=debug msg="Logging Initialized" fields.level=debug file= format=json keep_stdout=false�[0m
�[2mserver # [ 16.343479] authelia[1144]: time="2024-09-16T01:30:04Z" level=debug msg="Process user information" gid=999 uid=999 username=authelia�[0m
�[2mserver # [ 16.345447] authelia[1144]: time="2024-09-16T01:30:04Z" level=warning msg="Configuration: configuration key 'identity_providers.oidc.issuer_private_key' is deprecated in 4.38.0 and has been replaced by 'identity_providers.oidc.jwks': you are not required to make any changes as this has been automatically mapped for you, but to stop this warning being logged you will need to adjust your configuration, and this configuration key and auto-mapping is likely to be removed in 5.0.0 : see https://www.authelia.com/c/oidc for more information"�[0m
�[2mserver # [ 16.354177] authelia[1144]: time="2024-09-16T01:30:04Z" level=warning msg="Configuration: identity_providers: oidc: clients: client 'dummy_client': option 'client_secret' is plaintext but for clients not using the 'token_endpoint_auth_method' of 'client_secret_jwt' it should be a hashed value as plaintext values are deprecated with the exception of 'client_secret_jwt' and will be removed in the near future"�[0m
�[2mserver # [ 16.357145] authelia[1144]: time="2024-09-16T01:30:04Z" level=info msg="Authelia v4.38.10-nixpkgs is starting"�[0m
�[2mserver # [ 16.358262] authelia[1144]: time="2024-09-16T01:30:04Z" level=info msg="Log severity set to debug"�[0m
�[2mserver # [ 16.383796] authelia[1144]: {"level":"debug","msg":"Registering client dummy_client with policy one_factor (one_factor)","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.398927] authelia[1144]: {"level":"info","msg":"Storage schema is being checked for updates","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.420047] authelia[1144]: {"level":"info","msg":"Storage schema migration from 0 to 15 is being attempted","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.461772] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 0 to 1","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.479110] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 1 to 2","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.489055] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 2 to 3","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.558285] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 3 to 4","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.562090] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 4 to 5","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.640212] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 5 to 6","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.672254] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 6 to 7","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.683535] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 7 to 8","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.685442] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 8 to 9","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.687365] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 9 to 10","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.689099] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 10 to 11","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.705699] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 11 to 12","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.712634] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 12 to 13","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.714638] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 13 to 14","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.720125] authelia[1144]: {"level":"debug","msg":"Storage schema migrated from version 14 to 15","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.721337] authelia[1144]: {"level":"info","msg":"Storage schema migration from 0 to 15 is complete","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.842927] authelia[1144]: {"level":"debug","msg":"LDAP Supported OIDs. Control Types: none. Extensions: 1.3.6.1.4.1.4203.1.11.1","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.845072] lldap-start[1114]: 2024-09-16T01:30:04.681587602+00:00 INFO LDAP session [ 64.9ms | 1.60% / 100.00% ]�[0m
�[2mserver # [ 16.846242] lldap-start[1114]: 2024-09-16T01:30:04.683411576+00:00 INFO ┝━ LDAP request [ 63.8ms | 98.33% ]�[0m
�[2mserver # [ 16.847287] lldap-start[1114]: 2024-09-16T01:30:04.748281832+00:00 INFO ┕━ LDAP request [ 46.4µs | 0.07% ]�[0m
�[2mserver # [ 16.851139] authelia[1144]: {"error":"error occurred during dial: dial udp: lookup time.cloudflare.com: no such host","level":"warning","msg":"Could not determine the clock offset due to an error","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.882303] authelia[1144]: {"level":"info","msg":"Listening for non-TLS connections on '127.0.0.1:9091' path '/'","server":"main","service":"server","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.884749] authelia[1144]: {"level":"info","msg":"Startup complete","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # [ 16.885630] authelia[1144]: {"level":"info","msg":"Listening for non-TLS connections on '127.0.0.1:9959' path '/metrics'","server":"metrics","service":"server","time":"2024-09-16T01:30:04Z"}�[0m
�[2mserver # Connection to localhost (127.0.0.1) 9091 port [tcp/xmltec-xmlmail] succeeded!�[0m
(finished: waiting for TCP port 9091 on localhost, in 7.18 seconds)
server: �[1m�[32mmust succeed: mkdir -p /tmp/shared/tmpui2n_v9w�[0m
(finished: must succeed: mkdir -p /tmp/shared/tmpui2n_v9w, in 0.02 seconds)
server: �[1m�[32mmust succeed: cp -r /etc/ssl/certs/ca-certificates.crt /tmp/shared/tmpui2n_v9w/ca-certificates.crt�[0m
(finished: must succeed: cp -r /etc/ssl/certs/ca-certificates.crt /tmp/shared/tmpui2n_v9w/ca-certificates.crt, in 0.02 seconds)
client: �[1m�[32mmust succeed: rm -r /etc/ssl/certs�[0m
client: �[1m�[32mwaiting for the VM to finish booting�[0m
client: Guest shell says: b'Spawning backdoor root shell...\n'
client: connected to guest root shell
client: (connecting took 0.00 seconds)
(finished: waiting for the VM to finish booting, in 0.00 seconds)
(finished: must succeed: rm -r /etc/ssl/certs, in 0.02 seconds)
client: �[1m�[32mmust succeed: mkdir -p /tmp/shared/tmpdrge69u1�[0m
(finished: must succeed: mkdir -p /tmp/shared/tmpdrge69u1, in 0.01 seconds)
client: �[1m�[32mmust succeed: mkdir -p /etc/ssl/certs�[0m
(finished: must succeed: mkdir -p /etc/ssl/certs, in 0.01 seconds)
client: �[1m�[32mmust succeed: cp -r /tmp/shared/tmpdrge69u1/ca-certificates.crt /etc/ssl/certs/ca-certificates.crt�[0m
(finished: must succeed: cp -r /tmp/shared/tmpdrge69u1/ca-certificates.crt /etc/ssl/certs/ca-certificates.crt, in 0.02 seconds)
�[1m�[32msubtest: access�[0m
curl --show-error --location --cookie-jar cookie.txt --cookie cookie.txt --connect-to v.example.com:443:server:443 --connect-to v.example.com:80:server:80 --connect-to auth.example.com:443:server:443 --silent --output /dev/null --write-out '{"code":%{response_code}}' https://v.example.com
�[2mserver # [ 18.084887] authelia[1144]: {"level":"debug","msg":"Check authorization of subject username= groups= ip=2001:db8:1::1 and object https://v.example.com/ (method GET).","time":"2024-09-16T01:30:06Z"}�[0m
�[2mserver # [ 18.087606] nginx[938]: server nginx: {"remote_addr":"127.0.0.1","remote_user":"-","time_local":"16/Sep/2024:01:30:06 +0000","request":"GET /api/verify HTTP/1.0","request_length":"258","server_name":"auth.example.com","status":"200","bytes_sent":"439","body_bytes_sent":"6","referrer":"-","user_agent":"curl/8.9.1","gzip_ration":"-","post":"-","upstream_addr":"127.0.0.1:9091","upstream_status":"200","request_time":"0.008","upstream_response_time":"0.007","upstream_connect_time":"0.000","upstream_header_time":"0.007"}�[0m
�[2mserver # [ 18.096143] nginx[938]: server nginx: {"remote_addr":"2001:db8:1::1","remote_user":"-","time_local":"16/Sep/2024:01:30:06 +0000","request":"GET / HTTP/2.0","request_length":"30","server_name":"v.example.com","status":"200","bytes_sent":"2533","body_bytes_sent":"1236","referrer":"-","user_agent":"curl/8.9.1","gzip_ration":"-","post":"-","upstream_addr":"127.0.0.1:8222","upstream_status":"200","request_time":"0.023","upstream_response_time":"0.008","upstream_connect_time":"0.000","upstream_header_time":"0.008"}�[0m
(finished: subtest: access, in 0.13 seconds)
�[1m�[32msubtest: extraScript�[0m
�[1m�[32msubtest: unauthenticated access is not granted to /admin�[0m
curl --show-error --location --cookie-jar cookie.txt --cookie cookie.txt --connect-to v.example.com:443:server:443 --connect-to v.example.com:80:server:80 --connect-to auth.example.com:443:server:443 --silent --output /dev/null --write-out '{"code":%{response_code},"auth_host":"%{urle.host}","auth_query":"%{urle.query}","all":%{json}}' https://v.example.com/admin
�[2mserver # [ 18.184343] authelia[1144]: {"level":"debug","msg":"Check authorization of subject username= groups= ip=2001:db8:1::1 and object https://v.example.com/admin (method GET).","time":"2024-09-16T01:30:06Z"}�[0m
�[2mserver # [ 18.186332] authelia[1144]: {"level":"info","method":"GET","msg":"Access to https://v.example.com/admin (method GET) is not authorized to user \u003canonymous\u003e, responding with status code 401","path":"/api/verify","remote_ip":"2001:db8:1::1","time":"2024-09-16T01:30:06Z"}�[0m
�[2mserver # [ 18.188797] nginx[938]: server nginx: {"remote_addr":"127.0.0.1","remote_user":"-","time_local":"16/Sep/2024:01:30:06 +0000","request":"GET /api/verify HTTP/1.0","request_length":"268","server_name":"auth.example.com","status":"401","bytes_sent":"241","body_bytes_sent":"16","referrer":"-","user_agent":"curl/8.9.1","gzip_ration":"-","post":"-","upstream_addr":"127.0.0.1:9091","upstream_status":"401","request_time":"0.007","upstream_response_time":"0.007","upstream_connect_time":"0.001","upstream_header_time":"0.007"}�[0m
�[2mserver # [ 18.192852] nginx[938]: server nginx: {"remote_addr":"2001:db8:1::1","remote_user":"-","time_local":"16/Sep/2024:01:30:06 +0000","request":"GET /admin HTTP/2.0","request_length":"36","server_name":"v.example.com","status":"302","bytes_sent":"456","body_bytes_sent":"138","referrer":"-","user_agent":"curl/8.9.1","gzip_ration":"-","post":"-","upstream_addr":"-","upstream_status":"-","request_time":"0.009","upstream_response_time":"-","upstream_connect_time":"-","upstream_header_time":"-"}�[0m
�[2mserver # [ 18.201364] nginx[938]: server nginx: {"remote_addr":"2001:db8:1::1","remote_user":"-","time_local":"16/Sep/2024:01:30:06 +0000","request":"GET /?rd=https://v.example.com/admin HTTP/2.0","request_length":"57","server_name":"auth.example.com","status":"200","bytes_sent":"1806","body_bytes_sent":"1057","referrer":"-","user_agent":"curl/8.9.1","gzip_ration":"-","post":"-","upstream_addr":"127.0.0.1:9091","upstream_status":"200","request_time":"0.000","upstream_response_time":"0.001","upstream_connect_time":"0.000","upstream_header_time":"0.001"}�[0m
(finished: subtest: unauthenticated access is not granted to /admin, in 0.10 seconds)
(finished: subtest: extraScript, in 0.10 seconds)
(finished: run the VM test script, in 18.88 seconds)
test script finished in 18.94s
�[1m�[32mcleanup�[0m
kill machine (pid 8)
qemu-kvm: terminating on signal 15 from pid 5 (/nix/store/h3i0acpmr8mrjx07519xxmidv8mpax4y-python3-3.12.5/bin/python3.12)
kill machine (pid 29)
qemu-kvm: terminating on signal 15 from pid 5 (/nix/store/h3i0acpmr8mrjx07519xxmidv8mpax4y-python3-3.12.5/bin/python3.12)
(finished: cleanup, in 0.02 seconds)
kill vlan (pid 6)
Loading