| Name | Theory_2922-4761709-423_1 |
| Workunit | 2640047 |
| Created | 18 Dec 2025, 12:44:43 UTC |
| Sent | 20 Dec 2025, 12:38:09 UTC |
| Report deadline | 30 Dec 2025, 12:38:09 UTC |
| Received | 20 Dec 2025, 14:21:38 UTC |
| Server state | Over |
| Outcome | Computation error |
| Client state | Compute error |
| Exit status | 1 (0x00000001) Unknown error code |
| Computer ID | 5388 |
| Run time | 3 min 27 sec |
| CPU time | 9 sec |
| Validate state | Invalid |
| Credit | 0.00 |
| Device peak FLOPS | 1.89 GFLOPS |
| Application version | Theory Simulation v7.62 (docker) x86_64-pc-linux-gnu |
| Peak working set size | 39.50 MB |
| Peak swap size | 1.99 GB |
| Peak disk usage | 3.79 MB |
<core_client_version>8.2.8</core_client_version>
<![CDATA[
<message>
process exited with code 1 (0x1, -255)</message>
<stderr_txt>
09-423_1" is already in use by 404090d24f2bb66679e357d49743882a4c74e4450354261a266c509cd78494ad. You have to remove that container to be able to reuse that name: that name is already in use, or use --replace to instruct Podman to do so.
command output:
starting container
running docker command: start boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 2 seconds 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
4.45% 224.7MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 12 seconds 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
5.19% 757.2MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 22 seconds 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
2.76% 2.619GB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 32 seconds 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
1.87% 1.806GB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 42 seconds 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
1.42% 3.355GB / 12.54GB
got quit/abort from client
running docker command: stop boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
time="2025-12-20T15:10:31+01:00" level=warning msg="StopSignal SIGTERM failed to stop container boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1 in 10 seconds, resorting to SIGKILL"
command output:
boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
docker_wrapper config:
workdir: /boinc_slot_dir
use GPU: no
Web graphics guest port: 80
create args: --cap-add=SYS_ADMIN --device /dev/fuse -v /cvmfs:/cvmfs:shared
verbose: 1
Using podman
running docker command: ps --all --filter "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Exited (137) 10 seconds ago 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
creating container boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: images
command output:
REPOSITORY TAG IMAGE ID CREATED SIZE
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4825193-424 latest a50cf76a1834 9 days ago 620 MB
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423 latest a50cf76a1834 9 days ago 620 MB
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4909923-423 latest a50cf76a1834 9 days ago 620 MB
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4845910-423 latest a50cf76a1834 9 days ago 620 MB
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4797184-423 latest a50cf76a1834 9 days ago 620 MB
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4840834-424 latest a50cf76a1834 9 days ago 620 MB
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4856958-408 latest a50cf76a1834 9 days ago 620 MB
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4882138-422 latest a50cf76a1834 9 days ago 620 MB
docker.io/library/almalinux 9 623706a2d956 6 months ago 195 MB
docker.io/library/hello-world latest 74cc54e27dc4 11 months ago 26.7 kB
web graphics: host port 44777, guest port 80
running docker command: create --name boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1 -v .:/boinc_slot_dir -p 44777:80 --cap-add=SYS_ADMIN --device /dev/fuse -v /cvmfs:/cvmfs:shared --log-driver=k8s-file boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423
Error: creating container storage: the container name "boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1" is already in use by 404090d24f2bb66679e357d49743882a4c74e4450354261a266c509cd78494ad. You have to remove that container to be able to reuse that name: that name is already in use, or use --replace to instruct Podman to do so.
command output:
starting container
running docker command: start boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 1 second 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
5.00% 463.8MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 11 seconds 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
5.31% 1.006GB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 21 seconds 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
2.78% 2.982GB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 31 seconds 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
1.89% 3.797GB / 12.54GB
got quit/abort from client
running docker command: stop boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
time="2025-12-20T15:11:32+01:00" level=warning msg="StopSignal SIGTERM failed to stop container boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1 in 10 seconds, resorting to SIGKILL"
command output:
boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
docker_wrapper config:
workdir: /boinc_slot_dir
use GPU: no
Web graphics guest port: 80
create args: --cap-add=SYS_ADMIN --device /dev/fuse -v /cvmfs:/cvmfs:shared
verbose: 1
Using podman
running docker command: ps --all --filter "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Exited (137) About a minute ago 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
creating container boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: images
command output:
REPOSITORY TAG IMAGE ID CREATED SIZE
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4825193-424 latest a50cf76a1834 9 days ago 620 MB
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423 latest a50cf76a1834 9 days ago 620 MB
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4909923-423 latest a50cf76a1834 9 days ago 620 MB
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4845910-423 latest a50cf76a1834 9 days ago 620 MB
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4797184-423 latest a50cf76a1834 9 days ago 620 MB
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4840834-424 latest a50cf76a1834 9 days ago 620 MB
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4856958-408 latest a50cf76a1834 9 days ago 620 MB
localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4882138-422 latest a50cf76a1834 9 days ago 620 MB
docker.io/library/almalinux 9 623706a2d956 6 months ago 195 MB
docker.io/library/hello-world latest 74cc54e27dc4 11 months ago 26.7 kB
web graphics: host port 46999, guest port 80
running docker command: create --name boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1 -v .:/boinc_slot_dir -p 46999:80 --cap-add=SYS_ADMIN --device /dev/fuse -v /cvmfs:/cvmfs:shared --log-driver=k8s-file boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423
Error: creating container storage: the container name "boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1" is already in use by 404090d24f2bb66679e357d49743882a4c74e4450354261a266c509cd78494ad. You have to remove that container to be able to reuse that name: that name is already in use, or use --replace to instruct Podman to do so.
command output:
starting container
running docker command: start boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 1 second 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
4.51% 256.5MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 11 seconds 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
5.19% 659.2MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 21 seconds 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
2.76% 1.841GB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 31 seconds 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
1.88% 567.7MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 41 seconds 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
1.42% 668.5MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 52 seconds 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
1.15% 335.7MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up About a minute 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
9.59% 358.2MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up About a minute 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
8.24% 682.2MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up About a minute 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
7.22% 1.144GB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up About a minute 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
6.42% 385MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up About a minute 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
5.79% 385.2MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up About a minute 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
5.27% 386.3MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 2 minutes 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
4.83% 386MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 2 minutes 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
4.46% 387.2MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 2 minutes 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
4.15% 387.1MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 2 minutes 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
3.87% 385.3MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 2 minutes 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
3.63% 386.3MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 2 minutes 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
3.42% 371.6MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 3 minutes 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
3.23% 365.4MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Up 3 minutes 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: stats --no-stream --format "{{.CPUPerc}} {{.MemUsage}}" boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
3.07% 365.2MB / 12.54GB
running docker command: ps --all -f "name=boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1"
command output:
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
404090d24f2b localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest /bin/sh -c ./entr... About an hour ago Exited (0) 3 seconds ago 0.0.0.0:58707->80/tcp boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: logs boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
===> [runRivet] Sat Dec 20 14:12:35 UTC 2025 [boinc ee zhad 43.6 - - pythia6 6.428 377 100000 423]
command output:
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=9544
job: logsize=56 k
job: times=
0m0.004s 0m0.005s
2m43.852s 0m12.286s
job: cpuusage=176
Job Finished
boinc_shutdown called with exit code 0
stderr from container:
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
Could not download a wpad.dat from lhchomeproxy.{cern.ch|fnal.gov}
Got a proxy from the local environment
Will use it for CVMFS and Frontier
Using CVMFS on the host.
Probing CVMFS repositories ...
Probing /cvmfs/alice.cern.ch... OK
Probing /cvmfs/cvmfs-config.cern.ch... OK
Probing /cvmfs/grid.cern.ch... OK
Probing /cvmfs/sft.cern.ch... OK
Excerpt from "cvmfs_config stat":
VERSION HOST PROXY
2.13.3.0 http://s1ral-cvmfs.openhtc.io http://192.168.50.138:3128
Environment HTTP proxy: http://192.168.50.138:3128
job: htmld=/var/www/lighttpd
job: unpack exitcode=0
job: run exitcode=0
job: diskusage=9544
job: logsize=56 k
job: times=
0m0.004s 0m0.005s
2m43.852s 0m12.286s
job: cpuusage=176
Job Finished
boinc_shutdown called with exit code 0
stderr end
running docker command: stop boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: container rm boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
command output:
boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423_1
running docker command: image rm boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423
command output:
Untagged: localhost/boinc__lhcathomedev.cern.ch_lhcathome-dev__theory_2922-4761709-423:latest
2025-12-20 15:15:59 (3178471): called boinc_finish(1)
</stderr_txt>
]]>
©2025 CERN