13 Commits

Author SHA1 Message Date
49d616ef32 Merge branch 'upstream'
All checks were successful
Docker Image CI / build (push) Successful in 2m22s
Github-Actions / build (push) Successful in 53s
2025-04-12 14:38:43 -05:00
12f795bcf8 container names 2025-04-12 14:37:19 -05:00
Jelle van der Waa
653b482ec5 Include request scheme into opensearch data 2025-03-29 19:08:16 +01:00
Jelle van der Waa
ecc4bdf0a7 bump django to 5.0.13 2025-03-29 18:55:35 +01:00
luis.carilla
a45b88da4b fix linting issue 2025-03-29 15:13:22 +01:00
luis.carilla
e460ba4727 set a timeout for the rsync subprocess when checking mirror availability 2025-03-29 15:13:22 +01:00
Jelle van der Waa
910e428baa public: update to latest seqouia version
Closes: #553
2025-02-20 17:53:50 +01:00
8e6bc69713 Merge branch 'upstream'
All checks were successful
Github-Actions / build (push) Successful in 57s
Docker Image CI / build (push) Successful in 7m25s
2025-02-17 19:08:14 -05:00
Christian Heusel
f624f5677b donate: Add link to LoadView
This was forgotten in the previous agreement with that sponsor.

Signed-off-by: Christian Heusel <christian@heusel.eu>
2025-02-12 10:56:53 +01:00
luzpaz
2064099696 Fix typos
Some checks failed
Github-Actions / build (push) Failing after 2m25s
2025-01-30 10:32:52 +01:00
Jelle van der Waa
67209075c5 devel: fix grammar mistake a HTTP => an HTTP
Some checks failed
Github-Actions / build (push) Failing after 10s
Closes: #546
2025-01-28 10:13:01 +01:00
Jelle van der Waa
336d686ca2 public: add spacing between past donors and a past donor
Some checks failed
Github-Actions / build (push) Failing after 7s
2025-01-25 12:10:38 +01:00
Christian Heusel
4f0e24f1f7 donate: Add link to DotcomMonitor
Signed-off-by: Christian Heusel <christian@heusel.eu>
2025-01-25 11:55:13 +01:00
14 changed files with 43 additions and 27 deletions

View File

@@ -131,7 +131,7 @@ Archweb provides multiple management commands for importing various sorts of dat
* reporead_inotify - Watches a templated patch for updates of *.files.tar.gz to update Arch databases with. * reporead_inotify - Watches a templated patch for updates of *.files.tar.gz to update Arch databases with.
* donor_import - Import a single donator from a mail passed to stdin * donor_import - Import a single donator from a mail passed to stdin
* mirrorcheck - Poll every active mirror URLs to store the lastsnyc time and record network timing details. * mirrorcheck - Poll every active mirror URLs to store the lastsnyc time and record network timing details.
* mirrorresolv - Poll every active mirror URLs and determine wheteher they have IP4 and/or IPv6 addresses. * mirrorresolv - Poll every active mirror URLs and determine whether they have IP4 and/or IPv6 addresses.
* populate_signoffs - retrieves the latest commit message of a signoff-eligible package. * populate_signoffs - retrieves the latest commit message of a signoff-eligible package.
* update_planet - Import all feeds for users who have a valid website and website_rss in their user profile. * update_planet - Import all feeds for users who have a valid website and website_rss in their user profile.
* read_links - Reads a repo.links.db.tar.gz file and updates the Soname model. * read_links - Reads a repo.links.db.tar.gz file and updates the Soname model.

View File

@@ -59,7 +59,7 @@ class Command(BaseCommand):
arches = Arch.objects.filter(agnostic=False) arches = Arch.objects.filter(agnostic=False)
repos = Repo.objects.all() repos = Repo.objects.all()
arch_path_map = {arch: None for arch in arches} arch_path_map = dict.fromkeys(arches)
all_paths = set() all_paths = set()
total_paths = 0 total_paths = 0
for arch in arches: for arch in arches:

View File

@@ -72,7 +72,7 @@ class Command(BaseCommand):
arches = Arch.objects.filter(agnostic=False) arches = Arch.objects.filter(agnostic=False)
repos = Repo.objects.all() repos = Repo.objects.all()
arch_path_map = {arch: None for arch in arches} arch_path_map = dict.fromkeys(arches)
all_paths = set() all_paths = set()
total_paths = 0 total_paths = 0
for arch in arches: for arch in arches:

View File

@@ -1,20 +1,20 @@
version: '2' version: '2'
# Run the following once: # Run the following once:
# docker compose run --rm packages_web python manage.py migrate # docker compose run --rm archweb_web python manage.py migrate
# docker compose run --rm packages_web python manage.py loaddata main/fixtures/arches.json # docker compose run --rm archweb_web python manage.py loaddata main/fixtures/arches.json
# docker compose run --rm packages_web python manage.py loaddata main/fixtures/repos.json # docker compose run --rm archweb_web python manage.py loaddata main/fixtures/repos.json
# docker compose run --rm packages_web python manage.py createsuperuser --username=admin --email=admin@artixweb.local # docker compose run --rm archweb_web python manage.py createsuperuser --username=admin --email=admin@artixweb.local
## go to /admin and create a user according to overlay/devel/fixtures/user_profiles.json ## go to /admin and create a user according to overlay/devel/fixtures/user_profiles.json
## go to /admin/auth/user/2/change/ and add a name ## go to /admin/auth/user/2/change/ and add a name
# docker compose run --rm packages_web python manage.py generate_keyring pgp.surfnet.nl ./config/keyring # docker compose run --rm archweb_web python manage.py generate_keyring pgp.surfnet.nl ./config/keyring
# docker compose run --rm packages_web python manage.py pgp_import ./config/keyring # docker compose run --rm archweb_web python manage.py pgp_import ./config/keyring
## go to /admin/devel/developerkey/ and set the owner (and parent) for the ownerless key ## go to /admin/devel/developerkey/ and set the owner (and parent) for the ownerless key
## go to /admin/sites/site/1/change/ and set the domain ## go to /admin/sites/site/1/change/ and set the domain
services: services:
packages_web: archweb_web:
container_name: artixweb-packages container_name: artixweb-packages
build: build:
context: ./ context: ./
@@ -25,7 +25,7 @@ services:
volumes: volumes:
- ./config:/usr/src/web/config - ./config:/usr/src/web/config
packages_sync: archweb_sync:
container_name: artixweb-sync container_name: artixweb-sync
build: build:
context: ./ context: ./
@@ -35,7 +35,7 @@ services:
- ./config:/usr/src/web/config - ./config:/usr/src/web/config
command: ./downloadpackages.sh command: ./downloadpackages.sh
packages_nginx: archweb_nginx:
container_name: artixweb-nginx container_name: artixweb-nginx
image: linuxserver/nginx:latest image: linuxserver/nginx:latest
restart: "no" restart: "no"

View File

@@ -10,7 +10,7 @@ def format_key(key_id):
if len(key_id) in (8, 20): if len(key_id) in (8, 20):
return '0x%s' % key_id return '0x%s' % key_id
elif len(key_id) == 40: elif len(key_id) == 40:
# normal display format is 5 groups of 4 hex chars seperated by spaces, # normal display format is 5 groups of 4 hex chars separated by spaces,
# double space, then 5 more groups of 4 hex chars # double space, then 5 more groups of 4 hex chars
split = tuple(key_id[i:i + 4] for i in range(0, 40, 4)) split = tuple(key_id[i:i + 4] for i in range(0, 40, 4))
return '%s\u00a0 %s' % (' '.join(split[0:5]), ' '.join(split[5:10])) return '%s\u00a0 %s' % (' '.join(split[0:5]), ' '.join(split[5:10]))

View File

@@ -184,12 +184,26 @@ def check_rsync_url(mirror_url, location, timeout):
with open(os.devnull, 'w') as devnull: with open(os.devnull, 'w') as devnull:
if logger.isEnabledFor(logging.DEBUG): if logger.isEnabledFor(logging.DEBUG):
logger.debug("rsync cmd: %s", ' '.join(rsync_cmd)) logger.debug("rsync cmd: %s", ' '.join(rsync_cmd))
start = time.time() start = time.time()
proc = subprocess.Popen(rsync_cmd, stdout=devnull, stderr=subprocess.PIPE) timeout_expired = False
_, errdata = proc.communicate() # add an arbitrary 5-second buffer to ensure the process completes and to catch actual rsync timeouts.
end = time.time() rsync_subprocess_timeout = timeout + 5
log.duration = end - start try:
if proc.returncode != 0: proc = subprocess.Popen(rsync_cmd, stdout=devnull, stderr=subprocess.PIPE)
_, errdata = proc.communicate(timeout=rsync_subprocess_timeout)
end = time.time()
log.duration = end - start
except subprocess.TimeoutExpired:
timeout_expired = True
proc.kill()
logger.debug("rsync command timeout error: %s, %s", url, errdata)
log.is_success = False
log.duration = None
log.error = f"rsync subprocess killed after {rsync_subprocess_timeout} seconds"
if proc.returncode != 0 and not timeout_expired:
logger.debug("error: %s, %s", url, errdata) logger.debug("error: %s, %s", url, errdata)
log.is_success = False log.is_success = False
log.error = errdata.strip().decode('utf-8') log.error = errdata.strip().decode('utf-8')
@@ -197,7 +211,7 @@ def check_rsync_url(mirror_url, location, timeout):
# don't record a duration as it is misleading # don't record a duration as it is misleading
if proc.returncode in (1, 30, 35): if proc.returncode in (1, 30, 35):
log.duration = None log.duration = None
else: elif not timeout_expired:
logger.debug("success: %s, %.2f", url, log.duration) logger.debug("success: %s, %.2f", url, log.duration)
if os.path.exists(lastsync_path): if os.path.exists(lastsync_path):
with open(lastsync_path, 'r') as lastsync: with open(lastsync_path, 'r') as lastsync:

View File

@@ -27,7 +27,7 @@ def test_mirrorurl_get_full_url(mirrorurl):
def test_mirror_url_clean(mirrorurl): def test_mirror_url_clean(mirrorurl):
mirrorurl.clean() mirrorurl.clean()
# TOOD(jelle): this expects HOSTNAME to resolve, maybe mock # TODO(jelle): this expects HOSTNAME to resolve, maybe mock
assert mirrorurl.has_ipv4 assert mirrorurl.has_ipv4
# requires ipv6 on host... mock? # requires ipv6 on host... mock?
# assert mirrorurl.has_ipv6 == True # assert mirrorurl.has_ipv6 == True

View File

@@ -67,7 +67,7 @@ def test_sort(client, package):
def test_packages(client, package): def test_packages(client, package):
response = client.get('/opensearch/packages/') response = client.get('/opensearch/packages/')
assert response.status_code == 200 assert response.status_code == 200
assert 'template="example.com/opensearch/packages/"' in response.content.decode() assert 'template="http://example.com/opensearch/packages/"' in response.content.decode()
def test_packages_suggest(client, package): def test_packages_suggest(client, package):

View File

@@ -25,7 +25,7 @@ def opensearch(request):
current_site = Site.objects.get_current() current_site = Site.objects.get_current()
return render(request, 'packages/opensearch.xml', return render(request, 'packages/opensearch.xml',
{'domain': current_site.domain}, {'domain': f'{request.scheme}://{current_site.domain}'},
content_type='application/opensearchdescription+xml') content_type='application/opensearchdescription+xml')

View File

@@ -31,7 +31,7 @@ def index(request):
'news_updates': News.objects.order_by('-postdate', '-id')[:15], 'news_updates': News.objects.order_by('-postdate', '-id')[:15],
'pkg_updates': updates, 'pkg_updates': updates,
'staff_groups': StaffGroup.objects.all(), 'staff_groups': StaffGroup.objects.all(),
'domain': current_site.domain, 'domain': f'{request.scheme}://{current_site.domain}',
} }
return render(request, 'public/index.html', context) return render(request, 'public/index.html', context)

View File

@@ -1,5 +1,5 @@
-e git+https://github.com/fredj/cssmin.git@master#egg=cssmin -e git+https://github.com/fredj/cssmin.git@master#egg=cssmin
Django==5.0.11 Django==5.0.13
IPy==1.1 IPy==1.1
Markdown==3.3.7 Markdown==3.3.7
bencode.py==4.0.0 bencode.py==4.0.0

View File

@@ -7,7 +7,7 @@
<div class="box"> <div class="box">
<h2>Tier 0 Mirror usage information</h2> <h2>Tier 0 Mirror usage information</h2>
<p>Arch Linux Tier 0 mirror on <a href="https://repos.archlinux.org">repos.archlinux.org</a> which can be used if to obtain the absolute latest packages. The mirror is protected with a HTTP Basic Auth password unique per Staff member.</p> <p>Arch Linux Tier 0 mirror on <a href="https://repos.archlinux.org">repos.archlinux.org</a> which can be used if to obtain the absolute latest packages. The mirror is protected with an HTTP Basic Auth password unique per Staff member.</p>
{% if mirror_url %} {% if mirror_url %}
<code id="serverinfo">Server = {{ mirror_url }}</code> <button id="copybutton">Copy to clipboard</button> <code id="serverinfo">Server = {{ mirror_url }}</code> <button id="copybutton">Copy to clipboard</button>

View File

@@ -84,6 +84,8 @@
<h3>Past donors</h3> <h3>Past donors</h3>
<p><a href="http://www.dotcom-monitor.com/" title="Dotcom-Monitor">Dotcom-Monitor</a> &amp; <a href="https://www.loadview-testing.com/" title="LoadView">LoadView</a></p>
<div id="donor-list"> <div id="donor-list">
<ul> <ul>
{% for donor in donors %} {% for donor in donors %}

View File

@@ -132,10 +132,10 @@
<pre><code>$ b2sum -c b2sums.txt</code></pre> <pre><code>$ b2sum -c b2sums.txt</code></pre>
To verify the PGP signature using Sequoia, first download the release signing key from WKD: To verify the PGP signature using Sequoia, first download the release signing key from WKD:
<pre><code>$ sq network wkd fetch {{ release.wkd_email }} -o release-key.pgp</code></pre> <pre><code>$ sq network wkd search {{ release.wkd_email }} --output release-key.pgp</code></pre>
With this signing key, verify the signature: With this signing key, verify the signature:
<pre><code>$ sq verify --signer-file release-key.pgp --detached archlinux-{{ release.version }}-x86_64.iso.sig archlinux-{{ release.version }}-x86_64.iso</code></pre> <pre><code>$ sq verify --signer-file release-key.pgp --signature-file archlinux-{{ release.version }}-x86_64.iso.sig archlinux-{{ release.version }}-x86_64.iso</code></pre>
Alternatively, using GnuPG, download the signing key from WKD: Alternatively, using GnuPG, download the signing key from WKD:
<pre><code>$ gpg --auto-key-locate clear,wkd -v --locate-external-key {{ release.wkd_email }}</code></pre> <pre><code>$ gpg --auto-key-locate clear,wkd -v --locate-external-key {{ release.wkd_email }}</code></pre>