#11493 retire ambassadors map service/website
Closed: Fixed a year ago by phsmoura. Opened 2 years ago by kevin.

We ran a service that mapped fedora ambassadors on a wold map so people could find nearby ambassadors.

This is explained at:

https://fedoraproject.org/wiki/Fedora_ambassadors_map
and the map itself is at:

https://fedoraproject.org/membership-map/ambassadors.html

Problem is that when we moved the wiki to OIDC auth it broke our updater script, so it hasn't updated in a long long time.

It also uses fas2 (the old retired account system).

So, IMHO we should remove it.

it's in the following places at least:

./roles/membership-map
playbooks/groups/proxies.yml
playbooks/groups/sundries.yml
playbooks/include/proxies-miscellaneous.yml
roles/apps-fp-o/files/apps.yaml
roles/rsyncd/files/rsyncd.conf.sundries

it seems to be getting used by mirrormanager in roles/mirrormanager/frontend2/tasks/main.yml
but I think we could move those files into the mirrormanager role? Or do something else. Perhaps @adrian could chime in on it?

In addition to a PR to remove this, we also need a small set of ansible ad-hock commands to clean up files left over. Or we could do two PR's... one to use 'state: absent' to remove everything and then another to drop things in ansible entirely.


Metadata Update from @phsmoura:
- Issue assigned to phsmoura

2 years ago

Metadata Update from @zlopez:
- Issue priority set to: Waiting on Assignee (was: Needs Review)
- Issue tagged with: low-gain, medium-trouble, ops

2 years ago

Created 2 PRs:
1. PR 1572: Add task in membership-map role to remove its files
2. PR 1573: Remove membership-map role and its references from other files

After executing the task in the 1st PR we can merge the 2nd

Great! I think we should probibly wait until f39 freeze is over to do these.

The two PRs are merged. So what remains?

Metadata Update from @phsmoura:
- Issue close_status updated to: Fixed
- Issue status updated to: Closed (was: Open)

a year ago

Log in to comment on this ticket.

Metadata
Boards 1
ops Status: Backlog