Skip to content

Commit

Permalink
chore: update urls
Browse files Browse the repository at this point in the history
  • Loading branch information
purarue committed Oct 27, 2024
1 parent d008147 commit 620c8c2
Show file tree
Hide file tree
Showing 13 changed files with 23 additions and 23 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,19 +15,19 @@ This runs on an Debian server, but it should be OS agnostic. `vps_install` will

[`bin`](./bin) includes scripts that are run on my machine or on the server

See [here](https://sean.fish/x/blog/server-setup/) for a blog post describing how I set up this server.
See [here](https://purarue.xyz/x/blog/server-setup/) for a blog post describing how I set up this server.

- [`vps_install`](./bin/vps_install) clones and sets up environments for each application. Checks that you have corresponding commands/packages installed and that required credential/files are in the right location, installs virtual environments/packages for each application.
- [`super`](./super) lets me interact with the underlying `supervisord`/`supervisorctl` processes with my environment variables/configuration.
- [`logs`](./logs) streams the logs from all applications
- [`vps_backup`](./bin/vps_backup) copies cache/token files to a tar.gz so they can be backed up. [runs with bgproc](https://github.com/purarue/bgproc). [`backup_server`](./backup_server) is run from my computer, which ssh's into the server to run that. Runs once per day, in [`housekeeping`](https://sean.fish/d/housekeeping)
- [`vps_backup`](./bin/vps_backup) copies cache/token files to a tar.gz so they can be backed up. [runs with bgproc](https://github.com/purarue/bgproc). [`backup_server`](./backup_server) is run from my computer, which ssh's into the server to run that. Runs once per day, in [`housekeeping`](https://purarue.xyz/d/housekeeping)
- [`vps_deploy`](./bin/vps_deploy) and [`deploy`](./deploy) are a basic ssh/git pull/restart/view logs script for projects which I deploy frequently
- [`vps_generate_static_sites`](./bin/vps_generate_static_sites) builds my static websites and places them in `~/static_files`.
- [`remsync`](./bin/remsync) is a script that's run on my machine, which rsyncs files from a local directory to the server. That directory is served with nginx, so I can sync something to the server from my CLI and send someone a link. [example output](https://gist.github.com/purarue/2b11729859d248069a0eabf2e91e2800). Has two endpoints, `f` and `p`, which specify private (a non-autoindexed nginx listing) and public indexes.
- [`playlist`](./bin/playlist) interfaces with my [playlist manager](https://github.com/purarue/plaintext-playlist). It allows me to select multiple playlists, combines all the files from those playlists into a single mp3 and syncs that up to my server with `remsync`. Often run this on my machine before I leave my house; I then listen to the file on my phone by going to corresponding URL.
- [`mediaproxy`](./bin/mediaproxy) to ssh into the server, `youtube-dl`/`ffmpeg` something and host it on a readable link. Has video/audio wrapper that use more of my [personal scripts](https://github.com/purarue/dotfiles/) to prompt me to to select format codes, (similar to [`mpvf`](https://github.com/purarue/mpvf/)). That way, I can press a keybind, which grabs the URL from my clipboard and re-hosts it on my server.
- [`shorten`](./bin/shorten) creates a shortened url using [`no-db-shorturl`](https://github.com/purarue/no-db-shorturl)
- [`approve-comments`](./bin/approve-comments) approves comments for my guest book at [https://sean.fish](https://github.com/purarue/glue)
- [`approve-comments`](./bin/approve-comments) approves comments for my guest book at [https://purarue.xyz](https://github.com/purarue/glue)
- [`mnu`](./bin/mnu) runs the periodic job to update the [google sheet](https://github.com/purarue/mnu_gsheets)

- [`directories`](./directories) is a helper script sourced at the top of other scripts that defines common application location environment variables
Expand Down
2 changes: 1 addition & 1 deletion bin/approve-comments
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
#!/bin/bash
# run from my computer to approve guest book comments at https://sean.fish/#
# run from my computer to approve guest book comments at https://purarue.xyz/#
# specify one of --approve-comments, --review-comments, --print-count or --print-new-comments to this script
# defaults to --approve-comments
# see https://github.com/purarue/glue/blob/master/production_server
Expand Down
2 changes: 1 addition & 1 deletion bin/mediaproxy
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
readonly SYNC_USER="sean"
readonly SYNC_KEYFILE="${HOME}/.ssh/vultr" # ssh keyfile
readonly TO_SERVER=140.82.50.43
readonly BASE_URL="https://sean.fish/m"
readonly BASE_URL="https://purarue.xyz/m"

######### SETUP

Expand Down
2 changes: 1 addition & 1 deletion bin/remsync
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ readonly SYNC_USER="sean"
readonly SSH_TO='vultr' # setup in ~/.ssh/config
declare TO_DIR="/home/${SYNC_USER}/f/"
[[ -n "$REMSYNC_PUBLIC" ]] && TO_DIR="/home/${SYNC_USER}/p/"
readonly BASE_URL="https://sean.fish"
readonly BASE_URL="https://purarue.xyz"
readonly TO_DIR

# local information
Expand Down
2 changes: 1 addition & 1 deletion bin/remsync-image
Original file line number Diff line number Diff line change
Expand Up @@ -29,5 +29,5 @@ cp -p "$image" "${XDG_DOCUMENTS_DIR}/remsync/i/${name}" || {
}
remsync || exit 1
# create URL
url="https://sean.fish/f/i/${name}"
url="https://purarue.xyz/f/i/${name}"
printf '%s' "$url" | clp
4 changes: 2 additions & 2 deletions bin/shorten
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,11 @@
# the server generates a random hash
# e.g.
# shorten "https://wiki.archlinux.org/index.php/File_opener" open
# would return "https://sean.fish/s/open"
# would return "https://purarue.xyz/s/open"
# which now redirects to the archwiki link

# handle user input
readonly UPLOAD_TO="https://sean.fish/s/"
readonly UPLOAD_TO="https://purarue.xyz/s/"
readonly SHORTURL_TOKEN="${SHORTURL_TOKEN:?No shorturl token set}"
readonly URL="${1:?No url provided to shorten}"
readonly HASH="$2" # is fine if this is empty
Expand Down
2 changes: 1 addition & 1 deletion bin/update-recent-page-hits
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
set -o pipefail

main() {
data="$(curl -sL 'https://sean.fish/api/page_hit/7' | jq)" || return $?
data="$(curl -sL 'https://purarue.xyz/api/page_hit/7' | jq)" || return $?
count="$(echo "$data" | jq -r '.count')"
# https://github.com/purarue/is-integer
if is-integer "$count" >/dev/null; then
Expand Down
4 changes: 2 additions & 2 deletions bin/vps_backup
Original file line number Diff line number Diff line change
Expand Up @@ -76,8 +76,8 @@ mkdir_if_not_exists "$BACKUP_COUNTDOWN"
mkdir_if_not_exists "$BACKUP_NOTIFY"

# save data from my website
curl -s 'https://sean.fish/api/gb_comment' >"$BACKUP_DIR/gb_comment.json"
curl -s 'https://sean.fish/api/page_hit' >"$BACKUP_DIR/page_hit.json"
curl -s 'https://purarue.xyz/api/gb_comment' >"$BACKUP_DIR/gb_comment.json"
curl -s 'https://purarue.xyz/api/page_hit' >"$BACKUP_DIR/page_hit.json"

expect_file_and_copy "$NOTIFY_BOT/token.yaml" "$BACKUP_NOTIFY"
expect_file_and_copy "$NOTIFY_BOT/old" "$BACKUP_NOTIFY"
Expand Down
8 changes: 4 additions & 4 deletions functions.sh
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,8 @@ alias remsync-public='REMSYNC_PUBLIC=1 remsync' # to push to /p/ (public index)
alias remsync-ranger='ranger "${XDG_DOCUMENTS_DIR}/remsync" && remsync'
alias remsync-public-ranger='ranger "${HOME}/Files/remsync_public" && remsync-public'
alias print-new-comments='approve-comments --print-new-comments'
alias page-hits="curl -s 'https://sean.fish/api/page_hit' | jq '.count'"
alias gb-comments="curl 'https://sean.fish/api/gb_comment' | jq 'reverse'"
alias page-hits="curl -s 'https://purarue.xyz/api/page_hit' | jq '.count'"
alias gb-comments="curl 'https://purarue.xyz/api/gb_comment' | jq 'reverse'"
gb-comments-pretty() {
gb-comments |
jq '.[]' -c |
Expand All @@ -21,10 +21,10 @@ gb-comments-pretty() {
# print/select open shortened urls
# https://github.com/purarue/no-db-shorturl
alias shorturls="ssh vultr 'ls shorturls'"
alias shz="shorturls | fzf | sed -e 's|^|https://sean.fish/s/|' | tee /dev/tty | clipcopy"
alias shz="shorturls | fzf | sed -e 's|^|https://purarue.xyz/s/|' | tee /dev/tty | clipcopy"
remsync-html-from-stdin() {
local tmpf
# https://sean.fish/d/pipehtml?redirect
# https://purarue.xyz/d/pipehtml?redirect
tmpf="$(pipehtml "$*")"
remsync "$tmpf"
rm -f "$tmpf"
Expand Down
2 changes: 1 addition & 1 deletion jobs/linux/backup_server_tar.job
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,6 @@
evry 2 weeks -backup-fish-backup && {
backup_to="${HOME}/Files/Backups/fish_server"
mkdir -p "${backup_to}"
printlog 'backing up tar.gz sean.fish...'
printlog 'backing up tar.gz purarue.xyz...'
cp ~/.cache/backup_dir.tar.gz "${backup_to}/$(epoch)_backup_dir.tar.gz"
}
6 changes: 3 additions & 3 deletions jobs/linux/check_fish_server.job
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
wait-for-internet -q --timeout "${WFI_TIMEOUT:-10}" || exit 0

evry 30 minutes -check-fish-server && {
printlog 'checking sean.fish...'
HTTP_CODE="$(curl -L -so /dev/null -w "%{http_code}" 'https://sean.fish')"
[[ "$HTTP_CODE" != "200" ]] && send-error "sean.fish is down"
printlog 'checking purarue.xyz...'
HTTP_CODE="$(curl -L -so /dev/null -w "%{http_code}" 'https://purarue.xyz')"
[[ "$HTTP_CODE" != "200" ]] && send-error "purarue.xyz is down"
}
4 changes: 2 additions & 2 deletions jobs/linux/guestbook_comments.job
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
#!/usr/bin/env bash
# saves the number of unapproved comments to a cache file
# this is for my guest book on https://sean.fish/
# this is for my guest book on https://purarue.xyz/

wait-for-internet -q --timeout "${WFI_TIMEOUT:-10}" || exit 0

evry 15 minutes -guestbook_comments && {

get_count() {
local COUNT_LINE
COUNT_LINE="$(curl -sL 'https://sean.fish/api/gb_comment/1' | jq -r .count)" || return $?
COUNT_LINE="$(curl -sL 'https://purarue.xyz/api/gb_comment/1' | jq -r .count)" || return $?
echo "$COUNT_LINE"
}

Expand Down
2 changes: 1 addition & 1 deletion jobs/linux/page_hit_count.job
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,6 @@
wait-for-internet -q --timeout "${WFI_TIMEOUT:-10}" || exit 0

evry 30 minutes -recent_page_hits && {
printlog 'recent_page_hits:getting recent page hit count from sean.fish'
printlog 'recent_page_hits:getting recent page hit count from purarue.xyz'
update-recent-page-hits
}

0 comments on commit 620c8c2

Please sign in to comment.