- On your terminal, create a custom ssh key to use with github
ssh-keygen -t rsa -b 2048 -C "https://kunalmgupta.github.io"
. Use your own email ID or some other string. - Save key to location (custom for github)
/Users/kunal/.ssh/github
. This is going to create a private key~/.ssh/github
and public key~/.ssh/github.pub
. - View and then copy the public key using
cat ~/.ssh/github.pub
. And put this in your github account https://github.com/settings/keys. - Since we made a custom ssh key, we would have to append this to
~/.ssh/config/
. If it is not currently there then create a new one!vim ~/.ssh/config/
- Add the following to the config file. Replace
/Users/kunal/.ssh/github
with your own ssh key path.HostName github.com User git Port 22 PreferredAuthentications publickey IdentityFile /Users/kunal/.ssh/github IdentitiesOnly yes```
- Check if authentication is working
ssh -T git@github.com
. If it says something like "Hi KunalMGupta! You've successfully authenticated, but GitHub does not provide shell access." Then Congratulations! you are all set.
- Create a new repository on your git account. Then git clone it to your machine.
- Add username to git
git config --global user.name "KunalMGupta"
- In your folder, add the remote. For instance, for this repo you will do the following.
cd RemoteResearchTools
andgit remote set-url origin git@github.com:KunalMGupta/RemoteResearchTools.git
- Now you can do your usual add/commit/push to remote. Enjoy!
- To see the log of all commits:
git log
- To see the files that are currently being tracked
git ls-tree -r master --name-only
- To push files
git push origin main
- Set email
git config --global user.email k5gupta@eng.ucsd.edu
- Set username
git config --global user.name KunalMGupta
I have found that a combinatino of nomachine and teamviewer works out quite well for this. The catch is that Teamviewer generally gives much better bandwidth compared to nomachine, however I couldn't get to start from ssh terminal itself, luckily this issue is not with nomachine. So, the trick is to first use nomachine to launch teamviewer (only if it not running automatically). Once the teamviewer is active, we can simply use that.
- Install nomachine on both your machine (client) and linux server (host). For downloading instructions look here for Mac OS and Linux . Also, I recommend first downloading both of these to your local machine and then scp the linux installer to the server.
- No machine will automatically get activated after the installation. Connect to server via your username and password (using nomachine).
- Once in, you can simply connect using teamviewer and close nomachine later on.
Sometimes it is handy to be able to use finder to quickly parse through the directories on your remote machine. For this we use Cyberduck applications that relies on SFTP and is super easy to set up and is free (but I encourage donating if you like it!).
Note: Lately the method for authentication has changed. Use the latest rclone which will ask you to open authentication on a system with access to web browser.
sudo -v ; curl https://rclone.org/install.sh | sudo bash
On several occasions there is a need to upload data from the servers to a place where it can be shared more easily with others. I found that using rclone to be a great tool to achieve this. To back up some file backup.zip
to your GDrive, do the following.
- First install rclone on your linux server
apt install rclone
- Then config your Google account with rclone using
rclone config
- Give some name say
drive
to refer to the remote location - Since we are using GDrive, type
drive
next - Leave blank for Client Id and Secret
- Since we don't have access to a monitor on the remote server, type
n
here to get an external signup link - Sign in using the generated link on your local machine and copy paste the generated key under
Enter verification code
- Type
y
to verify if the token is correct. and then hit Quitq
The destination is now set under rclone config. To list the contents of your GDrive you can use rclone lsd <remote name>:
. Now to do fast copy from your local server to GDrive, use the following command.
rclone copy backup.zip drive:<target directory in GDrive>/ -v --drive-chunk-size=512M
Here, the -v
flag will tell you things like progress and ETA and the --drive-chunk
is crucial to make this run fast, however, in my experience any value above 128 doesn't help much since the uploading is happening over commodity internet, which isn't that great, but you can always use multiple threads to speed things up. I use k8s so I end up using multiple jobs to run the upload which makes this process fast!
https://medium.com/@minkesh.asati/development-with-the-docker-on-the-remote-server-via-ssh-from-vs-code-ef9e0f2fcbe6 docker/for-mac#770 (comment)
https://code.visualstudio.com/docs/remote/tunnels#_using-the-code-cli