Today I'm asking about your scripts.
Almost every engineer I know has a collection of scripts/utilities for automating ${something}. I'm willing to bet that HN users have some of the most interesting scripts on the internet.
So that said, could I please see your scripts?
I'll go first: https://github.com/fastily/autobots
On first connect to a server, this sync all the dotfiles i want to a remote host and on subsequent connects, it updates the dotfiles.
Idk if this is "special", but I haven't seen anyone else do this really and it beats for example ansible playbooks by being dead simple.
Match Host 192.168.123.*,another-example.org,*.example.com User myusername,myotherusername
ForwardAgent yes
PermitLocalCommand yes
LocalCommand rsync -L --exclude .netrwhist --exclude .git --exclude .config/iterm2/AppSupport/ --exclude .vim/bundle/youcompleteme/ -vRrlptze "ssh -o PermitLocalCommand=no" %d/./.screenrc %d/./.gitignore %d/./.bash_profile %d/./.ssh/git_ed25519.pub %d/./.ssh/authorized_keys %d/./.vimrc %d/./.zshrc %d/./.config/iterm2/ %d/./.vim/ %d/./bin/ %d/./.bash/ %r@%n:/home/%r
Command help, inspired by http://explainshell.com/ to extract help text from builtin commands and man pages. Here's an example:
$ ch ls -AXG
ls - list directory contents
-A, --almost-all
do not list implied . and ..
-X sort alphabetically by entry extension
-G, --no-group
in a long listing, don't print group names
---https://github.com/learnbyexample/regexp-cut/blob/main/rcut is another - uses awk to provide cut like syntax for field extraction. After writing this, I found commands like `hck`, `tuc` written in Rust that solves most of the things I wanted.
https://github.com/tpapastylianou/misc-updater
In full honesty, I'm as proud of the "MISC" acronym as of the script itself. :p
I'm secretly hoping the acronym catches on for referring to any stuff outside the control of a system's standard package management.
I eventually got tired of writing that manually, so I wrote a small
git co-commit --thor ...
that works just like 'git commit', except it adds another line to the commit message template.Placing it in e.g. ~/bin/git-co-commit and having ~/bin in your $PATH will enable it as a git sub-command.
I've never had a use for this before, and I don't think I'll need it much beyond this team, but this was my first git sub-command that wasn't trivially solvable by existing command parameters (that I know of).
https://gist.github.com/sshine/d5a2986a6fc377b440bc8aa096037...
~LButton up::
if (GetKeyState("CapsLock", "P")) {
while(!GetKeyState("LButton", "P")){
MouseClick, left
sleep 20
}
}
return
Useful in situations when you need to click a lot :)Also a bookmarklet that you use to turn any page dark with a click:
javascript:document.querySelectorAll('\*').forEach(e=>e.setAttribute('style','background-color:#222;background-image:none;color:#'+(/^A|BU/.test(e.tagName)?'36c;':'eee;')+e.getAttribute('style')))
From here: https://github.com/x08d/222
#!/usr/bin/sh
while true; do
reset;
"$@";
inotifywait -e MODIFY --recursive .
done
For example, if you invoke `rerun make test` then `rerun` will run `make test` whenever you save a file in your editor.
So I use this script to give me a nice work environment, based on each day.
Every time you open bash, it'll drop you into today's directory. (~/work/year/month/day/)
When I think about stuff it's like.. oh yeah I worked on that last week, last year, etc - the folder structure makes this a lot easier, and you can just write 'notes' or 'meeting-with-joe' and you know the ref date.
For your bashrc:
alias t='source /path/to/today'
t
Now every day you'll know what you worked on yesterday! # this is where you'll get dropped by default.
calvin@bison:~/work/2022/07/10$
calvin@bison:~/work/2022/07/10$ ls
WardsPerlSimulator.pl
calvin@bison:~/work/2022/07/10$ cd ..; ls;
01 02 04 05 06 07 08 09 10
calvin@bison:~/work/2022/07$ cd ..; ls
01 02 03 04 05 06 07
calvin@bison:~/work/2022$ cd ..; ls
2021 2022
additionally you'll get a shortcut, you can type 't' as a bash fn, or go to ~/t/ which is symlinked and updated everytime you run today (which is everytime you open bash or hit 't'. this is useful if you want to have Firefox/Slack/whatever always save something in your 'today' folder.1. Preserve single newlines that people typed in: Often people hit Return only once, and their intended formatting becomes a wall of text. Hacker News preserves the newline in the HTML.
.commtext {
white-space: pre-wrap;
}
.commtext .reply {
white-space: normal; /* fix extraneous whitespace caused by first rule */
}
2. Vertical lines to visually represent thread depth! .ind {
background-image: repeating-linear-gradient(to right, transparent 0px 39px, rgba(204, 204, 204, 1.0) 39px 40px);
}
https://github.com/dom111/dotfiles/blob/master/bin/git
which when combined with files like:
$cat ~/.gitidentities/github.com
[user]
name = dom111
email = dom111@users.noreply.github.com
means I don't accidentally leak email addresses etcAlso, not entirely related, but I wrote a small tool to add some animated GIFs to scripts: https://dom111.github.io/gif-to-ansi/
To use this alias, make an executable file called `git-recent` with the following contents and ensure it is in your `$PATH`
git for-each-ref \
--sort=-committerdate refs/heads/ \
--format='%(HEAD) %(color:red)%(objectname:short)%(color:reset) %(color:yellow)%(refname:short)%(color:reset) - %(contents:subject) - %(authorname) (%(color:green)%(committerdate:relative)%(color:reset))'
Here's a redacted example of the output looks like * fc924a7e68 team/name/feature-2 - save - My Name (4 days ago)
2fed1acfac team/name/feature-1 - add test - My Name (3 weeks ago)
4db4d4ac77 main - Remove changes (#22397) - My Name (6 weeks ago)
I made a CLI utility for automating certain operations I was doing all the time: rsync of sources (push or pull), db backup / rollback, copying the local db to the remote server or back, etc. The utility looked for a dotfile in the project directory to get things like the remote server address, remote project path, etc.
The tool served several purposes:
- Executing auxiliary tools (rsync, mysqldump, drush) with the right parameters, without requiring me to remember them.
- Storing (non-secret) information about the remote environment(s) in the project directory.
- Some dangerous operations (e.g. copying the local db to the remote server) were prohibited unless the dotfile explicitly enabled them. Some sites were only edited on dev and then pushed to production, but some had user data that should never be overwritten.
- When running rsync of sources the tool always did a dry-run first, and then required entering a randomly generated 4-letter code to execute them... so I would have to stop and think and didn't deploy by mistake.
This tool is too rough for sharing it with the general public... but I consider it one of my greatest professional achievements because it saved me a lot of mental effort and stress over the years, quite a bit of time, prevented me from shooting myself in the foot, and forced me to use proper workflows every time instead of winging it. It required a small investment of time and some foresight... but my philosophy is that my work should be to build tools to replace me, and that was a step in that direction.
function ccd { mkdir -p "$1" && cd "$1" }
Biggest timesaving script I've ever included in my arsenal.
For example, I have one script that uses rumps to show how many outdated homebrew packages I have (and also as a convenient shortcut to update those packages in the dropdown menu). I also have a second script that uses it to show a counter for open pull requests that I need to review (with links to individual PRs in the dropdown menu). It's great!
Result looks like this: https://imgur.com/yy6GlYk.jpg
git-move src/afile.c src/bfile.c src/cfile.c ../destination/repo
https://gist.github.com/mnemnion/87b51dc8f15af3242204472391f...
function go() {
let line = document.querySelector('#statementLines .line');
if (line) {
let leftAndRight = line.querySelectorAll('.statement');
let left = leftAndRight[0];
let right = leftAndRight[1];
if (right.hasClassName('matched')) {
let leftDetails = left.querySelectorAll('.info .details-container .details span');
let leftDate = leftDetails[0].textContent;
let leftReference = leftDetails[1].textContent;
let rightDetails = right.querySelectorAll('.info .details-container .details span');
let rightDate = rightDetails[0].textContent;
let rightReference = rightDetails[2].textContent;
rightReference = rightReference.replace('Ref: ', '');
if (Date.parse(leftDate).getTime() == Date.parse(rightDate).getTime() && leftReference.toLowerCase() == rightReference.toLowerCase()) {
var okButton = line.querySelector(".ok .okayButton");
console.log(leftReference);
okButton.click();
var waiter = function () {
if (line.parentNode == null) {
go();
}
else {
setTimeout(waiter, 50);
}
};
setTimeout(waiter, 50);
}
else {
console.log("Details dont match");
}
}
else {
console.log("Line not matched");
}
}
else {
console.log("No line found");
}
}setTimeout(go, 100);
BUT, this thread is so special because it feels like this is the stuff you only get to see when you sit down at a co-worker's desk and watch them type something and then say "WHAT? HOW COOL!"
I miss that part now that it is all remote work. :(
#!/usr/bin/python3
battery_directory = "/sys/class/power_supply/BAT1/"
with open(battery_directory + "status", "r") as f:
state = f.read().strip()
with open(battery_directory + "current_now", "r") as f:
current = int(f.read().strip())
with open(battery_directory + "voltage_now", "r") as f:
voltage = int(f.read().strip())
wattage = (voltage / 10**6) * (current / 10**6)
wattage_formatted = f"{'-' if state == 'Discharging' else ''}{wattage:.2f}W"
if state in ["Charging", "Discharging", "Not charging"]:
print(f"{state}: {wattage_formatted}")
Output:Charging: 32.15W
Discharging: -5.15W
https://github.com/sirikon/workstation/blob/master/src/cli/c...
For Linux, it can install and configure everything I need when launched on a clean Debian installation. apt repositories, pins and packages; X11, i3, networking, terminal, symlinking configuration of many programs to Dropbox or the repository itself... The idea is to have my whole setup with a single command.
For Mac, it installs programs using brew and sets some configurations. Mac isn't my daily driver so the scripts aren't as complete.
Also there are scripts for the terminal to make my life easier. Random stuff like killing any gradle process in the background, upgrading programs that aren't packetized on APT, backing up savegames from my Anbernic, etc. https://github.com/sirikon/workstation/tree/master/src/shell
And more programs for common use, like screenshots, copying Stripe test cards into the clipboard, launching android emulators without opening Android Studio, etc. https://github.com/sirikon/workstation/tree/master/src/bin
Use it everyday. Great because my company has multiple git submodules in any given project and I can use this to watch for pipeline failures and the like.
x=$(git config --local remote.origin.url|sed -n 's#.*/\([^.]*\)\.git#\1#p')
y=$(git symbolic-ref --short HEAD)
url="https://git.thecompany.com/thecompany/$x/tree/$y"
$(open -a "firefox developer edition" "$url"
function alarm_forever() {
# play one part of the track at a time so that this function can be killed any time
while :; do
afplay --time .72 ~/sounds/alarm.mp3;
done
}
function alarm_until_input() {
alarm_forever &
pid=$!;
read -n 1 -p "$*";
kill -9 $pid;
}
# pip install termdown
function timer {
termdown $@;
alarm_until_input "[Press any key to stop]"
}
alias alarm="timer"
# TODO: ask if user soaked the rice first
function rice {
echo "1. Wash rice. Place in pressure cooker with 1-1 water-rice ratio."
echo "2. Place the pressure cooker on the stove on high."
read -n 1 -p "3. When the pressure pot starts whistling, press any key to start the timer."
termdown --title "Tweet!" 2m
alarm_until_input "4. Take pot off heat and press any key."
termdown --title "RICE" 11m
alarm_until_input "5. Open the pot and stir the rice immediately."
alarm_until_input "6. Eat!"
}
clip_video () {
ffmpeg -ss "$2" -i "$1" -t "$3" -c copy "$4"
}
Used like so: clip_video filename.mp4 start_point duration output.mp4
https://github.com/trevorgross/installarch/blob/main/install...
It's a personal tool that just kept growing. Probably amateurish by HN standards, but then, I'm an amateur. Yes, I could simply copy a disk image, but that's no fun.
One I was particularly proud of/disgusted by was one that allowed me to jump around a network with a single command despite access being gated by regional jumphosts..
You are warned: https://git.drk.sc/-/snippets/107
Another script I wrote for our devs to get access to MySQL in production on GCP; the intent was for the script to be executable only by root and allow sudo access to only this script: that means also ''chmod ugo-rwx gcloud'' too though: https://git.drk.sc/-/snippets/98
I have another script to generate screenshots from grafana dashboards since that functionality was removed from grafana itself (https://github.com/grafana/grafana/issues/18914): https://git.drk.sc/-/snippets/66
Another time I got annoyed that Wayland/Sway would relabel my screens on successive disconnect/reconnects (IE my right screen could be DP-1 or DP-7 or anything in between randomly); so I wrote a docking script which moves the screens to the right place based on serial number: https://git.drk.sc/-/snippets/74
I have used these 2 on my machines for the last 4 years and writing tons of script for myself, here are a few:
- Displaying internet/internal ip and allow me to click it to put in clipboard
- taskwarrior
- Simple conversion script that take my clipboard & encode/decode in base64, hex, url encoding, convert epoch to UTC,
- "auto type" my clipboard by simulating keystrokes- particular useful for pasting text into terminal that disable clipboard
- An incident response switch that would trigger a script to take screenshot every 5 seconds when my mouse moves, reduce image quality and save it to a folder in my homedrive. Another script will GPG encrypt it at the end of the day so i can go back and get screenshot or look back at incident if needed.
var nodes = [...document.querySelectorAll('\*[aria-label="Number of times this review was rated helpful"]')];
nodes.sort((a, b) => (parseInt(b.innerText) || 0) - (parseInt(a.innerText) || 0));
nodes.map(e => ([
parseInt(e.innerText) || 0,
e.parentNode.parentNode.parentNode.parentNode.parentNode.children[1].textContent.toString().trimStart(),
]));
https://github.com/Mister-Meeseeks/subcmd/blob/master/subcmd
It can be configured to exclude certain directories (.cache and Downloads being likely contenders). Also, it can read in config files so it can backup other directories.
cd ()
{
builtin cd "$@" || return $?
ls --my-usual-flags
}
# Change to the Front Folder open in Finder
function ff {
osascript -e 'tell application "Finder"'\
-e 'if (0 < (count Finder windows)) then'\
-e 'set finderpath to get target of the front window as alias'\
-e 'else'\
-e 'set finderpath to get desktop as alias'\
-e 'end if'\
-e 'get POSIX path of finderpath'\
-e 'end tell';};\
function cdff { cd "`ff $@`" || exit; };
git() {
if [[ "$1" == 'checkout' ]]; then
echo 'Reminder: Use `git switch` or `git restore` instead.' >&2
fi
command git "$@"
}
Then I wrote a script that does that automatically:
#!/usr/bin/env bash
main() {
local package="$1"
if [ -z "$package" ]
then
echo "usage: $0 PACKAGE"
exit 1
fi
install_package "$package"
}
install_package() {
local package="$1"
local subpackage
if sudo apt-get -y install "$package"
then exit 0
else
sudo apt-get -y install "$package" \
|& grep '^ ' \
| sed 's/[^:]*:[^:]*: //;s/ .*//;' \
| {
while read subpackage
do install_package "$subpackage"
done
}
sudo apt-get -y install "$package" \
&& echo "SUCCESS: $package" \
|| echo "FAILURE: $package"
fi
}
main "$@"
#!/usr/bin/env bash
function shc() { #: cat for shell scripts, source code. #: prints text with line numbers and syntax highlighting. #: accepts input as argument or pipe.
if [ $# -eq 0 ]; then
# arguments equal zero; assume piped input
nl | /usr/local/bin/pygmentize -l bash
# accept piped input, process as source code
else
case "$1" in
-h|--help)
printf "%s\n" "shc usage:" " shc [file]" " type [function] | shc"
;;
-v|--version)
printf "%s\n" "vers 2"
;;
*)
if [ -f "$1" ]; then
# test anything that isn't expected flags for file
cat "$1" | nl | /usr/local/bin/pygmentize -l bash
# process file as source code
else
# if not a file or expected flags, bail
printf "%s\n" "error; not the expected input. read shc_func source for more details"
fi
esac
fi
}
---------
#!/bin/sh
echo "Store and retrieve session token AWS STS \n\n"
# Get source profile
read -p "Source Profile [ # Get destination profile
read -p "Destination Profile [ mfa_serial_number='arn:aws:iam:: echo "\nOTP: "
read -p "One Time Password (OTP): " otp echo "\nOTP:" $otp
echo "\n" output=$(aws sts get-session-token --profile echo $output access_key_id=$(echo $output | jq .Credentials.AccessKeyId | tr -d '"')
secret_access_key=$(echo $output | jq .Credentials.SecretAccessKey | tr -d '"')
session_token=$(echo $output | jq .Credentials.SessionToken | tr -d '"') aws configure set aws_access_key_id $access_key_id --profile=$destination_profile
aws configure set aws_secret_access_key $secret_access_key --profile=$destination_profile
aws configure set aws_session_token $session_token --profile=$destination_profile echo "Configured AWS for profile" $destination_profile
!`::CycleCurrentApplication(0)
!+`::CycleCurrentApplication(1)
WhichMonitorAppIsOn(winId) {
WinGetPos, cX, cY, cW, cH, ahk_id %winId%
xMid := cX + (cW / 2)
yMid := cY + (cH / 2)
SysGet, nMons, MonitorCount
Loop, % nMons
{
; MsgBox %A_Index%
SysGet, tmp, Monitor, %A_Index%
withinWidth := (xMid > tmpLeft) && (xMid < tmpRight)
; MsgBox % tmpLeft . " -> " . tmpRight . "`t" . xMid
if (withinWidth == 1)
return %A_Index%
}
}
CycleCurrentApplication(same_desktop_only) {
WinGet, curID, ID, A
curMon := WhichMonitorAppIsOn(curID)
WinGetClass, ActiveClass, A
WinGet, WinClassCount, Count, ahk_class %ActiveClass%
IF WinClassCount = 1
Return
Else
WinGet, List, List, % "ahk_class " ActiveClass
Loop, % List
{
index := List - A_Index + 1
WinGet, State, MinMax, % "ahk_id " List%index%
WinGet, nextID, ID, % "ahk_id " List%index%
nextMon := WhichMonitorAppIsOn(nextID)
if (same_desktop_only > 0 && (curMon != nextMon))
continue
if (State != -1) ; if window not minimised
{
WinID := List%index%
break
}
}
WinActivate, % "ahk_id " WinID
}
The reason I like it is also backs up the original in case I mess up the regex (happens sometimes...)
#!/usr/bin/env bash
perl -i.bak -p -e 's/oldtext/newtext/g;' $1
#!/bin/sh
if [ $(tmux has-session 2>/dev/null; echo $?) -eq 0 ]; then
if [ $(tmux list-windows -f '#{window_active_clients}') ]; then
if [ $(tmux ls | head -n 1 | awk '{print $2}') -le 2 ]; then
xterm -e "tmux new-session -f active-pane,ignore-size -t "0" \; new-window"
else
xterm -e "tmux new-session -f active-pane,ignore-size -t "0" \; select-window -t +2"
fi
else
xterm -e "tmux attach -f active-pane,ignore-size -t "0""
fi
else
xterm -e tmux new-session -f active-pane,ignore-size
fi
if [ $(tmux ls | wc -l) -gt 1 ]; then
for i in $(tmux ls -F '#S' -f '#{?session_attached,,#S}' ); do
tmux kill-session -t ${i}
done
fi
This can be extended easily, even dynamically creating an account if the user is part of an org, or use libnss-ato to alias the user to a specific account.
#! /bin/bash
remote_file_path=$1
wget --recursive --level=5 --convert-links --page-requisites --wait=1 --random-wait --timestamping --no-parent ${remote_file_path}
And a couple of zshrc functions which make jumping around my filesystem quite snappy. `jump` is aliased to `j`, and `mark` to `m` MARKPATH=~/.marks
function jump {
cd -P ${MARKPATH}/$1 2> /dev/null || (echo "No such mark: $1" && marks)
}
function mark {
mkdir -p ${MARKPATH}; ln -s $(pwd) $MARKPATH/$1
}
function unmark {
rm -i ${MARKPATH}/$1
}
function marks {
ls -l ${MARKPATH} | sed 's/ / /g' | cut -d' ' -f9- && echo
}
_jump()
{
local cur=${COMP_WORDS[COMP_CWORD]}
COMPREPLY=( $(compgen -W "$( ls $MARKPATH )" -- $cur) )
}
complete -F _jump jump
(Totally stolen, and fixed up to work in ZSH)
function git-checkout-branch-by-search-string() {
local maybe_branch_name
maybe_branch_name=$(git branch --sort=-committerdate | grep $1 | head -n 1)
if [ -n "$maybe_branch_name" ]; then
git checkout "${maybe_branch_name:2}"
else
echo "Could not find branch matching $1"
fi
}
alias gcos="git-checkout-branch-by-search-string"
Branches often include things like ticket numbers and project keys, so you can do $ gcos 1234
and save some typing.I have a pair of fixup commit functions, which make it faster to target fixup commits prior to rebasing:
function git-commit-fixup() {
git commit --fixup ":/$*"
}
function git-add-all-then-git-commit-fixup() {
git add .
git commit --fixup ":/$*"
}
Long function names that are then assigned to an alias can make it easier to find them later if you forget rarely used ones. That is you can do:$ alias | grep fixup
to see the list of relevant aliases and the functions they call.
I also have two functions I use like a linear git bisect:
function git-checkout-parent-commit() {
local prev
prev=$(git rev-parse HEAD~1)
git checkout "$prev"
}
function git-checkout-child-commit() {
local forward
forward=$(git-children-of HEAD | tail -1)
git checkout "$forward"
}
function git-children-of() {
for arg in "$@"; do
for commit in $(git rev-parse $arg^0); do
for child in $(git log --format='%H %P' --all | grep -F " $commit" | cut -f1 -d' '); do
echo $child
done
done
done
}
#!/usr/bin/env bash
for file in `find . -name '*.md'`; do
output=${file::-3}.html
if [[ `date -r "$file" "+%s"` -le `date -r "../$output" "+%s"` ]]
then
echo "Skipping $file"
continue
fi
mkdir -p ../$(dirname $output)
echo Generating $output from $file
cat << EOF > ../$output
`cat head.html`
`cat navigation.html`
`pandoc $file`
EOF
done;
# Simple calculator
function calc() {
local result=""
result="$(printf "scale=10;$*\n" | bc --mathlib | tr -d '\\\n')"
# └─ default (when `--mathlib` is used) is 20
#
if [[ "$result" == *.* ]]; then
# improve the output for decimal numbers
printf "$result" |
sed -e 's/^\./0./' `# add "0" for cases like ".5"` \
-e 's/^-\./-0./' `# add "0" for cases like "-.5"`\
-e 's/0*$//;s/\.$//' # remove trailing zeros
else
printf "$result"
fi
printf "\n"
}
search_notes() { input=$(rg -v '(\-\-)|(^\s*$)' --line-number /home/user/some-dir | fzf --ansi --delimiter : --preview 'batcat --color=always {1} --highlight-line {2}' --preview-window 'up,60%,border-bottom,+{2}+3/3,~3' | choose -f : 0) if [[$input = ""]]; then else less $input fi }
It uses various linux utilities including fzf and batcat(https://github.com/sharkdp/bat) to open a terminal with all the places where my query comes up (supporting fuzzy search). Since the workhorses are fzf and ripgrep its is quite fast even for very large directories.
So i will do `search_notes postgres authentication`. I can select a line and it will open the file in less. Works like a charm!
1: https://gist.github.com/adewes/02e8a1f662d100a7ed80627801d0a...
The most recent was a script that parsed a financial report and generated multiple emails depending on a set of criteria. Then the user could manual review these emails and press send if everything checks out. The goal of the script was to reduce some of the menial work my financial co-worker was doing. I don't have it published on GitHub because it has some internal company info in it. But it worked cleanly, and regularly saves him hours of tedious work.
Also I highly recommend EasyGui library for those quick scripts that need user input from people who are not comfortable with a console/cmd. Helps make different types of popup windows for user input/selection with a few simple lines.
https://github.com/64kramsystem/openscripts
Missed the previous cheatsheet post :) I have a massive collection, which are large enough to be books more than cheatsheets (still, I access and use them as cheatsheets):
https://github.com/64kramsystem/personal_notes/tree/master/t...
[1] https://gitlab.com/victor-engmark/tilde/-/blob/master/.bash_...
alias filter_repos_z="grep -ZzEv '/tags|/\.hg/|/\.svn/|/\.git/|/\.repo/|\.o$|\.o\.cmd$|\.depend|\.map$|\.dep$|\.js$|\.html$'"
function findxgrep()
{
find . -type f -print0 | filter_repos_z | xargs -0 grep --color=auto "${@}" | grep -v "^Binary file" | sed 's/^\.\///' | less -F
}
The "${@}" is the critical bit that allows me to pass arguments like -i to grep. The grep, find and xargs commands all support using a NULL as a file separator instead of whitespace.
My "Bash Toolkit": https://github.com/adityaathalye/bash-toolkit
My (yak-shaving-in-progress :) "little hot-reloadin' static shite generator from shell": https://github.com/adityaathalye/shite
A subtle-ish aspect is, I like to write Functional Programming style Bash. I've been blogging about it here: https://www.evalapply.org/tags/bash/
sshcreen () {
ssh -t "$@" screen -xRR
}
Works with bash and zsh. Usage is pretty simple: $ sshcreen user@example.com
Or for local Docker instances mapped to port 2222: $ sshcreen root@localhost -p 2222
Detach the session with CTRL-A + D, reattach by rerunning the sshcreen command you previously used.
pman()
{
man -t "${1}" | open -f -a /System/Applications/Preview.app
}
I like using this in conjunction with pbcopy to quickly generate a random password at given length pwgen()
{
length=${1:-64}
charlist='0-9a-zA-Z~!@#$%^&*()_+-=:";<>?,./'
echo `cat /dev/random | tr -dc $charlist | head -c$length`
}
It also uses my https://github.com/pcho/dotfiles, https://github.com/pcho/vimfiles and https://github.com/pcho/zshfiles
alias makepw='cat /dev/urandom | LC_ALL=C tr -cd A-Za-z0-9,_- | head -c 25; echo'
Any proper password manager will of course be able to supplant tricks like these.
With my son having opened an account over a year ago, but we didn’t sign up for Mint until this weekend, I ended up writing a new import script for the updated API:
- Scripts to test our rate limiting for both authenticated and unauthenticated users (was handy)
- API routes changed in a given PR (set of commits since the last interaction with master in reality)
- ssl-expiration-date - Checks the expiration date of a site's certificate
domain="$1"
echo "Checking the SSL certificate expiration date for: $domain"
curl -vI "$domain" 2>&1 | grep -o 'expire date: .*$'
- test-tls-version - Checks if a website supports a given version of TLS domain="$1"
curl_options=( "--tlsv${2}" --tls-max "$2" )
curl "${curl_options[@]}" -vI "$domain" 2>&1
There are also some miscellaneous PHP scripts lying around for template related stuff. PHP makes a create templating language when you need some basic programmatic additions to your output text.Everything is too coupled to my work to be useful to others, and most of the automation scripts I've written for work are run as cron jobs now and send out emails to the appropriate emails. Most of these are written in PHP (we're a PHP shop).
It shows the current status, lists out the most recent tags, prompts for a new tag and message, and finally pushes.
Everything is colorized so it's easy to read and I use it quite often for Golang projects.
https://github.com/bbkane/dotfiles/blob/e30c12c11a61ccc758f7...
#!/bin/sh
pdf_viewer="mupdf";
latex_cmd="pdflatex -interaction=nonstopmode"
if [[ $# -eq 0 ]]; then
print "No arguments: filename required"
exit
fi
filename=$1;
pdfname=${filename%%.*}.pdf
# inital compilation to make sure a pdf file exists
${latex_cmd} ${filename};
${pdf_viewer} ${pdfname} &
# get pid of the pdf viewer
pdf_viewer_pid=$!;
while true; do
# as long as the pdf viewer is open, continue operation, if it gets closed,
# end script
if kill -0 "${pdf_viewer_pid}" 2>/dev/null; then
if [[ ${filename} -nt ${pdfname} ]]; then
${latex_cmd} ${filename};
# reload pdf file, only works with mupdf
kill -HUP ${pdf_viewer_pid};
touch $pdfname
fi
sleep 1;
else
exit 0;
fi
done;
There are many ways to search for the process, but here's what I use:
lsof -iTCP -sTCP:LISTEN -P | grep [PORT NUMBER]
Look for port num and kill the process with: kill -9 [PID OF PROCESS YOU WANT TO KILL]
Note if running as root user, you will need to prepend the above commands with sudo
As a workaround, I wrote a small wrapper script that would enable multi-threading for SimpleHTTPServer.
~/bin/http-cwd , Python 2 version (original):
#!/usr/bin/python
import argparse
import BaseHTTPServer
import SimpleHTTPServer
import SocketServer
import sys
class ThreadedHTTPServer(SocketServer.ThreadingMixIn, BaseHTTPServer.HTTPServer):
pass
def main(argv):
parser = argparse.ArgumentParser()
parser.add_argument(
"--port", type = int, nargs = "?",
action = "store", default = 8000,
help = "Specify alternate port [default: 8000]",
)
parser.add_argument(
"--iface", type = str, nargs = "?",
action = "store", default = "127.0.0.1",
help = "Specify iface [default: 127.0.0.1]",
)
args = parser.parse_args(argv[1:])
server_address = (args.iface, args.port)
srv = ThreadedHTTPServer(server_address, SimpleHTTPServer.SimpleHTTPRequestHandler)
sa = srv.socket.getsockname()
print "Serving http://%s:%r ..." % (sa[0], sa[1])
srv.serve_forever()
if __name__ == "__main__":
sys.exit(main(sys.argv))
Python 3 version (necessary for platforms that have dropped Python 2, such as macOS): #!/usr/bin/python3
import argparse
import http.server
import socketserver
import sys
class ThreadedHTTPServer(socketserver.ThreadingMixIn, http.server.HTTPServer):
pass
def main(argv):
parser = argparse.ArgumentParser()
parser.add_argument(
"--port", type = int, nargs = "?",
action = "store", default = 8000,
help = "Specify alternate port [default: 8000]",
)
parser.add_argument(
"--iface", type = str, nargs = "?",
action = "store", default = "127.0.0.1",
help = "Specify iface [default: 127.0.0.1]",
)
args = parser.parse_args(argv[1:])
server_address = (args.iface, args.port)
srv = ThreadedHTTPServer(server_address, http.server.SimpleHTTPRequestHandler)
sa = srv.socket.getsockname()
print("Serving http://%s:%r ..." % (sa[0], sa[1]))
srv.serve_forever()
if __name__ == "__main__":
sys.exit(main(sys.argv))
# Search all directories for this directory name.
dname() {
[ $# -eq 0 ] && echo "$0 'dir_name'" && return 1
fd --hidden --follow --exclude .git --type directory "$*"
}
# Search all files for this filename.
fname() {
[ $# -eq 0 ] && echo "$0 'file_name'" && return 1
fd --hidden --follow --exclude .git --type file "$*"
}
# Find and replace with a pattern and replacement
sub() {
[ $# -ne 2 ] && echo "$0 'pattern' 'replacement'" && return 1
pattern="$1"
replace="$2"
command rg -0 --files-with-matches "$pattern" --hidden --glob '!.git' | xargs -0 perl -pi -e "s|$pattern|$replace|g"
}
# Uses z and fzf, if there's a match then jump to it. If not, bring up a list via fzf to fuzzy search.
unalias z 2> /dev/null
z() {
[ $# -gt 0 ] && _z "$*" && return
cd "$(_z -l 2>&1 | sed 's/^[0-9,.]* *//' | fzf)"
}
fndi ()
{
tgt="${1}";
shift;
echo find . -iname \*"${tgt}"\* "${@}";
find . -iname \*"${tgt}"\* "${@}" 2> /dev/null;
[[ -z $tgt ]] && {
echo;
echo "No target was specified, did the results surprise?"
}
}
Shorthand to find all files containing a pattern: fndg ()
{
binOpt="-I";
wordOpt="";
caseOpt="-i";
while true; do
if [[ -z $1 || $1 =~ ^[^-+] ]]; then
break;
fi;
case $1 in
+i)
caseOpt=""
;;
-B)
binOpt=""
;;
-w)
wordOpt="-w"
;;
*)
echo "Unrecognized option '${1}', cannot proceed.";
return 1
;;
esac;
shift;
done;
if [[ -z $2 ]]; then
startIn=.;
else
startIn='';
while [[ ! -z $2 ]]; do
startIn+="$1 ";
shift;
done;
fi;
[[ -z $1 ]] && {
echo "No target specified, cannot proceed.";
return
};
tgt=$1;
echo find ${startIn} -type f -exec grep $binOpt $wordOpt $caseOpt -H "${tgt}" {} \;;
find ${startIn} -type f -exec grep $binOpt $wordOpt $caseOpt -H "${tgt}" {} \; 2> /dev/null
}
There is also a collection of more "obscure" scripts in my shellscripts repository documented here: https://masysma.lima-city.de/32/shellscripts.xhtml.
Another (probably niché) topic is my handling of scanned documents which arrive as PDFs from the scanner and that I want to number according to the stamped number on the document and convert to png at reduced color space: https://masysma.lima-city.de/32/scanning.xhtml
8<-----------------------------
function prompt_func {
CMDNUM=`history 1 | awk '{print $1}'`
LAST_CMD=`history 1 | cut -f 3- -d ' '`
if [ x$LAST_CMDNUM = xwho_knows ]; then
LAST_CMDNUM=$CMDNUM
fi
if [ x$CMDNUM != x$LAST_CMDNUM ]; then
FULL_CMD_LOG="$HOME/full-history/$(date "+%Y-%m-%d").log"
echo "$(date '+%H:%M:%S') `munge_pwd` $LAST_CMD" >> $FULL_CMD_LOG
LAST_CMDNUM=$CMDNUM
fi
}
export PROMPT_COMMAND=prompt_func
export LAST_CMDNUM=who_knows
function fh() {
grep -r --color=NEVER ${*} ~/full-history |
sed 's/[^ ]* //' |
sed 's/ \[[^]]\*\]/$/'
}
8<-----------------------------`munge_pwd` is another script that does various substitutions on the prompt (specific to how my work directories are laid out) but mostly you can just substitute `pwd` if you don't care about deduplicating stuff like multiple checkouts of the same project.
https://github.com/djsamseng/cheat_sheet/blob/main/grep_for_...
#!/bin/bash
if [ $# -eq 0 ] then echo "Usage: ./grep_for_text.sh \"text to find\" /path/to/folder --include=*.{cpp,h}" exit fi
text=$1 location=$2
# Remove $1 and $2 to pass remaining arguments as $@ shift shift
result=$(grep -Ril "$text" "$location" \ $@ \ --exclude-dir=node_modules --exclude-dir=build --exclude-dir=env --exclude-dir=lib \ --exclude-dir=.data --exclude-dir=.git --exclude-dir=data --exclude-dir=include \ --exclude-dir=__pycache__ --exclude-dir=.cache --exclude-dir=docs \ --exclude-dir=share --exclude-dir=odas --exclude-dir=dependencies \ --exclude-dir=assets)
echo "$result"
json2yaml() {
python3 -c "import json,sys,yaml; print(yaml.dump(json.load(sys.stdin)))"
}
export -f json2yaml
yaml2json() {
python3 -c "import json,sys,yaml; json.dump(yaml.safe_load(sys.stdin), sys.stdout, default=str)"
}
export -f yaml2json
httping() {
while true; do
curl $@ -so /dev/null \
-w "connected to %{remote_ip}:%{remote_port}, code=%{response_code} time=%{time_total}s\n" \
|| return $?
sleep 1
done
}
[[ ! $(>&/dev/null type httping) ]] && export -f httping
redis-cli() {
REDIS_HOST="${1:-127.0.0.1}"
REDIS_PORT="${2:-6379}"
rlwrap -S "${REDIS_HOST}:${REDIS_PORT}> " socat tcp:${REDIS_HOST}:${REDIS_PORT} STDIO
}
[[ ! $(>&/dev/null type redis-cli) ]] && export -f redis-cli
https://gist.github.com/stuporglue/83714cdfa0e4b4401cb6
It's one of my favorites because it's pretty simple, and I wrote it when a lot of things were finally coming together for me (including GIS concepts, plpgsql programming, and a project I was working on at the time).
This is code which takes either two foci points and a distance, or two foci, a distance and the number of pointers per quadrant and generates a polygon representing an ellipse. Nothing fancy, but it made me happy when I finally got it working.
The use case was to calculate a naive estimate of how far someone could have ridden on a bike share bike. I had the locations they checked out the bike, and where they returned it, and the time they were gone. By assuming some average speed, I could make an ellipse where everywhere within the ellipse could have been reached during the bike rental.
And then I have different dotfile repos. I have a base one that I keep so clean I could get a job at Disney with it. That's where most of my scripts live. And then I have locale ones, like -home, - I don't have to hop systems too much, so grabbing fresh tooling is a twice a year problem. I'm a cli-as-ide dinosaur so I just hide all my seldom-used scripts under a double underscore prefix. __init_tooling will update vim and give me the 8 or 9 plugins I have grown dependent upon, give me a ruby and python environment, etc. I have a function called "add_word". Every time I see a word I dont know, I learn it, and then I run "add_word The dirtiest thing I have is a cheap vault that uses vim and shell automation. I have a grammar for descripting secrets, and I can pass a passphrase through automation to get secrets out. I'm sure it's 100% hackable. I know the first rule of security software is "dont ever try to make your own". So I don't put anything too good in there.
Long story short: you can use hard links + rsync to create delta snapshots of a directory tree. I use it to create a back up of my important directory trees.
Funny story about this: I had really old HP "Lance Armstrong" branded laptop that I used for years. The above above script was on it and was rsyincing to separate machine so it was fully backed up. Because of that, I was actually hoping for the laptop to die so I could get a new one (frugalness kicking in strong here).
My girlfriend at the time was using it and said "Oh, should I not eat or drink over your laptop?" and I responded: "No, please do! If you break it that means I can allow myself to order a new one."
f0() {
echo 'select moz_bookmarks.title || '"'"' = '"'"' || url from moz_places, moz_bookmarks on moz_places.id = moz_bookmarks.fk where parent = 2;' | sqlite3 /home/user/.mozilla/firefox/twht79zd.default/places.sqlite
}
f1() {
firefox `echo 'select url from moz_places, moz_bookmarks on moz_places.id = moz_bookmarks.fk where moz_bookmarks.title = '"'$1'"';' | sqlite3 /home/user/.mozilla/firefox/twht79zd.default/places.sqlite`
}
f$# $1
Execute PostScript programs alone with command-line arguments: exec gs -P -dBATCH -dNODISPLAY -dNOEPS -dNOPAUSE -dNOSAFER -q -- "$@"
Tell the IP address: curl -s 'http://icanhazip.com/' | cat -v
alias please='sudo zsh -c "$(fc -ln -1)"' # rerun the last command with sudo (because it failed )
Easier PATH management:
# nicer path configuration and lookup function path { if [[ $# -eq 0 ]]; then echo -e ${PATH//:/\\n} | sort elif [[ "$1" == "--save" ]]; then path $2 && echo "\npath $2" >> $HOME/.profile else if [[ -d "$1" ]] ; then if [[ -z "$PATH" ]] ; then export PATH=$1 else export PATH=$1:$PATH fi else echo "$1 does not exist :(" return 1 fi fi }
This is a bash one-liner that takes the place of an RSA/2FA token/AuthyApp
Variables: $HOSTNAME - the computer hostname $TOBACKUPDIR - the local directory you want backed up $N_CORES - the number of cores you want to use for compression $REMOTEUSER - the ssh user login on the remote server $REMOTEHOST - the remote server's IP $BACKUPDIR - where you want the file to be backed up to
#!/bin/bash
bfile=`date +%F`.$HOSTNAME.tar.gz
/usr/bin/tar cvpf - \
# You can exclude local directories here with
# --exclude="dir" \
$TOBACKUPDIR | pigz -p $N_CORES | \
ssh $REMOTEUSER@$REMOTEHOST "cat - > /$BACKUPDIR/$bfile"
#!/usr/bin/env bash
function cdf() { #: Change working directory to the top-most Finder window location cd "$(osascript -e 'tell app "Finder" to POSIX path of (insertion location as alias)')"; }
https://www.masteringemacs.org/article/fuzzy-finding-emacs-i...
manps()
{
if [ -z "$1" ]; then
echo usage: $FUNCNAME topic
echo This will open a PostScript formatted version of the man page for \'topic\'.
else
man -t $1 | open -f -a /Applications/Preview.app
fi
}
This is for MacOS. All it does is display a `man` entry as properly formatted Postscript. If you were around in the 80's when we had ring-bound paper manuals, you may remember how much easier they were to read as compared with fixed-pitch terminal rendering of the same page.Sorry, no Linux version as I rarely have a graphical desktop open on Linux. It should be easy to rig something up with Ghostscript or similar.
This will fork it and wait until it ends.
#!/bin/bash
TOTAL=`ps aux | grep YOUR_CRONJOB.php | grep -v grep | wc -l`
echo "TOTAL PROCESSES ALREADY RUNNING :"$TOTAL
MAX_THREADS=20
TOTAL_MODS="$(($MAX_THREADS-1))"
echo "TOTAL MODS: "$TOTAL_MODS
if [ $TOTAL -eq 0 ]
then
echo "RUNNING..."
for i in $(seq 0 $TOTAL_MODS)
do
echo "Starting thread $i"
timeout 10000 php YOUR_CRONJOB.php $i $MAX_THREADS &
pids[${i}]=$!
done
echo "FINISHED FORKING"
else echo "NOT RUNNING...."
fifor pid in ${pids[*]}; do
wait $pid
doneecho "OK FINISHED"
[1]: https://github.com/axelf4/nixos-config/blob/e90e897243e1d135...
https://github.com/cednore/dotfiles/blob/master/.functions#L... https://github.com/cednore/dotfiles/blob/master/.aliases#L46...
Use gnuplot to plot one or more files directly from the command line: https://github.com/RhysU/gplot/blob/master/gplot
mkcd() {
mkdir -p "$1" && cd "$1"
}
mkcdtmp() {
mkcd ~/tmp/$(date "+%y%m%d")
}
When I read it today I miss those soo oversimplified solutions to do stuff :'-)
Here are some selected scripts folks might find interesting.
Here's my backup script that I use to encrypt my data at rest before shipping it off to s3. Runs every night and is idempotent. I use s3 lifecycle rules to keep data around for 6 months after it's deleted. That way, if my script goofs, I can recover: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...
I have so many machines running Archlinux that I wrote my own little helper for installing Arch that configures the machine in the way I expect: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...
A tiny little script to recover the git commit message you spent 10 minutes writing, but "lost" because something caused the actual commit to fail (like a gpg error): https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...
A script that produces a GitHub permalink from just a file path and some optional file numbers. Pass --clip to put it on your clipboard: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae... --- I use it with this vimscript function to quickly generate permalinks from my editor: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...
A wrapper around 'gh' (previously: 'hub') that lets you run 'hub-rollup pr-number' and it will automatically rebase that PR into your current branch. This is useful for creating one big "rollup" branch of a bunch of PRs. It is idempotent. https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...
Scale a video without having to memorize ffmpeg's crazy CLI syntax: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...
Under X11, copy something to your clipboard using the best tool available: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...
[1] https://github.com/benwinding/dotfiles
[2] https://zachholman.com/2010/08/dotfiles-are-meant-to-be-fork...
https://github.com/ianmiell/bash-template
It's a 'cut and paste' starter for shell scripts that tries to be as robust as possible while not going crazy with the scaffolding. Useful for "I want to quickly cut a script and put it into our source but don't want it to look totally hacky" situations.
autoload -U add-zsh-hook
add-zsh-hook chpwd source_env
source_env() {
if [[ -f .env && -r .env ]]; then
source .env
fi
}
## coding analysis
function lines_coded {
perl -ne'print unless /^\s*$/ || /^\s*(?:#|\/\*|\*)/' $* | wl
}
function lines_commented {
perl -ne'print if /^\s*(?:#|\/\*|\*)/' $* | wl
}
And wl is just a small alias (because I used it all the time): wl='wc -l'
replace() {
grep -rl "$1" . | xargs gsed -i "s/$1/$2/g"
}
Also, I run Spotify from the command line: https://github.com/hnarayanan/shpotify
zfsnapr, a ZFS recursive snapshot mounter - I run borg-backup.sh using this to make consistent backups: https://github.com/Freaky/zfsnapr
mkjail, an automatic minimal FreeBSD chroot environment builder: https://github.com/Freaky/mkjail
run-one, a clone of the Ubuntu scripts of the same name, which provides a slightly friendlier alternative to running commands with flock/lockf: https://github.com/Freaky/run-one
ioztat, a Python script that basically provides what zfs-iostat(8) would if it existed: https://github.com/jimsalterjrs/ioztat
I replaced Plone for my personal use with about 1000 lines of Python. A object oriented database. The interface is awkward but if you get past that the goal was to produce pictures of trees with graphviz.
playMeSomeMusicMan() { rg --files -tmusic ~/Music | shuf | mpv --playlist=- }
I also got sick of waiting for Activity Monitor to boot to kill an errant process, so I wrote this one to fuzzy search and kill the selection. kp() { ps aux | fzy | awk '{ print $2 }' | xargs kill }
TASKKILL /IM outlook.exe
TASKKILL /IM teams.exe
TASKKILL /IM onedrive.exe
timeout /t 2
TASKKILL /F /IM outlook.exe
TASKKILL /F /IM teams.exe
TASKKILL /F /IM onedrive.exe
TASKKILL /F /IM Microsoft.AAD.BrokerPlugin.exe
timeout /t 2
start outlook.exe
start "" %LOCALAPPDATA%\Microsoft\Teams\Update.exe --processStart "Teams.exe"
start "" "C:\Program Files\Microsoft OneDrive\OneDrive.exe" /background
There is a short period between network start and VPN start where all the microsoft thingies start and want me to login again. As their SMSes sometimes take hours to arrive, it is more easye to just kill and restart them, and let them reuse their existing login.So I dropped the batch above on my desktop, and click it while the VPN is starting up. In the 4 seconds it takes to kill everything, the network works as it should.
#!/bin/bash echo |\ openssl s_client -connect ${1:?Usage: $0 HOSTNAME [PORT] [x509 OPTIONS]}:${2:-443} 2>&1 |\ sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' |\ openssl x509 ${3:--text} ${@:4} 2>/dev/null |\ sed '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/d'
https://github.com/ericfitz/dominfo
Dependencies: sublist3r (Python) pv (used for progress bars)
The junk I haven't touched in 10 years: https://github.com/psypete/public-bin/src
# M1 compatibility switches
arm() { arch -arm64 "${@:-$SHELL}" }
x86() { arch -x86_64 "${@:-$SHELL}" }
This with the addition of `$(uname -m)` in my $PROMPT, has saved me a lot of time by letting me switch between arm and x86_64 architecture.
$ wat wat
#!/usr/bin/env bash
cat `which $1`
Photoshop Layer Labeler: https://www.middleendian.com/pslayerlabeler
$ qtime.bash
It's nearly twenty-five past two.
loop() { NUM=$1 shift for i in {1..$NUM}; do "$@" done }
http://angg.twu.net/eepitch.html
that lets me execute my scripts line by line very easily.
kubectl get pods --all-namespaces --sort-by=.metadata.creationTimestamp -o wide -Lapp \
| grep -vP "Completed|Terminating|ContainerCreating|Running\s+[01234567]\s+"
function rgs {
rg --line-number --with-filename --color always "$@" | sort --stable --field-separator=: --key=1,1
}
# Recursively search for keyword in all files of current directory
grr() {
grep -rHIn --exclude-dir=.git --exclude-dir=node_modules --exclude=*.min.* --exclude=*.map "$@" . 2>&1 | grep -v "No such file"
}
I wanted to control my display’s brightness using my keyboard on Linux. Turned out to be pretty easy with ddcutil!
[ -z "$PS1" ] && return function cd { builtin cd "$@" && ls }
https://github.com/mrichtarsky/linux-shared
The repo name is a bit outdated, it works on macOS too. Lots of scripts are missing, will add them soon.
Will definitely be adding more as I tidy them up! :)
This one will generate any kind of TLS certificate: Root CA, intermediate, mail, web, client-side …
#!/bin/bash
# Perform a work-in-progress commit
# Add everything
git add $(git rev-parse --show-toplevel)
# If the latest commit is already a WIP, amend it
if [[ $(git show --pretty=format:%s -s HEAD) = "WIP" ]]; then
git amend --no-edit --no-verify
else
git commit -m "WIP" --no-verify
fi
I wanted a way to quickly commit everything in a branch without thinking about it. This comes up a lot when I'm working on something and either need to pivot to something else or I want to pull down a PR and verify it without losing my work. I also wanted the option to quickly switch back to that branch, pick up where I left off, and be able to drop it again just as quickly without muddying up the commit history.This script automatically stages everything and commits it as "WIP". If it detects that the most recent commit was a "WIP" then it amends the previous commit. No more weird stashing just to avoid losing my place
hist() {
history | grep $1
}
> up
Does a `cd ..` on every keypress except ESC or space.
> up $n
Does a total of $n `cd ..` and (important!) set OLDPWD to the initial directory for proper `cd -`.
Sets up an Ubuntu server as a strongSwan IKEv2 VPN.
#!/usr/bin/env sh
set -o errtrace; set -o errexit; set -o pipefail
if [ -n "${1}" ]; then filter="${1}"; else filter=''; fi
jq ."${filter}" package.json
Never promoted it but I’ve been quietly using it myself to build stuff that I need. Obviously browser based stuff have limitations but I found I still get a lot done
What a descriptive name :D
#!/usr/bin/env bash
function szup() {
description=' #: Title: szup #: Synopsis: sort all items within a directory according to size #: Date: 2016-05-30 #: Version: 0.0.5 #: Options: -h | --help: print short usage info #: : -v | --version: print version number '
funcname=$(echo "$description" | grep '^#: Title: ' | sed 's/#: Title: //g') version=$(echo "$description" | grep '^#: Version: ' | sed 's/#: Version: //g') updated="$(echo "$description" | grep '^#: Date: ' | sed 's/#: Date: //g')"
function usage() {
printf "\n%s\n" "$funcname : $version : $updated"
printf "%s\n" ""
}
function sortdir() {
Chars="$(printf " %s" "inspecting " "$(pwd)" | wc -c)"
divider=====================
divider=$divider$divider$divider$divider
format=" %-${Chars}.${Chars}s %35s\n"
totalwidth="$(ls -1 | /usr/local/bin/gwc -L)"
totalwidth=$(echo $totalwidth | grep -o [0-9]\\+)
Chars=$(echo $Chars | grep -o [0-9]\\+)
if [ "$totalwidth" -lt "$Chars" ]; then
longestvar="$Chars"
else
longestvar="$totalwidth"
fi
shortervar=$(/Users/danyoung/bin/qc "$longestvar"*.8)
shortervar=$(printf "%1.0f\n" "$shortervar")
echo "$shortervar"
printf "\n %s\n" "inspecting $(pwd)"
printf " %$shortervar.${longestvar}s\n" "$divider"
theOutput="$(du -hs "${theDir}"/* | gsort -hr)"
Condensed="$(echo -n "$theOutput" | awk '{ print $1","$2 }')"
unset arr
declare -a arr
arr=($(echo "$Condensed"))
Count="$(echo "$(printf "%s\n" "${arr[@]}")" | wc -l)"
Count=$((Count-1))
for i in $(seq 1 $Count); do
printf " %5s %-16s\n" "$var1" "${var2//\/*\//./}"
done
echo
}
case "$1" in
-h|--help)
usage
return 0
;;
*)
:
;;
esac
if [ -z "$1" ]; then
oldDir="$(pwd)"
cd "${1}"
local theDir="$(pwd)"
sortdir
cd "$oldDir"
return 0
else
:
oldDir="$(pwd)"
cd "${1}"
local theDir="$(pwd)"
sortdir
cd "$oldDir"
return 0
fi
}
When you download a video from certain sites, ctime is the time you created the file (so the time you downloaded) but the video still comes with a timestamp which is saved as the mtime (I'm not sure why this happens, maybe there's a http header for that?), and I presume it's the time when the video was first uploaded to the site?
Here's a favorite of mine: all my scripts' -h simply show the source code
$ cat $(which ,)
#!/bin/sh
#export DRI_PRIME=1
cmd=mpv
param='-fs --msg-level=all=no,cplayer=info'
filter='/Playing/!d; s/^Playing: //'
order='%C@' # by default, order by ctime
sort=-n # (which is a numeric sort)
reverse=-r # ... show newer videos first
depth= # ... and do it recursively
loop= # ... without looping
[[ -n $MYT_MUTE ]] && set -- -u "$@"
[[ -n $MYT_1 ]] && set -- -1 "$@"
[[ -n $MYT_REC ]] && set -- -r "$@"
while getopts 1rcmsaRnolugh o; do
case "$o" in
1) depth='-maxdepth 1';; # just the current directory
r) depth=;; # recursively
# a video uploaded in 2009 but downloaded in 2015 will have
# mtime in 2009 and ctime in 2015
#
# (note: moving the video to another directory actually bumps
# the ctime)
c) order='%C@'; sort=-n;; # order by ctime (download time)
m) order='%T@'; sort=-n;; # order by mtime (upload time)
s) order='%s'; sort=-n;; # order by size
a) order='alpha'; sort=;; # order lexicographically
R) order='random'; sort=-n;; # order at random
n) reverse=-r;; # newer first
o) reverse=;; # older first
l) loop=--loop=inf;; # infinite loop
u) mute=--mute=yes;; # no sound
g) filter=; param=;; # debug
h) ${PAGER-less} "$0"; exit;;
esac
done
shift $((OPTIND-1))
find -L "$@" $depth -mindepth 0 \
-not -path '*/\.*' \
-type f \
-name '*.*' \
-printf "$order %p\0" \
| awk 'BEGIN { RS="\0"; srand() } {
if ($1 == "random")
sub ($1, int(rand()*100000));
printf "%s\0", $0
}' \
| sort -z $sort $reverse \
| sed -zr 's/^[^ ]+ //' \
| xargs -0 $cmd $loop $mute $param 2>&1 \
| sed "$filter"
# -exec $cmd {} + | sed "$sed"
In .zshrc I have the least possible things so it opens fast. But I include the commands that would extend (srcBlah) or help me tune how to extend (editBlah):
For my pet projects I'd use these two:
alias editSeb="code ~/.sebrc"
alias srcSeb="source ~/.sebrc"
As you can see, editSet opens VSCode and src is sourcing it in the current terminal. .sebrc
# Download video from the given YouTube URL
function ytdl() {
youtube-dl -x --audio-format mp3 --prefer-ffmpeg $1
}
# Download audio from the given YouTube URL
function ytmp3() {
ytdl $1 | ffmpeg -i pipe:0 -b:a 320K -vn $2.mp3
}
# Shows total size of the given directory at $1
function dus() {
du -h -d 1 $1
}
# Used to opt-out of pre-commit autofixes
export NO_COMMIT_CHECKS=true
function cleanUSB() {
volumeName=$1
subdir=$2
if [[ "$volumeName" != "" ]] && [[ "$subdir" = "" ]]; then
rm -rfv /Volumes/$volumeName/.DS_Store
rm -rfv /Volumes/$volumeName/.Spotlight-V100
rm -rfv /Volumes/$volumeName/.fseventsd
rm -rfv /Volumes/$volumeName/.Trashes
rm -rfv /Volumes/$volumeName/._\*
echo "Volume $volumeName is clean"
elif [[ "$volumeName" != "" ]] && [[ "$subdir" != "" ]]; then
rm -rfv /Volumes/$volumeName/$subdir/.DS_Store
rm -rfv /Volumes/$volumeName/$subdir/.Spotlight-V100
rm -rfv /Volumes/$volumeName/$subdir/.fseventsd
rm -rfv /Volumes/$volumeName/$subdir/.Trashes
rm -rfv /Volumes/$volumeName/$subdir/._\*
echo "Volume $volumeName/$subdir is clean"
else
echo "No volume name given. Nothing to do."
fi
}
function blogBackup() {
rsync -avzh --progress -e ssh root@seb-nyc1-01:/root/blog/db /Users/seb/Documents/blog
}
alias showHiddenFiles='defaults write com.apple.finder AppleShowAllFiles YES; killall Finder /System/Library/CoreServices/Finder.app'
alias hideHiddenFiles='defaults write com.apple.finder AppleShowAllFiles NO; killall Finder /System/Library/CoreServices/Finder.app'
# Show ports currently listening
function openPorts() {
netstat -p tcp -van | grep '^Proto\|LISTEN'
}
# Create a RAM disk on macOS
function ramDisk() {
# https://eshop.macsales.com/blog/46348-how-to-create-and-use-a-ram-disk-with-your-mac-warnings-included/
# 2048 = 1MB
# 2097152 = 1G
quantityOfBlocks=2097152
diskutil erasevolume HFS+ "RAMDisk" `hdiutil attach -nomount ram://${quantityOfBlocks}`
}
# Tauri watcher for source file changes will not stop automatically.
function killRollup() {
ps aux | grep node | grep rollup | awk '{print $2;}' | xargs kill -9 $1
}
# X pet project required env var
export X_TOKEN=blahValue
function dockerCleanAll() {
docker stop $(docker ps -aq)
docker rm $(docker ps -aq)
docker rmi $(docker images -q) -f
}
function dockerCleanVolumes() {
docker volume rm $(docker volume ls -qf dangling=true)
}
alias ll='ls -lah'
alias gg='git status -s'
# Creates a timestamped backup of the current branch:
alias gbk='git checkout -b "backup-$(git symbolic-ref -q HEAD --short)-$(date +%Y-%m-%d-%H.%M.%S)" && git checkout -'
#!/usr/bin/env sh
find | grep -- "$1"
If you are searching for a Python or a Java package / class, it will work because the dots in it will mean "any char" for grep and will match the slashes in its path.oneline:
#!/usr/bin/env sh
tr '\n' ' '; echo
Puts anything you give in its standard input in one line.L, my journaling tool (whenever I need to get something out of my head or be sure to find it later); I can edit and fix stuff by editing the file it generates after the fact:
#!/bin/sh
set -e
CONFIG_FILE="${HOME}/.config/Ljournalrc";
if [ ! -f "${CONFIG_FILE}" ]; then
mkdir -p "$(dirname "$CONFIG_FILE")"
printf 'JOURNAL_FILE="${HOME}/Documents/journal.txt"\n' >> "${CONFIG_FILE}"
printf 'VIEWER=less\n' >> "${CONFIG_FILE}"
printf 'LESS='"'"'-~ -e +G'"'"'\n' >> "${CONFIG_FILE}"
fi
L=$(basename $0)
usage() {
cat <> ${JOURNAL_FILE}
fi
printf "%s\n" "$msg" >> ${JOURNAL_FILE}