HACKER Q&A
📣 fastily

Can I see your scripts?


A few weeks ago, I asked if I could see your cheatsheets (https://news.ycombinator.com/item?id=31928736) and I was really impressed by all the high quality responses!

Today I'm asking about your scripts.

Almost every engineer I know has a collection of scripts/utilities for automating ${something}. I'm willing to bet that HN users have some of the most interesting scripts on the internet.

So that said, could I please see your scripts?

I'll go first: https://github.com/fastily/autobots


  👤 hoechst Accepted Answer ✓
Not really a script, but a `.ssh/config` to automatically deploy parts of my local cli environment to every server i connect to (if username and ip/hostname matches my rules).

On first connect to a server, this sync all the dotfiles i want to a remote host and on subsequent connects, it updates the dotfiles.

Idk if this is "special", but I haven't seen anyone else do this really and it beats for example ansible playbooks by being dead simple.

   Match Host 192.168.123.*,another-example.org,*.example.com User myusername,myotherusername
      ForwardAgent yes
      PermitLocalCommand yes
      LocalCommand rsync -L --exclude .netrwhist --exclude .git --exclude .config/iterm2/AppSupport/ --exclude .vim/bundle/youcompleteme/ -vRrlptze "ssh -o PermitLocalCommand=no" %d/./.screenrc %d/./.gitignore %d/./.bash_profile %d/./.ssh/git_ed25519.pub %d/./.ssh/authorized_keys %d/./.vimrc %d/./.zshrc %d/./.config/iterm2/ %d/./.vim/ %d/./bin/ %d/./.bash/ %r@%n:/home/%r

👤 asicsp
https://github.com/learnbyexample/command_help/blob/master/c...

Command help, inspired by http://explainshell.com/ to extract help text from builtin commands and man pages. Here's an example:

    $ ch ls -AXG
           ls - list directory contents

           -A, --almost-all
                  do not list implied . and ..

           -X     sort alphabetically by entry extension

           -G, --no-group
                  in a long listing, don't print group names

---

https://github.com/learnbyexample/regexp-cut/blob/main/rcut is another - uses awk to provide cut like syntax for field extraction. After writing this, I found commands like `hck`, `tuc` written in Rust that solves most of the things I wanted.


👤 tpoacher
I have a nice little script for managing "MISC" packages, which stands for "Manually Installed or Source Compiled".

https://github.com/tpapastylianou/misc-updater

In full honesty, I'm as proud of the "MISC" acronym as of the script itself. :p

I'm secretly hoping the acronym catches on for referring to any stuff outside the control of a system's standard package management.


👤 sshine
I've pair-programmed a lot this year, and some of my colleagues tend to like the "Co-authored-by: ..." message because they like that due attribution is given regardless of who was in control of the keyboard.

I eventually got tired of writing that manually, so I wrote a small

  git co-commit --thor ...
that works just like 'git commit', except it adds another line to the commit message template.

Placing it in e.g. ~/bin/git-co-commit and having ~/bin in your $PATH will enable it as a git sub-command.

I've never had a use for this before, and I don't think I'll need it much beyond this team, but this was my first git sub-command that wasn't trivially solvable by existing command parameters (that I know of).

https://gist.github.com/sshine/d5a2986a6fc377b440bc8aa096037...


👤 WithinReason
AutoHotkey script, pressing Caps Lock+Left Mouse Button simulates 50 mouse clicks per second until the left mouse button is pressed:

  ~LButton up::
    if (GetKeyState("CapsLock", "P")) {
      while(!GetKeyState("LButton", "P")){
        MouseClick, left
        sleep 20      
      } 
    }
    return
Useful in situations when you need to click a lot :)

Also a bookmarklet that you use to turn any page dark with a click:

    javascript:document.querySelectorAll('\*').forEach(e=>e.setAttribute('style','background-color:#222;background-image:none;color:#'+(/^A|BU/.test(e.tagName)?'36c;':'eee;')+e.getAttribute('style')))
From here: https://github.com/x08d/222

👤 bwhmather
I use this script, saved as `rerun`, to automatically re-execute a command whenever a file in the current directory changes:

    #!/usr/bin/sh

    while true; do
        reset;
        "$@";
        inotifywait -e MODIFY --recursive .
    done
For example, if you invoke `rerun make test` then `rerun` will run `make test` whenever you save a file in your editor.

👤 calvinmorrison
Okay it's 2022 and you still don't want to run nepomunk, a holistic semantic filesystem approach never happened and you're stuck with a million files in your Download folder, home folder etc.

So I use this script to give me a nice work environment, based on each day.

Every time you open bash, it'll drop you into today's directory. (~/work/year/month/day/)

When I think about stuff it's like.. oh yeah I worked on that last week, last year, etc - the folder structure makes this a lot easier, and you can just write 'notes' or 'meeting-with-joe' and you know the ref date.

For your bashrc:

  alias t='source /path/to/today'

  t
Now every day you'll know what you worked on yesterday!

  # this is where you'll get dropped by default.

  calvin@bison:~/work/2022/07/10$

  calvin@bison:~/work/2022/07/10$ ls
  WardsPerlSimulator.pl

  calvin@bison:~/work/2022/07/10$ cd ..; ls;
  01  02  04  05  06  07  08  09  10
  calvin@bison:~/work/2022/07$ cd ..; ls
  01  02  03  04  05  06  07
 calvin@bison:~/work/2022$ cd ..; ls
  2021  2022
additionally you'll get a shortcut, you can type 't' as a bash fn, or go to ~/t/ which is symlinked and updated everytime you run today (which is everytime you open bash or hit 't'. this is useful if you want to have Firefox/Slack/whatever always save something in your 'today' folder.

https://git.ceux.org/today.git/


👤 cormorant
Not exactly a script, but I have a UserStyle applied to Hacker News (using Stylus: <https://addons.mozilla.org/en-US/firefox/addon/styl-us/>). Here are the best bits:

1. Preserve single newlines that people typed in: Often people hit Return only once, and their intended formatting becomes a wall of text. Hacker News preserves the newline in the HTML.

  .commtext {
    white-space: pre-wrap;
  }
  .commtext .reply {
    white-space: normal; /* fix extraneous whitespace caused by first rule */
  }
2. Vertical lines to visually represent thread depth!

  .ind {
    background-image: repeating-linear-gradient(to right, transparent 0px 39px, rgba(204, 204, 204, 1.0) 39px 40px);
  }

👤 dom111
Similar to most other posters, I have a dotfiles repo, most of it isn't particularly novel, but I have a light wrapper around `git` that after a successful clone, will add custom identity information to `.git/config` so when I commit, I won't inadvertently use my work author string vs my personal author string:

https://github.com/dom111/dotfiles/blob/master/bin/git

which when combined with files like:

    $cat ~/.gitidentities/github.com 
    
    [user]
            name = dom111
            email = dom111@users.noreply.github.com
means I don't accidentally leak email addresses etc

Also, not entirely related, but I wrote a small tool to add some animated GIFs to scripts: https://dom111.github.io/gif-to-ansi/


👤 zffr
I use a custom git alias `git recent` almost every day. It shows you the the most recent branches you have worked on. This is useful for when you are are trying to find a branch you have worked on recently, but forgot its name.

To use this alias, make an executable file called `git-recent` with the following contents and ensure it is in your `$PATH`

    git for-each-ref \
      --sort=-committerdate refs/heads/ \
      --format='%(HEAD) %(color:red)%(objectname:short)%(color:reset) %(color:yellow)%(refname:short)%(color:reset) - %(contents:subject) - %(authorname) (%(color:green)%(committerdate:relative)%(color:reset))'

Here's a redacted example of the output looks like

    * fc924a7e68 team/name/feature-2 - save - My Name (4 days ago)
      2fed1acfac team/name/feature-1 - add test - My Name (3 weeks ago)
      4db4d4ac77 main - Remove changes (#22397) - My Name (6 weeks ago)

👤 elondaits
Not that long ago I was maintaining around 30 Drupal (popular PHP CMS) websites for different clients, on different ISPs.

I made a CLI utility for automating certain operations I was doing all the time: rsync of sources (push or pull), db backup / rollback, copying the local db to the remote server or back, etc. The utility looked for a dotfile in the project directory to get things like the remote server address, remote project path, etc.

The tool served several purposes:

- Executing auxiliary tools (rsync, mysqldump, drush) with the right parameters, without requiring me to remember them.

- Storing (non-secret) information about the remote environment(s) in the project directory.

- Some dangerous operations (e.g. copying the local db to the remote server) were prohibited unless the dotfile explicitly enabled them. Some sites were only edited on dev and then pushed to production, but some had user data that should never be overwritten.

- When running rsync of sources the tool always did a dry-run first, and then required entering a randomly generated 4-letter code to execute them... so I would have to stop and think and didn't deploy by mistake.

This tool is too rough for sharing it with the general public... but I consider it one of my greatest professional achievements because it saved me a lot of mental effort and stress over the years, quite a bit of time, prevented me from shooting myself in the foot, and forced me to use proper workflows every time instead of winging it. It required a small investment of time and some foresight... but my philosophy is that my work should be to build tools to replace me, and that was a step in that direction.


👤 junon

    function ccd { mkdir -p "$1" && cd "$1" }
Biggest timesaving script I've ever included in my arsenal.

👤 leibnitz27
Not mine, and not ..... really.... serious.... but someone has to mention the greatest work scripts ever :

https://github.com/NARKOZ/hacker-scripts


👤 sunaurus
Related to writing scripts on Mac OS, I highly recommend rumps (https://rumps.readthedocs.io) to show anything you want in your status bar.

For example, I have one script that uses rumps to show how many outdated homebrew packages I have (and also as a convenient shortcut to update those packages in the dropdown menu). I also have a second script that uses it to show a counter for open pull requests that I need to review (with links to individual PRs in the dropdown menu). It's great!

Result looks like this: https://imgur.com/yy6GlYk.jpg


👤 samatman
Here's a shell script for moving files in git between repos, preserving the history, and following that history through the file being renamed:

     git-move src/afile.c src/bfile.c src/cfile.c ../destination/repo
https://gist.github.com/mnemnion/87b51dc8f15af3242204472391f...

👤 grandpoobah
Bank reconcilliation in Xero has no "auto match if reference and date are the same" option, so I (very crudely) scripted one. It'll run until it encounters a mismatch (which for the company I work for is basically never). Paste into console.

function go() {

    let line = document.querySelector('#statementLines .line');

    if (line) {

        let leftAndRight = line.querySelectorAll('.statement');

        let left = leftAndRight[0];
        let right = leftAndRight[1];

        if (right.hasClassName('matched')) {

            let leftDetails = left.querySelectorAll('.info .details-container .details span');
            let leftDate = leftDetails[0].textContent;
            let leftReference = leftDetails[1].textContent;

            let rightDetails = right.querySelectorAll('.info .details-container .details span');
            let rightDate = rightDetails[0].textContent;
            let rightReference = rightDetails[2].textContent;

            rightReference = rightReference.replace('Ref: ', '');

            if (Date.parse(leftDate).getTime() == Date.parse(rightDate).getTime() && leftReference.toLowerCase() == rightReference.toLowerCase()) {


                var okButton = line.querySelector(".ok .okayButton");

                console.log(leftReference);

                okButton.click();

                var waiter = function () {

                    if (line.parentNode == null) {

                        go();

                    }
                    else {
                        setTimeout(waiter, 50);
                    }


                };

                setTimeout(waiter, 50);



            }
            else {
                console.log("Details dont match");
            }


        }
        else {
            console.log("Line not matched");
        }

    }
    else {
        console.log("No line found");
    }
}

setTimeout(go, 100);


👤 xrd
I just reviewed my own set of scripts in bin and don't feel I've got anything to contribute.

BUT, this thread is so special because it feels like this is the stuff you only get to see when you sit down at a co-worker's desk and watch them type something and then say "WHAT? HOW COOL!"

I miss that part now that it is all remote work. :(


👤 Nephx
Simple command line utility to display charging (or discharging) rate in watts on linux. You might have to modify battery_directory and the status/current_now/voltage_now names based on laptop brand, but Lenovo, Dell and Samsung seems to use this convention.

    #!/usr/bin/python3

    battery_directory = "/sys/class/power_supply/BAT1/"

    with open(battery_directory + "status", "r") as f:
        state = f.read().strip()

    with open(battery_directory + "current_now", "r") as f:
        current = int(f.read().strip())

    with open(battery_directory + "voltage_now", "r") as f:
        voltage = int(f.read().strip())

    wattage = (voltage / 10**6) * (current / 10**6)
    wattage_formatted = f"{'-' if state == 'Discharging' else ''}{wattage:.2f}W"

    if state in ["Charging", "Discharging", "Not charging"]:
        print(f"{state}: {wattage_formatted}")

Output:

Charging: 32.15W

Discharging: -5.15W


👤 Sirikon
My workstation setup, both for Linux and MacOS, is in the following repository: https://github.com/sirikon/workstation

https://github.com/sirikon/workstation/blob/master/src/cli/c...

For Linux, it can install and configure everything I need when launched on a clean Debian installation. apt repositories, pins and packages; X11, i3, networking, terminal, symlinking configuration of many programs to Dropbox or the repository itself... The idea is to have my whole setup with a single command.

For Mac, it installs programs using brew and sets some configurations. Mac isn't my daily driver so the scripts aren't as complete.

Also there are scripts for the terminal to make my life easier. Random stuff like killing any gradle process in the background, upgrading programs that aren't packetized on APT, backing up savegames from my Anbernic, etc. https://github.com/sirikon/workstation/tree/master/src/shell

And more programs for common use, like screenshots, copying Stripe test cards into the clipboard, launching android emulators without opening Android Studio, etc. https://github.com/sirikon/workstation/tree/master/src/bin


👤 impalallama
A bash script to open up the current git branch in my browser.

Use it everyday. Great because my company has multiple git submodules in any given project and I can use this to watch for pipeline failures and the like.

  x=$(git config --local remote.origin.url|sed -n 's#.*/\([^.]*\)\.git#\1#p')
  y=$(git symbolic-ref --short HEAD)
  url="https://git.thecompany.com/thecompany/$x/tree/$y"
  $(open -a "firefox developer edition" "$url"

👤 garfieldnate
I use termdown to run timers in my terminal. Back when we only had a pressure cooker and couldn't afford an automatic rice cooker, I wrote a bash function "rice" that would give instructions for cooking it in the pressure cooker. Kinda silly in retrospect, but it did ease the pain of being broke a bit:

    function alarm_forever() {
        # play one part of the track at a time so that this function can be killed any time
        while :; do
            afplay --time .72 ~/sounds/alarm.mp3;
        done
    }

    function alarm_until_input() {
        alarm_forever &
        pid=$!;
        read  -n 1 -p "$*";
        kill -9 $pid;
    }

    # pip install termdown
    function timer {
        termdown $@;
        alarm_until_input "[Press any key to stop]"
    }
    alias alarm="timer"

    # TODO: ask if user soaked the rice first
    function rice {
        echo "1. Wash rice. Place in pressure cooker with 1-1 water-rice ratio."
        echo "2. Place the pressure cooker on the stove on high."
        read  -n 1 -p "3. When the pressure pot starts whistling, press any key to start the timer."
        termdown --title "Tweet!" 2m
        alarm_until_input "4. Take pot off heat and press any key."
        termdown --title "RICE" 11m
        alarm_until_input "5. Open the pot and stir the rice immediately."
        alarm_until_input "6. Eat!"
    }

👤 GOATS-
I used to clip tons of videos for highlight reels, which was made a lot quicker with this snippet.

  clip_video () {
          ffmpeg -ss "$2" -i "$1" -t "$3" -c copy "$4"
  }

Used like so:

  clip_video filename.mp4 start_point duration output.mp4

👤 trevithick
A bloated script to automate creation of an Arch Linux Qemu VM. The subscript that runs in the VM is useful by itself for setting up a new Arch installation.

https://github.com/trevorgross/installarch/blob/main/install...

It's a personal tool that just kept growing. Probably amateurish by HN standards, but then, I'm an amateur. Yes, I could simply copy a disk image, but that's no fun.


👤 dijit
My scripts are usually for work so they don't make sense outside of that.

One I was particularly proud of/disgusted by was one that allowed me to jump around a network with a single command despite access being gated by regional jumphosts..

You are warned: https://git.drk.sc/-/snippets/107

Another script I wrote for our devs to get access to MySQL in production on GCP; the intent was for the script to be executable only by root and allow sudo access to only this script: that means also ''chmod ugo-rwx gcloud'' too though: https://git.drk.sc/-/snippets/98

I have another script to generate screenshots from grafana dashboards since that functionality was removed from grafana itself (https://github.com/grafana/grafana/issues/18914): https://git.drk.sc/-/snippets/66

Another time I got annoyed that Wayland/Sway would relabel my screens on successive disconnect/reconnects (IE my right screen could be DP-1 or DP-7 or anything in between randomly); so I wrote a docking script which moves the screens to the right place based on serial number: https://git.drk.sc/-/snippets/74


👤 scottLobster
Going to use this opportunity to spam ShellCheck, because it has historically saved me dozens of hours catching many silent Bash scripting errors and just making my scripts more robust/warning me of obscure edge cases:

https://www.shellcheck.net/


👤 mateuszbuda
I have a script for concurrent web scraping: https://github.com/mateuszbuda/webscraping-benchmark It takes a file with urls and scrapes the content. For more demanding websites it can use web scraping API that handles rotating proxies. I add some logic to process the output as needed.

👤 ebfe1
Since everyone here like scripting, May I suggest, if you have not used it already, checkout Xbar (https://xbarapp.com/) for Mac and Argos (https://argos-scripts.github.io/) for Linux.

I have used these 2 on my machines for the last 4 years and writing tons of script for myself, here are a few:

- Displaying internet/internal ip and allow me to click it to put in clipboard

- taskwarrior

- Simple conversion script that take my clipboard & encode/decode in base64, hex, url encoding, convert epoch to UTC,

- "auto type" my clipboard by simulating keystrokes- particular useful for pasting text into terminal that disable clipboard

- An incident response switch that would trigger a script to take screenshot every 5 seconds when my mouse moves, reduce image quality and save it to a folder in my homedrive. Another script will GPG encrypt it at the end of the day so i can go back and get screenshot or look back at incident if needed.


👤 hunterb123
JS snippet to sort and return Play Store app reviews by helpfulness:

  var nodes = [...document.querySelectorAll('\*[aria-label="Number of times this review was rated helpful"]')];
  nodes.sort((a, b) => (parseInt(b.innerText) || 0) - (parseInt(a.innerText) || 0));
  nodes.map(e => ([
    parseInt(e.innerText) || 0,
    e.parentNode.parentNode.parentNode.parentNode.parentNode.children[1].textContent.toString().trimStart(),
  ]));

👤 dcolkitt
Here's one I wrote a few years back, that I'm quite fond of. It turns any arbitrary directory tree with individual executables into a "git [X] [Y]" style shell command.

https://github.com/Mister-Meeseeks/subcmd/blob/master/subcmd


👤 HeckFeck
Somewhat boring, but I wrote a shell script to tar and gzip my home directory and then rsync it to a NAS drive.

It can be configured to exclude certain directories (.cache and Downloads being likely contenders). Also, it can read in config files so it can backup other directories.

https://github.com/lordfeck/feckback


👤 SuperCuber
Dead simple but serves me extremely well, and I haven't seen anyone do it:

    cd ()
    {
        builtin cd "$@" || return $?
        ls --my-usual-flags
    }

👤 superfamicom
On macOS this can be really useful, change the current terminal to the top most folder in Finder, do not recall where it came from:

  # Change to the Front Folder open in Finder
  function ff {
    osascript -e 'tell application "Finder"'\
    -e 'if (0 < (count Finder windows)) then'\
    -e 'set finderpath to get target of the front window as alias'\
    -e 'else'\
    -e 'set finderpath to get desktop as alias'\
    -e 'end if'\
    -e 'get POSIX path of finderpath'\
    -e 'end tell';};\
  function cdff { cd "`ff $@`" || exit; };

👤 aendruk
Old habit breaker:

  git() {
    if [[ "$1" == 'checkout' ]]; then
      echo 'Reminder: Use `git switch` or `git restore` instead.' >&2
    fi

    command git "$@"
  }

👤 paskozdilar
I've often encountered dependency issues on Ubuntu. One time, while dealing with NVidia/CUDA, running `apt-get -y install cuda` complained about some missing dependency. I recursively went through the error messages and installed every missing dependency manually, and it worked, but it took me a long time and a lot of typing.

Then I wrote a script that does that automatically:

    #!/usr/bin/env bash

    main() {
        local package="$1"

        if [ -z "$package" ]
        then
            echo "usage: $0 PACKAGE"
            exit 1
        fi

        install_package "$package"
    }

    install_package() {
        local package="$1"
        local subpackage

        if sudo apt-get -y install "$package"
        then exit 0
        else
            sudo apt-get -y install "$package" \
            |& grep '^ ' \
            | sed 's/[^:]*:[^:]*: //;s/ .*//;' \
            | {
                while read subpackage
                do install_package "$subpackage"
                done
            }
            sudo apt-get -y install "$package" \
                && echo "SUCCESS: $package" \
                || echo "FAILURE: $package"
        fi
    }

    main "$@"

👤 alsetmusic
Here’s a cat function with highlighting for when I’m working on different platforms and might not have the shell configured for syntax highlighting:

#!/usr/bin/env bash

function shc() { #: cat for shell scripts, source code. #: prints text with line numbers and syntax highlighting. #: accepts input as argument or pipe.

    if [ $# -eq 0 ]; then
        # arguments equal zero; assume piped input
        nl | /usr/local/bin/pygmentize -l bash
        # accept piped input, process as source code
    else
        case "$1" in
            -h|--help)
                printf "%s\n" "shc usage:" "           shc [file]" "           type [function] | shc"
                ;;
            -v|--version)
                printf "%s\n" "vers 2"
                ;;
            *)
                if [ -f "$1" ]; then
                    # test anything that isn't expected flags for file
                    cat "$1" | nl | /usr/local/bin/pygmentize -l bash
                    # process file as source code
                else
                    # if not a file or expected flags, bail
                    printf "%s\n" "error; not the expected input. read shc_func source for more details"
                fi
        esac
    fi
}

👤 simzor
Makes operating AWS CLI against a user with MFA enabled easier

---------

#!/bin/sh

echo "Store and retrieve session token AWS STS \n\n"

# Get source profile read -p "Source Profile []: " source_profile source_profile=${source_profile:-''} echo $source_profile

# Get destination profile read -p "Destination Profile [-mfa]: " destination_profile destination_profile=${destination_profile:-'-mfa'} echo $destination_profile

mfa_serial_number='arn:aws:iam:::mfa/'

echo "\nOTP: " read -p "One Time Password (OTP): " otp

echo "\nOTP:" $otp echo "\n"

output=$(aws sts get-session-token --profile --serial-number $mfa_serial_number --output json --token-code $otp)

echo $output

access_key_id=$(echo $output | jq .Credentials.AccessKeyId | tr -d '"') secret_access_key=$(echo $output | jq .Credentials.SecretAccessKey | tr -d '"') session_token=$(echo $output | jq .Credentials.SessionToken | tr -d '"')

aws configure set aws_access_key_id $access_key_id --profile=$destination_profile aws configure set aws_secret_access_key $secret_access_key --profile=$destination_profile aws configure set aws_session_token $session_token --profile=$destination_profile

echo "Configured AWS for profile" $destination_profile


👤 davisoneee
This is my autohotkey function, so that Windows gets the same functionality as OSX for cycling through instances of the same app (e.g. multiple firefox instances). Pass 0 or 1 to cycle all apps, or apps on the same desktop.

    !`::CycleCurrentApplication(0)
    !+`::CycleCurrentApplication(1)


    WhichMonitorAppIsOn(winId) {
        WinGetPos, cX, cY, cW, cH, ahk_id %winId%
        xMid := cX + (cW / 2)
        yMid := cY + (cH / 2)
        SysGet, nMons, MonitorCount
        Loop, % nMons
        {
            ; MsgBox %A_Index%
            SysGet, tmp, Monitor, %A_Index%
            withinWidth := (xMid > tmpLeft) && (xMid < tmpRight)
            ; MsgBox % tmpLeft . " -> " . tmpRight . "`t" . xMid
            if (withinWidth == 1)
                return %A_Index%
        }
    }


    CycleCurrentApplication(same_desktop_only) {
        WinGet, curID, ID, A
        curMon := WhichMonitorAppIsOn(curID)

        WinGetClass, ActiveClass, A
        WinGet, WinClassCount, Count, ahk_class %ActiveClass%
        IF WinClassCount = 1
            Return
        Else
            WinGet, List, List, % "ahk_class " ActiveClass
        Loop, % List
        {
            index := List - A_Index + 1
            WinGet, State, MinMax, % "ahk_id " List%index%
            WinGet, nextID, ID, % "ahk_id " List%index%
            nextMon := WhichMonitorAppIsOn(nextID)

            if (same_desktop_only > 0 && (curMon != nextMon))
                continue

            if (State != -1)  ; if window not minimised
            {
                WinID := List%index%
                break
            }
        }
        WinActivate, % "ahk_id " WinID
    }

👤 acomjean
I use a short bash/perl script to find/replace globally in large files. I have to change the search function each time, although I pass in the file name. Its not sophisticated, but its been very useful.

The reason I like it is also backs up the original in case I mess up the regex (happens sometimes...)

   #!/usr/bin/env bash
   perl -i.bak  -p -e 's/oldtext/newtext/g;' $1

👤 t-3
I used to have a bunch of scripts, but I compulsively "clean" my backups too often to keep old stuff around. Here's my "newterm" script, which I use for launching xterm with tmux:

  #!/bin/sh
  
  if [ $(tmux has-session 2>/dev/null; echo $?) -eq 0 ]; then
      if [ $(tmux list-windows -f '#{window_active_clients}') ]; then
          if [ $(tmux ls | head -n 1 | awk '{print $2}') -le 2 ]; then
              xterm -e "tmux new-session -f active-pane,ignore-size -t "0" \; new-window"
          else
              xterm -e "tmux new-session -f active-pane,ignore-size -t "0" \; select-window -t +2"
          fi
      else
          xterm -e "tmux attach -f active-pane,ignore-size -t "0""
      fi
  else
      xterm -e tmux new-session -f active-pane,ignore-size
  fi
  
  if [ $(tmux ls | wc -l) -gt 1 ]; then
      for i in $(tmux ls -F '#S' -f '#{?session_attached,,#S}' ); do
          tmux kill-session -t ${i}
      done
  fi

👤 withinboredom
Using my GitHub SSH keys to login as root on my servers: https://gist.github.com/withinboredom/84067b9662abc1f968dfad...

This can be extended easily, even dynamically creating an account if the user is part of an org, or use libnss-ato to alias the user to a specific account.


👤 falcolas
For local documentation of libraries (and languages):

    #! /bin/bash

    remote_file_path=$1

    wget --recursive --level=5 --convert-links --page-requisites --wait=1 --random-wait --timestamping --no-parent ${remote_file_path}
And a couple of zshrc functions which make jumping around my filesystem quite snappy. `jump` is aliased to `j`, and `mark` to `m`

    MARKPATH=~/.marks
    function jump {
        cd -P ${MARKPATH}/$1 2> /dev/null || (echo "No such mark: $1" && marks)
    }
    function mark {
        mkdir -p ${MARKPATH}; ln -s $(pwd) $MARKPATH/$1
    }
    function unmark {
        rm -i ${MARKPATH}/$1
    }
    function marks {
        ls -l ${MARKPATH} | sed 's/  / /g' | cut -d' ' -f9- && echo
    }
    _jump()
    {
        local cur=${COMP_WORDS[COMP_CWORD]}
        COMPREPLY=( $(compgen -W "$( ls $MARKPATH )" -- $cur) )
    }
    complete -F _jump jump
(Totally stolen, and fixed up to work in ZSH)

👤 firesloth
I have a bash function I use to checkout a git branch based on a search string:

  function git-checkout-branch-by-search-string() {
    local maybe_branch_name
    maybe_branch_name=$(git branch --sort=-committerdate | grep $1 | head -n 1)
    if [ -n "$maybe_branch_name" ]; then
      git checkout "${maybe_branch_name:2}"
    else
      echo "Could not find branch matching $1"
    fi
  }
  alias gcos="git-checkout-branch-by-search-string"
Branches often include things like ticket numbers and project keys, so you can do

  $ gcos 1234
and save some typing.

I have a pair of fixup commit functions, which make it faster to target fixup commits prior to rebasing:

  function git-commit-fixup() {
    git commit --fixup ":/$*"
  }
  function git-add-all-then-git-commit-fixup() {
    git add .
    git commit --fixup ":/$*"
  }
Long function names that are then assigned to an alias can make it easier to find them later if you forget rarely used ones. That is you can do:

$ alias | grep fixup

to see the list of relevant aliases and the functions they call.

I also have two functions I use like a linear git bisect:

  function git-checkout-parent-commit() {
    local prev
    prev=$(git rev-parse HEAD~1)
    git checkout "$prev"
  }
  function git-checkout-child-commit() {
    local forward
    forward=$(git-children-of HEAD | tail -1)
    git checkout "$forward"
  }
  function git-children-of() {
    for arg in "$@"; do
      for commit in $(git rev-parse $arg^0); do
        for child in $(git log --format='%H %P' --all | grep -F " $commit" | cut -f1 -d' '); do
          echo $child
        done
      done
    done
  }

👤 demindiro
I use this to generate my site:

    #!/usr/bin/env bash

    for file in `find . -name '*.md'`; do
        output=${file::-3}.html
        if [[ `date -r "$file" "+%s"` -le `date -r "../$output" "+%s"` ]]
        then
            echo "Skipping $file"
            continue
        fi
        mkdir -p ../$(dirname $output)
        echo Generating $output from $file
        cat << EOF > ../$output
    
    `cat head.html`
    
    `cat navigation.html`
    
`pandoc $file`
The content on this page is licensed under the CC BY-ND 4.0 Source
EOF done;

👤 flobosg
Not mine and I don’t remember the source, but really useful:

    # Simple calculator
    function calc() {
        local result=""
        result="$(printf "scale=10;$*\n" | bc --mathlib | tr -d '\\\n')"
        #                       └─ default (when `--mathlib` is used) is 20
        #
        if [[ "$result" == *.* ]]; then
                # improve the output for decimal numbers
                printf "$result" |
                sed -e 's/^\./0./'        `# add "0" for cases like ".5"` \
                    -e 's/^-\./-0./'      `# add "0" for cases like "-.5"`\
                    -e 's/0*$//;s/\.$//'   # remove trailing zeros
        else
                printf "$result"
        fi
        printf "\n"
    }

👤 muskmusk
I have my notes in Dendron which is basically a directory of yaml files. I often need to search through the notes so I made the below

search_notes() { input=$(rg -v '(\-\-)|(^\s*$)' --line-number /home/user/some-dir | fzf --ansi --delimiter : --preview 'batcat --color=always {1} --highlight-line {2}' --preview-window 'up,60%,border-bottom,+{2}+3/3,~3' | choose -f : 0) if [[$input = ""]]; then else less $input fi }

It uses various linux utilities including fzf and batcat(https://github.com/sharkdp/bat) to open a terminal with all the places where my query comes up (supporting fuzzy search). Since the workhorses are fzf and ripgrep its is quite fast even for very large directories.

So i will do `search_notes postgres authentication`. I can select a line and it will open the file in less. Works like a charm!


👤 ThePhysicist
I run this simple shell script to make daily incremental backups of my home folder using Borg. It works really well, haven't touched it in years [1].

1: https://gist.github.com/adewes/02e8a1f662d100a7ed80627801d0a...


👤 doctorwho42
To shake it up compared to other responses. I regularly integrate the PyWin32 library into work scripts. Sometimes you just need a way to automate interactions with Windows in those non-dev jobs.

The most recent was a script that parsed a financial report and generated multiple emails depending on a set of criteria. Then the user could manual review these emails and press send if everything checks out. The goal of the script was to reduce some of the menial work my financial co-worker was doing. I don't have it published on GitHub because it has some internal company info in it. But it worked cleanly, and regularly saves him hours of tedious work.

Also I highly recommend EasyGui library for those quick scripts that need user input from people who are not comfortable with a console/cmd. Helps make different types of popup windows for user input/selection with a few simple lines.


👤 sm_ts
I can only share a part, since the majority of my scripts reveal much about my system structure (I try to open whatever I can, though; the tedious part of open sourcing a script, is to make it generic/configurable):

https://github.com/64kramsystem/openscripts

Missed the previous cheatsheet post :) I have a massive collection, which are large enough to be books more than cheatsheets (still, I access and use them as cheatsheets):

https://github.com/64kramsystem/personal_notes/tree/master/t...


👤 l0b0

👤 ansible
I use find and grep a lot in code repos, so I came up with this bash function:

    alias filter_repos_z="grep -ZzEv '/tags|/\.hg/|/\.svn/|/\.git/|/\.repo/|\.o$|\.o\.cmd$|\.depend|\.map$|\.dep$|\.js$|\.html$'"
    function findxgrep()
    {
    find . -type f -print0 | filter_repos_z | xargs -0 grep --color=auto "${@}" | grep -v "^Binary file" | sed 's/^\.\///' | less -F
    }
The "${@}" is the critical bit that allows me to pass arguments like -i to grep. The grep, find and xargs commands all support using a NULL as a file separator instead of whitespace.

👤 adityaathalye
Favourite topic!

My "Bash Toolkit": https://github.com/adityaathalye/bash-toolkit

My (yak-shaving-in-progress :) "little hot-reloadin' static shite generator from shell": https://github.com/adityaathalye/shite

A subtle-ish aspect is, I like to write Functional Programming style Bash. I've been blogging about it here: https://www.evalapply.org/tags/bash/


👤 olifante
My favorite utility is a shell function for quickly attaching to an existing screen session after connecting via ssh (or creating a new one if none exists). It's pretty handy for treating a single ssh connection to a server as if it was a long-lived multi-tab terminal:

  sshcreen () {
      ssh -t "$@" screen -xRR
  }
Works with bash and zsh. Usage is pretty simple:

  $ sshcreen user@example.com
Or for local Docker instances mapped to port 2222:

  $ sshcreen root@localhost -p 2222
Detach the session with CTRL-A + D, reattach by rerunning the sshcreen command you previously used.

👤 hk1337
This was handy before oh-my-zsh but omz has functionality for this now that it's no longer necessary.

    pman()
    {
        man -t "${1}" | open -f -a /System/Applications/Preview.app
    }

I like using this in conjunction with pbcopy to quickly generate a random password at given length

    pwgen()
    {
        length=${1:-64}
        charlist='0-9a-zA-Z~!@#$%^&*()_+-=:";<>?,./'
        echo `cat /dev/random | tr -dc $charlist | head -c$length`
    }

👤 monkin
I use my system setup script: https://github.com/pcho/binfiles/blob/master/bt. It helps me a lot with setting up new VPS when I need or with daily tasks. While using macOS, I also had this as helpers: https://github.com/pcho/binfiles/blob/master/.archive/setup-..., to set up homebrew. And, https://github.com/pcho/binfiles/blob/master/.archive/setup-..., for a bunch of options as many of build from source works fine in both systems. In .archive folder there’s a lot of other scripts that I used, but tried to incorporate them in bootstrap script.

It also uses my https://github.com/pcho/dotfiles, https://github.com/pcho/vimfiles and https://github.com/pcho/zshfiles


👤 daneel_w

  alias makepw='cat /dev/urandom | LC_ALL=C tr -cd A-Za-z0-9,_- | head -c 25; echo'
Any proper password manager will of course be able to supplant tricks like these.

👤 jader201
Mint recently updated the API they use behind the scenes, and it broke the preexisting scripts others had written for importing transactions from a CSV (when you link a new bank, it only goes up to the past 90 days).

With my son having opened an account over a year ago, but we didn’t sign up for Mint until this weekend, I ended up writing a new import script for the updated API:

https://github.com/jeradrose/mint-simple-import


👤 jkern
I'm not sure bashrc tweaks completely qualify but considering it involves probably the most convoluted shell script I've ever had to come up with I'll plug https://github.com/jkern888/bash-dir-collapse. I like having the current directory in my prompt but got annoyed at how long it could get so made this to shrink the overall length without completely sacrificing the full path

👤 jjice
I'm looking in my `~/bin` folder on my work machine right now. I have a good few that are very specific to my work.

- Scripts to test our rate limiting for both authenticated and unauthenticated users (was handy)

- API routes changed in a given PR (set of commits since the last interaction with master in reality)

- ssl-expiration-date - Checks the expiration date of a site's certificate

  domain="$1"
  
  echo "Checking the SSL certificate expiration date for: $domain"
  
  curl -vI "$domain" 2>&1 | grep -o 'expire date: .*$'
- test-tls-version - Checks if a website supports a given version of TLS

  domain="$1"
  curl_options=( "--tlsv${2}" --tls-max "$2" )
  
  curl "${curl_options[@]}"  -vI "$domain" 2>&1
There are also some miscellaneous PHP scripts lying around for template related stuff. PHP makes a create templating language when you need some basic programmatic additions to your output text.

Everything is too coupled to my work to be useful to others, and most of the automation scripts I've written for work are run as cron jobs now and send out emails to the appropriate emails. Most of these are written in PHP (we're a PHP shop).


👤 bbkane
Here's a small script I use often to tag commits with Git.

It shows the current status, lists out the most recent tags, prompts for a new tag and message, and finally pushes.

Everything is colorized so it's easy to read and I use it quite often for Golang projects.

https://github.com/bbkane/dotfiles/blob/e30c12c11a61ccc758f7...


👤 pveierland
I use a script called `shell-safe-rm` [1], aliased as `rm` in interactive shells, such that I don't normally use `rm` directly. Instead of directly removing files, they are placed in the trash folder so they can be recovered if they were mistakenly deleted. Highly recommend using a script/program like this to help prevent accidental data loss.

[1] https://github.com/kaelzhang/shell-safe-rm


👤 tls-kn
Script I quickly wrote that automatically compiles my tex files on change, and reloads my pdf viewer (mupdf in this particular case). This was written for OpenBSD (TeXstudio wasn't available, and I ended up liking this editor+mupdf approach even more), so I don't know if it perfectly translates to other OSs.

    #!/bin/sh
    
    pdf_viewer="mupdf";
    latex_cmd="pdflatex -interaction=nonstopmode"
    
    if [[ $# -eq 0 ]]; then
        print "No arguments: filename required"
        exit
    fi
    
    filename=$1;
    pdfname=${filename%%.*}.pdf
    
    # inital compilation to make sure a pdf file exists
    ${latex_cmd} ${filename};
    
    ${pdf_viewer} ${pdfname} &
    
    # get pid of the pdf viewer
    pdf_viewer_pid=$!;
    
    while true; do
        # as long as the pdf viewer is open, continue operation, if it gets closed,
        # end script
        if kill -0 "${pdf_viewer_pid}" 2>/dev/null; then
            if [[ ${filename} -nt ${pdfname} ]]; then
                ${latex_cmd} ${filename};
    
                # reload pdf file, only works with mupdf
                kill -HUP ${pdf_viewer_pid};
                touch $pdfname
            fi
            sleep 1;
        else
            exit 0;
        fi
    done;

👤 mstudio
I sometimes leave a process running on a port for webdev and then try to open a new one resulting in the error, "Error: listen EADDRINUSE 0.0.0.0:NNN", e.g. 0.0.0.0:443.

There are many ways to search for the process, but here's what I use:

   lsof -iTCP -sTCP:LISTEN -P | grep [PORT NUMBER]
Look for port num and kill the process with:

   kill -9 [PID OF PROCESS YOU WANT TO KILL]
Note if running as root user, you will need to prepend the above commands with sudo

👤 jmillikin
I often have a need to serve a local directory via HTTP. In the old days the built-in Python webserver was enough, but at some point browsers became more aggressive about concurrent connections and the single-threaded `python -m SimpleHTTPServer` would just get stuck if it received two requests at once.

As a workaround, I wrote a small wrapper script that would enable multi-threading for SimpleHTTPServer.

~/bin/http-cwd , Python 2 version (original):

  #!/usr/bin/python
  import argparse
  import BaseHTTPServer
  import SimpleHTTPServer
  import SocketServer
  import sys

  class ThreadedHTTPServer(SocketServer.ThreadingMixIn, BaseHTTPServer.HTTPServer):
      pass

  def main(argv):
      parser = argparse.ArgumentParser()
      parser.add_argument(
          "--port", type = int, nargs = "?",
          action = "store", default = 8000,
          help = "Specify alternate port [default: 8000]",
      )
      parser.add_argument(
          "--iface", type = str, nargs = "?",
          action = "store", default = "127.0.0.1",
          help = "Specify iface [default: 127.0.0.1]",
      )
      args = parser.parse_args(argv[1:])
      server_address = (args.iface, args.port)
      srv = ThreadedHTTPServer(server_address, SimpleHTTPServer.SimpleHTTPRequestHandler)
      sa = srv.socket.getsockname()
      print "Serving http://%s:%r ..." % (sa[0], sa[1])
      srv.serve_forever()

  if __name__ == "__main__":
      sys.exit(main(sys.argv))
Python 3 version (necessary for platforms that have dropped Python 2, such as macOS):

  #!/usr/bin/python3
  import argparse
  import http.server
  import socketserver
  import sys

  class ThreadedHTTPServer(socketserver.ThreadingMixIn, http.server.HTTPServer):
      pass

  def main(argv):
      parser = argparse.ArgumentParser()
      parser.add_argument(
          "--port", type = int, nargs = "?",
          action = "store", default = 8000,
          help = "Specify alternate port [default: 8000]",
      )
      parser.add_argument(
          "--iface", type = str, nargs = "?",
          action = "store", default = "127.0.0.1",
          help = "Specify iface [default: 127.0.0.1]",
      )
      args = parser.parse_args(argv[1:])
      server_address = (args.iface, args.port)
      srv = ThreadedHTTPServer(server_address, http.server.SimpleHTTPRequestHandler)
      sa = srv.socket.getsockname()
      print("Serving http://%s:%r ..." % (sa[0], sa[1]))
      srv.serve_forever()

  if __name__ == "__main__":
      sys.exit(main(sys.argv))

👤 jasonariddle
Yup, here are mine.

    # Search all directories for this directory name.
    dname() {
        [ $# -eq 0 ] && echo "$0 'dir_name'" && return 1
        fd --hidden --follow --exclude .git --type directory "$*"
    }

    # Search all files for this filename.
    fname() {
        [ $# -eq 0 ] && echo "$0 'file_name'" && return 1
        fd --hidden --follow --exclude .git --type file "$*"
    }

    # Find and replace with a pattern and replacement
    sub() {
        [ $# -ne 2 ] && echo "$0 'pattern' 'replacement'" && return 1
        pattern="$1"
        replace="$2"
        command rg -0 --files-with-matches "$pattern" --hidden --glob '!.git' | xargs -0 perl -pi -e "s|$pattern|$replace|g"
    }

    # Uses z and fzf, if there's a match then jump to it. If not, bring up a list via fzf to fuzzy search.
    unalias z 2> /dev/null
    z() {
        [ $# -gt 0 ] && _z "$*" && return
        cd "$(_z -l 2>&1 | sed 's/^[0-9,.]* *//' | fzf)"
    }

👤 PeterWhittaker
Shorthand to find all files matching a pattern (with optional additional arguments, e.g., -delete, -ls, -exec ..., etc.

  fndi () 
  { 
      tgt="${1}";
      shift;
      echo find . -iname \*"${tgt}"\* "${@}";
      find . -iname \*"${tgt}"\* "${@}" 2> /dev/null;
      [[ -z $tgt ]] && { 
          echo;
          echo "No target was specified, did the results   surprise?"
      }
  }
Shorthand to find all files containing a pattern:

  fndg () 
  { 
      binOpt="-I";
      wordOpt="";
      caseOpt="-i";
      while true; do
          if [[ -z $1 || $1 =~ ^[^-+] ]]; then
              break;
          fi;
          case $1 in 
              +i)
                  caseOpt=""
              ;;
              -B)
                  binOpt=""
              ;;
              -w)
                  wordOpt="-w"
              ;;
              *)
                  echo "Unrecognized option '${1}', cannot proceed.";
                  return 1
              ;;
          esac;
          shift;
      done;
      if [[ -z $2 ]]; then
          startIn=.;
      else
          startIn='';
          while [[ ! -z $2 ]]; do
              startIn+="$1 ";
              shift;
          done;
      fi;
      [[ -z $1 ]] && { 
          echo "No target specified, cannot proceed.";
          return
      };
      tgt=$1;
      echo find ${startIn} -type f -exec grep $binOpt $wordOpt $caseOpt -H "${tgt}" {} \;;
      find ${startIn} -type f -exec grep $binOpt $wordOpt $caseOpt -H "${tgt}" {} \; 2> /dev/null
  }

👤 Linux-Fan
My most-used self-written script is probably ``syssheet` -- this script displays information about the running system in terms of load (RAM usage, disk usage, load avg, users) and Hardware (CPU model, Network Interfaces...). It is like a crossover between top and inxi. I use it to clearly distinguish what kind of system I am logging into and deploy it to all of my systems: https://masysma.lima-city.de/11/syssheet.xhtml

There is also a collection of more "obscure" scripts in my shellscripts repository documented here: https://masysma.lima-city.de/32/shellscripts.xhtml.

Another (probably niché) topic is my handling of scanned documents which arrive as PDFs from the scanner and that I want to number according to the stamped number on the document and convert to png at reduced color space: https://masysma.lima-city.de/32/scanning.xhtml


👤 zem
not mine (i stole it from someone else) but very useful bash prompt_func to store your entire bash history forever:

8<-----------------------------

  function prompt_func {
    CMDNUM=`history 1 | awk '{print $1}'`
    LAST_CMD=`history 1 | cut -f 3- -d ' '`

    if [ x$LAST_CMDNUM = xwho_knows ]; then
      LAST_CMDNUM=$CMDNUM
    fi

    if [ x$CMDNUM != x$LAST_CMDNUM ]; then
      FULL_CMD_LOG="$HOME/full-history/$(date "+%Y-%m-%d").log"
      echo "$(date '+%H:%M:%S') `munge_pwd` $LAST_CMD" >> $FULL_CMD_LOG
      LAST_CMDNUM=$CMDNUM
    fi
  }
  export PROMPT_COMMAND=prompt_func
  export LAST_CMDNUM=who_knows

  function fh() {
    grep -r --color=NEVER ${*} ~/full-history |
    sed 's/[^ ]* //' |
    sed 's/ \[[^]]\*\]/$/'
  }
8<-----------------------------

`munge_pwd` is another script that does various substitutions on the prompt (specific to how my work directories are laid out) but mostly you can just substitute `pwd` if you don't care about deduplicating stuff like multiple checkouts of the same project.


👤 djsamseng
Recursive grep: For every time you know you’ve written that code before but can’t remember the exact syntax. Filters out known build directories that would otherwise make it slow (moddify this to your personal use case).

https://github.com/djsamseng/cheat_sheet/blob/main/grep_for_...

#!/bin/bash

if [ $# -eq 0 ] then echo "Usage: ./grep_for_text.sh \"text to find\" /path/to/folder --include=*.{cpp,h}" exit fi

text=$1 location=$2

# Remove $1 and $2 to pass remaining arguments as $@ shift shift

result=$(grep -Ril "$text" "$location" \ $@ \ --exclude-dir=node_modules --exclude-dir=build --exclude-dir=env --exclude-dir=lib \ --exclude-dir=.data --exclude-dir=.git --exclude-dir=data --exclude-dir=include \ --exclude-dir=__pycache__ --exclude-dir=.cache --exclude-dir=docs \ --exclude-dir=share --exclude-dir=odas --exclude-dir=dependencies \ --exclude-dir=assets)

echo "$result"


👤 mxvzr

    json2yaml() {
        python3 -c "import json,sys,yaml; print(yaml.dump(json.load(sys.stdin)))"
    }
    export -f json2yaml

    yaml2json() {
        python3 -c "import json,sys,yaml; json.dump(yaml.safe_load(sys.stdin), sys.stdout, default=str)"
    }
    export -f yaml2json

    httping() {
        while true; do
            curl $@ -so /dev/null \
                -w "connected to %{remote_ip}:%{remote_port}, code=%{response_code} time=%{time_total}s\n" \
                || return $?
            sleep 1
        done
    }
    [[ ! $(>&/dev/null type httping) ]] && export -f httping

    redis-cli() {
        REDIS_HOST="${1:-127.0.0.1}"
        REDIS_PORT="${2:-6379}"  
        rlwrap -S "${REDIS_HOST}:${REDIS_PORT}> " socat tcp:${REDIS_HOST}:${REDIS_PORT} STDIO
    }
    [[ ! $(>&/dev/null type redis-cli) ]] && export -f redis-cli

👤 stuporglue
Does a PostgreSQL plpgsql function count as a script?

https://gist.github.com/stuporglue/83714cdfa0e4b4401cb6

It's one of my favorites because it's pretty simple, and I wrote it when a lot of things were finally coming together for me (including GIS concepts, plpgsql programming, and a project I was working on at the time).

This is code which takes either two foci points and a distance, or two foci, a distance and the number of pointers per quadrant and generates a polygon representing an ellipse. Nothing fancy, but it made me happy when I finally got it working.

The use case was to calculate a naive estimate of how far someone could have ridden on a bike share bike. I had the locations they checked out the bike, and where they returned it, and the time they were gone. By assuming some average speed, I could make an ellipse where everywhere within the ellipse could have been reached during the bike rental.


👤 digitalsushi
I have a shell function exported called "cheat". It looks for a plain text file in a specific location named after the one argument to cheat() and it just prints the file out. I think it might glob and print multiples out.

And then I have different dotfile repos. I have a base one that I keep so clean I could get a job at Disney with it. That's where most of my scripts live. And then I have locale ones, like -home, -. Those have overlays so that I can have contextual extensions, such as a cheat database with work stuff. Also, I can keep that dotfile-employername hosted at my employer so that I'm not "crossing the streams". I don't even have to link them, they just autoload based on their location and name.

I don't have to hop systems too much, so grabbing fresh tooling is a twice a year problem. I'm a cli-as-ide dinosaur so I just hide all my seldom-used scripts under a double underscore prefix. __init_tooling will update vim and give me the 8 or 9 plugins I have grown dependent upon, give me a ruby and python environment, etc.

I have a function called "add_word". Every time I see a word I dont know, I learn it, and then I run "add_word ". It creates a new file called with the definition and commits it to a git repo hidden away. Every couple years I'll work through the list and see which I remember. I have about a 30% success rate adopting new words, which again, dinosaur here, so, I'll take whatever I can get.

The dirtiest thing I have is a cheap vault that uses vim and shell automation. I have a grammar for descripting secrets, and I can pass a passphrase through automation to get secrets out. I'm sure it's 100% hackable. I know the first rule of security software is "dont ever try to make your own". So I don't put anything too good in there.


👤 alexpotato
I wasn't the author but if I had to pick one set of scripts that have lowered my anxiety the most it's this: http://www.mikerubel.org/computers/rsync_snapshots/

Long story short: you can use hard links + rsync to create delta snapshots of a directory tree. I use it to create a back up of my important directory trees.

Funny story about this: I had really old HP "Lance Armstrong" branded laptop that I used for years. The above above script was on it and was rsyincing to separate machine so it was fully backed up. Because of that, I was actually hoping for the laptop to die so I could get a new one (frugalness kicking in strong here).

My girlfriend at the time was using it and said "Oh, should I not eat or drink over your laptop?" and I responded: "No, please do! If you break it that means I can allow myself to order a new one."


👤 zzo38computer
Access Firefox bookmarks from command-line:

  f0() {
    echo 'select moz_bookmarks.title || '"'"' = '"'"' || url from moz_places, moz_bookmarks on moz_places.id = moz_bookmarks.fk where parent = 2;' | sqlite3 /home/user/.mozilla/firefox/twht79zd.default/places.sqlite
  }
  f1() {
    firefox `echo 'select url from moz_places, moz_bookmarks on moz_places.id = moz_bookmarks.fk where moz_bookmarks.title = '"'$1'"';' | sqlite3 /home/user/.mozilla/firefox/twht79zd.default/places.sqlite`
  }
  f$# $1
Execute PostScript programs alone with command-line arguments:

  exec gs -P -dBATCH -dNODISPLAY -dNOEPS -dNOPAUSE -dNOSAFER -q -- "$@"
Tell the IP address:

  curl -s 'http://icanhazip.com/' | cat -v

👤 kakulukia
If you are polite, it works:

alias please='sudo zsh -c "$(fc -ln -1)"' # rerun the last command with sudo (because it failed )

Easier PATH management:

# nicer path configuration and lookup function path { if [[ $# -eq 0 ]]; then echo -e ${PATH//:/\\n} | sort elif [[ "$1" == "--save" ]]; then path $2 && echo "\npath $2" >> $HOME/.profile else if [[ -d "$1" ]] ; then if [[ -z "$PATH" ]] ; then export PATH=$1 else export PATH=$1:$PATH fi else echo "$1 does not exist :(" return 1 fi fi }


👤 hcrean
python3 -c "import pyotp ; print(pyotp.TOTP(''.lower()).now())"

This is a bash one-liner that takes the place of an RSA/2FA token/AuthyApp


👤 jamesralph8555
Wrote this simple one to create gzipped tar backups and send them over ssh. A lot faster than rsync if you just need to back up a whole folder on a schedule. It requires pigz, which is a parallel gzip implementation.

Variables: $HOSTNAME - the computer hostname $TOBACKUPDIR - the local directory you want backed up $N_CORES - the number of cores you want to use for compression $REMOTEUSER - the ssh user login on the remote server $REMOTEHOST - the remote server's IP $BACKUPDIR - where you want the file to be backed up to

#!/bin/bash

bfile=`date +%F`.$HOSTNAME.tar.gz

    /usr/bin/tar cvpf - \
            # You can exclude local directories here with
            # --exclude="dir" \
            $TOBACKUPDIR | pigz -p $N_CORES | \
            ssh $REMOTEUSER@$REMOTEHOST "cat - > /$BACKUPDIR/$bfile"

👤 alsetmusic
This is not mine. I can’t remember where I found it. Apologies for no attribution (Mac):

#!/usr/bin/env bash

function cdf() { #: Change working directory to the top-most Finder window location cd "$(osascript -e 'tell app "Finder" to POSIX path of (insertion location as alias)')"; }


👤 mickeyp
A bash script + a little elisp magic that leverages Emacs for fuzzy finding on the command line instead of fzf:

https://www.masteringemacs.org/article/fuzzy-finding-emacs-i...


👤 tengwar2
This is embarrassingly simple, but useful.

    manps()
    {
     if [ -z "$1" ]; then
      echo usage: $FUNCNAME topic
      echo This will open a PostScript formatted version of the man page for \'topic\'.
     else
      man -t $1 | open -f -a /Applications/Preview.app
     fi
    }
This is for MacOS. All it does is display a `man` entry as properly formatted Postscript. If you were around in the 80's when we had ring-bound paper manuals, you may remember how much easier they were to read as compared with fixed-pitch terminal rendering of the same page.

Sorry, no Linux version as I rarely have a graphical desktop open on Linux. It should be easy to rig something up with Ghostscript or similar.


👤 ransom1538
Let's say you have a cronjob YOUR_CRONJOB.php that is slow that uses a query to get the data it needs to iterate over. Change your sql where to have MOD(id, total_threads) = MOD. Toss this into jenkins and you are set.

This will fork it and wait until it ends.

#!/bin/bash

TOTAL=`ps aux | grep YOUR_CRONJOB.php | grep -v grep | wc -l`

echo "TOTAL PROCESSES ALREADY RUNNING :"$TOTAL

MAX_THREADS=20

TOTAL_MODS="$(($MAX_THREADS-1))"

echo "TOTAL MODS: "$TOTAL_MODS

if [ $TOTAL -eq 0 ]

then

    echo "RUNNING..."

    for i in $(seq 0 $TOTAL_MODS)

    do 
        echo "Starting thread $i"
        timeout 10000 php YOUR_CRONJOB.php $i $MAX_THREADS  & 

        pids[${i}]=$!

    done

    echo "FINISHED FORKING"  
else

    echo "NOT RUNNING...."
fi

for pid in ${pids[*]}; do

    wait $pid
done

echo "OK FINISHED"


👤 yabones
Few years ago, I needed a quick way to create Qemu VM's locally for testing some weird software configurations. So I made a script to pull Ubuntu cloud images and clone them into qcow2 disks, then create and register libvirt virtual machines. Part of the "magic" was creating a cloud-config ISO image that would be mounted to pre-seed the VM on first launch. It also pushed my ssh key into the VM so I wouldn't need to use passwords. Janky, but worked well for what I needed.

https://github.com/noahbailey/kvmgr/blob/master/kvmgr.sh


👤 axelf4
Here is my script gfm-preview [1], which I think is pretty cool since it implements a HTTP server in 50 lines of shell script (ab-)use with netcat. What is does is it starts a HTTP server that serves a rendered preview of a Markdown document using GitHub's API for rendering GitHub Flavoured Markdown. The page will automatically update when the document changes using fswatch and HTTP long polling!

[1]: https://github.com/axelf4/nixos-config/blob/e90e897243e1d135...



👤 RhysU
A shebang-friendly script for "interpreting" single C99, C11, and C++ files, including rcfile support: https://github.com/RhysU/c99sh/blob/master/c99sh

Use gnuplot to plot one or more files directly from the command line: https://github.com/RhysU/gplot/blob/master/gplot


👤 vbezhenar
Some zsh functions I can't live without

  mkcd() {
      mkdir -p "$1" && cd "$1"
  }

  mkcdtmp() {
      mkcd ~/tmp/$(date "+%y%m%d")
  }

👤 lukaszkups
Oh gosh, that made me soo nostalgic - I did such mono .sh file to boost my system setup couple years ago as elementaryOS at the time was not supporting updates and I had to do a clean install when new release was published: https://github.com/mrmnmly/linux-installation-script/blob/ma...

When I read it today I miss those soo oversimplified solutions to do stuff :'-)


👤 burntsushi
My dotfiles: https://github.com/BurntSushi/dotfiles

Here are some selected scripts folks might find interesting.

Here's my backup script that I use to encrypt my data at rest before shipping it off to s3. Runs every night and is idempotent. I use s3 lifecycle rules to keep data around for 6 months after it's deleted. That way, if my script goofs, I can recover: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...

I have so many machines running Archlinux that I wrote my own little helper for installing Arch that configures the machine in the way I expect: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...

A tiny little script to recover the git commit message you spent 10 minutes writing, but "lost" because something caused the actual commit to fail (like a gpg error): https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...

A script that produces a GitHub permalink from just a file path and some optional file numbers. Pass --clip to put it on your clipboard: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae... --- I use it with this vimscript function to quickly generate permalinks from my editor: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...

A wrapper around 'gh' (previously: 'hub') that lets you run 'hub-rollup pr-number' and it will automatically rebase that PR into your current branch. This is useful for creating one big "rollup" branch of a bunch of PRs. It is idempotent. https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...

Scale a video without having to memorize ffmpeg's crazy CLI syntax: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...

Under X11, copy something to your clipboard using the best tool available: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...


👤 gitgud
Here's my dotfiles repository [1], which is used to sync my little scripts and config files between my different systems (Mac/Linux). I first heard about it here [2].

[1] https://github.com/benwinding/dotfiles

[2] https://zachholman.com/2010/08/dotfiles-are-meant-to-be-fork...


👤 zwischenzug
Not a script as such, but I did put this together, building on what someone else did:

https://github.com/ianmiell/bash-template

It's a 'cut and paste' starter for shell scripts that tries to be as robust as possible while not going crazy with the scaffolding. Useful for "I want to quickly cut a script and put it into our source but don't want it to look totally hacky" situations.


👤 sumosudo

  autoload -U add-zsh-hook

  add-zsh-hook chpwd source_env

  source_env() {
        if [[ -f .env && -r .env  ]]; then
                source .env
        fi
  }

👤 lifefeed
Small bash functions that're useful for silly-ego-reasons.

    ## coding analysis
    function lines_coded {
        perl -ne'print unless /^\s*$/ || /^\s*(?:#|\/\*|\*)/' $* | wl
    }
    function lines_commented {
        perl -ne'print if /^\s*(?:#|\/\*|\*)/' $* | wl
    }
And wl is just a small alias (because I used it all the time):

    wl='wc -l'

👤 jamescampbell
My Mac OS bootstrap script to setup any new Mac from scratch. It includes niceties like moving the default screenshots from the Desktop to a more sane location, setting full disk encryption, and setting up privoxy & dnscrypt out if the box. https://github.com/james-see/fresh-mac

👤 altilunium
With this i can produce beautifully typesetted latex PDF and publish a blogpost, right from my text editor.

https://rtnf.prose.sh/prose-sublime-text-integration

https://rtnf.prose.sh/pandoc-sublime-text-integration


👤 hnarayanan
I use the following definition in my .profile to be able to replace foo with bar in all text files within a folder.

  replace() {
      grep -rl "$1" . | xargs gsed -i "s/$1/$2/g"
  }
Also, I run Spotify from the command line: https://github.com/hnarayanan/shpotify

👤 Freaky
borg-backup.sh, which runs my remote borg backups off a cronjob: https://github.com/Freaky/borg-backup.sh

zfsnapr, a ZFS recursive snapshot mounter - I run borg-backup.sh using this to make consistent backups: https://github.com/Freaky/zfsnapr

mkjail, an automatic minimal FreeBSD chroot environment builder: https://github.com/Freaky/mkjail

run-one, a clone of the Ubuntu scripts of the same name, which provides a slightly friendlier alternative to running commands with flock/lockf: https://github.com/Freaky/run-one

ioztat, a Python script that basically provides what zfs-iostat(8) would if it existed: https://github.com/jimsalterjrs/ioztat


👤 thedanbob
Not sure whether to be proud or ashamed of this, but this script runs in a cron job to monitor my home server’s IPv4 and IPv6 addresses and update them everywhere (DNS, firewall, reverse proxies) if they change: https://gist.github.com/thedanbob/13f88ca8c21cb2ab7904ec5a6e...

👤 gengiskush
https://bitbucket.org/mieszkowski/introspect/src/master/

I replaced Plone for my personal use with about 1000 lines of Python. A object oriented database. The interface is awkward but if you get past that the goal was to produce pictures of trees with graphviz.


👤 mauli
I started - but rarely update and kinda forgot pushing to github - some small scripts and knowledge snippets. One of them being a network/ssh based distributed unseal mechanism (using shamir algorithm) to allow machines to boot and decrypt their OS partition.

https://github.com/maulware/maulstuff


👤 PainfullyNormal
I wrote a silly little script to play a random assortment of music.

    playMeSomeMusicMan() { rg --files -tmusic ~/Music | shuf | mpv --playlist=- }
I also got sick of waiting for Activity Monitor to boot to kill an errant process, so I wrote this one to fuzzy search and kill the selection.

    kp() { ps aux | fzy | awk '{ print $2 }' | xargs kill }

👤 hyperman1
themicrosoftchainsawmassacre.cmd

  TASKKILL /IM outlook.exe
  TASKKILL /IM teams.exe
  TASKKILL /IM onedrive.exe

  timeout /t 2

  TASKKILL /F /IM outlook.exe
  TASKKILL /F /IM teams.exe
  TASKKILL /F /IM onedrive.exe
  TASKKILL /F /IM Microsoft.AAD.BrokerPlugin.exe
  timeout /t 2

  start outlook.exe
  start "" %LOCALAPPDATA%\Microsoft\Teams\Update.exe --processStart "Teams.exe"
  start "" "C:\Program Files\Microsoft OneDrive\OneDrive.exe"  /background


There is a short period between network start and VPN start where all the microsoft thingies start and want me to login again. As their SMSes sometimes take hours to arrive, it is more easye to just kill and restart them, and let them reuse their existing login.

So I dropped the batch above on my desktop, and click it while the VPN is starting up. In the 4 seconds it takes to kill everything, the network works as it should.



👤 RijilV
I find this one handy:

#!/bin/bash echo |\ openssl s_client -connect ${1:?Usage: $0 HOSTNAME [PORT] [x509 OPTIONS]}:${2:-443} 2>&1 |\ sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' |\ openssl x509 ${3:--text} ${@:4} 2>/dev/null |\ sed '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/d'


👤 efitz
Here’s one I wrote recently to take a list of domains and enumerate subdomains using sublist3r. Mostly it wraps the latter tool and cleans up the output, but it also enriches the output with dig info.

https://github.com/ericfitz/dominfo

Dependencies: sublist3r (Python) pv (used for progress bars)


👤 0xbadcafebee
My junk drawer: https://github.com/peterwwillis/junkdrawer

The junk I haven't touched in 10 years: https://github.com/psypete/public-bin/src


👤 ilozinski
I work on an m1 macbook and a lot of times using arm architecture breaks dependencies. I have two really basic functions in my .zshrc (should also work for bash):

# M1 compatibility switches

arm() { arch -arm64 "${@:-$SHELL}" }

x86() { arch -x86_64 "${@:-$SHELL}" }

This with the addition of `$(uname -m)` in my $PROMPT, has saved me a lot of time by letting me switch between arm and x86_64 architecture.


👤 klysm
My philosophy is that if I need a script something is wrong. Unfortunately a lot of things are wrong so I have a lot of scripts

👤 maybe_pablo
A quick one to display the contents of my scripts:

  $ wat wat
  #!/usr/bin/env bash
  cat `which $1`

👤 MiddleEndian

👤 kseistrup
https://codeberg.org/kas/qtime/src/branch/master/shell/qtime...

    $ qtime.bash
    It's nearly twenty-five past two.

👤 pech0rin
Not mind-blowing but I use it all the time to loop a task a certain number of times. I always forget the bash command for loops so I just wrote a simple command where I can run `loop 10 ./this-other thing and stuff` to loop ten times.

loop() { NUM=$1 shift for i in {1..$NUM}; do "$@" done }


👤 edrx
This is a bit of a meta-answer... many years ago I saw that I was not very good at writing error handling code for my scripts, so I (mostly) switched to this:

http://angg.twu.net/eepitch.html

that lets me execute my scripts line by line very easily.


👤 alinspired
Show kubernetes pods in "unusual states" or restarted 8 or more times:

    kubectl get pods --all-namespaces --sort-by=.metadata.creationTimestamp -o wide -Lapp \
        | grep -vP "Completed|Terminating|ContainerCreating|Running\s+[01234567]\s+"

👤 gkfasdfasdf
Sorted output from ripgrep without sacrificing parallel search (of course you have to wait for the search to complete before seeing any output:

  function rgs {
      rg --line-number --with-filename --color always "$@" | sort --stable --field-separator=: --key=1,1
  }

👤 lioeters
I use this one almost every day.

  # Recursively search for keyword in all files of current directory

  grr() {
    grep -rHIn --exclude-dir=.git --exclude-dir=node_modules --exclude=*.min.* --exclude=*.map "$@" . 2>&1 | grep -v "No such file"
  }

👤 thedookmaster
Here's my scripts directory from my dotfiles repo: https://github.com/kleutzinger/dotfiles/tree/master/scripts

👤 jlund-molfese
https://github.com/johnl-m/display-brightness-scripts

I wanted to control my display’s brightness using my keyboard on Linux. Turned out to be pretty easy with ddcutil!


👤 dusted
I don't have that many useful scripts online, but one I'm using a bit is for quickly generating static photo albums: https://github.com/DusteDdk/chromogen

👤 jcuenod
Zotero: I `git init`ed my zotero folder and wrote scripts to do daily commits and pushes to a remote host:

https://github.com/jcuenod/zotero-backup-scripts/


👤 j1elo
Not a script for a specific need, but I have a folder with Bash snippets from where I copy and mix parts of them when writing scripts.

https://github.com/j1elo/shell-snippets


👤 hansschouten
I often type ls after cd and even if I know the folder contents I don't mind seeing it again. To automatically ls after each cd command add this to ~/bash_profile

[ -z "$PS1" ] && return function cd { builtin cd "$@" && ls }


👤 RandomBK
My scripts aren't particularly fancy or original, but you can look through everything at https://github.com/randombk/randombk-dotfiles.

👤 mrich
System setup and scripts:

https://github.com/mrichtarsky/linux-shared

The repo name is a bit outdated, it works on macOS too. Lots of scripts are missing, will add them soon.


👤 curtis86
I've started adding some of my shorter scripts to a single repo - https://github.com/curtis86/my-scripts

Will definitely be adding more as I tidy them up! :)


👤 yakshaving_jgt
I wrote a small script to stick my local weather (based on IP address) in my tmux status bar.

https://jezenthomas.com/showing-the-weather-in-tmux/


👤 egberts1
Script? Bash? TLS?

This one will generate any kind of TLS certificate: Root CA, intermediate, mail, web, client-side …

https://github.com/egberts/tls-ca-manage


👤 mcluck
TL;DR Quick way to create work-in-progress commits:

    #!/bin/bash
    # Perform a work-in-progress commit
    
    # Add everything
    git add $(git rev-parse --show-toplevel)
    
    # If the latest commit is already a WIP, amend it
    if [[ $(git show --pretty=format:%s -s HEAD) = "WIP" ]]; then
      git amend --no-edit --no-verify
    else
      git commit -m "WIP" --no-verify
    fi
I wanted a way to quickly commit everything in a branch without thinking about it. This comes up a lot when I'm working on something and either need to pivot to something else or I want to pull down a PR and verify it without losing my work. I also wanted the option to quickly switch back to that branch, pick up where I left off, and be able to drop it again just as quickly without muddying up the commit history.

This script automatically stages everything and commits it as "WIP". If it detects that the most recent commit was a "WIP" then it amends the previous commit. No more weird stashing just to avoid losing my place


👤 adulakis
search history like "hist tsc"

  hist() {
    history | grep $1
  }

👤 ellis0n
I am using a simple python script to check Terabytes of files

https://github.com/web3cryptowallet/drive-py


👤 jph
I maintain git alias scripts, such as for shortcuts, metrics, and workflows.

https://github.com/gitalias/gitalias


👤 archi42
I lost the script (bash function) when I changed job, but inspired by a co-worker:

> up

Does a `cd ..` on every keypress except ESC or space.

> up $n

Does a total of $n `cd ..` and (important!) set OLDPWD to the initial directory for proper `cd -`.


👤 gmac
https://github.com/jawj/IKEv2-setup

Sets up an Ubuntu server as a strongSwan IKEv2 VPN.



👤 oweiler
Open GitLab MRs from the commandline

https://github.com/helpermethod/mr


👤 avestura
I keep some of my scripts and cheat sheets here:

https://avestura.dev/snippets


👤 woodruffw
Mine are mostly here: https://yossarian.net/snippets

👤 vermaden

👤 jbenner-radham
This is a really simple script that I use to save a few keystrokes when I'm querying a package.json from the CLI. It depends on JQ. e.g., pkg dependencies, pkg version, etc.

  #!/usr/bin/env sh
  set -o errtrace; set -o errexit; set -o pipefail
  
  if [ -n "${1}" ]; then filter="${1}"; else filter=''; fi
  jq ."${filter}" package.json

👤 czhu12
this exact idea is what I’m trying to build https://trytoolkit.com for

Never promoted it but I’ve been quietly using it myself to build stuff that I need. Obviously browser based stuff have limitations but I found I still get a lot done


👤 ivanjermakov
> "fastily/autobots/ubuntu/scripts/bin/vomitSMART.sh"

What a descriptive name :D


👤 alsetmusic
A function for sorting contents of a directory by storage consumed (sorry for lack of comments throughout). I must admit, I’m particularly pleased with this one. (This is for MacOS):

#!/usr/bin/env bash

function szup() {

description=' #: Title: szup #: Synopsis: sort all items within a directory according to size #: Date: 2016-05-30 #: Version: 0.0.5 #: Options: -h | --help: print short usage info #: : -v | --version: print version number '

funcname=$(echo "$description" | grep '^#: Title: ' | sed 's/#: Title: //g') version=$(echo "$description" | grep '^#: Version: ' | sed 's/#: Version: //g') updated="$(echo "$description" | grep '^#: Date: ' | sed 's/#: Date: //g')"

    function usage() {
        printf "\n%s\n" "$funcname : $version : $updated"
        printf "%s\n" ""
    }

    function sortdir() {
        Chars="$(printf "    %s" "inspecting " "$(pwd)" | wc -c)"
        divider=====================
        divider=$divider$divider$divider$divider
        format="    %-${Chars}.${Chars}s %35s\n"
        totalwidth="$(ls -1 | /usr/local/bin/gwc -L)"
        totalwidth=$(echo $totalwidth | grep -o [0-9]\\+)
        Chars=$(echo $Chars | grep -o [0-9]\\+)
        if [ "$totalwidth" -lt "$Chars" ]; then
            longestvar="$Chars"
        else
            longestvar="$totalwidth"
        fi
        shortervar=$(/Users/danyoung/bin/qc "$longestvar"*.8)
        shortervar=$(printf "%1.0f\n" "$shortervar")
        echo "$shortervar"
        printf "\n    %s\n" "inspecting $(pwd)"
        printf "    %$shortervar.${longestvar}s\n" "$divider"
        theOutput="$(du -hs "${theDir}"/* | gsort -hr)"
        Condensed="$(echo -n "$theOutput" | awk '{ print $1","$2 }')"
        unset arr
        declare -a arr
        arr=($(echo "$Condensed"))
        Count="$(echo "$(printf "%s\n" "${arr[@]}")" | wc -l)"
        Count=$((Count-1))
        for i in $(seq 1 $Count); do
        printf "   %5s    %-16s\n" "$var1" "${var2//\/*\//./}"
        done
        echo
    }

    case "$1" in
        -h|--help)
            usage
            return 0
            ;;
        *)
            :
            ;;
    esac

     if [ -z "$1" ]; then
             oldDir="$(pwd)"
             cd "${1}"
             local theDir="$(pwd)"
             sortdir
             cd "$oldDir"
             return 0
     else
            :
             oldDir="$(pwd)"
             cd "${1}"
             local theDir="$(pwd)"
             sortdir
             cd "$oldDir"
             return 0
     fi


}

👤 makobado
I do not understand

👤 nextaccountic
Here's my script for playing videos in a folder. The command is , (comma)

When you download a video from certain sites, ctime is the time you created the file (so the time you downloaded) but the video still comes with a timestamp which is saved as the mtime (I'm not sure why this happens, maybe there's a http header for that?), and I presume it's the time when the video was first uploaded to the site?

Here's a favorite of mine: all my scripts' -h simply show the source code

    $ cat $(which ,)
    #!/bin/sh
    
    #export DRI_PRIME=1
    
    cmd=mpv
    
    param='-fs --msg-level=all=no,cplayer=info'
    filter='/Playing/!d; s/^Playing: //'
    
    order='%C@' # by default, order by ctime
    sort=-n     # (which is a numeric sort)
    reverse=-r  # ... show newer videos first
    depth=      # ... and do it recursively
    loop=       # ... without looping
    
    [[ -n $MYT_MUTE ]] && set -- -u "$@"
    [[ -n $MYT_1 ]] && set -- -1 "$@"
    [[ -n $MYT_REC ]] && set -- -r "$@"
    
    while getopts 1rcmsaRnolugh o; do
        case "$o" in
        1) depth='-maxdepth 1';;  # just the current directory
        r) depth=;;               # recursively
    
            # a video uploaded in 2009 but downloaded in 2015 will have
            # mtime in 2009 and ctime in 2015
        #
        # (note: moving the video to another directory actually bumps
        # the ctime)
    
        c) order='%C@'; sort=-n;;    # order by ctime (download time)
        m) order='%T@'; sort=-n;;    # order by mtime (upload time)
    
        s) order='%s'; sort=-n;;     # order by size
        a) order='alpha'; sort=;;    # order lexicographically
        R) order='random'; sort=-n;; # order at random
    
        n) reverse=-r;; # newer first
        o) reverse=;;   # older first
    
            l) loop=--loop=inf;; # infinite loop
    
        u) mute=--mute=yes;; # no sound
    
        g) filter=; param=;;    # debug
    
        h) ${PAGER-less} "$0"; exit;;
        esac
    done
    shift $((OPTIND-1))
    
    find -L "$@" $depth -mindepth 0 \
         -not -path '*/\.*' \
         -type f \
         -name '*.*' \
         -printf "$order %p\0" \
        | awk 'BEGIN { RS="\0"; srand() } {
            if ($1 == "random")
              sub ($1, int(rand()*100000));
            printf "%s\0", $0
          }' \
        | sort -z $sort $reverse \
        | sed -zr 's/^[^ ]+ //' \
        | xargs -0 $cmd $loop $mute $param 2>&1 \
        | sed "$filter"
    
    
    #     -exec $cmd {} + | sed "$sed"

👤 sebastianconcpt
I'd add how I load a terminal enviroment for different profiles.

In .zshrc I have the least possible things so it opens fast. But I include the commands that would extend (srcBlah) or help me tune how to extend (editBlah):

For my pet projects I'd use these two:

   alias editSeb="code ~/.sebrc"
   alias srcSeb="source ~/.sebrc"
As you can see, editSet opens VSCode and src is sourcing it in the current terminal.

   .sebrc

   # Download video from the given YouTube URL
   function ytdl() {
     youtube-dl -x --audio-format mp3 --prefer-ffmpeg $1
   }

   # Download audio from the given YouTube URL
   function ytmp3() {
     ytdl $1 | ffmpeg -i pipe:0 -b:a 320K -vn $2.mp3
   }

   # Shows total size of the given directory at $1
   function dus() {
     du -h -d 1 $1
   }

   # Used to opt-out of pre-commit autofixes
   export NO_COMMIT_CHECKS=true

   function cleanUSB() {
     volumeName=$1
     subdir=$2
     if [[ "$volumeName" != "" ]] && [[ "$subdir" = "" ]]; then
       rm -rfv /Volumes/$volumeName/.DS_Store
       rm -rfv /Volumes/$volumeName/.Spotlight-V100
       rm -rfv /Volumes/$volumeName/.fseventsd
       rm -rfv /Volumes/$volumeName/.Trashes
       rm -rfv /Volumes/$volumeName/._\*
       echo "Volume $volumeName is clean"
     elif [[ "$volumeName" != "" ]] && [[ "$subdir" != "" ]]; then
       rm -rfv /Volumes/$volumeName/$subdir/.DS_Store
       rm -rfv /Volumes/$volumeName/$subdir/.Spotlight-V100
       rm -rfv /Volumes/$volumeName/$subdir/.fseventsd
       rm -rfv /Volumes/$volumeName/$subdir/.Trashes
       rm -rfv /Volumes/$volumeName/$subdir/._\*
       echo "Volume $volumeName/$subdir is clean"
     else
       echo "No volume name given. Nothing to do."
     fi
   }

   function blogBackup() {
     rsync -avzh --progress -e ssh root@seb-nyc1-01:/root/blog/db /Users/seb/Documents/blog
   }

   alias showHiddenFiles='defaults write com.apple.finder AppleShowAllFiles YES; killall Finder /System/Library/CoreServices/Finder.app'
   alias hideHiddenFiles='defaults write com.apple.finder AppleShowAllFiles NO; killall Finder /System/Library/CoreServices/Finder.app'

   # Show ports currently listening
   function openPorts() {
     netstat -p tcp -van | grep '^Proto\|LISTEN'
   }

   # Create a RAM disk on macOS
   function ramDisk() {
     # https://eshop.macsales.com/blog/46348-how-to-create-and-use-a-ram-disk-with-your-mac-warnings-included/
     # 2048 = 1MB
     # 2097152 = 1G
     quantityOfBlocks=2097152
     diskutil erasevolume HFS+ "RAMDisk" `hdiutil attach -nomount ram://${quantityOfBlocks}`
   }

   # Tauri watcher for source file changes will not stop automatically.
   function killRollup() {
     ps aux | grep node | grep rollup | awk '{print  $2;}' | xargs kill -9 $1 
   }

   # X pet project required env var
   export X_TOKEN=blahValue

   function dockerCleanAll() {
       docker stop $(docker ps -aq)
       docker rm $(docker ps -aq)
       docker rmi $(docker images -q) -f
   }

   function dockerCleanVolumes() {
       docker volume rm $(docker volume ls -qf dangling=true)
   }

   alias ll='ls -lah'
   alias gg='git status -s'

   # Creates a timestamped backup of the current branch:
   alias gbk='git checkout -b "backup-$(git symbolic-ref -q HEAD --short)-$(date +%Y-%m-%d-%H.%M.%S)" && git checkout -'

👤 jraph
list:

    #!/usr/bin/env sh
    
    find | grep -- "$1"
If you are searching for a Python or a Java package / class, it will work because the dots in it will mean "any char" for grep and will match the slashes in its path.

oneline:

    #!/usr/bin/env sh
    tr '\n' ' '; echo
Puts anything you give in its standard input in one line.

L, my journaling tool (whenever I need to get something out of my head or be sure to find it later); I can edit and fix stuff by editing the file it generates after the fact:

    #!/bin/sh
    set -e

    CONFIG_FILE="${HOME}/.config/Ljournalrc";

    if [ ! -f "${CONFIG_FILE}" ]; then
        mkdir -p "$(dirname "$CONFIG_FILE")"
        printf 'JOURNAL_FILE="${HOME}/Documents/journal.txt"\n' >> "${CONFIG_FILE}"
        printf 'VIEWER=less\n'                                  >> "${CONFIG_FILE}"
        printf 'LESS='"'"'-~ -e +G'"'"'\n'                      >> "${CONFIG_FILE}"
    fi

    L=$(basename $0)

    usage() {
        cat <> ${JOURNAL_FILE}
    fi

    printf "%s\n" "$msg" >> ${JOURNAL_FILE}