1: https://gist.github.com/adewes/c9b2a71457c6c6f01f2f
2: https://gist.github.com/adewes/02e8a1f662d100a7ed80627801d0a...
3: https://gist.github.com/adewes/7a4c20a5a7379e19d78ba54521d3d...
I'm still super proud of that because it was so much fun to create. I recall "releasing" the program to a handful of friends, and they all seemed to like it, but it never really went anywhere.
I'd never written a line of code in my life, so going through the iterative process of getting it functioning and fixing bugs was pretty invigorating. I could see where the generated code was going awry and describe it in natural language, and chatGPT would turn that into syntax and explain why it worked. Definitely a fun way to learn.
I'm sure it's ugly/inefficient, but here's the script:
mkdir ~/.marks/
export CDPATH=.:~/.marks/
function mark { ln -sr "$(pwd)" ~/.marks/"$1"; }
Then: mark @name # add bookmark called @name for the current directory
cd @name # jump to the bookmarked location
cd @ # list all available bookmarks
It can list bookmarks, auto-complete, jump to sub-directories within bookmarks, all without introducing any new commands - just `cd`.[1]: http://karolis.koncevicius.lt/posts/fast_navigation_in_the_c...
To my knowledge, this was the first "site-specific" browser extension, so I wrote a blog post about it:
https://www.holovaty.com/writing/all-music-guide/
Aaron Boodman saw my post and decided to generalize the idea; he created Greasemonkey, which was a single browser extension that could aggregate site-specific JS customizations. In the years since, this idea of "user scripts" has further developed, and I think some browsers even support the concept natively.
#! /bin/bash
function todo {
if [ $# -eq 0 ]
then
vim ~/Dropbox/todo/todo_list.txt
else
vim "$HOME/Dropbox/todo/todo_list_$1.txt"
fi
}
Sourcing this in my shell means I just have to type... todo
To automatically get into my default todo list which is synced to dropbox (and thus accessible everywhere on mobile as well).While, whenever I have a new project/etc come up, I can just
todo new_project_name
And get a new, nice clean file, with the same backups and multi-device accessibility.
Before I could register I could see most classes I wanted to take had plenty of availability except for one which was almost full. That class would make my schedule perfect (no classes before 10am, everything ending before 4PM, good teachers), so I was ready the moment my group got access. I logged into the janky Java applet and saw that my desired class was full. Dismayed, but not giving up, I refreshed the class list to see if someone might drop during the drop-add week. Nope, still full, but there was still time for someone to change their mind and for me to slip in. I was going to need an edge.
Another peculiarity of this system was that it was definitely a weird Java app based on some type of main frame. It must have been around for a decade at least. One of the signs of its age was that you had a limited amount of time per day during which you could be logged into the system, something like 90 minutes a day.
So I did what any good hacker would: I made _really_ good use of that limited time. I wrote an AutoHotkey script which would automatically log in, traverse all the menus and attempt to sign me up for that class. It was much faster than a human, so each run barely put a dent in my logged-in time allotment. As a cherry on top, I wired it up to a python script to push notify me if it succeeded.
After letting it run for two days, some false alarms, and tweaking, I got a delightful "You're registered!" notification on my phone and found that it had successfully gotten me the class I wanted. I'm still chasing that high to this day.
#!/bin/sh
ROTATION=$(shuf -n 1 -e '-' '')$(shuf -n 1 -e $(seq 0.05 .5))
convert -density 150 $1 \
-linear-stretch '1.5%x2%' \
-rotate ${ROTATION} \
-attenuate '0.01' \
+noise Multiplicative \
-colorspace 'gray' $2
We had great AP density so by the 2nd night I started working on an Expect script that would log in and boot 1 AP at a time per floor-building combo (multiple areas simultaneously), wait for them to come back, wait about 30 seconds, and move to the next AP on each floor. At first I did a dry run where the reboot was commented out and the "is it back yet" was delayed by a short timer to simulate a reboot delay + make sure the script would really run without a login timeout or something stupid. This would take a couple hours but it was better than taking everything down as clients under the specific AP being cycled would just roam and have a weaker signal for a minute (assuming the system had been booted the night before). We ran it the 3rd night in place of the mass reboot and watched it carefully. It went great so we left it to run every night. We had the nightly notice from an outage to a potential outage and stopped getting complaints because, except for a particularly unlucky laptop in a corner for 1 minute, nothing was really out of the ordinary. At this point we knew we were buying a brand new wireless system that fiscal year (already in the budget) so we ended up just riding the year out with this script.
In all it was a lame reboot script shorter than the table containing the lists of APs to boot 1 by 1 but it probably had more real impact than any fancier scripts/tools I made while I was there.
yt-dlp --download-archive archive.txt --no-overwrites -o '%(playlist_index)s - (%(uploader)s) %(title)s.%(ext)s' --format bestvideo+bestaudio --yes-playlist --ignore-errors --write-info-json [PLAYLIST_URL]
Edit: if you use this, note that you should probably sort your playlist by Date added (oldest), otherwise the playlist_index doesn't work correctly. Or you can just leave out playlist_index from the filename if you don't care about it.
Another one was a password rotation script for my university. There was an idiotic password policy that required you to change your password every 6 months via a web form, and if you tried to change it to a password you had used before you would get an error saying that the new password couldn't match the last 24 passwords you had used. This information was very useful as it allowed me to write a script that simply changed my password 24 times and then changed it back to the one I wanted.
But I started accumulating so many of them, it got to be a problem just remembering all the ones I had and how to trigger them. I had to start finding new ways of keeping them all organized. That led to figuring out two things that I'm still really proud of and use every day: custom right-click menus and Emacs-style keychords.
The custom right-click menu is: any time I right-click while also holding down Ctrl or Alt, it opens a right-click dialogue run by AutoHotkey, not the default one in the given context. That lets me visually select what I want to happen rather than needing an arcane key combo for every thing. All of my frequently-used Regex functions are right-click menus now.
The key chords are: if I trigger a "prep" hotkey first—my most-used one is "Alt+G" for "Go-to"—it sets up a keyboard hook that listens for the next key and then dispatches to a function based on that. So, "Alt+G H" would be really easy to remember in my head as "Go to Hacker news." And it pops open a new tab in my browser. I use that one a lot for opening desktop applications—"Alt+O C" maps in my head to "Open vs Code." Super easy to remember.
I've automated 80% of the most common navigation items at my job this way. My coworkers have seen me do it and stopped me to ask, "Wait, tell me how you did that." It looks and feels like magic.
Over the years, I've made it more complex. I added a simple menu system using text-to-speech and accepting the DTMF tones for the selection. This way I could tell people to enter a secret PIN and they could provision themselves access to my lobby. If they didn't know the secret PIN, there was an option to have it patch through to my phone so I could screen the guest using normal voice and I could manually press 9 on my cell phone.
It also has an SMS interface where I can select different modes and hours of operations. So when I host a poker night, I'd txt my bot "poker" and it would it update the menu system to be Poker themed so that when guests were at the callbox they'd be greeted with poker puns. When my friend comes to visit from Austin, I'd txt "StarCraft" and he'd get to hear some dumb SC2 puns. I have 20+ different modes now for various occasions over the years. Now when people visit they get their phones out and record the greeting half-expecting something that they would want to remember or share on social media.
Sadly, my building just replaced their callbox with a newer model that does not accept secondary input. So once the call is connected to Twilio, if the user touches any button on the callbox (e.g. to enter a secret PIN), it will disconnect the call. I suppose it's now time to use some voice-to-text options to bring back the interactivity, but I suspect the lag would make the experience more frustrating than fun.
When on, it would make data noises at a reasonable volume indicating the current bandwidth usage. If a download got stalled, it was obvious even if I didn’t have that window in the foreground, or wasn’t even sitting at the computer.
Also, different kinds of data had different sounds and cadences, and I got used to many of them. It was fun to hear an email coming in before my sound card officially announced it.
[1] is the reason I no longer need a taskbar/dock. It's a sort of framework you can use to build a muscle-memory-friendly window switcher in Linux, similar to pinning windows to the taskbar in Windows -- map Win+1 to terminal, Win+2 to vs code, etc. If the app isn't running then it launches it; else, switch to it. (I wrote this because couldn't find a consistent and foolproof way to do this in Linux, and if you switch window managers you have to start from scratch)
(For better ergonomics, if the keyboard I'm using doesn't have a win/meta key on the right, I rebind right control to win/meta)
[2] is a timestamped log file for you, a human. Set up a cron job to launch it every hour to note what you were working on, or append lines from the terminal whenever you're chewing on a hard problem. Then, use the data to build an accurate picture of what you worked on during the last week, or grep last quarter's log to find out why you decided to use library A instead of library B. I have used this since 2018 and frequently go back to gain insight into the reasons for old decisions.
1: https://gist.github.com/cbd32/cbec9a32b32bd9e93b0d2696c71b5f...
2: https://gist.github.com/cbd32/f1ee2967ec0181b934639c30f4e68f...
Which was the same they used at my university in the cafeteria. Managed to plagiarise the same receipt we got when ordering food (which you needed to show the kitchen to get a plate of food) and was thus able to eat for free for the rest of my enrollment.
Not even my professors could teach me to do GUIs in Java but this thing sure was ‘Swinging’
To take advantage of this, I wrote a script that tracked our shipments from the previous 12-months and identified which parcels were late. It then lodged a new inquiry through Australia Post's website. I received a call from the head of Australia Post's customer service team, telling me I was responsible for 90% of their volume. Apparently, what was causing delays was the need to wait for us to select "get money back" or "prepaid satchel", and she asked if they could do one or the other for all inquiries. This would have impacted the queue size (guessing one of their KPIs).
In the end, we received several thousand dollars' worth of postage refunds.
Built a microcontroller driving a normally-closed relay to interrupt the power for 1 minute out of every 120. No more cold mornings. I later added temperature sensing, smarter resetting based on temperature (to prevent power cycling a working boiler), and a webserver to it: https://imgur.io/a/VM7nD74 (the graph is an SVG; everything was generated on the local ESP8266)
Made a bunch of Hammerspoon plugins for macOS:
- https://github.com/dbalatero/SkyRocket.spoon - mod+click to resize/drag windows
- https://github.com/dbalatero/VimMode.spoon - Vim mode everywhere
- https://github.com/dbalatero/HyperKey.spoon - pop-up menu for when you forget your keybinds (like neovim's which-key or spacemacs)
# Sort list by most common terms
alias sorn='sort | uniq -c | sort -n'
# Most common IP addresses:
$ cat access_log | awk '{ print $2 }' | sorn
# Last 30 modified files.
function new() {
ls -lat "$@" | head -30 ;
}
# I just downloaded something, where is it?
$ new ~/Downloads
I looked for a way to do that. Browser cookies seemed like a possibility, until I learned that one site couldn't read another site's cookies. When I asked people if there was any workaround I was told that there wasn't. But then I figured out how to use them to track people anyway. That hack is my answer to the OP.
I filed a patent in 1995 that showed how to do it. The patent tied that technique to my specific mathematical technique for selecting the ads the user would most likely be interested in, so it could not be used to restrict other people from using cookies that way as long as they didn't also use my specific math. The patent also contained a lot of features about how to give users complete control over their information and that protected their privacy. (That part didn't get implemented much!)
I eventually sold the patent to DoubleClick, and DoubleClick was eventually sold to Google for an amount I recall to be in excess of $1 billion. By then I'd seen a marketing site recommend that Google buy DoubleClick, with my patent listed as one of the reasons they should. DoubleClick had their own patent describing the tracking cookie, but their priority date was a couple months later than mine. And it didn't have the privacy features mine did, or the same degree of tech about how to actually select ads.
In defending against a patent troll in 2021, Google and Twitter together filed a joint brief calling the tracking cookie "Robinson's Cookie". My name is Gary Robinson. (In case anyone is curious there's more info at https://www.garyrobinson.net/2021/07/did-i-invent-browser-co....)
https://gist.github.com/michaelterryio/478f5b93111f5f5fc57c0...
To practice my marketing copy, I wrote a not-for-profit one page site about how I use it:
I didn't feel like doing that deep of a dive to determine why it sometimes sees the NVMe drive, so instead I wrote a script [0] that uses IPMI to power cycle it until it can get an ssh connection. Originally I was sending a magic packet, but realized that was only a one-shot, so I switched to calling the IPMI executable via subprocess. No idea why I left the other stuff in there.
Anyway, this has reliably worked for well over a year at this point, dutifully restarting the server until it comes up. Some days it takes 1 attempt, some days it takes 5, but it always comes up.
[0]: https://gist.github.com/stephanGarland/93c02385e344d8b338aab...
$ ch ls -vx
ls - list directory contents
-v natural sort of (version) numbers within text
-x list entries by lines instead of by columns
Script to provide `cut`-like syntax for a few `awk` features: https://github.com/learnbyexample/regexp-cut $ echo ' one two three' | rcut -f2
two
$ echo 'sample42string123with777numbers89' | rcut -d'[0-9]+' -f3
with
As a young person, I was on a trip to visit another ISP to learn how a VPN would work between our two companies. Their administrator, a wonderful person with 30 years of experience in the telco-turned-ISP world, had all these incredible aliases on her workstation for managing her fleet. But on the new vendor's system, a fresh install of the same OS left her almost unable to perform.
My jobs have all had the same opportunity for the same experience, and I've tried to leave it alone. I spend maybe 2/3 of my time logged into someone else's server, and I don't doubt myself, even knowing fully well it's a personal preference above all.
Something like the following, then just copy it in a shell script, chmod +x it and execute it:
select 'curl -XPOST localhost:9090/apis/some-service/' || p.technical_name || '/locale/' || l.full_locale
- Created a folder called "shortcuts"
- Copy the folder URL into the system path (Control Panel, Environment variables something something)
- Create a shortcut of anything I want to hyperlink to and put in there, with descriptive names
- In IE, I think the setting was to enable autocomplete. For some reason that was linked to the autocomplete of the run menu item in Windows.
- For extra bonus points, add a shortcut of the shortcuts folder and put in the folder.
Result is that you can hit Win-R, type e.g. "sho" and it would autocomplete "shortcuts", or "ti" for "timesheet" etc etc. Way faster than clicking around for icons, just a half-second until your oft-used app/spreadsheet/folder/website is loading.
Question: how can I do this on Mac?
I wrote a script on top of Puppeteer that would be called from crontab on my home server, to go to the reservation page, press the buttons to flip through the calendar and extract days with vacancy (the reservation app is an SPA). When there were new vacancies it would send me a message through one of the messaging APIs about new dates available, so I could see if it aligns with our schedule and book immediately from the phone wherever I am.
That actually worked like a charm and I snatched a very convenient reservation after few days.
It auto-archives any emails older than 2 days (you can change this obviously) unless they're starred - this changes my email experience in two ways:
1) My email inbox ONLY has emails that are either new or that I decided were relevant. I have two days to notice this.
2) When I'm scanning my email, if I see something important that I don't have time to deal with, I can star it and keep moving and know that it'll still be visible and in a manageable-size queue later.
When I travel off-grid it is a bit of a pain since I have to go through a lot of archives, but that can be addressed by just turning it off.
I also ruthlessly unsubscribe to things I've stopped reading to keep everything manageable.
# full terminal page of \n
pgdown () {
printf '\n%.0s' $(eval echo {1..$(( $(tput lines) - 1 ))})
}
# clear teminal and TMUX history and get to the bottom line
c () {
echo -en "\ec"; pgdown
if [[ -n $TMUX_PANE ]]; then
tmux clear-history -t $TMUX_PANE
fi
}
# force MOTD display in a new ssh session
alias sshmotd='ssh -o SetEnv=SSH_MOTD=1'
if command -v tmux 1> /dev/null; then
# open a new tmux session named default or attach to it
tmuxa () {
systemd-run -q --scope --user tmux new-session -A -s default "$@"
exit
}
# open a new tmux session named ssh_tmux or attach to it
alias tmuxs='tmux new-session -A -s ssh_tmux && exit'
# run tmux if TMUX variable is null locally or in an ssh session
if [[ "${TERM-}" != tmux* ]] && [[ -z "${TMUX}" ]] && [[ -z "${SSH_CONNECTION}" ]]; then
tmuxa
elif [[ -z "${TMUX}" ]] && [[ -n "${SSH_CONNECTION}" ]]; then
tmuxs
fi
fi
# run MOTD at login only in an ssh session
if [[ -n "${TMUX}" ]] && [[ -n "${SSH_CONNECTION}" ]] && [[ "${SSH_MOTD-}" == 1 ]] && [[ -f "/etc/update-motd.d/00-motdfetch" ]]; then
motd
else
pgdown
fi
Dunno if I'm proud of it, but I sold a bunch of stuff, most of which I'd bought by having other scrapers on other sites alert me when ads meeting my criteria were posted.
I automated the renewal process using TF and integrated it as much as possible with other services. I then integrated it into a CI/CD pipeline. It took DAYS to do these renewals (ironically of course, $30 per year for a fixed certificate would have been cheaper than the people time but #startuplife). It became a 30 minute process. It only took me two days of interrupted time to build this solution and get it working.
Still going after two years so it’s already saved about 900% on the labour costs.
The hotel website itself is inordinately difficult to search more than 1-2 nights (each day takes 6-8 clicks to search), but it’s all queryable through an undocumented API, so I wrote a small CLI tool in Crystal that can scan the next 60-90 days (configurable, of course) to see if there have been any sudden openings.
It does it all in parallel, so I can find out within about 10 seconds if there are any rooms available within a 30 (or longer) day range.
It’s already helped booked one wedding anniversary trip and one special getaway for 2 friends. I don’t use it often, but it’s wonderful to have around.
https://github.com/WantGuns/auto_meet
it does the bare minimum of:
- parsing calendar events (which have meet links)
- open zoom meets on lecture's starting time
- exit meet meets on lecture's ending time
- shutdown machine when all the meetings were over
my sleep schedule was f-ed up to say the least. these scripts helped me maintain my attendance :)
AFAIK it doesn’t have an option to check a standalone CSS file.
I wrote a script to insert a CSS file into a dummy one-line HTML file, so that I can pass a CSS file to the script, and when errors are generated from the CSS, they are given using the correct line number.
I have written many more complex and perhaps more interesting Scripps, but this is the one I am, by far the most proud of.
- Borg to Restic - Borg to Kopia
They work quite fine, though Restic and Kopia lack the ability to change the root directory of the backup so if your script extracts to a unique temporary directory for each snapshot then you got some confusing output about no files being cached (Kopia) even though it technically deduplicates them quite well. (I suspect Borg doesn't have that ability as well.)
Haven't published them yet but I think I will sometime this year. If somebody pings me I'm very likely to work on them a bit more and publish them though.
drag () {
dragon --and-exit --on-top "$@"
}
I write a lot of extremely simple but handy shell functions.This one lets me drag/and drop things out of a terminal session (kind of) into applications with https://github.com/mwh/dragon and i use it way too often!
tic mark and top row number 2 sends a double mouse click.
It helps reduce strain on my right hand mouse clicking finger.
Isn’t that spamming? The bot answers ads that you haven’t even seen and that you might reject based on description/photos. There could be other people genuinely interested in these ads who will have a harder time contacting the landlord.
``` #!/bin/bash
choice=$(echo -e 'pt -> es\nen -> es\nes -> pt\nes -> en' | dmenu)
text_to_translate=$(echo '> what to translate?' | dmenu)
[ "$choice" = 'pt -> es' ] && res=$(trans -no-warn --brief pt:es "$text_to_translate") [ "$choice" = 'en -> es' ] && res=$(trans -no-warn --brief en:es "$text_to_translate") [ "$choice" = 'es -> pt' ] && res=$(trans -no-warn --brief es:pt "$text_to_translate") [ "$choice" = 'es -> en' ] && res=$(trans -no-warn --brief es:en "$text_to_translate") [ "$choice" = 'pt -> en' ] && res=$(trans -no-warn --brief pt:en "$text_to_translate")
notify-send "traducão" "$res"
echo "$res" | xclip -selection clipboard ```
To automate this process, we have some ansible playbook to automate most of it: give it a query and you'll get a bunch of files with the results.
While useful, the setup and cleanup can get tidious. So, I whipped up a little shell script that creates a temporary directory and symlink it to a location somewhere in $XDG_STATE_HOME. I created a few subcommands to create the directory structure, add a query, run the query, aggregate results and destroy the state. This keeps things simple for me. Even if I forget to cleanup after myself, a reboot will make sure nothing is left behind. The symlink trick to xdg state home helps me a lot to track my state, while I work on a certain problem. It allows me to easily handle multiple request at the same time, when necessary.
https://github.com/Tade0/emergency-poncho
There are many like it, but this one is mine.
Useful when you're a front-end developer and the backend app is not easily deployable locally and the test environment is down.
Also with it you can make a blazing-fast, browsable snapshot of JIRA.
It does two things: 1. List the files in order of last modified or last opened. This way I can list 25 latest files and go to the ones opened by me. (no folders needed, just the last opened ones).
2. With last modified date, tells me when the file was modified last. (useful if you collect data in sheets from forms or other stuff, tells you when you have a new lead). Also tells you when the colleague you share a file with has edited it or not.
I wanted to enhance it with adding more stuff like: show diff, so I know what has changed. And then, show the content from a doc right here in sidebar, because well it's easier to type a mail by looking at that.
I use this now:
import { execSync } from 'child_process';
execSync('git add -A');
const modifiedFiles = execSync('git status --porcelain')
.toString()
.trim()
.split('\n')
.map((line) => line.replace(/^\s*[MADRCU?!]{1,2}\s+/, ''));
const fileList = modifiedFiles.join(', ');
const commitMessage = `Change: ${fileList}`;
execSync(`git commit -am '${commitMessage}'`);
"changes:summary": "tsx ./tools/changes-summary.ts",
After a bit of experimentation I determined that it was necessary to verify the existence of a given index entry at a given level, and that if it existed, one could add additional entries --- if it didn't exist, then it was necessary to create the level above as an index entry, and then it was possible to add the new one.
I probably have the code somewhere, but haven't done any Applescripting for a while now (kind of miss it, Powershell is nowhere as comfortable, and there doesn't exist an equivalent to Application Dictionaries).
Finding an appointment is notoriously difficult, and people often depend on the results of those appointments to start their life here. So I built a UI for the script and put it online. Not just code on GitHub but a live website everyone can use. It uses websockets to wait for appointments from the script running on a server.
It's been live for almost a year now. Thousands of people got an appointment that way. I got the city's blessing to keep it online, albeit with a restrictive poll rate.
It's really cool to see people recommend the tool to each other, and to work directly with the city.
When I return from vacation, I want to ask the city to extend the tool, and also translate it to other languages.
Solution: I used Home Assistant and Frigate (local NVR) to detect if there are humans present in the room, if no humans for 10 mins, switch off everything. I actually use a combination of things to detect human presence, some of the things include motion sensor, PC active, user sleeping (10 PM to 8 AM), there’s also a switch I use to turn on sleep mode if I’m taking an afternoon nap (The switch automatically turns off after 3 hours). I also use a template to measure if the AC temperature is below 22, if it is, I set it to 22 degree Celsius. The automation above saved me 15-20% on electricity bill.
At analysis, files were containing multiple sweeps were averaged to reduce random noise (YYMMDDXX.Ann files), then these files were exported as a CSV-like series of time-voltage paires (YYMMDDXX.Tnn files). Tables containing the calculated variables were also generated, and exported (YYMMDD.Nnn files). The fun part is that these files had to be named MANUALLY. Each and every one of them. I can't stress hard enough how repetitive this got... We generated double-digit number of files every day, and analyzing each file thorougly required around 30-50 keypresses to move around in the menu and to name the files. Lucky for me, no mouse use was required, and keypresses could at least be automated.
I used DOSBox on Debian to do the analysis, and I ended up creating a bash script that could automatically analyze whole folders of these files in a few minutes. To achieve this, I generated xmacro files that would be played back while the DOSBox window was opened. Opening the file was also put in these xmacro files. The generation of the files was wrapped inside a bash script that kept track both of the files in the folder and of the files generated by the analysis. If a file was supposed to be there but some something broke inside DOSBox, it would just stop playing the macro for the next file, so it could be restarted relatively easily.
A few months later, I met the guy who wrote the software for our team, and asked him if he could write us a script to unpack the binaries into CSVs. From there, I could come up with my own completely automated solution for analysis, and everything was much-much faster. I also showed him the macro-monster I created. I'm still not sure if he was amazed or he just thought that I was impatient.
Virus that turned files and folders on removable drives into hidden system files was a popular thing that antivirus progs just didn't help with was everywhere. Friend's and co-workers would constantly bombard me for help recovering the files. When I figured out essentially how the pesky virus worked (this was before the variants that made reg edits) I coded something to do the opposite in a batch script. Would then just set it to identify the portable drive added and it'd delete the virus....the .lnk shortcut files it created and then unhide and return the folders and files back to normal.
Nothing fancy but it did influence my career direction alot.
I'm medical and a hobbyist, not technical, so I was happy to figure this out in bash with some trial and error. I scan the fronts, it gives me 15 seconds to put the backs in the way they came out and it deinterlaces after with deskew, ocr, etc. I used it to shred most of my filing cabinet and now scan bills/letters/receipts/etc. A cron job banks up paperless-ngx offsite using borg. For me it's just slick, my spouse uses it independently (!), and helped my bookkeeping a lot.
I think it was a Perl script connected to a named pipe or similar, so whenever my mutt client read it I got a new sig.
It generated 4 lines of an ASCII art underwater scene, containing a randomised selection of different fish, sharks, boats, seaweed, crabs, rocks, shells, treasure chests, etc.
There was a bunch of code to ensure no overlap, and more natural looking scenes.
There were also certain things that only triggered on certain special days of the year (Christmas etc.), and others that only appeared in conversations with people I'd exchanged more than N emails with.
Sadly it went away when my email client moved to be primarily web based.
I ran ping very often and needed to see when exactly the connection dropped.
mkfifo f;echo \0>f&xargs -E, -a<(cat f) -I{} bash -c "wget http://api.us.socrata.com/api/catalog/v1?scroll_id={} -O-|jq -r '.results[]|.resource.id+"\","\"+.metadata.domain+"\"/api/views/\""+.resource.id+"\"/rows.csv\""'|tee <(cut -d, -f2|xargs wget -P'{}' --content-disposition)|sed q"|tee f
alias f='files=$(cat ~/.fzfdirs | fzf) && cd "${files}"'
I type f and it fuzzy find the directory that I have added to the ~/.fzfdirs file and cd into it.
If I want to add a dir, I append the directory into the file or `pwd >> ~/.fzfdirs` inside the wanted directory.
I find it simple and efficient. I gives me an amazing boost of productivity for moving across directories.
With 3-4 key stokes I get inside any common directory on my box.
fzf is so amazing.
Sometimes on another computer I would think "what was that URL again..?" and just SSH into that computer and execute the command.
[1]: https://github.com/bergheim/dotfiles/blob/a525b216c7404a4332...
#!/usr/bin/env bash
set -o errexit
set -o nounset
usage(){
printf "[TRACE=...] %s [-b /path/to/clones/root] (%s/w) -- \n" "${0##*/}" "$HOME"
}
if [[ -n "${TRACE-}" ]]; then
set -o xtrace
fi
trap 'set +o xtrace' ALRM HUP INT TERM EXIT
: "${GIT_BASE:=$HOME}"
while getopts b:h OPT; do
case "$OPT" in
b) GIT_BASE="$OPTARG";;
h) usage; exit 0;;
*) ;;
esac
done
shift $(( OPTIND - 1 ))
while read -r REPO; do
[[ -d "$REPO"/.git ]] || continue
[[ "$REPO" =~ \.terraform\/modules ]] && continue
tput setaf 8
printf "%s … %s\n" "$REPO" "$( git -C "$REPO" config --local --get remote.origin.url )"
tput sgr0
if [[ $# -eq 0 ]]; then
git -C "$REPO" status -sb
else
git -C "$REPO" "$@"
fi
printf "\n"
done < <( fd --one-file-system --type d . "$GIT_BASE" )
2: Auto restart ADSL modem on connectivity loss: https://gist.github.com/TalalMash/1fdb6363b46575d05530c6f654...
A couple of PowerShell scripts that moves multi-gigabyte files from the computer that generates it to an intermediate server, and then another computer picks those up and puts them on to whatever storage space is available, squeezing them in via a naive backpack algorithm. It shaved about two hours per day off my workload, instead of manually copying files around the network and then monitoring them to make sure they got where they meant to go.
A couple of Python scripts that tell a number of SONY video cameras to wake up, pull in the video and audio over USB, verify the files transferred correctly, then erase them from the camera to make space on the SD card.
A simple voice-controlled personal assistant that can start & stop multiple video cameras, perform the "mechanical clapper" effect for each scene and take, label the video files, and read out the prompt for the scene about to be shot.
A simple C# stopwatch timer for Windows that logs my daily walks via a touch screen computer located near the front door, calculating speed & distance for that walk, total daily time, etc. Hooks in to a couple of security cameras to track me as I walk past the house on my laps, recognize that it is me, and time stamp my passing (by the camera).
A simple Python script that pulled a whole bunch of data out of PDFs for a research project. It saved the small research team multiple thousands of dollars that they couldn't get a research grant for, and was able to then save the department multiple tens of thousands of dollars each year from that point onward. I got paid a large pizza (toppings of my choice). And made a couple of friends of some librarians.
A home point touch off position script for my CNC.
There are dozens more I'm proud of that I am forgetting. These are the kinds of interesting little projects that take an afternoon or an evening and often have a huge impact on quality-of-life.
The basic format is: "t "t a" goes to my main development git worktree, while "t b" goes to place I build from. But the real star is the history. I embedded "`save_t_history`" into my bash prompt. That's another scropt that saves the current directory to a history file, then removes duplicates so staying in the same directory doesn't accumulate. Typing "t h" shows the history back to about 25 unique directories. Typing "t h1" goes to the last directory you were in. "t h5" goes five ago. I be working in tmux in one pane, and then jump to another and type "t h0" to bring it to the same directory. This is basically my bread and butter for getting around. I would post the code, but a.) it's on a work computer so technically property of my company and b.) it's a mess so I'd recommend others just build their own version. A hint to get started is that the "t" alias is an alias for "cd `python3 /opt/t.py`"
* dstream (self-hosted music streaming) https://github.com/DusteDdk/dstream
* finalkey (hardware device for storing/typing logins) https://github.com/DusteDdk/FinalKey
* justhpscan (scan an image via a web gui, no drivers needed) https://github.com/DusteDdk/justhpscan
* boilerController (automatic startup/shutdown of my wood gasification boiler) https://github.com/DusteDdk/boilerController
* racinggpstracker (generate overlays with speed / track times / map for road-racing from go-pro) https://github.com/DusteDdk/RacingGpsTracker
* chromogen (generate static-html albums for sharing pictures online easy) https://github.com/DusteDdk/chromogen
https://github.com/TorbFoundry/torb
"Torb is a tool for quickly setting up best practice development infrastructure on Kubernetes along with development stacks that have reasonably sane defaults. Instead of taking a couple hours to get a project started and then a week to get your infrastructure correct, do all of that in a couple minutes."
My ambition is to fill the space of Heroku and right now I can get something like a React App and a Flask API or a full Elixir Phoenix project started and running in less than 5 minutes on Kube, with a full infrastructure as code environment, reproducibility and more.
It's definitely not meeting my ambition yet, but it's definitely in a place where I think people can use it and get a lot of value from it. I've been testing it with some friends over a few months and have been dog fooding it on my other projects.
I just finished adding a file system watcher to it over the last few days so you can iterate and changes will quickly be reflected onto your cluster. Next I'm going to extract this out into a library from the CLI tool, build an API over it and offer a hosted solution and a web app.
I've been meaning to share it with people but I've been heads down building for maybe a little longer than I should have been.
npm install happy -g
$ happy
// Basically does:
- npm build (if package.json build available)
- npm lint (if package.json lint available)
- npm test (if package.json test available)
- git add . && git commit -m $MESSAGE || "Saved on $DATE"
- git pull
- git push
happy --now
// Same but omit the build+lint+test steps, ideal for e.g. a documentation change
happy --major | --minor | --patch | --publish 1.2.3
// Same but use `np` (install separately) to bump the {publish} version and publish it in npm
It is built to maintain many small Git/npm projects, which fits my usecase and needs perfectly. But it has some limitations ofc, like it works assuming small projects working on a single git branch so if you have any kind of more advanced git setup it won't work properly.
I wrote a little daemon that'd l2ping my Nokia brick phone; if it didn't get a response for 30 seconds it'd invoke xscreensaver. Saved me a lot of paperwork.
I currently work at a Call of Duty studio. My favorite hacks ( not super high tech, but the ones that had the most impact for the least code, and the ones I feel I can talk about.. ):
* Put together a little box that polls varied knobs on a USB midi device to mangle the traffic going across its two interfaces. Allows for real time latency / jitter / packet loss testing https://twitter.com/ultrahax/status/1200902654882242562
* Studio LAN game browser didn't work across subnet boundaries ( studio is a couple of class B's ). Wrote a little daemon that'd take game discovery packets from one subnet, mangle the src addr, and send it on its merry way. Everyone can see everyone's games, happy producers.
tabChar=$'\t'
function prompt_command {
echo "$(date +%Y-%m-%d--%H-%M-%S)$tabChar$(hostname)$tabChar$PWD$tabChar$(history 1)" >> ~/.full_history
}
export PROMPT_COMMAND=prompt_command
function c {
while read -r _ dir
do
if [[ -e "$dir" ]]
then
echo "$dir"
cd "$dir"
break
fi
done < <(cat ~/.full_history | tail -n 10000 \
| cut -f 3 | sort | uniq -dc | sort -hr | grep "/$1$")
}
function _c {
local IFS=$'\n'
COMPREPLY=( $(cat ~/.full_history | tail -n 10000 \
| cut -f 3 | sort | uniq -dc | sort -hr \
| sed 's/.*\///g' | grep "^$2") )
}
complete -F _c c
$ some_cmd | another_cmd -d doing -some complex -thing # somecmd to do something complex
Some time later, I don't have to remember the long string of commands - just the comment .. CTRL-R "somecmd" .. and I'm back again!Another small personal hack that has had a big impact, is that I print-to-PDF any web page I find interesting, instead of bookmarking it. I have done this since 1999 and now have over 70,000 PDF's of everything interesting I've ever read online, safely in a local cache of readable PDF's for offline use.
The next big life hack will be to put myself on a boat in the middle of a vast ocean with 70,000 PDF's to read .. joking aside, its also pretty interesting to harvest this data and see what my interests have been over the years, and see the bread-crumb trails of sites I've visited related to those interests .. kind of like having my own personal Internet ..
Some gifs (with speed comparisons) here: https://github.com/dp12/probe
https://gist.github.com/xenophonf/893b323b99644290fad420a54c...
It keeps the custom certbot install updated, and if anything goes wrong—and only if something goes wrong—I'll get an email thanks to chronic. Same goes for the install command outputs when running the wrapper/certbot interactively. You'll only see error messages when something breaks. Otherwise, it looks exactly like certbot was invoked directly, just with a short delay while the script does the install/update in the background.
I back-up my photos every couple of weeks and I found out that if the laptop and the first HDD have the same directory structure, the rsync can figure out the diff and only copy over photos that are not yet backed up. This also let's me archive photos from my laptop to my first HDD at any time and rsync still picks it up correctly.
It works like this
rsync "$source_dir_laptop" "$source_dir_hdd" "$backup_destination" -ah --info=progress2 --delete-before --backup --backup-dir="$delete_backup_dir"
https://github.com/harish2704/dotFiles/blob/master/home/.loc...
This script + SpaceFM file manager saving lot of time for me
It replaces common ASCII characters by their typographic counterpart, e.g. "Hello - I'm back" will get "Hello — I'm back"
Anyway, when working on feature branches I wanted a quick way to switch to master, pull, switch back and finally rebase.
So I made a little script (https://gist.github.com/lelanthran/6c9bd1125f89e7621364878d1...) that pushes the current branch name onto a stack before switching to a new branch, and allows me to switch back to the topmost branch on the stack.
xgit master # Switch to master from current
git pull
xgit pop # Switches back to
git rebase master
I use this now very frequently; when I need to switch to branches I use `xgit Sadly it was lost in a when the HDD crashed and I never got around to remaking it because reasons.
I automated everything except the actual recording using a Makefile that's over here https://gist.github.com/nibrahim/2466292. I think I added some sox commands to clean up audio too in a later version but lost the file.
Another one I did was a small script to create ruling sheets for calligraphy. It's a tedious process by hand and having a script create a PDF based on nib width is a great time saver https://github.com/nibrahim/Calligraphic-Rulings
When I run a command that takes long, I run it (I use fish but any shell has an equivalent) like:
./mycommand && beep
or
./mycomand && play something.wav
It can be useful If I am doing something else on the computer or outside of it. It saves time from checking over and over if the command finished.
https://forecast.weather.gov/MapClick.php?lat=40.7983&lon=-1...
The issue I had with these were that, these pages are always in desktop mode even on mobile. So i made a one page static website which takes & stores these urls in local storage, & fetches the one I click upon. On next pageload, it remebers the last loaded city & it loads it again.
All the page is in mobile-first view mode in css.
https://spa.bydav.in/weather/index.html
You might see nothing as you need to preload/populate the local storage with either manually copied urls or simply paste this json.
[
{
"abb": "SNR",
"full": "Sonora",
"url": "https://forecast.weather.gov/MapClick.php?lat=37.98&lon=-120.38&unit=0&lg=english&FcstType=json&TextType=1"
},
{
"abb": "RDB",
"full": "Red Bluff",
"url": "https://forecast.weather.gov/MapClick.php?lat=40.1781&lon=-122.2354&unit=0&lg=english&FcstType=json&TextType=1"
},
{
"abb": "SFO",
"full": "San Francisco",
"url": "https://forecast.weather.gov/MapClick.php?lat=37.7771&lon=-122.4197&unit=0&lg=english&FcstType=json&TextType=1"
},
{
"abb": "ERK",
"full": "Eureka",
"url": "https://forecast.weather.gov/MapClick.php?lat=40.8033&lon=-124.1595&unit=0&lg=english&FcstType=json&TextType=1"
}
]
I also became frustrated with weather and news options, so I made a minimalist weather and news application to play VOA, NPR, and Sky news, along with the weather forecast from weather.gov by zip code: https://www.locserendipity.com/Start.html
I think of both of these as hacks because they are just static pages anyone can download them and use without connecting to my website.
To regularly remove any files in commonly dirty directories that haven't been viewed in a week
Keeps the desktop tidy: https://gist.github.com/higgins/b825103ce0bcf3fbefc79a439212...
https://gist.github.com/wittman/ab90f0eca5124233fa5163a32f89...
# Prerequisites: ImageOptim (desktop app) https://imageoptim.com/mac
## pngquant https://github.com/kornelski/pngquant
brew install pngquant
## oxipng https://github.com/shssoichiro/oxipng
brew install oxipng
## sips (utility provided by default in OS X)
It is generally possible to impersonate a user but there are always subtle differences in the way those sessions work.
Accepting the hash lets people with database access perform a real login if needed.
Why it was even nessecary to dothis in the first place rather than a script that also entered the correct answer was presumably some weird thing in the contract with the state who administered the test.
1. Toggle how function arguments are indented: either all on the same line, or all on separate lines.
2. Convert arbitrary piece of text into a literal string (i.e. with quotes).
3. Extension to Magit that calls git-grep.
4. Dired function that calls Ghostscript on multiple selected PDFs to combine them into a single PDF.
5. Magit extensions for internal Git commands added by the company I work for (they are all nonsense, but I have to deal with them and it's better to keep this nonsense to a minimum).
----
Stuff I don't use anymore, but used to be useful:
1. Govmomy Emacs Lisp wrapper for select functions that rendered information as Org tables (I don't need to work with VMWare anymore). Not sure if this counts as small, but it wasn't more than some 100-200 lines.
2. Emacs command to incrementally show duplicated lines.
bindsym ctrl+$alt+t exec xdotool getactivewindow getwindowpid | xargs ps | grep "alacritty" && (xdotool getactivewindow getwindowpid | xargs pgrep -P | xargs pwdx | cut -d":" -f 2 | xargs alacritty --working-directory ) || WINIT_X11_SCALE_FACTOR=1 alacritty
https://github.com/pkos98/Dotfiles/blob/master/config/i3/con...I probably scratched some parts together from the internet a couple of years ago.
comm -23 <(awk 'NR>1' "$DSTDIR/build-info") <(find "$SRCDIR" -name "*$EXT" -type f -printf "%P\n" | tee >(gen_index) | xargs -n1 "$0" "$SRCDIR" "$DSTDIR" | sort | tee -a "$DSTDIR/build-info.new") | (cd "$DSTDIR" && xargs rm)
The rest is here: https://gist.github.com/hadrianw/060944011acfcadd889d937b960...
CronBin: https://github.com/theowenyoung/blog/tree/main/scripts/cronb...
Both scripts are deployed as single file js in the free tier of Cloudflare's Workers, and my personal workflow relies on these two simple services for all persistent data storage and scheduled tasks.
Using n8n I can parse those emails and put the data into a nice Airtable base (and a csv for backup). Next step, is to add GPS co-ordinates to each transaction.
Does anyone know if there is a way to run a secure and private location tracking service to my iPhone, or is my only hope to pay Apple the annual fee and do it myself?
(Yes: I wrote "secure and private location tracking" with a straight face - I'm hoping there's a solution out there)
This is a script I made which can be invoked through "git difftool -d -x bettervimdiff" that opens each diff pair of changed files in a separate tab page. I acknowledge that it abuses tab page functionality a bit but I think it's pretty handy. Handles deletions, doesn't handle renames and symlinks yet.
https://gist.github.com/PhilipRoman/60066716b5fa09fcabfa6c95...
motd() {
date +%D > /tmp/motd_seed
sort -R --random-source=/tmp/motd_seed ~/Dropbox/motd.txt | head -n 1
}
motd
```
One is a RSS feed reader (GORSS) for the terminal that I use to always be up to date with stuff that interests me. The other is a simple todo-list that I use for work, shopping etc (DoIT).
https://github.com/lallassu/gorss https://github.com/lallassu/doit
`diary` opens the entry for today
`diary -1` opens the entry for yesterday
`diary -365` opens the entry for a year ago
https://gist.github.com/bspammer/12d6b4a8049c69c832df7c4fd1d...
I can easily get Iosevka Nerd Font just by running the command
install_nerd.sh Iosevka
One is GenFortune: https://github.com/EternityForest/GenFortune Which is like fortune, but it generates them using a MadLibs style algorithm from data files.
One is a candle flicker simulation algorithm. I did not make this demo, I wrote the original for the PIC, they adapted a version of it for JS: http://blackdice.github.io/Candle/
At the time cheap LED candles weren't very good, but it seems many are a lot better now. I used a model assuming that the flickering was wind driven, that the flame always wants to rise towards it's maximum at a certain rate, but that depending on the current wind speed at any time it can get randomly "toppled" to a lower value, while then wind settles down to a baseline but can randomly jump up in a gust to a higher speed.
Of course, now we have multicolor multipixel flame algorithms that do way better, but this is pretty good on a single pixel light.
I still use mostly the same algorithm for DMX flame effects in Python, but I apply then effect less to the red channel so it gets redder when the light goes down for a little added variety.
The other is this RNG. It does not even have a full period for it's 32 bits of state but it's very fast on PIC10 type chips. I can't think of any reason I'd use it. I was like 15. But of everything I've ever made, this gets the most attention, and I'm not entirely sure why. It doesn't even pass all statistical tests. I'd probably use it over LCG though.
``` uint8_t x,a,b,c; uint8_t randRNG8() // Originally unsigned char randomize(). { x++; //x is incremented every round and is not affected by any other variable a = (a^c^x); //note the mix of addition and XOR b = (b+a); //And the use of very few instructions c = ((c+(b>>1))^a); //the right shift is to ensure that high-order bits from b can affect return( c); //low order bits of other variables } ```
`$ diary` to create/open a file for today's diary in vim: https://github.com/Aperocky/diaryman/blob/master/diaryman.sh
Or try `pip install diarycli`: https://github.com/Aperocky/diarycli, for a pip packaged python version that does the exact same thing.
I've actually kept diary and work logs, things I did not know I was capable of.
Layer Labeler (sequentially labels layers with prefixes and leading 0s): https://www.middleendian.com/pslayerlabeler
Mask Placer (because Photoshop trims transparent pixels when exporting multiple layers for no fucking reason): https://www.middleendian.com/psmaskplacer
``` mkcd() { if [ $# != 1 ]; then echo "Usage: mkcd
It uses OLE to open Word documents and print them to PDF. Nowadays Word has a save to pdf option, but initially the script would print to postscript and then convert to PDF with Ghost.
It runs from IIS, but over the years it has been used as a CLI tool or with Apache.
So far nothing has shown to be as reliable as this script at converting to PDF. It would be great not to depend on Word or even Windows, but there's nothing that can convert with the same level of fidelity.
Aside from this, I basically just rely on things like cntrl + p for search, various bash/shell scripting + chatgpt to be efficient. Always improving!
A GET request (by the phones browser) is sent when I "touch" the phone to any of the stickers, the backend is a simple nodeJS express script that listens for GET request on a route.
I use it not for just persisting commands but leaving comments for future me such as github personal tokens or api keys etc.
[0] https://eli.thegreenplace.net/2013/06/11/keeping-persistent-...
git-clean-squashed
Which finds all squash-merged branches in the current Git repo, and warns you before deleting!https://github.com/benwinding/dotfiles/blob/master/bin/git-d...
* Load a playlist from spotify (including fetching the API token programatically from bash... shudder) * Search youtube for the song + artist * Download the MP3 of the song from youtube * Set up the directory structure for Rekordbox DJ * I could then manually import the song
When I was too broke to buy music for DJ'ing this satisfied my craving for new music to play.
This way I can quickly boot any Pi in my house (and there are quite a few now) to any arbitrary OS image just by renaming a symlink on the server.
So I automated the notification system using IFTTT.
https://github.com/ravivooda/thoughts/blob/main/docs/garage_...
o mv mispelled.txt
and it will make an interactive prompt that looks like:
mv mispelled.txt mispelled.txt_
but the last word is editable, so I can use the arrow keys to alter it in place
https://github.com/robswc/docker-essential-aliases
When deep diving into containers, I got tired of typing/tabbing out container names and docker formatting lol
Nothing revolutionary but it has saved me a ton of combined time.
This ended up always making sure I got into the section I wanted.
About same as GUI interface but with IAC advantages
The problem: I want to know when content from one of my sites is submitted to Hacker News, and keep track of the points and comments over time. I also want to be alerted when it happens.
Solution: https://github.com/simonw/scrape-hacker-news-by-domain/
This repo does a LOT of things.
It's an implementation of my Git scraping pattern - https://simonwillison.net/2020/Oct/9/git-scraping/ - in that it runs a script once an hour to check for more content.
It scrapes https://news.ycombinator.com/from?site=simonwillison.net (scraping the HTML because this particular feature isn't supported by the Hacker News API) using shot-scraper - a tool I built for command-line browser automation: https://shot-scraper.datasette.io/
The scraper works by running this JavaScript against the page and recording the resulting JSON to the Git repository: https://github.com/simonw/scrape-hacker-news-by-domain/blob/...
That solves the "monitor and record any changes" bit.
But... I want alerts when my content shows up.
I solve that using three more tools I built: https://datasette.io/ and https://datasette.io/plugins/datasette-atom and https://datasette.cloud/
This script here runs to push the latest scraped JSON to my SQLite database hosted using my in-development SaaS platform, Datasette Cloud: https://github.com/simonw/scrape-hacker-news-by-domain/blob/...
I defined this SQL view https://simon.datasette.cloud/data/hacker_news_posts_atom which shows the latest data in the format required by the datasette-atom plugin.
Which means I can subscribe to the resulting Atom feed (add .atom to that URL) in NetNewsWire and get alerted when my content shows up on Hacker News!
I wrote a bit more about how this all works here: https://simonwillison.net/2022/Dec/2/datasette-write-api/
Bit more than a hack, but a simple thing which has made the whole process much easier
Usage is hugonew 'my awesome new nice title'
https://gist.githubusercontent.com/Explosion-Scratch/53acf9d...
This saved me on avg about an hour a day of manually remoting into retail store systems to gather sales and stock data for the central system processing.
That was a good feeling, and was so simple.
I got this idea from a Twitter discussion which eventually blew up a year later after its initial release.
I stopped doing it because every time the server crashed through no fault of my own, everyone turned to look at me.
I haven't updated it in a few years because it just works
I'm still impressed by my connect4 and 2048 solver
my favorite is hypothesis die hard[2]. i need a sequence of actions that accomplishes a goal. assert there is no such sequence. error, here is such a sequence. ok thanks!
1. https://gist.github.com/nathants
2. https://gist.github.com/nathants/3dc397270d73fee57c86f246243...
> Relax and be ready to answer incoming calls :D
And they call you, with Berlin's rental market? Improbable.
The corp in question: Begins with V and ends with z and likes to screw customers.
All above board, approved by several levels of various teams/depts. The head of a specific dept sent me several strongly worded texts/voicemails. He was rather expressive to say the least over something that cost $0.
I could go in more detail but there are a few NDAs I signed.
At my first software developer job (small company of about 400 people in a country in Asia), there was a company wide requirement that our working hours should be logged into OpenProject.
Logging the hours turned out to be painful because the UI was clunky. Too many clicks and page loads were needed. You probably see where this is going.
Iteration #1:
After working there for a year, one evening I checked out the API for OpenProject. The web interface itself includes a place to set an access token, and examining the browser network tab gave me additional information that I needed.
In about a couple of hours, I wrote a quick and dirty python script that would read a text file formatted this way:
etc
My script when invoked read the file line by line and pushed each entry to OpenProject via API calls.What took 15 minutes every evening now only took 30 seconds!
Having said that, this was a quick and dirty script, and although it worked fine I didn't want to spend time cleaning things up.
Iteration #2:
30 seconds was too long!
This was because for every line in my input text file it would establish a new API connection. So if there was 5 lines of input, it would make essentially the same 5 API calls.
(There's probably a proper way to bundle all input into a single API call, but that's not the approach I took.)
I realised that my workload is embarrassingly parallel. I altered my script to process each line in parallel.
What took 30 seconds on a typical day now only took like 5 seconds! (Depending on input size of course).
(My dirty script became dirtier in terms of code quality.)
Iteration #3:
Even though it took literally just 5 seconds, manually running the script every evening was a chore!
So I setup a cronjob to run at 6:05 pm everyday to run this automatically. 6 pm is when I usually call it a day.
On the occasional day that I stay a little bit longer, I just had to log in manually to OpenProject and do the necessary adjustments. No big deal.
Iteration #4:
Before the aforementioned cronjob ran I'd like to review what it's going to push. I discovered the notify-send utility in Linux that I can use to display a popup.
Wrote a crontab entry to show me the contents of the input file in a popup a few minutes before my other script ran.
Iteration #5:
(By this time I've been relying on this python script for over a year).
OCD finally became unbearable about the crappy code quality (even though I never actually had to look into the code at all for like 9 months by this point).
One weekend I rewrote everything in Go (to take advantage of goroutines, since my program could readily use those — and because parallel programming in other languages I knew felt like an afterthought, though that's a topic for another day). My keys and such things were now in a proper config file.
The Go program also ran without hiccups.
I left the place about 6 months later though.
It was fun!
The solution is a simple bash script that deletes "target/" directories.