It worked amazingly well, it got me hundreds (yes, 100+) of dates, incredible dating stories and a lifetime of memories. As a matter of fact, my wife (of 7 years!) was found through the software, and she is the most important person in my life. She knows about the scripts and laughs about it.
I would just let the script send the initial messages with some basic filtering and scraping, and I would take over the conversation only when I got an initial positive reaction (which, as you can imagine, was a tiny fraction of the outreach) and manually confirmed that the profile was potentially interesting to me. The script would present the relevant information (pictures, links, etc) in an ideal way to be consumed in bulk instead of having to click through a million profiles. It really saved a lot of time.
I also have automations for sending a notification when my chest freezer goes above 10 degrees.
In the winter, I have an automation that turns my humidifier on/off to keep the indoor humidity within certain ranges depending on the outdoor temperature, to avoid excess condensation on my failing windows.
If a certain router started timing out, I had to go reboot it. I placed it on a networked power switch so if the router timed out 5 times in a row, the power switch bounced, rebooting it.
I had to go collect temperatures from the outside thermometers around the small 3 building campus. Replaced those with net thermometers with APIs that could be hit to get a reading.
Every 30 minutes we had to record information from some netsend ping and check some jobs. I had to first engineer a solution to get the net send messaging working on Win7. Then I had to extract the information from it, transform it into a usable data structure and use a perl script to load that into our mysql journal DB.
There were data management tasks that required clicking precise locations in an application. I used tools to make sure that the window opened in the same place then I learned how to accomplish precise cursor location and manipulation with either batch or wscript, depending on the needs at the time. Then I nailed down a precise timing for each action in the tasks and virtualized the entire setup into a VM.
Then I used the host machine to refactor our Java 5 application to Java 7 with considerations for Java 8.
But that wasn't "my job" and when the owner died and his son inherited the business he hired his buddy to manage me out. When his buddy found out that I'd "automated my entire job", I lost my job. (Never mind the fact that the job had become refactoring Java code, writing blog posts for the website, and maintaining said website.)
I was getting paid $9/hr.
You won't ever catch me automating anything anymore.
Another example is “1234 rename” script that does (F2 ctrl-v - Recently I automated some govt site with literally pages of inputs and combo boxes and non-existent API docs to input a couple hundreds of cards into it. Also semi-automated lora dataset preparation which makes it a lot easier to handle. I can collect and prepare a new dataset while a previous lora is in training, very productive compared to default kohya_ss experience.
Figuring out how to enter my pin automatically in my banking website was a surprisingly fun challenge: https://www.ing.com.au/securebanking/
He was happy and it kept everything untouched, thus I wasn't going to induce bugs, a very important thing in a job shop with repeat products that only show up every few years, but represent the life blood of the business.
We used Pabbly automations on our Google Sheets to send WhatsApp messages to candidates, fetch candidates confirmation for scheduled interview in Sheets directly, etc.
I built a cron job which checked the page every minute, and sent a push notification to my phone when the page changed.
I probably could have optimised it further, but this was sufficient for me to secure a ticket.
--->>>
Wrtie py script to scrape the spreadsheet with Python (20 sheets, 1000s of rows and columns)
--->>>
Calculate optimal networking conditions and connections/allocations for 1 entire full country network using Djikstras algo and some other custom code. Solves a major allocation issue that can't really be solved manually, very painstaking.
--->>>
1 hour later, Report results to bosses, solve entire task, ask for 2 pay-rises, leave the next week for double my salary
lol
The backbone of this was Pipedream, Jira, and Buddy, and GPT-4 handled the test writing. This was a year ago when GPT-4 was the top of the line AI model. Yes, we needed all of these, it was a fragile thing built on external tools.