I feel like if I don't use it for something useful it'll be wasted. Any ideas?
I happen to need around few thousand in GCP credits in order to egress around 30TB of data from our cloud buckets. It's training data, training logs, model snapshots, and pretty much all of our ML history (http://wiki.tensorfork.com/).
The real tally is actually 83TB, but around 30TB of that is incremental model snapshots which we don't need to download (turns out, you can generate a lot of data when you're using TPUs for training), and around ~30TB is whatever-data that no one cares about / duplicated data across regions. (To use TPUs in EU and US, you need to place the training data in both EU and US.)
Normally I wouldn't ask, but we've been playing "hot potato" with this data for several months now, since our last credits ran out. We've basically been making new GCP accounts (where you get $300 credit) and then letting it sit there for a month (because they give you a month before deleting it, even if the credits run out).
If you'd be willing to help, it would be quite simple: I would transfer the buckets to your account, and then download the data to a Hetzner server. I'd then make the data publicly available, such as it is. You can DM me on twitter (https://twitter.com/theshawwn) or email me (see profile). Thanks for considering!
EDIT: For an example of how we've amassed terabytes of training logs, you can see some of our BigGAN outputs here: https://twitter.com/search?q=from%3Atheshawwn%20biggan&src=t...
[0]: https://old.reddit.com/r/gridcoin/comments/7vhku5/fyi_we_are...
[1]: https://github.com/GoogleCloudPlatform/terraform-folding-at-...
Of course, Gcp, Aws, DO don't owe me anything and these programs must be successful based on their data but still shows the irony of the world where resources aren't matched well against needs.
Maybe people can use these on our platform (https://iko.ai) which is on GCP and GKE. The platform provides real-time collaborative notebooks so people can train, track, package, deploy, and monitor machine learning models on the users' Kubernetes clusters (AWS, GCP, Azure, Digital Ocean, etc. So if you have credits elsewhere, this may be doable).
You could transfer your credits to us, we'll generate a special link, then you donate that link to people or entities who, when signing up with the link, could use your resources.
I am inspired by this: https://news.ycombinator.com/item?id=28550764
I was thinking about building a domain name index, this would give the project a nice deadline and budget :-)
If you want to collaborate, my email is in my profile.
You'll always need storage.
"ROI"