Looks great. Feature request: Google Drive for desktop. The feature that gives you your drive as a mounted file system stream files as you need them. It gives me the ease of having access to a giant amount of files stored in my gdrive without having to worry about the space they take up locally nor moving files up and down.
Actually, what solutions to that might already exist? I don't really use the web UI of gdrive as much as use it as a cloud disk drive.
To be fair, I can't remember the last time I needed Dropbox or Google Drive, but I do use iCloud, since it comes with plenty of storage for my family plan. I don't send anyone files like back in the day where people would send me a Dropbox link and I'd send them one back.
Hah, wow. A post with an ID under 10k. Meanwhile this one is over 47M.
I didn't realize I've been reading HN nearly its whole existence. For all my complaining about what's happened to the internet since those days, HN has managed to stay high quality without compromising.
Every so often someone is like, Dropbox isn’t that hard. Look at this amazing ZFS/whatever! So simple. Yeah, I keep paying Dropbox every year so I don’t have to think about it. I shoot a sync off to backblaze every once in a while.
at the risk of a comment that doesn't age well, for most people on HN I would definitely look into just using rclone. I also has a GUI for people who want that. rclone is mind-blowingly good. You can set up client-side encryption (so object storage never sees the data or even the filename) to be seamless. I'm a huge fan
The selling point of Dropbox/Google Drive isn't the storage itself, but that there's app for mobile and desktop operating systems which deeply integrates it in the OS so it's just like a local folder that's magically synced.
So it's a cool project, but not really what I'd say is a Dropbox replacement.
Yep, I use rsync to sync files / directories between my desktop, laptop and even phone (Android). Also an external drive.
I ended up creating https://github.com/nickjj/bmsu which calls rsync under the hood but helps you build up a valid rsync command with no surprises. It also codifies each of your backup / restore strategies so you're not having to run massively long rsync commands each time. It's 1 shell script with no dependencies except rsync.
Nothing leaves my local network since it's all local file transfers.
Free, opensource, works on computers and phones, can in most cases puncture nat, supports local discovery (lan, multicast).
No googles, no dropboxes, no clouds, no AI training, no "my kid likes the wrong video on youtube, now our whole family lost access to every google account we had, so we lost everything, including family photos", just sync!
This is my go to solution for code sync across macOS laptop, Windows VMs, and Linux VMs to build and run/debug across environments. Unless something has changed, exclusions of build artifacts was always an issue with cloud sync providers.
I have been doing more cross compilation on macOS, copy and run on those other machines lately for prototypes, but for IDE based debugging it’s great to edit local or remote and get it all synced to the machine to run it in seconds.
Given how many fuckups sync had over lifetime of it (at one point it basically asked for re-log every day, at other it just corrupted data/didn't finish sync), no
The critical part of Dropbox is not just the storage layer but a combination of their client and server. Even small things like how do you handle conflicting writes to the same file from multiple threads, matter a great deal for data consistency and durability.
A lot of the backend bucket providers can handle file versioning.
I too would like the answer to this concern because the features page doesn’t mention it. I want to be able to handle file version history.
I’m currently using Filen which I find very reasonable and, critically, it has a Linux client. But I wish it was faster and I wish the local file explorer integration was more like Dropbox where it is seamless to the OS rather than the current setup where you mount a network share.
I think the idea is any s3 compatible api endpoint can be used. The code also clearly supports both backblaze, and more importantly, local blob storage
Just saying, but this is not really fair. It's not like you use that 2TB. So you shouldn't compare it to a 2TB bucket. Most of these plans have limits to prevent abuse but they're well beyond the 'I need to care' level.
Maybe you use 1TB, maybe just 10GB. As a user on this site I expect you know that a 10GB plan and a 1TB plan won't be that much different.
Looks like a good light weight solution to front object storage with a front end and auth. One suggestion is to add the license to the repo. The readme says License: MIT, but there’s no license file.
I bought 35$/mo 16TB server from OVH. I am running 2 replicas of Garage, one on this server. I am using this for backup for now but probably I will also move my Nextcloud files there and websites. This is fine for now and less pricey than any S3 provider I was able to find.
I use archive storage class on google cloud, to store old movies and wedding videos, pictures of old vacations.
For everything else I use paid onedrive subscription.
The biggest problem is user interface with s3 like storage and predictable pricing because remember you also pay for data retrieval and other storage apis, with dropbox etc you pay a fixed amount. Every year or so I roll over data into the bucket.
This is in Go, exposes both webdav and SFTP servers, with user and admin web interfaces. You can configure remotes, then compose user space from various locations for each user, some could be local, others remote.
I use a mini pc with small smb shares (less than 1 TB). This thing is on 24/7, but runs energy efficient.
When it's time to move data, i copy it to a Synology NAS that holds lots of TB's. Then it's also time to backup the really important stuff, which goes to a Hetzner Storage Box[2].
Actually, what solutions to that might already exist? I don't really use the web UI of gdrive as much as use it as a cloud disk drive.
https://news.ycombinator.com/item?id=9224
I didn't realize I've been reading HN nearly its whole existence. For all my complaining about what's happened to the internet since those days, HN has managed to stay high quality without compromising.
So it's a cool project, but not really what I'd say is a Dropbox replacement.
I ended up creating https://github.com/nickjj/bmsu which calls rsync under the hood but helps you build up a valid rsync command with no surprises. It also codifies each of your backup / restore strategies so you're not having to run massively long rsync commands each time. It's 1 shell script with no dependencies except rsync.
Nothing leaves my local network since it's all local file transfers.
Free, opensource, works on computers and phones, can in most cases puncture nat, supports local discovery (lan, multicast).
No googles, no dropboxes, no clouds, no AI training, no "my kid likes the wrong video on youtube, now our whole family lost access to every google account we had, so we lost everything, including family photos", just sync!
(not affiliated, just really love the software)
I too would like the answer to this concern because the features page doesn’t mention it. I want to be able to handle file version history.
I’m currently using Filen which I find very reasonable and, critically, it has a Linux client. But I wish it was faster and I wish the local file explorer integration was more like Dropbox where it is seamless to the OS rather than the current setup where you mount a network share.
1 TB is roughly 20-30 USD per month at AWS/GCP only in storage, plus traffic and operations. R2 is slightly cheaper and includes traffic.
Compared to e.g a Google AI plan where you get 5 TB storage for the same price (25 USD/month) + Gemini Pro thrown in.
I'd rather control the whole stack, even if it means deploying my own hardware to one or more redundant, off-site locations.
Edit: Are there robust, open source, self-hosted, S3-compliant engines out there reliable and performant enough to be the backend for this?
How much on S3? A LOT more.
Maybe you use 1TB, maybe just 10GB. As a user on this site I expect you know that a 10GB plan and a 1TB plan won't be that much different.
Old technology still works, even if it is old!
And so easy to set up on a home computer. Except it's not always on and doesn't come with backups.
I'm not saying S3 is where it's at but might need a bit more than just Samba. Or maybe you don't but people who need Dropbox do.
Turning on SMB is usually just a click of a button, even macOS supports it
Any user technical enough to be able to set up an S3 bucket, Syncthing, Nextcloud or this "Locker" tool from OP can also set up an SMB share
I was responding to the above thread, where sharing files on an offline network is being discussed. Backups were not mentioned as a requirement.
Sure, ChatGPT can help, but to use it reliably, you still need enough medical knowledge to ask good questions and evaluate the answers.
(and regarding contributors for all of his projects, it's mostly vibe-coded)
The comment is disingenuous, though, since Locker doesn't need AWS S3 to function.
For everything else I use paid onedrive subscription. The biggest problem is user interface with s3 like storage and predictable pricing because remember you also pay for data retrieval and other storage apis, with dropbox etc you pay a fixed amount. Every year or so I roll over data into the bucket.
But for infrequently accessed data its fine.
This is in Go, exposes both webdav and SFTP servers, with user and admin web interfaces. You can configure remotes, then compose user space from various locations for each user, some could be local, others remote.
For a better alternative, run MinIO on a cloud provider of your choice, or stick with a secure option like Proton Drive.
I use a mini pc with small smb shares (less than 1 TB). This thing is on 24/7, but runs energy efficient.
When it's time to move data, i copy it to a Synology NAS that holds lots of TB's. Then it's also time to backup the really important stuff, which goes to a Hetzner Storage Box[2].
[1]: https://en.wikipedia.org/wiki/Backup#3-2-1_Backup_Rule [2]: https://www.hetzner.com/storage/storage-box/
> run MinIO
When people say "s3", they mean "any s3 compatible storage" in my experience, not "amazon s3 specifically" or just "s3 as a protocol".
Doesn’t require an external database (just a s3 bucket) and is a single binary. A webui is shipping in the next few days.