Be careful with vanity address generators. A cryptocurrency market maker once lost around $160,000,000 in a vanity Ethereum address because the generator they used was only seeded with 32 bits of entropy.
I was already able to host onion services last year by using the crate directly. A few footguns related to flushing but it generally works as expected. I will however say that the code quality could be improved though. When trying to contribute, I found a lot of somewhat bad practices such as having direct file read/writes littered around without abstraction which made refactoring difficult (trying to add different storage/cache options such as in-memory only or encrypted)
Opting not to over engineer the solution with abstractions nobody asked for until you came along is the definition of best practice. something not being designed for any and all use cases doesn't make something bad practice. Reading and writing from a filesystem you always expect to available is more than reasonable. Modular code for the sake of modularity is a recipe for fizz buzz enterprise edition.
Not disagreeing or agreeing, but "best practice" is probably one of the concepts together with "clean code", that has as many definitions as there are programmers.
Most of the time, it depends, on context, on what else is going on in life, where the priorities lie and so on. Don't think anyone can claim for others what is or isn't "best practice" because we simply don't have enough context to know what they're basing their decisions on nor what they plan for the future.
It is also very useful to expose services to the world wide web behind a restrictive network, Tor takes care of the Nat punching and all that jazz, and you get free dns and encryption as an extra bonus :)
> I'm not sure if this is generally considered acceptable within the Tor network
Tor is already encrypted, that’s why you don’t need TLS. Some services (Like the hidden service from Facebook back in the days) have https but that was more of a vanity from what I remember.
Back when EV certificates were widely supported by browsers, HTTPS was a great way of cryptographically associating a .onion service with a real legal entity, for sites like Facebook which didn't care about being anonymous.
> have https but that was more of a vanity from what I remember
It has a functional difference as well, lots of new client-side features (like webcrypto) only work on "Secure Origins" which .onion isn't, but websites behind TLS are. So if you wanna deploy say something that encrypts/decrypts something client-side on .onion, you unfortunately need TLS today otherwise the APIs aren't available.
Of course browsers could fix this, but I don't think they have any incentives to do so. I guess Tor Browser could in fact fix this, and maybe they already do, but it'd be a patch on top of Firefox I think, something they probably want to do less off, not more.
The key used to encrypt traffic is in the URL, everything including path is encrypted from client to the onion service end. What you are saying is true for non-onion HTTP sites, not for onions.
tl;dr: Pressure from browsers, enterprise, and the overall ecosystem to use HTTPS (e.g., unavailability of advanced web features without HTTPS) is pushing for the use of HTTPS without exception, even for .onion sites with no significant technical advantage.
As long as they have the private key they can move it to new hosting infrastructure without issue, and the same onion address will still be operational.
"Mirroring" is a term also used when a single source publishes data in different mediums (technically in this case we're talking only about the internet but the internet is full of different protocols so I'll call them mediums). For example there are websites that mirror their content to Geminispace or in this case make it available as an onion service.
You are correct that this solution does not prevent problems if the server goes down. This particular approach aims to reach a larger audience, while your idea of mirroring enables resiliency.
Both approaches have their use cases and can even be combined too!
No part of hosting or visiting onion services involves exit nodes. Onion service traffic stays within the Tor network instead of exiting to the clearnet.
Run more exit nodes then, and more onion services so they don't need to involve exit nodes.
It's also not such a big deal, provided they aren't messing with your exit traffic which you did encrypt, right? There are few exit nodes, but a great many non-exit nodes which still help anonymize your traffic. If you think it's a problem though, run an exit node.
Well if you use Tor somewhat regularly and check your exit node IP, it is about 50% possible that yours is in that subnet each time you renew the route. Which begs questions.
Maybe I'm wrong, but it would look more benign to have exit nodes distributed without this much bias towards that particular subnet.
It's only 185.220.100 [0] and 185.220.101 [1] that contain all those relays. Some of the bigger German families work together as "Stiftung Erneuerbare Freiheit" that's why you see a big cluster there. But Tor never uses relays in the same /16 for a circuit so it's not really an issue.
Correct. "Stiftung Erneuerbare Freiheit" acts as LIR in charge of the address space, handing out chunks of that space to exit relay operating non-profits for free, but does not operate any Tor infrastructure themselves and has no visibility into the traffic. The cost for us are the RIPE membership fees (approx 2000€/yr).
Source: I'm its director and founder of torservers.net. Usually using a different nick here.
Very unlikely if you just hosting an onion service with legal content, where all traffic is encrypted.
Having to deal with law enforcement is unlikely even if you run a normal, encrypted, TOR relay.
Exit nodes, on the other hand, will most likely get letters or even visits by law enforcement. But those are not involved at all when just running an onion service.
There is one form of harassment though, if you run even just a TOR Relay you tend to be put on realtime blackhole lists regularly which will cause random websites to refuse your connection. Things like banks, ticket sites, even your insurance company might suddenly block your connection because your IP is listed as "Exterme Risk, active threats, verified" on one of like 200 RBL sites because someone scraped TOR and put all of the IP addresses they found on there and tagged them as active threats.
It does make me wonder if people are running very boring polite websites that can suddenly do very not boring or polite things if you know how to ask the right way over an onion address.
Surely I can't be the only one to think of this right?
>WebTunnel is a censorship-resistant pluggable transport designed to mimic encrypted web traffic (HTTPS) inspired by HTTPT. It works by wrapping the payload connection into a WebSocket-like HTTPS connection, appearing to network observers as an ordinary HTTPS (WebSocket) connection. So, for an onlooker without the knowledge of the hidden path, it just looks like a regular HTTP connection to a webpage server giving the impression that the user is simply browsing the web.
Anecdotally, I used to be in control of more than half of Tors exit capacity (until I had inspired enough other people to take over), with no association to US TLAs, and I personally know many exit and other relay operators. I have no reason to assume they are affiliated with US TLAs or other TLAs. The majority in terms of numbers may be, but not the majority in terms of bandwidth.
Personally, I doubt the US TLAs have a need to operate any relays themselves. They can simply wiretap, and use control flow data for correlation when necessary. Tor can still be useful for all those who do not try to hide from the few agencies who may have this kind of visibility.
The relay community is pretty good in terms of interacting with each other. There are real-world meetings to get to know others in the space, which may make you also more comfortable seeing their personal reasons for providing bandwidth.
I am of the view having a .gopher and .onion version of sites is important for avoiding government blocking where possible and to keep information as free as possible.
can you recommend some gopher server that is actively maintained? I always wanted to host gopher site but could not find a strong solution that I will not be afraid to be easily compromised.
https://www.forbes.com/sites/jeffkauflin/2022/09/20/profanit...
Noting the default configuration does not turn your server into a relay or exit node, in case anyone interprets this that way.
Thanks for offering a .onion, bookmarked for the caddy configuration.
https://tpo.pages.torproject.net/core/arti/
https://gitlab.torproject.org/tpo/core/arti/-/blob/main/CHAN...
Hosting onion services is apparently still a work-in-progress, though, and turned off by default.
Not disagreeing or agreeing, but "best practice" is probably one of the concepts together with "clean code", that has as many definitions as there are programmers.
Most of the time, it depends, on context, on what else is going on in life, where the priorities lie and so on. Don't think anyone can claim for others what is or isn't "best practice" because we simply don't have enough context to know what they're basing their decisions on nor what they plan for the future.
[0]: https://github.com/letscage
Tor is already encrypted, that’s why you don’t need TLS. Some services (Like the hidden service from Facebook back in the days) have https but that was more of a vanity from what I remember.
It has a functional difference as well, lots of new client-side features (like webcrypto) only work on "Secure Origins" which .onion isn't, but websites behind TLS are. So if you wanna deploy say something that encrypts/decrypts something client-side on .onion, you unfortunately need TLS today otherwise the APIs aren't available.
Of course browsers could fix this, but I don't think they have any incentives to do so. I guess Tor Browser could in fact fix this, and maybe they already do, but it'd be a patch on top of Firefox I think, something they probably want to do less off, not more.
edit: oh, is the last relay the onion service? So the entire chain is encrypted?
https://proton.me/blog/tor-encrypted-email
In the above blog post, they seem to imply that they made HTTPS mandatory for Proton Mail over Tor for security reasons.
tl;dr: Pressure from browsers, enterprise, and the overall ecosystem to use HTTPS (e.g., unavailability of advanced web features without HTTPS) is pushing for the use of HTTPS without exception, even for .onion sites with no significant technical advantage.
Just saying, this is an important distinction to me and I've been hosting tor nodes since the 2000s.
Archiving information, and making it available, is sometimes more powerful than anonymous proxying.
Especially if there's an anonymous proxy available to that archive. ;)
You are correct that this solution does not prevent problems if the server goes down. This particular approach aims to reach a larger audience, while your idea of mirroring enables resiliency.
Both approaches have their use cases and can even be combined too!
That jazz is increasingly played by the same band of 185.220.0.0/16 exit nodes, and plays it in a scale which is all but Anonymian.
It's also not such a big deal, provided they aren't messing with your exit traffic which you did encrypt, right? There are few exit nodes, but a great many non-exit nodes which still help anonymize your traffic. If you think it's a problem though, run an exit node.
Maybe I'm wrong, but it would look more benign to have exit nodes distributed without this much bias towards that particular subnet.
[0] https://metrics.torproject.org/rs.html#search/185.220.100 [1] https://metrics.torproject.org/rs.html#search/185.220.101
Source: I'm its director and founder of torservers.net. Usually using a different nick here.
TIL that Onion-Location is a header, only new about the <meta> element.
Having to deal with law enforcement is unlikely even if you run a normal, encrypted, TOR relay.
Exit nodes, on the other hand, will most likely get letters or even visits by law enforcement. But those are not involved at all when just running an onion service.
Surely I can't be the only one to think of this right?
>https://blog.torproject.org/introducing-webtunnel-evading-ce...
>WebTunnel is a censorship-resistant pluggable transport designed to mimic encrypted web traffic (HTTPS) inspired by HTTPT. It works by wrapping the payload connection into a WebSocket-like HTTPS connection, appearing to network observers as an ordinary HTTPS (WebSocket) connection. So, for an onlooker without the knowledge of the hidden path, it just looks like a regular HTTP connection to a webpage server giving the impression that the user is simply browsing the web.
Personally, I doubt the US TLAs have a need to operate any relays themselves. They can simply wiretap, and use control flow data for correlation when necessary. Tor can still be useful for all those who do not try to hide from the few agencies who may have this kind of visibility.
The relay community is pretty good in terms of interacting with each other. There are real-world meetings to get to know others in the space, which may make you also more comfortable seeing their personal reasons for providing bandwidth.