Hello everyone,
I recently came across an article on TorrentFreak about the BitTorrent protocol and found myself wondering if it has remained relevant in today’s digital landscape. Given the rapid advancements in technology, I was curious to know if BitTorrent has been surpassed by a more efficient protocol, or if it continues to hold its ground (like I2P?).
Thank you for your insights!
BitTorrent has a new version now BitTorrent v2 you will see this in BitTorrent clients that support it like qBitTorrent in ways like info hash v2 its still getting better v1 and v2 are not inoperable because some of the changes can not work together but you can create hybrid torrents that can work in both. https://www.libtorrent.org/features-ref.html#bittorrent-v2 https://blog.libtorrent.org/2020/09/bittorrent-v2/
“Is email still relevant in the modern era?”
This seems like a dumb question, BitTorrent absolutely is still relevant and probably the most popular method of file sharing in the scene. Foss groups use it too for distributing ISO files for Operating systems, and it might even be used as the video hosting provider in future Fediverse YouTube alternatives (I’ve heard talk of a video hosting platform on Fedi which uses activitypub for everything else but hosts videos via BitTorrent) pretty cool stuff.
So yeah BitTorrent is still relevant, and it makes sense since if it isn’t broken why fix it? Not to say that it couldn’t be better, the biggest problem with it is the anonymity issue, but until someone makes something better BitTorrent will continue to be popular, and the ideal choice for decentralized file sharing, especially in the piracy scene.
Almost always I find torrenting the most convenient method to download anything. When someone puts some file up for download and that person uses one of those stupid free file hosters, I usually get annoyed by “disable ad blocker”, slow dl speeds, etc.
A torrent makes things so much more convenient.
Snappy uses torrents to share Windows drivers.
I never used torrent as much as the last years
Most piracy is either two ancient methods that work perfectly of Usenet or BitTorrent. There is nothing wrong with these methods.
Usenet has many things wrong with it, NNTP is not at all designed for distributing large files, it’s for propagating messages across servers. File integrity checks have to be tacked on for instance, and the few servers still serving binaries are commercial services that are vulnerable to copyright trolls.
Thanks for explaining. I don’t use it.
Good to know
Considering that USENET goes back to the 70s, and bittorrent was invented in 2001, one of these things is clearly ancient and the other isn’t.
2001 was 24 years ago in 2 days. BitTorrent can drink.
I dislike this fact, because I very clearly remember when it was brand spanking new
Yeah and each torrent was its own separate window with no pause option.
Haha yes! 20 little BitTorrent windows ticking along
I remember when eDonkey and later eMule were brand spanking new… It took quite a while for BitTorrent to gain enough traction (and for me to get fast enough internet) for it to be better… (and, frankly, I still miss eMule’s Kademlia network’s peer to peer search capabilities…)
Ed2k/kad are still kicking, I use mldonkey for that networks
It’s still newer than HTML, CSS, and Javascript.
Yeah that’s pretty ancient to me. That’s like saying XP isn’t ancient
The article you linked answers most of your questions.
- Relative global upstream traffic went down, but not due to other file-sharing protocols but entirely different applications
- I2P is not mentioned anywhere in the article, nor any other sharing alternative
- VPN is mentioned as a potential reason for not being able to identify torrent traffic; VPN has become much more prevalent and promoted in the scene
- The article says, in piracy, streaming websites are much more popular now
It has not been surpassed by another protocol. The relative numbers don’t say much about absolute numbers or usage.
And 10 % of global internet upload is certainly no irrelevancy.
I2P is not an alternative to bittorrent, but to IP networks. Essentially I2P is an overlay over the IP-based Internet.
bittorrent can work through I2P just like it can over IP or Tor.
Thank you for this clarification
wow, this has blown up!
some additional clarification:
I2P is not universally supported by any bittorrent clients, because a bittorrent client needs specific knowledge about how to connect to the I2P network through an I2P router (by using the “SAM” protocol).
the java based biglybt bittortent client has pretty good support as I hear, it supports I2P-specific DHT and Peer Exchange. DHT is used for peer discovery without a tracker, Peer Exchange is another tech that helps with finding more peers.qbittorrent (and a few others that use the libtorrent programming library) has got support for I2P around a year ago, but its experimental so far I think, or at least it hasn’t been tested that much.
these bt clients don’t (yet) support DHT and PeX for I2P torrents. the functionality is missing from libtorrent and its single dev is very busy already.if you are interested about the technical aspects, here are some more words about using bittorrent with I2P from a developer perspective: https://geti2p.net/en/docs/applications/bittorrent
What the what? More relevant than ever. How is this a legitimate question? I2p is great but adoption is extremely low.
How is this a legitimate question?
It’s not.
I2p is not a more efficient file sharing protocol.
You may be thinking about ipfs, which is a file sharing protocol, but I wouldn’t say that is more efficient than bittorrent afaik.
Yes it’s very much alive and very important. A lot of industries (like their products: books, movies but also games) are getting restricted, taken away, taking down and removed from other platforms. Old ROM sites are taken down. And platforms like archive.org need to remove all their books.
The problem is, that there is nobody archiving anymore… because it’s not allowed due to “copyright infringement”. In the end, all these products like books, movies and (old) games might be gone forever. Next generations will not be able to have access to it. This is what worries me the most. And Torrent might be the only way to fix/solve it. By distributing these kind of material. Especially older books, older movies and older games.
Yea, hoarders and seeders are cultural heroes, to me
Torrenting is a decentralized approach and the corpo parasite hates it because there is nothing they can do about it, short of shutting down the internet lol
Get fuck Disney
It’s alive and well. My independent research shows that torrents of users are using it for large foss packages, as well as various media.
This duck in a hoodie shows how both technologies can function together. https://hackyourmom.com/en/pryvatnist/bittorrent-cherez-i2p-dlya-anonimnogo-obminu-fajlamy/
I use Torrent daily, I basically never stop seeding what I download to my Plex Server and I also use a Real Debrid account, which essentially caches the torrents to their servers for us to stream through different methods (like Kodi, Stremio, or more recently for me Plex thanks to Riven/Zurg).
The protocol is still relevant. Is there anything better yet with enough people using it that it’s relatively easy to find anything you want through it?
I mean I primarily use Usenet to find anything I want.
A better question is, what would you improve over current way that torrents work.
I wish there were some way to enable availability to persist even when torrents’ peak of popularity has passed - some kind of decentralized, self-healing archive where a torrent’s minimal presence on the network was maintained. Old torrents then could become slow but the archival system would prevent them being lost completely, while distributing storage efficiently. Maybe this isn’t practical in terms of storage, but the tendency of bittorrent to lose older content can be frustrating.
I don’t see what you can do at the protocol level to improve availability, you still need people storing the file and acting as peers. Some trackers try to improve that by incentivizing long term seeding.
It’s called private trackers, and they are great.
Meh… I get itchy when I hear private. We could also improve the experience for seeding publicly and for longer. Not only by education but maybe even using some kind of intensive to keep seeding.
The issue is that public trackers are too easy for people to monitor and pursue copyright infringement claims for. Private trackers, by design, are much harder to do that with, which makes them leaps and bounds safer to use.
Don’t think about it as keeping the common man out, it is about keeping The Man out.
A better question is; What would you change in the current Internet/WWW to make it as decentralized as Torrents are?
I wish there was a decentralised way of hosting websites. Kind of like torrents.
Sounds like maybe what you’re looking for is ipfs? https://ipfs.tech/
Problem with IPFS, is that it’s not really that decentralized as I wish it was. Since by default the data is not shared across the network, meaning if nobody is downloading and hosting that node, you are still the only one having a copy of the data. Meaning if your connection is gone or if you get censored, there is no other node where the IPFS data is living. It only works if somebody else is activily downloading the data.
Ow, and then you also need to Pin the content, or the data will be removed again -,-
Furthermore, the look-up via DHT is very slow and resolving the data is way too slow in order to make sense. People expect today max 1 or 2 seconds look-up time + page load would result in 4 or 5 seconds… Max… However with IPFS this could be 20, 30 seconds or even minutes…
That’s just for files though. Imagine a specific decentralised protocol for hosting websites.
You can technically host a website on IPFS but it’s a nightmare and makes updating the website basically impossible 2021 wikipedia IPFS Mirror. A specific protocol would make it far more accessible.
Websites are just files. For something like running a site on ipfs, you’d want to pack everything into a few files, or just one, and serve that. Then you just open that file in the browser, and boom, site.
I’m not really sure it qualifies as a web site any more at that point, but an ipfs site for sure. Ipfs has links, right?
With LibreWeb I tried to go this route, using IPFS protocol. But like I mention above, IPFS is not as decentralized by design as people might think. People still need to download the content first and hosting a node… And then ALSO pin the content… It’s not great. And look-up takes way too long as well with their DHT look-up.
Well… it’s not really designed for that use case, so yeah you’ll have to deal with issues like that. For interplanetary file transfers, that’s acceptable.
I’m personally trying to fix it… https://libreweb.org/. Still a proof of concept though.
Looks really cool. Thanks for the share
Why MIT license and not something like GPLv3?
MIT license is more permissive.
Yeah but then companies can use your work and not provide compensation. But to each their own.
Yes that is true.
That would be very cool, I know we have onion sites that operate on the Tor network that use keypairs for the domains, but the sites themselves are still centrally hosted by a person, anonymously hosted but still centrally hosted.
There is actually a JS library called Planktos that can serve static websites over BitTorrent. I don’t know how good it is, but it sounds like a starting point.
There’s some cryptobro projects about sticking distributed file sharing on top of ~ THE BLOCKCHAIN ~.
I’m skeptical, but it might actually be a valid use of such a thing.
Blockchain is a nice technology, but not all the solutions need blockchain technology. Just like BitTorrent doesn’t require blockchain, a decentralized internet alternative also doesn’t need blockchain.
The profit motive
Make mutable torrents possible.
What’s the advantage to that? I don’t want the torrent I’m downloading to change.
I want that. For example you downloaded debian iso version 13 and after some time it can be updated to 13.1. Obviously it shouldn’t be an automatic operation unless you allowed it before starting download.
I wouldn’t call that mutable, more like version tracking in which each torrent is aware of future versions.
I kind of like that, but you might be able to accomplish it with a plugin or something.
Put a file in the torrent called “versions” or something like that, and in there would be a url that the client can use to tell you if there is a new version.
It wouldn’t change the protocol though, since the new version and old version would still need to be separate entities with different data and different seeding.
Like the 13.1 torrent being only a patch to the 13 one and listing it as a dependency? Downloading the 13.1 torrent would transparently download the 13 if it wasn’t already, then download the 13.1 patch and apply it. But I don’t think any of this needs to be at the protocole level, that’s client functionality.
Resilio sync can do this, I’m pretty sure.
Although if implemented as an extension to BitTorrent, I’d want it to be append-only, because I don’t want to lose 1.0 just because 1.1 becomes available.
The last 0.01 percent comes in at the same speed as the rest of it