For instance, say I search for “The Dark Knight” on my Usenet indexer. It returns to me a list of uploads and where to get them via my Usenet provider. I can then download them, stitch them together, and verify that it is, indeed, The Dark Knight. All of this costs only a few dollars a month for me.
My question is, why can’t copyright holders do this as well? They could follow the same process, and then send takedown requests for each individual article which comprises the movie. We already know they try to catch people torrenting so why don’t they do this as well?
I can think of a few reasons, but they all seem pretty shaky.
- The content is hosted in countries where they don’t have to comply with takedown requests.
It seems unlikely to me that literally all of it is hosted in places like this. Plus, the providers wouldn’t be able to operate at all in countries like the US without facing legal repercussions.
- The copyright holders feel the upfront cost of indexer and provider access is greater than the cost of people pirating their content.
This also seems fishy. It’s cheap enough for me as an individual to do this, and if Usenet weren’t an option, I’d have to pay for 3+ streaming services to be able to watch everything I do currently. They’d literally break even with this scheme if they could only remove access to me.
- They do actually do this, but it’s on a scale small enough for me not to care.
The whole point of doing this would be to make Usenet a non-viable option for piracy. If I don’t care about it because it happens so rarely, then what’s the point of doing it at all?
They do receive takedown notices, however files uploaded to usenet are mirrored across many providers across many jurisdictions while also split into many parts as you noted. Usenets implementation of file sharing is quite robust; being able to rebuild a file that’s missing a significant portion of it’s data. To successfully take down a file, you need to remove many of these parts across almost all of the usenet backbones which requires cooperation across many nations/jurisdictions that are governed by varying laws. It’s not an easy task.
Here’s a somewhat limited map of usenet providers:
This makes perfect sense. Thank you!
So it’s basically the fediverse 30+ years before?
why is the DMCA the one fucking law that actually gets enforced at a high rate when there are literally billions of things more important that we could spend money on
Because violating the DMCA is copyright infringement, and § 501 (b) of the Copyright Act gives copyright holders a private right of action to file a civil lawsuit to enforce it. Copyright holders tend to be motivated in a way that the State very often isn’t.
Because the corporations that benefit from this law can afford to buy lots of politicians.
It’s civil lawsuits by corporations, not state prosecution
it’s not generally law enforcement enforcing it, it’s the copyright holders threatening civil action against things like internet service providers, who in turn will cut off your internet or some such. they have a lot of money, so they get law enforcement to do their bidding when they want, but the majority of DMCA action is civil action. this is, my very uneducated opinion looking from the outside
Because it’s simple. The company that owns the content does a DMCA claim and they either remove it or get sued. Removing it is simple and largely automated.
That’s why we don’t talk about it
As far as I know, they do get dmca’d. But they delete a single file so it’s incomplete. But if you have 2 different newsgroup providers they usually didn’t delete the same file, so you can still download it.
But I could be totally wrong because I haven’t really looked into this, and this is all from a very old memory.
That makes some amount of sense. I’m not sure exactly how each article is stitched together to create the full file. Do you happen to know if it’s just put together sequentially or if there’s XORing or more complex algorithm going on there? If it’s only the former, they would still be hosting copyrighted content, just a bit less of it.
EDIT:
https://sabnzbd.org/wiki/extra/nzb-spec
This implies that they are just individually decoded and stitched together.Compressed into a set of small archives, then each one is posted.
Usually par files are included so you can regenerate a few missing archives. https://en.m.wikipedia.org/wiki/Parchive
Do you happen to know if it’s just put together sequentially or if there’s XORing or more complex algorithm going on there? If it’s only the former, they would still be hosting copyrighted content, just a bit less of it.
Copyright is a legal construct, not a technological one. Shuffling the file contents around doesn’t make the slightest bit of legal difference, as long as the intent is to reconstruct it back into the copyrighted work.
(Conversely, if the intent was to, say, print out the file in hexadecimal and wallpaper your house with it, that wouldn’t be copyright infringement even if you didn’t rearrange it at all because the use was transformative. Unless the file in question was a JPEG of hex-digit wallpaper, of course.)
First, a massive amount of content is removed. You won’t find a lot of popular, unencrypted content these days on usenet. It’s all encrypted and obfuscated now to avoid the bots
Speaking of bots, I don’t think you realize how much of this process is automated, or how wide of a net is being used. The media corporations all have enormous collected libraries of material. It gets posted constantly to all sorts of places. This includes public torrents, public usenet, YouTube, PornHub (yes, really, even for non-porn), Facebook, TikTok, Tumblr, GNUtella, DDL sites…
The list goes on and on. Each one gets scanned for millions of potentially infringing items, often daily. No actual people are doing those steps.
Now, throw in things like private torrents, encrypted usenet posts, invite-only DDL, listings that use ‘3’ instead ‘e’ or those other character subscriptions… These require actual humans to process. Humans that cost money, and a considerable amount of it. As a business, you have to show a return on investment. Fighting piracy, even at its theoretical best, doesn’t increase revenues by a lot.
You mention revenue and breaking even, but you left out an important detail. Your time is free. They don’t have to pay $10/month, they have to pay $10/month + $20/hour for someone to deal with it. And most pirates of that level will just find another method.
You won’t find a lot of popular, unencrypted content these days on usenet. It’s all encrypted and obfuscated now to avoid the bots
That’s not been my experience at all. Pretty much everything I’ve looked for has been available and I rarely come across encrypted files. I do regularly have to try 2 or 3 nzbs before I find a complete one, but I almost always find one.
Are they obfuscated in any way? Depending on your client, you may not be able to see the names and subjects. But if you didn’t have the NZB, is there any real chance you could find it otherwise?
But if you didn’t have the NZB, is there any real chance you could find it otherwise?
No, but that’s just the nature of NZB file sharing. The individual articles aren’t typically tagged/named with the actual file names, that info is pulled from the NZB and the de-compressed + stitched together articles.
I’m not using any special indexers, just open public registration ones. The NZBs aren’t hard to find, for me or for IP claimants.
I remember spesifically game of thrones. If you didn’t download the episode within a day or so of being released it was DMCAed and gone from most popular usenets
Not seen that with anything else tho
HBO used to be bad. Going back to the Deadwood or True Blood days… the show would air, within 20 minutes of the end credits, it’d be up on Usenet. You had to start grabbing it ASAP, HBO’s sniffers would be on the look out within an hour. And they’d take down just enough parts to make the PARS useless. I don’t miss the UUEncode days and the refreshing a newsgroup every few minutes to see if my show got posted (or re-posted) days.
Only thing I’ve had a problem finding is parks and rec along with ATHF season 12
Off topic, but is there any tutorial on how to do this Usenet thing? Feel free to contact me on Matrix, it’s on my profile.
You’ll need 3 things:
A usenet client such as SABnzbd. This is equivalent to a torrent client like qbittorrent.
An NZB indexer such as NZBGeek, again equivalent to torrent indexers, but for nzb files.
And finally a usenet provider such as FrugalUsenet. This is where you’re actually downloading articles from. (there are other providers listed in the photo in my other comment here)
Articles are individual posts on usenet servers. NZB files contain lists of articles that together result in the desired files. There are also additional articles included so if some are lost (taken down due to dmca/ntd) they can be rebuilt from the remaining data. Your nzb client handles the process of reading nzb files, trying to download the articles from each of your configured usenet providers, then decompressing, rebuilding lost data, and finally stitching it all together into the files you wanted.
deleted by creator
Appreciate the post, but damn, that’s still cryptic. “Find a quality private usenet indexer”… I don’t even know where to begin to do this.
By searching Google for Usenet indexers.
It’s going to be difficult as many of them require invites.
Please note that, unlike private trackers, indexers are usually a paid service.
Nzbgeek, althub, ninjacentral, drunkenslug just to name a few.
Can recommend Nzbgeek
Honest question. Are there any advantages to usenet if I’m already deep into torrenting and on many private trackers? I’ve never used usenet but have been torrenting for 20+ years so I’m not sure if I’m missing out on anything
deleted by creator
Thank you for the answer. It sounds more legit than I had always thought. My current setup is just all private trackers with a VPN so I’m not sure I’d want to pay to set this up. I have PTP and BHD for movies. BHD and MTV for TV shows, RED for music and MAM for books. I generally always find what I’m looking for but I was always curious about usenet
Hey, yes there is. I was 100% torrents up until recently, also using multiple torrent sites including private trackers. The thing I struggled with was getting all the episodes of TV shows. Like if I added a niche or old TV show to Sonarr, it would be a constant struggle to find missing episodes or seasons (for some reason Southpark was a misery). This was pretty much completely eliminated with usenets. So now I have usenets as a primary and torrenting as a fallback if something cant be found. This config gives me almost full coverage. The downside is that usenets typically aren’t free whilst torrenting is (although they are very cheap)
Usenet will download much faster and is generally much more “private” and secure (your ISP can never see what you download).
Find a Usenet provider. A quick web search and some reading should get you to the right place. I’m not sure if any good free servers are available anymore, but there’s probably one that’s cheap enough.
Looks like https://sabnzbd.org/ is a free and open source Windows/MacOS/Linux client that can download files. I haven’t tried it, but it’s highly rated on alternativeto.net
Install the *arr programs that you want to manage your libraries - Radarr (for movies), Sonarr (for tv shows), Lidarr (for music)
Install NZBGet (for downloading the files)
Sign up to a usenet provider.
Sign up to an indexer like NZBGeek, NZBFinder, etc.
Put your usenet provider details in to NZBGet under the News-Server section.
Put your indexer details into the indexer settings in *arr programs.
Put your NZBGet details into the Download Clients setting section in *arr programs.
Pretty much the gist of it. Then you can just search for and add the content you’re after in Radarr/Sonarr/Lidarr and it will go looking.
If you need a hand I’m happy to help.
deleted by creator
There’s about 100 tutorials that’ll come up in a Google search.
You forgot the first rule of Usenet: We do not talk about Usenet. That’s why it works. Keep talking and see what happens.
Nah, you’re not nearly as cool. Usenet is by no means a secret lol
I mean, how does usenet compare to just pulling torrents from public trackers?
Is there a good way of searching to find out if something is available before diving in?
I don’t really know how usenet compares - and private trackers seem quite a pain and haven’t ever really had anything I couldn’t find on public trackers anyways.
If what you want is recent within the last 10-20 years or older but popular, Usenet is far superior, if it’s available it will always download as fast as your connection will allow Vs torrents which also needs seeders in addition to simply being available, and then ofc you’re beholden to those seeder’s upload bandwidths.
If what you want is obscure and/or old, torrenting is probably your best bet, especially since you can just leave the download “open” and download it byte by byte over months if you so choose. But even then, it’s still worth checking Usenet since you never really know what people will upload and when. If someone reuploads something obscure/ancient and then is never seen again it doesn’t matter, you’ll still be able to download it fast until it’s removed or until it’s retention expires (about 10 years for the good providers)
2 main advantages:
-
no hosting liability. Unlike torrents; you’re not seeding ie hosting the files yourself. You’re purely downloading. This moves you out of the crosshairs of copyright holders as they are only interested in the hosts (providers). This also means a VPN is not necessary for usenet downloading. (providers don’t log who downloads what either)
-
speed. As long as the content is available (hasn’t been removed due to dmca/ntd), you are always downloading at the maximum connection speed between you and your provider. No waiting/hoping for seeds and whatever their connections can provide. I’m usually at around 70mb/s. Where as torrents very very rarely broke 10mb/s for me, usually struggling to reach 1mb/s.
As far as availability goes, stats from my usenet client: of 17m articles requested this month, 78% were available. I’m only using a single usenet provider. That availability percentage can be improved by using more than one in different jurisdictions (content is difficult to remove from multiple servers across different regions).
-
Tom Cruise loves DMCA
deleted by creator
They do. Almost everything gets DMCA’d within hours of being uploaded. What happens is they just get uploaded again and again, on different backbones and servers, and you end up getting the complete thing from multiple sources or as soon as it gets uploaded but before it gets DMCAd.