Basically title. Recently I saw a new option in Chromium website permission settings called “allow access to local network” or something like that and I know some antiviruses on Windows that can list all devices connected to the same WiFi network. I’m usually using Firefox based browsers that obviously don’t have the option to disable or enable that access. So can some really invasive websites mine data about my local network, connected devices etc? And if so, what can I do to prevent it except for just disconnecting everything else when visiting such websites?
There is a Firefox extension that blocks port scanning from websites, and the prime example is eBay. If you block eBay with this extension, you cannot log in. eBay specifically requires a port scan of your machine or it won’t let you log in. So based on just that alone, I would say that yes, there is a risk.
What in the world are they digging for?
Anything that can help advertisers. In this case they can get data about your wealth and also assume that the nearby devices belong to the same person or family. That’s some very useful data for unethical advertisers.
Interesting, I didn’t know about that. Bleeping computer has a good write up on it (I’m assuming they broke the story) https://www.bleepingcomputer.com/news/security/ebay-port-scans-visitors-computers-for-remote-access-programs/
According to Nullsweep, who first reported on the port scans, they do not occur when browsing the site with Linux.
HA!
Is this related to how Linux does permissions?
Probably useragent check. They likely suspected that they’d get caught quicker if they scanned Linux users.
Hmm ok thanks for the information. I’ll look into that.
This is something new. Thanks for the info. Man we are not safe.
You can stop that (and many other things) with jshelter.
Any extensions or mitigations you use can be detected and used to increase the fingerprint of your browser/device even more.
If I visit that page I get a “fingerprinting activity detected” warning from JShelter and then a mostly blank page with “FP ID: Computing…” at the top, and a bunch of javascript errors in the console.
Most sites are fine with the settings where I normally leave them, but it’s not much of a surprise for one that’s devoted entirely to browser fingerprinting to be broken by JShelter. Stopping or at least making more difficult most fingerprinting attempts is among the things it does. It can’t stop all of them of course, but it’s one component that helps to work against them.
WebWorker is disabled by default in JShelter which is required for creepjs to work. If you set just that function to Strict instead of just the default Remove, then creepjs still works fine.
But creepjs could be modified to work without webworker if you were thinking JShelter really does something useful to hide your fingerprint from someone who wants it bad enough. And you can still be fingerprinted many other ways even without JavaScript at all.
Yeah my main browser is easily fingerprinted due to the many ways it is non-standard. I’ll use torbrowser or something if it actually matters. But JShelter does not really make that problem worse for most people, and it probably frustrates some fraction of attempts — including those that rely on web workers apparently.
The page load time of creepjs would not be acceptable for use in real life. Anything with that much creepy js is going to get itself blocked by other means.
The page load time of creepjs would not be acceptable for use in real life
Well any site that uses fingerprinting tech, regardless of what it is, is just going to have it load silently in the background so I don’t think it would be noticeable anyways.
That depends on what’s making it take so long, among other things. But with sufficient effort I suppose the more sneaky fingerprinters (those which aren’t aren’t already blocked by other extensions) could probably be made difficult to notice for unprepared users. JShelter popping up a big warning about a “very high” level of fingerprinting activity is a pretty good hint though, and I take it as a suggestion to add some rules for ublock if I expect to visit that site again.
As it continues to get more common, maybe it’s time to go back to using noscript as well.
Mullvad browser uBlock jshelter privacybadger NoScript
Whelp adding this to my extension list. There is no webpage I visit that should need this info … I think thanks for link
It is webRTC
WebRTC has a separate toggle.
Not in Firefox based browsers. Also that’s the tech they use for scanning
media.peerconnection.enabled = false
It has a separate toggle in Chromium so I think these are 2 separate things.
It is possible, yes. Here’s a proof of concept implementation and there are undoubtedly others out there.
I guess I’ll switch to Chromium then
Except that chromium and everything based on it is Sending information about your pcs Ressource usage on Google sites, as far as I have heard
I don’t use that sites on the devices with the highest threat model so it should be fine. Hopefully.
I don’t know if it’ll work on Chromium or not. It’s worth a try.
Wasn’t it Google drive, that once you install it onto a device on a network, that it would scan your entire network for other devices? I tried Googling for it but then laughed realizing Google wouldn’t let that information continue to linger. Or I could just be wrong.
Is it maybe the case that the setting is for allowing/disallowing you to go to sites on your local network?
For example your router controls at “192.168.1.1” (example address) or a raspberry pi with a selfhosted service like nextcloud etc.
You can probably test whether my claim is true by trying to visit your routers page with the setting enabled vs. disabled. (I am not using Chrome)
I don’t think websites have access to your local network through the browsers javascript engine, but I may be wrong.