>After discussing this with several content blocking extension developers, we have decided to implement DNR and continue maintaining support for blocking webRequest.
We've seen this before -- a number of times. This is what's put forth to avoid a cacophony and push back from developers and end-users alike. Both technologies will be supported for a short time before blocking webRequest is taken out altogether at a relatively short time thereafter.
>We will support blocking webRequest until there’s a better solution which covers all use cases we consider important ... (emphasis mine)
And there it is.
In a short time we'll read a statement -- released on a Friday afternoon -- stating the EOL for blocking webRequest with little-to-nothing in the way of analogous behavior because it wasn't considered important by Mozilla (or, perhaps, the pressure from Google was too much).
That's my cynical, entirely pessimistic outlook of it all anyway.
I'd like to read Raymond Hill's thoughts on this.
However, if Mozilla intended to keep the option of blocking webRequest (or whatever's necessary to keep uBlock possible), it would also look like this. It's not an option for Mozilla not to support the Chrome API, because the entire point of WebExtensions is the admission that Chrome is the dominant browser, and extension makers will only consider porting to Firefox if it's as little effort as possible. Hence they need to support the same API's.
Given that uBlock Origin was the first extension they added support for in Firefox for Android, I'm relatively confident that they're intent on making sure it stays able to do its job.
At least the last time I developed a browser extension, this was not the case. I found that what was considered the WebExtension API was not 100% portable across browsers, and I had to rewrite parts of my code between Chrome and Firefox because Chrome had many extra APIs that Firefox lacked.
The sentence continues:
> since DNR as currently implemented by Chrome does not yet meet the needs of extension developers.
That's an admission that webRequest won't be removed until there is something better than DNR on the table.
Killing the webRequest API (without a real replacement) would absolutely cripple uBO on Chrome. The Firefox version having access to WebAssembly without an extra permission isn't preventing the Chrome version from working.
More about CNAME:
Hasn't that been the case for a while now?
About CNAME, I think this is what you're looking for:
They already disabled (private) extension support on the
(you have to upload your extension to Mozilla servers) and they
completely disabled extension support on Firefox mobile (no, that small
list of approved extensions isn't enough). I'm not sure they care much
about their user base.
Have you noticed that public-use PCs (in e.g. libraries, hotel lobbies, kiosks) used to have Firefox installs, but now mostly have Chrome installs? It's not that the IT people setting these up are any less fans of FOSS than they used to be. It's that Chrome has, for a while now, only persistently loaded local extensions if they're declared in an MDM policy (this being remote security), while Firefox has continued to trust the local filesystem. Public PCs with Firefox (and without some OS change-reversion tool like Faronics DeepFreeze) have been cesspools of malware extensions since ~2011.
Besides — if your private extension isn't malware, why not just upload it to Mozilla's servers, but never set it to public, instead just personally being its only user?
(What, are you burning evidence of illegal activities into browser extensions or something? That's bad engineering! Make your illegal activities runtime state!)
> they completely disabled extension support on Firefox mobile (no, that small list of approved extensions isn't enough).
That's the platforms' idiocy, not Mozilla's. Apple and now Google both say that all non-sandboxed executable code an app ever runs must be part of the app's submission package, to be checked over and approved by the platform owner.
That "small list of approved extensions" is a set of extensions that gets burned into the Firefox Mobile binary and audited by the platform-owner during publication. It's small because every extension on it is another chance for a Firefox update to be shot down during QA by an overzealous platform owner. The extensions in the approved set are the ones whose developers understand well exactly what's required to ship code as part of Firefox Mobile, and who therefore do what they can to appease the platform-owners in each release.
Because it's private. Because I don't want to register and
maintain a Mozilla account. Because I can't trust my current
network environment. Because I just don't want to deal with that
code-signing shit. Why should I even need an internet connection
in order to run my own scripts in my browser.
Assuming private means malware or illegality isn't very creative.
: I specifically listed the Windows version of Firefox because
generally I have no problem with that procedure at all. But on
Windows they even removed the about:config-flag to disable it.
That's very rude.
Snark: because browser extensions only run on HTTP(S)-schema URLs, so you’ll at least need an active (maybe virtual) NIC with a loopback interface and a webserver listening on it, in order to see your extension running.
Reality: you don’t. You can still tell Firefox (or Chrome!) to trust a local extension in Group Policy, and it will. But only if your PC is actually bound to an AD domain, such that that GPO policy came from the AD controller rather than being something your computer created locally for itself. Because a virus can (get elevation and then) create a GPO policy to allow itself.
(You don’t technically need an Internet connection for setting up AD binding, because your AD controller can actually live on a VM that runs on the computer it’s managing. Crazy, but it works. People who pirate Windows may be familiar with a tool I won’t name here, that’s essentially a tiny wire-protocol-mimic of a VM running Windows running an AD controller running a Key Management Server.)
> But on Windows they even removed the about:config-flag to disable it. That's very rude.
It’s because Windows is where all the viruses that install malware extensions are. MacOS/Linux could run the malware extensions just as well — the extensions are just trying to [hijack your browser to] show ads/mine bitcoin/add Amazon affiliate IDs to things, and all that can be done in pure JS — but they can’t run the viruses that install them (and they’re uncommon-enough on networks to have low per-neighbour R0 even for viruses written specifically for them), so they’re somewhat protected.
It is frankly shocking to see such a hardcore authoritarian viewpoint here.
Things that steal your password, track your information, and show you ads. Y’know, BonziBuddy-esque things. Things that have a registered signature in an antivirus database, with nobody complaining about that.
Let me rephrase my initial statement to make it more clear: 99+% of the usage of the private extension install API — whether you choose to measure by volume of installations, or by variety of software installed — is by malware. Just like usage of the modern POTS phone system is 50+% by scam callers. Both because of the sheer quantity and variety of bad actors, and how automated and wide-reaching their attacks are; but also because of the dwindling valid use-cases, as most users’ needs have since been met by other technologies that meet those needs better.
In the POTS phone case, most people have switched to using email/texting/collab software/IP video-calling, which is why it feels like every ordinary phone call you receive is a spam/scam call. In the browser extension case, most people with more complex needs — e.g. the people who thought Chrome Apps were a good idea, the people who needed their extensions to have a native-library component, etc. — have switched to building standalone apps that have a browser embedded in them, where their “extension” can run un-sandboxed on your OS, not sandboxed within the browser. Y’know, Electron and the like.
(And moving your extension to Electron actually solves a great amount of the problem, because now your functionality itself is considered a base part of an “app” that can be independently quarantined by the OS, installation-constrained by MDM/GPO, sandboxed with a capabilities manifest to just what its use-case needs rather than what an entire browser needs, etc. IT admins love the shift from extensions to purpose-built Electron apps.)
> hardcore authoritarian
I have nothing against having control over your own system. For modern security, though, you truly have to have control over your own system — i.e. not let it control itself, but rather have two systems, where one follows the other’s whims via MDM.
The only truly-secure systems are ones where their choices are out of their own hands, and rather in some other system’s hands. Either interactively, via MDM; or non-interactively, via limited-capability ephemeral immutable-image infrastructure. Such systems have no entrypoint for a rootkit: in the immutable-image case, because the OS itself doesn’t have the physical capability to modify the boot volume (because it’s an NFS/iSCSI mount from a server that’s sharing it read-only; or because the OS kernel is a simplified unikernel that just doesn’t have “write to disk” as an implemented code path; or because it’s running as an instanced VM that disappears if it reboots.) And, in the MDM case, because there’s a chain of trust (signed BIOS → signed bootloader), where the bootloader will only boot a boot volume if that boot volume has been signed by the MDM controller; and where the keys for doing that aren’t on the device.
(If you like, instead of an additional MDM system, you could in theory have your boot-volume signing key held offline in a smart card/hardware wallet. No OS currently has the capability to use that offline signing key during the OS update flow, but it’d sure be neat if they started doing that.)
Is that authoritarian? In the same way as someone on a space station telling you not to open the airlock, perhaps.
As we move to a desktop world with strongly sandboxed processes (e.g. snap and flatpak), this argument will become less relevant, since it'll no longer be the case that dancingbunny.exe will run with the same credentials as the user himself and be able to plop an extension in a browser extension directory.
In a sandboxed-app world, local extensions will be much more likely to reflect the real intent of the user. I don't think we'll see a loosening of browser permissions though: basically every web browser maker feels that it ought to keep bad and desired extensions out of the hands of users.
> Besides — if your private extension isn't malware, why not just upload it to Mozilla's servers, but never set it to public, instead just personally being its only user?
Because extension-market web sites censor desired extensions that they think are bad. One example is Dissenter . Another example is "bypass paywalls" . Browser makers have blacklisted both extensions despite their not being malware and despite real human users actually wanting to use these extensions. Why? Because the people at major browser vendors just don't like these extensions. Browser makers should not have the power to censor other people's programs. Fundamentally, it's the user's computer, not Mozilla's or Google's.
Have you ever uploaded an extension yourself? There’s more to these extension hosting portals than a public-facing website. There’s also an authenticated area for you to manage your own uploaded extensions, where extensions go through a set of states similar to a CMS: from “draft” to “private/unpublished”, to “private beta”, to “in review”, to “public.”
I’ll say that again: in even the Chrome Web Store, “private beta” comes before platform review. You can publish an extension for just yourself + a limited set of friends, give them the URL, and they can install it — without the platform owner ever even taking a glance at it. And they won’t in the future, either, because the behaviour of private-beta extensions just don’t matter to them. They can’t spread like public extensions — they have explicitly limited install-count caps — so it doesn’t even matter if they’re malware.
But, just like with emulators on iOS, anyone who wants one can just take the source code, change the identifier on it, and deploy a copy of it to their own device for their own use, as a “test.” And unlike with iOS TestFlight, these private-beta browser-extension deployments don’t expire after a week.
Yes, that’s still a big hurdle to expect non-technical people to jump, compared to just allowing these extensions to get published. Yes, that puts Mozilla/Google/etc. in positions of censorship, where they may make arbitrary decisions.
But don’t just say “don’t do that then.” Carefully envision an alternative. What do you want the browser to do: let you add alternative extension trust-roots, like adding repos to apt? Then a virus could go in and add its trust-roots. (This would, again, be fine if it was only something the browser trusted if set in an MDM profile / GPO policy, signed with some remote key a virus couldn’t grab from the local OS; but it’s fundamentally not something the browser can trust if it’s just data sitting in the user’s personal preferences directory, or even the OS’s preferences.)
I’d be happy to see any solution to this problem that lets the locus of control remain within the local computer, while preventing viruses from tampering with it. Capability-based OSes were there decades ago, and some sandboxing approaches are kinda-sorta getting there. But nothing that consumers use is in any way there yet.
Why should browser makers get to decide which extensions are allowed to spread?
> let you add alternative extension trust-roots, like adding repos to apt? Then a virus could go in...
In an OS with app sandboxing, no a virus can't just "go in" and add extensions. The way to stop malware from compromising the user's data is to stop malware from compromising user data! It's not imposing central control over third-party software development.
The fundamental gap here is in people continuing to think that any app can access all of a user's preferences directory. That's an obsolete and reckless model. In the modern world, every app runs in a sandbox and can affect only its own preferences directory.
Yes, but I'm talking about how the world works today — and today, viruses still find ways to run on OSes where they aren't forced into any kind of sandboxing.
Usually because they get spread in the form of some kind of system-automation script (which inherently run with the kind of non-sandboxing that "gluing disparate apps together" requires) where "run embedded script with elevation" is an API call, so all they need to do is to convince the user to accept the elevation prompt.
Today, in the here and now, we shouldn't be playing with knives. But maybe we can play with knives again when we learn to wear gloves.
I feel like I should say: Mozilla aren't an OS manufacturer. They can't fix the OS security model of all OSes, everywhere. They can only modify the thing they have control over (the browser) to be secure in the face of generally-broken OS security models.
But that's not even really the point, because this is as much about Evil Maid attacks as it is about viruses. Sandboxing doesn't prevent someone impersonating the user, with the user's full authority from installing extensions on the system. They might have the right keys to authenticate as an admin, but as long as they can authenticate as the user, they can act as the user to do the same things the user themselves would do, to install user-specific malware browser extensions.
(And, maybe of more concern to home users, there are plenty of ways for a virus to impersonate the user, e.g. through accessibility APIs. Unlike superuser elevation prompts (which run on a different Desktop / as high-integrity / etc., and thus ignore other programs sending input-events to them), there's no elevation prompt for one program goes to interact with another program's files (ala macOS's recent Gatekeeper changes), that accessibility APIs can't be made to interact with. Imagine how a user operating the OS with switch-control would react to not being able to use switch-control to accept an inter-app-elevation-prompt. Untenable, right?)
As for evil maid attacks: that's just paternalism. It's the user's responsibility not to grant unauthorized humans access to his user account. Trying to protect against the scenario in which a user willingly grants account access to somebody he shouldn't trust through artificial feature limitations is an unacceptable infringement on user autonomy.
See my reply to a sibling comment: it's because extensions in Firefox Mobile aren't really "extensions" (modules external to the download package) per se, but rather are built into the package pushed to the App Store / Google Play Store. This is forced by the platform owners, who want to vet all [non-sandboxed] code that ever runs in relation to a given app — which means all that code has to be in the app, amenable to static analysis / fuzz testing / malware signature scanning / human spot checks / whatever else the platform-owner wants to put the app-bundle through as part of vetting it for release.
The alternative model Firefox Mobile could do, is to build an extension model where the extensions are each their own app-like things that get submitted to the App Store + Play Store. (Think iOS Safari's "iMessage Sticker Pack" app-bundles.)
This, though, has three problems: first, it takes the extension trust model out of Mozilla's hands (so, depending on the API, Mozilla potentially wouldn't be able to block what they consider to be malware extensions, without disabling all extensions.) Second, it forces extension authors to go through the platform owners to publish, which removes the "write once, publish once" flow that WebExtensions currently have.
But the third and most important problem, is that it requires these platforms to add one-off support for the new "type" of purchasable item within their stores — which they might not necessarily be willing to do.
Maybe Mozilla has already tried and failed to get the platforms to do so already. Or maybe — I'd guess this is more likely — they're still working on it, and the current situation is a stop-gap until then.
> It's the user's responsibility not to grant unauthorized humans access to his user account.
If you're the CIO of a corporation; and it's the frequency at which your company's employees fall victim to social-engineering attacks is a literal KPI that your job-security is determined by; and you can't drive that by just saying "don't hire people who aren't paranoid about OPSEC" because there aren't enough paranoid-mindset people in the world to actually fill all available positions in your company; then what are you going to do?
There's an analogous situation you're in when you're an able-bodied adult, advising an elderly person or a child or a very new computer-user of what software to install.
Yes, that's literal paternalism. Some people need someone else to be their IT admin. I really wish it was easier to set up home-OS computers during initial setup so that they were one-off MDM managed by some other person somewhere, without that person needing to run an MDM server.
But as that isn't easy — you need to actually have an MDM server set up, and bind them to it, and everything gets screwed up if their computer can't reach the domain for a month — then the closest thing you can do, is to just tell them to install software that enacts the same sort of MDM-restrictions you'd enact, for you.
The trend in the industry has been to ignore the existence of the individual home-user "power user" — i.e. the person who wants to be the sysadmin of their own computer. In the modern security model most OSes and apps subscribe to, there are three groups / user-stories:
1. home users who aren't (and don't want to be) sysadmins, but who don't have anyone else to be their sysadmin
2. corporate employees who have a real sysadmin managing their computer
3. the corporate sysadmins themselves
The modern setup is to give group 3 complete control over group 2, and to restrict the actions of group 1 about the same way as if the software ISV itself was the user's sysadmin (but to take these de-facto restrictions away for group 2, instead trusting group 3 to make the right decisions concerning them.)
This works well-enough to prevent the things the ISVs want to prevent — the spread of viruses/malware, etc. And it works well-enough for most users as well.
For the users that want to be their own sysadmin, their situation is annoying, but not impossible: they have to opt out of being a member of group 1, and instead join both groups 2 and 3: make their computer think it's managed by some corporate sysadmin, and then also be that corporate sysadmin.
We could make this much easier for these people... but honestly, there aren't that many of them. It's a good-enough solution, especially since most of these "power users" are exactly the kind of people who can follow extended lists of arcane instructions to put their systems into this externally-self-managed state.
It's also why I'm very interested in hearing what, if any, Raymond Hill's (author of uBlock Origin) thoughts on this matter.
Yes. I read the announcement totally different. They're keeping blocking webRequest because there's no other way to make good adblockers, and it will stay that way until there's a way to get them working in a new API.
This is good news. Great news, even.
>I'd like to read Raymond Hill's thoughts on this.
Would not be surprised if he was one of the developers mentioned in the article...
1/ of course they will sell it as API unification, for the better good of worlds web interoperability. Of maybe for safety as this is Googles current argument for killing all sources of User defined (aka not blessed by Google and extension store) code from running in Chrome. Google is faming the issue as "disabling Remote Scripts", of course they are remote to Google, but local to Users as they are on users HDD.
And you assume that nobody will fork it to keep webRequest. This seems strange as Firefox is itself a fork of Netscape create because the dominant browser was horrible to use.
Yes? It would be the... third or fourth, I can't remember anymore, mass extinction event for extensions, and be consistent with previous instances of sidelining functionality before killing it (say, tab groups, which were native functionality, then moved to an extension, and then completely killed off in one of the aforementioned extinction waves). Previous behavior is an excellent reason to be cynical.
> And you assume that nobody will fork it to keep webRequest. This seems strange as Firefox is itself a fork of Netscape create because the dominant browser was horrible to use.
I guess someone will, but they'll always lag, and accrue security flaws and bugs and missing features (ex. pale moon) since modern web standards helpfully move too fast to keep up with unless you're huge and/or making only the most minor of changes (and even then, really; icecat failing to keep up is a reasonable argument that it's impractical).
No, I'm not sure how you got that idea. There are plenty of instances in the past where Mozilla/Firefox has been forked.
Edit: I know Mozilla said they are waiting for a better alternative, but any alternative that can be proposed will end up being less powerful than what we currently have.
uBlock Origin just happens to be important and trusted enough that these limitations should not be imposed on it.
Now that would be an interesting and pro-user move that sets their browser apart from others.
But it might piss off Google a little too much, which is probably why they've not made that or a similar move long before now. One of FF's earliest differentiators, before it was even called Firefox, was a form of ad-blocking, after all (pop-up and pop-under blocking, which at the time mostly meant blocking really annoying ads)
I think it is ultimately necessary due to the incentives at play that the adblocking technology can be delivered by any third party.
As long as Mozilla aren't themselves in ad sales/brokerage I wouldn't be worried about a browser shipping an ad-blocker, as far as incentives go. Google doing it, that'd be concerning. Mozilla? Good.
[EDIT] though actually this is another case of their relationship with Google being kinda crippling, since that does introduce a conflict of interest... which is part of why Google does it, I'm sure.
Forget Google, you might just get blocked by UserAgent from a lot of sites if they know Firefox has ad blocking and the default config is to block all their advertising.
There's a fine line between the status quo, pushing the envelope, and jumping so far ahead you end up being counterproductive.
Just pretend to be Chrome.
If you're saying Firefox should just pretend to be Chrome by default, well what more sign is needed to indicate irrelevance than being able to be shunned without consequence and having to pretend to be another.
Yes. I think user agents should not even be identifiable to begin with. Browsers should always pretend to be the most common browser at all times. This improves privacy and also prevents websites from discriminating against users.
Ideally there should be absolutely no way to detect which browser the user is running.
> well what more sign is needed to indicate irrelevance than being able to be shunned without consequence and having to pretend to be another
Being shunned is a good thing. Firefox is user-centric and therefore actively hostile to abusive websites. It stops just short of directly threatening their business models, a stance I think they should grow out of.
The fact they want to block it is evidence that it's working. It's not a sign of irrelevance, it's a sign the industry is taking us very seriously and actively working to undo our progress. We must develop and deploy every possible countermeasure to prevent them from doing so.
I agree with your ideal, but I'm also trying to acknowledge what I see as reality. I think this would be seen as an excuse to view Firefox as hostile, and sites would discriminate against it. And it's trivial to actually detect a browser with a different JS and DOM rendering engine.
> The fact they want to block it is evidence that it's working.
I think that's a naive sentiment at best, which people use as an excuse to pursue their own agendas and proclivities at the expense of the agenda they purport to support.
Who did more for civil rights and increasing the standing of black Americans, MLK Jr., the Black Panthers, or the Black Liberation Army?
Progress is often best achieved through measured steps, not extreme changes. You don't train for a marathon when not already a runner by just running the length of a marathon one day, as that would most likely cause damage to to yourself and be counter productive by setting you back. Instead, you make measured steps towards a goal and eventually you get there.
I think this is fine. We are hostile. Maybe we should just embrace it.
We literally don't want sites profiting off of our attention or personal information. We don't want to enter into a comprimise. That would allow them to continue. We want them to stop. We couldn't care less how much money they lose.
> And it's trivial to actually detect a browser with a different JS and DOM rendering engine.
Right now, yes. I expect this situation to improve though. Curbing fingerprinting for the sake of privacy will necessarily entail normalizing all browsers to the point they can no longer be used as identifying bits of information. That will destroy the ability to detect which browser the user is running.
> Progress is often best achieved through measured steps, not extreme changes.
That's true but there's nothing wrong with a revolution. We have a huge advantage in the form of control over our computers and what software they run. There is no need to compromise.
Once you're seen as hostile, you'll see countermeasures applied that target you. You don't get to act with impunity, and since Firefox is not coming from a place of power (their market share isn't enough to force the issue), Firefox will take a lot of damage. Possibly enough to kill it as a choice for most people.
> expect this situation to improve though. Curbing fingerprinting for the sake of privacy will necessarily entail normalizing all browsers to the point they can no longer be used as identifying bits of information.
It's impossible to do so while not making browsers the exact same. The browser you are running, possible even the version of the browser you are running, will almost always be trivially accessible as long as there are actually different browsers to choose from with actual different implementations.
Think about it, even if the actual API space is identical in usage and result, there will still be differences in how the underlying code run, with differing performance and timing characteristics. We live in an age where it's best practice to do security operations in constant time operations just so things like a password auth can't be used to determine someone's real password by how quickly it fails and many attempts. Without throwing away all performance criteria for everything and relying on other browsers to as well (it does no good to hire yourself as the only one acting this way), you can never rely on the browser you're using being anonymous if you want any JS usage at all, my guess is something similar applies to DOM and CSS as well, even if somewhat harder to capitalize on.
> We have a huge advantage in the form of control over our computers and what software they run.
That advantage is worth nothing in the larger scheme of things. Facebook doesn't care what you do. But they do care about Firefox affecting ~7% of people visiting, and if Facebook and a few other high profile sites both ban Firefox and start up with their own rhetoric and narrative about Firefox, Firefox is dead. Other sites will use that as an excuse to do the same. At a minimum people that want access to these sites will install a separate browser, but most will likely just switch browsers, either initially or eventually.
Then we're left with the best current choice for privacy dead, and everything has been set back years because not only have people switched to less protective browsers, but there's been a clear showing of power and others are going to be less likely to push against that for fear of the same outcome (and let's be honest, Chrome has at best mixed incentives).
> That's true but there's nothing wrong with a revolution.
A revolution you can't win is a disaster for all those except those in power, who now have convenient targets to go after.
This is already the case for uBlock Origin and filtering extensions in general. Sites deploy blocker blockers. Users deploy blocker blocker blockers. It's a never-ending arms race at this point.
> and since Firefox is not coming from a place of power (their market share isn't enough to force the issue), Firefox will take a lot of damage. Possibly enough to kill it as a choice for most people.
It doesn't have to force anything. It just has to keep working despite any interference. If uBlock Origin can get away with it, surely Firefox can too.
> Think about it, even if the actual API space is identical in usage and result, there will still be differences in how the underlying code run, with differing performance and timing characteristics.
Surely there are ways to mitigate these side channel attacks. If it can be done in other systems, it can be done in the browser as well.
> Without throwing away all performance criteria for everything and relying on other browsers to as well
It doesn't mean throwing away performance, it means matching the performance and output of the more popular browsers.
> But they do care about Firefox affecting ~7% of people visiting, and if Facebook and a few other high profile sites both ban Firefox and start up with their own rhetoric and narrative about Firefox, Firefox is dead.
Probably not. We already have at least one browser with ad blocker included and enabled by default even on mobile: Brave browser. It doesn't just block ads, it replaces them with its own advertising system in order to steal their market share. Not sure if it's possible to be more hostile than this and yet Brave browser is growing.
Great, and now websites are actively incentivized to not support Firefox, because they'll know Firefox users generate zero advertising revenue by default.
I totally agree with this. At this point uBlock Origin's technology is so important and essential it should be a standard feature of every browser. I've posted this many times before. People usually say that uBlock Origin is better off independent because Mozilla is funded by Google.
As far as browser technology is concerned, extensions like uBlock Origin are so important they should be part of every browser. I would like to see even deeper integration and more powerful filtering features. I do agree that we need the right incentives for such a thing to happen. We can't have an advertising company in charge of ad blocker code.
Maybe then just don't install the extension -> voila, zero access
given! Of course this is a stupid advice because obviously you
want the working extension for some reason. Just like others
want their extension to be able to do whatever it needs to get
its job done.
If Google screws this up, more effective ad blocking extensions could end up being a good, concrete way for Firefox to differentiate itself from other browsers.
Running from the same artifact is a bit harder as Chrome will only accept extensions signed by Google from the Chrome Web Store. I guess Firefox could let you install them, but then it leaves the possibility of installing an extension that is actually incompatible and confusing users.
Improve performance, reduce bugs, stop tying to badly implement existing community extensions. The UI also does NOT need more redesign
Or maybe we just need a fresh new browser
The only other companies doing it are among the richest companies in the world (Google and Apple). Wishing for fewer bugs and better performance isn't invalid, but you have to put the stuff Mozilla is doing into perspective. There are only so many things you can do.
The addons post shows a compromise in trying to the keep browser compatible (even if that means adopting the `chrome.*` namespace and implementing stuff from Googles wishlist). But it also shows that where it matters — where Google wants to abuse their market position — they diverge from the spec.