Closing this as we are no longer pursuing Swift adoption

2026-02-1823:08328315github.com

List of issues preventing moving forward on moving Swift 6.0 support out of an experimental state: Swift issues: swiftlang/llvm-project#8998 Details: Swift's version of LLVM is missing the fix for ...

@ADKaster
@ADKaster

List of issues preventing moving forward on moving Swift 6.0 support out of an experimental state:

Swift issues:

CMake issues:

  • https://gitlab.kitware.com/cmake/cmake/-/issues/26174

    Details: Swift + Ninja doesn't respect CMAKE_OSX_DEPLOYMENT_TARGET. This results in a mismatched LC_BUILD_VERSION on swift and c++ object files, spamming the console with warnings.

    Workaround:

    # FIXME: https://gitlab.kitware.com/cmake/cmake/-/issues/26174
    if (APPLE)
    set(CMAKE_Swift_COMPILER_TARGET "${CMAKE_SYSTEM_PROCESSOR}-apple-macosx${CMAKE_OSX_DEPLOYMENT_TARGET}")
    endif()
  • https://gitlab.kitware.com/cmake/cmake/-/issues/26175

    Details: With CMP0157 enabled, swiftc does not set install_name directory to "@rpath" per CMAKE_INSTALL_NAME_DIR

    Workaround:

    # FIXME: https://gitlab.kitware.com/cmake/cmake/-/issues/26175
    if (APPLE)
    add_custom_command(TARGET LibGfx POST_BUILD
    COMMAND install_name_tool -id @rpath/liblagom-gfx.0.dylib "$<TARGET_FILE:LibGfx>"
    )
    endif()

    PR: https://gitlab.kitware.com/cmake/cmake/-/merge_requests/9692. Merged Aug 2, 2024 to be backported to CMake 3.29, 3.30.

  • https://gitlab.kitware.com/cmake/cmake/-/issues/26195

    Details: Imported targets from dependencies can have INTERFACE_COMPILE_OPTIONS or INTERFACE_LINK_OPTIONS that swiftc doesn't understand.

    Workaround: Swizzle the flags just after import, for every single imported library.

Ladybird issues:

Nice-to-have:

Open questions:

  • Unclear how to pass view types or byte slices to swift without creating a copy.

    • We will want to be passing untrusted Strings, or c++-owned Spans of bytes to swift for it to crunch on and return some structured data. It's not clear how to inform swift about this without copying the data (at least) once.
    • I was not able to massage swift into interpreting our String and StringView types as 'CxxConvertibleToContainer' or 'CxxRandomAccessContainer' types. Likely because they are actually immutable?
  • Unclear how to convince Swift that our types are just as good as std:: ones.

    • AK::Optional
    • AK::HashTable/HashMap
    • AK::Time
    • more?
  • How to integrate with our garbage collector? https://forums.swift.org/t/ladybird-browser-and-swift-garbage-collection/76084

Reactions are currently unavailable

You can’t perform that action at this time.


Read the original article

Comments

  • By incognitojam 2026-02-1823:321 reply

    The commit removing Swift has a little bit more detail:

        Everywhere: Abandon Swift adoption
    
        After making no progress on this for a very long time, let's acknowledge
        it's not going anywhere and remove it from the codebase.
    
    https://github.com/LadybirdBrowser/ladybird/commit/e87f889e3...

  • By miffy900 2026-02-192:445 reply

    As someone who first began using Swift in 2021, after almost 10 years in C#/.NET land, I was already a bit grumpy at how complex C# was, (C# was 21 years at that point), but then coming to Swift, I couldn't believe how complex Swift was compared to C# - Swift was released in 2014, so would've been 8 years old in 2022. How is a language less than half the age of C# MORE complex than C#?

    And this was me trying to use Swift for a data access layer + backend web API. There's barely any guidance or existing knowledge on using Swift for backend APIs, let alone a web browser of all projects.

    There's no precedent or existing implementation you can look at for reference; known best practices in Swift are geared almost entirely towards using it with Apple platform APIs, so tons of knowledge about using the language itself simply cannot be applied outside the domain of building client-running apps for Apple hardware.

    To use swift outside its usual domain is to become a pioneer, and try something truly untested. It was always a longshot.

    • By VerifiedReports 2026-02-193:535 reply

      I started using it around 2018. After being reasonably conversant in Objective-C, I fully adopted Swift for a new iOS app and thought it was a big improvement.

      But there's a lot of hokey, amateurish stuff in there... with more added all the time. Let's start with the arbitrary "structs are passed by value, classes by reference." And along with that: "Prefer structs over classes."

      But then: "Have one source of truth." Um... you can't do that when every data structure is COPIED on every function call. So now what? I spent so much time dicking around trying to conform to Swift's contradictory "best practices" that developing became a joyless trudge with glacial progress. I finally realized that a lot of the sources I was reading didn't know WTF they were talking about and shitcanned their edicts.

      A lot of the crap in Swift and SwiftUI remind me of object orientation, and how experienced programmers arrived at a distilled version of it that kept the useful parts and rejected dumb or utterly impractical ideas that were preached in the early days.

      • By ChrisMarshallNY 2026-02-196:471 reply

        I think Swift was developed to keep a number of constituencies happy.

        You can do classic OOP, FP, Protocol-Oriented Programming, etc., or mix them all (like I do).

        A lot of purists get salty that it doesn’t force implementation of their choice, but I’m actually fine with it. I tend to have a “chimeric” approach, so it suits me.

        Been using it since 2014 (the day it was announced). I enjoy it.

        • By alper 2026-02-198:432 reply

          No Swift was developed as a strategic moat around Apple's devices. They cannot be dependent on any other party for the main language that runs on their hardware. Controlling your own destiny full stack means having your own language.

          • By TazeTSchnitzel 2026-02-1910:12

            Apple already had that "strategic moat" with Objective-C. It was already a language you could effectively only use on Apple platforms (the runtime and the standard library only run on Darwin) and for which Apple controlled the compiler (they have their own fork of Clang).

          • By ChrisMarshallNY 2026-02-1911:18

            I suspect that it was developed, in order to make native development more accessible. SwiftUI is also doing that.

            They want native, partly as a “moat,” but also as a driver for hardware and services sales. They don’t want folks shrugging and saying “It doesn’t matter what you buy; they’re all the same.”

            I hear exactly that, with regard to many hybrid apps.

      • By zffr 2026-02-196:271 reply

        Prefer structs over classes != only use structs.

        There are plenty of valid reasons to use classes in Swift. For example if you want to have shared state you will need to use a class so that each client has the same reference instead of a copy.

      • By raw_anon_1111 2026-02-194:141 reply

        > But there's a lot of hokey, amateurish stuff in there... with more added all the time. Let's start with the arbitrary "structs are passed by value, classes by reference." And along with that: "Prefer structs over classes."

        This is the same way that C# works and C and C++ why is this a surprise?

        • By AlexandrB 2026-02-194:411 reply

          Neither C++ nor C pass classes by reference by default (what even is a C "class" other than a struct?).

          • By raw_anon_1111 2026-02-195:01

            You are correct - it’s been ages since I’ve done C. The distinction is in C#.

      • By sumuyuda 2026-02-198:022 reply

        > when every data structure is COPIED on every function call

        Swift structs use copy on write, so they aren’t actually copied on every function call.

        • By Epskampie 2026-02-198:26

          They are, as far as "Have one source of truth" is concerned. That is what parent is talking about.

        • By amomchilov 2026-02-1913:47

          They don’t, by default. That’s something you have to implement yourself.

          It’s a common misconception that comes from the standard library data structures, which almost all do implement CoW

      • By fingerlocks 2026-02-196:211 reply

        Nowhere does it say structs provide “one source of truth”. It says the opposite actually- that classes are to be used when unique instances are required. All classes have a unique ID, which is simply it’s virtual memory address. Structs by contrast get memcpy’d left and right and have no uniqueness.

        You can also look at the source code for the language if any it’s confusing. It’s very readable.

        • By mvdtnz 2026-02-197:421 reply

          You're re-stating his exact problem while trying to refute him.

          • By fingerlocks 2026-02-1921:061 reply

            No, I’m not. OP is conflating multiple guidelines for different purposes and attempting to use them all simultaneously.

            • By zephen 2026-02-209:031 reply

              > OP is ... attempting to use [multiple guidelines] simultaneously.

              No, he's literally explaining why the guidelines can't be used simultaneously.

              • By fingerlocks 2026-02-2112:241 reply

                They aren't supposed to be used simultaneously, which is literally what my comment was explaining. Different guidelines to solve different problems.

                • By zephen 2026-02-2116:34

                  Have one source of truth is a universal guideline.

                  Prefer structs over classes is a universal, if weak, guideline.

                  It's funny how people can be all hung up on composability of things like type systems, and then completely blow off the desire for composability of guidelines.

    • By librasteve 2026-02-198:114 reply

      In the last years, simplistic languages such as Python and Go have “made the case” that complexity is bad, period. But when humans communicate expertly in English (Shakespeare, JK Rowling, etc) they use its vast wealth of nuance, shading and subtlety to create a better product. Sure you have to learn all the corners to have full command of the language, to wield all that expressive power (and newcomers to English are limited to the shallow end of the pool). But writing and reading are asymmetrical and a more expressive language used well can expose the code patterns and algorithms in a way that is easier for multiple maintainers to read and comprehend. We need to match the impedance of the tool to the problem. [I paraphrase Larry Wall, inventor of the gloriously expressive https://raku.org]

      • By grey-area 2026-02-198:544 reply

        Not sure how I feel about Shakespeare and JK Rowling living in the same parenthesis!

        Computer languages are the opposite of natural languages - they are for formalising and limiting thought, the exact opposite of literature. These two things are not comparable.

        If natural language was so good for programs, we’d be using it - many many people have tried from literate programming onward.

        • By palata 2026-02-1914:45

          Natural languages are ambiguous, and that's a feature. Computer languages must be unambiguous.

          I don't see a case for "complex" vs "simple" in the comparison with natural languages.

        • By librasteve 2026-02-199:062 reply

          I fully accept that formalism is an important factor in programming language design. But all HLLs (well, even ASM) are a compromise between machine speak (https://youtu.be/CTjolEUj00g?si=79zMVRl0oMQo4Tby) and human speak. My case is that the current fashion is to draw the line at an overly simple level, and that there are ways to wrap the formalism in more natural constructs that trigger the parts of the brain that have evolved to hanle language (nouns, verbs, adverbs, prepositions and so on).

          Here's a very simple, lexical declaration made more human friendly by use of the preposition `my` (or `our` if it is packaged scoped)...

            my $x = 42;

          • By j_w 2026-02-1916:291 reply

            How is that snippet any better than:

            x := 42

            Or

            let x = 42

            Or

            x = 42

            It seems like a regression from modern languages.

            • By bmn__ 2026-02-2315:312 reply

              "my" is 33% shorter than "let"

              Example 1 and 3 are not declarations, so apples ↔ oranges

              • By yossi_peti 2026-02-2316:16

                Example 1 is a declaration in Go. Example 3 is a declaration in Python.

              • By j_w 2026-02-2316:18

                my $x = 42;

                let x = 42

                Well, when you add in the '$' and ';' tokens the "let" example is still shorter. Also as another person replied to you, those other two examples are declarations in other languages. So 0 for 3 there.

          • By grey-area 2026-02-208:01

            Have you looked at all the previous attempts?

            Your example is not compelling I’m afraid but you should try building a language to see. Also read literate programming if you haven’t already.

        • By h3lp 2026-02-2017:111 reply

          Literate programming is not about programming in natural languages: it's about integrating code (i.e. the formal description in some DSL) with the meta-code such as comments, background information, specs, tests, etc.

          BTW, one side benefit of LP is freedom from arbitrary structure of DSLs. A standard practice in LP is to declare and define objects in the spot in which they are being used; LP tools will parse them out and distribute to the syntactically correct places.

          • By grey-area 2026-02-2019:49

            Well I think the ambition was to have as much as possible in natural language, with macros calling out to ‘hidden’ code intended for machines. So I do think there is a good link with later attempts to write using natural language and make computer languages more human-friendly and he was one of the first to have this idea.

            Neither strategy has had much success IMO.

        • By tom_m 2026-02-1915:39

          Exactly. I mean think about the programming languages used in aircraft and such. There's reasons. It all depends on what people are willing to tolerate.

      • By integralid 2026-02-199:522 reply

        >But writing and reading are asymmetrical and a more expressive language used well can expose the code patterns and algorithms in a way that is easier for multiple maintainers to read and comprehend.

        It's exactly the opposite. Writing and reading are asymmetrical, and that's why it's important to write code that is as simple as possible.

        It's easy to introduce a lot of complexity and clever hacks, because as the author you understand it. But good code is readable for people, and that's why very expressive languages like perl are abhorred.

        • By librasteve 2026-02-1910:11

          > Writing and reading are asymmetrical, and that's why it's important to write code that is as simple as possible.

          I 100% agree with your statement. My case is that a simple language does not necessarily result in simpler and more readable code. You need a language that fits the problem domain and that does not require a lot of boilerplate to handle more complex structures. If you are shoehorning a problem into an overly simplistic language, then you are fighting your tool. OO for OO. FP for FP. and so on.

          I fear that the current fashion to very simple languages is a result of confusing these aspects and by way of enforcing certain corporate behaviours on coders. Perhaps that has its place eg Go in Google - but the presumption that one size fits all is quite a big limitation for many areas.

          The corollary of this is that richness places an burden of responsibility on the coder not to write code golf. By tbh you can write bad code in any language if you put your mind to it.

          Perhaps many find richness and expressivity abhorrent - but to those of us who like Larry's thinking it is a really nice, addictive feeling when the compiler gets out of the way. Don't knock it until you give it a fair try!

        • By valenterry 2026-02-267:51

          Then you should write assembly only. Like `MOV`, `ADD`... can't really get simpler than that.

          Problem is, that makes every small part of the program simple, but it increases the number of parts (and/or their interaction). And ultimately, if you need to understand the whole thing it's suddenly much harder.

          Surely you can write the same behaviour in "clever" (when did that become a negative attribute?) or "good" way in assembly. You are correct. But that's a different matter.

      • By spongebobism 2026-02-1910:221 reply

        Perlis's 10th epigram feels germane:

        > Get into a rut early: Do the same process the same way. Accumulate idioms. Standardize. The only difference(!) between Shakespeare and you was the size of his idiom list - not the size of his vocabulary.

        • By librasteve 2026-02-1915:161 reply

          Well sure - being in a rut is good. But the language is the medium in which you cast your idiom, right?

          Here's a Python rut:

            n = 20  # how many numbers to generate
            a, b = 0, 1
            for _ in range(n):
              print(a, end=" ")
              a, b = b, a + b
            print()
          
          Here's that rut in Raku:

            (0,1,*+*...*)[^20]
          
          I am claiming that this is a nicer rut.

          • By zephen 2026-02-1917:482 reply

              seq = [0,1]
              while len(seq) < 20:
                  seq.append(sum(seq[-2:]))
              print(' '.join(str(x) for x in seq))
            
            > I am claiming that (0,1,+...*)[^20] is a nicer rut.

            If it's so fantastic, then why on earth do you go out of your way to add extra lines and complexity to the Python?

            • By jaen 2026-02-1920:501 reply

              Complexity-wise, this version is more complicated (mixing different styles and paradigms) and it's barely less tokens. Lines of code don't matter anyway, cognitive load does.

              Even though I barely know Raku (but I do have experience with FP), it took way less time to intuitively grasp what the Raku was doing, vs. both the Python versions. If you're only used to imperative code, then yeah, maybe the Python looks more familiar, though then... how about riding some new bicycles for the mind.

              • By zephen 2026-02-204:431 reply

                > Complexity-wise, this version is more complicated (mixing different styles and paradigms)

                Really? In the other Python version the author went out of his way to keep two variables, and shit out intermediate results as you went. The raku version generates a sequence that doesn't even actually get output if you're executing inside a program, but that can be used later as a sequence, if you bind it to something.

                I kept my version to the same behavior as that Python version, but that's different than the raku version, and not in a good way.

                You should actually ignore the print in the python, since the raku wasn't doing it anyway. So how is "create a sequence, then while it is not as long as you like, append the sum of the last two elements" a terrible mix of styles and paradigms, anyway? Where do you get off writing that?

                > Lines of code don't matter anyway, cognitive load does.

                I agree, and the raku line of code imposes a fairly large cognitive load.

                If you prefer "for" to "while" for whatever reason, here's a similar Python to the raku.

                  seq = [0,1]
                  seq.extend(sum(seq[-2:]) for _ in range(18))
                
                The differences are that it's a named sequence, and it doesn't go on forever and then take a slice. No asterisks that don't mean multiply, no carets that don't mean bitwise exclusive or.

                > If you're only used to imperative code, then yeah, maybe the Python looks more familiar, though then... how about riding some new bicycles for the mind.

                It's not (in my case, anyway) actually about imperative vs functional. It's about twisty stupid special symbol meanings.

                Raku is perl 6 and it shows. Some people like it and that's fine. Some people don't and that's fine, too. What's not fine is to make up bogus comparisons and bogus implications about the people who don't like it.

                • By jaen 2026-02-209:411 reply

                  Reminds me a bit of the fish anecdote told by DFW... they've only swam in water their entire life, so they don't even understand what water is.

                  Here are the mixed paradigms/styles in these Python snippets:

                  - Statements vs. expressions

                  - Eager list comprehensions vs. lazy generator expressions

                  - Mutable vs. immutable data structures / imperative reference vs. functional semantics

                  (note that the Raku version only picks _one_ side of those)

                  > seq.extend(sum(seq[-2:]) for _ in range(18))

                  I mean, this is the worst Python code yet. To explain what this does to a beginner, or even intermediate programmer.... oooooh boy.

                  You have the hidden inner iteration loop inside the `.extend` standard library method driving the lazy generator expression with _unspecified_ one-step-at-a-time semantics, which causes `seq[-2:]` to be evaluated at exactly the right time, and then `seq` is extended even _before_ the `.extend` finishes (which is very surprising!), causing the next generator iteration to read a _partially_ updated `seq`...

                  This is almost all the footguns of standard imperative programming condensed into a single expression. Like ~half of the "programming"-type bugs I see in code reviews are related to tricky temporal (execution order) logic, combined with mutability, that depend on unclearly specified semantics.

                  > It's about twisty stupid special symbol meanings.

                  Some people program in APL/J/K/Q just fine, and they prefer their symbols. Calling it "stupid" is showing your prejudice. (I don't and can't write APL but still respect it)

                  > What's not fine is to make up bogus comparisons and bogus implications about the people who don't like it.

                  That's a quite irrational take. I didn't make any bogus comparisons. I justified or can justify all my points. I did not imply anything about people who don't like Raku. I don't even use Raku myself...

                  • By zephen 2026-02-2019:382 reply

                    > You have the hidden inner iteration loop inside the `.extend` standard library method driving the lazy generator expression with _unspecified_ one-step-at-a-time semantics

                    That's why it wasn't the first thing I wrote.

                    > To explain what this does to a beginner, or even intermediate programmer.... oooooh boy.

                    As if the raku were better in that respect, lol.

                    > Some people program in APL/J/K/Q just fine, and they prefer their symbols.

                    APL originally had a lot of its own symbols with very little reuse, and clear rules. Learning the symbols was one thing, but the usage rules were minimal and simple. I'm not a major fan of too many different symbols, but I really hate reuse in any context where how things will be parsed is unclear. In the raku example, what if the elements were to be multiplied?

                    > Calling it "stupid" is showing your prejudice. (I don't and can't write APL but still respect it) > Reminds me a bit of the fish anecdote told by DFW...

                    Yeah, for some reason, it's not OK for me to insult a language, but it's OK for you to insult a person.

                    But you apparently missed that the "twisty" part was about the multiple meanings. Because both those symbols are used in Python (the * in multiple contexts even) but the rules on parsing them are very simple.

                    perl and its successor raku are not about simple parsing. You are right to worry about the semantics of execution, but that starts with the semantics of how the language is parsed.

                    In any case, sure, if you want to be anal about paradigm purity, take my first example, and (1) ignore the print statement because the raku version wasn't doing that anyway, although the OP's python version was, and (2) change the accumulation.

                      seq = [0,1]
                      while len(seq) < 20:
                        seq = seq + [seq[-2] + seq[-1]]
                    
                    But that won't get you very far in a shop that cares about pythonicity and coding standards.

                    And...

                    You can claim all you want that the original was "pure" but that's literally because it did nothing. Not only did it have no side effects, but, unless it was assigned or had something else done with it, the result was null and void.

                    Purity only gets you so far.

                    • By jaen 2026-02-228:12

                      You're getting more and more irrational.

                      > it's OK for you to insult a person.

                      I made an analogy which just means that it's hard to understand what the different styles and paradigms are when those are the things you constantly use.

                      You're apparently taking that as an insult...

                      > But you apparently missed that the "twisty" part

                      I didn't miss anything. You just didn't explain it. "twisty" does not mean "ambiguous" or "hard to parse". Can't miss what you don't write.

                    • By lizmat 2026-02-2023:301 reply

                      > In the raku example, what if the elements were to be multiplied?

                      $ raku -e 'say (0, 1, 2, * × * ... )[^10]' # for readability (0 1 2 2 4 8 32 256 8192 2097152)

                      $ raku -e 'say (0, 1, 2, * * ... *)[^10]' # for typeability (0 1 2 2 4 8 32 256 8192 2097152)

                      • By zephen 2026-02-211:29

                        Yeah, no thanks.

                        My instincts about raku were always that perl was too fiddly, so why would I want perl 6, and this isn't doing anything to dissuade me from that position.

            • By librasteve 2026-02-1918:551 reply

              err - I cut and pasted the Python directly from ChatGPT ;-)

              • By zephen 2026-02-206:43

                But it doesn't do the same thing at all as the raku.

                It doesn't build a list, rather it dumps it as it goes.

                It has an explicit print.

                It uses a named constant for 20 rather than a literal.

                etc, etc...

    • By belmont_sup 2026-02-193:38

      Not to mention how heated my laptop gets when I try to compile a new vapor template. On an m1.

    • By i_am_a_peasant 2026-02-1912:32

      same. i thought it would have been as quick to pick up as rust. nowhere near. i spent weeks trying to go through every feature of the language at least once. time in which i could’ve read several rust books and already start hacking up some interesting projects. so much in swift is pointless syntax sugar. why do i need 50 ways to do exactly the same thing, it’s just nonsense. then i have to look up the language reference whenever i read a new codebase

    • By fud101 2026-02-195:311 reply

      So did you go back to and keep using C#/NET?

      • By miffy900 2026-02-1922:35

        well for backend development, yes - I technically never stopped as I had existing projects to maintain. But after trying out Swift a couple times, I've dropped it entirely for backend. For new backend work it's C#/.NET all the way.

        I wanted to try using a native language other than C++ and Swift ostensibly seemed easier to pick up. I continue to use Swift for iOS app development though where it is much easier to use; but that has its own share of compromises and trade-offs - but not centred around Swift, around SwiftUI vs UIKit.

  • By drnick1 2026-02-191:392 reply

    Regardless of the language it is written in, one thing that I hope Ladybird will focus on when the time comes is a user-respecting Javascript implementation. Regardless of what the Web standards say, it is unacceptable that websites can (ab)use JS against the users for things such as monitoring presence/activity, disabling paste, and extracting device information beyond what is strictly necessary for an acceptably formatted website. One approach could be to report standardized (spoofed) values across the user base so that Ladybird users are essentially indistinguishable from each other (beyond the originating IP). This is more or less the approach taken by Tor, and where a project like Ladybird could make a real difference.

    • By diath 2026-02-191:442 reply

      There's just too many defense mechanisms on popular websites that would simply make Ladybird flagged as a bot and render the website unusable. I wouldn't mind a toggle to switch between this and normal behavior but having that as a default would be bad for wider adoption.

      • By drnick1 2026-02-191:532 reply

        If those "popular websites" are the likes of Facebook and Instagram, I don't see that as a big loss. That being said, I find that most of the Web works just fine on Tor, so it's certainly possible. Most of the issues seem related to the (known) the exit IP being overused or identified as Tor.

        • By diath 2026-02-191:562 reply

          > If those "popular websites" are the likes of Facebook and Instagram, I don't see that as a big loss.

          Personally I wouldn't mind either but my point is that they probably want to cater to the average person, and not just security conscious tech savvy people, and if that's the case, then you really can't exclude FB/IG/YT and others from working properly in your browser.

          • By roughly 2026-02-192:191 reply

            > they probably want to cater to the average person, and not just security conscious tech savvy people

            Why? The average person is well served by large existing players, whereas security conscious tech people are extremely underserved and often actually willing to pay.

            • By ulyssys 2026-02-193:071 reply

              Specific numbers aside, one possible reason is they want to increase adoption to gain user volume, in order to have an effect on the larger ecosystem.

              Once you have non-trivial network effects, you could continue to influence the ecosystem (see: MSIE, Firefox in its early days, and Google Chrome). There are probably multiple paths to this. This is one.

              • By roughly 2026-02-1920:19

                Influence isn’t all about the raw numbers - if yours is the browser of choice for developers, that’s going to give you a stronger voice than just being the fourth best browser. Think about Twitter - by all accounts, Twitter’s user numbers were dwarfed by the other networks, but they punched way above their weight because the entire political policy making and reporting apparatus was on there.

          • By drnick1 2026-02-193:17

            Firefox tries to position itself as that secure and private alternative, but this is mostly marketing. For a long time, Chromium had better site isolation, and the default Firefox settings are permissive when it comes to fingerprinting. Out of the box, it seems that Brave wins here, but for now using Brave means accepting a lot of extra commercial stuff that should not be in a browser in the first place (and that increases the attack surface). I have been using the Arkenfox user.js for Firefox, but it's unclear how much good it does or if it isn't counterproductive (by making the the user stand out).

        • By xmcp123 2026-02-192:07

          Most of the web works with Tor, but to make tor successful at the things it is intended to do you have to disable JavaScript.

          This kills the internet.

      • By bluGill 2026-02-191:59

        Only if there is not widespread adoption.

    • By mvdtnz 2026-02-197:44

      A web browser that explicitly does its own thing regardless of web standards is the last browser in the world I would consider using.

HackerNews