Show HN: Spark, An advanced 3D Gaussian Splatting renderer for Three.js

2025-06-1117:0237786sparkjs.dev

An advanced 3D Gaussian Splatting renderer for THREE.js

Integrate in your scene with other meshes and splats, fast rendering on all devices, programmable dynamic splat effects, wide format support (ply, spz, splat, ksplat)
Get started →


Read the original article

Comments

  • By erulabs 2025-06-1117:191 reply

    Super impressive looking demo, works well on my older iphone.

    As an only-dabbling-hobbiest game developer who lacks a lot of 3d programming knowledge, the only feedback I can offer is you might perhaps define what "Gaussian Splatting" is somewhere on the github or the website. Just the one-liner from wikipedia helps me get more excited about the project and potential uses: Gaussian splatting is a volume rendering technique that deals with the direct rendering of volume data without converting the data into surface or line primitives.

    Super high performance clouds and fire and smoke and such? Awesome!

    • By dmarcos 2025-06-1117:30

      Thanks. We have to definitely add an FAQ

  • By jasonthorsness 2025-06-1117:482 reply

    The food scans demo ("Interactivity" examples section) is incredible. Especially Mel's Steak Sandwich looking into the holes in the bread.

    The performance seems amazingly good for the apparent level of detail, even on my integrated graphics laptop. Where is this technique most commonly used today?

    • By dmarcos 2025-06-1117:521 reply

      There's a community of people passionate about scanning all short stuff with handheld devices, drones... Tipatat let us generously use his food scans for the demo. I also enjoy kotohibi flower scans: https://superspl.at/user?id=kotohibi

      Edit: typos

      • By jasonthorsness 2025-06-1118:042 reply

        Wow what kind of device do I need to make my own?

        • By dmarcos 2025-06-1118:081 reply

          The food scans are just photos from a Pixel phone processed with postshot (https://www.jawset.com/) to generate the splats

          • By mft_ 2025-06-128:501 reply

            Out of interest, to what extent do splats recorded in this manner have reliable/measurable dimensions?

            • By jaccola 2025-06-1218:53

              They are not reliable at all unless paired with some physical measurements (Lidar, or a known size object in the scene).

              Probably an interesting use for a pretrained model to estimate scale based on common items seen in scenes (cars, doorframes, trees, etc…)

        • By ChadNauseam 2025-06-1118:091 reply

          I'm sure it's not cutting edge, but the app "scaniverse" generates some very nice splats just by you waving your phone around an object for a minute or so.

          • By dmarcos 2025-06-1118:16

            Yes there are several phone apps to generate splats. Also Luma 3D capture.

    • By creata 2025-06-1119:021 reply

      And the transfer size for that level of detail isn't that bad, either - only around 80MB. (Not being sarcastic, it's really neat.)

      • By dmarcos 2025-06-1119:09

        Yeah. And some of the individual scans like Clams and Caviar or Pad Thai are < 2MB.

  • By ertucetin 2025-06-1120:221 reply

    This is cool also BabylonJS has nice gaussian splat support as well: https://doc.babylonjs.com/features/featuresDeepDive/mesh/gau...

    • By echelon 2025-06-1120:493 reply

      BabylonJS and the OP's own Aframe [1] seem to have similar licenses, similar number of Github stars and forks, although Aframe seems newer and more game / VR focused.

      How do Babylon, Aframe, Three.js, and PlayCanvas [2] compare from those that have used them?

      IIUC, PlayCanvas is the most mature, featureful, and performant, but it's commercial. Babylon is the featureful 3D engine, whereas Three.js is fairly raw. Though it has some nice stuff for animation, textures, etc., you're really building your own kit.

      Any good experiences (or bad) with any of these?

      OP, your demo is rock solid! What's the pitch for Aframe?

      How do you see the "gaussian splat" future panning out? Will these be useful for more than visualizations and "digital twins" (in the industrial setting)? Will we be editing them and animating them at any point in the near future? Or to rephrase, when (or will) they be useful for the creative and gaming fields?

      [1] https://github.com/aframevr/aframe

      [2] https://playcanvas.com/

      • By dmarcos 2025-06-1121:201 reply

        A-Frame is an entity component system on top of THREE.js that uses the DOM as a declarative layer for the scene graph. It can be manipulated using the standard APIs and tools that Web developers are used to. Initial target was onboarding Web devs into 3D but found success beyond. The super low barrier of entry (hello world below) without sacrificing functionality made it very popular for people learning programming / 3D (part of the curriculum in many schools / universities) and in advanced scenarios (moonrider.xyz ~100k MAUs (300k MAUs at peak) most popular WebXR content to date is made with A-Frame)

        One of the Spark goals is exploring applications of 3D Gaussian Splatting. I don't have all the answers yet but already compelling use cases quickly developing. e.g photogrammetry / scanning where splats represent high frequency detail in an appealing and relatively compact way as you can see in one of the demos (https://sparkjs.dev/examples/interactivity/index.html). There are great examples of video capture already (https://www.4dv.ai/). Looking forward to seeing new applications as we figure out better compression, streaming, relighting, generative models, LOD...

        A-Frame hello world

        <html> <head> <script src="https://aframe.io/releases/1.7.1/aframe.min.js"></script> </head> <body> <a-scene> <a-box position="-1 0.5 -3" rotation="0 45 0" color="#4CC3D9"></a-box> </a-scene> </body> </html>

        • By echelon 2025-06-123:36

          Thank you, this is great info!

      • By ovenchips 2025-06-122:51

        When you say that PlayCanvas is commercial, that's a little misleading. The PlayCanvas Engine (analogous to Three.js and Babylon.js) is free and open source (MIT). The PlayCanvas Engine is where you'll find all the cool 3DGS tech. There are two further frameworks that wrap the Engine (for those that prefer to use a declarative interface): PlayCanvas Web Components and PlayCanvas React. Again, both of these are free and open source (MIT). Only the PlayCanvas Editor (analogous to a browser-based Unity) has optional payment plans (for those that want to create private projects).

        PlayCanvas Engine: https://github.com/playcanvas/engine

        PlayCanvas Web Components: https://github.com/playcanvas/web-components

        PlayCanvas React: https://github.com/playcanvas/react

      • By Joel_Mckay 2025-06-1121:302 reply

        Did a test study in BabylonJS, and generally the subset of compatible features is browser specific.

        The good:

        1. Blender plugin for baked mesh animation export to stream asset is cool

        2. the procedural texture tricks combined with displacement maps mean making reasonable looking in game ocean/water possible with some tweaking

        3. adding 2D sprite swap out for distant objects is trivial (think Paper Mario style)

        The bad:

        1. burns gpu vram far faster than normal engines (dynamic paint bloats up fast when duplicating aliases etc. )

        2. JS burns CPU cycles, but the wasm support is reasonable for physics/collision

        3. all resources are exposed to end users (expect unsophisticated cheaters/cloners)

        The ugly:

        1. mobile gpu support on 90% of devices is patchwork

        2. baked lighting ymmv (we tinted the gpu smoke VFX to cheat volumetric scattering)

        3. in browser games essentially combine the worst aspects of browser memory waste, and security sandbox issues (audio sync is always bad in browser games)

        Anecdotally, I would only recommend the engine for server hosted transactional games (i.e. cards or board games could be a good fit.)

        Otherwise, if people want something that is performant, and doesn't look awful.... Than just use the Unreal engine, and hire someone that mastered efficient shader tricks. =3

        • By tmilard 2025-06-127:431 reply

          Personaly I have been using babylonJs for five years. And I just love it. For me it's so easy to program ( cleanest API I have ever seen) and my 3D runtime is so light, my demos work fine even on my android phone.

          • By Joel_Mckay 2025-06-1213:461 reply

            Web browsers add a lot of unnecessary overhead, and require dancing with quarterly changes in policies.

            In general, most iOS devices are forced to use/link their proprietary JS vm API implementation. While Babylon makes it easier, it often had features NERF'd by both Apple iOS, and Alphabet Android. In the former case it is driven by a business App walled garden, and in the latter it is device design fragmentation.

            I like Babylon in many ways too, but we have to acknowledge the limitations in deployment impacting end users. People often end up patching every update Mozilla/Apple/Microsoft pushes.

            Thus, difficult to deploy something unaffected by platform specific codecs, media syncing, and interface hook shenanigans.

            This coverage issue is trivial to handle in Unity, GoDot, and Unreal.

            The App store people always want their cut, and will find convenient excuses to nudge that policy. It is the price of admission on mobile... YMMV =3

            • By m_kos 2025-06-1217:201 reply

              One component of my hobby web app project is a wavetable. Below are two examples of wavetables. I want it to not tax the browser so that other, latency sensitive, components do not suffer.

              Would you have any suggestions on what JS/TS package to use? I built a quick prototype in three.js but I am neither a 3D person nor a web dev, so I would appreciate your advice.

              Examples:

              - https://audiolabs-erlangen.de/media/pages/resources/MIR/2024...

              - https://images.squarespace-cdn.com/content/v1/5ee5aa63c3a410...

              • By Joel_Mckay 2025-06-1218:321 reply

                Personally, I wouldn't try to do DSP pipe code in VM.

                1. Use global fixed 16bit 44.1kHz stereo, and raw uncompressed lossless codec (avoids gpu/hardware-codec and sound-card specific quirks)

                2. Don't try to sync your audio to the gpu 24fps+ animations ( https://en.wikipedia.org/wiki/Globally_asynchronous_locally_... ). I'd just try to cheat your display by 10Hz polling a non-blocking fifo stream copy. ymmv

                3. Try statically allocated fifo* buffers in wasm, and software mixers to a single output stream for local chunk playback ( https://en.wikipedia.org/wiki/Clock_domain_crossing )

                * recall fixed rate producer/consumers should lock relative phase when the garbage collector decides to ruin your day, things like software FIR filters are also fine, and a single-thread output pre-mixed stream will eventually buffer though whatever abstraction the local users have setup (i.e. while the GC does its thing... playback sounds continuous.)

                Inside a VM we are unfortunately at the mercy of the garbage collector, and any assumptions JIT compiled languages make. Yet wasm should be able to push io transfers fast enough for software mixers on modern cpus.

                Best of luck =3

        • By echelon 2025-06-123:36

          Thanks!

HackerNews