Blur Busters Open Source Display Initiative – Refresh Cycle Shaders

2025-01-054:098735blurbusters.com

CES 2025 Announcement from Blur Busters "Area51" UFO Lab - BBOSDI Version 1.01 - Initial Draft Version Part 1: The CRT Simulation Breakthrough Leads to Plasma TV Simulation and CRT-VRR Experiments…

CES 2025 Announcement from Blur Busters “Area51” UFO Lab – BBOSDI Version 1.01 – Initial Draft Version

Part 1: The CRT Simulation Breakthrough Leads to Plasma TV Simulation and CRT-VRR Experiments

With Blur Busters real-time CRT electron beam simulation breakthrough and our open source code successfully simulating a CRT that successfully reduces motion blur at any simulated refresh rate, including simulated 60Hz for 60fps content (with less flicker than eye-searing squarewave PWM BFI strobe).

The CRT shader does not even require integer-divisible native:simulated Hz ratios, as we do temporal scaling (temporal dimension of good spatial scaling). Later 2025, we plan to release a plasma display simulation shader (frame-stacked 3D dither x-y-time).

Since our CRT shader does not require integer native:simulated Hz ratios, we are now experimenting in CRT-VRR (software based VRR strobing) as well as other display simulation algorithms.

Attracting the Community with Proof of Concept Demonstrations

The retrogaming community has brilliantly skilled open source software developers working in the SPATIAL dimension (CRT filters, scanlines, phosphor simulation, etc).

But Blur Busters? We do the TEMPORAL dimension, since we’re the Hz geniuses. Motion, blur reduction, display simulation, interlace simulation, subfield simulation, BFI, GtG simulators, etc.

With the launch of TestUFO Version 2.1, we have added more display simulators. Now we have extended this to a third dimension, thanks to brute refresh rates making this possible. We can use upcoming 600 Hz displays to simulate a 600 Hz plasma TV, using stacked dithered images for subfields.

Our popular display testing website already has TestUFO interlacing demo (View at 60 Hz + 400% browser zoom), TestUFO color wheel simulator (View at 240Hz+), and TestUFO black frame insertion demo (View at 120Hz+ and TestUFO photo version of BFI (View at 120Hz+), and TestUFO VRR simulator! So we already simulate lineitems of displays. However, as we roar higher in Hz, it enables more simulation superpowers!

IMPORTANT: TestUFO integrity check: Make sure browser is framepacing properly, in single-monitor mode, on the primary monitor, and no red spikes at TestUFO Frame Pacing Tester to prevent erratic flicker problems.

Yes, we’re porting the CRT shadertoy to TestUFO soon too. All display simulators will be built as TestUFO demos, even if some may arrive at shadertoy first.

Why Are We Doing This?

We have been working for years to improve display technology through programmes such as Blur Busters Approved and Blur Busters Verified programme, with successes such as ViewSonic XG2431 and Retrotink 4K.

  • Limiting Factor: Display scaler/TCONs are not as programmable as we would like
  • Limiting Factor: Modern OLEDs are unable to do flexible-range BFI (e.g. 48Hz-240Hz range, even for fixed-Hz), and unable to do adjustable pulsewidth in software. Some vendors refuse to implement our requests.
  • Limiting Factor: Some shaders outperform firmware now. Even Retrotink 4K BFI box-in-middle has less latency than a popular brand’s OLED firmware BFI, because of the incredible partially beam-raced optimizations I helped Retrotink 4K do.
  • Limiting Factor: Communication problem between a headquarters and a factory speaking in a different language, making display algorithms extremely difficult to engineer in an era of falling-cost displays.
  • Limiting Factor: Generic scaler/TCON is used with severely quality-degrading attributes such as outdated 17×17 overdrive lookup tables, formerly good at 60Hz, that look absolutely awful at 360Hz+ or panels with very asymmetric GtG response (e.g. tiny GtG hotspots that 17×17 OD LUTs cannot cover properly for good strobing).
  • Limiting Factor: We can’t do ergonomic creativity in both directions, including an adjustable LCD GtG simulator for OLED panels to make Netflix 24fps stutter less (24fps stutters more on OLED than LCD)
  • Limiting Factor: Aging greybeard CRT and display engineers, trained on yesteryear research, needing to collaborate with star shader programmers & modern temporally skilled people, suddenly only recently realizing a 480Hz OLEDs very accurately emulates most temporal aspects of a CRT tube via a GPU shader with nigh-100% population human visibility;
  • Limiting Factor: We have both mainstream and niche spinoff benefits display manufacturers aren’t doing for us. As evidenced by software based CRT eventually leading to CRT-VRR, etc.

GPUs are getting more powerful. Why not do some blur busting in a shader instead?  A modern RTX 3080 is now powerful enough to do a software-based equivalent of “GSYNC Pulsar” for a future 1000Hz OLED. Much easier software development work! Instead of limited FPGA talent, the world has tons of shader star programmers. In addition, ordinary VRR can now be simulated via brute refresh cycle blending algorithms similar to TestUFO VRR Simulator.

Great Filter-Down of BYOA (Bring Your Own Algorithm)

  1. Stage 1: Bleeding edge communities (with great shader programmers) creates shaders.
  2. Stage 2: Shaders gets implemented into software such as RetroArch emulator (which now has CRT simulator), as well as other things like Retrotink 4K and others (which is receiving CRT simulator too)
  3. Stage 3: Operating systems with refresh cycle hooks can then accept optional plug-in shaders.
  4. Stage 4: Display manufacturers can port shaders to FPGA if they wish. But we’re not waiting for them.

Blur Busters is currently communicating to Valve, and has now committed to adding a refresh cycle shader system to SteamOS that runs independently of content frame rate.

DRAFT Specification v1.01 – Refresh Cycle Shader

  • Operating Systems (e.g. Valve SteamOS, Microsoft Windows, Apple MacOS/iOS, Linux);
  • GPU driver vendors (e.g. NVIDIA, AMD, Intel, Qualcomm);
  • Display manufacturers (the whole industry);
  • Video Processor Vendors and Video Capture Vendors (e.g. Retrotink, Elecard);
  • Independent Software Vendors (e.g. RetroArch, Reshade)

We hereafter call this the “core subsystem“.

Possible Use Cases of Refresh Cycle Shaders

There are mainstream and niche use cases.

  • Display simulators (CRT simulator, plasma simulator, etc);
  • Adjustable pixel response and overdrive (shader-based overdrive algorithm, better LCD overdrive, etc);
  • Ergonomic features (deflickering features, destuttering features, etc);
  • Motion blur reduction features (softer impulsing-based simulators using phosphor simulators);
  • Improved gaming and de-stuttering features (simulated VRR, de-jittering gametimes, etc);
  • Improved home theater (Simulating 35mm projector / simulated LCD GtG, less harsh for 24fps OLED);
  • Videophile ISF-league features (Color adjustment hooks and preferred-tonemapping hooks);
  • High-end esports gaming (Some, not all, algorithms are less lag in refresh cycle shader);
  • Easier 3D shutter glasses on ANY 240Hz+ generic display with NO crosstalk;
    (Most 240Hz displays works with generic shutter glasses via LEFT/BFI/RIGHT/BFI sequence).
  • Etc, etc, etc.

Mandatory Driver Subsystem Requirements (RFC2119 “MUST”)

Your software, display, hardware, driver, or operating system MUST:

  • Contain a GPU that supports a shader programming language;
    (you can use the same GPU used for other purposes such as gaming/streaming. That means an existing GPU in a display used for apps, can also do double duty doing refresh cycle shaders too!)
  • Support refresh cycle shader processing, independently of content frame rate.
    (e.g. CRT simulation on 240Hz OLED must run continually every Hz)
  • Support plug-in refresh cycle shaders, installed by the end user
    (e.g. BOTH open source & proprietary shaders)
  • Support virtualization of VSYNC, independent of real device VSYNC.
    (e.g. 60Hz CRT VSYNC on 240Hz OLED. This allows existing 60fps games/videos to work properly.)
  • For displays, support linear nits in at least one display mode
    (e.g. post-gamma corrected SDR is very common, but ideally I’d like the first 20% windowing of HDR)

This MAY be internal (Retroarch) and/or driver (OS + GPU driver) and/or box-in middle (Video processor box) and/or display side (Display with a built-in GPU).

Recommended Subsystem Requirements (RFC2119 “SHOULD”)

Refresh cycle shaders require real time shader computations that involve energy buffers (linear colorspace) to subdivide refresh cycles into multiple subfields.

Subfields MAY involve CRT scanout, plasma subfields, DLP subfields, BFI frames, simulation of LCD GtG, phosphor simulation, and subfields invented for dream non-existent displays never made in hardware, etc.

In order to facilitate this:

  • The subsystem SHOULD be able to communicate a time budget to the shader as a debug or uniform (e.g. 1ms, 2ms, 4ms), to allow the shader to determine if there’s enough time to process the shader.
    (e.g. compute-heavy display simulators, versus lightweight display simulators)
  • The subsystem SHOULD be able to allow configurable virtual refresh rate, complete with virtualized VSYNC’s lower than native VSYNC’s.
    (e.g. 72Hz CRT on 240Hz OLED).
  • The subsystem SHOULD relay as much of the following as shader uniforms to the refresh cycle shader: – Native Hz of display; – Virtualized Hz (or if VRR simulation shader, current frametime); – Framebuffer of last 1 or 2 game frames (frame based); – Whether it’s a changed frame or not (e.g. no new frame has arrived due to low frame rate); – Resolution, color depth

    – Optional: Game 6dof positionals (e.g. for lagless 10:1 framegen algorithms);

    – Optional: Epoch gametime timestamps, if relayed from game engine (e.g. for dejittering duty); – Optional: Epoch time elapsed between game frame and refresh cycle (e.g. for dejittering duty);

    Where it is not possible, inform shader of limitations, so shader metadata can inform whether it’s possible or not.

  • Optional: The subsystem MAY allow virtualized Hz above native Hz.
    (e.g. This will permit things like splitting 4K 1000Hz over multiple 4K 120Hz LCoS projectors (using Arduino spinning shutter wheels in front of existing projectors), while projection mapping 8 video outputs, stacking onto the same projector screen, flashing one at a time for an unlimited-bounds refresh rate display.)
  • Optional: The subsystem MAY NOT assume integer divisors between native Hz and virtual Hz, although this is reasonable initially as a simplification to bootstrap a plug in shader system. Some shaders supports temporal scaling (Temporal version of spatial bilinea
    (e.g. Blur Busters CRT simulator can do 24fps Netflix to 72Hz simulated CRT on 280Hz LCD)
  • The shader SHOULD be able to inform the subsystem of the approximate workload expected.
    (e.g. Informing that a shader is 4x more compute-heavy than the other shader)
  • For displays only: Due to linearspace requirements of some (not all) refresh cycle shaders, the display subsystem SHOULD relay APL/ABL tonemapping algorithm and/or inform constraints (e.g. linear HDR up to first 10% window).

    Note: Currently, we are often relying on SDR+gamma correction to get access to linearspace for correct Talbot Plateau Law mathematics in refresh cycle shaders such as CRT simulators. 

While no mandatory requirement exists, if your shader is based on widely available knowledge with no patents, it is RECOMMEDED that generic open source refresh cycle shaders be provided in a reasonably permissive source code license (MIT/Apache) permitting integration into both GPL2/GPL3 projects and proprietary/commercial projects, for fastest & maximum industry spread.

Otherwise, the hardware industry suffers with inferior little-used or little-known entrenched algorithms that gets worse quickly over time (e.g. outdated 17×17 LCD overdrive LUTs from yesteryear, still used in 360Hz+ panels) as refresh cycles progress to stratospheric levels and creates GtG heatmapping issues.

Recommended Testing Displays

  • Faster displays perform better.
    Example: OLED performs display simulators better than VA LCDs
  • High bit-depth displays perform better.
    Example: Most TN panels are only 6-bit, interfering with temporal brightness spreading algorithms.
  • More native:simulated Hz ratios helps refresh cycle shaders. More Hz is better.
    Example: CRT simulator benefits from more Hz:
    CRT sim on 120Hz sample-and-hold reduces 60fps 60Hz motion blur by up to 50%
    CRT sim on 240Hz sample-and-hold reduces 60fps 60Hz motion blur by up to 75%
     CRT sim on 480Hz sample-and-hold reduces 60fps 60Hz motion blur by up to 87.5%

While the now-released CRT simulator showed noticeable benefit (Better than BFI) on a 120Hz IPS LCD, the best testing results occured with 480 Hz OLEDs. Test using much Hz as you can. That being said, 240Hz OLED generally produces very good results with the CRT simulators, especially on retro resolutions (e.g. Super Mario).

Researchers have noticed that for OLED instead of LCD, 120 Hz vs 480 Hz OLED is more human visible than 60Hz vs 120Hz. The OLED’s instant pixel response (GtG = like a slow moving camera shutter) enables unfiltered refresh rate benefits to human eyes, similar to a 1/120sec camera shutter versus 1/480sec camera shutter. As of January 5th, 480Hz OLEDs are available, as are 750Hz LCDs coming soon.

Specification is Evolving

The founder Mark Rejhon has experience creating specifications (since Mark is born deaf, he worked on XMPP XEP-0301 Real Time Text). However, we release this specification as a Blur Busters page, to bring this specification much more mainstream, much more quickly.

With our reputation (30+ peer reviewed citations in papers and industry textbook reading), we welcome collaborators to join our growing consortium for a formalized Version 1.1 specification of this Blur Busters Open Source Display Initiative, to get ready for the Bring Your Own Algorithm (BYOA) Open Source Display Revolution.


Read the original article

Comments

  • By JonathanFly 2025-01-056:371 reply

    So this a new method that simulates a CRT and genuinely reduces motion blur on any type of higher framerate displays, starting a 120hz. But it doesn't dim the image like black frame insertion which is the only current method that comes close to the clarity of a CRT. But it also simulates other aspects of CRT displays, right?

    Can you use this method just to reduce blur without reducing brightness, on any game? They mention reducing blur for many things other than retro games in "Possible Use Cases of Refresh Cycle Shaders" but does reducing blur in a flight simulator also make it visually look like a CRT with phosphors?

    • By delusional 2025-01-056:53

      They do mention that it does reduce brightness. The selling point compared to strobing sounds to be less eyestrain. I'd expect it to lose more brightness than strobing, considering the lower relative pixel on time.

  • By stevage 2025-01-058:406 reply

    I do not understand at all what this is talking about or why. Is it some elaborate joke?

    Don't visual effects people go to lots of effort to add motion blur? Why would you want to remove it?

    Why are they trying to simulate old CRT displays?

    Can someone explain what this is about?

    • By Springtime 2025-01-059:032 reply

      This is about improving motion clarity, so each displayed frame of moving content looks crisp rather than having blur (something that monitors can struggle with even at high refresh rates / high Hz).

      Most good monitor reviews of high Hz displays (eg: 120Hz+) take fast photographs of moving objects (typically from Blur Busters' 'Test UFO' web page) to demonstrate how good or poorly a monitor handles fast moving content.

      One technique of significantly improving motion clarity is inserting frames of pure black in the display output (aka BFI, black frame insertion). A downside is some are sensitive to this where it causes eyestrain.

      This CRT beam simulating shader is said to be similarly effective to BFI at improving motion clarity but with the benefit of reducing eyestrain. However from what I understand the current version is limited to simulating a lower Hz display and requires a higher Hz monitor.

      All this is distinct from the kind of in-media motion blur that can be enabled in games or seen in recorded video. It's instead about the monitor not being able to render fast moving content clearly enough which leads to non-crisp output frames.

      • By stevage 2025-01-059:26

        Thank you, that's a really great explanation.

      • By noduerme 2025-01-0510:081 reply

        What is the method used on newer TVs that attempts to double the framerate / interpolate frames / make everything shot on film look like an overlit soap opera? I find it impossible to watch; it destroys the lighting and the performances. My recollection of CRT TVs was that they had a lot of blur, both motion and spatial, and that was kind of what made them feel warmer and more analog / less overly crispy.

        • By Springtime 2025-01-0510:522 reply

          That's typically called 'motion smoothing' and yeah that's trying to interpolate frames to manipulate lower framerate video (like 24FPS) into higher framerates in an attempt to make scenes like panning shots 'smoother' at the expense of a soap opera feel and interpolation artifacting.

          Whereas what Blur Busters (and similar enthusiasts) are focused on is how accurately frames are (perceptibly) displayed on the screen, so ideally each input frame is perfectly presented without any interference from prior frames (due to limits of panels in keeping up with changing the pixels from one frame to another, very rapidly, causing blurring).

          The ultimate goal, in a perfect scenario, is for input from say a video game running at 60 frames per second to have each frame perfectly rendered like individual screenshot stills, one after the other. In reality for most monitors displaying such content there's not enough distinct separation between frames, leading fast changing pixel content (like objects moving) to blend into each other, causing blurring at the monitor level.

          The BFI technique, by inserting alternating black frames in the output, mitigates the inter-frame blending issues since instead of the prior frame being various colors (ie: of the prior input frame) it's starting from pure black which dramatically reduces frame blending artifacts and perceptibly makes the motion clarity more distinct.

          • By CuriousSkeptic 2025-01-0511:522 reply

            It’s not that the frames blend in the screen. Screens are perfectly capable of switching the pixels fast enough. It’s rather that each frame is displayed for to long.

            In a CRT the “pixels” start to fade immediately leaving the full screen mostly dark as the beam sweeps over the screen. It never shows a full frame.

            One could say that modern screens are more like slide shows, while BFI tries to make them more like stroboscopes.

            The blurring effect is more pronounced ate low refresh rates, its just that BFI requires at least 120hz to male sense at all.

            • By tigen 2025-01-0513:58

              Yes, the relevant blur here is in your retina, as it tracks a moving screen object, called "sample and hold" blur. 60 fps is not enough when the pixel persists for the full frame duration -- the pixels smear across your retina.

            • By fulafel 2025-01-0513:033 reply

              CRTs don't darken that fast, one way to observe this is that CRTs don't appear black in photos/video with shutter times << 1/60

              • By yuriks 2025-01-0517:55

                They do darken that fast (not fast enough you can't catch it in a high speed camera, but much faster than a frame). Most of the apparent persistence in the CRT comes from the retina/camera exposure, not the phosphor. A CRT has a sharp peak of light that quickly falls off, but the peak is bright enough that even though it is brief, when averaged out in an exposure in the camera it still appears bright enough to form an image.

              • By rcxdude 2025-01-0611:321 reply

                They frequently flicker or are only showing part of an image in video footage or photos, for exactly this reason. They're a right headache to film clearly.

                (see this youtube video showing one in slow-motion to get an idea: https://www.youtube.com/watch?v=3BJU2drrtCM)

                • By fulafel 2025-01-0613:58

                  Thanks for the link. It seems i'd concluded this a bit wrong from seeing those half lit frames (like at 1:35 of this YT video).

              • By chowells 2025-01-0519:31

                You've never taken photos of a CRT, have you? Even at like 400 ISO equivalent, only about a third of the screen is illuminated.

          • By thfuran 2025-01-0520:241 reply

            >at the expense of a soap opera feel

            The "soap opera feel" is precisely the goal of motion interpolation on 24 fps source. It reminds people of soap operas because they were often broadcast 60i instead of 24p. The weird part is that many people somehow prefer the terrible 24 fps to higher film frame rates.

            • By noduerme 2025-01-060:13

              I think you're right that it's partly a subconscious association with what we're used to seeing at higher frame rates (TV and video games).

              But it's also that a DP / cinematographer on a movie is crafting shots in ways that knowingly make use of a 24 fps framerate. There are consciously chosen effects that are in the shot, particularly directional motion blur that acts as a visual cue (like in action sequences or with hand-held cameras), which gets destroyed when the frame rate is increased without adding additional blur in the right places in post. Rather than a smoothly increasing/decreasing blur that creates a sort of ease-in-out as the camera or subject changes speed, you end up with jagged, rapid shifts in direction and speed which make well-crafted motion sequences feel either jarring or as if they're not really moving. I suspect that if a director were shooting originally at 60 fps they would have probably made the necessary adjustments in post production to achieve the effects they wanted, which they initially got by tuning their shots to 24 fps. But when it's done automatically by some software in a TV set, all of that subtlety is lost.

              It's sort of like if you took an oil painting and say the colors look more lifelike in digital reproduction: That may be true, but it wasn't the artist's intent. The artist understood they were working with a particular palette and worked within its limitations to achieve their desired effects.

              My contention is that it's not the higher frame rate which bothers people, per se, but that all the motion blur (slight as well as heavy) in a well-shot 24 fps movie is intentional, and therefore the problem is that removing it detracts from the intended effect of the shot. If you chose to replicate the original blur across 60 fps, rather than interpolate the sharpest possible interstitial frames, people might not have the same negative reaction.

    • By haunter 2025-01-059:30

      First thing I turn off in every single game is motion blur. It’s only useful in racing sims to have more sense of speed but that’s also a personal taste.

      Motion blur made a bit more sense on the 30fps Xbox 360 and PS3 games.

    • By martini333 2025-01-0512:41

      Why exactly do you think motion blur is added?

    • By cubefox 2025-01-0520:09

      Our eyes are constantly and mostly unconsciously tracking moving objects in our field of view in order to keep them still relative to our eyes. It's called Smooth pursuit: https://en.wikipedia.org/wiki/Smooth_pursuit

      This is because our retina has a very low "refresh rate", which means things can easily blur together. Smooth pursuit prevents that. However, modern sample-and-hold displays like LCD and OLED work against Smooth pursuit. If you watch anything moving on a screen (including "still" objects moving on screen due to camera movement), your eye will automatically track those objects if they are momentarily the focus of attention, which should make them be still relative to your eyes and thus appear sharp.

      However, since the tracked object is being still relative to your eyes and the individual frames on screen are being still relative to your screen, the frames move (are not being still) relative to your eyes. Which means they appear blurry during smooth pursuit, when in reality they should be perfectly sharp.

      For example, your eyes track a sign that moves on the screen due to camera movement. Say it moves 10 pixels per frame horizontally. This means you will see a 10 pixel wide horizontal blur on this sign. Which could make it unreadable. In reality (without screen with a real sign) the sign would appear perfectly clear.

      On CRT screens this doesn't happen (to the same extent) because the frame is not displayed for the entire frame time (e.g. 1/60th of a second) but much shorter. The CRT just very quickly flashes the frames and is dark in between. Strobing/flickering basically. So if the tracked object moves 10 pixels per frame, the frame might only be (say) visible for 1/5th of that frame time, which means it moves only 2 pixel while the frame is actually on screen. So you get only 2 pixel blur, which is much less.

      Of course at 60 FPS you might instead get some degree of perceptible flicker (computer CRTs therefore often ran higher than 60) and in general the overall achievable screen brightness will be darker, since the screen is black most of each frame time. CRTs had a low maximum brightness. But they had very little of the "persistence blur" which plagues sample-and-hold screens like OLED and LCD.

      The motion blur intentionally introduced by video games is there to make moving objects appear smoother that are not tracked by our eyes. In that case motion blur is natural (since smooth pursuit doesn't try to remove it). So some forms of motion blur are undesirable and others are desirable.

      The optimal solution would be to run games (and videos content in general) at an extremely high frame rate (like 1000 FPS) which would introduce natural perceptible motion blur where it naturally occurs and remove it where it doesn't naturally occur (during smooth pursuit). But obviously that would be computationally an extremely inefficient way to render games.

      By the way, if you have a screen with 120+ Hz you can test the above via this black frame insertion demo, which emulates how CRTs work:

      https://testufo.com/blackframes

      On my 120 Hz OLED screen, the 40 FPS (1 frame + 2 black frames) UFO looks as clear as the native 120 Hz UFO. A real 60 or even 80 Hz CRT screen would be even better in terms of motion clarity. Perhaps better than a 240 or even 480 Hz OLED.

    • By fishermanbill 2025-01-059:02

      Yeah they are two different effects. Theres motion blur on individual objects that you want (as human eyes see/have) then there is full screen motion blur that is due to the display technology (lcd,oled etc) that you dont want (as human eyes dont see/have). CRTs dont have this motion blur as the screen is blank most of the time - see slo mo guys on youtube for crt displays.

    • By 7734128 2025-01-059:14

      Because I hate it.

  • By fishermanbill 2025-01-059:13

    We need display manufacturers to provide a refresh cycle that is agnostic of the incoming signal hz sent down the cable AND to either provide shader support (ideally) at the displays hz OR to implement this shader.

    There really is no need for an expensive RetroTink if we had this. Some manufacturer must be able to do it and the rest would follow.

HackerNews