...

pfranz

3709

Karma

2012-03-30

Created

Recent Activity

  • > The problem is that it just doesn't work on modern, fast displays.

    I'm very confused by this. From what I've seen it's been getting a lot better (since transitioning from CRTs). At least for television, frame-rate matching is becoming more of a thing. Higher frame rates really help. Calling everything fps for simplicity; 120 divides evenly by 24, 30, and 60. Lower values wont match and cause issues.

    Similarly, (maybe back in the 90s?) projectors in theaters would double-expose each frame to reduce the flicker in between frames. With digital, they no longer have to advance the film between frames.

    > smooth motion like credits at certain speeds are extremely uncomfortable to look at at these frame rates.

    I think scrolling credits are the most difficult use case: white on black with hard text and no blur. DLP projectors (common 10+ years ago) drive me nuts displaying R G and B separately.

    Outside of credits, cinematographers and other filmmakers do think about these things. I remember hearing a cinematographer talk about working on space documentaries in Imax. If you panned too quickly, the white spaceship over a black star field could jump multiple feet each frame. Sure films shot today are optimized for the theater, but the technology gap between theater and home is nowhere near as crazy as CRT vs acetate.

    > Frame size is different from the other parameters, as it is solely a physical practicality.

    I'm still struggling to see how its that different. Widescreen meant a lower effective resolution (it didn't have to--it started with Cinerama and Cinemascope), but was adopted for cost and aesthetic reasons.

    > If some technology somewhere else in the stack causes a change…and soon all content aligns on the format, and the majority of home TV sets will be shaped to fit the majority content it can receive.

    And the industry and audiences are really attached to 24fps. Like you say, home televisions adopted film's aspect ratio and I've also seen them adopt much better support for 24fps.

    As kind of an aside, I wonder if the motion blur is what people are attached to more than the actual frame rate. I assume you're talking about frame rates higher than 30? Sure, we have faster films and brighter lights, but exposure time is really short. I saw the Hobbit in theaters in both high frame rate and 24fps and the 24fps one looked weird to me, too--I meant to look it up, but I assume they just dropped frames making the blur odd.

  • > I think this would have a negative effect.

    How? Disclosure should already be legally required--class-actions and lawsuits should already be a thing. The Have I Been Pwned data sets aren't volunteered by these companies. It's a catalog of leaked data.

    The class-action response of "identity monitoring" is nonsense. More companies, if they can't afford to or don't want secure data, shouldn't collect it or should aggressively purge it. User data should be a liability.

  • Right. Just like the article, HDR is too vague to mean anything specific and a label that's slapped onto products. In gaming, it often meant they were finally simulating light and exposure separately--clipping highlights that would have previously been shown. In their opinion, reducing the fidelity. Same with depth of field blurring things that used to not have blur.

  • > Ugh. I will never understand the obsession this effect.

    All of it (lens flares, motion blur, film grain, DoF, tone mapping, and exposure, frame rate) are artistic choices constrained by the equipment we have to collect and present it. I think they'll always follow trends. In my entire career following film, photography, computer graphics, and game dev the only time I've heard anyone talk about how we experience any of those things is when people say humans see roughly equivalent of a 50mm lens (on 35mm film).

    Just look at the trend of frame size. Film was roughly 4:3, television copied it. Film started matting/cropping the frame. It got crazy with super wide-screen to where some films used 3 projectors side-by-side and most settled on 16:9. Then television copied it. Widescreen is still seen as more "filmic." I remember being surprised working on a feature that switched to Cinemascope's aspect ratio and seeing that was only 850 pixels tall--a full frame would be about twice that.

    To me, high frame rate was always just another style. My only beef was with motion-smoothing muddying up footage shot at different frame rates.

  • Stereo film has its own limitations. Sadly, shooting for stereo was expensive and often corners were cut just to get it to show up in a theater where they can charge a premium for a stereo screening. Home video was always a nightmare--nobody wants to wear glasses (glassesless stereo TVs had a very narrow viewing angle).

    It may not be obvious, but film has a visual language. If you look at early film, it wasn't obvious if you cut to something that the audience would understand what was going on. Panning from one object to another implies a connection. It's built on the visual language of still photography (things like rule of thirds, using contrast or color to direct your eye, etc). All directing your eye.

    Stereo film has its own limitations that were still being explored. In a regular film, you would do a rack focus to connect something in the foreground to the background. In stereo, when there's a rack focus people don't follow the camera the same way. In regular film, you could show someone's back in the foreground of a shot and cut them off at the waist. In stereo, that looks weird.

    When you're presenting something you're always directing where someone is looking--whether its a play, movie, or stereo show. The tools are just adapted for the medium.

    I do think it worked way better for movies like Avatar or How to Train Your Dragon and was less impressive for things like rom coms.

HackerNews