• The new B5TV.COM is here. We've replaced our 16 year old software with flashy new XenForo install. Registration is open again. Password resets will work again. More info here.

Babylon 5 and HD...

...I HIGHLY doubt that the Foundation crew just handed over one copy of each file to Netter after S3, and then deleted all their backups. I bet Ron Thornton and others have their own copies of their groundbreaking pieces of CG history squirreled away somewhere. If the scene files can be found....

I'm under the impression that the B5 production teams had to turn over everything to Warner Bros at the end of production, and that anyone other than WB keeping copies of those files after B5 ended would be illegal. Perhaps Joe, Amy, or Jan could help clarify for me here.
 
Very interesting story b4bob. Sounds like it was a fun community.

And vacantlook, I believe that is pretty much what happened. Or at least, according to JMS here:
Babylonian Productions was not allowed to keep this material, it all had to be
turned over to WB each year, and what could have been maybe squirreled away at
NDEI was lost when that company went away.

jms
Which makes it sound like they were legally required to turn everything over, and that with ND folding after Crusade was cancelled, JMS believes that what may have been retained there was lost. Not sure about personal archives, although if I'm reading JMS's message right, the legality might make those who did keep something back not all that forthcoming about it.
 
I think it's a great idea. JMS was proud of how B5 was at the forefront of bringing this cutting edge technology to television, and I'd love to watch a glorious high-def version of the show with the CGI brought back to the cutting edge once more (while most definitely keeping the aesthetic style and shot composition the same). After all, I'll (as I assume we all here will) always have the original versions on DVD. And I already wince at the blown-up, cropped comps on those DVDs, just in regular PAL resolution.

Incidentally, as one of the guys behind the new Star Trek CGI work pointed out, even if they weren't incredibly intent on reproducing the original shots exactly but with greater detail, they'd have great difficulty making any significant changes, because the shots are all of a very specific length. Even the longest and most extravagant of B5 battle sequences or establishing shots are going to be measured in the seconds, not minutes. That would leave them only so much wiggle room to deviate from the way original shots went.

For the record, I'm a fan of the SW special edition versions as well, with the only exceptions being the horrendous 'Jedi Rocks' song in Jedi, and the messed up lightsabre colours.
 
One thing that always bothered me about the B5 FX was that they were rendered to look different in comparison with the live-action footage. The live-action stuff looks like most movies displayed on a TV, but the CG was rendered interlaced, like video!

Movies and TV shows shot on film (like movies) use film that runs at 24 frames per second. Film is also inherently “progressive,” meaning the entire frame is captured at the same time. The pre-high definition America TV system, on the other hand, displays video images 30 frames a second. And the frames are also “interlaced,” meaning for every frame you first see every even line of the image and then every odd line. This 30 frames interlaced look is what folks think of when they think of the 6 o’clock news or porn or your home camcorder. In short: “video” or “cheap” (at least to Americans).

A good way to recognize this is to think of British TV from the 70’s, like the Jon Pertwee or Tom Baker Doctor Who eras. The studio work, like the TARDIS interior, was shot with studio video TV cameras. But the location work, like the endless quarries, was shot on film.

Babylon 5 was a little like this. The live-action was shot on film, and looked something like a movie on TV (24p). But the video was rendered at 30 frames a second, interlaced. I strongly believe that, at least to American TV-watching eyes, interlaced video looks cheap like, as I said, porn or the evening news. I think this was one of the factors that caused the FX work to be a little jarring and why some thought it looked “cheap.”

Why was this done? I really don’t know. Maybe because the guy who pioneered the FX work and set the tone was Ron Thornton, an Englishman. Maybe he didn’t realize to us Yanks interlaced = cheap. Modern CG artists go to great lengths to match the look of the FX to the live-action footage.

A “simple” re-rendering of the CG at 1080p resolution with a 24p frame-rate effect applied would yield fairly spectacular results methinks.
 
b4bob:

That is almost entirely wrong. :) The difference between the studio stuff and the exterior stuff in Dr. Who or Monty Python has nothing to do with interlaced vs. non-interlaced, and everything to do with the inherent differences betweeen videotape (and live electronic video cameras) and film and film cameras.

All pre-HD teleivison signals (and even some HD signals) are transmitted interlaced. PAL, SECAM, NTSC, ATSC, doesn't make a difference. All film seen on TV is converted to video, and generally interlaced video, before it reaches your house. Nowadays films are first transferred to pro-grade video tape and broadcast from that. In the early days the film was projected on a screen in the TV studio and a TV camera was aimed at it to send it out over the airwaves. Either way, the film image was converrted to an electronic signal that went out at 25 frames per second (PAL) or 30 frames per second (NTSC), in each case with 2 fields per frame - odd lines first, then even lines. Unless you happen to have seen a HD broadcast at 720p, everything you have ever seen on over-the-air, cable or satellite television has been interlaced video. (The highest resolution broadcast standard in the U.S. is 1080 lines - 1080 lines interlaced. And it looks a hell of a lot more like film than anything played by from a DVD at 480 progressive lines.)

Look, this is a real complicated and real technical issue, which is why I've basically stayed out of all the threads I've seen on it. This is the kind of thing that really needs either a huge answer - like a book - or no answer at all.

However, having come this far, a few thoughts.

1) JMS always thought that the definitive version of the show would be the HD broadcast or the home video version - in widescreen.

2) When they started shooting the show, the state-of-the-art in home video was laserdisc. DVD and anamorphic digital compression didn't exist, so nobody planned for it. Ironically this resolution-enhancing technology makes it harder to do a decent widescreen transfer of a show produced the way B5 was.

3) JMS assumed that if the show did well enough to make it to five years it would have done so well that they would have the time and the money to take the existing CGI wireframes, and composite shots, extend the frame to 16:9, re-render and recomposite the shots. He didn't count on the entire franchise imploding in the wake of the Crusade debacle and Warner Bros. essentially losing all interest in it.

4) Several episodes had CGI that was just barely finished in time to air. A few weren't quite finished and the episodes originally aired with missing pieces or obvious mistakes. These were fixed after the fact in time for reruns and future broadcasts. Because of this JMS always, in some ways, considered the CGI and composite shots something of a "work in progress" to be fixed later if time and money allowed.

In order to re-do B5 for HD TV all the live action footage would have to be retelecined from the original Super35 elements to new hi-def digital tape masters. In theory the CGI would then have to be recreated from scratch because so much has been lost. What, if anything, can be done with the digial composite shots would depend on whether or not the live action footage that went into the comps was saved. Don't forget, the final assembly for B5 was done on tape, at NTSC resolution. If the film elements of the composite shots were not saved, then the lo-res video is all there is and all there ever will be. At that point we have to accept that either an HD version is impossible, or that we can get a mostly HD version that will have glaringly bad composite shots interspersed through it.

(For an invaluable analysis of the issues that were involved in even the original DVD widescreen releases see this piece by Henrik Herranen, which also describes a technique that would have yielded better results than those obtained by the contractor that WB used to create the widescreen masters for The Sci-Fi Channel. )

If all the film is still around then a new HD version of the show is possible, provided WB is willing to put up the money. In that case I'd have no problem with some good digital artists attempting to reproduce a higher res version of the original FX shots and composites, without pulling a full George Lucas. I'm comfortable with this because JMS spoke or redoing the CGI from the beginning, and because none of it would involve changing the story. And because it would make it easier for the show to continue being shown on TV where new generations can discover it.

And yes, of course I'd buy it again. :)

But this is all pertty academic unless the Lost Tales really put B5 back on the map as far as Warner Bros. is concerned and they think they can make back the cost of the revisions.

Regards,

Joe
 
Yeah, that doesn't sound quite right, b4bob. For someone who says he was part of a community that Ron Thornton interacted with, I think you're selling him massively short. Most people involved in the more technical sides of making television should know the difference between progressive and interlaced in their sleep, and a CGI supervisor even more so. And, like Joe said, both NTSC and PAL television broadcast standards were interlaced (480i and 576i to be precise, so the biggest difference there is the number and length of the scanlines, and the framerate). That the CGI and comped shots were done interlaced happened for the same reason it was done in 4:3 instead of widescreen; that's the format the show was broadcast in, and they did not have the render time to go beyond that.

And thank you Joe for writing down something that is almost exactly the same as how I feel about this, only expressed with more clarity. :)

If the film elements of the composite shots were not saved, then the lo-res video is all there is and all there ever will be. At that point we have to accept that either an HD version is impossible, or that we can get a mostly HD version that will have glaringly bad composite shots interspersed through it.
Yeah. If you look at the questions I asked JMS, you'll see this one among them, but he didn't really answer it directly. Only that "but again you'd have to crop the CGI if you wanted it in wide." I do remember, in the season 5 dvd documentary section on the cgi, that they had a short splitscreen shot of a comped shot 'clean' and the same shot with CGI comped in (Garibaldi looking out over a Drazi city in his hotel room). If that means that all of it is still stored on film is impossible to tell though. I did find this on it by JMS:
Nor can this footage be re-rendered because the separate elements do not existanymore, only the original un-comped film elements are there. The CGI files are not around anymore, and to recreate every shot would be prohibitively expensive. In a big way.
And if they were originally intending to redo comped and CGI shots, I can almost not imagine they would not turn over clean versions on film. If they do have it, hopefully they stored it in the same vault as the B5 master film reels, and not somewhere where the rats can get to it. :p

(For an invaluable analysis of the issues that were involved in even the original DVD widescreen releases see this piece by Henrik Herranen, which also describes a technique that would have yielded better results than those obtained by the contractor that WB used to create the widescreen masters for The Sci-Fi Channel. )
That's a wonderful analysis. I remember reading it a few years ago. The PAL versions look even worse though; there is some horrible blur on some of the comped shots because of the way they handled the frame rate conversion. And while he mentions it very briefly, in surprise, he largely ignores the fact that the 4:3 telecine and the wide telecine were done separately, with many years and significant technological advancements in between them, and that the latter looks superior because of it. More detail is picked up on the wide telecine, even in the area where the 4:3 frame and the wide frame overlap, and the colour is much more vivid and well defined.

But really, I agree with just about every point you made. :)
 
No disrespect intended, Joe & Shabaz, but I think we’re kind of talking past each other here on the interlaced issue. I only bring it up because I’ve always been interested in the issue, and I’ve never really seen it discussed elsewhere.

I’m not talking about the final broadcast of the B5 footage, which in NTSC is of course interlaced. I’m talking about the original acquisition of the footage. Footage acquired in 24p, like movies and many single-cam TV shows, have a different look when transferred to NTSC than footage originally acquired in the native NTSC mode of 30 frames/second interlaced (60i).

The Babylon 5 live action footage was acquired—filmed—in 24p. But the B5 CG was acquired (in this case “rendered” not filmed, per se) in native NTSC 60i. This, at least to me, makes the 60i CG look like video compared to the film look of the live action footage. Modern CG is given an effect to mimic the look of 35MM film transferred to NTSC to better match the live action stuff. It’s very similar to the 24p mode on some newer camcorders. The cams mimic the look of footage acquired in 24p and then transferred to 60i NTSC.

I know there are many differences between film and video—color curve, film grain, bloom, DOF, contrast. But the main tell-tale sign that you’re watching something acquired in film is the 24p-transferred-to-60i “look.” The 60i video “look” of the CG makes the original FX work look a little cheaper than it needed to. Because of this, in the last part of season 5 wouldn’t you know, JMS mentioned that:

The CGI is now being produced at 24 frame per second, starting about 517 or
518, at a higher degree of resolution, nearer film-res. We're doing 24 so it
matches better with the live-action which is also shot at 24 fps.

jms

Quote here.

Do you get what I’m talking about, Joe?
 
That quote is about the frame rate and resolution. When you said "Maybe he didn’t realize to us Yanks interlaced = cheap", that isn't quite the same as what JMS is describing there. And, like I said, I sincerely doubt it was because of ignorance from the side of Thornton, but rather it was because of technical restrictions.

You say B5's film is inherently 24p, which is true, but is not the way it was broadcast, far as I know. The way it was broadcast was interlaced, as all television was then, and that was matched by the effects shots.

And Joe is right; Who's interior and exterior shots don't look different because of interlaced versus progressive, but because of the inherent differences between film and electronic shooting.

What I think you are doing here b4bob, is taking a whole range of issues - framerates, film versus digital, resolution - and mistakenly calling this all interlaced related problems. And you seem to have some notion about the European PAL standard being more interlaced, why I'm not sure.
 
Yes, I know the live action was filmed in 24p and broadcast in 60i. But footage acquired in 24p and transferred to 60i looks different than footage both acquired and broadcast in 60i.

The "camera" in Lightwave 3d was set to mimic the look of 60i video, not the look of 24p film transferred to 60i video. Modern effects work matches the motion quality of the CG to the live action. B5 did not. That's just one of the things that could be improved upon.

And I never said that PAL was more interlaced. What I was trying to say was that--for me--watching B5 jump from 24p acquired live action footage to footage rendered in native 60i is akin to watching an older BBC TV show where they used video for the studio work, and 16mm film for the location work.

I know what I'm talking about is esoteric and confusing. If you really want to see what I mean, find someone with a 24p Camcorder, and shoot a brief clip twice--once in 24p mode one once in 60i mode. Watch them back to back on an NTSC TV, and you'll see what I mean. The 24p clip looks slightly more "filmic" due to the motion properties.
 
b4bob, it's not that the issue is confusing, as that you are making connections that you neither properly explain, and don't really seem very clear or sound without explanation. From what I get after re-reading your first post a few times, the argumentation you are making goes something like this:

Interlaced is cheap looking. Standard tv cameras are interlaced, and film cameras are progressive. In Doctor Who, both television camera footage and film footage was mixed. Who is British, and Thornton is British. Thornton might not have realize that interlaced is cheap looking.

Do you see where the gaps in you argumentation are now? You don't make a clear and logical progression in your points, leaving us to fill in the blanks. And if there is no obvious way to fill in the blanks, we're left to conclude that maybe the argumentation isn't sound.

Yes, transferring film to 60i will look different than something rendered directly into 60i, and there are ways to compensate for that difference. And one of the ways is to render that first in 24p, and then do the same type of transfer that is done to get the 24p film footage into 60i, or do it with CGI filters. That's what you're saying, right? I'm still not sure what Thornton's Britishness has to do with it; to misquote G'Kar, part of that progression escapes me. I still think it's more likely that doing the CGI directly to 60i, and doing the comped shots on film that was already transferred into 60i format, was for time reasons, and technical achievability, rather than ignorance.
 
Okay. Sorry for derailing this whole conversation off into some bizarre digression. This is my last comment on the unfortunate interlacing of B5 CG. Promise.

Here’s how Lightwave handles the rendering of fields. It can render interlaced (even or odd field first) or it can render no fields, also known as “progressive.” It actually takes slightly less time to render progressive frames because when it renders interlaced fields in one frame it needs to compensate for the slight motion possible between the even and odd fields. If Foundation had simply rendered the frames progressive, the CG would have been a native 30p. That wouldn’t have been a perfect match to the 24p live action footage, but it would have avoided the cheap look which I believe is inherent to native interlaced video. In a perfect world, the CG would have been rendered in 24p and then, as you said, somehow processed to give it the telecine look of film transferred to video. Whether that was even possible, given the budgetary and technological limitations facing the B5 crew, I have no idea.

Now, the British thing. I just happen to be personally interested in B5, British TV, and Lightwave 3d. I was merely theorizing that Thornton may not have realized that Americans are culturally less used to seeing interlaced video and 24p filmic footage in the same show, especially prime-time dramas.

Let's look at it this way. Thornton worked on the classic BBC SF series Blake's 7. The show ran in the late 70's and early 80's for 4 seasons on the biggest channel in Britain, BBC 1 (I believe Thornton helped create some of the spacecraft miniatures used in the 4th season). B7, like many other British shows form the era, shot the studio scenes on video and the locations scenes on film. There are many factors that tip one off to the fact that footage is shot in either film or video. But one of the chief differences is the motion of interlaced video versus progressive film frames. Interlaced video has that crisp, realistic, local news look. 24p film has a slightly more blurry, more cinematic motion.

B7 was a prime-time drama on the biggest channel in Britain in the early 80's. And it was perfectly natural to create the show by mixing some 24p footage with some interlaced video footage. Now, let's think of Star Trek. Can you imagine if Trek TOS, made 10 years earlier, had used studio video cameras to shoot the studio footage??? No way. I think we would be hard-pressed to cite ANY American high profile prime-time drama from the past 40 years that shot large portions on interlaced video.

Of course there are exceptions in both directions. Plenty of older British shows--especially shows intended for American distribution--were shot entirely on film. But it was common for many years in Britain to use interlaced video for Prime-time drama in a way it is not used in the US. I honestly think that interlaced video has a subtle but different effect on American viewers than it does on British viewers.

Why is this? I have no idea. I read one bloke on a Dr. Who board mention that PAL interlaced video doesn't look quite so cheap when viewed on an actual PAL set, possible because it's running at 25 frames/second rather than 30. I'd love to see interlaced PAL in its native habitat though...
 
I think I finally get all what you're saying now. ;)

I'm not sure you're right though, I don't know how much 'cultural familiarity' plays into things like this. It's an interesting theory, but I can't say I've heard it before.

There is actually something interesting going on with early Who, to go off into an even further tangent. I mentioned that the first 6 six seasons were in part outright trashed, and that what survives comes in part from copies that were given out to foreign stations. Who was mostly shot interlaced video, but because of incompatible video standards and the like, it was deemed easier to copy the show on 16mm film for foreign distribution, which actually removes some frame information and some of the native look of the show. The folks who are doing the DVDs actually have gone back to those 16mm copies, and through careful digital interpolation, tried to extract something approximating the earlier video look, using custom software. See here. I thought the process was something that might be of interest to you. :)
 
Yeah, vidfire. I have one of the Who DVD's where the effect was used. I think maybe Dalek Invasion of Earth or Aztecs? It's pretty cool. But see that plays into what I'm saying. Because the interlaced studio look was how portions were originally intended to be seen, they're actually adding the interlacing effect back into the image!

That's funny considering many modern productions shooting on interlaced video (like new Who) are now actually "filmizing," trying to give the video a false 24p or 25p look on TV.
 
New who is actually broadcast in 25p, I believe. PalPLUS, the widescreen version of PAL over here, at least offers support for it. That is, the last 2 seasons were. Starting with 3, it will be done in HD (as is its spinoff series, Torchwood).
 
I now understand what you were driving at, but you are still fundamentally wrong in both the way you expressed what you had in mind and in the way you seem to understand film and video. :)

Film is not "24p". "Progressive" and "Interlaced" are terms that only apply to electronically scanned images that are made up of individual lines. Film images, whether 8mm, 16mm, 35mm or any other variant, are NOT made up of scan lines. "Progressive" makes no more sense when applied to a filmed image than "Interlaced" would. It is akin to talking about the "bitrate" of an analog recording. Analog recordings don't have bitrates, because they don't have bits. The individual lines of a film image aren't "progressive" or "interlaced" because film images don't have individual lines.

1080i ("interlaced") HD video looks smoother and more filmlike than any standard def video source. In fact, it looks better than film-sourced material extracted at 480p from an SD DVD. Yet even the best Hi-Def video format - 1080p - doesn't come close to being able to capture the level of detail that a frame of 35mm film can. It is the inherent qualities of the media, not "interlaced" vs "non-interlaced" that determines the difference between 35mm film and 1080p video. Yet you have consistently written as if the only difference between broadcast material from a filmed source and that taken from video is this mythical "interlacing" - which, as we've seen, doesn't even apply to film.

It adds nothing but confusion to the conversation to speak of film frames as if they were interchangable in every way with video fields and frames. And I'm sure that even if Ron Thornton were mysteriously confused about the standards of American television, someone would have explained the problem to him. In fact, the FX shots looked just fine and worked very well with the original broadcasts. It was only when they were first transferred to widescreen and then reprocessed for anamorphic DVD that quality issues began turning up in the CGI.

Regards,

Joe
 
If I didn’t know better, I’d think you were willfully trying to misunderstand me.

You do realize that, when speaking about NTSC, part of the reason Film looks different from video on TV is the motion characteristics of the interlacing, right? Yes, everything is broadcast and displayed interlaced (on NTSC), but the film is originally filmed non-interlaced, and the video, recorded with an NTSC video camera, IS interlaced. When both are displayed on an NTSC TV, they look different, and part of the reason is the interlacing.

Do you understand this fact?
 
When film at 24 frames per second is played on NTSC tvs some of the pictures are repeated. Are you talking about these repeats?
 
I believe that's called a "3:2 pulldown". Not sure if that was what was used for the NTSC version of b5, but from what I understand, it's a pretty common way to go from 24fps to 30fps.
 
Film is not "24p". "Progressive" and "Interlaced" are terms that only apply to electronically scanned images that are made up of individual lines. Film images, whether 8mm, 16mm, 35mm or any other variant, are NOT made up of scan lines. "Progressive" makes no more sense when applied to a filmed image than "Interlaced" would…

I think you’re picking some big nits here, Joe. I am aware that film isn’t made up of scan lines. But the frame-rate of most motion pictures is 24/second. And in its native form, film captures and displays its frames all at once, not interlaced. Hence I think “24p” is reasonable shorthand for how B5 was filmed compared to video formats.

Ok, let me try one more time. This is mainly for Joe’s benefit. He’s the resident Mr. Smarty Pants, and I’ve enjoyed his posts and been educated by him many, many times over the last 4 years or so. But this is one area that he’s not quite getting me. Let me try one more time. This is for you Joe. :cool:

This is the heart of the matter. You keep speaking as though the image acquisition and delivery is the same thing.

Let me use your 1080i example to try to explain what I mean about acquisition versus delivery. Say my local ABC HD channel broadcasts in 1080i (which is 1080/60i specifically). Tonight they will broadcast SW ep III in 1080/60i. As we know, Lucas shot ROTS in 1080/24p. But let’s imagine that one scene, for some bizarre reason, was originally shot in 1080/60i. Even though both are being broadcast in 1080/60i, the footage shot in 1080/60i will look slightly different than the footage shot in 1080/24p. Anyone watching on the HD channel will see it, but many might not know why it looks different. The footage shot in 1080/60i will have a certain shot-on-video motion look to it. The 1080/24p will have the look of a film transferred to video. Even though they are both being broadcast in 1080/60i . There will be nothing different about these two kinds of footage other than the difference between 60i and 24p original acquisition. That difference is precisely what I’m talking about with the B5 effects.

To extend this story, it is as if the live action B5 footage was filmed in the 24p setting but the CG was filmed at the 60i setting. Or to use another metaphor, it is as if the B5 crew magically filmed all of the show with one of those new-fangled prosumer camcorders that have both 60i and 24p modes. It is as though the live action was filmed with the 24p mode and the cg was filmed with the 60i mode. Even though both are finally delivered in a 60i format, the difference is notable.

…you have consistently written as if the only difference between broadcast material from a filmed source and that taken from video is this mythical "interlacing"…

No, I’ve said many times in this thread that there are many differences between film and video, and many differences between video on TV and film transferred to video on TV. The interlacing issue I’m talking about is only one of those differences, but it’s the one that directly affects the problem I’m referring to.

In fact, the FX shots looked just fine and worked very well with the original broadcasts…

In your opinion. And I largely agree with you. I loved the effects, and they got me interested in 3D animation at the amateur level. In a broader sense, the issue I’m describing isn’t a very big deal, especially if the effects ever get re-rendered. It’s just kinda odd. The problem I’m describing is real, however, and I just want you block heads to not think I’m crazy now! :D
 

Latest posts

Members online

No members online now.
Back
Top