Jump to content
IGNORED

Is there a name for a 60th of a second (an NTSC/PAL60 cycle)?


Best "60th of a Second" Synonym  

20 members have voted

  1. 1. What is your favorite word or phrase that means 60th of a second?

    • Third.
      0
    • Sexagintisecond.
      0
    • Hexacontasecond.
      0
    • Cycle.
      1
    • Frame.
      13
    • Jiffy.
      5
    • Tick.
      0
    • 60th of a second.
      0
    • Sixtieth of a second.
      0
    • 1/60 second.
      1
    • 60 FPS divided by 60.
      0
    • I don't care.
      0
    • Other. (I'll post my favorite.)
      0

  • Please sign in to vote in this poll.

Recommended Posts

I know it's not a millisecond. That's one thousandth of a second. It says at Wikipedia that 16.67 milliseconds (1/60 second) is called a third, but it seems like that would end up confusing people since they might think I was talking about a third of a second. Something like sexagintisecond or hexacontasecond would probably be just as confusing. I've seen people using cycle and frame to mean a sixtieth of a second. Some people use the word jiffy or the word tick.

Which word is better? Or do you have one that hasn't been posted?

Link to comment
Share on other sites

I choose frame, if it's meant to literally denote time between 240p frames. ie. slightly longer than 1/60th of a second on NTSC, and 1/50th on PAL.

 

"cycle" to mean 1/60th of a second is either wrong or confusing.

"jiffy" used to be right, but now has many differing specific meanings and a general one too. I'd stay its lost.

"tick" I use in my own code, but in my mind it isn't limited to 1/60th of a second. I use it in my comments as a unit of movement/action, and it may be 1/60th, 1/30th, 1/20th, etc., depending on the application.

  • Like 1
Link to comment
Share on other sites

I learned it as Jiffy from my Commodore days. The VIC used a Jiffy Clock to keep track of time. On the VIC the zero page locations A0-A2 held the 3 byte jiffy clock, whose value was incremented every 1/60th of a second.

 

 

The online computing dictionary has this:

1. The duration of one tick of the computer's system clock. Often one AC cycle time (1/60 second in the US and Canada, 1/50 most other places), but more recently 1/100 sec has become common.

  • Like 1
Link to comment
Share on other sites

NTSC television runs at 29.97 FPS.

 

You have 29.97 frames, each consisting of two fields, one field being useless to you since it's black while running at 240p such as old game systems do. That is not "technically" progressive video, but we call it that to satisfy the lingo. :)

 

So, an Atari is ~30FPS.

Edited by R.Cade
Link to comment
Share on other sites

NTSC television runs at 29.97 FPS.

 

You have 29.97 frames, each consisting of two fields, one field being useless to you since it's black while running at 240p such as old game systems do. That is not "technically" progressive video, but we call it that to satisfy the lingo. :)

 

So, an Atari is ~30FPS.

 

 

Nope. The Atari is 60 fps. Analog television works by scanning the screen from top to bottom 60 (or 50) times/second. Usually, the timing of even and odd fields are offset by 1/2 line so the lines appear to be between the lines of the previous field, giving better coverage/resolution on the tube. The TV doesn't really care if the previous field was even or odd so you can send all odd or even fields and they will simply appear as a screen with half the lines, but they're being updated twice as often.

  • Like 3
Link to comment
Share on other sites

 

Nope. The Atari is 60 fps. Analog television works by scanning the screen from top to bottom 60 (or 50) times/second. Usually, the timing of even and odd fields are offset by 1/2 line so the lines appear to be between the lines of the previous field, giving better coverage/resolution on the tube. The TV doesn't really care if the previous field was even or odd so you can send all odd or even fields and they will simply appear as a screen with half the lines, but they're being updated twice as often.

 

I am obviously confused about how this can be true. Classic NTSC broadcast television is only capable of 29.97 frames per second. It can do 59.94 *fields* per second, but the Atari does not output anything for the second field, or else it would be in interlaced mode, which it is not. They are black, which is why we see scan lines.

 

I'm not sure how you extrapolate that to mean that the Atari can make a 30fps TV do 60fps... I am missing the logic leap.

Link to comment
Share on other sites

 

I am obviously confused about how this can be true. Classic NTSC broadcast television is only capable of 29.97 frames per second. It can do 59.94 *fields* per second, but the Atari does not output anything for the second field, or else it would be in interlaced mode, which it is not. They are black, which is why we see scan lines.

 

I'm not sure how you extrapolate that to mean that the Atari can make a 30fps TV do 60fps... I am missing the logic leap.

Here's how it works:

 

The TV beam starts at the top of the tube and scans left to right while progressing downward. Then it zips back to the top and does it again. This happens 60 times a second.

 

NTSC is considered to have 262 scan lines but many of these happen off screen so we often refer to the picture as having 240 scan lines.

 

Now, a little trick is employed to get 480 lines. The TV is incapable of showing that many in one pass, so we do it in two passes and slightly delay the start of the 2nd pass so the beam is a little lower than before placing these new lines in the space between the previous ones. This delay distinguishes an odd frame from an even one. So the TV draws lines 1,3,5,7...479 on one pass and 2,4,6...480 on the other. Since a frame now requires two passes (fields), the frame rate is 30 fps. Since the lines are drawn out of order, we call it interlaced.

 

If we don't want to interlace, we can simply draw lines 1,3,5,7...479 over and over every 60th of a second. Our resolution drops from 480 to 240 because we aren't drawing between those lines with a 2nd field type. Since we only have one field in our picture there's no difference between a field and a frame, and our update rate is 60fps.

 

The important thing is that you are not forced to send a sequence of both even and odd fields. The TV doesn't care. It's going to scan the screen at 60Hz with whatever you send it.

  • Like 3
Link to comment
Share on other sites

 

I am obviously confused about how this can be true. Classic NTSC broadcast television is only capable of 29.97 frames per second. It can do 59.94 *fields* per second, but the Atari does not output anything for the second field, or else it would be in interlaced mode, which it is not. They are black, which is why we see scan lines.

 

I'm not sure how you extrapolate that to mean that the Atari can make a 30fps TV do 60fps... I am missing the logic leap.

It's because you are thinking of a field as 2 frames... which is true. However if you ever look at an Atari game on a CRT then you'll see that it's not adjusting the beam so that one field interlaces with the other. That missing adjustment is the reason why there are black gaps between the scanlines on a CRT for the 2600. It's not because it's drawing a field, and then drawing a blank black field afterwards to complete the frame. It is continually updating each field, and it does that 60 times a second.

 

Now you can call that 60 fields per second if you wish. However on a CRT it does appear as 60 unique images, and not an interlacing of them.

  • Like 1
Link to comment
Share on other sites

The key thing is, the second field's interlace doesn't come from the fact it's the second field, and the second field is somehow special. It comes from the delay between field pairs that Bryan was talking about. A 480i signal has the delay, and a 240p doesn't, and otherwise they're pretty much identical from a signal perspective.

  • Like 1
Link to comment
Share on other sites

Even crazier is that a 480 NTSC picture usually isn't even derived from a single snapshot in time. The 2nd field is usually just as delayed when it's captured by the camera as when it's shown, meaning you can't assemble the 2 fields into a coherent 480 image when things are moving. Otherwise, you get this:

 

distortion1.jpg

 

The real purpose of interlace is to fill in more of the picture tube, and to sort of fudge the trade-off of resolution vs. refresh rate.

Link to comment
Share on other sites

OK, but television shows are 30fps because it spends 50% on each interlaced set of 240 lines?

 

So as I understand what you are saying, we can do 240 lines 60 times per second, but not 480 lines at 60fps like television shows are normally broadcast. So the flicker is because every other "frame" has the other lines blacked out.

 

So since the Atari is not an interlaced signal, it is sending a signal with only the first "field" at 60fps.

Edited by R.Cade
Link to comment
Share on other sites

OK, but television shows are 30fps because it spends 50% on each interlaced set of 240 lines?

 

So as I understand what you are saying, we can do 240 lines 60 times per second, but not 480 lines at 60fps like television shows are normally broadcast. So the flicker is because every other "frame" has the other lines blacked out.

 

So since the Atari is not an interlaced signal, it is sending a signal with only the first "field" at 60fps.

 

Close.

 

Nothing is ever blacked out. A TV draws 240 visible lines per scan. That's all the resolution you get per refresh.

 

TV-picture-scan.gif

 

This isn't enough for a 480 picture. So on the next scan we'll draw the new lines halfway between those lines and with some persistence of vision, it will look like there are 480 lines.

 

SCAN 1, LINE 1 -------------------------------------------------------------------------

SCAN 2, LINE 1 -------------------------------------------------------------------------

SCAN 1, LINE 2 -------------------------------------------------------------------------

SCAN 2, LINE 2 -------------------------------------------------------------------------

SCAN 1, LINE 3 -------------------------------------------------------------------------

 

We get the illusion of 480 lines because even and odd frames are shifted vertically, but the TV only draws 240 lines per 60th of a second. So, it takes 2 offset fields to get to 480.

 

I don't know how else to describe it. Try this:

 

 

Link to comment
Share on other sites

OK, but television shows are 30fps because it spends 50% on each interlaced set of 240 lines?

 

So as I understand what you are saying, we can do 240 lines 60 times per second, but not 480 lines at 60fps like television shows are normally broadcast. So the flicker is because every other "frame" has the other lines blacked out.

 

So since the Atari is not an interlaced signal, it is sending a signal with only the first "field" at 60fps.

 

I don't know what flicker you're talking about.

 

yes "normally" every other line is blacked out. with a crt television its not apparent because the screen doesn't have time to actually go black

the phosphors continue to phosphoresce between fields

so yes, it only sends the "first" field but it sends it twice per NTSC frame (30fps) or (60 fps if you prefer)

I don't recall how the 2600 is set up so I don't know if the lines in the individual fields are actually

interleaved by the 2600 (I don't think so)

with older TVs they would be because that's just the way the TV worked but maybe (probably?) not with modern TVs

Edited by bogax
Link to comment
Share on other sites

OK, but television shows are 30fps because it spends 50% on each interlaced set of 240 lines?

 

So as I understand what you are saying, we can do 240 lines 60 times per second, but not 480 lines at 60fps like television shows are normally broadcast. So the flicker is because every other "frame" has the other lines blacked out.

 

So since the Atari is not an interlaced signal, it is sending a signal with only the first "field" at 60fps.

 

TV shows are displayed interlaced, with each field happening at 60FPS. The two fields together make a frame.

 

An interlaced signal includes half a scan line to offset where the beam will scan the next field. When this half scanline is included, the two fields appear in slightly different vertical positions on the CRT. Because your eye has persistence of vision, you see 480 lines. And you see a complete frame every 1/30 of a second.

 

All the Atari does is omit that half scanline. This puts both fields at the same position on the CRT. What you get is half the vertical resolution, and you get twice the frame rate. The concept of field mostly goes away when there is no interlacing going on. It's unnecessary. Since both fields do the exact same thing, we can just consider them frames, and update the display at 1/60 of a second and call it all good.

 

Now, as to the question of the TV broadcast being 30FPS... This is sort of mostly true. You do get a complete set of 480 lines in 1/30 of a second, but as Bryan mentioned above, the capture of moving images means stuff does not line up. It's not like the world sits still for 1/30 of a second while the interlaced gear captures all the lines needed for broadcast. So the fields end up conveying part of the scene part of the time.

 

In reality, your brain processes this when it's shown to you in real time, and it averages and extrapolates everything to give you the illusion of 60 FPS motion even though your effective full frame rate is 1/30 of a second. Detail gets spread out in both space and time, which your brain assembles into a pretty good representation of what the camera did capture.

 

Pause full motion interlaced video and then step one field at a time to see how this all works. Today, we have sophisticated electronics that can do image processing for us, and the purpose of that is to trade frame rate for improved detail again. The frame rate on those is typically back to the 1/30 second, and there is latency because it takes time to receive the interlaced image, process it and send the result back to the viewer as a more complete sum of the two fields. How this works varies a lot, but one effect it can have is to remove flicker from some games! I've got an LCD that does this. Many VCS titles have a bit more choppy movement than I'm used to, and that's due to the whole 1/30 vs 1/60 thing being discussed, But that LCD really wants to handle it as an interlaced image. So it does that, and objects that flicker every other frame get included on both. It's kind of spiffy. I've a VGA monitor that will do the same. If I send it a 640x400 interlaced signal, which is the slowest one supported today, that monitor assembles it all and just shows me complete frames. It's odd.

 

For understanding this stuff, I highly recommend an old school CRT and if you can find one, get an old amber, grey or green composite screen. Those things will display just about any signal, PAL, NTSC, whatever, and are very high resolution, which lets you see all sorts of artifacts, and they will show you interlaced vs non interlaced, etc... clearly and cleanly. Some of them have slow phosphors too, meaning you can play Asteroids, crank the contrast, turn down the brightness and in a dark room get killer trails. :) One day, I want to convert one to vector display just for that experience, but I seriously digress.

 

The important thing to realize about TV broadcast devices and computer monitors is the NTSC and PAL formats (and PAL is 1/50 second or 1/25 second for fields and frames) are all designed to maximize the perception of motion and detail while not actually delivering any more motion and detail than necessary! It's illusion and good engineering combined to make TV a good experience for people. And back when this was all decided, it was a serious effort to get all that done with old, slow, kind of crappy analog gear, such as tubes, etc...

 

So the standards are modest, and they are flexible, due to the gear being used at the time. This is why an Atari and most other home computers could abuse the signal standard a little and trade vertical resolution for a stable, fast display.

 

The thing is, interlaced displays actually work really well on real world things. Movies, nature, people, all have lots of color detail, and because of that, blend together and generally mask the artifacts both the format (NTSC) and scanning system (interlace vs progressive) impose on the image we view. If you can, go watch a weather broadcast on a standard definition CRT device. Most HDTV sets today do enough correction to spoil the fun, but you can try those too. Watch carefully and pay attention to the line art and graphics and compare those to the weather man, or the news cast to follow. What you are looking for is that crisp display of the graphics. You can see flicker, and you can see image tearing on the weather map when it or it's graphics move. This is because a computer is outputting precision graphics every field. (if they don't have a more sophisticated one that does blending --most don't)

 

Now, when you look at the people hard, you will find they look smooth, but you can't quite see the detail! This is the real world blending together and it's this effect the engineers back in the day realized would work just fine for people in most cases. Home computers came later.

 

One other thing, early VGA displays often were interlaced. One of my early PC's had a 1024x768 display that was interlaced. This was due to monitors and graphics systems being too slow to actually do a progressive scan of it all. It worked best in a dark room, low contrast, to maximize persistance of vision and allow my eyes to blend well enough to keep it all from being distracting.

 

Interlace mostly fails on high contrast line art and text. Where there is a lot of detail and color and intensity depth, our brains work that out. It's the same problem our eyes have. They are only so fast, and the world is faster. To avoid us seeing a mushed up perception of things, the brain has image processing that can work this all out for us. It's really quite amazing when you dig into the detail of what the eye actually captures and what our brain can make of it.

 

But, text and line art are harsh and there just isn't enough detail for it to work. So our brains see the artifacts, tearing, etc... which is why interlaced displays are undesirable for computer use. Back in the day, they knew this, and chose to bend the standard just a little, trading overall resolution for a fast, field and frame consistent image that people could read without fatigue and discomfort.

 

BTW, the Atari uses blue and white for text because that combination tends to mask small artifacts on text, and it also takes advantage of the fact that our eyes have very poor blue resolution. Truth! We have only a fraction of the blue receptors as we do red and green. If you want, you can explore this by taking a nice picture into a graphics program capable of breaking it out into red, blue and green images. Go ahead and blur the blue one and recombine the picture. You will see nearly the same picture despite there being much less resolution in the blue channel you would have blurred!

 

So, the nice white on blue maximizes our eye and brain to make the text easy to read. To explore this, take your Atari and change the background. Poke 710, color and explore some combinations! White on black on a color set is the worst. You will see all sorts of little color fringing. Green is better, red is kind of ugly, purple is a mess, etc...

 

The next best thing is a slow display, amber on black, or green, or even grey. Those deliver crisp images, no artifacts, very little flicker at all, even on interlaced images, due to the slow phosphors and the single wavelength of light. (the grey ones are a little less prime, because grey isn't a single wavelength of light) Our perception peaks on green and amber, and it's the fastest with the highest detail, which is why those colors were used. We like 'em and can view them for very long periods of time without significant fatigue.

 

Our eye doesn't focus perfectly either. Each wavelength of light requires a bit different focus. Our eye balances this for best focus for us, but extremes can be seen. Purple text on a blue or red background, for example, just can't be focused. We can't see the edges well, because two different focus settings are required and the typical person will view that text and experience fatigue due to the eye and brain fighting over what the best compromise should be. Since there isn't one, you can literally feel your eyes focusing in, out, in, out, in, out, etc...

 

Text displays on home TV devices can be a problem because of this and due to how the standard requires color to be encoded. NTSC actually has pretty low color resolution, and fairly reasonable, and a lot higher, monochrome resolution. When you view something in NTSC, the color and intensity all get smudged together, your brain extrapolates, and it's reasonable. Again, high contrast text fails. There isn't enough for the magic to work, and the artifacts are seen clearly.

 

This, BTW, is why many people do S-video mods to their computers. Separating color from intensity can significantly improve on all of that, which can be important for text and line art, but may actually make some color art worse, depending on your preferences. Think, little pixels smudging together compared to precise little pixels easily seen. A lot of people prefer real hardware over an emulator for these reasons. The original art was done taking the TV and how it works into account. When viewed "perfectly", like a PC and emulator can do, it's sometimes a lesser experience.

Edited by potatohead
  • Like 1
Link to comment
Share on other sites

One more thing. What I want to emphasize is that interlaced video is an extension of how a TV works. It's not set in stone. It's an option.

 

What is set in stone:

 

1. The lines must be sent at a rate of ~15.7KHz or the TV will not sync.

2. The beam must be returned to the top of the screen at a rate of ~60Hz or the TV will not sync (some are more flexible here than others).

3. Given the numbers above, the TV can only scan 262 lines (~240 visible) before it must return to the top and start over.

4. The beam is always moving downward so the time that a line is sent determines its vertical position on the screen.

 

So, using these parameters, we can easily use a TV as a monitor for 60fps video at 240 lines or less.

 

Interlacing is a more complicated application of the above, but the rules are never broken. Interlacing just does its thing over the course of two fields, so we call it 30Hz.

Link to comment
Share on other sites

Yep. Those are hard limits, though I have tried "50 HZ NTSC" on a variety of American TV sets, and most them will display it properly. Of those two hard limits, the horizontal is the primary one. It can be a little off, rounded to a nearest cycle, which the Atari actually does, but not much more than that. The vertical can be off by more, depending on how it's done.

 

50Hz NTSC employs a slightly different timing to match PAL vertical frame rates. I first ran into this on the CoCo 3, which has a 50/60hz switch in the graphics chip. It's an NTSC signal, tweaked to result in 50 frames per second, but in all other ways is the same.

 

I also believe the Amiga would output NTSC at 50Hz to match up PAL / NTSC game rates. Never did own one, but I'm sure someone here can clear up how that's done.

 

50Hz NTSC is also in use in some parts of the world. Can't recall off hand, but it's out there.

  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...