Jump to content
IGNORED

FPGA Based Videogame System


kevtris

Interest in an FPGA Videogame System  

682 members have voted

  1. 1. I would pay....

  2. 2. I Would Like Support for...

  3. 3. Games Should Run From...

    • SD Card / USB Memory Sticks
    • Original Cartridges
    • Hopes and Dreams
  4. 4. The Video Inteface Should be...


  • Please sign in to vote in this poll.

Recommended Posts

Well I'll be a.. It is drama, and it's going to stay drama forever and ever till the end of time itself. You know it and I know it and anyone not thickheaded knows it too.

https://www.youtube.com/results?search_query=drama

 

https://twitter.com/search?q=%23drama&src=typd

 

Those are links to a youtube search for drama as well as a twitter search for "#Drama" If you can find one. single. non-theatrical. instance. where drama is used to mean anything other than "a pointless waste of time by immature people" I will concede that I am wrong. There are 30 million results on youtube alone. If you can not find it used a single time in a way that fits what you think it means then you can admit you are incorrect and get up to date with what everyone that is not a moron understands the word to mean.

Link to comment
Share on other sites

Does anyone here know what 8K television refresh rates are going to be like? For sure, 60Hz. But anything higher?

8k will require hdmi 2.1 for 60hz (hdmi 2.0 can only do 8k at 30hz) but the limit for hdmi 2.1 data transfer is 10k at 120hz so I imagine 8k could do at least 140hz https://en.wikipedia.org/wiki/HDMI#Version_2.1

 

How long until a decent display supports that is anyone's guess though.

Link to comment
Share on other sites

Does anyone here know what 8K television refresh rates are going to be like? For sure, 60Hz. But anything higher?

8k, really? How long will they continue to boost the resolution? Of course it's great for prints and stuff, but they continue to double the resolution every ten years or so, eventually you will need a 30x microscope held to the screen to see the pixels. I prefer mine big, like a CRT mask.

 

So... SD -> "2k" ie 1920x1080, then "4k" ie 3840x2160, now "8k" is apparently a thing at 7680x4320. What next? "32k", technically 30720x17280, would be a half-gigapixel resolution, or more precisely 536 Mega pixels. Yeah I skipped 16. Now we have the "Moore's" law applied to HD displays when the human eye really can't discern much above 4k, or 1080p with a non-gigantic screen at normal viewing distances. Hopefully they just stop somewhere and decide upon some arbitrary value once the pixels are infinitesimally small as to be invisible.

 

Definitely frame rate will be a huge boost to the film industry. Live sports and news events are captured at 60Hz and have been since the advent of broadcast telivision, yet we still shoot movies at 24fps. If people during the early days of cinema had access to faster projectors and high speed ISO film, I am sure they would have done full 60Hz or faster frame rate, yet legacy dictates that even most modern films be shot at 24Hz. And with the push for higher detail and bandwidth, lower resolutions at higher frame rates are simply not in the cards at this time. I consider the jump from 24Hz to 48 or 60 or 120Hz a quantum leap, not so much the jump from 4k to 8k.

Link to comment
Share on other sites

8k, really? How long will they continue to boost the resolution? Of course it's great for prints and stuff, but they continue to double the resolution every ten years or so, eventually you will need a 30x microscope held to the screen to see the pixels. I prefer mine big, like a CRT mask.

 

So... SD -> "2k" ie 1920x1080, then "4k" ie 3840x2160, now "8k" is apparently a thing at 7680x4320. What next? "32k", technically 30720x17280, would be a half-gigapixel resolution, or more precisely 536 Mega pixels. Yeah I skipped 16. Now we have the "Moore's" law applied to HD displays when the human eye really can't discern much above 4k, or 1080p with a non-gigantic screen at normal viewing distances. Hopefully they just stop somewhere and decide upon some arbitrary value once the pixels are infinitesimally small as to be invisible.

 

Definitely frame rate will be a huge boost to the film industry. Live sports and news events are captured at 60Hz and have been since the advent of broadcast telivision, yet we still shoot movies at 24fps. If people during the early days of cinema had access to faster projectors and high speed ISO film, I am sure they would have done full 60Hz or faster frame rate, yet legacy dictates that even most modern films be shot at 24Hz. And with the push for higher detail and bandwidth, lower resolutions at higher frame rates are simply not in the cards at this time. I consider the jump from 24Hz to 48 or 60 or 120Hz a quantum leap, not so much the jump from 4k to 8k.

Well, there is the law of diminishing returns. The difference between 30hz and 60hz can be night and day but the difference between 120 and 150 would be virtually indistinguishable. I imagine display technology will continue to improve until it hits the limit of what can be marketed. Regardless of the what the human eye can detect people love numbers and the bigger they are the more we spend. Hence phone cameras marketed by gigapixels even though that has no significance on if the camera is any good at all. Factors like motion detection, light sensitivity, focus, ect are far more important but once we pick a number we are attached to everything else be damned as long as that number gets bigger with each generation.

 

That said we are nowhere near the limit of what the human eye can detect now.

 

As for when films will adopt this technology, as recently as "The Hobbit" was when the movie industry adopted shooting some films in a mere 48fps. You might be wondering why that is considering 1080p 60fps cameras in phones were common at this time let alone some industrial grade stuff they could purchase if they chose to. The answer is that the movie industry doesn't want to show you reality, they want to show you fantasy. So when doing practical effects and cgi deceiving your audience into thinking something is real becomes incredibly more difficult when the real parts of the film are extremely distinguished. If you pay attention while watching "The Hobbit" for example you can clearly see the makeup on some of the characters and for myself at least that broke the immersion and took me out of the film.

 

As for games graphics are *huge* selling points as most people that know what they mean are a bit elitist about it, and most casual gamers just know that bigger is better. On the other hand though would you rather have a game that took 8 more years to develop because you can see with crystal clarity every link in a chain link fence or would your enjoyment of the game overall be virtually the same regardless of how clear that fence looked?

 

There really are no correct answers to these issues as the tradeoffs are personal preference for both the developers and consumers. Personally I'm very excited for ultra realistic immersive virtual and augmented reality and I'll take my improvements where I can get them but for retro gaming it doesn't matter except for getting your game to display with minimal lag, unless of course you are also enhancing the image ultra64 style with anti-aliasing filters and such.

Link to comment
Share on other sites

I don't know how far they're going to take it. But this article talks about eye resolution and may shed some light on a practical limit. I personally seem to recall it's around 16,000 lines at standard viewing distances.

 

It's also interesting to learn you can double or quadruple your resolution by tilting and straining your neck to see more detail. You do it all the time when trying to make out details in the distance. A technique camera makers have yet to release to the public. Sony does it with their anti-shake mechanism - shifting the entire sensor by 1/4th pixel to get a different set of light rays. And then integrating the new and slightly different scene in software.

 

http://www.clarkvision.com/articles/eye-resolution.html

Link to comment
Share on other sites

All this crap about not discerning 250-260 FPS is total bullshit. You won't see the individual frame, but instead you'll see different degrees of smearing.

 

For 0 smearing of a sprite traversing the entire screen in one second by moving left to right, on a screen with 200 horizontal pixels resolution, you will have to update at 200Hz. Anything less, and the object begins to smear or flicker as it skips pixels. At 100Hz rate, it has to skip every other pixel in order to cover the distance. At 60Hz it skips 3 pixels.

 

And while you may not see individual frames above 30FPS, you will see the object distort or stretch or have ghosts. This is evident in another way by some of the fast action cameras used to make action scenes in Star Trek: Enterprise. It's almost stop action where you can pause the playback and see perfectly sharp poses in what should be a blur of punches and tumbles. It's pretty obvious when they switch to small high-speed camera.

 

---

 

I used to play Quake III at 90 FPS. It was oil on glass smooth - until the game engine asked to move something more than 90 pixels' distance in 1 second. Then you could see a subtle double image that would stretch a little more and little more the faster it moved.

 

The effect is not unlike the old-school cartoons where they ring a bell over the character's head and everything starts vibrating. Just much smoother and consistently modulated.

Edited by Keatah
  • Like 1
Link to comment
Share on other sites

All this crap about not discerning 250-260 FPS is total bullshit. You won't see the individual frame, but instead you'll see different degrees of smearing.

 

For 0 smearing of a sprite traversing the entire screen in one second by moving left to right, on a screen with 200 horizontal pixels resolution, you will have to update at 200Hz. Anything less, and the object begins to smear or flicker as it skips pixels. At 100Hz rate, it has to skip every other pixel in order to cover the distance. At 60Hz it skips 3 pixels.

 

And while you may not see individual frames above 30FPS, you will see the object distort or stretch or have ghosts. This is evident in another way by some of the fast action cameras used to make action scenes in Star Trek: Enterprise. It's almost stop action where you can pause the playback and see perfectly sharp poses in what should be a blur of punches and tumbles. It's pretty obvious when they switch to small high-speed camera.

 

---

 

I used to play Quake III at 90 FPS. It was oil on glass smooth - until the game engine asked to move something more than 90 pixels' distance in 1 second. Then you could see a subtle double image that would stretch a little more and little more the faster it moved.

 

The effect is not unlike the old-school cartoons where they ring a bell over the character's head and everything starts vibrating. Just much smoother and consistently modulated.

What it is saying is that the human eye on average updates 250 times per second. What you are talking about, moving 1 pixel per frame for as many frames as you have pixels is "perfect quality" or an exact video of how something moves. That is not the same as how the human eye perceives something to move. Essentially from our approximate 250fps we input our brains make that into movement. So while what you are describing would be perfect quality, it is not quality we can detect.

 

If you doubt me feel free to test it with a 240hz monitor and see if you can accurately tell the difference between 230hz and 240hz.

Link to comment
Share on other sites

What it is saying is that the human eye on average updates 250 times per second. What you are talking about, moving 1 pixel per frame for as many frames as you have pixels is "perfect quality" or an exact video of how something moves. That is not the same as how the human eye perceives something to move. Essentially from our approximate 250fps we input our brains make that into movement. So while what you are describing would be perfect quality, it is not quality we can detect.

 

If you doubt me feel free to test it with a 240hz monitor and see if you can accurately tell the difference between 230hz and 240hz.

 

Any deviation from perfect quality as described shows up, not in flicker, but wider edges to an object. In the direction of motion. Especially if the edges are outlined in different color than the background. High contrast or a vector outline shows it best.

Link to comment
Share on other sites

8k, really? How long will they continue to boost the resolution? Of course it's great for prints and stuff, but they continue to double the resolution every ten years or so, eventually you will need a 30x microscope held to the screen to see the pixels. I prefer mine big, like a CRT mask.

 

So... SD -> "2k" ie 1920x1080, then "4k" ie 3840x2160, now "8k" is apparently a thing at 7680x4320. What next? "32k", technically 30720x17280, would be a half-gigapixel resolution, or more precisely 536 Mega pixels. Yeah I skipped 16. Now we have the "Moore's" law applied to HD displays when the human eye really can't discern much above 4k, or 1080p with a non-gigantic screen at normal viewing distances. Hopefully they just stop somewhere and decide upon some arbitrary value once the pixels are infinitesimally small as to be invisible.

 

Definitely frame rate will be a huge boost to the film industry. Live sports and news events are captured at 60Hz and have been since the advent of broadcast telivision, yet we still shoot movies at 24fps. If people during the early days of cinema had access to faster projectors and high speed ISO film, I am sure they would have done full 60Hz or faster frame rate, yet legacy dictates that even most modern films be shot at 24Hz. And with the push for higher detail and bandwidth, lower resolutions at higher frame rates are simply not in the cards at this time. I consider the jump from 24Hz to 48 or 60 or 120Hz a quantum leap, not so much the jump from 4k to 8k.

 

We are not going to see TV's or computer screens over 8K. UHD was created with the intent of having 8K theatrical resolution. The only thing higher than this are hemisphere screens for IMAX. The only way you get that at home is with a VR HUD.

 

It's actually unlikely we will see anything over 4K for computer monitors. We may see 240hz 4K monitors, but we won't see 8K. TV's we will see 8K, but that's it.

 

I know it sounds strange to basically say, "that's it", but that is it. Computer hardware is not improving in a way that will allow anything better. There is maybe one viable die shrink left and that might not bee seen for 3 more years. So take what we have right now, and multiply by 4. GPU power currently can do 4Kp60 with a single $1000 card. To get 240fps, you need 4 of them. So a die shrink may allow this being reduced to a single card. 8K is 4 4K screens just like how 4K was 4 1080p screens.

 

Without some kind of lossy compression allowed in the monitor, it's unlikely anything higher will be possible. That's assuming anything higher than 8K could be built at anything under 40"

  • Like 1
Link to comment
Share on other sites

The word processor. All one ever needed was what was present in the early Office series. At most. Now to run the latest WP you need the latest CPU or it bogs. That's what the industry has been calling an upgrade. With slower hardware evolution they’ll need to wow the customer by doing something different. Not just curing bloat-induced slowdowns.

I've been happy with HD and 4K for a long time. If 8K comes it comes. If not, no big loss.

 

In a weird way I hope the evolution and process shrinkage rate slows down. Because all that's resulted is bigger and more bloated software. Once limits are set, improvements and incentives to upgrade will be transformed from the need to maintain speed to actual functional improvements.

 

The word processor. All one ever needed was what was present in the early Office series. At most. Now to run the latest WP you need the latest CPU or it bogs. That's what the industry has been calling an upgrade. With slower hardware evolution they’ll need to wow the customer by doing something different. Not just curing bloat-induced slowdowns.

Edited by Keatah
Link to comment
Share on other sites

All this crap about not discerning 250-260 FPS is total bullshit. You won't see the individual frame, but instead you'll see different degrees of smearing.

 

For 0 smearing of a sprite traversing the entire screen in one second by moving left to right, on a screen with 200 horizontal pixels resolution, you will have to update at 200Hz. Anything less, and the object begins to smear or flicker as it skips pixels. At 100Hz rate, it has to skip every other pixel in order to cover the distance. At 60Hz it skips 3 pixels.

 

And while you may not see individual frames above 30FPS, you will see the object distort or stretch or have ghosts. This is evident in another way by some of the fast action cameras used to make action scenes in Star Trek: Enterprise. It's almost stop action where you can pause the playback and see perfectly sharp poses in what should be a blur of punches and tumbles. It's pretty obvious when they switch to small high-speed camera.

 

---

 

I used to play Quake III at 90 FPS. It was oil on glass smooth - until the game engine asked to move something more than 90 pixels' distance in 1 second. Then you could see a subtle double image that would stretch a little more and little more the faster it moved.

 

The effect is not unlike the old-school cartoons where they ring a bell over the character's head and everything starts vibrating. Just much smoother and consistently modulated.

Yes, 240fps is a useless metric. If an object travels fast enough across the screen, you will see jaggies. For instance, if someone shot a gun in front of a strobe light running at several thousand discharges per second, a human viewer would not see a blur, but multiple bullets. The bullets would only appear in an instant, but persistence of vision, combined with the high strobe rate, the retina in the eye would record several images of the passing bullet, perceiving not a blur or flash but a series of objects repeated. A more practical example is when LEDs or VFD displays are strobed at some high flicker rate that is much higher than 60 or even 240Hz. If you rapidly move your eyes from point A to point B in a darkened room, steady lamps will result in a straight line whereas strobed or duty cycled lamps will produce a series of dots in the vision.

 

Whenever there is fast motion in a film, 24fps creates a jarring effect, with the scenery staggering. This is especially true, whether using digital or analog film photography, if the scene is broad daylight where the exposure time is considerably less than the length of a frame, it is more noticeable. Ideally the shutter should be held open the entire duration of each frame such that high motion elements in a scene become blurred, reducing the stuttering effect. For analog film, this is impossible, but it is possibe to reset the CCD of a digital camera without closing the shutter. So if video recorded at 24fps or 48fps or 60 or 120fps or some other arbitrary frame rate could record the light entering the camera for the entire duration of each frame, with mechanical rather than digital image stabilization, then you don't get a serration effect when an object passes across the screen at high speed. Low motion detail is still sharp, but high motion detail becomes blurred. This is harder to do with CGI effects, as motion estimation must be tweened between frames requiring many more rays to be rendered per frame without image degradation . This would help with persistence of vision and realism, especially if the CGI effects are motion blurred by the same percentage between frames as the shutter speed of the live action video it is being combined with. Oftentimes in fast moving scenes, the CGI elements are sharper than the real life action, though that can be mitigated by the expensive tweening technique by rendering subsampled pixels at varying sub-frame time points. So while yes, 48fps is a definite improvement, it is not good enough for fast motion, albeit far less jarring than 24fps. As for SD vs HD vs UHD, it depends upon a lot of factors. If you're viewing a small 26" TV from 20 feet away, anything beyond SD is a placebo. Your eyes will discern far less detail than say 5 feet away from a 72" screen, where 4k UHD may be a marked improvement over 1080p.

  • Like 1
Link to comment
Share on other sites

That said we are nowhere near the limit of what the human eye can detect now.

 

Nice video.

 

I always taken the perception that the 100Hz CRT was the standard. And eye limit.
But I suppose that is better to double to 200Hz-250Hz for sync issues, like Nyquist analog signal theory.
Link to comment
Share on other sites

 

 

Nice video.

 

I always taken the perception that the 100Hz CRT was the standard. And eye limit.
But I suppose that is better to double to 200Hz-250Hz for sync issues, like Nyquist analog signal theory.

 

 

Well, again, VR Headsets are more likely to get that 16,000p 250fps realism by using the way the eye works against itself. Instead of having a 16K monitor that wraps your entire peripheral vision , you'd have something like a 2K or 4K monitor that physically occupies the space of about 4" but takes up the entire peripheral vision of the VR headset sweet spot.

 

Like as it stands right now, people become sick if a VR headset isn't even 100fps. (Sony's VR kit is only 90 apparently) To say nothing of the latency. I don't even like VR stuff.

 

But to reiterate my previous point, there will likely be no market for an 8K computer monitor or a >8K TV, because the amount of space required for it to be of any practical use requires a 48" screen to fit entirely in your peripheral vision, and right now a 4K 24" monitor sitting at about 3' away tends to fill the entire sweet spot. You can't really tell a 4K from a HD monitor at 24" short of some scaling artifacts.

 

eg, my 4K and my HD monitor side by side with a photo I scanned at >4K resolution from a negative, then yes the 4K IPS monitor looks better, but if I'm standing 8' away from the monitor I could not tell you if that was a 4K photo or a HD photo, only that the 4K IPS monitor is more colorful.

 

At some point, likely by 2025, computers, tv's and monitors will just be like cars, where there's very little improvement other than energy efficiency, and it's just a feature fight.

  • Like 1
Link to comment
Share on other sites

A quadrillion quadrillion pixels, with a refresh period of one planc time unit. Anything less will simply not compare to real life.

 

There is a difference between real life(subjective experience of the world) and the real world(the world as it objectively is). A screen doesn't have to render what is objectively out there to compare to real life. It just has to render what we experiencing subjectively in here which is much more macroscopic than the microscopic quantum world out there. In other words, a screen just needs to render what can be seen with the naked eye instead of what is happening on the quantum level that we use other instruments other than our eyes to detect.

  • Like 1
Link to comment
Share on other sites

 

There is a difference between real life(subjective experience of the world) and the real world(the world as it objectively is). A screen doesn't have to render what is objectively out there to compare to real life. It just has to render what we experiencing subjectively in here which is much more macroscopic than the microscopic quantum world out there. In other words, a screen just needs to render what can be seen with the naked eye instead of what is happening on the quantum level that we use other instruments other than our eyes to detect.

 

except that a) every person is different from every other person, with occasional extreme differences in both physical capabilities and cognitive workings, and b) as time goes on we will be enhancing our physical bodies (including brains) with additional technology that may be able to increase our perception, and increase the amount of information we can directly absorb.

 

Perhaps one goal some might strive for would be to have "real life" (as you put it) approach the "real world", through the use of technology. Even today, just by wearing eyeglasses, billions of people are "enhanced".

 

Artificial eyes are already being developed. It probably won't be that long until we start arguing about the resolution of our implants!

  • Like 1
Link to comment
Share on other sites

 

except that a) every person is different from every other person, with occasional extreme differences in both physical capabilities and cognitive workings, and b) as time goes on we will be enhancing our physical bodies (including brains) with additional technology that may be able to increase our perception, and increase the amount of information we can directly absorb.

 

Perhaps one goal some might strive for would be to have "real life" (as you put it) approach the "real world", through the use of technology. Even today, just by wearing eyeglasses, billions of people are "enhanced".

 

Artificial eyes are already being developed. It probably won't be that long until we start arguing about the resolution of our implants!

 

Let's say we had eyes that could see down to the quantum level. Would we be brought closer to real life approaching the real world by being able to see probability waves? No, we would only see particles because the act of observing collapses the wave function. In other words, what determines rather or not a photon is a probability wave or a particle is based on rather or not we are observing it. To put that another way, we can't see what the world objectively is directly because what has brought us closer to understanding what the world objectively is isn't just based on figuring out how it behaves when we are looking at it but also how it behaves when we are not.

Link to comment
Share on other sites

I work with a few of the people that are trying to figure this all out and are actively developing industry standards to make sure it all holds together.

HDR (@12 bit color depth) is the main topic of conversation along with higher frame rates for sports content (120fps good, 240fps better), more efficient compression codecs (like HEVC) as well as studies that show things like the best viewing distance from the screen (~6 feet), etc.

Then there's the distribution problem. This is probably where the streaming vendors will have the advantage over cable and local broadcasters.

In terms of infrastructure most broadcast facilities can't handle anything over 1080i.

 

I've seen 8K TVs at trade shows and I have to say that I can't tell that I'm not looking at a 4K screen.

But if the public wants it they get it.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...