-
Content Count
3,083 -
Joined
-
Last visited
Content Type
Profiles
Member Map
Forums
Blogs
Gallery
Calendar
Store
Everything posted by jbanes
-
Mmmm... a lot of good points there. It must have been a lot of *cough* fun to work for Atari in those days. It's a secret to everybody. But the 2700 sticks only had one fire button. That undoubtedly factored in. Single-button controllers weren't feasable anymore. Still, adding another button is a lot less work than designing a new controller. From this photo, it seems that Atari added the second button at some point in the Sylvia's development. What's interesting, though, is that the image looks a lot closer to what became the 5200 controller. So I guess the answer to my own question might be that Atari wanted something *different* than the 5200 controller, and they felt that the 5200 lineage tainted the Sylvia controller.
-
Odyssey2: Intel 8048 8-bit microcontroller running at 1.79 MHz 2600: MOS Technology 6507 @ 1.19 MHz Or are you comparing cycle counts per instruction? (As I said, I know jack-all about the O2 architecture.) It's kind of the same for the 2600, though. If it had more sprites available, more work could be done. Instead, programmers had to pull tricks like flickering, reusing missiles as sprites, and a repeating-sprites-on-one-line-trick-who's-name-escapes-me-at-the-moment-but-it-was-used-in-space-invaders trick.
-
How interesting. I assume that you're checking every so often to see if the computations need to be split-up between frames?
-
The difference is beside the point since you don't store sprites in the visible buffer. I was speaking only in terms of operations. 256 operations to blit an image is still more than 1. (Or 1 per line if we want to get specific about 2600 hardware.) When offering an explanation to someone, the goal should be to clarify their understanding, not confuse it. Fair enough. Simplifying for non-techies is a difficult thing to do. If you feel you can do better, please feel free to interject your own explanation.
-
*nods* I suppose. Though wouldn't the computer side have been the former console side? Given that it was going to be the next console anyway, that is.
-
*sigh* How did I know that was going to bug you? I was trying to keep it simple. If you miss your deadline, what you want to get rendered doesn't get rendered. Thus you can't miss your deadline. No it isn't. Flickering is the 2600's answer to limited sprite hardware. From the perspective of a non-techie, though, the two are very similar. The 2600 runs out of graphical power before it runs out of CPU power. Other systems run out of CPU power before they run out of graphical power. Thus the tradeoffs are similar, even if they're not precisely the same. Not that I expect you to agree with such an abstract viewpoint. EDIT: See? Now we've overloaded the poor smuck's cognitive stack! He's lucky he doesn't have smoke pouring out of his ears!
-
I'm not sure that I'm the best person to answer this, given that I currently know jack-all about the O2's rendering hardware. However, I can make a few generalizations. First and foremost, as the Killer Bees programmer explained in an interview, speed is merely an illusion. If you want something to go faster, merely move it more pixels per frame. The only tradeoff is in the smoothness of the movement. Large jumps can occasionally become detectable by the human eye. Thankfully, television screens run at 60FPS interlaced. Far above a frame rate that humans can see, but bordering the edge of what the brain can perceive. So we don't normally notice a bit of oddness in the movement. This can be further obscured by using sudden changes in direction and acceleration to screw up the brains ability to detect where the object *should* be. Secondly, different rendering technologies make for strengths in different areas. In machines like the 2600, it was able to smoothly render a few objects on screen at a time, because the signal generator was capable of converting a few chunks of memory into a partial video signal. The remainder of the signal was just filled in with a solid color. The background ability of the 2600 allowed for that color to switch between two options in increments of 1/20th of the width of your television. (More colors could be shown if you were willing to waste precious CPU power on fiddling with the signal generator.) Something like the Intellivision, however, was able to fill the screen by rendering repetitive chunks of memory for the background, then allowing for 8 more chunks of memory to be arbitrarily tracked and rendered on the screen. The Nintendo did something similar except that it used tiles of multi-colored imagery instead of single-color "on/off" pictures. (You know, the more I think about it, the more I realize that the NES designers must have been Intellivision fans.) The ultimate rendering technology, however, is the framebuffer. By quantifying each pixel into a memory location, the entire screen can be manipulated. The advantage to this is that you can show anything that can possibly be shown by a television monitor. There are no limits on how many objects can be rendered, and there's no difference between the background and the moving objects. It's all the same. The disadvantage to a framebuffer, however, is that it takes an incredible amount of time to move an image from main memory to video memory. For a 16x16 pixel image, that's 256 operations! Now compare that to the sprite hardware of most game consoles where the game merely modifies a single register (1 operation) to point to an image in ROM. That's a big difference! However, if you have enough memory for a framebuffer, then you can also afford to miss a frame. If you don't render the next frame, the video memory still contains the information. Thus the signal generator shows you the last frame until it gets updated. If the CPU is particularly busy and can't update the frame, this will look like the game has slowed down. Now most older consoles didn't use framebuffers due to memory costs. (320x200 at 256 colors is ~62K of memory!) However, they could store data for rendering an entire frame instead of storing only enough for one line. (The 2600 registers had to be updated on each line, or the next line would end up looking just like the previous one.) Once the data was stored like this, then more advanced graphics processors could render a frame of data independent of what the main CPU was doing. Thus the GTIA (Atari 8-bit), MARIA (7800), and PPU (NES) were all capable of missing a frame without failing to display anything. When they missed a frame, they'd appear to "slow down" much in the way that a framebuffer would "slow down". Umm... does that sort of make sense, or is it too technical?
-
Gas == $3.00/gal 90 mi / 25MPG = 3.6 gallons (You can do better than that, right?) 3.6 gal * $3.00/gal = $10.80 ~$0.40/toll * 5 tolls = $2.00 $10.80 + $2.00 = $12.80 one way $12.80 * 2 = $25.60 round trip Considering that you'll be spending $10.00 per person for entry, at least $5 per for lunch, and $100's of dollars on new gear, all I have to say is: Don't be such a cheapskate.
-
Pretty much. In the world of computing, the 2600 just didn't have enough power to make that particular Space/Time tradeoff. It had to either power the graphics device or expect to show a blank screen. (Speaking of which, I believe that 3D Tic Tac Toe did exactly that. While it was "thinking" it helpfully killed the video signal so that it could use more processing power. Given its graphics, that's actually kind of pathetic when you think about it. But hey, they had to keep it down to 20 minutes of "thinking" somehow.)
-
Doug was talking about his original Atari 800 version of Star Raiders, not the 2600 version. The 800 had the memory to store an entire frame for the GTIA to render as opposed to the 2600 Just in Time rendering design. The 2600's design lead to more flicker, because missing your deadline meant that nothing would get rendered. On top of that, the 2600 just didn't have enough objects to throw around the screen at any given time, so they had to be multiplexed. The multiplexing ended up being the most common source of flicker simply because it was intentional.
-
That's a good point. I'm a bit curious, though. Does anyone understand the interplay inside of Atari between the design of the Atari 8-bits board and the cancellation of the 3200? Some of the questions in my mind are: Why didn't they release a PAM console in parallel with the Atari 400/800 release? Why did they decide to create a *new* console in response to the Intellivison when they already had one? Why did Atari decide to redeisgn the 2600 controllers as the Prolines when they already had the excellent 2700/Sylvia designs? There's just so much about the 5200 history that doesn't seem to add up. One could almost swear that Atari was suffering from multiple personality disorder, and was unable to do anything other than start brand-new projects. Not to mention this infatuation with copying the competitor. The Sylvia had a 10-bit processor while the 5200 had a (mostly useless) keypad. WHY? Especially given that the Intellvision's power stemmed more from the STIC design and less from the processor itself. (Which was actually a) 16-bit and b) clocked slower than the 6502.)
-
See, this is what I really love about the Blue Sky Rangers. They truely enjoyed working with the Intellivision. To them, it was much more than just a job. While Mattel lost a few developers to IMagic early on, Mattel never experienced an exodus of their best developers quite like the Activision exodus at Atari. One has to wonder if the entire third party market could have been prevented had Atari treating their programmers with a little more respect and dignity.
-
They're both. The Intellivision contains built-in graphics (mostly alphanumeric characters) in GROM, as well as user-defineable characters in GRAM. The GROM has 256 entries while GRAM can be programmed with up to 64 entries. Not exactly. According to the Blue Sky Rangers the Intellivision was based on a "fun project" design published in the General Instruments catalog. GI called the console the Gimini 6900. Now the Gimini was designed to use only GROM, but BSR Dave James insisted that dynamic sprites were a requirement. Thus GI helped Mattel modify the design to use both. According to this page the running man was in GROM. But I don't know if that's an authoritive source or not. To get back on topic... All kinds of old systems are under rated and under appreciated. IMHO, each and every one has a history behind it from which we can learn something. Their popularity in the market or lack thereof only makes them that much more interesting. Even for the consoles that did poorly, you still have to take the time to realize that no one set out to make a bad system. Rather *someone* *somewhere* thought the work they were doing was valuable at the time. Analyzing why it didn't pan out is half the fun. And there's no better way to do an analysis than... play the games!
-
Indeed. Since Doom is so low-res anyway, all they really needed to do was swap in the (much smaller) Intellivision color palette, and voila! Instant pixelation. Of course, the Intelli's graphics design make such a port unlikely at best (the STIC isn't exactly a framebuffer), but it's kind of fun for stringing along the less astute. The IntyOS, on the other hand, now that's cool. Utterly and completely useless, but cool none the less.
-
Someone doesn't get the joke*. * You did read the fine print, didn't you?
-
Cool links time! IntyOS A desktop environment for your Intellivision? Who would have thought? These guys have gone through great pains to ensure that you can use your Intellivision disk as a mouse controller to run a full desktop system. It doesn't do much, but it's pretty neat to look at. If you don't have an IntelliCart, you can try a demo of their older version thanks to the Java version of Bliss. Doom Make all your Atari 2600 friends jealous with this bonafide* Intellivision port of Doom! The only downside is that the ROM is a whopping 32K. But hey, it's Doom! Try it through your IntelliCart or load it into Bliss. Mike's Intellivision Page Missing a manual? Don't know what the overlay is supposed to look like? Want a rarity guide similar to AA's, but for the Intellivision? Mike's site is the place to go! INTV Funhouse Lots more Intellivision info including box scans, reviews, overlays, and cartridge photos. If a game isn't listed here, it probably didn't exist. That's all for now. FYI, Space Spartans is a lot more fun than I had expected! After playing Star Raiders, Star Voyager, and Starmaster, I was expected Yet Another clone of the overly complex, extremely difficult space games. I must say, SS exceeded my expectations by a mile. The controls are extremely easy to use (once you figure out that you need two keypresses to make it do what you want), the difficulty ramps up smoothly, and the voices really do help keep the management details managable. Sure, it's really more of a Space Battle II, but Space Battle was fun! If you haven't tried it before, consider getting yourself an Intellivoice and a copy of Space Spartans. You won't be disappointed! * Make sure you read the fine print.
-
By "close enough to drive," do you mean "close enough to get a hotel" or "close enough to drive from home"? Keep in mind that it's a two day event. If you mean the latter, you can go for just one day. Or perhaps (depending on your son's interests) take him with you as a Birthday celebration?
-
Which brings up a question that's really been bugging me. Does the 6502 downshift even for accesses like reading the Joystick Position, or is the lower clocking only triggered when you write to the TIA's ports? I've been crossing my fingers that it's the latter, because the former would make the chip incredibly erratic; but I'm concerned that I may just be practicing wishful thinking.
-
You do? Wow, that's cool! Can I have one?
-
Considering that they haven't been manufactured since Carter was President, I'd say the engineers will be tweaking them whether NASA likes it or not. The materials used today just aren't the same heavy metal sheets that were popular when the Saturn V was designed. As a result, the engines are likely to be lighter and more efficient based on materials and manufacturing upgrades alone. A bit like the Flashback 2.0. That somewhat concerned me as well, but it all works out in the end. As it so happens, the SSMEs (7000lbs) are just about twice the weight of the J-2s (3600lbs), and just about twice the thrust. The SSMEs win on theoretical fuel efficiency (Isp), but not so much as to make a difference. I never quite understood this. In my mind a Turbopump is a motor, not a controlled explosive. Alas, thats what they call them on model rockets as well, so go figure. Yeah, they really tried their best to get that sucker flightworthy as fast as possible. It's too bad that the program was cancelled after only one successful flight. I could care less about the Shuttle itself (they copied many of the bad features of the American shuttle with only a few excellent corrections, but did so without a contextual reason for doing so), but the Energia was a sight to behold. Still, it's a bit sad that the only Russian shuttle ever to fly has been lost ot history.
-
Ok, I got a response back: So there you have it. This was a really interesting thread, though. It would have been cool if we'd ended up with a rare non-released cart. Almost as exciting as the Cheetamen II find a few years back. (Not that anything Intelli sucks that bad, but without the International Intellivoice, it would have been just about as playable.)
-
Lightguns don't work on LCD screens, even less so on LCD Computer Monitors. So I can't find out for myself.
-
The "English Language" splat on the link you posted makes me think you're right. It almost looks like it's a repacking for the French market, but on American hardware. You'll have to plug it in and find out for us.
-
I've sent an email to Keith Robinson about the matter. With any luck, he'll reply with some insight.
-
From the description of the French version: So it should speak French if you can find an unreleased International Intellivoice. Kind of a catch-22 there.
