Jump to content
IGNORED

How 2600 reaches 90s when better consoles fail


Serguei2

Recommended Posts

New console generations are now like...3 or 4 years because the industry just can't keep making profit off them, so they have to convince people that a new console will be an advantage to them.

So much selective observation here. . .

 

The 360/ps3 console era was the longest planned cycle ever.

 

The 2600's end of life marketability was just an accident and spasm of desperation. You can bet that if the PS3 had somehow been a failure that Sony would have swamped the world with "budget" PS2s in the exact same way.

 

Also, there is nothing forcing me to upgrade my already 4 year old XOne. It will continue to run software for years.

  • Like 2
Link to comment
Share on other sites

 

 

what industry?

 

I buy brand new chips expecting a max of 48 volts to be applied, just cause new fragile dainty arm's are only 1.squat volts heh

 

cmos, ttl, dtl, rtl and all the likes are still supported, just might not be in a 1ghz qfp that does something stupid like blink a light

For the record, I'm aware that hobbyists still can buy 5V TTL/CMOS chips all day long. But how much of that is newly manufactured versus old stock? Most manufacturers of microcontrollers and CLPDs / FPGAs have discontinued 5V logic. So while some old manufacturers may still produce dip chips and stuff with 5V gates on them, the fab facilities for actual custom processors with more complex circuits are dying. If whatever fab Nintendo used to manufacture their FC/SFC CPU/PPUs and such were shutting down or converting their facilities to smaller die sizes and 3.3V logic, it may not have been feasible to continue production of 5V parts. Then there were RAM chips and other components sourced from outside manufacturers. And futher more, 1983-2003 is a freakishly long time for a console. consoles were at an all time low on the secondhand market so there was no demand at the time for new ones. They weren't going to keep making legacy hardware forever as it's not a sustainable business model.

  • Like 1
Link to comment
Share on other sites

There are hardware reasons for this move, but look at the "mid-generation" refreshes of the Xbox One and PS4. It's ludicrous how short the generation cycle of consoles is getting.

 

This is a widespread and growing problem for consumers of tech. Things move too rapidly and there is less effort on the part of tech companies to keep things backwards compatible.

 

In a semi-related field (no pun intended) look at how fast intel is revising microprocessors, SkyLake, KabyLake, CannonLake. Or for those of you not into codenames, the 6th, 7th, and 8th generation labels will have to do.

 

And except for niche & technical applications, the 3rd generation of these i7 processors hasn't been anywhere nearly fully exploited in the consumer space. Imagine the awesomeness we could have if i7's were programmed with the same care as a VCS game!

 

---

 

There is no other reason for this rapid pace of granular improvements except to milk profits. And if they have to install artificial limitations to ensure that happens then so be it. Look at what OS'es are supported on the 6th generation vs 7th generation of processors.

 

Backward compatibility is now a liability. Doesn't have to be, and there is plenty of power to make work-arounds and all that. But.. Profit..

 

There is no reason the semiconductor industry shouldn't advance at breakneck pace. But stop breaking things with artificially set rules and limits. No. Won't do it. Because, greed & profit.

  • Like 2
Link to comment
Share on other sites

For the record, I'm aware that hobbyists still can buy 5V TTL/CMOS chips all day long. But how much of that is newly manufactured versus old stock? Most manufacturers of microcontrollers and CLPDs / FPGAs have discontinued 5V logic. So while some old manufacturers may still produce dip chips and stuff with 5V gates on them, the fab facilities for actual custom processors with more complex circuits are dying. If whatever fab Nintendo used to manufacture their FC/SFC CPU/PPUs and such were shutting down or converting their facilities to smaller die sizes and 3.3V logic, it may not have been feasible to continue production of 5V parts. Then there were RAM chips and other components sourced from outside manufacturers. And futher more, 1983-2003 is a freakishly long time for a console. consoles were at an all time low on the secondhand market so there was no demand at the time for new ones. They weren't going to keep making legacy hardware forever as it's not a sustainable business model.

 

The problem for hobbyists is that TTL parts are still available, but not in the smaller formfactor. A lot of the new formfactors and package technologies like SMT and thin-SMT and fine-pitch SMT are rather hard to find. And that's what designers of add-ons for classic computers like to use. Look at the CFFA3000 for Apple II, it has a whole row of 3v/5v translating buffers or bus-interface chips. Those are likely still available in full-side DIP packages, but not in the tiny format used in the current product.

Link to comment
Share on other sites

 

I completely agree. I mean for me I take an anti-capitalist lesson from all this ruining of artistic integrity by the shortened business cycle. I don't expect anyone else to agree but I feel like we need a new way of fundraising and getting resources together in order to make sure people can actually finish their damn work so you don't have to release a 40GB patch like they did for the Xbox One Halo: Master Chief Edition. That game was literally shipped deliberately unfinished because the publisher demanded that it be released on a certain schedule.

 

Where they get these schedules from, I don't know. Somehow people who aren't involved in actually making the video games and hardware get to dictate the flow of the work. Personally I'm sick of management and HR. They're not there to help you get your job done, but to defend you from the people pocketing the money.

 

Anyway, end rant.

 

I reckon the Atari is amazing for the same reasons as you - the console is well designed, has an amazing library of games that had a great deal of effort put into them, and was accessible to a huge amount of people. Pretty much none of that applies today.

 

Absolutely this 2x!

 

The CEO of Panera Bread is going on a personal crusade to try and educate other upper-level managers both in and out of his company. He says product cycles are too short. Business have an outlook that only extends 1 year in the best of circumstances. And this needs a change.

 

Personally I think it's futile attempt because of greed and the almighty dollah. But what do I know. I'm a smartass smalltown kid. In any case I applaud his statement.

 

Link to comment
Share on other sites

 

In a semi-related field (no pun intended) look at how fast intel is revising microprocessors, SkyLake, KabyLake, CannonLake. Or for those of you not into codenames, the 6th, 7th, and 8th generation labels will have to do.

AMD man. Put the money down on a threadlocker. 32 cores of fractal rendering bliss. If only it were still comp[atible with win7

Link to comment
Share on other sites

I would be really careful about claims like "reaching 90s". I am quite sure there was hardly any interest in 2600 in early 90s. Of course you could argue they offered it in catalogues and things like that but I wouldn't be surprised if demand was significantly lower than it is nowadays.

  • Like 1
Link to comment
Share on other sites

I would be really careful about claims like "reaching 90s". I am quite sure there was hardly any interest in 2600 in early 90s. Of course you could argue they offered it in catalogues and things like that but I wouldn't be surprised if demand was significantly lower than it is nowadays.

I have read about how the 2600 and some other systems were discontinued by 1992 but I have never stumbled over any adverts or catalogs listing the 2600 past the mid 80's, or seen any sales figures.

 

I'd love to see how a late 80s or early 90s ad for the 2600 or 7800 would look like, and I'd love to see how sales fared in the early 90s.

 

I remember seeing a spanking new Wii sitting alone on a shelf just some 2 years ago at the local electronics store in the small town I live in. The owner refused to lower the price even when I said they could be had for fractions of the price if used.

 

I wonder where and when the last new original 2600 was sold.

Link to comment
Share on other sites

The 2600's end of life marketability was just an accident and spasm of desperation. You can bet that if the PS3 had somehow been a failure that Sony would have swamped the world with "budget" PS2s in the exact same way.

 

 

Interesting point, I think they did do this when it was first released for a couple of years.

 

Trying to keep backward emulation was a telling point too, only the PS2 proved more difficult than anticipated to emulate with titles like GOW showing off how far the hardware could be pushed - another parallel to the 2600.

  • Like 1
Link to comment
Share on other sites

I have read about how the 2600 and some other systems were discontinued by 1992 but I have never stumbled over any adverts or catalogs listing the 2600 past the mid 80's, or seen any sales figures.

 

 

 

It's true that Atari Corp. ended all offcial support for 8-bit systems (including 2600) that year. But I remember during the early/mid 90's that Radio Shack were offering 2600 carts through their ordering service, plus they still sold accessoriers like a joystick & TV switch box.

 

And even though there was no serious demand from 'current' gamers at the time there were people who were collecting Atari games sold at thrift stores & flea markets while swapping notes on Internet newsgroups. Hence how the classics gaming scene got started...

  • Like 3
Link to comment
Share on other sites

I think there is a difference between popularity and official corporate support, so maybe there is an argument to be made that the 2600 was finished before it was officially unsupported by Atari.

 

But I think we can still interpret the wording of the OP charitably - the 2600 did have a very long lifespan.

  • Like 1
Link to comment
Share on other sites

For the record, I'm aware that hobbyists still can buy 5V TTL/CMOS chips all day long. But how much of that is newly manufactured versus old stock? Most manufacturers of microcontrollers and CLPDs / FPGAs have discontinued 5V logic. So while some old manufacturers may still produce dip chips and stuff with 5V gates on them, the fab facilities for actual custom processors with more complex circuits are dying. If whatever fab Nintendo used to manufacture their FC/SFC CPU/PPUs and such were shutting down or converting their facilities to smaller die sizes and 3.3V logic, it may not have been feasible to continue production of 5V parts. Then there were RAM chips and other components sourced from outside manufacturers. And further more, 1983-2003 is a freakishly long time for a console. consoles were at an all time low on the secondhand market so there was no demand at the time for new ones. They weren't going to keep making legacy hardware forever as it's not a sustainable business model.

 

For what it's worth,

 

I think the reason the Famicom lived so long in Japan was the same reason the Atari lived on in America...I think it was their first big console, and had a momentum of nostalgia behind it. I imagine the Japanese often buying Famicoms for their (younger) kids even after newer, more powerful systems appeared.

  • Like 2
Link to comment
Share on other sites

Interesting that your KB kept their 2600 inventory; the one I used to frequent basically cleared all evidence of 2600 items - don't even recall 7800 merch. Strictly Nes and Sms related items around that '87 era.

My KB had them as well. I remember seeing Winter Games there, and thinking it was the 7800 version. Nope, it was for the 2600! I couldn't believe it existed.

  • Like 1
Link to comment
Share on other sites

I completely agree. I mean for me I take an anti-capitalist lesson from all this ruining of artistic integrity by the shortened business cycle. I don't expect anyone else to agree but I feel like we need a new way of fundraising and getting resources together in order to make sure people can actually finish their damn work so you don't have to release a 40GB patch like they did for the Xbox One Halo: Master Chief Edition. That game was literally shipped deliberately unfinished because the publisher demanded that it be released on a certain schedule.

 

Where they get these schedules from, I don't know. Somehow people who aren't involved in actually making the video games and hardware get to dictate the flow of the work. Personally I'm sick of management and HR. They're not there to help you get your job done, but to defend you from the people pocketing the money.

 

Anyway, end rant.

When you consider that a game earns most of its money the week its released, and that most game sales happen in Q4 because of Christmas, then it becomes crucial for companies to get the game out by the Christmas season, or it will lose money. If you delay it until next Christmas, it will probably lose money as well because you are paying a development staff that nowadays consists of hundreds of people for an extra year of work. Companies also try to pick a release date where the game won't be completely overshadowed by another major release. If it slips even a week or two, it will lose the advantage of a carefully chosen date and likely be competing against something else.

 

In the old days 1 programmer could create a game in a matter of weeks. These days you have hundreds of people working for 3 years or more on a single game. All those people have to be managed and kept in-sync, or else your game goes into "development hell". You can't compare today's development environment to the old days, it's completely different.

  • Like 1
Link to comment
Share on other sites

I think you can compare them though - you need bigger teams because of the logic of money. If you forget about money, something totally incidental and secondary to making games aesthetically perfect, you don't have this dilemma.

 

If people have enough time to finish the game, the game has a much higher chance of actually being good. Because of the first week of sale, because of Christmas, we're forced to have enormous teams creating bland sequels.

Link to comment
Share on other sites

I think you can compare them though - you need bigger teams because of the logic of money. If you forget about money, something totally incidental and secondary to making games aesthetically perfect, you don't have this dilemma.

 

If people have enough time to finish the game, the game has a much higher chance of actually being good. Because of the first week of sale, because of Christmas, we're forced to have enormous teams creating bland sequels.

 

You need bigger teams because games are bigger, which is a natural offshoot of improving technology. Better graphics and sound capabilities mean you need actual artists, composers, voice actors, etc. to take advantage of the possibilities, not just one person sitting at a terminal coding. That would be the case even if every game was free and every release date was the same as every other.

Link to comment
Share on other sites

When you consider that a game earns most of its money the week its released, and that most game sales happen in Q4 because of Christmas, then it becomes crucial for companies to get the game out by the Christmas season, or it will lose money. If you delay it until next Christmas, it will probably lose money as well because you are paying a development staff that nowadays consists of hundreds of people for an extra year of work. Companies also try to pick a release date where the game won't be completely overshadowed by another major release. If it slips even a week or two, it will lose the advantage of a carefully chosen date and likely be competing against something else.

 

In the old days 1 programmer could create a game in a matter of weeks. These days you have hundreds of people working for 3 years or more on a single game. All those people have to be managed and kept in-sync, or else your game goes into "development hell". You can't compare today's development environment to the old days, it's completely different.

This is one area where Nintendo gets game development right. They will push a game back months or more if it means a more polished experience for the end user. But they also tend to march to the sound of their own drum, refusing to play ball in MS/Sony's court. A strategy that pays off big about every other generation.

 

Look at Switch. It completely missed holiday 2016 as well as Zelda, and the system is doing fine right now. Had they shot for Q4 of 2016, it would have spelled mass shortages, no games, shoddy hardware, and a surefire repeat of the Wii-U fiasco.

  • Like 1
Link to comment
Share on other sites

Look at Switch. It completely missed holiday 2016 as well as Zelda, and the system is doing fine right now. Had they shot for Q4 of 2016, it would have spelled mass shortages, no games, shoddy hardware, .

And the 2017 launch was different from this, how? :P

 

 

Games are bigger now... but are they any better? The new Ghost Recon game was just loads of mountains and lots of quests that had you murder people.

 

I feel like the open world standard big game is pretty overrated.

 

It's what people demand. Gamers demand 30, 40, 50 hours or more from their game purchases, so these big games get padded out with useless quests. If you don't have this there will be a backlash. See "Order 1886" for example. People were angry it only lasted 6-8 hours, and it got bashed relentlessly for it. Can't have a concise story. Needs to be long and padded out with useless side quests to check a box.

 

There are indie companies making games the old-school way. They have their fans, but most not named "Minecraft" will never see the kind of success that AAA games have.

  • Like 1
Link to comment
Share on other sites

And the 2017 launch was different from this, how? :P

 

 

 

It's what people demand. Gamers demand 30, 40, 50 hours or more from their game purchases, so these big games get padded out with useless quests. If you don't have this there will be a backlash. See "Order 1886" for example. People were angry it only lasted 6-8 hours, and it got bashed relentlessly for it. Can't have a concise story. Needs to be long and padded out with useless side quests to check a box.

 

There are indie companies making games the old-school way. They have their fans, but most not named "Minecraft" will never see the kind of success that AAA games have.

6-8 hrs sounds right to me. I'd prefer 10hr games which cost $20. If a 30hr game costs $60 they could do a 10hr game for $20, right? I'd get to enjoy 3 worlds for the price of 1.

 

I probably wouldn't get to see the later 20hrs of a 30hr game anyway.

Edited by pacman000
  • Like 1
Link to comment
Share on other sites

6-8 hrs sounds right to me. I'd prefer 10hr games which cost $20. If a 30hr game costs $60 they could do a 10hr game for $20, right? I'd get to enjoy 3 worlds for the price of 1.

 

I probably wouldn't get to see the later 20hrs of a 30hr game anyway.

 

except dev costs don't scale linearly with the number of hours of content. It's relatively cheap to add quests to an existing open world to boost playtime, but expensive to create that world in the first place.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...