Jump to content
IGNORED

Presenting the Atari 800 to college students: How would you do it?


jaybird3rd

Recommended Posts

As others have noted, next year marks the 40th anniversary of the release of the Atari 400/800 computer series. Other prominent classic computers and video games, such as the TI home computer series and the Intellivision, will also be hitting the big 4-0 next year. I'd like to mark the occasion by spotlighting certain of these classic systems during our monthly Computer Science Club meetings at our university next year, with a focus on one system during each meeting. It would be very enjoyable for me, but more importantly, it may help to raise awareness of these systems among the younger generationwhich, in the long run, will be increasingly important if any appreciation for these beloved machines is to live beyond the first generation of users (which would include many of us).

 

Here is my question: if you were to introduce the Atari 800/XL/XE computers to a group of college students, how would you go about doing it? These would be students of Computer Science, some of whom are concentrating in game design, who have probably never seen a computer or video game that is more than about fifteen years old. Ideally, the introduction should be done in a way that makes these machines as relatable to the students as possible, so that they can get an appreciation of the Atari 800's place in history, and of the reasons people still enjoy using them today. I think it would come across as self-indulgent and boring if I spoke only in terms of personal nostalgia, and that is the last thing I would want.

 

I can think of a few possible approaches. My first idea was to show them early video games which started the game franchises that they know today, but it occurs to me that this would be difficult to do because, sadly, there seems to be little in today's game industry which can trace its lineage back to the Atari 800 except in very indirect ways (such as Donkey Kong being the first game that features Mario, or the Pac-Man games).

 

I could talk about games which broke new ground in game design on the Atari 800 in some important way, but I'd need to have some good examples ready to show them. I could also show them a bit of programming in Atari BASIC, and talk about how the first kids ever to have home computers learned computer programming by typing in program listings from magazines and the like. That could be relatable because many of these students are just learning to program themselves. Or, I could focus on the hardware, and relate the Atari computer architecture and the 6502 to the microcontroller-based systems that some of the students are studying in their computer architecture classes.

 

Whatever angle I end up taking, what I plan to do is to introduce the computer itself, talk a little bit about its history, do a quick demonstration, and then leave the rest of the meeting open so that anyone who wants to play with it can do so. I'd have at least two or three computers set up with CRT displays so they can be experienced as intended, with plenty of games and controllers within reach.

 

Any input would be greatly appreciated. Thanks in advance!

  • Like 1
Link to comment
Share on other sites

I think any introduction would have to have some history, to put these systems in context. First, that Atari was already an established name in videogaming in both the arcades and in the home with the Atari 2600. I think even arcades themselves would need some context, since game arcades of the 70s and 80s were a much bigger thing in gaming culture than they are today.

 

With that in mind, you'd also have to set some context about the Atari coming out at the birth of the personal computer industry. At that time, it was a big thing if a computer could do color graphics at all, and the machines that did support it (Apple II, TI 99/4a) were quite expensive. So the fact the Atari 800 was essentially a hybrid of personal computer as well as videogame hardware was quite remarkable. It's also one of the earliest machines that could be described as having a GPU, in the sense of the ANTIC chip and support for scrolling and sprites in hardware.

 

The system was also ground for some seminal games. Star Raiders pioneered the entire first-person space/combat simulator genre for instance, and I think those would have to be demonstrated along with some modern games that trace right back to it.

 

When you set the machine into its historical context, it becomes impressive without being nostalgic.

  • Like 2
Link to comment
Share on other sites

It's a VERY interesting concept, and I applaud your enthusiasm and efforts. The Millennial generation and younger lads and lasses interested in the past, hummmm? These machines should be celebrated, the early pioneers should be commended. However, at the risk of sounding like Debbie Downer, I think these kids don't really care. Life for them is what it is now. They have little reverence for the past, for history. They are not taught history (except with an agenda attached). They really could care less. Some real hardcore coding students might show some interest. There certainly will be those that go for the nostalgia angle (Pac-Man, Mario Bros.). But for the most part, I fear they will walk right past the booth/display barely noticing it (not cranking their heads up from their cell phones screens. I am jaded, I know. I just hope I am absolutely wrong about it, and it turns out to be the biggest thing since their hero, their almighty spiritual guru, Steve Jobs announced the Iphone. Keep pressing for ideas.

  • Like 1
Link to comment
Share on other sites

Personally, if you're going to introduce students to entry level systems like this, it probably wouldn't be college students. At this point in their lives they've probably done some C/C++, Java, Javascript, and/or Python and know they will never touch Assembly so this will have little to no interest to them. It's an interesting fact that there were serious limitations with these old systems and that challenge extended into the 90's but those are historic fact. Devs these days are confronted with different challenges and those aren't generally hardware resources. Still, it's an interesting fact and a good lesson to depend on ingenuity but they're probably not planning on being computer historians so I'd quickly move onto how things are today. Now, if you were teaching middle school kids then maybe they might be interested in developing for such an old system but in reality, I think you'd get a better response teaching them how to program the NES/SNES. Limitations of hardware are a fun challenge but it was a challenge that people took on because there wasn't any other option. Today, people program these old systems out of a dedication to nestalgia.

That being said, you didn't ask how NOT to do it so here is my suggestion on how to make it fun. First, if you have the time and circuit design is part of the curriculum, building a simple computer would be the first step. Then, an OS. If you don't have time for both, start with the OS. I'm guessing OS design is part of a computer science degree and this would be a very simple way for them to get their feet wet while understanding how to work with system limitations. It might be interesting to see what they can come up with.

 

Hope that helps,

Link to comment
Share on other sites

It's a VERY interesting concept, and I applaud your enthusiasm and efforts. The Millennial generation and younger lads and lasses interested in the past, hummmm? These machines should be celebrated, the early pioneers should be commended. However, at the risk of sounding like Debbie Downer, I think these kids don't really care. Life for them is what it is now. They have little reverence for the past, for history. They are not taught history (except with an agenda attached). They really could care less. Some real hardcore coding students might show some interest. There certainly will be those that go for the nostalgia angle (Pac-Man, Mario Bros.). But for the most part, I fear they will walk right past the booth/display barely noticing it (not cranking their heads up from their cell phones screens. I am jaded, I know. I just hope I am absolutely wrong about it, and it turns out to be the biggest thing since their hero, their almighty spiritual guru, Steve Jobs announced the Iphone. Keep pressing for ideas.

Some will. I have a bunch of retro tech in my office and it can be an amazing conversation starter with some Undergrads and Grads. Not all, but some definitely get it. There's even one undergrad who's put together a Computing History Seminar for next quarter. This in a in National Top 5 CompSci program. There are definitely young folks out there who care.

  • Like 1
Link to comment
Share on other sites

I'd talk about the system in the context of it's time and what made it unique among its competitors.

I'd talk about the really advanced stuff that was going on with the system, like the SIO bus, System handlers, and the dedicated Sound and Graphics chips.

I'd demo a couple a Contemporary Games, stuff that was pretty ground breaking at the time:
Star Raiders: Free fly-around "3d" space battle sim!
Alternate Reality: Grandfather of the Walk-about RPG -- also the Intro to that game is still damned impressive.

That's just general stuff off the top of my head. I'll let this percolate and post if I come up with anything else.

Edited by Mechanicjay
Link to comment
Share on other sites

Show them and tell them the TRUTH...

 

Show them the amazingly great work put together by J. Miner, Joe Decuir, Doug Neubauer, and the rest of the team... tell that the 800 (the real McCoy) is on PC World's 25 BEST EVER made PCs in history. Tell them that Doug Neubauer's work (Star Raiders) is on the 10 most important video games in history (!).

 

Tell them that APPLE, ATARI and IBM are the (real) holy trinity of personal computing, not just because of how special the products were, but also because these three companies are also connected in space and time, being California, New York and Floirida where everything really happened (and where Steve Jobs worked at Atari and introduced the idea of personal computing, three years before Atari got it, and five years before IBM got it), that Jay Miner was introduced to and work (contractually) for Steve Jobs in Cali., that Steve Jobs tried to recruit Don Estridge (the real IBM maverick that made, along with 12-14 other people the IBM PC a reality in Florida), and that IBM approached Atari with the interest on having the 800 becoming its OEM personal / home computer, and that Paul Allen and Bill Gates sat in IBM's Boca Raton office, to SIGN the contract that would catapult Microsoft and IBM to almost immediate stardom!)

 

Also tell them that, more than THIRTY YEARS LATER, folks like Avery have just discovered how to extract 60 fps of real-time video and audio with basically... a CF reader attached to the cartridge port (!!!) WAY, WAY beyond the wildest dreams of Atari's original team... Tell them that SIO is basically a direct pre-cursor of USB, having both in common Joe Decuir, actual responsible for SIO... Tell them about Jon's later work, and the amazingly good Graphical OS prototype / demo, running on fundamentally STOCK 1979 chipset (!!!) We are talking about the SAME concepts that Steve Jobs and Bill Gates otherwise found in XEROX's Palo Alto research center (!)

 

I like superlatives, when credit is due where it is really due... A slightly distant (but familiar) analogy, may inspire the final presentation...

 

 

Just tell them how fortunate we were ALL to witness and be part of a technological revolution dawn... that WILL NEVER happen again!

Edited by Faicuai
  • Like 4
Link to comment
Share on other sites

As you're talking with Comp Sci students, definitely get them laying hands on the machine. Come up with a couple fairly trivial programming exercises and let them hack at it with Atari Basic.

Maybe have a couple programming demos ready for how easy it was to access the hardware and all the feature.

Also something demonstrating how primitive things were, like build a multi-dimensional string array in Atari BASIC.

Link to comment
Share on other sites

I don't think showing Star Raiders is worthwhile - unless you show what was it like before it?

ie. You need to show before it - the turn based Star Trek game - that was just TEXT -

like from a CBM PET?

So it's quite dramatic to go from the TEXT game into a full blown 3D graphics simulation in real time - and it was an 8K game because it was initially thought that 8K cartridges would be the norm - hence the low-res looking graphics.

Memory prices dropped - so that 16K became the standard - hence the nice looking Donkey Kong became possible.

You gotta show the arcade version - then the Atari version - so as to point out how nice the home conversion was - for the Atari.

 

Harvey

  • Like 3
Link to comment
Share on other sites

You may have a look at

http://www.gymnasium-nordhorn.de/index.php/aktuelles/241-stefan-both-geschichte-der-computer

 

I'm sorry it is German only, but you can click the picture

to get a sideshow. Google translate should do the rest fairly good.

 

I understand, you'd like to give the students an overview

of the Atari aera. I've doubt, its enough stuff to fill

an hour - and have even more doubt, this small part of computer

history (alone) is interesting enough to captivated the audience.

 

I absolutely agree, students love to touch and test old computers

and like to play (easy to understand) games (no time to read manuals)

or operate a paint program or wordprocessor (incl. disk access

and the regarding printer in operation) with the help of a

self made super short manual.

 

Of course the 40th anniversary of Atari is a wonderful

opportunity to have a special focus on Nolan Bushnell

and the history of Atari - including the "I love to fuck"

T-shirt Stroy etc...

Just my 50 cent ;-)

 

Stefan

Link to comment
Share on other sites

I don't think showing Star Raiders is worthwhile - unless you show what was it like before it?

ie. You need to show before it - the turn based Star Trek game - that was just TEXT -

like from a CBM PET?

So it's quite dramatic to go from the TEXT game into a full blown 3D graphics simulation in real time - and it was an 8K game...

 

Harvey

I agree completely. I showed it to my high school robotics students and they were largely unimpressed. I should have done it like you suggest, and put it in context of the times.

Link to comment
Share on other sites

I would probably start by showing some pictures of me and my friends, hovering around the desk in my dorm room, playing Montezuma's Revenge on my old Atari 800 with 13" RCA television, Rana1000 disk drive, and Wico Command Control joystick, assuming any such pictures actually still existed anywhere today.

 

Then I'd lead into a discussion of trying to do word processing on a 40 column display and actually have things line up and look the way you wanted. I'd probably pull up some of the original versions of my first resumes, created on HomeText, part of Batteries Included's HomePak, making sure to show them one of the first print preview modes I can recall ever seeing.

 

I'd probably wrap up by showing screenshots of the old VT-100 terminal software I used with my Atari 1030 300 baud modem to connect to my university's network to check email. It was barely useable with each character being about 3 pixels wide.

 

Ahhh, kids today. They have it soooooo easy!

 

I'd probably show off the old Tandy Color Computer 2 that I had to purchase for some of labs of my later computer electronics courses. It was actually pretty cool. We etched our own expansion cards to plug into the side expansion port, then attached a breadboard. We used this setup to create all kinds of neat circuits. I vividly remember one of the projects was to setup a stoplight using LEDs. We had to get the timing right and everything was programmed in machine code. Every couple of days the instructor would add a new twist to the circuit and it would be back to the drawing board. First it was one light, then it was a true intersection with a light for each direction. Then he added a vehicle present sensor. Then there was a crosswalk button. He kept us going for two or three weeks. I could never look at a stoplight the same way after that!

 

 

Ah, thinking about my 8-bit Atari and college has really taken me back. That was a really neat trip back to 1987!

Link to comment
Share on other sites

I would focus on the 6502 as one of the first microprocessors. On how a computer is actually built (logic ports, built with transistors). On these Atari's you can actually look at the motherboard and see how everything is connected, databus, address bus, latches, i/o etc. It is also a perfect opportunity to show them what programming actually is; how programming in the end is nothing more then filling some registers with values that are built out of 0's and 1's, and these registers are built out of latches, which are built out of logic ports, which are built out of transistors... i don't think much people, even grad students, understand how a computer works on a low level, how it is actually built. With these machines, you can go back to the roots of computing and computer architecture.

  • Like 1
Link to comment
Share on other sites

I believe computer science is more about software than hardware, while electrical engineering would be more interested in the hardware design? Obviously a computer and it's OS etc. benefit when both disciplines collaborate on a project design. But anyway, for CS students I might focus more the software angle. Specifically about how the OS used a central I/O, everything's a file, type concept. That's how the big iron which preceded it was doing it (and still still to this day), and Atari managed to make it work on the small scale of a micro-computer. I personally had exposure enough to Atari, Apple and CBM 8-bit computers to do some programming for each of them, and I always really appreciated how the Atari approach made it more intuitive to program for.

Link to comment
Share on other sites

As someone who has taught degree level students (BSc, MA, BA, MSc) - in game design, and also integrated classic systems as part of the wider historical context of gaming and games development, what I have found valuable is not a description of chips and the like, but rather the experiences they facilitated in context.

 

I presented games and the devices that enabled them - so for the 800 I had students play M.U.L.E - as a four player experience and in context of Dani Bunten's legacy. I had students dissect the ludological values of Alley Cat and Necromancer (I featured Bill Williams as a pioneering creative).

 

I also brought forward the work of Brenda Laurel and the Atari labs team in the pursuit of extending the potential of games (Computers as Theatre is a wonderful book)...

 

Obviously I utilized many other platforms and games from PONG to the Playstation for other areas of exploration of cultural context and impact of games...

 

sTeVE

  • Like 1
Link to comment
Share on other sites

I wish you good luck for your presentation. As said before, some of them will be interested, that's enough. As it's game design students you teach, I'd try to show the evolution of game concepts which was closely inertwined but not always in sync with the development of hardware. Maybe you'll have to draw this out over several machines to show how games progressed from "single screen" to "multiple screens" to "move from screen to screen" to "move freely over a scrolling map". As said above you can point out how certain limitations influenced game design.

 

A "hands-on" could be done using emulators and even if an hour is too short to get students programming, dissection of e.g. a good ten-liner game (like the "beam dropping" game) could be good reminder that game ideas are a lot more important than power.

 

In every generation of computer/console there are examples that show that sometimes it's better to make a good game on current hardware than a game that needs the next generation to look good or play with decent speed (IMHO that's true even today if you look at VR).

 

As for the discussion of hardware, I'd limit that to describing new and revolutionary features (such as dedicated graphics chips for the Atari) that set computers apart from the previous generation/competitors.

Link to comment
Share on other sites

I would focus on the ANTIC chip that was revolutionary at the time. The ability to mix graphics modes on a the screen with display lists was unique at the time as were display list interrupts that Atari engineers wisely built in to allow line by line control of the graphics making it possible to put 256 colors on the screen at the same time. You could then end with a discussion of player-missile graphics. The ability to have players and missiles sent to the TV as an independent signal from the playfield graphics is amazing and blew the competition out of the water. Apple, Commodore, and Radio Shack had nothing like this when the Atari 400 and 800 were released in 1979. Wow! A point in tech history that college students studying computer engineering, graphics, or game design must know about. Their education will not be complete without this essential background. Gives me the chills just writing about this. Pure awesomeness.

  • Like 1
Link to comment
Share on other sites

It is often said that the more things change, the more they stay the same. I am invariably fascinated when I think about the direct correlation between early home computer components and those of modern systems. Take the Atari 8-Bit for example. The Antic is an obvious predecessor to the modern day GPU; complete with its own instructions, interrupts, and memory access. The Pokey is clearly analogous to the sound card, even in its ancillary functions. Remember how PC sound cards used to also have the game port. Pokey handles joystick input, right? USB is strikingly similar to SIO as a plug-and-play communication channel for variable-function devices addressed by OS handlers. The micro-computer industry is currently standardized on a highly modular, extensible version of the proprietary, integrated architectures of the 8-Bit era. Systems like the Atari 8-Bit line become the giants on whose shoulders the current industry stands.

Link to comment
Share on other sites

The Atari 400/800 story - is that they were there first - with hardware sprites and scrolling - a custom graphics chip that allowed custom screen graphics - but like any new hardware - it takes 2-3 years for the software to appear which really shows off that hardware - what it can really do?

When the C-64 appeared - the Atari 400/800 seemed just as capable when you do a Let's Compare between them in the games available on both.

 

Then it becomes much like the Beta versus VHS story - as C-64 sales takes off - with programmers getting to know the C-64 hardware well. Support drops off for the Atari 400/800 because the market is small for it's software - and Tramiel doesn't do what is required to encourage new software development for the new XLs/XE/XEG hardware which are priced against the C-64. Maybe we finally see that the gap between these hardware is closer than what we imagined them to be - with the homebrew development in the past decade or so...

 

There is something to be learned from this history - that should make new developers of new technology more aware of the problems - in trying to make new hardware/software successful for the masses. It is much the same kind of story - when you review past inventors of their new inventions - like Tesla versus Edison. Tesla is not generally recognised for the true genius that he was - a one man's think tank, whereas Edison relied more on his assistants - and the use of trial and error?

 

And there is the common story of how - when something revolutionary does come along - that it is never one person working on it alone (though rarely is this the case - with Tesla's AC power system) - but rather several people are working independently - trying to solve the problem of - how to do it so that it can work? That competing systems appear in the marketplace - and hopefully the best designed one - will come out on top?

 

Harvey

Edited by kiwilove
Link to comment
Share on other sites

The 400/800 were way a head of their time:

 

1) The ANTIC/GTIA custom video chips were the precursor to modern GPUs

 

2) The POKEY custom chip for sound was the precursor to Sound Blaster

 

3) The SIO port that could identify the device and start working was the precursor to USB (1st plug and play)

 

Atari got there in 1979.

  • Like 1
Link to comment
Share on other sites

I know by 1981, Atari was out selling Apple. The 16K 400 was arguably the first "real" home computer under $500. By real I mean that it had power (a 1.79Mhz 6052), features (POKEY, ANTIC, GTIA) and enough memory (16K) to write large programs and run applications at a price point at the time that was half of its contemporaries (Atari 800 and Apple II). The 400/800 was originally conceived to be the follow on successor to the 2600, so it was designed to execute better graphics and sound than the other home computers of the time. Again, Atari was way ahead of the curve. The Atari 400/800 was a "me too" because of the Apple and the C-64 was a "me too" because of the 400/800. And so it goes ....................................................................Intel Core i9, Radeon GPU. What do we use these high end PCs for? Playing video games. Most people can do everything they need on a Core i5 and onboard integrated graphics (browse, stream videos, etc). Today's super rigs are designed to be the best performers at high demand applications (Games). The 400/800 of 1979 is their great great grandpa.

Edited by ACML
  • Like 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...