Jump to content
IGNORED

What if we never got past the 8-bit chips?


BSA Starfire

Recommended Posts

Yesterday I had a daydream, kinda what if scenario about how the world would be today if we had found it impossible to exceed 8-bit micro chips & say off the top of my head a 10mhz ceiling. For arguments sake let say that is as far as micro-chip technology can progress.

How would the world look today? I guess the internet could still exist in some from, possibly more like teletext pages or viewdata, huge hard disks probably would'nt exist as I guess you wouldn't need them as file sizes would I assume be much smaller. DVD in fact digital video wouldn't exist, cell phones also I'm guessing.

Would video games have fizzled out with no huge increases in WOW factor? Or would they have contiued to evolve with in the technological constrains?

How about personal/home computers? Would there still be a market if there was not much in the way of technological jumps to drive the upgrade sales?

How would the technology advance whitin these constrains, what do you think?

Would companies like Atari, Acorn, Sinclair, Dragon Data, Oric etc still be here without the advent of the cheap IBM clones from the far east.

Would the arcades still be alive and well without the competiton from the top end consoles of today?

How about movies, I guess none of the CGI extravaganzas would exist, so we would still make film like George Lucas and sPielberg did with models etc.

 

Give it some thought, I think it's an interesting what if?

 

Best,

Chris

Link to comment
Share on other sites

Yesterday I had a daydream, kinda what if scenario about how the world would be today if we had found it impossible to exceed 8-bit micro chips & say off the top of my head a 10mhz ceiling.

 

So, basically you're asking what would happen if technology were capped at the level it reached in the mid-80s. That's a hard question to answer, almost impossible really, because if it can't advance, it can't advance. However, I think you're trying to limit just that one facet of the technology, allowing improvements in all other areas. That's still a stretch, but since I love these kinds of questions, I'd love to ponder this.

 

First of all, don't forget that there is usually more than one solution to a problem, so increased memory and storage technologies could theoretically make up for the lack of raw horsepower. Secondly, keep in mind that a lot of what we have today isn't really as new as we think it is. You brought up DVD. That's a great example. DVD is derived from the CD, which was first used in the late 70s. Also, at the time you're looking at, Laserdisc was in widespread use (well, widespread by laserdisc standards...) Those are two viable technologies ready to go. To create something like DVD, one could make an analog format by reducing the pit size on a laserdisc platter until the disc could be CD-sized. Assuming that the data could be laid densly enough on the disc, Laserdisc could eventually have evolved into a pseudo-DVD.

 

It would be harder, but it's still feasible to do the same to the CD. Since it's digital, the limiting factor becomes the decoding power of the playback device. Variable bitrates would probably be out of the question.

 

Computers used for purposes other than entertainment probably would not be limited as much. More applications would have been text-based, and full GUIs would have only been used in specialized applications... IRC, Mosaic, and usenet would probably have had much longer lifespans. It's possible, even likely, that as the need for more graphical apps progressed, we might be seeing GUIs appear as hardware upgrades (Hey, Bob, I've got an extra PCI-E slot! I think I'll buy photoshop!)

 

Interesting possibilities.

Link to comment
Share on other sites

The longer a hardware technology exists, the further programmers are able to push it. We have to look no further than our 2600 to see this, both in terms of commercial releases and homebrew. That would play a big factor. I'm sure that in 1982 no one would have fathomed that something like GEOS could be programmed on the Commodore 64 either.

 

I'd also wonder if multi-processor computing would have developed with chips like the 6502 (has this been done in reality)? I can just see a file-cabinet sized machine with 100's of 6502's or Z80's in it, crunching away to somehow play a 3D Doom-like game. Not saying there wouldn't be other throughput issues, but, I'll presume those issues could be somehow solved as well.

 

Interesting questions! I think that companies like Microsoft/IBM/Compaq would have standardized hardware and software to be somewhat similar to what we have today though. That seems somewhat inevitable. We'd still be running something like Microsoft XP or Vista, but it would just be on huge, heat-producing rack-mounted multiprocessor'ed 8-bits!

Link to comment
Share on other sites

Thanks for your input, I think it's quite an intriging scenario.

Maybe CP/M would have been the dominant or maybe even MSX with the Asian companies behind it.

I still have an Amstrad PCW, it's a 512K Z80 Based CP/M machine that was produced here in the UK till the mid 1990's & mostly marketed as a word processor, it is however a remarkably versatile machine and capable of most of the things I use my duo- core Macintosh for today.

I think 2D would have stayed the predominant force in gaming, but with larger game worlds.

I forgot about Laserdisc, but don't know much about the technology, I also remember the CED videodisc player we had when I was a kid, worked like a gramaphone, but this possibly could have evolved if the need was there.

 

Look forward to reading more of your ideas.

also love the idea of huge multiprocessor 8-bit's playing doom!

 

Best,

Chris

Link to comment
Share on other sites

Laserdisc was an excellent video format that predated DVD. It consisted of an analog stream encoded onto a large optical disc. Later discs were able to add digital audio. Much as computers in the 80s could use cassette tapes to store data, I could see someone finding a method of using laserdiscs to store data. IIRC, a CD-Rish WORM disc was actually created for the laserdisc, but only in very small numbers and only saw limited military use.

 

CED was essentially a movie stored on a vinyl record, read with a physical needle. It is for this reason that I don't think it would be a viable long-term format. The needle would lead to too much wear and tear and the physical contact would make the system unreliable in the long term. Reasonable for movie or music use, possibly, but that technology was destined to die in the 80s, IMO.

Link to comment
Share on other sites

Laserdisc was an excellent video format that predated DVD. It consisted of an analog stream encoded onto a large optical disc. Later discs were able to add digital audio. Much as computers in the 80s could use cassette tapes to store data, I could see someone finding a method of using laserdiscs to store data. IIRC, a CD-Rish WORM disc was actually created for the laserdisc, but only in very small numbers and only saw limited military use.

 

CED was essentially a movie stored on a vinyl record, read with a physical needle. It is for this reason that I don't think it would be a viable long-term format. The needle would lead to too much wear and tear and the physical contact would make the system unreliable in the long term. Reasonable for movie or music use, possibly, but that technology was destined to die in the 80s, IMO.

 

I definitely agree with you there. We had one, and it used to skip ALL the time. I don't remember the quality looking any better (through my then child eyes) than vhs...it was the only way I could watch my Great Muppet Caper movie, though :-D

Link to comment
Share on other sites

Even in the 8-bit era there was still big iron. You know, minicomputers and mainframes that ran behind the scenes. You had the metered walled-gardens like Compuserve, GEnie, Prodigy, and the private internet limited to government and universities. The only free stuff was BBSs and only for local calls, one at a time. So what we'd have today is something inbetween dumb terminals and thin clients. The stuff that was beyond the capability of 8-bit machines could be done client-server but you'd have to pay a premium to use it.

 

Most of the computing power we have today is already overkill for what we use computers for primarily. Aside from HD movies, games, and content creation, computers are really little more than internet communications devices. The only times they really get pushed is streaming youtube and hulu clips or running a videoconferencing session. It's really not necessary to have much more power than what netbooks offer to get by now or in the future as far as basic utilitarian tasks. Going forward I think there will be more focus placed on all-day mobile computing than speed. A great many people who have no need for the bleeding edge will stick with the increasingly compact and miserly systems. These low end nettop systems will probably become so commoditized that it will just be something standard that is embedded in TVs as they are starting to do now. It could be like a no-compromises version of WebTV.

Edited by mos6507
Link to comment
Share on other sites

If we just persue the Atari aspect of the question

 

Way back when Atari were still interested in giving us hardware they had some interesting possibilites

 

Project 'Sara'....Atari's Attempt at a 10 BIT games system

 

Project 'Sierra'....Atari's Attempt a a Multi CPU based computer (one version apparently was to come with what became the amiga chipset)

 

Project 'Falcon'....Atari's attempt at bringing it's hardware knowledge to the telecoms market and also interfacing telecoms equipemnt to Atari hardware

 

I dont know the name of the project but it was Atari's Attempt at the growing or burgeoing 'transputer' market of the late 80's with a system called Abaq/ATW

 

And then ofcourse we had the 'bought in' technology that atari just slapped it's logo on...i.e Jagaur, Lynx, Portfolio (and i guess you could say ST pad/Book etc)

 

Interesting that Atari didn't take these possibilities to their zenith...i could have been typing this in from a 128 bit multi cpu, transputer based Atari 130xe 3000 deluxe prosystem cum gameconsole

Link to comment
Share on other sites

Well, there's probably no limit on parallel processing (new tricks etc would be learned) and 8 bit cpus run really cool (remember, compuers used to not need heatsinks and cooling fans)

 

Largely, I think the world would look more or less the same now, except things like Cellphones and notebooks would be considerably bigger.

 

A lot of companies did have great Ideas incase we never moved beyond 8 bit (or probably in h0pes of) but as 16 32 and higher chips come out, they got cheaper and the 8 bits got left behind.

Link to comment
Share on other sites

I would have this in my car :cool: :

 

I actually know a few people who had early mobile phones. Oddly enough, they seemed to work better on the old analog system, and people were trained on how to properly drive and talk at the same time (as in, don't if it can be avoided).

Link to comment
Share on other sites

I remember way back in 82 when a friend of mine was telling me that he was in all american one day and he saw a guy watching "tv" on his wrist watch. I know this was pure and utter bullshit because as awsome as it sounded, I knew it wasnt true. I didnt think we'd ever get the technoligy but we are now. grant it, we dont watch tv or movies on a wrist watch but we can view them on ipods and such and they are near the size. frankly, I cant see screens getting any smaller because you would basically strain your eyes trying to watch something so small!

Link to comment
Share on other sites

I dont know the name of the project but it was Atari's Attempt at the growing or burgeoing 'transputer' market of the late 80's with a system called Abaq/ATW

Well, the ATW really was released. Unfortunately, it didn't sell too well, and they dumped the product really fast. It's too bad, as it's a really curious design. I bet they spent a lot of money designing the thing too.

 

--Zero

Link to comment
Share on other sites

Well, look at what an Apple ][ ended up being capable of!

 

A nicely expanded machine had a Megabyte of RAM, decent graphics, sounds, interface devices. If you take a look at a //e enhanced, it's still a viable workstation today! That's 8 bits. Other machines, like the Color Computer, ran OS/9, and had their fair share of expansion options as well.

 

I suspect we would have many of the things we have today, only just a bit slower. Things like hardware memory management chips, and dedicated graphics chips would have been optimized to the 8 bitters. If you cap it at the late 80's, there were multi-core 6502 chips! (Always wanted to run one of those) There were also lots of hardware interface devices that could have made some potent computers.

 

We didn't see real power until SPARC / MIPS et al... though. Moto was a contender with it's 68K series. The first silicon graphics workstations used Moto chips, ran multi-user UNIX in the form of IRIX. These buggers had a fair amount of RAM, ran the distant parent of Open GL (IRIS GL), and many other goodies.

 

That's not really constraining things to 8 bits though. More like just calling a moment in time.

 

Being limited to 8 bitters means CPU's like the 6809 ruling the roost. Maybe the 65816, or 68K with 8 bit interface. Don't know.

 

I do think we would have seen more CPU's put together for specific purposes. I'm thinking main program processor, with math assist, graphics assist, I/O assist and all of those tasks being 8 bits as well. With hardware memory management, it would then be possible to have quite the machine!

Link to comment
Share on other sites

Well, look at what an Apple ][ ended up being capable of!

 

A nicely expanded machine had a Megabyte of RAM, decent graphics, sounds, interface devices. If you take a look at a //e enhanced, it's still a viable workstation today! That's 8 bits. Other machines, like the Color Computer, ran OS/9, and had their fair share of expansion options as well.

 

 

Yeh! I still have mine with tons of original hardware, a 10 meg hard drive the size of a briefcase, svga graphics card, 4 meg ram, and various periperals. You should see the mx-80 dotmatrix printer. A sight to behold.

 

I programmed the thing to play music ( the printer! ), by striking the pins and swishing the head back and forth. Slick, and I'd use the floppy to accompany it for rhythm, or background 'drum' effects, banging the head and recalibrating.

 

And I just found a soduku puzzle solver that performs as fast as a 2ghz pentium, it solves the difficult puzzles in a fraction of a second. Fast enough were human error is the factor if you use a stopwatch. .4 seconds? .7 seconds?? Whatever. and that is on a 1mhz cpu, 48k.. And you know what? The data entry is so much easier than on a full-blow gui, like windows. Excellent programming.

 

http://home.comcast.net/~mjmahon/Sudoku.html

 

And the apple ][ has ethernet and USB is in development now..

Link to comment
Share on other sites

And one more thing, I think that if we were limited ot 6502's we'd probably have tons of them in parallel, and on the same package, like larrabee from intel. Or perhaps if you meant hardcore 6502's running at 10mhz and in discrete packaging, meaning one chip per 40pin dip; then we'd have that as a master controller of many more dedicated chips. and I mean many more!

 

But also, everything would be minimalist in nature. And that would be a good thing, as it would keep computer technology away from the idiotic masses and in the hands of folks who know what to do with computers. We wouldn't have all the mindless Uh'leet haxorz and pnwed crap going on in useless lan parties and garbage like that - a sad waste of resources.

Edited by Keatah
Link to comment
Share on other sites

I'll play the role of bad guy.

 

The past only works better when we filter out the bodies.

 

We've already seen the scenario occur if we take the question literally, and allow no creative work around: take an 8bit cpu capped at 10mhz, and you have a GBC. While the system has one of the strongest libraries of games ever made, it also has one of the worst, and the bad outnumbers the good by at least a 4 to 1 margin. Whether it's graphics over gameplay ( Perfect Dark ), or the complication involved in adapting too ambitious of an idea ( The Tony Hawk series ), even good teams can fall victim to problems we no longer even think about.

 

As consumers, we wouldn't see much of a difference beyond bad video games ...and a more user friendly internet.

 

The GUI dates back to the late 60's. The desktop comes to us from the early 80's. A flat cap on what technology can achieve means more time spent refining the user's experience. We skip Windows Vista entirely. Youtube is your choice of a tiny window or pixels the size of emoticons. Webpages skip flash openings. Copyright lawyers stick mostly to suing public schools for their use of Mickey Mouse.

 

On the other hand, any spyware makes your computer die. A single virus bricks it forever. It all balances out.

Edited by A Sprite
Link to comment
Share on other sites

This screams "what if the Atari 8 bit was still king?"

Unless you are going to alter the laws of physics (and everything else as a result) it's not even possible.

If you limit the laws of physics that makes all sorts of other things impossible and we would probably be lucky to even have television.

 

 

A more realistic approach would be what if something caused the existing 8 bit market to hang around longer... then where would we be?

Perhaps just no 680x0 or 80x86 series gets introduced.

If you look at what was happening with 8 bits you would see they were evolving into more powerful CPUs.

You would have seen some of the modern enhancements to 8 bits built in.

 

As far as the Z80 goes...

The MSX Turbo R runs circles around the original Z80, executing instructions in fewer clock cycles.

The Hitachi 64180 adds improvements to the Z80 instruction set and adds an MMU (later to become standard on the Z180).

The Kawasaki 8400 and other CPUs implement the Z180 in a similar fashion to the CPU in the Turbo R.

The Rabbit enhances the Z80 instruction set further... etc.

That would have made it into 8 bit computers and CP/M gets some sort of odd stepchild that can take advantage of more RAM and CPU features.

At some point 16 or 32 bit enhancements need to be added in spite of the speed improvements that really have taken place but the CPU doesn't have much room for additional instructions so something is introduced that is source code compatible only.

 

As far as the 6502 goes...

The 65816 gets higher clock speeds and a faster IIgs gets introduced by Apple but probably without a GUI since that originated on the Mac.

The 32 bit enhanced 6502 part from WDC would have actually been released instead of mysteriously disappearing from their website shortly before it was due for release.

 

Motorola introduces a followup to the 6809 with similar enhancements to what the 65816 added to the 6502 but without mode hopping. Beyond that is pushing the extension of the 8 bit era.

 

 

Apple probably goes on to be the largest PC manufacturer in the world mostly due to mistakes their competition makes.

 

Tandy doesn't focus on the PC clone market and introduces more powerful 8 bits. They remain a major PC manufacturer for much longer than they did.

 

CBM introduces some form of the C65 instead of the Amiga and probably their own competitor to the Tandy 100 series.

 

Atari is still plagued by financial problems since but manages to introduce a followup series of 8/16 bit machines before they go under.

 

For that matter, anyone that had financial problems before then end of the 8 bit era would still have those problems except possibly Sinclair.

Without spending money on a totally new 68008 based system (that flopped) they introduce an evolution of the Spectrum instead.

But then Clive dumped too much money on other projects so I think the sellout to Amstrad was inevitable.

He needed to clean up problems in his business instead of focusing on his pipe dreams.

 

If Dragon Data hadn't gone under you would have seen multi-CPU systems anyway so I think such systems may have become common.

 

Ultimately, 16/32 bits would have ruled anyway but the ones that ruled would have probably been better than what we ended up with due to more time spend figuring out what worked best.

 

 

Now, having said all that...

The April 1980 issue of Micro I downloaded from a 6502 archive site had some interesting articles.

It seems that someone at TI had already been saying that the 8 bit didn't really have a place because 16 bits was better for more powerful applications and 4 bits was enough for smaller applications. The editorial proclaims how wrong the guy was but history seems to show he was at least partially correct. What he didn't foresee is 8 bits becoming small enough and cheap enough to replace the 4 bits. He was proven correct with personal computers though. Now, if that made it into a 1980 magazine, that talk had probably been around since early in the life of the personal computer market but no 16 bit was cheap enough yet. Remember, if you had the money there were 16 bit CPUs already available.

 

BTW, that magazine also had an article about the Synertek SY6516 (aka 6509) processor which was originally supposed to go into the Atari 8 bit but it wasn't ready. Think a 65802ish CPU standard on the Atari at it's introduction. Even Atari wanted 16 bits before the machine was ever built. At least if the article is correct anyway. However, from a search for the SY6516 I found some notes by the author of that article. Synertek denied such a project existed and the author seemed to think the company was just trying to find out if there was any interest. The SY6516 never saw the light of day and it's supposed design appears to make up a large part of the 65802/65816.

Link to comment
Share on other sites

I'll play the role of bad guy.

 

The past only works better when we filter out the bodies.

 

We've already seen the scenario occur if we take the question literally, and allow no creative work around: take an 8bit cpu capped at 10mhz, and you have a GBC. While the system has one of the strongest libraries of games ever made, it also has one of the worst, and the bad outnumbers the good by at least a 4 to 1 margin. Whether it's graphics over gameplay ( Perfect Dark ), or the complication involved in adapting too ambitious of an idea ( The Tony Hawk series ), even good teams can fall victim to problems we no longer even think about.

 

As consumers, we wouldn't see much of a difference beyond bad video games ...and a more user friendly internet.

 

The GUI dates back to the late 60's. The desktop comes to us from the early 80's. A flat cap on what technology can achieve means more time spent refining the user's experience. We skip Windows Vista entirely. Youtube is your choice of a tiny window or pixels the size of emoticons. Webpages skip flash openings. Copyright lawyers stick mostly to suing public schools for their use of Mickey Mouse.

 

On the other hand, any spyware makes your computer die. A single virus bricks it forever. It all balances out.

 

 

I think most of the wildly overambitious games on eight bits in latter years were simply trying to convert 16 & 32 bit games to the old tech. If that tech didn't exist, all the produced game concepts would be tailored whitin the limits of the hardware. This does not mean all games would be good, but unsuitable concepts because of hardware restraints should be rare.

Many of the game staples of today may not exist simply as they are a product of the availablity of cheap high power systems. No developers would fiddle with high end 3D etc if the power wasn't there in the average system for example.

Link to comment
Share on other sites

Everyday would be like waking up in heaven and people wouldn't all be morons obsessed with uploading pictures of themselves every split second of their lives. Also, videogames would be much better.

i think you might be right, taking the image and video capabilities away from the 'net would be no bad thing IMHO. It would probably clear the 'net of most of the pointless junk(facebook etc) and porn etc.

Also the evolution of games rather than hardware and shiny graphics would perhaps have given us experiences that we don't have now, look at how so many of our best homebrew programmers today give us new experiences on the supposedly obsolete technology that are often offering much fresher and more entertaining experiences than the latest in next-gen games. Sometimes having boundaries can lead to more creativity than have practically no limits.

Also it always seems that the finest games on any gaming system are made in it's twilight years as programmers finally get to grips with the capabilities and tricks involved in making the most of the available resourses, again I look to homebrew writers to illustrate how far games could have come within the 8-bit frame. What would a Atari with Supercharger with 64 K Ram loaded from an audio CD been capable of if it was available as mainstream hardware in the mid/late 80's just as a small example of the possibly add-ons that may have been available?

 

As far as personal computers are concerned a previous poster mentioned his Apple 2 and how it was advanced and upgraded with modern technology as it became available.

 

It would be interesting how the popularity of mobile phones would have faired without the superslim mp3, video playing fashion devices on the market today.

 

Thanks for all who are participating in this discussion, i'm am enjoying reading everyones thoughts so far. Funny how a simple daydream can have such far reaching and diverse conclusions.

Best regards,

Chris

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...