Jump to content
IGNORED

Why did Atari make the 400 have a membrane keyboard?


LostRanger22

Recommended Posts

12 hours ago, Robert Cook said:

Well, BASIC is a poor language to learn programming on, in general, which is why I kind of treat the subject as largely moot.

Can't agree with that, I started with Atari BASIC and yes, you soon learn the limitations (which is a good thing) as it made

me move onto assembler, initially using USR calls, but then pure assembler. When I got my ST I went onto 'C' and assembler.

 

From that initial day with my 800, my career in electronics turned to a mix of hardware design and software design, I ended up with

a full 35+years programming which I put down to my good old 800 with BASIC as a starter.

  • Like 4
Link to comment
Share on other sites

2 hours ago, bfollowell said:

How in the world did a thread about the Atari 400 membrane keyboard get hijacked into a discussion about various BASICs?

 

Because programmers appear very hung up on the languages they use, which is perhaps entirely understandable- it's their medium in the same way as an artist's medium is their paints...

 

Similarly, it's easy to perhaps get too close to the subject and end up arguing too strongly for the merits of one language or paint (or dialect or shade of paint) over another, when perhaps standing back it's more apparent that one can program (or paint) well or badly in any language (or paint). Or on any platform, come to that...

 

Some things disparaged in old BASICs- such as line numbering rather than labelling- were a function of the limitations of terminals and rudimentary line editors rather than the language.  Without an interactive full screen editor, it's rather more difficult to indicate that you want to insert a line of code between line 20 & line 30 if they don't have line numbers...  When fully-featured screen text editors became the norm, BASIC evolved to become more structured and drop its line numbering requirement.

  • Like 1
Link to comment
Share on other sites

12 hours ago, Robert Cook said:

It'll teach many things about programming that are generally useful, along with many things that are best forgotten (bad habits that need to be broken) when moving on to other languages for serious use.  While it's not going to permanently "ruin" anyone who has any kind of talent or knack for programming, it is far from the best starting point.  BASIC's accessibility and interactivity for users as an interpreted language with an immediate mode is its main strong point, but the real accessibility that accounts for its popularity is that it was relatively undemanding on primitive early computers and easy--and therefore cheap--to implement and include as part of a packaged system.  Compilers for better high-level languages to learn were hard to come by.  They existed even for 8-bit computers, but were overshadowed by the ubiquity of BASIC.

BASIC did enable bad habits in me that were hard to break. 

 

Compilers generally required disk drives so compiled languages were not practical on these computers that were usually sold without built-in drives.   So it had to include something interpreted.    These computers were always low on memory and didn't come with full-screen editors or IDEs, so line-editors/line numbers were used in place.   It made BASIC the perfect language for this class of machine.

 

16 hours ago, MrFish said:

In answer to BASIC in general (or any specific BASIC) being a bad starting point for programming, I disagree. It all depends on how the language is handled. Bad programming habits can be seen in any language. Good BASIC code can be written; and the benefit is that the language is quite accessible for beginners, compared to some other languages.

Even with the best practices, it is difficult to write good BASIC code on these systems.   For one, the limited memory means that comments or remarks were expensive since they had to be kept in RAM.  Same for whitespace.   Many BASIC interpreters didn't support indentation for loops or if/then blocks,  and they also allowed multiple commands to be strung together on a single line.     The end result is dense code that is difficult to read.   Also even if you use the good practice of GOSUB instead of GOTO,  many interpreters didn't have labels, so you were GOSUBing to some non-descriptive line number.  There were also generally no parameters or local variables.    Some later BASICs added these niceties, but on 8-bit systems, BASIC was very... well basic. 

  • Like 3
Link to comment
Share on other sites

6 hours ago, TGB1718 said:

Can't agree with that, I started with Atari BASIC and yes, you soon learn the limitations (which is a good thing) as it made

me move onto assembler, initially using USR calls, but then pure assembler. When I got my ST I went onto 'C' and assembler.

Like I said earlier, if you have talent for programming, then BASIC is not going to ruin you as a programmer (it didn't ruin me, either), but I stand by my argument that it's not a good starting point, and that's because it's not a good language in general.  It served a purpose at the time, based on many constraints, and doing any programming is better than doing none, but it's simply not the best starting point.  Are you arguing that it is the best way to start?  It was the best that could affordably be included for "free" in a timely (as in extremely rushed) manner on early 8-bit computers, perhaps, but hardly the best for learning and/or actually programming.  That would be a tough argument to make (and well out of the scope of this topic).

 

I started with BASIC, too, but if I could have started with something better and/or more useful, then I would have, and indeed, I switched to 6502 assembler before I ever did much programming.  Even such a low-level language is inherently superior to 8-bit interpreted BASIC in many ways, especially in the labels and structure it allows (as minimal as that is).  If all one ever wanted to do was dabble in programming out of curiosity, then BASIC is fine, but if really learning how to program is one's goal, then there are better languages than BASIC for that.  I never said nor implied that anyone who started out with BASIC (like I did) is going to become an awful programmer.

 

6 hours ago, TGB1718 said:

From that initial day with my 800, my career in electronics turned to a mix of hardware design and software design, I ended up with

a full 35+years programming which I put down to my good old 800 with BASIC as a starter.

Same here, essentially, but I for one would not program in BASIC if I could avoid it.

 

4 hours ago, bfollowell said:

How in the world did a thread about the Atari 400 membrane keyboard get hijacked into a discussion about various BASICs?

It was more direct than you're probably thinking.  The thread starter explicitly mentioned programming in the very first post, a couple of members criticized one version of BASIC (CBM BASIC V2), and I defended it as minimally adequate for what it is, as well as rather dependable and bug-free, which not all BASICs from back then are.  It was never all about the 400's membrane keyboard, it was also about what reduced-cost computers like the 400 or the VIC-20 were intended for, and we covered that topic, as well.

 

1 hour ago, drpeter said:

Because programmers appear very hung up on the languages they use, which is perhaps entirely understandable- it's their medium in the same way as an artist's medium is their paints...

Maybe some programmers are religious zealots for certain languages or paradigms, but I use a variety of languages, depending on the application and its requirements.  For example, sometimes I use object-oriented languages, and other times I don't.  And sometimes I use interpreted languages or ones running on virtual machines, while other times I use natively compiled ones.

 

As for your analogy, while there is some truth in that programming involves creativity and to some extent individuality, it is really more of a craft because there can be objectively bad ways of doing things.  For example, if I were to use Java in a virtual machine to program a real-time embedded system such as a space probe, just because I LUV Java and think it's SO KEWL and would never let me make a mistake ?, then I would be an idiot, because what if the garbage collection function happened right when the probe needed to make a critical burn to change its trajectory, or had exactly one split-second opportunity to take a photo before zooming out into deep space?  It could work, probably, but the choice of language and environment for this purpose are still objectively horrible.  Should a pottery maker fire their creations in a kiln or use a blowtorch?  Any ugly, poorly made piece of crap could be considered "art", but programming is a craft, so there can be bad ways to make bad programs.

 

As programming languages go, BASIC, especially on 8-bit computers in the 1970s and 1980s, is one of the worst imaginable, if not the very worst, and in all of my decades of programming, it has never been the best choice for anything.  It was always there and served utilitarian purposes for these early personal computers adequately, and you can write virtually any kind of program with it, but it is virtually always the worst choice for anything more, including learning how to program.  If we were discussing the merits of automobiles, then 8-bit interpreted BASIC would be a Yugo (remember those?).  The only thing it is better than is nothing.

 

1 hour ago, drpeter said:

Similarly, it's easy to perhaps get too close to the subject and end up arguing too strongly for the merits of one language or paint (or dialect or shade of paint) over another, when perhaps standing back it's more apparent that one can program (or paint) well or badly in any language (or paint). Or on any platform, come to that...

So people like me can't see the forest for the trees?  What are you really trying to say, that 8-bit interpreted BASIC is just as good a programming language as, say, C, and just as productive for doing real programming?  Like I said, if one is only ever going to dabble, then BASIC is fine, but it would and should be the last choice for actually trying to program something that comprises more than several lines of code.  You wouldn't choose to drive a Yugo if you had the option to choose a better car, would you?  It's better than walking (unless you have to get out and push it!), but that's all.

 

1 hour ago, drpeter said:

Some things disparaged in old BASICs- such as line numbering rather than labelling- were a function of the limitations of terminals and rudimentary line editors rather than the language.  Without an interactive full screen editor, it's rather more difficult to indicate that you want to insert a line of code between line 20 & line 30 if they don't have line numbers...  When fully-featured screen text editors became the norm, BASIC evolved to become more structured and drop its line numbering requirement.

Some modern BASICs are fundamentally different languages, but we're talking about vintage interpreted BASICs here.

 

1 hour ago, zzip said:

BASIC did enable bad habits in me that were hard to break.

If you ever got seriously into it, then sure, I can definitely see that happening.  It's nothing you couldn't overcome, but my point is that it's a bad thing you've had to overcome, as you're saying, too.  And bad things are best avoided if possible, obviously.

 

1 hour ago, zzip said:

Compilers generally required disk drives so compiled languages were not practical on these computers that were usually sold without built-in drives.   So it had to include something interpreted.    These computers were always low on memory and didn't come with full-screen editors or IDEs, so line-editors/line numbers were used in place.   It made BASIC the perfect language for this class of machine.

I totally understand and agree.

 

1 hour ago, zzip said:

Even with the best practices, it is difficult to write good BASIC code on these systems.   For one, the limited memory means that comments or remarks were expensive since they had to be kept in RAM.  Same for whitespace.   Many BASIC interpreters didn't support indentation for loops or if/then blocks,  and they also allowed multiple commands to be strung together on a single line.     The end result is dense code that is difficult to read.   Also even if you use the good practice of GOSUB instead of GOTO,  many interpreters didn't have labels, so you were GOSUBing to some non-descriptive line number.  There were also generally no parameters or local variables.    Some later BASICs added these niceties, but on 8-bit systems, BASIC was very... well basic. 

Additionally comments, which are essential for efficient and effective code maintenance, and whitespace, which is important for readability and work efficiency and quality, not only took up valuable limited memory, they slowed programs down even more.  Frankly, I'm surprised that my contention that BASIC (especially in this vintage form) isn't the best language to either use or learn first is so controversial.  Perhaps you've just conveyed what I was trying (but failing) to say.

Link to comment
Share on other sites

2 hours ago, zzip said:

Even with the best practices, it is difficult to write good BASIC code on these systems.   For one, the limited memory means that comments or remarks were expensive since they had to be kept in RAM.  Same for whitespace.

 

I don't find this to be true at all. Lack of comments is more a habit of laziness or neglect. Most programs written by beginners never get to the point of maxing out memory. These were things seen by more advanced (for lack of a better term) users. Simple, short comments, reasonably placed throughout code was never a problem in interpreted BASIC; if it was, BASIC was also capable of being compiled. Well-written code is also self-documenting. If you choose to use variables names only to demonstrate your command of reciting the alphabet, then I guess you can expect your program to look like alphabet soup.

 

2 hours ago, zzip said:

Many BASIC interpreters didn't support indentation for loops or if/then blocks,  and they also allowed multiple commands to be strung together on a single line.     The end result is dense code that is difficult to read.

 

Sure, better constructs are helpful for cleaner code; but like writing in general, good organization and readability is more a symptom of the one doing the writing. Dumping off bad coding practices on the language is either an excuse or lack of understanding. 

 

2 hours ago, zzip said:

Also even if you use the good practice of GOSUB instead of GOTO,  many interpreters didn't have labels, so you were GOSUBing to some non-descriptive line number.  There were also generally no parameters or local variables.

 

Variable names can be used for GOSUB and GOTO line numbers. Parameters and local variables can be simulated. You assign variable values before hitting your GOSUB's, and keep variables specific to each routine.

 

Most of these claims are exaggerated excuses or lazy practices of people in coding frenzies during which they exhibit no self-control or patience, and place little to no emphasis on being able to interpret their own work in the future. Having the freedom to do something stupid is only an invitation to those so inclined.

 

  • Like 1
Link to comment
Share on other sites

26 minutes ago, Robert Cook said:

If you ever got seriously into it, then sure, I can definitely see that happening.  It's nothing you couldn't overcome, but my point is that it's a bad thing you've had to overcome, as you're saying, too.  And bad things are best avoided if possible, obviously.

I suppose I was no different than a kid who taught himself guitar by picking one up and play it until things sounded good and then being subject to a music theory class and bewildered by a bunch of rules that seemed pointless.

 

That was me learning "structured programming".  As soon as I got my Atari, I started cranking out various BASIC games and soon became confident in my BASIC skills,   but the idea of coding without line numbers was scary.   I also didn't immediately see the point of "local variables", or recursive functions and was completely scratching my head when people started talking about "object oriented" languages.  Of course a lot of that doesn't make sense for BASIC programs on an 8-bit machine,  but as I started writing bigger programs,  I started to see the need for the other stuff..  even the object-oriented 'BS' started to make sense.  :)

 

So I don't know if learning BASIC my way first was a blessing or a curse.  It did cause me to resist changing my coding style at first, but on the other hand it gave me a head start over the other kids in class who never programmed anything before.

 

But now I look at BASIC and wonder how I was able to ever code like that!  

39 minutes ago, Robert Cook said:

Frankly, I'm surprised that my contention that BASIC (especially in this vintage form) isn't the best language to either use or learn first is so controversial.  Perhaps you've just conveyed what I was trying (but failing) to say.

I think it was the best prior to maybe 1985 or so.   Floppy Drives were crazy expensive before that and BASIC was well-suited to those small computer systems which is all many of us had.   In the later part of the 80s,  the schools started teaching Pascal as the beginner language instead.

Link to comment
Share on other sites

57 minutes ago, Robert Cook said:

You wouldn't choose to drive a Yugo if you had the option to choose a better car, would you?

Context is all.  If I was popping down the road to the shops, I would definitely choose a Yugo over a Lamborghini...  But that is perhaps not 'real driving' ?

Link to comment
Share on other sites

13 minutes ago, MrFish said:

I don't find this to be true at all. Lack of comments is more a habit of laziness or neglect. Most programs written by beginners never get to the point of maxing out memory.

I ran out of memory, even in my first programs.

 

15 minutes ago, MrFish said:

Simple, short comments, reasonably placed throughout code was never a problem in interpreted BASIC; if it was, BASIC was also capable of being compiled.

BASIC was free, the BASIC compilers were not.   At least not until Turbo Basic XL came along.   But still you needed a disk drive for the compiler.   If you started off like my friends and I with just a cassette drive that was out.

 

18 minutes ago, MrFish said:

Well-written code is also self-documenting. If you choose to use variables names only to demonstrate your command of reciting the alphabet, then I guess you can expect your program to look like alphabet soup.

It is still a very dense language and not highly readable compared to alternatives.   Also using variables instead of line numbers consumes extra memory which is always in short supply on 8-bit systems. 

 

23 minutes ago, MrFish said:

Sure, better constructs are helpful for cleaner code; but like writing in general, good organization and readability is more a symptom of the one doing the writing. Dumping off bad coding practices on the language is either an excuse or lack of understanding. 

BASIC requires bad coding practices in a lot of cases.   Of course this is very dependent on BASIC dialect.  But for the early BASICs every variable is global scope,  there aren't many data types available, no structures, no local variables.   And then there's the bad practices that BASIC encourages.   GOTO is the obvious one,  but also using line numbers encourages people to string more commands on a single line to avoid renumbering.  

 

But in general even the best-coded BASIC program tends to still be inferior to the best coding practices in other languages.

  • Like 2
Link to comment
Share on other sites

1 hour ago, drpeter said:

Context is all.  If I was popping down the road to the shops, I would definitely choose a Yugo over a Lamborghini...  But that is perhaps not 'real driving' ?

Clearly you have never driven a Yugo.  I had a neighbor who actually bought one, and within weeks, he had duct-taped the rear window back on when it had spontaneously fallen off.  He wasn't the only one who did that, either--I've seen others do the same.  It wasn't long until he had disposed of the vehicle altogether.  He had bought it new, but it wasn't even worth fixing the window for real ?, and he hardly ever drove it (that I saw--it seemed to always be sitting there as an object of ridicule and much amusement for my family, albeit not in front of our neighbor).

 

Anyway, analogies are never perfect or complete.  Maybe I should have said moped or bicycle instead, and ignoring all other context was deliberate.  Within the context of personal/home computing at the time, BASIC was the only practical--and therefore the best--choice on the part of the manufacturers.  It was and is also not the most ideal language to really learn programming in earnest on.  Both statements can be true at the same time, you know--they are not contradictory.  Maybe it was the only language many people had access to, and it was certainly better than nothing, but that in no way implies that at least in theory, it was just as good a language as any other.

 

As for what is "real" or not, the point is that 8-bit interpreted BASIC is significantly more limiting in what can be accomplished than other languages.  It could be compiled, but that was still quite limiting in comparison to machine language, due to the barely adequate hardware of the time.  And other compiled high-level languages weren't necessarily much better in this regard, depending on the type of compiler and its level of optimization, but at least they didn't tempt beginners (or even professional programmers) to adopt bad programming conventions and habits to nearly the extent that BASIC did.  Of course, if you were someone who was born to program, like I and some others here so modestly feel about ourselves ?, then we could have always used the best practices by choice, but not everyone is like that.  And even I, for one, chose to give up on BASIC early on, largely due to the context of the low-performance computers of the time.  It wasn't worth learning all of the ins and outs of a language that, in the long run, I would never voluntarily choose to use, either professionally or casually.  For little bitty things that I needed to get done right now, sure, I did a bunch of that, but I wouldn't write an arcade-style game or any major application in BASIC.

Edited by Robert Cook
Link to comment
Share on other sites

4 hours ago, Robert Cook said:

Clearly you have never driven a Yugo.  I had a neighbor who actually bought one, and within weeks, he had duct-taped the rear window back on when it had spontaneously fallen off.  He wasn't the only one who did that, either--I've seen others do the same.  It wasn't long until he had disposed of the vehicle altogether.  He had bought it new, but it wasn't even worth fixing the window for real ?, and he hardly ever drove it (that I saw--it seemed to always be sitting there as an object of ridicule and much amusement for my family, albeit not in front of our neighbor).

 

Anyway, analogies are never perfect or complete.  Maybe I should have said moped or bicycle instead, and ignoring all other context was deliberate.  Within the context of personal/home computing at the time, BASIC was the only practical--and therefore the best--choice on the part of the manufacturers.  It was and is also not the most ideal language to really learn programming in earnest on.  Both statements can be true at the same time, you know--they are not contradictory.  Maybe it was the only language many people had access to, and it was certainly better than nothing, but that in no way implies that at least in theory, it was just as good a language as any other.

 

As for what is "real" or not, the point is that 8-bit interpreted BASIC is significantly more limiting in what can be accomplished than other languages.  It could be compiled, but that was still quite limiting in comparison to machine language, due to the barely adequate hardware of the time.  And other compiled high-level languages weren't necessarily much better in this regard, depending on the type of compiler and its level of optimization, but at least they didn't tempt beginners (or even professional programmers) to adopt bad programming conventions and habits to nearly the extent that BASIC did.  Of course, if you were someone who was born to program, like I and some others here so modestly feel about ourselves ?, then we could have always used the best practices by choice, but not everyone is like that.  And even I, for one, chose to give up on BASIC early on, largely due to the context of the low-performance computers of the time.  It wasn't worth learning all of the ins and outs of a language that, in the long run, I would never voluntarily choose to use, either professionally or casually.  For little bitty things that I needed to get done right now, sure, I did a bunch of that, but I wouldn't write an arcade-style game or any major application in BASIC.

 

I think perhaps we are in agreement, after all.

 

PS LOL re the Yugo anecdote!

Link to comment
Share on other sites

On 10/19/2021 at 4:31 PM, Robert Cook said:

which is why, for one thing, its performance is relatively poor (CBM BASIC V2 remains better optimized).  I guess we just have a different sense of the meaning of "whackadoodle".

 

You may be surprised, though....

 

0363F4F0-35AA-4C38-BF0D-5E838E6DF4EE.thumb.jpeg.3ce4d73e0765be8b3f7a1bf274f1bae5.jpeg 

 

34810FE9-3FC6-4281-9605-CBAF692B841C.thumb.jpeg.be450fee1f9e39562e9207c8e7c34e58.jpeg

 

It is clear where CBM Basic was definitely optimized, with a good sense of practical value, as well as what areas were totally neglected or, simply passed-on the horrid performance from its common ancestor. 

 

I personally cut my teeth on Atari Basic rev.A and a membrane-keyboard 400, with 16K and a totally unreliable 410... Brave times, those were...

 

 

Link to comment
Share on other sites

1 hour ago, Faicuai said:

I personally cut my teeth on Atari Basic rev.A and a membrane-keyboard 400, with 16K and a totally unreliable 410... Brave times, those were...

I wonder how many great BASIC programs were lost because the 410 couldn't retrieve them.  I know I lost a few ?

  • Like 2
Link to comment
Share on other sites

1 hour ago, zzip said:

I wonder how many great BASIC programs were lost because the 410 couldn't retrieve them.  I know I lost a few ?

I got the tee-shirt on that one ?

My 410 was the early model using a power supply tap from the 800 power supply, used it for maybe over a year

until the 130XE came out and I got a 1050.

Link to comment
Share on other sites

On 10/29/2020 at 5:23 PM, drpeter said:

It was a huge frustration to many of those who (like me) loved the Atari 8-bit line that in 13 years of production the only significant evolution was to make the original 800 design cheaper to produce, with little-to-no functional enhancement of either hardware or firmware.  Not improving that unnecessarily creepingly-slow floating point ROM while somehow managing to partly use up an expanded ROM by replacing the Memo Pad with something even more functionally useless (Self-Test) being perhaps only the most inexplicable and unforgiveable example...

 

On the other hand, that the essential 800 design was able to endure pretty much unchanged for 13 years in production, and leaves a legacy to this day, is undeniable testament to its brilliance.

That might be true. However, if you have an Atari ST to sell, there is no point in investing in development of the 8-bit line.

Cutting production cost and selling at lower price is economically more viable as you can milk the old tech to the last cent.

The same happened to C64. When there was Amiga, they did the same with C64. And it sold well as a low cost computer.

 

All things considered, it would have been nice to have the following as standard:

256 KB RAM, Enhanced GTIA with 8 sprites, Fixes and enhancements in the OS ROM, better BASIC, Tape transfer speed increased to 2400 bps.

In 1985, there really was no point of developing this.

 

Link to comment
Share on other sites

On 10/20/2021 at 1:06 PM, MrFish said:

 

 Well-written code is also self-documenting.

 

 

No, no, no, a thousand times, NO.

 

When I get new minions, I tell them I want comments that describe what they were thinking, what they intended the code to do. Any idiot can read the code and deduce what it does, the question becomes, is that what it was supposed to do.

 

  • Like 4
  • Thanks 1
Link to comment
Share on other sites

3 minutes ago, poobah said:

No, no, no, a thousand times, NO.

 

When I get new minions, I tell them I want comments that describe what they were thinking, what they intended the code to do. Any idiot can read the code and deduce what it does, the question becomes, is that what it was supposed to do.

Agreed, no substitute for comments, I'm as guilty as the next person, code I wrote back in the 80's whether BASIC or Assembler

that I had not commented is difficult to read without comments.     

Link to comment
Share on other sites

2 hours ago, thorfdbg said:

Is this test suite available anywhere?

Well, I have been working on and off for quite some time, but gave it a final (serious) push, last week, here it is:

 

Atari Basic format (tokenized); vulcangt.bas

Atari Basic LISTED (ATASCII): vulcangt.lst (editable outside Atari Basic)

Atari Microsoft Basic (ATASCII): vulcangt.msb (also editable outside MSB)

Windows TEXT format: vulcangt.txt

 

Minus max. line-length (plus ":" single-line concatenation) supported, the above versions should run on virtually ANY basic out there. 


For a much more revealing (and interesting) display of information, set variable "VB" to "1" (top of listing).

 

Here's a run on Avery's kick-ass 8K Altirra-Basic v.157, on XEP-80 Ultra session:

 

B7113D14-A4D6-4917-8B64-B130B2B22402.thumb.jpeg.72f1e517b19dd8cfc960bc5dde9e1cfc.jpeg DC5F8DCB-8D9A-4139-A2D7-E8974D204FC0.thumb.jpeg.ca4d6e9dd830cf0206bb7410e8be8820.jpeg

 

 

Link to comment
Share on other sites

1 hour ago, Faicuai said:

Well, I have been working on and off for quite some time, but gave it a final (serious) push, last week, here it is:

 

Atari Basic format (tokenized); vulcangt.bas

Atari Basic LISTED (ATASCII): vulcangt.lst (editable outside Atari Basic)

Atari Microsoft Basic (ATASCII): vulcangt.msb (also editable outside MSB)

Windows TEXT format: vulcangt.txt

 

Minus max. line-length (plus ":" single-line concatenation) supported, the above versions should run on virtually ANY basic out there. 


For a much more revealing (and interesting) display of information, set variable "VB" to "1" (top of listing).

 

Here's a run on Avery's kick-ass 8K Altirra-Basic v.157, on XEP-80 Ultra session:

 

B7113D14-A4D6-4917-8B64-B130B2B22402.thumb.jpeg.72f1e517b19dd8cfc960bc5dde9e1cfc.jpeg DC5F8DCB-8D9A-4139-A2D7-E8974D204FC0.thumb.jpeg.ca4d6e9dd830cf0206bb7410e8be8820.jpeg

 

 

That is likely with DMA disabled, and thus not quite comparable...

  • Haha 1
Link to comment
Share on other sites

1 hour ago, thorfdbg said:

That is likely with DMA disabled, and thus not quite comparable...

 

There is no such thing as "not quite comparable". 

 

Having said that, I am a bit "confuzed" with your statement, since the only "comparison" (relevant to a prior post left on this thread) is actually on post #63, after which there are no other vis-a-vis comparisons (all in the spirit of attempting to keep the thread hovering around the old "membrane" keyboard trauma, which seems the essence of it).

 

I personally designed and worked on this Test Suite with the sole purpose of squarely hitting the interpreters' engine directly, in any HW/SW setup (while also cutting through all the academic bushtit, synthetic-algorithmic crusades, dependencies on interpreter dialects, and all that sort of garbage), as long as:

  1. The structure and # of lines on the suite are NOT altered (because there is essentially no need or no point in doing so).
  2. Things like using shorter-name variables for line-fitting, or adapting STRING-based command to a specific dialect, are Ok. 
  3. Two runs are reported / considered:
  • Run #1:"default", on power-up HW/SW config., of the system, and default Interpreter runtime config right after being launched.
  • Run #2: "optimized" run, with total freedom to optimize any and/or all of the following allowed variables:
    • Interpreter of choice. If ROM-based, it is ideal to be a 1:1 fit for OEM interpreter. Larger is OK, but must be disclosed.
    • If a compiler is also used, it needs to be disclosed, as well as compiling settings, if [compiler : interpreter] is of interest.
    • Runtime variables of Interpreter (e.g. pre-compilation of line #s / location, default FOR-NEXT execution modes, use of INTEGER-type variables as instructed on test, etc.)
    • Interpreters' operating environment, which includes the systems OS, and HW-based processing (local CPU, HW-based integer or floating-point ops., CO-PROCESSORS, etc. are ALL allowed to be optimized or used !!)
    • NOTE: Run #2 will be valid., if and only if  the optimized-config. is disclosed, and repeatable by anyone else interested.

 

Now, with that in mind, we may now suggest benchmarking how fast folks can type-in the above test-suite on a 400 membrane key vs. a 800 mech. keyboard, while also drinking the favorite Latte... ?

 

 

 

 

 

Link to comment
Share on other sites

On 10/22/2021 at 5:11 AM, Faicuai said:

You may be surprised, though....

It doesn't surprise me that Atari BASIC might do some things faster than CBM BASIC.  At least in my experience, though, Atari BASIC runs noticeably slower, at least for the things I've done in that language and ported between various platforms.  CBM BASIC additionally seems to scale better as programs increase in size.  And everything, especially final scores and assessments, depends on how various factors are weighted, obviously.

 

My original and main point, though, was simply to counter the false notion that CBM BASIC V2 is some weird, highly non-standard, bad, "whackadoodle" variant of interpreted BASIC, when in fact its core is actually MicroSoft (how they stylized their name back then) BASIC--the de facto standard of the time--and for some perspective, at least in comparison, Atari BASIC is a bit of an oddball (just a tiny bit, mostly due to its non-standard string handling).  CBM BASIC V2 is also quite stable and reliable, and was reasonably optimized, not some trash that was thrown together haphazardly (like we might expect from cheap-ass Commodore at times, or some other BASIC variants back then).  I'm just trying to be fair.  On that note, I called the VIC-20/C64 keyboard cheap (yet solid and quite usable), but I failed to mention that the keycaps are double shot, which while far from rare for the time, is nice and definitely not the cheapest option.  That's classier than the printed keycaps on one of my modern laptops for sure, which have been worn blank from ordinary use (high use but no particular abuse).

 

On 10/22/2021 at 5:11 AM, Faicuai said:

It is clear where CBM Basic was definitely optimized, with a good sense of practical value, as well as what areas were totally neglected or, simply passed-on the horrid performance from its common ancestor. 

It's just not some crazy, out-of-left-field variant of BASIC that lacks the essentials, as some have claimed or implied, nor is it something weird that only some nutcase at one company could come up with as a "quick and dirty" stopgap measure.  I wonder how many people think that it is.  I bet that many do--wouldn't surprise me in the slightest, based on numerous comments I've been hearing/reading for decades now.  It shares a core or at least "code heritage" with BASICs that were licensed by Apple, IBM, and various S-100 CP/M computers.  Actually, in a way, that's kind of disappointing.  Maybe it would be "cool" to have a unique, "whackadoodle" BASIC variant instead.  But I'm just giving the facts, and at least it works well.

 

On 10/22/2021 at 5:11 AM, Faicuai said:

I personally cut my teeth on Atari Basic rev.A and a membrane-keyboard 400, with 16K and a totally unreliable 410... Brave times, those were...

While this is not a contest, I cut my teeth on paper, learning how to program (starting with BASIC) years before I ever owned a computer or took a programming class.  This included Atari 800s on display in retail stores (and later, VIC-20s and C64s), at least when a BASIC cartridge was handy in the case of the older Ataris.  At one place, there was always a 400 next to the 800, but I rarely bothered with the 400, for obvious reasons, so you got me there, although I did work with PET 2001s on occasion, and in my opinion that particular Chiclet keyboard is even worse!  For a long time, my programs were always retyped (especially since I used a variety of computers for limited amounts of time, and most had no storage devices connected), never saved. ?  While this was inefficient, it forced me to reexamine my code all the time, and actually might have helped me in some ways.  Then in high school I sometimes had to use a Teletype with the PDP-11/70 (earlier I had said it was a 60, but that was a brain fart--we actually had a 70 running RSTS for BASIC and FORTRAN) because some of the old video terminals (mostly Lear Siegler ADM-3As) were out for repair; I volunteered because the other students were just beginning to learn, and I was already quite familiar with the language.  I missed out on punched cards (there was a reader present, but it hadn't been used for untold years), although there was a dual 8-inch floppy drive that I occasionally had a reason to use.  I don't know about these things in general, but the disks I used (from digital/DEC) were somewhat unreliable, with media that constantly shed particles and disks that developed a permanently bent shape that caused friction and spindle slippage.

 

Brave times, indeed, and strange, especially if one was a kid who was interested in computers.  When my elementary school got its first computer, an Apple ][ Plus, it was placed in my 5th grade class because my teacher, we were told, was the only one who volunteered to deal with it.  Then I immediately had to come to her rescue. ?  She tried to boot a floppy disk with some kind of math learning game, but wasn't having any luck, so without asking permission or saying a word, I walked right up and closed the drive door, and it worked.  Then she tried to boot a different disk with the Reset key, since someone had told her that unnecessarily cycling the power did "unbelievable" amounts of damage, but this particular computer had been switched to require a Control-Reset combination instead (because the silly Reset key was right above Return), so I showed her how to do it, and she laughed and said "Alright, that's it, you're the computer monitor!  Everyone, talk to Robert when it's your turn to use the computer."  And I was thinking "Whoa, [most] adults are so intimidated by these machines!" ?

 

On 10/22/2021 at 9:01 AM, poobah said:

No, no, no, a thousand times, NO.

 

When I get new minions, I tell them I want comments that describe what they were thinking, what they intended the code to do. Any idiot can read the code and deduce what it does, the question becomes, is that what it was supposed to do.

Ideally, one would always utilize both "self-documenting code" and good comments, although yes, the former is never a substitute for the latter.  In fact, especially back in vintage times, sometimes it was necessary to write arcane, enigmatic algorithms for some functions/operations in order to squeeze out every last bit of performance, but as long as the comments are good (and it is necessary or highly beneficial to write such code), that is acceptable, in my view.

Link to comment
Share on other sites

On 10/22/2021 at 8:11 AM, Faicuai said:

It is clear where CBM Basic was definitely optimized

I'm not sure one can call it optimized, per se. The use of 16-bit line numbers in branches, for instance, was a side-effect of them not tokenizing numbers at edit time. So they had to read it from ASCII every time, and as such it was easier to make an ASCII->16 bit parser than use the 40-bit version and then convert. In contrast, AB was converting these to FP at edit time, so in their case it was easier to just call the FP->int. In neither case do I think anyone asked "how do I make this fast?" That said, MS's "search from here forward" clearly is a deliberate optimization. I suspect that came later.

 

One sees the same pattern in the loops. I suspect in both cases it was simply a case of using whatever pointer they had in-hand to push on the stack.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...