Jump to content
IGNORED

The Complete Noob & Idiots Guide to E/A programming...


Omega-TI

Recommended Posts

I think successful assembly code generation often depends on what sort of program you are trying to write. If you are writing a program with "basic" displays and input requirements, you can get away with building upon the built-in routines for video, keyboard, and IO. You still need your own routines to accept and manipulate input and strings, but that is arguably simpler than trying to code a keyboard matrix CRU bit scanning routine.

 

What stymied me in the beginning was how various memory and video locations translated into displaying characters or accepting input. Once I understood the simple IO building blocks, other pieces fell into place. Reading -commented- source code was very important. As I experimented I took sample code and modified it to do something completely different, building upon what I learned in the process. Compute! and Molesworth were much better reference books than the E/A manual.

 

The first few years my assembly programming was done in tandem with Extended BASIC programming. I tend to disagree that you should not try to translate BASIC programs into assembly. I still periodically write a simple proof of concept program in Extended BASIC because it can be written and tested very quickly. If I like it, I'll convert it into assembly based on the overall flow, input, and outputs. While not a direct conversion, the overall flow carries over nicely in many efforts. For some tasks I will either just code the routine or, if I'm feeling up to it, I will document the entire flow before coding. It often depends if I need a simple brute-force hack, a quick routine, or a more structured solution.

 

As for "messy spaghetti" code, I don't think new programmers should overly concerned about creating perfect routines and algorithms. At the end of the day, it is their own improved understanding and the final program result that matter. To me it's loosely similar to comparing someone who files all their paperwork neatly in folders, tucked away in a file drawer versus the person who must visually see the stacks of paper on the desk. They can both probably tell you where any item is located, though their method for locating said item is completely different on many levels.

  • Thanks 1
Link to comment
Share on other sites

The way I learn is to read simple books, ask questions, try examples, experiment around, observe others' techniques.

 

Best way there is.

 

 

I know you do your stuff in XB and have a certain mastery of it which allows you to transcend that slap-dash level of work. You also use a higher-level language than XB (TIdBit) for a lot of your work, so that helps with the structure a lot. Though I do not think it will slap you around silly if you use a variable you have not declared :)

 

 

I also program in assembly on occasion (SkyChart, Chaos Musings, Robot Arm Controller, Credit Card Reader, Ultimate Planet) , essentially projects requiring either the use of the bitmap screen or low level hardware access ;) Outside of that, XB remains my programming language of choice for its simplicity, power and quick project turnaround. By the way, TidBit is not a language but simply a formatting interpreter allowing the use of long labels and indentation as well as automatic line numbering. The code is still 100% XB.

  • Thanks 1
Link to comment
Share on other sites

It is my belief that no one can teach you anything, they can only guide you and try to help you understand. Learning is a solitary activity. You have to make the discoveries, your brain has to make the correlations and put the pieces together, no one can do that for you. That means you only get out of it what you put into it. I am willing to guide you (those wishing to learn), but you have to be willing to put in the effort and keep up your end of the bargain. Learning to "program" (which is not so easy to define anymore) is not something you do in 14-days, regardless of what book titles suggest or various "learn to code" websites would have you believe. It takes time and a lot of effort, and it not unlike learning to read music and play a musical instrument. If you are not willing to give that amount of effort then don't sign up. Programming is not a casual sport.

 

To me "programming" is using a computer to solve a problem. These days there are many ways to solve many kinds of problems with a computer, but back in the 99/4A days there were not many choices when it came to "home computers". You mostly had ROM BASIC or assembly. So, I will assume the problems you want to solve are such that you can use an old home computer with limited resources. Problem solving also means you should have a curious nature about you and wonder how things work. This is a fundamental trait shared by most programmers and the drive that keeps you going. Being stubborn and wanting to figure something out is also a good personality trait to have. I find that working on puzzles (the kind you do on a table where you assemble lots of little uniquely shaped pieces) gives me similar feelings to programming. Finishing the program and having it work, or watching other people enjoy what you created, is the reward. Programming is just as much about creating as painting, music, sculpture, or anything else. Your brain also perceives learning as "fun", so programming can be very enjoyable.

 

When you decide you are going to use a low-level language (programming languages that get you closer to the hardware) then you need to have a good understanding about what the computer can do and how it does it. That means you have to learn more than just the language itself, you have to understand the parts that make up the computer, how they are put together, and what each does. People using computers to solve problems typically like abstractions from the hardware, but assembly language is right at the CPU's level.

 

There are advantages and disadvantages to everything, and using assembly vs. some other language is no different. My opinion is that assembly is a good choice for 8-bit and 16-bit computers because such systems typically have limited resources (RAM, speed, disk, video, etc.) which means the problems you will be solving are usually going to be in the realm of what one person can achieve. You don't need the abstractions that you get from higher-level languages, resources of modern computers, or teams of people. Also, learning to "code on the metal" will make you a better programmer as you move up the language ladder. You can absolutely use your low-level skills (no matter the flavor of assembly you learn) in higher level languages, I have done it for years.

 

Learning how a computer works at a low level is also fun and will appeal to the curious nature that programmers have. The secret of programming in assembly, if there is such a "secret", is in understanding about how a computer and it's parts work, and using that knowledge to solve problems.

 

Going from one language to another, starting with BASIC vs. assembly vs. whatever is all irrelevant. If you are open-minded and willing to learn then any past knowledge will simply be something else you know and some aspects of if may be useful as you learn assembly.

 

You will learn more by: teaching (guiding), reading other people's code, hacking on your own code, reading books, listening to other people. In that order.

 

For those still here, there are some prerequisites:

 

* I won't hold your hand. I can guide you, answer questions, present challenges, etc. but YOU HAVE TO DO THE LEARNING. Don't waste your time or mine, please.

 

* You are going to have to get familiar with binary and hex. We can cover some of that if required, but there are a billion (yes, I counted) websites out there that already explain these things and I'm sure you can find an explanation that "clicks" with you. If not, then ask and I'll see if I can make it clear.

 

* Ask specific questions. Vague questions begets vague answers.

 

* Don't be afraid to try something. Exploration and forging ahead are required.

 

* You won't always understand everything, so don't try. Sometimes you will have to just do something to move forward before you understand it all. There is always a layer underneath that will take you deeper into the machine, and if you try to understand it all up front you will never return to get anything done.

 

* NO COPY AND PASTE! You will type all your code. You learn by doing. If you don't you are just cheating yourself and wasting my time.

 

* Participate. If we were in the same location doing this exercise in person, I would ask you a lot of questions and expect you to think and answer. Questions would be answered with a question. When I ask a question, think about it and try to answer and figure it out. But don't get frustrated, ask for more help before you get to that point.

 

So, where do you want to do this? Here? What is the starting point? Someone mentioned having a goal (a problem to solve) and I think that is a good idea. It can't be something to vague or uninteresting though, or the reward won't be worth the effort and you will become disillusioned. Getting something on the screen is usually a good place to start since it gives visual feedback. "Hello World" has been suggested, so maybe start there?

  • Like 8
Link to comment
Share on other sites

If someone posted a DSK image with the necessary stuff, we could all follow along, ask stupid questions and maybe learn something.

"E/A programming" as such is possible using Asm994a *), but I guess you want to use the E/A-module to program (implying a more "hardware" oriented approach) ?

 

To the question about necessary stuff. I guess all one need is the "editor" (built-in with some images/cartridges) ?

 

Other necessary stuff may pop up as you go along (thinking font editor, sprite designer etc.).

 

If you're more into doing applications (vs. games) then XB, GCC, GPL and Forth may work better for you.

 

 

*) Cross-platform development and you don't need anything on the DSK (to begin with).

  • Like 1
Link to comment
Share on other sites

I agree with everything that Matthew wrote, but still, it can be a major hurdle just to get the tools working without even having to think about writing any code. So here's a small hello world program that will get you going:

       DEF  START                * Tell E/A where the program can be started
       
*      Define nice labels for the memory addresses used for communicating with the VDP.
*      This does not generate any machine code by itself. 
VDPWD  EQU  >8C00                * VDP write data
VDPWA  EQU  >8C02                * VDP set read/write address


*      This is where the actual program starts       
START  LIMI 0                    * Turn off interrupts, since we want to write to the VDP
*      Tell the VDP which RAM address we want to write to.
       LI   R0,>0102             * This is the actual address, 258 character positions into the screen
*      This code would usually go into a subroutine since we use it all the time.
       ORI  R0,>4000             * Set the bit to tell the VDP to set up a write address
       SWPB R0                   * Swap high and low byte
       MOVB R0,@VDPWA            * Send the low byte to the VDP
       SWPB R0                   * Swap bytes back again
       MOVB R0,@VDPWA            * Send the high byte to the VDP
*      Send the characters to display to the VDP
       LI   R1,TXTHEL            * Load the source address of the text into R1
       LI   R2,12                * This is the loop counter, initialized to the length of the text
LOOP1  MOVB *R1+,@VDPWD          * Send one byte from the address that R1 points to to the VDP, then add 1 to R1
*                                * Note that the VDP will automatically increment its write address after we write a byte
       DEC  R2                   * Subtract 1 from the counter
       JNE  LOOP1                * Jump as long as the counter has not reached zero
*      Wait for quit
       LIMI 2                    * Enable interrupts and the check for quit
LOOP2  JMP  LOOP2                * Loop forever


*      The text to display, stored as one byte per character       
TXTHEL TEXT 'HELLO WORLD!'       
       
       END  START                * This will tell E/A to run our program automatically at START

The code could have been much shorter if I had decided to use the E/A VDP routines, but I figure there are plenty of examples using those routines available, whereas a program that includes everything required (without calling any internal or external routines) may be harder to find. The code does depend on being run from E/A because it assumes that the VDP has already been set up with certain values of the registers and a character set loaded.

 

Here's how you can assemble and run the program from Windows:

  • Make a folder called 'hello'. Open a text editor a type in :-) the code above. Save it in the folder as hello.a99.
  • Open WinAsm99 and add hello.a99 as the only source file. Make sure 'Def Regs' is checked. Save the project in the same folder under the name hello.apf.
  • Click Run Assembler. If everything goes well it should generate an E/A#3 object file called hello.obj.
  • Open Classic99 and set DSK1 to point to the folder you created. Open E/A and use option 3 to run the file DSK1.HELLO.OBJ

Once the program is working you can try to change the text or the placement on the screen. You will soon want to move on to having a general toolbox of VDP routines, which you will find in Matthews thread.

  • Like 2
  • Thanks 1
Link to comment
Share on other sites

Worth mention is the reason Rasmus has "Make sure 'Def Regs' is checked" in his instructions. This is what allows you to reference 9900 registers as R0, R1, etc. up to R15. Simply put, without these definitions, the assembler's register addressing does not "automatically" recognize Rx as calls to registers but instead uses regular numbers (0, 1... 15.) These definitions, whether you put them in your code to start or have the assembler assume them for you, make your source easier to work with as registers are clearly indicated. As an example

ADD8    AI   R0,8

is much easier to work with than

ADD8    AI   0,8

as the former explicitly shows the first argument is a register, while the latter only implies.

 

If you started with the Mini Memory's Line-by-line Assembler the explicit form is the form with which you will be familiar.

 

Later down the road you will see you can use any definition you want to stand in for a register.

 

 

 

@Vorticon: I argue for TIdBit as a language as it has its own rules and grammar as a super-set of XB. That it is interpreted or re-formatted into another language makes it no less so. It is just a higher-level language than its target. TIdBit -> XB -> GPL/Assembly at run-time.

 

 

  • Thanks 1
Link to comment
Share on other sites

Perhaps, even a little clearer would be to say that R0, R1, ..., R15 are constants in the same way that labels are constants, i.e.,. once they are declared in the source code, they cannot change. As mentioned somewhere above, they are names for constants that make the numbers they reference easier to read and use in your program. You can include the following in your program yourself to make your source code easier for you to understand:

R0     EQU  0
R1     EQU  1
R2     EQU  2
R3     EQU  3
R4     EQU  4
R5     EQU  5
R6     EQU  6
R7     EQU  7
R8     EQU  8
R9     EQU  9
R10    EQU  10
R11    EQU  11
R12    EQU  12
R13    EQU  13
R14    EQU  14
R15    EQU  15

Even checking “Def Regs” in WinAsm99 AND including your own EQUates will not hurt anything as long as the labels are the same, which they are. Obviously, the Assembler will complain if you try to EQUate 2 different numbers to the same label.

 

...lee

Edited by Lee Stewart
  • Thanks 1
Link to comment
Share on other sites

I would very strongly recommend reading Bruce Harrison's Art of Assembly articles which are found in the pinned Development Resources thread at the top of the forum. They are fun to read, and very easy to follow. Bruce was clearly learning as he went, so his writing was very approachable. I recall desperately trying to understand the Radix 100 representation of floating point numbers for my SkyChart program, and Bruce had a full article on it which made it seem so simple.

Edited by Vorticon
  • Like 4
  • Thanks 1
Link to comment
Share on other sites

I would very strongly recommend reading Bruce Harrison's Art of Assembly articles which are found in the pinned Development Resources thread at the top of the forum. They are fun to read, and very easy to follow. Bruce was clearly learning as he went, so his writing was very approachable. I recall desperately trying to understand the Radix 100 representation of floating point numbers for my SkyChart program, and Bruce had a full article on it which made it seem so simple.

 

Agreed, his articles were the best! Especially at part 31, when he did a set of "intro to assembly" articles, starting with "This is a Football"

Edited by adamantyr
  • Like 1
Link to comment
Share on other sites

I put a number of PDFs about assembly on my TouchPad so I can read them handily at will. Harrison's articles are one of them.

 

I keep the PDF on my flash drive to access anytime. :)

 

I lament that Bruce Harrison left the community the way he did... frustrated and angry at 3rd party hardware constantly breaking his software being the primary factor.

 

My recent struggles with getting my CRPG's code safely baked into the cartridge RAM area definitely make me realize how tricky it is to tailor software to hardware at times... If I ever HAD to use the Superspace II cartridge with the bank switching, I'd have to write not just a custom loader but a custom compiler as well... Yeesh.

Link to comment
Share on other sites

 

I keep the PDF on my flash drive to access anytime. :)

 

I lament that Bruce Harrison left the community the way he did... frustrated and angry at 3rd party hardware constantly breaking his software being the primary factor.

 

My recent struggles with getting my CRPG's code safely baked into the cartridge RAM area definitely make me realize how tricky it is to tailor software to hardware at times... If I ever HAD to use the Superspace II cartridge with the bank switching, I'd have to write not just a custom loader but a custom compiler as well... Yeesh.

In talking to Bruce Harrison I mentioned to him that if when writing software with future problems in mind that software will crash less often.

 

He objected to this as he thought that Hardware should center around being backward compatible, with me countering that that is hard to do with advancements in hardware as TI was already 20 years old.

Link to comment
Share on other sites

I would very strongly recommend reading Bruce Harrison's Art of Assembly articles which are found in the pinned Development Resources thread at the top of the forum. They are fun to read, and very easy to follow. Bruce was clearly learning as he went, so his writing was very approachable. I recall desperately trying to understand the Radix 100 representation of floating point numbers for my SkyChart program, and Bruce had a full article on it which made it seem so simple.

Do you know which article it is? I'd like to read that one on R100 :-)

 

[Edit] never mind - found it! :thumbsup:

Edited by Willsy
Link to comment
Share on other sites

Well, whaddaya know... Learned something today... An un-documented feature:

*
* FOLLOWING DISPLAYS A NUMBER USING AN
* UNDOCUMENTED FEATURE WITH GPLLNK
* BY CONVERTING THE INTEGER PLACED AT
* >835E INTO A STRING, WHICH IS THEN
* DISPLAYED AT ROW 12, COLUMN 7
* THE NUMBER IS DISPLAYED AS AN UNSIGNED
* INTEGER FROM 0 THROUGH 65535
* THANKS TO MERLE VOGT FOR THIS ONE!
*
*
   LI R0,11*32+6        SET R0, ROW 12, COL 7
   MOV @NUMBER,@>835E   PLACE NUMBER AT >835E
   CLR @>837C           CLEAR GPL STATUS BYTE
   BLWP @GPLLNK         USE GPLLNK
   DATA >2F7C           TO CONVERT INTEGER TO STRING (UNDOCUMENTED)
   MOVB @>8361,R2       GET STRING LENGTH
   SRL R2,8             RIGHT-JUSTIFY
   MOVB @>8367,R1       GET LOW BYTE OF ADDRESS
   SRL R1,8             RIGHT-JUSTIFY
   AI R1,>8300          ADD >8300 HIGH BYTE
   BLWP @VMBW           WRITE STRING TO SCREEN
*
*

(the above taken from Bruce Harrisons articles on FP)

Edited by Willsy
  • Like 1
Link to comment
Share on other sites

I had to browse through my disks to remember how I actually learned Assembly language. Indeed, I somehow managed to make sense of the TI Editor/Assembler manual, although I remember it took me quite some efforts to work through the manual, taking into account that it was all in English, and maybe that I was just 14 years old. :)

 

I always sought for example programs in Assembly language, but all you could get those days was in magazines, written in BASIC or Extended Basic. I remember well that I finally found two books that were a great help to me: "99 Special I" and "99 Special II" (see below), issued by Texas Instruments, authored by @APEsoft.

 

Out of curiosity and a fit of nostalgia, I just wandered through the images of my old disks back when I learned programming on the TI. As I numbered my disks, I can find out a little bit about the path I was following, back to 1983, and how I spent my afternoons ...

 

So if you plan to learn Assembly language, maybe there are some inspirations for you:

  • Do some simple graphics mode text output using utils (VSBW, ...)
  • Multicolored text in bitmap mode
  • Tool that converts pound to kilogram: implement a replacement for INPUT known from BASIC
  • Play music
  • Load a file from disk (DSRLNK) and play music
  • Sidescroller (just column-wise, not soft scrolling)
  • Play music and scroll its lyrics on the screen
  • Prime number generator: input a number, printing all primes up to that number in Text mode using multi-column output (first complete application, saved as EA#5, dated 1984)
  • Read speech ROM and print contents on screen as hexdump
  • Disk-to-tape copier: Leaning to use GPLLNK
  • Memory tests (my TI console suffered from cartridge port wear, so I tried to diagnose the problem)
  • Soft scrolling (one line of text in bitmap mode)
  • Load EA#5 in Extended Basic from tape (a friend of mine had a memory expansion, but no disk drive and no Editor/Assembler)
  • Interfacing with Extended Basic; learn about XMLLNK
  • Utility to save an Extended Basic program by pressing CTRL+FCTN (learn about interrupt hooks)
  • LOAD interrupt handler
  • Disk sector loader and hexdump display (date in comments: July 1986)
  • Printer operations; set particular modes (EPSON LX-80)
  • Disk cataloger (complete application)
  • CRU experiments
  • Create a stripped-down DSRLNK version for disk only
  • Use LOAD interrupt to get a screen dump from the printer
  • Copy tape to disk. Used especially for copying Scott Adams Adventure tapes to disk.
  • 1987: Kick-off of my first multi-year project "SPEECODER" (initial name "DECODE/ENCODE") - an assembly language implementation of a TI Extended Basic tool to decode and encode LPC speech. Final version 1989.
  • Designed fixed point arithmetics for faster computation
  • First Mandelbrot set generator, black/white
  • Switch to Geneve 9640 and learn to use v9938
  • ...

=============

 

As some of you probably know, I'm holding courses at the Technical University at Nuremberg, and this semester I had the freshmen (first semester) again. So we try to make them understand basic computing architecture and machine language programs, and we use the MIPS architecture for that purpuse.

 

The MIPS instruction set architecture is pretty interesting, and quite different from the TMS9900. Most notably, all operations only deal with direct register access, and we have 32 registers on the chip. All memory accesses must be done with dedicated load and store operations. Interestingly, all commands are exactly 32 bits long; no additional arguments.

 

This week, the exam was due, and the last assignment was to write a MIPS assembler program that converts a null-terminated string with characters '0' - '9' to an integer. Takes about 10 lines; unfortunately, only few people actually managed to create meaningful code.

 

I've got the impression that it gets increasingly difficult to bring people to a certain level of technical understanding. Although everyone spends lots of time with information technology devices, they get increasingly detached from the underlying technology.

post-35000-0-35637000-1422047622_thumb.jpg

Edited by mizapf
Link to comment
Share on other sites

Back when I started with the MiniMemory, I wrote a few utility routines for a game I wrote in BASIC. I used the manual as a guide and wrote routines to: copy a custom font and character definitions into VDP RAM, copy color table and sprite table into VDP RAM, load entire screens into VDP RAM (one in text mode, one in standard graphics mode.) I also wrote routines to stash the same into MiniMemory RAM space for me to save onto cassette.

 

Now I am learning more while I work on Arkanoid. The biggest difference between the MiniMemory and E/A environments is the method of assembly. With the MiniMemory you will do assembly one line at a-time, hence "Line-by-line Assembler," whereas with the E/A you take an entire source file and assemble it.

Link to comment
Share on other sites

I've got the impression that it gets increasingly difficult to bring people to a certain level of technical understanding. Although everyone spends lots of time with information technology devices, they get increasingly detached from the underlying technology.

The number of technical "geeks" in the world has not changed just because computers and technology have become common place. Back when affordable home computers came out (early 80's), the only people who had computers were the geeks. Now we are just lost in the noise of the general masses.

 

There is this perception that everyone who uses a computer should know something about how they work or how to program them. I disagree. That would be like saying everyone who drives should know how to work on and fix their car. Everyone can drive, but most people have no knowledge or ability to fix their car, let alone modify or engineer some part of it. Why should it be any different for computers?

 

Some people are technically inclined, some are not. Computers and programming will be naturally attractive to some people, but probably only a small percent compared to the number of people simply using computers.

Link to comment
Share on other sites

There is this perception that everyone who uses a computer should know something about how they work or how to program them. ...

Some people are technically inclined, some are not. Computers and programming will be naturally attractive to some people, but probably only a small percent compared to the number of people simply using computers.

 

You're certainly right, Matthew, and one of the perpetual objectives is to deliver "computing power" to every home and user in a way we do it for water, electric power, or other commodities. In an invited talk at my former university, the term "commodification of computer science" was coined, back in 2006, when Cloud Computing was in its beginnings. The speaker came from a bank, maybe Deutsche Bank, if I remember correctly. Of course, they are dreaming of getting their IT done with less computer science geeks. :-)

 

I doubt this will happen too soon, if ever. We continue to find issues that seem to require quite a lot of "computer science thinking" far beyond that what we need to know about power generation when we plug in our vacuum cleaner. People seem to believe that with the proper tools, we just click together our solutions, but it looks as if just this "click together" is the crucial point, not to be able to put that down in a programming language.

 

Anyway, I was talking about some other people, particularly those who start to study computer science, and who should have some inherent interest in the details of computing and its machinery. When I started to teach at TH Nuremberg, I expressed my surprise that our study plan includes Assembly language in the first semester, because I know that topic from the second semester or higher. Also, we do get recurring complaints that people studying CS/Economics feel they have no use for Assembly language at all. On the other hand, we say that we cannot grant a Bachelor/Master degree in CS to someone who never got an idea how the machine level of a computer is actually doing its job.

 

The problem, as I see it, is that it becomes increasingly difficult to teach students technical details of computers, different to what it was in our times, when we sat in the auditorium. This is more than "everything was better in those days long ago".

Link to comment
Share on other sites

You're certainly right, Matthew, and one of the perpetual objectives is to deliver "computing power" to every home and user in a way we do it for water, electric power, or other commodities. In an invited talk at my former university, the term "commodification of computer science" was coined, back in 2006, when Cloud Computing was in its beginnings. The speaker came from a bank, maybe Deutsche Bank, if I remember correctly. Of course, they are dreaming of getting their IT done with less computer science geeks. :-)

 

I doubt this will happen too soon, if ever. We continue to find issues that seem to require quite a lot of "computer science thinking" far beyond that what we need to know about power generation when we plug in our vacuum cleaner. People seem to believe that with the proper tools, we just click together our solutions, but it looks as if just this "click together" is the crucial point, not to be able to put that down in a programming language.

I don't understand why everyone is trying to turn computer science, engineering, and programming into some high-level modular "click pieces together" kind of discipline. Everyone has a body, so why don't we all start doing surgery on ourselves? I mean, really, doctors and surgeons don't do anything I can't do, and medicine has been around a long time now and should be really easy for anyone to do. (That is meant to be sarcastic, in case it is not clear.)

 

I wish the world would get off computer science as something that is supposed to magically get easier and faster, and just accept that the discipline is actually really hard, takes time to learn, and requires skilled people in the field.

 

Nowhere have I seen where "technology" means things got easier, so why do people think that about computers? Maybe easier for the end user, but not easier to learn, design, or produce. For example, look at aircraft. I could build an airplane the likes of the Wright Brothers made, but it was very hard to use and dangerous. Today people get on a commercial jet and fly all over the world very easily, thanks to advances in technology. But, could the passengers design and build the aircraft they are riding in? Do any of them *think* they could design and build such an aircraft? Anyone want to ride in an airplane that a passenger just bolted together by selecting major components like wings, landing gear, fuselage sections, tail, engines, various controls, etc.?

 

Anyway, I was talking about some other people, particularly those who start to study computer science, and who should have some inherent interest in the details of computing and its machinery. When I started to teach at TH Nuremberg, I expressed my surprise that our study plan includes Assembly language in the first semester, because I know that topic from the second semester or higher. Also, we do get recurring complaints that people studying CS/Economics feel they have no use for Assembly language at all. On the other hand, we say that we cannot grant a Bachelor/Master degree in CS to someone who never got an idea how the machine level of a computer is actually doing its job.

I think assembly should be taught first. As a 13-year-old I managed to learn BASIC from the blue book that came with the 99/4A, followed by assembly language when I finally got the expansion unit. In a learning environment with older students that you have in a university setting, learning an assembly language should be a realistic goal.

 

However, I do believe that you need to learn how to solve problems using the tools a computer provides before you can expect to produce any working code. It does not matter if you know all the syntax of a language if you don't understand how to put instructions together to get the computer to do something meaningful. That would be like trying to learn a foreign language by only reading a dictionary.

 

It is a mistake to think that by learning a programming language that you will inherently learn to think like a computer, or pick up on the underlying structure of the machine. For example, it takes more than just explaining the syntax of a loop to use them. You have to explain when to use them, why they are used in the first place, how they are a tool to solve a problem, what the computer is really doing (assembly level), and how they relate to the problem as a whole.

 

People are also used to thinking in generalities: "come here", "go to the store", etc., but computer requires you to describe absolutely every single step in a process down to the smallest detail. Get it wrong and the computer has no problem telling you in the form of a crashed program. In some ways this is a direct reflection to the programmer that they made a mistake, and some people take offense to that, while others see it as a challenge and yet another problem to solve and overcome. The higher the frustration the greater the reward, if you manage to get through the task.

 

If I were teaching a first year course, I would have two parallel lines of learning going on. One would be where students are following instructions and getting the computer to do something. People like to see things happen and get feedback. If you try to start with a ton of theory to establish some initial foundation, you are going to bore the students and they will become uninterested. Don't make them drag through binary and hex number theory, ugh! That stuff will come naturally over time. Getting something on the screen, hearing sounds, etc. is rewarding, even if they don't understand completely what they are doing. Then hacking on the code, making small changes to see the results, etc. reinforces the learning, gives the confidence, and helps keep up the interest.

 

The other line of learning would be in teaching why the programs do what they do. How the computer is dealing with that information and data and producing something on the screen, making a sound, or whatever. At points in the process you would have deep-dives into a concept (i.e., 8-bits in a byte, I/O vs memory map, etc.) which could be injected by the teacher or, better yet, prompted by a question from a student.

 

The problem, as I see it, is that it becomes increasingly difficult to teach students technical details of computers, different to what it was in our times, when we sat in the auditorium. This is more than "everything was better in those days long ago".

Computers have not changed since the stored-program design, or basically since the mid-1960's. Our computers are faster, more dense, and have more storage available to them, but how a CPU and computer actually works has not changed. All CPUs still perform the same old fetch, decode, execute, store process and get their instructions and data from RAM. ICs are still designed with photo lithography processes building up layers of silicon and other elements to make transistor circuits, which in turn make up the basic flip-flop and Boolean logic circuits that let a CPU do what it does.

 

I don't think it would be hard to teach students the details of computers. I think it would be hard to convince the students that they *need* to learn the details of computers. This goes back to that perception problem that people have about technology these days, and most students are going to be coming into class with preconceived notions.

  • Like 3
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...