Marius Posted January 10, 2012 Share Posted January 10, 2012 Yikes - not so safe without battery backed-up RAM. You wouldn't want to have a power-cut after two or three hours of coding. That's why I was a bit disappointed finding out that the battery on the U1MB upgrade was only meant for RTC purpose. Candles explanation that 'we' today have fast enough hard disks is true, but not suitable for the way I work on my Atari 8bit :S Quote Link to comment Share on other sites More sharing options...
flashjazzcat Posted January 10, 2012 Share Posted January 10, 2012 Candles explanation that 'we' today have fast enough hard disks is true, but not suitable for the way I work on my Atari 8bit :S I'm rather intrigued now... how does an HDD fall short of requirements? Quote Link to comment Share on other sites More sharing options...
Marius Posted January 10, 2012 Share Posted January 10, 2012 I'm rather intrigued now... how does an HDD fall short of requirements? What I said above. When I am coding I do not want my own code mess with my Harddisk and the important data on it. So when I'm coding my hard disk goes on write protect (with the Write Protect switch on my BlackBox). I do not want to code on a slow device, so the very best alternative is a Ramdisk. In fact... it isn't an alternative, it is the first choice for me, using a Ramdisk for coding. Just like you wrote: it's a bit tricky... with a power cut, or other serious hardware problem everything on the ramdisk is gone. A battery backup of all that would be awesome. Coding with the hard drive not write protect is not really a nice alternative. And for me the Ramdisk is the best thing to use when I'm working on my Atari projects. Quote Link to comment Share on other sites More sharing options...
flashjazzcat Posted January 10, 2012 Share Posted January 10, 2012 Is the RAMdisk not equally or more susceptible to damage while you're coding and testing? I assume you're editing source code, saving it to the RAMdisk, compiling, testing, and so on? Quote Link to comment Share on other sites More sharing options...
Marius Posted January 10, 2012 Share Posted January 10, 2012 Yes. But the only thing I will lose is the work of that day. When I lose my complete hard disk content..... Quote Link to comment Share on other sites More sharing options...
candle Posted January 10, 2012 Author Share Posted January 10, 2012 why should you loose your hard disk content? black box is THAT unstable? Quote Link to comment Share on other sites More sharing options...
Marius Posted January 10, 2012 Share Posted January 10, 2012 Sigh... I can keep repeating. I would lose harddisk content in case I make an error with coding and screw my hard disk content. That would or could happen with every device. When I am coding everything except ram disk is write protect. Quote Link to comment Share on other sites More sharing options...
Marius Posted January 10, 2012 Share Posted January 10, 2012 @candle please read back and please answer my questions. Every time a lot of messages follow and you seem to miss my question(s) Thanks Marius Quote Link to comment Share on other sites More sharing options...
candle Posted January 11, 2012 Author Share Posted January 11, 2012 Marius, could you make a hack by yourself? it would require you to lift one pin of each ram chip, add any small signal diode (1n4148 would suffice) and so-called supercapacitor or double-layer capacitor per chip it would be similiar to what i've seen in Hias creation (or MegaHerz?) you would end up with few hours of backup i suppose, that should suffice for short term protection you're talking about a proper way of doing so would require another chip, called NVRAM controller, that takes control over OE and WE signals and passes them through to the ram only if power supply it supervises is stable and above tripping point this is considerable cost-wise and area-wise (boards area that is) and i belive not required by GP 1 Quote Link to comment Share on other sites More sharing options...
flashjazzcat Posted January 11, 2012 Share Posted January 11, 2012 Here's another solution: boot the machine, copy your work to the RAMdisk, write protect the BB, edit, test, etc. Then unlock the BB, copy the day's work back to the HDD (assuming it hasn't all been wiped out by runaway code) and turn off the computer. Repeat ad infinitum. Quote Link to comment Share on other sites More sharing options...
+rdemming Posted January 11, 2012 Share Posted January 11, 2012 Sigh... I can keep repeating. I would lose harddisk content in case I make an error with coding and screw my hard disk content. That would or could happen with every device. When I am coding everything except ram disk is write protect. I'm surprised that in this age with 6502 cross compilers on fast PCs with nice and user-friendly text editors and integrated development environments (IDE) with no 40 column limitation that assemble huge projects in a eyeblink, with no risk that you lose your work in a ramdisk because you made a programming mistake, with far better keyboards than an Atari ever had, near perfect Atari emulators to test your work and SIO2PC solutions that can transfer your work to a real 8-bit far faster than a happy drive, someone is still programming directly on an Atari 8-bit. Respect Robert 2 Quote Link to comment Share on other sites More sharing options...
flashjazzcat Posted January 11, 2012 Share Posted January 11, 2012 ...someone is still programming directly on an Atari 8-bit. Respect I completely agree. Regardless of the specific working methods employed, it's something I miss. 1 Quote Link to comment Share on other sites More sharing options...
Marius Posted January 11, 2012 Share Posted January 11, 2012 (edited) I'm surprised that in this age with 6502 cross compilers on fast PCs with nice and user-friendly text editors and integrated development environments (IDE) with no 40 column limitation that assemble huge projects in a eyeblink, with no risk that you lose your work in a ramdisk because you made a programming mistake, with far better keyboards than an Atari ever had, near perfect Atari emulators to test your work and SIO2PC solutions that can transfer your work to a real 8-bit far faster than a happy drive, someone is still programming directly on an Atari 8-bit. Respect Well... thanks a lot for the Respect Robert! Here I explain why: http://www.atariage....ost__p__2439444 I have been using cross assemblers. In fact: our last project (The Abbuc Intro #106) we did on cross assembler. I must agree: the editor was easier to use than the line numbered mac/65. But the FIRST moment I saw the final result on the real thing, was when it was done. The only thing I could do was watching our final result... 2 minutes of joy on the real thing! I don't know the english translation of this Dutch saying: "Het bezit van de zaak, is het eind van het vermaak" ... but it means in English: once you reached your goal, the fun is gone. And that's just my problem with all that PC/Mac/Emulation involved things. Everybody who uses PC or Mac could say: I can not understand why someone would still use it's atari at all? They are outdated, and everything on modern PC/Mac goes faster, better and easier. Why in the world would you still use your real atari... you have an emulator? Right... For me: coding = fun For me: atari = fun For me: coding on the atari = FUN ^ 2 But of course when I'm on the Atari I love to use the best configuration there is. I do not use Atari Assembler Editor. I use the Mac/65 cart with DDT (it's a fabulous development cart). Another fabulous Assembler is the Synassembler. It would fit in the 8K slot of the Ultimate Upgrade (that's why I want that FLASHER and the feature to select or Basic, or the other 8K rom). And sometimes I use the 130XE+ assembler ... that one has a fantastic editor. I think... with the right configuration... and the right knowledge of how to get the max. performance/environment features out of your little atari you would be surprised how easy it is to program on little Atari. I bet it would be just as easy and versatile as cross assembler. Only more fun. Edited January 11, 2012 by Marius1976 1 Quote Link to comment Share on other sites More sharing options...
Marius Posted January 11, 2012 Share Posted January 11, 2012 Marius, could you make a hack by yourself? it would require you to lift one pin of each ram chip, add any small signal diode (1n4148 would suffice) and so-called supercapacitor or double-layer capacitor per chip it would be similiar to what i've seen in Hias creation (or MegaHerz?) you would end up with few hours of backup i suppose, that should suffice for short term protection you're talking about a proper way of doing so would require another chip, called NVRAM controller, that takes control over OE and WE signals and passes them through to the ram only if power supply it supervises is stable and above tripping point this is considerable cost-wise and area-wise (boards area that is) and i belive not required by GP Great! Is that super capacitor a gold cap? And one more Which pin on the rams would that be? I'm not sure I'm going to do it... my experience with SMD is very limited, and I do not want to kill the boards. But when I will think it over. It is an interesting option (for me). Greetz M. Quote Link to comment Share on other sites More sharing options...
candle Posted January 11, 2012 Author Share Posted January 11, 2012 pin 32 these capacitors goes under diffrent names gold cap, backup cap, double layer cap, or super cap - its all the same - you're looking for 0.5-1F capacity with rated at 5V-5.5V (there will be some voltage drop on diode, so 5V is marginal, but enough) 1 Quote Link to comment Share on other sites More sharing options...
+bob1200xl Posted January 11, 2012 Share Posted January 11, 2012 How about a slave drive (CF card?) that you can leave R/W while the master is WP? Memory is a lot easier to corrupt than a drive... Bob Marius, could you make a hack by yourself? it would require you to lift one pin of each ram chip, add any small signal diode (1n4148 would suffice) and so-called supercapacitor or double-layer capacitor per chip it would be similiar to what i've seen in Hias creation (or MegaHerz?) you would end up with few hours of backup i suppose, that should suffice for short term protection you're talking about a proper way of doing so would require another chip, called NVRAM controller, that takes control over OE and WE signals and passes them through to the ram only if power supply it supervises is stable and above tripping point this is considerable cost-wise and area-wise (boards area that is) and i belive not required by GP Great! Is that super capacitor a gold cap? And one more Which pin on the rams would that be? I'm not sure I'm going to do it... my experience with SMD is very limited, and I do not want to kill the boards. But when I will think it over. It is an interesting option (for me). Greetz M. Quote Link to comment Share on other sites More sharing options...
Fox-1 / mnx Posted January 11, 2012 Share Posted January 11, 2012 I'm surprised that in this age with 6502 cross compilers on fast PCs with nice and user-friendly text editors and integrated development environments (IDE) with no 40 column limitation that assemble... I can't believe people get any joy of using an ordinary modern every day PC/Mac/Whatever system to do things using buggy emulated tools (read: buggy=close but still not 100% emulation) while having a 100% real system at hand. 1 Quote Link to comment Share on other sites More sharing options...
+bob1200xl Posted January 11, 2012 Share Posted January 11, 2012 So, I have an Atari with a hardware hack (or two...). How do I support that in a PC? Should I climb the learning curve for programming in both environments? How do I separate a bug in my program from a bug in my emulator/crossassembler/Atari? You can program on the Atari with higher speed clocks, multiple CF drives, battery-backed SRAM, internal Synassembler,,,, if you really want all that. Or, you can just use Assembler/Editor and a 1050. I would guess that the total time invested is comparable, depending on the project. Would you do all that PC stuff for 50 lines of code? Bob Sigh... I can keep repeating. I would lose harddisk content in case I make an error with coding and screw my hard disk content. That would or could happen with every device. When I am coding everything except ram disk is write protect. I'm surprised that in this age with 6502 cross compilers on fast PCs with nice and user-friendly text editors and integrated development environments (IDE) with no 40 column limitation that assemble huge projects in a eyeblink, with no risk that you lose your work in a ramdisk because you made a programming mistake, with far better keyboards than an Atari ever had, near perfect Atari emulators to test your work and SIO2PC solutions that can transfer your work to a real 8-bit far faster than a happy drive, someone is still programming directly on an Atari 8-bit. Respect Robert Quote Link to comment Share on other sites More sharing options...
flashjazzcat Posted January 11, 2012 Share Posted January 11, 2012 So, I have an Atari with a hardware hack (or two...). How do I support that in a PC? Should I climb the learning curve for programming in both environments? How do I separate a bug in my program from a bug in my emulator/crossassembler/Atari? You can program on the Atari with higher speed clocks, multiple CF drives, battery-backed SRAM, internal Synassembler,,,, if you really want all that. Or, you can just use Assembler/Editor and a 1050. I would guess that the total time invested is comparable, depending on the project. Would you do all that PC stuff for 50 lines of code? Very much depends on the size of the project. I went about as far as I could with MAC/65 compatible source code on the A8, before running the assembler in emulation for extra speed, and then finally jumping over to a (MAC/65 compatible) cross-assembler. The learning curve as far as the assembly language was concerned was nil, and while it takes some moments to set up an environment on the PC, once it's done, it's done. SpartaDOS X was written using a cross-assembler back in the day, and there are plenty of similarly huge projects which are - sadly - somewhat unsuited to in-place development. I've encountered bugs in cross-assemblers and emulators, just as I've encountered bugs in assemblers on the A8. There's more chance of getting the bugs in the emulator / cross-assembler fixed toute-de-suite. I developed a large project using an XF551 and a home-brewed macro assembler in the 90s. Towards the end, my main worry was that the drive would give up at any moment... but they were great times! I can surely envisage handling small to medium sized projects again on the A8... but multi-bank 1MB flash carts full of code and 40KB assembler projects... I don't have time to wait for them to compile on the little machine. Quote Link to comment Share on other sites More sharing options...
candle Posted January 11, 2012 Author Share Posted January 11, 2012 its about what you're up to i guess if its retro feel and you have nothing but the time on your hands - i would do it on atari - perhaps with aki or tt-touch equipped keyboard at least as my original keyboard was so mush it was beyond the point of using it for a coding didn't feel that way back in the days, but i didn't have nothing to compare it with same story goes for ST - back then i was able to sit down all day and write some code under devpack using a single floppy drive as my setup novadays i woudn't do it again - even though i don't own any hard drive for ST but if its about making a goal and obtaining it - tools aren't important, the goal is (plus i'm not that confident with my code, so sometimes its trail and error approach) Quote Link to comment Share on other sites More sharing options...
flashjazzcat Posted January 11, 2012 Share Posted January 11, 2012 OT: but you mentioned the other day that there was no decent cross-dev platform for the ST, Sebastian. I was amazed, but I believe you're correct! Quote Link to comment Share on other sites More sharing options...
+rdemming Posted January 12, 2012 Share Posted January 12, 2012 I can't believe people get any joy of using an ordinary modern every day PC/Mac/Whatever system to do things using buggy emulated tools (read: buggy=close but still not 100% emulation) while having a 100% real system at hand. So, I have an Atari with a hardware hack (or two...). How do I support that in a PC? How do I separate a bug in my program from a bug in my emulator/crossassembler/Atari? I would guess that the total time invested is comparable, depending on the project. Would you do all that PC stuff for 50 lines of code? LOL, doing everything on a real system has certainly its charm. Emulators are never perfect. That why I said you can use (highspeed) SIO2PC solutions to test your code on a real system. For large projects this seems more comfortable than waiting for things to assemble on real hardware. Emulators are not perfect but for developing purposes they have the advantage of debugging features like (conditional) breakpoints, step-by-step code execution and tracing. I'm not familiar with native development environments but are those debugging features available on a real system? Robert Quote Link to comment Share on other sites More sharing options...
Marius Posted January 12, 2012 Share Posted January 12, 2012 Emulators are never perfect. That why I said you can use (highspeed) SIO2PC solutions to test your code on a real system. Sio is a lot slower than my ramdisk, which is lightning fast. For large projects this seems more comfortable than waiting for things to assemble on real hardware. I have worked on rather big projects on my Mac/65 and I never found the assembly time annoying. Even a rather big source assembes in matter of just a few seconds. (Mac/65 or Synassembler). Yes Assembler Editor is a pain in the ass, so when you have that assembler in mind: yes I understand what you say about speed. Mac/65 is incredible fast. When you put a .opt no list at the start of your source, and turn off the screen output, ... it is assembled before your know it! Emulators are not perfect but for developing purposes they have the advantage of debugging features like (conditional) breakpoints, step-by-step code execution and tracing. I'm not familiar with native development environments but are those debugging features available on a real system? Partly yes. DDT has a step-by-step code execution, and I believe there are breakpoints possible in that one too. Tracing is an option in the Synassembler for sure (I'm sure, because I used that in Synassembler). I don't know whether that is in DDT (Mac/65 debugging tool) or not. Never checked it there. And ofcourse I count my blackbox too. That has the best feature: a realtime 'break' button where you jump during operation in the 6502 monitor. So like I said in my previous posts and that other topic. If you have the right configuration (both software and hardware) on little atari, and you know how to use it... I'm wondering which setting gives the most satisfaction. I know that there are a lot atari related PC tools. Like G2F, RMT music tool etc. Yes ofcourse... it is more convenient when you use the creations of these tools in your program. But I don't use those programs. I use MPT and CMC and for graphics I use atari native programs. So for that reason I can stick with mac/65 or Synassembler too Quote Link to comment Share on other sites More sharing options...
flashjazzcat Posted January 12, 2012 Share Posted January 12, 2012 (edited) Sio is a lot slower than my ramdisk, which is lightning fast. I think the time spent waiting for 200KB of source code to compile on the A8 would outweigh the inconvenience of waiting for the target binary to load via SIO2PC. However, a PBI peripheral emulator is, IMO, desperately needed - particularly with the profusion of fast A8 HDDs nowadays. The time required to copy large files across to the hard disk via SIO is an issue. I have worked on rather big projects on my Mac/65 and I never found the assembly time annoying. Even a rather big source assembes in matter of just a few seconds. (Mac/65 or Synassembler). Yes Assembler Editor is a pain in the ass, so when you have that assembler in mind: yes I understand what you say about speed. Mac/65 is incredible fast. When you put a .opt no list at the start of your source, and turn off the screen output, ... it is assembled before your know it! I'm gonna have to test this theory out. Back when I was coding on the A8, I was using my own assembler (MA65, which is fast) to assemble about 200KB of code disk-to-disk under SpartaDOS X. It took about 5 minutes to compile the lot on an XF551, but I'm starting to wonder how long it would take using a hard disk. Eventually, raw processor speed will become the limiting factor, no matter how fast the I/O. So - I'm starting to think I haven't fully explored coding on the A8 using modern software and hardware. When I have time to burn, I'll get back into it. Edited January 12, 2012 by flashjazzcat Quote Link to comment Share on other sites More sharing options...
Rybags Posted January 12, 2012 Share Posted January 12, 2012 The thing is, if you're producing anything substantial on the real machine, you tend to assemble to a file rather than RAM. Anything significant I do, I just use PC tools and emulation. Test on real hardware once in a while. Unless you've got time to burn, it's the only way. Productivity wins over nostalgia. Also, I don't like typing lots of stuff on the Atari. The Pokey keyboard scan is a real weakness there. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.