Jump to content
IGNORED

sdcc 3.3.0 released today


PkK

Recommended Posts

A new version of sdcc, the compiler used with both Daniel's and my ColecoVision development tools, has been released today. Compared to the previous release, there are many improvements relevant to Z80, and thus ColecoVision development. Here's the release announcement from Maarten:

 

A new release of SDCC, the portable optimizing compiler suite for Intel MCS51 based (8031, 8032, 8051, 8052, etc.), Maxim (formerly Dallas) DS80C390 variants, Freescale (formerly Motorola) HC08 based (hc08, s08), Zilog Z80 based (z80, z180, gbz80, Rabbit 2000/3000, Rabbit 3000A) and Microchip PIC16 and PIC18 microprocessors is now available. Sources, documentation and binaries compiled for x86 Linux, x86 and x64 MS Windows and Mac OS X universal binaries are available ( http://sdcc.sourceforge.net ).

 

SDCC 3.3.0 Feature List

 

Many small improvements in code generation for the z80-related ports - merged smallopts branch

lospre (currently enabled for z80-related and hc08-related ports only) - merged lospre branch

More efficient initialization of globals in z80, z180, r2k and r3ka ports.

Inclusion of tests from the gcc test suite into the sdcc regression test suite led to many bugs being found and fixed.

Split sdas390 from sdas8051

Merged big parts of ASxxxx v5 into sdas

New pic devices (synchronization with MPLABX 1.60). (Except for very old MCU-s.)

New script which disassembles those hex files, in which MCS51 code there is. (mcs51-disasm.pl)

Added the PIC16F1788 and PIC16F1789 devices.

C11 _Alignof operator.

C11 _Alignas alignment specifier.

C11 _Static_Assert static assertion.

  • Like 1
Link to comment
Share on other sites

  • 6 months later...

Hmm...so I tried upgrading to 3.3 (daily build) and even 3.2.0a, and it wasn't working with lib4k. I patched the crtcv.s file that Daniel supplies a bit, to add in the new .areas and put them in the correct order, but it still doesn't seem quite right. So I'm back using 3.1.0 until I can figure out what is wrong. The areas list now looks like:

;; Ordering of segments for the linker - copied from sdcc crt0.s
    .area _HOME
    .area _CODE
    .area _INITIALIZER
    .area _GSINIT
    .area _GSFINAL
        
    .area _DATA
    .area _INITIALIZED
    .area _BSEG
    .area _BSS
    .area _HEAP

but there is still something wrong with some variables. Maybe they aren't initialized as expected? I'm not sure yet.

 

PKK, is there something else I need to add to the start up code to initialize and zero all variables properly on startup? Anyway I'll poke around a bit as time permits.

Link to comment
Share on other sites

  • 4 weeks later...

Okay, so I finally got back to it, and found that I'd missed something else that's different in the newer SDCCs. The initializations of global variables is now in crt0.s, and so I need to migrate it to crtcv.s. It now looks like:

        .area _GSINIT
gsinit::
        ld      bc, #l__INITIALIZER
        ld      a,b
        or      a,c
        jr      z, gsinit_next
        ld      de, #s__INITIALIZED
        ld      hl, #s__INITIALIZER

        ldir
gsinit_next:


        .area _GSFINAL
        ret
        ;

And from what I can see, I can now use the daily build from Jan 11. Most excellent. I've requested that Daniel run his test suite and see if it works for his games.

 

When I was looking at it a couple of months ago I got errors that the symbols didn't exist, so I added some new globl lines and that did the trick.

        ;; global from C code
        .globl _main
        .globl _nmi

        .globl l__INITIALIZER
        .globl s__INITIALIZER
        .globl s__INITIALIZED

just those l__ and s__ symbols were missing.

Edited by hardhat
Link to comment
Share on other sites

  • 3 weeks later...

Sorry for not replying earlier. I'm a bit busy this term. While it took you some effort to adapt your programs for current sdcc, there is a reward, since the new way of initializing global variables is much more efficient.

 

Philipp

Link to comment
Share on other sites

Is there any chance of these updates being put into NewColecos devkit? Pending that, how would a neophyte integrate this into an existing NewColeco devkit install?

Hi,

 

I have made the changes in NewColeco's dev kit, but it is up to him when he publishes it next. In the mean time, I've attached the new start up code (crtcv.s) for lib4k. I will be using lib4ksa on my next project, but the changes are similar to the start up code. I will post them on request. For the moment I haven't tested them so I'll save that for a later post.

 

To build from source, you put that in the lib4k directory, you can run build.bat (or on Mac/Linux build.sh) to make a new crtcv.rel file. Then put that rel file in your project directory to use with CCI3 as appropriate.

 

But the best way might be to just take the two files in the attached zip (the crtcv.rel and the cvlib.lib) and put them in the project directory with CCI3.

 

crtcv.s

lib4k-sdcc3_3.zip

Edited by hardhat
  • Like 2
Link to comment
Share on other sites

  • 3 weeks later...

I've begun working on Jewel Panic today, and my first (well, actually second) step was to attempt to migrate from Hi-Tech C to SDCC. I've got SDCC 3.3.0 and CCI3 set up according to the most recent info I could find, but while compiling works, I get this warning when I press the "Link" button:

 

?ASlink-Warning-Couldn't find library '../comp.lib'

 

And the ROM that is generated doesn't want to work under blueMSX, while the exact same source code compiled under Hi-Tech C (via CCI.exe) works perfectly.

 

So what am I missing? :)

 

And also, why is compiling so slow on a 60K source file? I would expect a modern compiler like SDCC to burn through such a small amount of source code in the blink of an eye! :P

Link to comment
Share on other sites

Did you include the comp.lib in your linker? I don't use CCI3 , but in CCI2 you have un checkbox to check. But you have to check it , only if you use function that use Dan0 compression.

 

Concerning the compilation speed, personnaly i have this kind of issue with all recent version of SDCC.

 

For instance, i recently migrated Ghost"n Zombie from Hi-tech C to SDCC.

 

With SDCC 2.something , it compiles in 20 secondes.

 

With SDCC 3.something, it compiles in 10 minutes!!!... the exact same source code on the exact same machine!

 

So for now, i renounced to use the last versions. :(

 

And another i noticed The SDCC version of Ghost'n Zombies run slower than the Hi-tech C. But the rom is smaller with SDCC (even with speed optimization option)

Edited by youki
Link to comment
Share on other sites

UPDATE: I figured out my main problem, which was that I didn't download the latest version of Daniel's dev kit.

 

With everything now installed properly (after running the "patchsdcc" batch file) the Link operation functions without giving any error messages, but the ROM still doesn't want to function under BlueMSX.

 

EDIT: Oh, and my 60K source file now compiles in seconds. :P

 

EDIT #2: I tried compiling some of the sample programs that come with Daniel's dev kit, and they all work under BlueMSX. That means there's something I'm doing in the code of Jewel Panic that, once compiled with SDCC, BlueMSX doesn't like...

 

EDIT #3: I want to point out that my binary does run under blueMSX, but all I get is a black screen. Perhaps some kind of disable/enable_nmi problem?

Link to comment
Share on other sites

For Pete's sake, what's going on here? I reduced my code to the most basic bitmap display test, and no matter what I try, the generated binary still displays a black screen under blueMSX!

 

Here's my test code below, so can anyone tell me what I'm doing wrong? Was the RLE decoder implemented in rle2vram() changed at one point? I don't get it. :(

 

#define NO_SPRITES
#include <coleco.h>
#include <getput1.h>
#include "jp_define.h"

/*=================*
 * Graphic patterns, colors, and name tables
 *=================*/
/* Defined in jp_gra.c */
extern byte JP_PAT1_TITLE[];
extern byte JP_PAT2_TITLE[];
extern byte JP_PAT3_TITLE[];
extern byte JP_COL1_TITLE[];
extern byte JP_COL2_TITLE[];
extern byte JP_COL3_TITLE[];
extern byte JP_NAM1_TITLE[];
extern byte JP_NAM2_TITLE[];
extern byte JP_NAM3_TITLE[];

/*=================*
 *=================*
 * jp_init_title_screen
 *=================*
 *=================*/
void jp_init_title_screen() {

   /*------------------*
    * Load patterns and colors
    *------------------*/
   rle2vram(JP_PAT1_TITLE, 0);
   rle2vram(JP_PAT2_TITLE, 0x800);
   rle2vram(JP_PAT3_TITLE, 0x1000);

   rle2vram(JP_COL1_TITLE, 0x2000);
   rle2vram(JP_COL2_TITLE, 0x2800);
   rle2vram(JP_COL3_TITLE, 0x3000);

   /*------------------*
    * Load title screen into name table
    *------------------*/
   rle2vram(JP_NAM1_TITLE, 0x1800);
   rle2vram(JP_NAM2_TITLE, 0x1900);
   rle2vram(JP_NAM3_TITLE, 0x1A00);

   enable_nmi();
}


/*=================*
 *=================*
 * nmi
 *=================*
 *=================*/
void nmi(void) {
}


/*=================*
 *=================*
 * main
 *=================*
 *=================*/
void main(void) {
   /*------------------*
    * Clear VRAM and initialize screen mode
    *------------------*/
   screen_mode_2_bitmap();
   jp_init_title_screen();
   while (1==1) {
      delay(1);
   }
}
Link to comment
Share on other sites

When you set a screen mode, the screen is off at default. Daniel fixed the screen mode issues that had problem with sprite mirror, he changed the register setting to have the screen to be off at default.

 

screen_on();

Will show the screen.

Edited by Kiwi
Link to comment
Share on other sites

Did you include the comp.lib in your linker? I don't use CCI3 , but in CCI2 you have un checkbox to check. But you have to check it , only if you use function that use Dan0 compression.

 

Concerning the compilation speed, personnaly i have this kind of issue with all recent version of SDCC.

 

For instance, i recently migrated Ghost"n Zombie from Hi-tech C to SDCC.

 

With SDCC 2.something , it compiles in 20 secondes.

 

With SDCC 3.something, it compiles in 10 minutes!!!... the exact same source code on the exact same machine!

 

So for now, i renounced to use the last versions. :(

 

And another i noticed The SDCC version of Ghost'n Zombies run slower than the Hi-tech C. But the rom is smaller with SDCC (even with speed optimization option)

If I use a goto statement very end of a very big main(), and the origin is very beginning of the code, then SDCC takes neverending compilation also using 1.5 GB of RAM using -size -max alloc select at minimum setting. I'm using the goto to reset Text Adventure. The main() function is very large and my inexperience in coding Colecovision shows. One of these day, I will write a proper game engine that drives the game instead of having linear structure.

 

Removing the goto statement let SDCC compiles quickly, but the game stops at the ending. However, for Pong and Computer Space. The main is short and call the game as function. Then the function it goes back to main. It reset properly due to the main function being short and doesn't take long to compile without max alloc selected.

Link to comment
Share on other sites

 

When you set a screen mode, the screen is off at default. Daniel fixed the screen mode issues that had problem with sprite mirror, he changed the register setting to have the screen to be off at default.

 

screen_on();

 

Will show the screen.

I tried it. Doesn't work. I tried placing the call to screen_on() just before and also just after enable_nmi() and I still get a black screen.

 

I'm starting to think there's a bug in rle2vram() that makes the program hang while it reads and decodes my graphic data...

Link to comment
Share on other sites

I'm not sure if

 

#include "jp_define.h"

 

suppose to be

 

#include <jp_define.h>

No, that won't work. The quotes are for user-defined H files. The < and > are for H files native to the compiler.

 

EDIT: I just removed the #include since I didn't need it for the test display code, and it makes no difference under blueMSX.

 

I haven't load my name table in 3 seperate data before, usually I have 1 name table data. What did you use to make the picture and encode in rle format, just curious?

I used ICVGM303. The application only allows the definition of 256 tiles, so I could only create the data for one screen third at a time. So one pattern + color + name for each screen third. :)

Link to comment
Share on other sites

This is interesting: I moved the data tables (JP_PAT1_TITLE, JP_COL1_TITLE, etc.) to the main file in order to have everything in a single source file, and now the ROM won't even run at all under BlueMSX. I checked the binary with an hex editor, and it doesn't even start with (0xAA 0x55) or (0x55 0xAA).

 

Weird...

Link to comment
Share on other sites

When you export to RLE, the name table is one big group of 784 tiles. So it would write 1A00-1D00, which the sprite information would get over written. It would be very difficult to get 256 name information if it is encoded in RLE.

 

post-24767-0-88344400-1392785946_thumb.png

 

This is how I make full screen title screen. Note on the right picture that the name table area looks funny. It is in place of the left picture. So when I export to rle and copy the data to ROM, then it'll is complete.

 

post-24767-0-40058400-1392786369_thumb.png

 

This is how I load the image.

 

disable_nmi();
screen_mode_2_bitmap();
rle2vram(title1PATTERN,0x0000);
rle2vram(title1PATTERN,0x0800);
rle2vram(title2PATTERN,0x1000);
rle2vram(title1COLOR,0x2000);
rle2vram(title1COLOR,0x2800);
rle2vram(title2COLOR,0x3000);
rle2vram(titleNAME,0x1800);
screen_on();
pause();

 

Wait, you need const statement.

 

extern byte JP_PAT1_TITLE[];
extern byte JP_PAT2_TITLE[];
extern byte JP_PAT3_TITLE[];
extern byte JP_COL1_TITLE[];
extern byte JP_COL2_TITLE[];
extern byte JP_COL3_TITLE[];
extern byte JP_NAM1_TITLE[];
extern byte JP_NAM2_TITLE[];
extern byte JP_NAM3_TITLE[];

 

to

 

extern const byte JP_PAT1_TITLE[];
extern const byte JP_PAT2_TITLE[];
extern const byte JP_PAT3_TITLE[];
extern const byte JP_COL1_TITLE[];
extern const byte JP_COL2_TITLE[];
extern const byte JP_COL3_TITLE[];
extern const byte JP_NAM1_TITLE[];
extern const byte JP_NAM2_TITLE[];
extern const byte JP_NAM3_TITLE[];

Data without 'const' will get loaded right to Colecovision 1KB of RAM instead of being on ROM. Add the const and see if you can load 1 name table to 0x1800 and see if it shows up.

Link to comment
Share on other sites

For Pete's sake, what's going on here? I reduced my code to the most basic bitmap display test, and no matter what I try, the generated binary still displays a black screen under blueMSX!

 

Here's my test code below, so can anyone tell me what I'm doing wrong? Was the RLE decoder implemented in rle2vram() changed at one point? I don't get it. :(

 

#define NO_SPRITES
#include <coleco.h>
#include <getput1.h>
#include "jp_define.h"

/*=================*
 * Graphic patterns, colors, and name tables
 *=================*/
/* Defined in jp_gra.c */
extern byte JP_PAT1_TITLE[];
extern byte JP_PAT2_TITLE[];
extern byte JP_PAT3_TITLE[];
extern byte JP_COL1_TITLE[];
extern byte JP_COL2_TITLE[];
extern byte JP_COL3_TITLE[];
extern byte JP_NAM1_TITLE[];
extern byte JP_NAM2_TITLE[];
extern byte JP_NAM3_TITLE[];

/*=================*
 *=================*
 * jp_init_title_screen
 *=================*
 *=================*/
void jp_init_title_screen() {

   /*------------------*
    * Load patterns and colors
    *------------------*/
   rle2vram(JP_PAT1_TITLE, 0);
   rle2vram(JP_PAT2_TITLE, 0x800);
   rle2vram(JP_PAT3_TITLE, 0x1000);

   rle2vram(JP_COL1_TITLE, 0x2000);
   rle2vram(JP_COL2_TITLE, 0x2800);
   rle2vram(JP_COL3_TITLE, 0x3000);

   /*------------------*
    * Load title screen into name table
    *------------------*/
   rle2vram(JP_NAM1_TITLE, 0x1800);
   rle2vram(JP_NAM2_TITLE, 0x1900);
   rle2vram(JP_NAM3_TITLE, 0x1A00);

   enable_nmi();
}


/*=================*
 *=================*
 * nmi
 *=================*
 *=================*/
void nmi(void) {
}


/*=================*
 *=================*
 * main
 *=================*
 *=================*/
void main(void) {
   /*------------------*
    * Clear VRAM and initialize screen mode
    *------------------*/
   screen_mode_2_bitmap();
   jp_init_title_screen();
   while (1==1) {
      delay(1);
   }
}

 

 

Try to define your array containing RLE as :

 

const byte array[] , instead of Byte array[] .

 

even for the "extern byte" => extern const byte

 

When you migrate from Hi-tech C to SDCC , you have to do that.

 

If you don't put "Const", it is allocated in RAM i think.

 

Today, if i can find some time , i will try your sample to see if i can find the issue.

Edited by youki
Link to comment
Share on other sites

Try to define your array containing RLE as :

 

const byte array[] , instead of Byte array[] .

 

even for the "extern byte" => extern const byte

 

When you migrate from Hi-tech C to SDCC , you have to do that.

 

If you don't put "Const", it is allocated in RAM i think.

 

Today, if i can find some time , i will try your sample to see if i can find the issue.

a-AH! Adding "const" to all my data arrays did the trick! The test program works now! Many thanks, youki! :D

 

Now to restore the project to its original state and get a proper binary! :twisted:

Link to comment
Share on other sites

 

You're welcome.

 

So now, i guess we start to ship Jewel Panic next week! :D

LOL! Not quite. ;)

 

I got the game mostly working now, but there's something weird happening with the main game display. See the attached pics below. The opening menu is displayed correctly, but in the main game, it's like the software displays the patterns in the first third of the screen only.

 

I'm currently investigating whether it's some kind of bug related to calling set_screen_mode_2_text (for the main game) after calling set_screen_mode_2_bitmap (for the title screen) but I'm far from sure that's the actual problem. It's more likely some kind of bug in my code that appeared after migrating from Hi-Tech C to SDCC.

 

Hopefully I'll put my finger on the problem soon...

post-7743-0-88522900-1392828270_thumb.png

post-7743-0-69817200-1392828282_thumb.png

Link to comment
Share on other sites

 

 

LOL! Not quite. ;)

 

I got the game mostly working now, but there's something weird happening with the main game display. See the attached pics below. The opening menu is displayed correctly, but in the main game, it's like the software displays the patterns in the first third of the screen only.

 

I'm currently investigating whether it's some kind of bug related to calling set_screen_mode_2_text (for the main game) after calling set_screen_mode_2_bitmap (for the title screen) but I'm far from sure that's the actual problem. It's more likely some kind of bug in my code that appeared after migrating from Hi-Tech C to SDCC.

 

Hopefully I'll put my finger on the problem soon...

 

Could it be that you forget the COLOR table for the 2 zones at the bottom? Like if all color are set to black you see nothing even if there is patterns....

 

Even in screen mode 2 TEXT , you have 3 color patterns. Not only one!

 

#edit: oups... sorry, it is 3 Patterns table , not Color one , it is only one Color!

 

Daniel modified screenmode_2_text after i discover a bug and introduced my fix in its sdk since

 

http://atariage.com/forums/topic/143304-did-you-have-problem-on-real-console/?do=findComment&comment=1752992

Edited by youki
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...