Jump to content
IGNORED

SDCC PIC16 C compiler?


zezba9000

Recommended Posts

Also another option is the LCC C compiler: https://en.wikipedia.org/wiki/LCC_(compiler)

Its designed to solve the issues we are looking at. Someone used it here for example to target a 16 bit CPU: https://stackoverflow.com/questions/7484466/choosing-cpu-architecture-for-llvm-clang

 

Here is the git repo: https://github.com/drh/lcc

Edited by zezba9000
Link to comment
Share on other sites

From the StackOverflow post.

You can certainly make char, short, int, long, long long, etc. the same size...

 

 

LCC may be the C compiler to use if its much easier to use than the GCC for this type of thing: https://stackoverflow.com/questions/7484466/choosing-cpu-architecture-for-llvm-clang

Edited by zezba9000
Link to comment
Share on other sites

So was looking at some defines in GCC and noticed "BITS_PER_UNIT". Which char in the GCC directly uses for its bit length.

Ref: http://www.delorie.com/gnu/docs/gcc/gccint_112.html

 

So looked up to see if this value was used in the SDCC and it looks like at least here it is: https://github.com/darconeous/sdcc/blob/master/support/cpp/output.h#L263

 

If you changed "BITS_PER_UNIT" in the GCC or SDCC to 16 might this solve the problem?

 

Indeed, in the GCC port I started, I set BITS_PER_UNIT to 16. It's also used in binutils, IIRC. It shows up in SDCC because they pulled some material from GCC (the preprocessor, IIRC).

 

SDCC doesn't use BITS_PER_UNIT elsewhere, though.

 

GCC and binutils do support a wider range of CHAR_BIT. I did find some bugs in binutil's ELF support, but I submitted bugfixes upstream.

 

As for LCC... It's a bit dated and I hear the code quality isn't terribly high. Plus, I pretty much have to buy the dead-tree book for documentation on retargeting it. It didn't really make the jump into the online era.

Link to comment
Share on other sites

 

Indeed, in the GCC port I started, I set BITS_PER_UNIT to 16. It's also used in binutils, IIRC. It shows up in SDCC because they pulled some material from GCC (the preprocessor, IIRC).

 

SDCC doesn't use BITS_PER_UNIT elsewhere, though.

 

GCC and binutils do support a wider range of CHAR_BIT. I did find some bugs in binutil's ELF support, but I submitted bugfixes upstream.

 

As for LCC... It's a bit dated and I hear the code quality isn't terribly high. Plus, I pretty much have to buy the dead-tree book for documentation on retargeting it. It didn't really make the jump into the online era.

I mean you're targeting a very old CPU, using an older compiler like LCC isn't that bad. The online resources not being officially documented is a little silly, although might be able to find a PDF version of the books on the WayBack machine website.

 

Also what about just writing a LLVM backend? http://llvm.org/docs/WritingAnLLVMBackend.html#introduction

Then you could just use Clang. LLVM seems to have lots of docs.

Edited by zezba9000
Link to comment
Share on other sites

For LCC if you go here: https://sites.google.com/site/lccretargetablecompiler/lccmanpage

Then click on "Docs->Code Generate Interface" if will pull up this PDF: http://drhanson.s3.amazonaws.com/storage/documents/interface4.pdf

Here are more directions for how someone made LCC target a 16 bit CPU using that PDF: http://www.fpgacpu.org/usenet/lcc.html

 

Anyway just some interesting options.

Edited by zezba9000
Link to comment
Share on other sites

I mean you're targeting a very old CPU, using an older compile like LCC isn't that bad. The online resources not being officially documented is a little silly, although might be able to find a PDF version of the books on the WayBack machine website.

 

Also what about just writing a LLVM backend? http://llvm.org/docs/WritingAnLLVMBackend.html#introduction

Then you could just use Clang. LLVM seems to have lots of docs.

 

As I mentioned, Clang / LLVM don't support CHAR_BIT != 8.

Link to comment
Share on other sites

For LCC if you go here: https://sites.google.com/site/lccretargetablecompiler/lccmanpage

Then click on "Docs->Code Generate Interface" if will pull up this PDF: http://drhanson.s3.amazonaws.com/storage/documents/interface4.pdf

Here are more directions for how someone made LCC target a 16 bit CPU using that PDF: http://www.fpgacpu.org/usenet/lcc.html

 

Anyway just some interesting options.

 

Looking again through the LCC docs, I don't see where you can configure the number of bits in char. Its type metrics are all specified in multiples of char, though, and it says char's size and alignment must be 1.

Edited by intvnut
Link to comment
Share on other sites

 

Looking again through the LCC docs, I don't see where you can configure the number of bits in char. Its type metrics are all specified in multiples of char, though, and it says char's size and alignment must be 1.

I asked about it on GitHub. Fingers crossed maybe someone has a good answer: https://github.com/drh/lcc/issues/39

From reading other stuff online it sounds like it should be possible but maybe someone who has used LCC before has a better answer.

Edited by zezba9000
Link to comment
Share on other sites

See the bottom of this page: http://www.homebrewcpu.com/projects.htm

 

 

LCC retargeting for D16/M homebrew computer

Recently, John Doran put together a web site with details of his homebrew machine, D16/M. While looking over the instruction set, it occurred to me that it might not be that difficult to retarget LCC to it. LCC is a nice full ANSI C compiler designed to be easily retargeted. For the most part it is, particularly if your target happens to look like a classic RISC machine with an orthogonal instruction set architecture. D16/M is quite different that than - it's a 16-bit accumulator machine, and addresses memory as an array of 16-bit words. To get LCC to work, I told it that D16/M's char, short, int, long int, pointer, float and double types were all 1 byte wide. Additionally, I changed a bunch of places in LCC's source code that had a hard-wired assumption that a byte is 8 bits wide. Instead, I added a field in the interface record to allow a target to specify how wide a byte (i.e. - minimum unit of addressability) is. For D16/M, it is 16, for others 8. The result of this is that D16/M will have 16-bit ints and chars. It could easily be extended to support 32-bit ints & floats and 64-bit doubles.

 

The second big trick was to lie to LCC and tell it that D16/M had 8 integer registers and 8 floating point registers. In truth, these were simply fixed memory locations. Amazingly, it all seems to work. For you LCC retargeters out there, here's the .md file to see what I did. Email me if you'd like more info on the byte width modifications.

 

So, LCC doesn't support it by default, but it can be beaten about the head to make it work.

Link to comment
Share on other sites

See the bottom of this page: http://www.homebrewcpu.com/projects.htm

 

 

So, LCC doesn't support it by default, but it can be beaten about the head to make it work.

Ic, so unless you want to do the same hacks GCC is the only option for a C compiler. In either case going to continue IL2X as I can force the concept of "sizeof(byte) = sizeof(int) = sizeof(IntPtr) = 1" in C# terms.

Link to comment
Share on other sites

 

You still need to be careful about sizeof() and pointer arithmetic.

 

Just to be clear, intvnut, what you are saying is not that it is impossible, but that it will require some heavy invasive changes and exploratory analysis and that there is risk that missing something would cause it to fail in some mysterious way.

 

I would suppose that this is because there is a working assumption in those projects that a byte is always 8-bits wide.

 

Is this correct?

 

-dZ.

Link to comment
Share on other sites

I did not read all the post, but why people sould be interested in a C compiler when we have IntyBasic ?, cause those people dont know the Basic Syntax?

Some times when people hear Basic they think slow code, thats for sure, with older systems where the code is not Compiled. I dont think Intybasic is slow, first of all it compiles to assembler, which is assembled into a usefull code, second it is maintained by Oscar, who is also a very good Assembler programmer :) so I am quite sure his compiler know what its doing :)

 

Maybe some one can do a C Compiler, which Compiles into Intybasic code :)

Link to comment
Share on other sites

 

Just to be clear, intvnut, what you are saying is not that it is impossible, but that it will require some heavy invasive changes and exploratory analysis and that there is risk that missing something would cause it to fail in some mysterious way.

 

I would suppose that this is because there is a working assumption in those projects that a byte is always 8-bits wide.

 

Is this correct?

 

Not necessarily "heavy and invasive" changes, but definitely some changes that aren't confined to the target backend. See my comment #110 above where I quote someone who has already made such changes to a version of LCC.

 

 

I did not read all the post, but why people sould be interested in a C compiler when we have IntyBasic ?, cause those people dont know the Basic Syntax?

Some times when people hear Basic they think slow code, thats for sure, with older systems where the code is not Compiled. I dont think Intybasic is slow, first of all it compiles to assembler, which is assembled into a usefull code, second it is maintained by Oscar, who is also a very good Assembler programmer :) so I am quite sure his compiler know what its doing :)

 

I want a C compiler for me. That alone is enough for me. That same impulse already resulted in jzIntv, AS1600, 4-Tris, Space Patrol, JLP, and LTO Flash.

 

C supports register-allocated variables, local variables, recursion, a type system and type checking, proper pointers, proper separate compilation and libraries, and so on. C does not bake in default libraries for music, graphics, and controller input. If I build on an established C compiler such as GCC, I inherit all of the work they've done to optimize C, including inlining, loop unrolling, global code optimizations, register allocation, etc. And if that compiler also extends to C++, I get some crazy-powerful facilities for compile-time evaluation.

 

Those are some of the reasons *I* want a C compiler.

 

I actually don't care much if I'm the only person who wants a C compiler. I see most folks sticking with IntyBASIC, actually, because it's good enough for what they want to do.

 

IntyBASIC is a fine implementation of BASIC, and is an order of magnitude faster than an interpreted BASIC (if not more). It has a limited compilation model, though, and I've seen where it tops out.

 

It lacks local variables, so you have to be careful about how you name things in different parts of the code. It compiles statement-at-a-time, which prevents it from optimizing over large blocks of code. (No inlining, loop unrolling, global code optimization, or register allocation.) Its memory model requires results written back to memory with every statement, which puts a persistent drag on performance.

 

It provides a lot of mechanisms to get you started (a "high floor"), but simultaneously has many limitations in its programming model that limit it (a "low ceiling").

 

A few times now, I've helped someone out by rewriting some IntyBASIC code in assembly for a huge speedup (3x to 4x) to keep their game viable. That was after putting a fair bit of effort getting the code to perform well in IntyBASIC. My hope is that I could do much better in C.

 

 

 

Maybe some one can do a C Compiler, which Compiles into Intybasic code :)

 

I'm thinking the other way around: If I imagine to get a decent backend on GCC, perhaps it's worth looking at writing an IntyBASIC front end for GCC so it can inherit all of the optimization infrastructure GCC provides?

 

If nothing else, once I get the GNU binutils a little more mature, it might at least be worth migrating IntyBASIC off of AS1600 and onto GNU binutils. Since binutils provide a proper linker, you could replace "intybasic_prologue.asm" and "intybasic_epilogue.asm" with "libintybasic.a". Furthermore, you could solve the "memory map problem" by providing a proper linker script.

 

If done right, you could theoretically have a program with some portions in IntyBASIC, some portions in C, and some in assembly.

Edited by intvnut
  • Like 1
Link to comment
Share on other sites

I did not read all the post, but why people sould be interested in a C compiler when we have IntyBasic ?, cause those people dont know the Basic Syntax?

Some times when people hear Basic they think slow code, thats for sure, with older systems where the code is not Compiled. I dont think Intybasic is slow, first of all it compiles to assembler, which is assembled into a usefull code, second it is maintained by Oscar, who is also a very good Assembler programmer :) so I am quite sure his compiler know what its doing :)

 

Maybe some one can do a C Compiler, which Compiles into Intybasic code :)

IntyBASIC isn't a portable lang and for me this isn't usable. I want to be able to write code in C (preferably) so 90% of game logic etc can be shared with CC65 for other Atari platforms for example. My personal project is actually getting C# to run on legacy platforms and the primary target for making that happen was to target C. Besides this C has a much smaller learning curve as most people already know how to read its syntax.

 

With my project IL2X I could translate .NET IL to IntyBASIC but it sounds like a better idea to just target ASM (or C if that ever happens).

Edited by zezba9000
Link to comment
Share on other sites

Its always great with more options, and INTYBASIC is also in development, maybe in the future it performs better and other features can be supported.

Local variables could be nice, and booleans, but else i dont have a long list of missing features in Intybasic.Until now I stick to Intybasic cause I really dont know C so well. And I grew up with Basic at homecomputer, later did some Turbo Pascal in CP/M, and MSXDOS (and MSDOS). But its more about what are you used too. If you are used to C you would like to use a C compiler, sure. But its great if we can have a C compiler, so we have more choices :)

Link to comment
Share on other sites

Switching between programming languages isn't a big deal unless you're moving to object oriented programming for the first time. Fortunately C is not object oriented. Good documentation is important and the enormous resources for C programming help is a big plus.

Link to comment
Share on other sites

IntyBASIC isn't a portable lang and for me this isn't usable. I want to be able to write code in C (preferably) so 90% of game logic etc can be shared with CC65 for other Atari platforms for example.

 

Sign me up!

 

If the same code can be easily adapted to multiple platforms, that means:

- As an (aspiring) developer, if I write something for Intellivision, more people will get to play it in other platforms. Now I have more of an incentive to write exceptional games, and potentially monetize the best ones.

- As a player, if something cool is written for other platforms, I have increased probability of seeing it ported to Inteliivision.

 

What's not to like?

 

Am I missing something?

Link to comment
Share on other sites

 

Sign me up!

 

If the same code can be easily adapted to multiple platforms, that means:

- As an (aspiring) developer, if I write something for Intellivision, more people will get to play it in other platforms. Now I have more of an incentive to write exceptional games, and potentially monetize the best ones.

- As a player, if something cool is written for other platforms, I have increased probability of seeing it ported to Inteliivision.

 

What's not to like?

 

Am I missing something?

Your points are spot on. This is the main point for me. Portability. I only see pros here. Also simplifying dev resources has been a interest of mine... you end up bringing in more content devs which is what makes a platform shine.

Edited by zezba9000
Link to comment
Share on other sites

  • 3 weeks later...

Just an update. Working on a new lang design my brother came up with over the last year. Its designed to be memory safe, no GC scanning, fast and very low memory (think Rust but much easier to read and use while being faster and far more portable).

 

Its memory design is single ownership vs shared-ownership as most langs are now. This is key for making a memory safe lang without a scanning GC.

Its type system will allow you to create objects without any overhead as you can in C perfect for the Intellivision.

 

Also because it will be designed to handle sizeof(byte) = 16bit early on allowing it to be used on custom 16bit FPGA homemade CPUs, etc... AND designed to be regrettable via a extension/plugin system (no need to recompile the compiler) it should be very easy to "re-target" it with a ASM backend for the Intellivision. Portable C89 will be its first target allowing it to run on the majority of hardware out there.

 

Anyway these are some of its goals. Its in the early stages and the first version of the compiler will be made with "C# .NET Core" so its cross-platform then later we will bootstrap the compiler as early as possible.

If it pans out as planned will update.

Edited by zezba9000
  • Like 1
Link to comment
Share on other sites

Not sure if I understand, but the Cp1600 backend seems not in your plans, does it?

Have you abandoned the idea of a C compiler for intellivision?

CP1600 support is still in the plans.

 

I personally never abandoned C as my original goal was to get C# working not C for CP1600. All for a C compiler but C is just portable assembly for me.

 

Because of the conversations with @intvnut and looking into all the issues people have with re-targeting C to run on a custom 16bit CPUs made with FPGAs or the CP1600 I think a new lang will solve portability issues across Intellivision, Atari 8-bit, Nintendo, etc and be useful on modern PCs as well.

 

This new lang will be called "Ez" and you will get the same performance (if not better) as you would with C.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...