Jump to content
IGNORED

What could have saved the Jag?


Tommywilley84

Recommended Posts

12 hours ago, Greg2600 said:

Great discussion, though really nothing suggested would have helped.  From a software publisher's perspective, even if you ignore the future arrival of Sony, Nintendo and Sega were the only games in town.

This is why I say that Jaguar was lost cause around 1985 when Atari was focused on computers and was neglecting the console side.   Nintendo was afraid of the Warner Atari, they actually tried to get Atari to market the NES.   But with the Tramiel Atari chasing computers, it left a huge opening for Sega/Nintendo to come in

 

2 hours ago, Lost Dragon said:

To this day I never understood why the Tramiel's didn't simply move into PC clones, like Dell or Compaq, maybe Atari would have survived?

Well they did make clones too.   Problem was they did it on the cheap, their PCs had no ISA expansion ports as I recall and integrated everything.  But Atari was trying to get into too many areas, and not doing any of them right..   workstations were another example of this with the ill-fated Transputer.

  • Like 4
Link to comment
Share on other sites

9 hours ago, agradeneu said:

You are an hardware designer?

BSEE. I've designed computer cards and input adapters and other such devices. I also write software. It's my opinion (and just an opinion) that taking hardware classes makes for better programmers than just software classes.

 

3 hours ago, agradeneu said:

TOM is a graphics chip, you could use Hitachi or ARM for CPU but still would need a capable custom graphics chip. For most consoles it was like: General purpose CPU from the shelf, custom chip for the graphics with powerful features special to game development, aka scaling sprites like "Suzy" of the Lynx.

 

3 hours ago, agradeneu said:

No.

Correct me if I'm wrong but ARM and SH1 are NOT graphics chips, you could slap 2 or 3 into the Jaguar, end up with a super pricey hw but still need a capable 

dedicated graphics chips to get anything done effectively. The more CPUs the better? I don't think so.

 

The 32X has 2x SH-2  but is still outperformed by the Jaguar in most scenarios.

 

The Saturn has 2 SH2s as well, but is drastically more capable than the 32X. So why is that? ;-)

 

Programming the 2 SH-2 effectively was said to be very hard, guess if it was a good idea then to put in 3x?! ;-) 

 

 

 

Those two quotes go together in this discussion. Tom has some extra hardware associated with handling graphics alongside the RISC. The RISC itself has nothing special for graphics other than being fast and having a very limited matrix operation. My own thought is that they should have kept all the stuff in Tom other than the RISC processor, which would have made Tom cheaper and easier to debug. Ditto for Jerry - they had some limited operations to make saturated operations work on audio summing, but that's hardly an issue with audio rates. Again, rip out the RISC processor and keep the rest making Jerry cheaper and easier to debug again. Maybe it wouldn't have had the serial bug (for example) if they hadn't had to spend all their time working on RISC bugs instead.

 

So I wasn't saying get rid of Tom and Jerry, just get rid of the RISC processors inside them. Keep the Object Processor and the Blitter and all the rest, just not the custom RISC cores. That would have left them plenty of time to work the bugs out of the rest of the hardware and given them mature programming tools to work with.

 

The SuperH is not hard to program at all - it's much easier to be more effective than x86, for example. That's talking about assembly, of course, as C is the same for everybody - it's all on the compiler to turn the C into good assembly.

 

The Saturn was much more capable then the 32X for a number of reasons: the SH2s were clocked faster and had 32-bit access to 1MB of ram (where the 32X SH2s just had 16-bit access to 256KB of ram), and the Saturn had a dedicated video rendering processor (VDP1) while the 32X only had a simple line fill function in its VDP. The 32X would have been far more capable if Sega had given it only one SH2 along with the VDP1 from the Saturn, but SOJ didn't want the 32X competing with the Saturn... they didn't want it at all, to be honest.

 

  • Like 1
Link to comment
Share on other sites

1 hour ago, Chilly Willy said:

BSEE. I've designed computer cards and input adapters and other such devices. I also write software. It's my opinion (and just an opinion) that taking hardware classes makes for better programmers than just software classes.

 

 

 

Those two quotes go together in this discussion. Tom has some extra hardware associated with handling graphics alongside the RISC. The RISC itself has nothing special for graphics other than being fast and having a very limited matrix operation. My own thought is that they should have kept all the stuff in Tom other than the RISC processor, which would have made Tom cheaper and easier to debug. Ditto for Jerry - they had some limited operations to make saturated operations work on audio summing, but that's hardly an issue with audio rates. Again, rip out the RISC processor and keep the rest making Jerry cheaper and easier to debug again. Maybe it wouldn't have had the serial bug (for example) if they hadn't had to spend all their time working on RISC bugs instead.

 

So I wasn't saying get rid of Tom and Jerry, just get rid of the RISC processors inside them. Keep the Object Processor and the Blitter and all the rest, just not the custom RISC cores. That would have left them plenty of time to work the bugs out of the rest of the hardware and given them mature programming tools to work with.

 

The SuperH is not hard to program at all - it's much easier to be more effective than x86, for example. That's talking about assembly, of course, as C is the same for everybody - it's all on the compiler to turn the C into good assembly.

 

The Saturn was much more capable then the 32X for a number of reasons: the SH2s were clocked faster and had 32-bit access to 1MB of ram (where the 32X SH2s just had 16-bit access to 256KB of ram), and the Saturn had a dedicated video rendering processor (VDP1) while the 32X only had a simple line fill function in its VDP. The 32X would have been far more capable if Sega had given it only one SH2 along with the VDP1 from the Saturn, but SOJ didn't want the 32X competing with the Saturn... they didn't want it at all, to be honest.

 

Have you read the tech manual for the Jaguars chipsets?

The RISC GPU was designed to run together with the OP and Blitter, all the graphics processors and the Risc were integrated into one chip. So you mean you can just rip out one of its integral parts without possibly ruining the whole design/architecture?

 

From the tech refernce manual:

"The Graphics Processor and Blitter provide a tightly coupled pair of processors for performing a much wider range of animation effects. A design goal of this system was to provide a fast throughput when rendering 3D polygons. The Graphics Processor therefore has a fast instruction throughput, and a powerful ALU with a parallel multiplier, a barrel-shifter, and a divide unit, in addition to the normal arithmetic functions. The Graphics Processor has four kilobytes of fast internal RAM, which is used for local program and data space. This allows it to execute programs in parallel with the other processing units. The Blitter is capable of performing a range of blitting operation 64 bits at a time, allowing fast block move and fill operations, and it can generate strips of pixels for Gouraud shaded Z-buffered polygons 64 bits at a time. It is also capable of rotating bit-maps, line-drawing, character-painting, and a range of other effects. The graphics processor and the Blitter will usually act together preparing bit-maps in memory, which are then displayed by the Object Processor"

 

 

 

To summarize some other suggestions that could have improve the chipset:

 

- higher clock rates 

- bigger caches for the Riscs

- dedicated Video RAM 

- bugfixing/improved revisions of the RISCs

 

 

 

 

 

Edited by agradeneu
Link to comment
Share on other sites

2 hours ago, Chilly Willy said:

BSEE. I've designed computer cards and input adapters and other such devices. I also write software. It's my opinion (and just an opinion) that taking hardware classes makes for better programmers than just software classes.

 

That and "security." When I first started programming in the late 1990s, security wasn't even a concept. In normal applications, and particularly in new "web" applications, it was quite common to literally take text directly from text boxes or other text inputs, and concatenate them into strings that you'd just push to the database... immediately make you susceptible to SQL injection or in some cases cross-site scripting. Or people would pass pertinent information in the url as parameters in cleartext, like user and session IDs, making them immediately susceptible to masquerade attacks, and even in some cases side-jacking where cookies were involved.

 

But I think it depends on the developer. It was rare that I was employed in a position where I was writing directly to hardware... that was limited to some PLCs for medical devices, and directly writing to smart-cards. Most "programmers" today simply write off an IDE or through a compiler that is *heavily* supplanted with multiple layers of framework to make it easier for the developer to rapidly develop applications. We can thank Rear Admiral Hopper for that...

 

 

  • Like 1
Link to comment
Share on other sites

4 hours ago, zzip said:

This is why I say that Jaguar was lost cause around 1985 when Atari was focused on computers and was neglecting the console side.   Nintendo was afraid of the Warner Atari, they actually tried to get Atari to market the NES.   But with the Tramiel Atari chasing computers, it left a huge opening for Sega/Nintendo to come in

 

Well they did make clones too.   Problem was they did it on the cheap, their PCs had no ISA expansion ports as I recall and integrated everything.  But Atari was trying to get into too many areas, and not doing any of them right..   workstations were another example of this with the ill-fated Transputer.

? That isn't one of my posts being quoted there. 

 

Far too well written for a start. 

Link to comment
Share on other sites

Sticking with my approach of getting development teams on board is one thing, keeping them is something else again, approach.. 

 

Never realised D. I. D (T. F. X and Inferno) had such a low opinion of the 3DO and preferred working on the Commodore CD32 platform instead. 

 

D. I. D commented that the 3DO was over hyped, nowhere near as powerful as they had been made to believe and had abandoned all 3DO development after finding the hardware not upto scratch, the O/S far too slow. 

 

The only critiscm they appeared to have of the CD32 was it's lack of fast memory. 

 

Wonder what they made of the Jaguar hardware in comparison... 

Link to comment
Share on other sites

8 hours ago, Lost Dragon said:

From a personal perspective, the Jaguar was simply the second, day one hardware purchase i had made (the first being the Sega Mega CD) who's potential was sadly ignored by a lot of publishers, who simply moved existing titles to it, with minimal enhancements. 

 

When you think your buying something with a very promising future, only to see your hopes dashed and games canned, left, right and Chelsea, it's not hard to feel cheated.

?

Sheesh, remind me not to ask you for lottery numbers!

7 hours ago, Lost Dragon said:

Don't forget the poor sods who owned Falcon computers, only to see all resources pushed into games for the Jaguar. 

 

Few Lynx titles also canned to focus on Jaguar. 

 

By Atari's own admission, they never had the resources to support multiple flagship platforms. 

 

Panther originally intended to launch alongside the Lynx. 

Yup they lost out too.

3 minutes ago, Lost Dragon said:

Sticking with my approach of getting development teams on board is one thing, keeping them is something else again, approach.. 

 

Never realised D. I. D (T. F. X and Inferno) had such a low opinion of the 3DO and preferred working on the Commodore CD32 platform instead. 

 

D. I. D commented that the 3DO was over hyped, nowhere near as powerful as they had been made to believe and had abandoned all 3DO development after finding the hardware not upto scratch, the O/S far too slow. 

 

The only critiscm they appeared to have of the CD32 was it's lack of fast memory. 

 

Wonder what they made of the Jaguar hardware in comparison... 

Well 3DO were already onto the M2 as their current platform was clearly inadequate.  I still go back to business sense though.  Nintendo, Sega, eventually Sony, if you were the CFO or CEO of one of dozens of developers, weren't you going to be more comfortable with partners you expected to be standing in 5 years?

Link to comment
Share on other sites

2 hours ago, Greg2600 said:

Sheesh, remind me not to ask you for lottery numbers!

Yup they lost out too.

Well 3DO were already onto the M2 as their current platform was clearly inadequate.  I still go back to business sense though.  Nintendo, Sega, eventually Sony, if you were the CFO or CEO of one of dozens of developers, weren't you going to be more comfortable with partners you expected to be standing in 5 years?

It's interesting (well i find it somewhat ?) looking back at old interviews with established development teams during the 32//64 bit console wars. 

 

You see Core Design having the utmost faith in the Sega Saturn in the very early days, with Jeremy Heath - Smith doubting the Playstation had the legs (software support) to go past 2.5-3 years maximum. 

 

Sony, being the newcomer, would he thought, struggle to convince publishers to stay with the Platform. 

 

 

Skip to Core abandoning Saturn projects to focus on Playstation not that long after.. 

  • Haha 1
  • Sad 1
Link to comment
Share on other sites

5 hours ago, agradeneu said:

Have you read the tech manual for the Jaguars chipsets?

The RISC GPU was designed to run together with the OP and Blitter, all the graphics processors and the Risc were integrated into one chip. So you mean you can just rip out one of its integral parts without possibly ruining the whole design/architecture?

 

From the tech refernce manual:

"The Graphics Processor and Blitter provide a tightly coupled pair of processors for performing a much wider range of animation effects. A design goal of this system was to provide a fast throughput when rendering 3D polygons. The Graphics Processor therefore has a fast instruction throughput, and a powerful ALU with a parallel multiplier, a barrel-shifter, and a divide unit, in addition to the normal arithmetic functions. The Graphics Processor has four kilobytes of fast internal RAM, which is used for local program and data space. This allows it to execute programs in parallel with the other processing units. The Blitter is capable of performing a range of blitting operation 64 bits at a time, allowing fast block move and fill operations, and it can generate strips of pixels for Gouraud shaded Z-buffered polygons 64 bits at a time. It is also capable of rotating bit-maps, line-drawing, character-painting, and a range of other effects. The graphics processor and the Blitter will usually act together preparing bit-maps in memory, which are then displayed by the Object Processor"

 

The GPU and Blitter are only "tightly coupled" in that they share silicon in Tom. Many games use the 68000 rather than the RISC to drive the blitter. Most of the description of the GPU RISC itself is applicable to virtually any CPU of the time and nothing special. The 4KB of fast internal ram wasn't anything special, either. The SH2 has 4KB of internal ram that can be used as 4KB of 4 way set associative cache, 2KB of 2 way set associative cache + 2KB fast scratch ram, or 4KB of fast scratch ram. The GPU can only use its ram as scratch ram. ARM chips might also have fast scratch ram as we saw in things like the GBA. The rest of the description has nothing at all to do with the GPU RISC and is the part I say they should have kept.

 

In case you're wondering, the original SH1s had 1KB of ram for fast scratch. Later versions had 2KB. ARM was kinda the oddball of the time in that it didn't come with internal ram standard, but could be added to the design if desired for a particular use. Various MIPS chips had different cache sizes internally.

  • Like 1
  • Confused 1
Link to comment
Share on other sites

I'd argue the Jaguar was already saved; by its fans. The folks that continue to produce games for it, and the players that buy and share their experiences with others, they saved the Jaguar years ago. It's still alive. I mean, shit! I think folks show the console more respect now than when it was launched. That's saying something, considering we have numerous click-bait articles and YouTube programs that continue to push it as the punchline of retro gaming. The Jaguar died, only to be resurrected. IT LIVES, DAMN YOU! IT LIVES!

  • Like 11
Link to comment
Share on other sites

Considering Jaguar games fetch $50-200 on the regular and systems sell for $300+ every single time, and new games being released to the point homebrew and indie titles now practically match the amount of retail releases, I agree its far from dead.  Bring a ps1 and some games to a decent retro gaming store.  Eh?  thats what you'll get, bring in a Jaguar and people start asking questions.  The Jaguar failed, but not because the hardware was crap, just the situation.   I fully agree with Andrew Rosa, the Jaguar sure as hell is alive and kick'n.  

 

  • Like 2
Link to comment
Share on other sites

11 minutes ago, Andrew Rosa said:

That's saying something, considering we have numerous click-bait articles and YouTube programs that continue to push it as the punchline of retro gaming.

Yeah, there's a lot of negative noise out there, but I wouldn't worry about it.  I got here by way of retro YouTube videos, and when you watch the negative videos about Jaguar, they just clearly have nothing to offer.  Further, I found YouTube will direct you to something positive like the Nostalgia Nerd one after those most of the time.  Watching both types of videos back to back, it's pretty clear from the passion, research and even video editing quality of the positive reviews and retrospectives which ones are going to win the debate in the viewers' minds.

17 minutes ago, Andrew Rosa said:

The Jaguar died, only to be resurrected. IT LIVES, DAMN YOU! IT LIVES!

This.  A thousand times this.  It doesn't sound like Atari the company was something worth saving anyway.  However, the hardware, including the engineering marvels and passion behind it, live on.

  • Like 5
Link to comment
Share on other sites

10 hours ago, Chilly Willy said:

 

The GPU and Blitter are only "tightly coupled" in that they share silicon in Tom. Many games use the 68000 rather than the RISC to drive the blitter. Most of the description of the GPU RISC itself is applicable to virtually any CPU of the time and nothing special. The 4KB of fast internal ram wasn't anything special, either. The SH2 has 4KB of internal ram that can be used as 4KB of 4 way set associative cache, 2KB of 2 way set associative cache + 2KB fast scratch ram, or 4KB of fast scratch ram. The GPU can only use its ram as scratch ram. ARM chips might also have fast scratch ram as we saw in things like the GBA. The rest of the description has nothing at all to do with the GPU RISC and is the part I say they should have kept.

 

In case you're wondering, the original SH1s had 1KB of ram for fast scratch. Later versions had 2KB. ARM was kinda the oddball of the time in that it didn't come with internal ram standard, but could be added to the design if desired for a particular use. Various MIPS chips had different cache sizes internally.

But what's most important: DID you program the GPU, Blitter or OP to give out definitive verdicts? ;)  I'm a bit sceptical about that. Initially you said 3 ARMs would have kicked ass, now you saying swap the GPU, a graphics processor, with an ARM or SH2, which means 1 extra  chip and a different design for the TOM RISC. Sorry I'm not convinced. I think it was a MUCH better idea to improve the chipset/design by increasing clock speeds or addding caches for buffering. T

Edited by agradeneu
Link to comment
Share on other sites

8 hours ago, cubanismo said:

Yeah, there's a lot of negative noise out there, but I wouldn't worry about it.  I got here by way of retro YouTube videos, and when you watch the negative videos about Jaguar, they just clearly have nothing to offer.  Further, I found YouTube will direct you to something positive like the Nostalgia Nerd one after those most of the time.  Watching both types of videos back to back, it's pretty clear from the passion, research and even video editing quality of the positive reviews and retrospectives which ones are going to win the debate in the viewers' minds.

This.  A thousand times this.  It doesn't sound like Atari the company was something worth saving anyway.  However, the hardware, including the engineering marvels and passion behind it, live on.

The likes of:

 

Andrew Rosa

BTB via The Jag Bar 

Second Opinion Games 

 

Have done a fantastic job in giving the Jaguar a much more balanced viewpoint than a lot of the commercial games press did at the time. 

 

 

They present the viewpoint from a pure gamers perspective and titles like :

 

Super Burnout, Raiden, Ultra Vortex are looked at far more sensibly as a result. 

 

Having said that, i still stumble across old European games press where titles like Hoverstrike (95%) and Checkered Flag (91%) scores can catch me by surprise. 

 

US Press little different, an old interview with Ed Ringer, then of Design Star, keen to distance his company from an ST version of one of their titles Mindscape converted, badly.. ?

 

Even when you think you've exhausted all the Atari coverage from the time, you can still find things unexpectedly. 

  • Like 2
Link to comment
Share on other sites

On 6/26/2020 at 1:01 AM, Lost Dragon said:

Commercial games coders can be a fickle bunch, praising hardware like the Jaguar one moment and then taking a rather different view some months down the line. 

 

I put John Carmack quotes up some time ago, but the Jaguar to a certain degree sometimes reminds me of how the Playstation 2 fared at times. 

 

 Lorne Lanning, whilst hyping up  Oddworld:Munch's Oddysee, said that whilst he would of liked  more VRAM and CPU power, he considered the Playstation 2 hardware truly amazing and the biggest challenge with regards to game development on it would be creative, not technical.

 

 

It wasn't that long before he announced he was taking the game to the Xbox as the PS2 lacked the power needed. 

 

 

Jez San pulled a similar stunt with Malice. 

 

Starting life on PS1, then becoming an Xbox exclusive, before appearing on PS2 and not that great either. 

 

Point being, it doesn't matter if your a company with the sheer resources of someone like Sony  (or M. S) and your previous console had racked up sales of over 56 Million, developers are always going to be enticed by the next big thing. 

 

With Atari, thinking the war was with existing Sega and Nintendo consoles and the 3DO, Sony proved to be a very brutal wake up call, but Sony themselves didn't have it all their own way a mere generation later and find themselves battling more than just M. S and Nintendo. 

Marc Rosacha is pretty consistent when it comes to the Jaguar. He thinks it was great, he also praised the RISCs and their integration into the system/achitecture and did NOT consider the Jaguar to be particularly hard to program.

Edited by agradeneu
  • Like 1
Link to comment
Share on other sites

On 6/25/2020 at 10:43 PM, Chilly Willy said:

From the POV of an engineer/programmer, I'd say they should have gone with a more mature, existing RISC processor instead of trying to roll their own. The buggy state of Tom and Jerry combined with the lack of stable programming tools for them killed the Jaguar more than almost anything else. The 68000 was a good choice; a custom RISC was not. They'd have been better off with MIPS, ARM, or SuperH instead. Those processors were cheap, fast, and had stable compilers/assemblers/debuggers.

Marc Rosocha (Iron Soldier 1&2) might disagree, he praised the RISCs at more than one occasion, for them having powerful feaures and being fast for graphics, and he also suggested turning the 68K off to get the best out of the hardware. You can also read that Duranik improved Natives framerate by moving the game engine from the 68K to the GPU, it then ran 60 FPS without slowdowns. I wholly disagree to your assumption that a general RISC was better choice than a custom RISC optimized with feature sets for graphics, I doubt that a scenario like this really exists and its highly vague in case of the Jaguar. At same clockspeed, a custom chip will perform better. I alreay pointed out, that the 32X had 2 (!) general RISCs that could not match the Jaguar capabilities despite having similar clock speeds. I already pointed out that TOM has to be treated as a whole, as Flares intent was to design a chip that integrates all the graphics processors for rendering 3D and 2D graphics. 

After analyses of Jaguar 1, they later adressed the shortcomings e.g slow texture mapping with Jaguar 2, they did not resort to slap in general RISCs from the shelf, but improved the custom design drastically.

BTW John Mathieson worked later for nvidia, a company which is well known for developing custom graphics chips.

 

By your assumption, slapping in 4-5 SuperHs (or any other general CPU) and cranking up clockspeeds should have been enough for improving 3D graphics, for at least from the 90s to the early 2000s. ;-)

 

Edited by agradeneu
  • Like 1
Link to comment
Share on other sites

25 minutes ago, agradeneu said:

Marc Rosacha is pretty consisent when it comes to the Jaguar. He thinks it was great, he also praised the RISCs and their integration into the system/achitecture and did NOT consider the Jaguar to be particularly hard to program.

Atari could of done with a lot more coders of the calibre of Marc. 

 

Jaguar badly needed a much higher ratio of killer app titles written exclusively for the Jaguar hardware, rather than converted from Amiga, SNES and M. D. 

  • Thanks 1
Link to comment
Share on other sites

3 minutes ago, Lost Dragon said:

Atari could of done with a lot more coders of the calibre of Marc. 

 

Jaguar badly needed a much higher ratio of killer app titles written exclusively for the Jaguar hardware, rather than converted from Amiga, SNES and M. D. 

Word.:-)

Link to comment
Share on other sites

31 minutes ago, agradeneu said:

You can also read that Duranik improved Natives framerate by moving the game engine from the 68K to the GPU, it then ran 60 FPS without slowdowns. 

 

It might not slow down,  but Native literally rips the screen apart by bus saturation, totally overloading the object processor to the point where it just gives up and draws black lines.  And that is without audio. 

 

It isn't the gold standard people claim it is. 

  • Like 1
Link to comment
Share on other sites

2 minutes ago, CyranoJ said:

It might not slow down,  but Native literally rips the screen apart by bus saturation, totally overloading the object processor to the point where it just gives up and draws black lines.  And that is without audio. 

 I think it looks just great, best 2D on the Jaguar in my book.

  • Like 1
Link to comment
Share on other sites

6 hours ago, agradeneu said:

Initially you said 3 ARMs would have kicked ass, now you saying swap the GPU, a graphics processor, with an ARM or SH2, which means 1 extra  chip and a different design for the TOM RISC. Sorry I'm not convinced.

 

No, what you're not doing is reading the thread completely. I always called for replacing JUST THE CUSTOM RISC processors with mature ones. When I said 3 ARMs would kick ass, it was following the rest of the thread in that you would be replacing the 68000, the GPU RISC, the DSP RISC, and NOTHING ELSE with ARM processors. You'd still have the blitter and object processor and all the other hardware, just not the processors Atari used.

 

My original post left the 68000, only replacing the GPU and DSP RISC processors. Someone else pointed out that replacing the 68000 would also boost performance. That was the basis for that part of the thread, not using ONLY RISC processors and nothing else.

 

  • Confused 1
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...