Jump to content
--- Ω ---

OT: Dumping Thread

Recommended Posts

True, but I can build you a version of Windows XP embedded that boots in 30 seconds too. The distro is everything in both cases.

 

Yes but I can get bug fixes for my copy of Xubuntu..which boots up in like 20 seconds :) and runs modern software..

 

G

Share this post


Link to post
Share on other sites

I once considered a specific measure for boot performance:

 

How long does it take from turning on the PC until you get a display of a selected web page?

 

This includes the time for booting, for logging on, and for launching an application. I know that Windows is pretty quick to show the login screen, but it takes some time after logging in before the system becomes responsive because a lot of services still have to be started.

Share this post


Link to post
Share on other sites

Yes but I can get bug fixes for my copy of Xubuntu..which boots up in like 20 seconds :) and runs modern software..

Ignoring for the moment what a stupid thing this is to argue about (why must people argue in favor of supporting prejudices again?), it's not true misleading when you take it in context.

 

Windows XP was released in 2001. Xunbuntu did not even exist until 2006. Windows XP was officially supported with patches and releases for 13 years. Xunbuntu's official support timeline is /THREE/ years. Attempting to run a Linux program from 13 years ago on a modern distribution has a medium-to-low likelihood of working, running a Windows program from 20 years ago is highly likely to work on a modern version of Windows (excluding 16-bit software, which will work on a modern 32-bit Windows but not 64-bit.) (Even re-compiling that old program from source has a questionable chance of working, due to major incompatible changes in many common Linux libraries. In fairness, you might not be able to recompile that old Windows program either, but the binary is more likely to work anyway.)

 

Your argument dosen't make sense because you are comparing a specific, very old release of Windows to the current release of Xubuntu, and treating them as equivalent. Even the first Ubuntu, from which XUbuntu is derived, did not exist when Windows XP was released. But if you upgrade Windows to the current version, yes, you can get bug fixes and modern software for it.

 

I work with Linux distributions (more than one) every day, it's what pays my bills. And a big part of what I do is legacy Linux software on modern distributions. I know what I'm talking about as I do this every day. I also spent 14 years doing that for Windows distributions. When you've done the same, THEN let's have a conversation about pros and cons. :)

 

(edit: correct an editorialized comment - striked out)

Edited by Tursi
  • Like 1

Share this post


Link to post
Share on other sites

I once considered a specific measure for boot performance:

 

How long does it take from turning on the PC until you get a display of a selected web page?

 

This includes the time for booting, for logging on, and for launching an application. I know that Windows is pretty quick to show the login screen, but it takes some time after logging in before the system becomes responsive because a lot of services still have to be started.

 

Yes, this was a trick Microsoft did.. but, we were talking originally about stripped down, embedded systems, and Embedded XP or Embedded 7 can likewise be made not to start (or even have) all these unnecessary services. BartPE was previously mentioned and actually does this nicely for free, or Microsoft also has an official tool for creating minimalized versions. I worked with XP Embedded for a good number of years and despite some silly decisions here and there, it did work well (I had a booting system with GUI down to 25MB, but I had to increase it to about 100MB for the final image because of some driver dependencies). I also used BartPE to create a tiny installation environment, and it's defintely cool to see Windows without the bloat. ;)

Share this post


Link to post
Share on other sites

Ignoring for the moment what a stupid thing this is to argue about (why must people argue in favor of supporting prejudices again?), it's not true misleading when you take it in context.

 

Windows XP was released in 2001. Xunbuntu did not even exist until 2006. Windows XP was officially supported with patches and releases for 13 years. Xunbuntu's official support timeline is /THREE/ years. Attempting to run a Linux program from 13 years ago on a modern distribution has a medium-to-low likelihood of working, running a Windows program from 20 years ago is highly likely to work on a modern version of Windows (excluding 16-bit software, which will work on a modern 32-bit Windows but not 64-bit.) (Even re-compiling that old program from source has a questionable chance of working, due to major incompatible changes in many common Linux libraries. In fairness, you might not be able to recompile that old Windows program either, but the binary is more likely to work anyway.)

 

Your argument dosen't make sense because you are comparing a specific, very old release of Windows to the current release of Xubuntu, and treating them as equivalent. Even the first Ubuntu, from which XUbuntu is derived, did not exist when Windows XP was released. But if you upgrade Windows to the current version, yes, you can get bug fixes and modern software for it.

 

I work with Linux distributions (more than one) every day, it's what pays my bills. And a big part of what I do is legacy Linux software on modern distributions. I know what I'm talking about as I do this every day. I also spent 14 years doing that for Windows distributions. When you've done the same, THEN let's have a conversation about pros and cons. :)

 

(edit: correct an editorialized comment - striked out)

 

 

Sorry I thought we were discussing running XP on a computer today.. were we not? Then nevermind :) I'll be happy with my Xubuntu machines..and the XP machine I keep just to run HDX server ;)

 

G

Share this post


Link to post
Share on other sites

Ignoring for the moment what a stupid thing this is to argue about.

 

Attempting to run a Linux program from 13 years ago on a modern distribution has a medium-to-low likelihood of working, running a Windows program from 20 years ago is highly likely to work on a modern version of Windows (excluding 16-bit software, which will work on a modern 32-bit Windows but not 64-bit.)

 

It's stupid to argue about, but it's fun to debate ;). As long as no-one gets their internet panties in a bunch I'm all for it.

 

That said, I'm not on board with your reasoning here. First of all, there are plenty of age-old programs that run perfectly well on modern Linux distros (I still run UT99 on my Linux machines, granted it's only 15 years old but still). Secondly, you're making it sound as if the choice to drop backwards compatibility with apps that use deprecated libraries is a bad thing. It's exactly the reason why it is (or can be) so much more snappy than Windows and one of the reason why I prefer it. Apple falls somewhere in between with OSX and it shows some the same benefits. Thirdly, due to the nature of the ecosystem, at least with open source programs there is something that can be done about it. Try that with 20 year old software that you just found in an obscure corner of the internet (what is that application that can generate speech synthesizer compatible data again?).

 

But as you said, it's a silly argument. If you want something lean and mean this thing beats it all: entire OS + GUI + web browser on a single floppy disk: http://toastytech.com/guis/qnxdemo.html :)

  • Like 1

Share this post


Link to post
Share on other sites

That said, I'm not on board with your reasoning here. First of all, there are plenty of age-old programs that run perfectly well on modern Linux distros (I still run UT99 on my Linux machines, granted it's only 15 years old but still).

Seriously? The same /binary/ from 1999? Or UT99 that has been ported to the modern libraries? The program being updated to work doesn't count for my argument. :) If it is the same binary, my hat's off to that developer for somehow getting it right! I'd love to see that!

 

Secondly, you're making it sound as if the choice to drop backwards compatibility with apps that use deprecated libraries is a bad thing.

No, I didn't actually pass judgment, I just stated the way it is. It's true that it sometimes annoys me, and that may have come through. ;)

 

It's exactly the reason why it is (or can be) so much more snappy than Windows and one of the reason why I prefer it.

Using both, every day, all the time, I don't feel that Linux at the GUI is any snappier than Windows. (I do like the command prompt more ;) ). Some people do, but, some people don't keep their Windows machines maintained, and let it get loaded down with crap. If you don't /like/ the system, how can you be expected to understand it well enough to keep it clean? I like Linux just fine. I just like the Windows model better.

 

A old Windows application and a new Windows application run through different code paths. The old application doesn't run through layers of emulation, it just loads the old DLLs. The only impact is disk space. Linux /could/ do the same thing but they choose not to, and many library maintainers break APIs just because they can. I feel that the lack of respect for backwards compatibility is one of the reasons that Linux has not made the inroads into the desktop market that they once were hoping for. It's getting better, but there are still problems.

 

Thirdly, due to the nature of the ecosystem, at least with open source programs there is something that can be done about it. Try that with 20 year old software that you just found in an obscure corner of the internet (what is that application that can generate speech synthesizer compatible data again?).

I covered re-compiling old software for modern distributions. For the average user this is difficult, and if it fails and needs re-coding in sections, just as impossible as reverse engineering an old binary. That said, Qbox Pro (which is that software) runs on modern Windows systems as long as they are 32-bit, since it is a 16-bit app. Out of the box and without any modification needed. (It just has path assumptions that need to be satisfied and suffers from old-software-design-itis. ;) )

 

I'm not entirely sure what point you were trying to make, though. Mine was just that saying XUbuntu was better than Windows because you can get updates on the modern distribution of XUbuntu but your options are limited on XP is a silly argument. ;) (And sorry, ArcadeShopper :) )

Edited by Tursi
  • Like 1

Share this post


Link to post
Share on other sites

Actually.. sort of related (even though this is an OT thread).. if you have 45 minutes to kill, this guy gives a brilliant breakdown of things in Linux that need work (ignore the trollish title, it's a great in-depth discussion).

 

 

... and actually, I haven't watched this one yet! I saw last year's and the year before. If you are really really looking to kill time, also check out the previous ones. :)

 

Linux really is great for what it is and it is growing every year. It's probably the world's greatest and most successful software experiment. But it could be SO MUCH BETTER. ;)

  • Like 1

Share this post


Link to post
Share on other sites

Screw all of you and your petty squabbles. I can run the same AmigaOS binaries from 1980s on a modern AmigaOS system.

 

post-27864-0-95431800-1400889166_thumb.jpg

  • Like 1

Share this post


Link to post
Share on other sites

Linux has been in this situation for 10 years. It is not getting better or is it gowning away.

 

I suspect another video like this in 2 years and the situation will have changed very little if at all.

 

That is the problem of a committee that designs and creates a OS.

 

It takes years to fix or update and no one agrees on the how...

Share this post


Link to post
Share on other sites

But it could be SO MUCH BETTER. ;)

It could - and I think it's getting there little by little ;). (I'd still rather run Linux over Windows myself though :P).

 

 

Screw all of you and your petty squabbles. I can run the same AmigaOS binaries from 1980s on a modern AmigaOS system.

I wish I'd gotten more experience than I did with those machines back in the day (we had some at our family church in the early 90's), but they were awesome 8). Pretty impressive to boot.

Share this post


Link to post
Share on other sites

I felt back in the early 2000s that Linux's fatal flaw was it was trying too hard to woo Windows users with a Windows-like interface. At the time, there was actually a Linux distro which was designed to look just like XP. Meanwhile, the Mac OSX UI was snatching up users left and right, us old-school Amiga users were lamenting that our beloved Workbench and various enhancements were not mainstream (well, many of us, anyway,) and Microsoft was going on and on in TS2 seminars about how Vista was geared to snatch Mac users.

 

I always like Gnome, but KDE made in-roads and IIRC even The Torvalds selected it as his personal preference (with some very nasty words for Gnome,) and at some point I actually missed CDE. Okay, not too much, but all the changes in other window environments were just annoying.

 

Anyway, just some thoughts, nothing really coherent here. Right now I am actually of two minds about my work. I absolutely loath and despise the bullshyte Microsoft is pulling and pushing onto its customers and I am ready to abandon my field because of it. At the same time, last week I deployed a Lenovo Yoga with Windows 8.1 Update 1 for a customer and it was actually an enjoyable experience.

Share this post


Link to post
Share on other sites

Seriously? The same /binary/ from 1999? Or UT99 that has been ported to the modern libraries? The program being updated to work doesn't count for my argument. :) If it is the same binary, my hat's off to that developer for somehow getting it right! I'd love to see that!

Loki's original UT99 binary has worked for me for all of this time. Of course, it's a bit easier for a game that only links against SDL1.2, OpenGL and libc, basically. But it is true that Ryan Gordon is a bit of a god when it comes to these things.

 

Using both, every day, all the time, I don't feel that Linux at the GUI is any snappier than Windows. (I do like the command prompt more ;) ). Some people do, but, some people don't keep their Windows machines maintained, and let it get loaded down with crap. If you don't /like/ the system, how can you be expected to understand it well enough to keep it clean? I like Linux just fine. I just like the Windows model better.

Well, I have to admit that my experience with Windows has been limited to my work laptop where there's a bunch of company mandated crap probably slowing things down. But I can definitely say that my Ubuntu install on the same device feels a fair bit quicker and while covering my needs in a way that better suits my workflow. But it's mostly preference and having been a Linux enthusiast since before Y2K was a thing it's not unexpected that I lean towards the Linux-y style of working.

 

I covered re-compiling old software for modern distributions. For the average user this is difficult, and if it fails and needs re-coding in sections, just as impossible as reverse engineering an old binary. That said, Qbox Pro (which is that software) runs on modern Windows systems as long as they are 32-bit, since it is a 16-bit app. Out of the box and without any modification needed. (It just has path assumptions that need to be satisfied and suffers from old-software-design-itis. ;) )

 

That's not entirely true though. It's not just the DLL's, it's also the legacy services that need to run in the background and the age-old components that in some cases have not been updated/modernized/optimized because they feel that backwards compatibility is so important. I also disagree that it's just as difficult to patch an open source project than it is to reverse engineer an old binary. Most of the incompatibilities stem from evolved API's, or libraries that have been deprecated in favor of newer ones that offer similar but improved functionality in a non-compatible way. If you have the source, you're mostly concerned in updating/replacing/refactoring the calls to those deprecated API's. With binaries only, you can hope that the changes are small enough that a simple binary patch might be doable or you're stuck reverse engineering the entire functionality.

 

I'll gladly concede that even with source code this is beyond most user's capabilities though :)

Edited by TheMole
  • Like 1

Share this post


Link to post
Share on other sites

I'm not entirely sure what point you were trying to make, though. Mine was just that saying XUbuntu was better than Windows because you can get updates on the modern distribution of XUbuntu but your options are limited on XP is a silly argument. ;) (And sorry, ArcadeShopper :) )

 

To be honest, neither am I :). Like I said, sometimes it's just fun to talk about this stuff, as long as it remains civil and good natured of course ;).

  • Like 2

Share this post


Link to post
Share on other sites

Last I heard, Torvalds used XFCE, but that was a while ago, when MATE was just coming out (the GNOME 2.x fork - which I use myself on Mint 13 ;)).

 

Illustrates my point... I have no idea what he is using these days. Not that it matters, really. I have tried several window environments and each has its aggravation point with me. I am moving away from Ubuntu, though. I have been running that in a VM since v10, but it has become a pain for me. My big gripe with Linux today is that almost everyone is depending upon packages. For extra credit on an assignment I needed to run and dissect the BitCoin DVR virus in an ARM QEMU host. I could not get it to run on Windows (just sat there) and the images I found were QCOW, which QEMU 1.blah would not run so I needed 1.5, but Ubuntu 12.04LTS only had 1.blah. v14 was due out on the assignment due date which has QEMU 1.5, but after the ninetieth upgrade attempt (all 89 just hung about restoring packages) I lost my personal directory, so I turned in the assignment with no extra credit. Everywhere I checked about compiling QEMU was all "apt-get" this and that.

 

I use packages on my Solaris boxes a lot more now than I used to. But I still like being able to compile exactly what I need, AND having information from other who may have run into silly things (like a Solaris .h equivalent to a Linux .h and what-not.) Anyway.

  • Like 1

Share this post


Link to post
Share on other sites

...My big gripe with Linux today is that almost everyone is depending upon packages...

I hear you. I think they do that for ease of use for beginners. What they should do is make some of the tricks more obvious, like using

sudo apt-get build-dep <package-name> 

to get all the dependencies needed to build something. For me, getting all the dependencies has always been the hardest part of building - once that's done it's a simple matter of ./configure && make && make install :).

Share this post


Link to post
Share on other sites

What is the point of creating a OS that has such a limited user base?

Solely based on programming skill in C and on spending hours mucking around with it to do anything?

 

If you can not do anything from the very start after the install then what is the point?

 

This is like going to buy a kit car you build your self vs a car you buy at the dealership.

Yea the Kit car allows total customization and freedoms.

But who wants to waste that much time to play a game or load a app?

 

Imagine a phone you had to spend 2 hours to set up before you could make a phone call?

That in a nutshell is the pure Linux person. Any package that gets away from this nightmare is welcome and should be applauded.

 

Any package of any software that requires that much set up is going to die out from having a small user base.

Look ease of use sells and hard to use just sucks.

Share this post


Link to post
Share on other sites

A lot of Linux users have the idea that Windows users spend most of their time running virus scans and defragging the hard drive aud never doing anything useful. A lot of Windows users have the idea that Linux users spend most of their time typing things like sudo apt build dep package name etc and never doing anything useful. Both those viewpoints have an element of truth, but less and less as time goes on. For example, a couple of years ago I was experimenting with Linux. I was trying to get a driver for my old Canon printer. With XP there was a disk that came with the printer, or you could connect to the internet and turn on the printer and Windows would download a driver for you. I found a four year old site that described how to install a linux driver for the printer. Two pages of sudo apt get BS. I commented to one of the Linux users at work that you could run a lot of virus scans in the amount time it would take to install a printer driver that way. He said "try turning on the printer to see what happens." I did, and in short order was given the option to download a driver for the printer. So time does not stand still for either Linus or Windows - improvements keep happening all the time.

 

FWIW, I am using Linux Mint with the Mate desktop. It installs about the same way that Windows does, you put in the disk and when it is done installing it just plain works. Use the Clearlooks theme and Elementary icons and you will be hard pressed to tell the difference between this and Windows. I think it actually takes a little longer to boot up than windows but not much. However, it doesn't seem to get the infamous Windows slowdown. You would never have to use the terminal (command line) in general use. The new release will come out at the end of the month and will be supported for 5 years. I think I will install this and leave it alone until the computer dies.

 

BTW, the windows XP version that I mentioned earlier that loads so fast is XP Home Premium. My old HP computer came with this but got a bad case of blue screenitis. Our IT guy said it had a bunch of viruses and swept the drive clean and "upgraded" me to Windows XP Professional. This turned the computer into a real dog - The change in performance was shocking. It just didn't have enough horsepower to handle the larger OS. If I had known then what I know now I would have reinstalled Home Premium, but instead I wound up hating the computer for 3 years.

 

One of the biggest problems that I see with windows is the way it slows down as the computer ages. Tursi mentioned maintaining the computer to avoid this, but most people don't know how to do it. You could make a lot of people happy if you could direct us to a site that explains how this is done.

  • Like 3

Share this post


Link to post
Share on other sites

The incremental updates are a major reason that I see Windows slow as it ages. Windows XP originally only required 128MB to run with a recommended 512MB. As of SP2, 512MB was the bare minimum to run reasonably, 1GB for comfort, and 2GB for performance. Disk fragmentation was a minor issue as XP would defrag when idle.

 

Disk age is another reason for performance problems over years: IDE drives fail. Period. Worst of all, IDE desktop drives can experience problems and not report it to the host OS. The drive will beat a sector to death until it gets statistically valid data, and in the meantime the OS will wait. The trick is that IDE drive relocate data from bad sectors but only on write. Program files tend to not change, and tend to stay in the same place over time, so if part of the drive your program files reside get flaky, load time can increase. (Somehow this avoids the prefetch, which I have yet to understand why.) A program which will non-destructively "refresh" the media surface can help.

 

Startup programs can be an issue, and there are a number of programs out there to corral those things if they cannot corral themselves. Boot time can be decreased at the "expense" of numerous "quick load" for programs like Open Office, amongst numerous others. Also remember that the first time a program is run after boot will generally be slow as all of the program parts (and/or its prefetch data) is loaded from disk.

 

But the worst problem I have with Windows is shitty driver writers. And I mean just the absolute bottom of the barrel worthless piece of shit programmers. Especially drivers which just shot-gun blast themselves all over Windows. I have seen printer drivers (the worst of the worst) bring a machine to a crawl.

 

The Internet Explorer cache is a place to keep an eye on. By default it is 10% of your hard drive, but if you have a 200GB hard drive, that is a 20GB Internet cache. That is a LOT of little tiny files which have to be tracked in a database which is live. Reduce your IE cache. For that matter, if you really are using Windows XP, stop using IE (the latest version for XP is 8, which is not only not supported but quite vulnerable.)

 

If you really want to breathe some new life into your XP machine (really, just ditch it already... you can probably run Windows 7, anyway) install an SSD. Since XP does not support TRIM, you need either an Intel SSD with the SSD Toolkit which can schedule a TRIM run (I use one on my XP x64 machine) or an SSD which automatically runs a TRIM when it detects that the OS does not (the old OCZ Vertex drives do this.)

 

If you do install an SSD in XP, I have a registry file which applies some of the same tweaks the Intel SSD Toolkit applies. I do not recommend disabling the page file (Windows always needs it, irrespective of what people say -- read Russinovich sometime,) nor do I recommend disabling System Restore (it will save your ass at some point.) Even if you do not go with an SSD, a newer hard drive (that 80GB IDE is ready to retire) will help out immensely: better read speeds, better interface speeds, larger cache, and far fewer failed sectors.

 

Run the Disk Cleanup (cleanmgr.exe) every so often. Watch your anti-virus: modern AV for Windows is pretty resource-heavy, especially with all of the hooks it has to install throughout the OS, upper- and bottom-level disk drivers, etc. Most AVs will drop XP support within the next year, anyway.

 

I maintain a number of XP machines which have been running for years with little performance loss.

  • Like 2

Share this post


Link to post
Share on other sites

The only thing I need XP (or Windows 7) for is to run Classic 99 or Win994a which aren't totally happy running in Wine. I have disabled internet access, so there should be no need for antivirus. It's a bit of a pain, because if I want to see something on the internet I need to exit XP and boot up Linux. I thought about getting an SSD for this old laptop which was low end in 2005. The light showing disk access was lit all the time when using XP and I figured that speeding this up would be a good thing. Instead, I upgraded the memory from 512MB to 2GB. Now the light is hardly ever lit and I see no reason for a SSD as the compuiter is now adequate for my needs. I may try runng XP in virtual box again. Earlier, XP in Virtual Box would have periods of inactivity where I thought the sysem had crashed, but it might be that the memory was inadequate- I had reserved 512MB for XP and this is obviously not enough.

 

Your advice above is much appreciated - does it also apply to Windows 7?

Share this post


Link to post
Share on other sites

I use Classic99 (and to a lesser extent Win99) in a VirtualBox XP quite frequently - it works great. I would suspect your problems were with memory - now that you have 2 GB you should be good to go :).

Share this post


Link to post
Share on other sites

In my experience, many computer problems are the result of cheap, unreliable hardware and poor utility power (or wiring) quality. For example, one of my friends builds his own PCs using some of the cheaper parts he can find and invariably his system will blue-screen and crash. It doesn't seem to matter if it was Win98, Win2000, XP, etc. His cheap RAM and motherboard combos were usually the culprit but he always blamed the OS.

 

On the other hand, I can count on one hand (well, maybe two) the number of times my self-built systems and purchased laptops have crashed in the last 10 years. Adequate RAM, more reliable parts, and not beating the hardware to death with "cleanup" and "scan" programs has worked well for me. Coupled with a UPS, my last home server ran for nearly seven years without missing a beat. I only rebuilt it to take advantage of larger drives and less power-hungry hardware, and to virtualize some apps for me and my wife. That replacement has been running 24x7 with only an occasional reboot, for over 2 years.

 

Whether you hate or love Microsoft, it seems pretty obvious they get some of the bad rap for problems that are really not of their doing.

Share this post


Link to post
Share on other sites

That's not entirely true though. It's not just the DLL's, it's also the legacy services that need to run in the background and the age-old components that in some cases have not been updated/modernized/optimized because they feel that backwards compatibility is so important.

No, this statement is not true. There is very little software on Windows that depends on services. I'm not saying you can't give five examples, but I am saying it's not anywhere near a majority. I am not exaggerating when I talk about old software compatibility with Modern Windows, nor am I just supposing. I actually run tests to see what works. Likewise, I am forced to do similar on Linux.

 

/Drivers/ are a different case, and Windows DOES change the underlying architecture to make the new ones run better. :)

Share this post


Link to post
Share on other sites

Tursi mentioned maintaining the computer to avoid this, but most people don't know how to do it. You could make a lot of people happy if you could direct us to a site that explains how this is done.

They don't "just" slow down. Something changes.

 

Usually it's caused by installing new software, and letting it install startup programs, toolbars, and other crap. Just about every damn package on Windows comes with some kind of a startup tool to "speed startup" or "check for updates", or some other stupid task, usually completely unnecessary. (Adobe and MS Office are the worst offenders for this on my system, and removing their startup tasks has no impact on the programs. For instance, the MS Office startup, in particular, all it does is load the DLLs so they are already in memory in case you want to run Office later.) The other problem is that the way that Windows manages swap as a file means that a full hard drive can slow it down. (Linux doesn't have a swap file but a swap partition, which is a smarter way to do it ;) ).

 

So... I don't know if there is a website, but it's pretty easy. And it's the same as on Linux and other systems, and provides the dual benefit of performance and improved security. Remove unnecessary software and disable unused background processes. (Emptying the temp folder from time to time doesn't hurt either ;) ).

  • Like 1

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...

  • Recently Browsing   1 member

×
×
  • Create New...