Jump to content
IGNORED

What if NEW computers operated like OLD computers...


Omega-TI

Recommended Posts

Remember back in the day... you know the "Classic Computer Era" those pre-PC days when our computers generally only ran one program at a time. The time when there were not 10,000 unknown processes going on in the background to spy on us, slow down our computers or track our every movement for revenue generating purposes.

 

Imagine how fast our modern computers would seem if our new computers ran like our old computers?

I'm sure most people like multiple programs running at the same time, BUT I wonder how much B.S. is really going on in the background that we could get rid of that would actually improve our experience?

 

Do any of you gurus know?

  • Like 4
Link to comment
Share on other sites

It's an obvious point of contention with me and has been for a long time. Doesn't matter how "fast" PC's are on paper (or spec) anymore. Counter-productive/intuitive performance crushing updates, inefficient internet languages and all the things you mention and more - only aid in slowing down even the fastest PC's. The modern PC has been stripped down to nothing more than an appliance at this point and those behind the software have seen to that. Such a miserable situation. Welcome to the future. :(

  • Like 2
Link to comment
Share on other sites

multitasking is gemnerally a good thing. Bad is a plethora of background processes hogging cpu and harddrive chatter. like when my overclocked win7 rig svchost is maxing out one of my eight cores, i hear the fan spin up or hdd light flicker. Culprit was win update service. i turned that crap off, something win10 disallows... But forced updates, spying, ammd phone home bs is what really grinds my gears. that and walled garden app stores telling me wghat i can and cannot install... :mad:

  • Like 1
Link to comment
Share on other sites

My work laptop is about three years old and even though it has a 2.6Gz quad i5 and 8GB of RAM, it is dog slow. I think the aggressive antivirus/malware s/w my company put on it is a major contributor. It got even worse when they went to full hard drive encryption. Any access to files is crazy slow now. It's a serious productivity killer.

 

A few years ago I setup my 10 year old AMD desktop to dual boot WinXP and DOS. It boots in about 1 second when I choose DOS and the programs absolutely fly.

  • Like 1
Link to comment
Share on other sites

Multitasking has been around on personal computers since OS-9 and MP/M were released in 1979... so almost as long as personal computers have been around.
If you have used either OS, it's obvious that multitasking in itself isn't part of the problem.

The problem is that we have a generation of developers and companies with the attitude that if their software isn't fast enough, you need to upgrade your computer.
Every company thinks it's okay for them to install something that runs in the background just to see if their software needs an update.
Interfaces aren't text or simple buttons anymore, you need something drawn by a graphic artist as eye candy taking up megabytes of hard drive space and RAM.

Word processing isn't just ASCII text anymore, it's 100s of languages, fonts, symbols, colors, graphics, kerning, etc...
We have virus checkers scanning every program before they can run to make sure some idiot didn't infect a file with malware.
Software now requires virtual memory, protected memory, over a dozen run time libraries just to display anything on your screen, and those include a huge amount of code whether you use it or not. The OS must do everything imaginable now because someone thought it was a good idea or just cool, so it now includes more lines of code than all your common applications put together.

If you want to know what things would be like without all the garbage, just compare a fast Amiga to a modern PC.
The Amiga OS and GUI is amazingly responsive even compared to Windows machine clocked way faster.
I haven't tried it for a while, but AROS (Amiga Research OS) will give you an idea of what it's like to run such an OS on modern hardware.

It'd love to see a benchmark of a modern PC with and without the bloat.
The bloat certainly means slower boot times, slower load times for applications, and a large RAM footprint for everything.
That certainly makes a machine feel slow, and swapping code in and out of RAM using virtual memory really slows programs down.
But once things are running, and when you aren't swapping code around with virtual memory, I wonder how much cpu time is actually eaten up by background processes.
Even 5% would be a lot though when you consider 5% of 2GHz is as much CPU power as a 100 MHz cpu.

  • Like 5
Link to comment
Share on other sites

Modern computers *do* basically operate like old computers. Nothing much has really changed fundamentally. What's changed is that there's more memory and CPU speed available, so programs are written to take advantage of that. I think threads like this kind of miss the point and assume that people upgrade their RAM and speed in order to run the same programs faster, but that's not really the case. Usually people upgrade their machines in order to be able to do something they couldn't do before, or at least not well. For example, I got a workstation laptop recently to be able to edit 4K video, which is not something I needed to do a couple years ago. And while I could technically do it on my old machine, it was painful.

 

Editing 4K video and having it look professional requires a professional-level editing app as well as well as certain hardware and a modern graphical OS to run it on. I use DaVinci Resolve 12.5, which is something that could not have been imagined even in the days of the Amiga, because computers did not have enough memory and were too slow to be able to run a program like that, or even the underlying technologies that allow it to exist.

 

The end result is that our computers nowadays don't really feel any faster than they used to, despite being objectively many thousands of times faster. But they *do* a hell of a lot more. And that's really the whole point. Classic computers are fun to toy around with, but would you really want to try to get by in life with a C64 as your main computer, *even if* it was 100,000 times faster? The advantage of that extra speed would be in allowing you to write and run *new* programs that give you functionality you didn't have. It wouldn't be in running "California Games" at 100,000 times speed.

  • Like 6
Link to comment
Share on other sites

Remember back in the day... you know the "Classic Computer Era" those pre-PC days when our computers generally only ran one program at a time. The time when there were not 10,000 unknown processes going on in the background to spy on us, slow down our computers or track our every movement for revenue generating purposes.

 

Imagine how fast our modern computers would seem if our new computers ran like our old computers?

I'm sure most people like multiple programs running at the same time, BUT I wonder how much B.S. is really going on in the background that we could get rid of that would actually improve our experience?

 

Do any of you gurus know?

 

We could get rid of quite a lot. Probably 70% or so.

 

I have an older WinXP machine that's pretty tight and since its purchase it has gotten better and faster over the years. And on the flipside, it downclocks to 50MHz in a special low voltage mode allowing me run basic office applications all damn day long. Essentially a sleep mode with the LCD going. Keypresses or clicks blip the speed up to around 500MHz when on battery, or 2GHz when on AC. I also have a cache module between the spinner and IDE PATA connector for some good buffering. It's one of my long-term computing machines that I have spare parts for and I expect it'll work many more years.

 

When running full tilt it handles most all my emulators, and boots in 25 seconds from its spinner hard disk. And it has about 30 processes running, about 10 of which are there to support hp printing and scanning. Otherwise it be around 15-20 processes.

 

Typically I have Office running, Outlook, Firefox, Itunes, some file management windows, a couple of desktop support tools like priority changers and "this window on top" types of things, and several instances of Wordpad or Notepad. It also has a full suite of photo editing software, ready to handle images from the biggest and baddest DSLRs. I added a lot of auxiliary functionality through 3rd party utilities that stay out of memory till called upon.

 

Its all on demand and "happens" when **I** want it. Not automatically loading at startup, or when some remote server says so or when some company says so.

 

It's currently sitting at SP3 with the last OS patch being done some years ago. So it's a stable and dependable system I can count on day to day. I also have a 7 machine, too, but I have not fully optimized it yet, it's ongoing project that happens in bits and pieces.

Edited by Keatah
  • Like 1
Link to comment
Share on other sites

Multitasking has been around on personal computers since OS-9 and MP/M were released in 1979... so almost as long as personal computers have been around.

If you have used either OS, it's obvious that multitasking in itself isn't part of the problem.

 

The problem is that we have a generation of developers and companies with the attitude that if their software isn't fast enough, you need to upgrade your computer.

Every company thinks it's okay for them to install something that runs in the background just to see if their software needs an update.

Interfaces aren't text or simple buttons anymore, you need something drawn by a graphic artist as eye candy taking up megabytes of hard drive space and RAM.

Word processing isn't just ASCII text anymore, it's 100s of languages, fonts, symbols, colors, graphics, kerning, etc...

We have virus checkers scanning every program before they can run to make sure some idiot didn't infect a file with malware.

Software now requires virtual memory, protected memory, over a dozen run time libraries just to display anything on your screen, and those include a huge amount of code whether you use it or not. The OS must do everything imaginable now because someone thought it was a good idea or just cool, so it now includes more lines of code than all your common applications put together.

 

If you want to know what things would be like without all the garbage, just compare a fast Amiga to a modern PC.

The Amiga OS and GUI is amazingly responsive even compared to Windows machine clocked way faster.

I haven't tried it for a while, but AROS (Amiga Research OS) will give you an idea of what it's like to run such an OS on modern hardware.

 

It'd love to see a benchmark of a modern PC with and without the bloat.

The bloat certainly means slower boot times, slower load times for applications, and a large RAM footprint for everything.

That certainly makes a machine feel slow, and swapping code in and out of RAM using virtual memory really slows programs down.

But once things are running, and when you aren't swapping code around with virtual memory, I wonder how much cpu time is actually eaten up by background processes.

Even 5% would be a lot though when you consider 5% of 2GHz is as much CPU power as a 100 MHz cpu.

 

A really great synopsis.

 

I want to re-emphasize the laziness. Programmers are using more and more toolkits and libraries and these "development ecosystems", for lack of better explanation on my part, like Qt or whatever. In gamers' terms I suppose it'd be Unity. Whatever. These APIs and "libraries" are a big source of bloat and they are responsible for taking up to 60% or even more of a program's computational time.

 

In the old days, like the Apple II, you hit a key, the cpu does something with it and it ends up in the video memory, waiting to be scanned to the screen. Today it thrashes through many layers of software. Too many middlemen. And it seems every damned company "has a solution" to speeding that process up while in reality they're packing on the pounds. And being cross-platform development tools doesn't help any. Any hardware-specific features are abstracted away in favor of the least common denominator. Qt that's you!

In thinking about word processing many great works were accomplished on basic IBM compatibles and lesser machines. Does anyone not doing pro-level DTP need anything more than WordStar or WordPerfect or Word 2.0?

 

There seems to be this fallacy that more tools, more busy-work on the desktop, more options, more pull-downs will make you into a more creative person. And they sent men to the moon with slide-rules and computers made of discrete transistors! By comparison, every village, every town, should be designing starships with the computing power available today. Just like in the commercials on TV. BULLSHIT!!

Edited by Keatah
  • Like 1
Link to comment
Share on other sites

I think spacecadet gets it -- modern computers are orders of magnitude faster than the old beaters of the past, and having a few well-chosen background processes are like spit in the ocean in terms of slowing things down.

 

Windows 10 might be getting a dedicated game mode (someone found a dll suggesting this) which could give you what you think you need. Personally, I'm skeptical that it will change much on a well tuned system, but I guess it's nice to have options. http://www.pcgamer.com/windows-10-may-be-getting-a-new-game-mode-option/

Link to comment
Share on other sites

Sorry about quoting long messages when only a line or two would do. Blame it on the shitty tedious software in tablets.

 

Modern computers *do* basically operate like old computers. Nothing much has really changed fundamentally. What's changed is that there's more memory and CPU speed available, so programs are written to take advantage of that. I think threads like this kind of miss the point and assume that people upgrade their RAM and speed in order to run the same programs faster, but that's not really the case. Usually people upgrade their machines in order to be able to do something they couldn't do before, or at least not well. For example, I got a workstation laptop recently to be able to edit 4K video, which is not something I needed to do a couple years ago. And while I could technically do it on my old machine, it was painful.

 

Editing 4K video and having it look professional requires a professional-level editing app as well as well as certain hardware and a modern graphical OS to run it on. I use DaVinci Resolve 12.5, which is something that could not have been imagined even in the days of the Amiga, because computers did not have enough memory and were too slow to be able to run a program like that, or even the underlying technologies that allow it to exist.

 

The end result is that our computers nowadays don't really feel any faster than they used to, despite being objectively many thousands of times faster. But they *do* a hell of a lot more. And that's really the whole point. Classic computers are fun to toy around with, but would you really want to try to get by in life with a C64 as your main computer, *even if* it was 100,000 times faster? The advantage of that extra speed would be in allowing you to write and run *new* programs that give you functionality you didn't have. It wouldn't be in running "California Games" at 100,000 times speed.

 

As early as the KIM-1 I always thought there to be two classes of upgrades. One for speed and one for added functionality.

 

Back in the day we'd upgrade our CPU from 1MHz to 3MHz (Apple A.E. Transwarp card) in order to finish a task faster. We would upgrade the SYSTEM with a printer or digitizer or sound board to gain new capabilities like sound/vision io.

 

This remains true today, we just don't think about it that much anymore. Consider the electric car, the CPU gets digitizer cameras and suite of sensors and a powerful motor. It's essentially a robot. Not fundamentally different than Mike.

post-4806-0-73390800-1483151500_thumb.jpg

  • Like 1
Link to comment
Share on other sites

Operate like pre-PC days? Without pretty pictures to point and click at, 90% of earth's population would disappear from the internet. Hell, most wouldn't be able to boot up a computer let alone figure out something like DOS. Think of how intelligent and pleasant the internet would be? :)

 

I feel my modern PCs do anything I want in the blink of an eye and speeding things up really wouldn't make a difference. It's rare that I get a background process going that slows things down.

I think the low points had to be in the 90s. I recently picked up some IBM P70s and I forgot how painfully slow things were back then. They are 386 machines with 486 AOX add-on kits (they had to be big $$$ screaming machines back in the 80s). When you turn the power on, you may as well go make a sandwich, and that's just booting up into DOS. Windows 3.1 loads instantly but I'm also reminded why I didn't really start using Windows until ME.

Link to comment
Share on other sites

I'm as big of a fan of classic computers as anyone, but I really would never want to go back to the old days. A properly maintained modern system shouldn't be slow, and running multiple programs on multiple displays and doing multiple things at once is a boon - not a detriment - to productivity when managed correctly.

 

Anyway, to be fair, while there was no real performance degradation with many classic computers (especially with ROM-based OS's) no matter how many years they were running, removable/storage mediums like cassettes and disks tended to be slow and finicky. Cartridges were obviously instant, but it wasn't like you could write to most of them.

Link to comment
Share on other sites

Multi-tasking is overated. Task-switching is all that is needed with most programs. Resident background programs should be illegal without special authorisation. Too much sloppy programming; optimisation costs money and "acceptable" performance seems to be a low target. In the old days hardware performance was behind and being pushed by the software. Thats not the case anymore.

Link to comment
Share on other sites

  • 3 weeks later...

​While I don't miss the old OSes, Software bloat that isn't really giving us new functionality has long been a pet peeve of mine. There's an old saying that what Intel Giveth Microsoft taketh away. My first experience with 'bloat' was in college, when I was using Microsoft Word to write papers and enjoying it. I came back next semester to find Word on those machines had been "upgraded" and it was now so slow that it was unusable, because the machines needed to be upgraded. But it's not just Microsoft. Every hot new web browser starts out as small, quick, pleasant to use and eventually grows into a monster.

 

I understand the guy above who says "now I can edit 4K video", that's valid, but for most of these apps, we are doing the same things we were doing 10-15 years ago, but suddenly the software needed to be bigger, and our machines are now too slow.

 

Related is the fact that many modern OSes get slow over time. Due to clutter, excessive background processes, disk fragmentation or whatever. A good OS could keep many of these things in check, which makes me think these things are allowed to happen on purpose because many users will simply upgrade because "my machine is too slow" rather than cleaning up the crap that made it slow.

 

Smartphones are even worse. By the end of my contract, they've gotten very slow and sometimes painful to use. I even take steps to clean mine and keep it running well, but it's like a losing battle. They will replace all your apps with ones that run slower on your hardware over time, etc.

 

So yes, the limited resources of the old computers forced developers to optimize. OS in ROM meant upgrades were more an exception than a rule.

Link to comment
Share on other sites

I also feel like a skinny person trapped inside a fat persons body whenever I use Windows these days. Same with most modern software. That's why I switched to Linux a while back. Of course I still feel like most modern Linux Distributions could optimize memory much more efficiently.

Link to comment
Share on other sites

Even BITD, miniaturization marched on fast.

In 1970 DEC released the famous PDP-11 :

Pdp-11-40.jpg

13 years later, the Soviet shrunk it into a C64 sized box :

gallery_35492_1655_69713.jpg

 

And it was sold for 600 rubles, so about 2 months of salary of a worker of the era. Pretty sure that the PDP-11 in 1970 wasn't that cheap at all :D

 

And reading about it, computers supported multitasking and real time quite early on, with the Leo III and the Bull Gamma 60 offering multitasking in 1960/1961. DEC RT-11 OS in 1970 was realtime (hence the RT name).

So modern computers operate like old computers - it mgiht be more important to say that modern programming (as was already pointed out) isn't done like BITD where the important things were ressources.

Edited by CatPix
  • Like 1
Link to comment
Share on other sites

After spending four hours on Friday evening trying to make two different vintage computers read floppy disks, I am rather pleased that new computers no longer work like the old ones used to. In particular the IBM PC 5150 and 5160 XT, which I find extremely finicky when it comes to booting from floppy disks, and finding a disk that is bootable. And it is not the first time I've had this trouble. Not to mention all the work to disassemble it to reach the components.

  • Like 3
Link to comment
Share on other sites

After spending four hours on Friday evening trying to make two different vintage computers read floppy disks, I am rather pleased that new computers no longer work like the old ones used to. In particular the IBM PC 5150 and 5160 XT, which I find extremely finicky when it comes to booting from floppy disks, and finding a disk that is bootable. And it is not the first time I've had this trouble. Not to mention all the work to disassemble it to reach the components.

I definitely don't miss floppies. You have to wonder what programs we might have had if there had been fast devices with large capacities similar to what we have now.

Link to comment
Share on other sites

My old XP laptops got very slow towards the end. It got to the point where it was easier to keep them on standby all the time instead of booting them every day. It wasn't until I upgraded and moved over to Linux that I started tinkering around with them and found that removing the anti-virus software improved their performance by about 80%.

 

Those laptops still come in handy for old games, emulators and movies and they've run a lot better since Microsoft and AVG stopped supporting them.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...