EUFI YOU!

If you are a tech nerd, you care about this.

If you just buy and use Micro$oft products and never think about it, you don’t care.

As of Windows 8, MicroSnot is requiring that your hardware only work with their stuff. The boot loader locks you out otherwise. A couple of Linux distributions have signed up with Microsoft signing and paid fees (so Microsoft will let RedHat Linux boot, for example) but if you don’t like the idea of your machine talking to Micorsoft every time it boots up to get permission to run, you are screwed.

As I’m fond of getting old hardware essentially for free and running very fast Linux on it, this will shut off the supply of old hardware reuse (Oh Boy, more old dead computers to the landfill or China Recycle instead of reuse… /sarcoff;)

Story here:

https://www.linux.com/learn/tutorials/588498-uefi-secure-boot-big-hassle-questionable-benefit

Microsoft requires UEFI “secure” boot for Windows 8 certified hardware. More security is good, right? Even if it locks out Linux?

Microsoft is requiring Windows 8-certified hardware to ship with UEFI Secure Boot enabled. This prevents installing any other operating systems, or running any live Linux media. There are ways to get around Secure Boot– but why should we, once again, have to jump through Microsoft hoops just to use our own hardware the way we want to?
[…]
UEFI was originally EFI, which was developed by Intel as a modern alternative to the PC BIOS. Now it’s supported by a big ole industry consortium populated by pretty much everyone in tech. UEFI is really a little operating system, so it can be programmed to do just about anything: boot fast, play games, allow remote access without starting the operating system, shutdown, Web surf, and all those things that the Linux pre-boot environments promised. It supports both a pretty mouse-driven graphical interface, and a console interface. It has its own networking stack, and supports its own set of device drivers so you can have video, networking, peripherals, and other functions available during pre-boot.

Aside from the fact that this is a 100 MB pig of a boot loader, I don’t really WANT my bootloader to enable things like remote access and networking to who know whom about who knows what… The number of potential security “issues” this has makes my skin crawl. (Can you spell TLA?… Three Letter Agency…)

Microsoft has gone to something called “Signed drivers” so various bits of code have to check in with a signing authority to be validated before they will run. What happens if the signing authority “has issues”?

To deliver some actual security, Secure Boot needs a bulletproof pre-boot environment, and a trusted, secure certificate authority and signing keys. So the second big question, after “Bulletproof? What’s that?” is whose CA and keys? Microsoft already has a CA infrastructure in place for signing drivers.

But haven’t we learned that the root CAs are vulnerable? How many times has Verisign been compromised? Bruce Schneier calls the certificate system “completely broken.” Who hosts Microsoft’s CA? Verisign. Have we already forgotten the Flame malware that spoofs Microsoft’s own CA, takes over Windows Update, and fools Windows computers believing that the malware they’re installing is genuine trusted signed binaries?

To borrow Bruce Schneier’s wonderful phrase, I think this just another piece of security theater that will inconvenience many and benefit no one. Except for whatever value is derived from forcing purchasers of new hardware to be Secure Boot beta testers, and to once again dance to Microsoft’s tune.

OK, so it is sort of useless, really. So what? Surely we are used to fat bloated code with bugs in it, so just run Linux instead and get over it. Right?

Despite all the questions about its safety and actual security benefits, Microsoft requires, as a condition of receiving the official Windows 8 certification, that hardware vendors enable UEFI Secure Boot by default on client systems. They may use their own signing keys, or Microsoft’s. There are financial incentives to getting that official certification, so they’ll all do it. Windows 8 will boot without Secure Boot, and it will install on legacy hardware. But later this year, as the new OEM Windows 8 PCs enter the market, they’re going to ship with UEFI Secure Boot turned on. So everyone who doesn’t want to hassle with Secure Boot will be forced to. Originally Microsoft did not even want a disable option, or to allow users to use their own keys and certificate authorities, but they changed their minds for x86 hardware.

Fedora and Red Hat, wanting to keep their users’ options open, have chosen to pay the $99 fee to be signed with Microsoft’s keys. This will allow Fedora 18 users to use Secure Boot-encumbered systems without disabling it, and eventually Red Hat Enterprise Linux as well. Other distributions are still figuring out what to do.

OK, so they backed off a little bit, for now, for x86 hardware and will let you do the whole “roll your own keys and certificates” and all that crap. I’m sure that as soon as they can swing it, they will go back to their original position of “no changes Windoze only”…

Why do I think that?

Turning it Off– Except on ARM

How to turn it off on your x86 device? That will depend entirely on the hardware vendor’s implementation. Users will have to enter their UEFI interfaces and hunt down the Secure Boot control, which could be called anything and buried anywhere. I have been unimpressed with the quality control that went into the PC BIOS for all these years, so who knows what fun awaits us in messing with UEFI firmware settings.

ARM hardware is another story– Secure Boot is mandatory and cannot be disabled.
(See page 122 of the Windows 8 hardware certification.)

So if you have an ARM based computer shipped with Windows 8, Microsoft owns it (and you).

You dance to their tune, they decide what you can run, and when, and that’s that.

No thanks.

The only good news in all this is that generic boards have gotten cheap enough (and fast enough) that nearly no money gets a generic bit of hardware not so afflicted where you can put up a secure (and PRIVATE) Linux environment without it tattling to MicroSnoop and getting permission to use the devices and device drivers.

Guess it’s time to start planning to “Roll your own” computers again…

Why Bigger Isn’t Always Better

This EUFI is really an operating system. It has a BSD (Berkeley Standard Distribution Unix) license. And while I love BSD and it is one of the most robust and well written operating systems; this puts ALL your higher level OS loads et. al. as dependent on ANOTHER operating system. One you can not inspect, nor harden. God knows what’s in it. While the basic BSD has been hardened by generations of college students giving their professors heartburn, and likely has had most of the easy bugs patched; this is a new port and a new version. It WILL have bugs and holes.

Not to mention that a TLA showing up at the vendor can request all sorts of nice little back doors be slid into the code and left inactive, where you won’t notice, until they want a little look around. (TLAs from all sorts of governments… remember that many of these machines will be assembled in China…)

Oh Well. It’s not like I was ever going to buy a Windows 8 machine anyway. At least now more of them will go to the landfill faster.

And I can’t wait for the first time a “bios bug” is announced that lets EVERY ONE OF THEM be completely compromised no matter what you do.

Compromising the signing authority step is a known technique now. It is just a matter of time (and likely not that much of it…)

Windows 8 – Just Say No.

Subscribe to feed

About E.M.Smith

A technical managerial sort interested in things from Stonehenge to computer science. My present "hot buttons' are the mythology of Climate Change and ancient metrology; but things change...
This entry was posted in Tech Bits and tagged , , , , , , , , . Bookmark the permalink.

96 Responses to EUFI YOU!

  1. Ian W says:

    So what you are saying is that Micro$oft Windoze 8 is actually a virtual machine running on BSD Unix. Not only that but the BSD is buried so that white and black hats have full access but the user does not.

    Does Micro$oft have a death-wish?

  2. E.M.Smith says:

    @Ian W:

    I don’t think it is a “Virtual Machine” (that is, I think the MS Binaries run directly on the hardware) but just that they could only make a “secure boot loader” that did all the “Check with MS key exchange certificate authority crap” with a BSD kernel. It boots, does all the Chatty Cathy stuff and validation, then hands the box over to the MS Kernel IFF your devices and device drivers et. al. have had their “signatures” validated.

    During that time, there’s a great opportunity for various exploits….

    As to MS having a death-wish: One can only hope…

  3. p.g.sharrow says:

    Windoze is a virus. Why advance to a worse quality system that will lock you out of your own system and leave hidden back doors openable for possible hostiles to use. If they can, they will. Still I avoid becoming familiar with a different operating system, and I don’t want to kill this XP system before I get a useable Linux system up. Dual boot always leads to a system crash and the need to rebuild and recover. Oh well soon, and I am really tired of needing to reboot every day or two as this thing locks up at the most annoying times. pg

  4. adolfogiurfa says:

    And what abouT Apple´s OS?

  5. E.M.Smith says:

    @P.G.:

    Well, I’d not be complaining if it had a full BSD on it and I could just stop at that point. But it only has a BSD kernel and some tools to boot onward. IFF it doesn’t prevent me from loading the rest of the BSD kit, I’d be happy (but that would be an incredible security hole, one even MS is not dumb enough to allow…)

    So basically, they load a crippled BSD ’cause it works, to then tattle, rat, and police you, so you can have the “privilege” of running MS OS under their watchful eye and direct control; and if you don’t like it, you can throw away the box…. or get into Bios Hacking.

    One good thing: This will train a whole new generation of Bios Hackers…. Heck, I’m already thinking: “If it’s on the disk, I can do binary overlays to disk. If it is in a ROM, I can burn a replacement ROM… or maybe just a clip on around it…. it’s got to have an error mode exit…”

    Heck, if it expects to boot BSD, just find the actual low level boot load code, and flash your own BSD with it into a replacement chip…

    But the good news is that I’ve found a few dozen single board computers in the $Hundred-ish range with all sorts of nice hardware that will be very nicely fast with Linux. The other good news is that there will undoubtedly be someone selling a vanilla box with the EUFI left open for booting “Other Operating Systems”, and perhaps even with a spot where you can ‘flash’ your desired boot security checks – meaning I can make a Linux box that can also spoof MS if I want. ( I wouldn’t bother, but there are other folks…. “Look, I got MS to validate my signature, from a Linux box!”… )

    There are so many computers running Linux and BSD (full distribution) out there that a market exists for generic boot boxes.

    The other bit of good is that Linux is so much more efficient, that all the existing Windoz 7 boxes have me covered for “more hardware than I need” for the next couple of decades… Heck, I’m happily running on an AMD 400 MHz chip with 128 MB of memory and don’t feel any need to upgrade… so one of those multiple core GHz things with memory measured in GB is just so way much overkill…

    Ah, well, back to looking at specs on Single Board Computers to see what else turns up in the process of figuring out if all the parts will work together…

  6. Larry Geiger says:

    Virtual. Just get enough ram. MS never knows what or where it’s running.

  7. tckev says:

    From http://en.wikipedia.org/wiki/Unified_Extensible_Firmware_Interface
    you will notice
    In January 2006, Apple Inc. shipped its first Intel-based Macintosh computers. These systems used EFI instead of Open Firmware, which had been used on its previous PowerPC-based systems.[29] On 5 April 2006, Apple first released Boot Camp, which produces a Windows drivers disk and a non-destructive partitioning tool to allow the installation of Windows XP or Vista without requiring a reinstallation of Mac OS X. A firmware update was also released that added BIOS compatibility to its EFI implementation. Subsequent Macintosh models shipped with the newer firmware.
    So Apple has been using this style of system for a while. But that is the price the user pays for a proprietary system. Considering what MicroSlug wants to do is a bit of ‘me to’ profit protection by trying to say a general hardware computer platform was designed only for their operating system. The Linux groups have been trying to grapple control of the keys away from MicroSlug for a while but so far have only gained small concessions of being allowed to issue user keys. There is also problem with hardware upgrades on PCs that were originally Windoze machines and then remade for Linux (or any other operating system) – who issues keys for that new hardware? Without the new hardware key it’s a doorstop. MicroSlug insist it is themselves only! The whole thing is a nasty PITA and a good source of future litigation.

  8. adolfogiurfa says:

    Does not it violates the US Anti-Trust law?. Everything under the sun borns, grows, gets old and finally dies. That is the ever present Sinus law, and it seems Microsoft has been fatally injured by Apple´s big sales. In any case we are in need of a free and open system.

  9. tckev says:

    I must admit is was about 2 years ago when I looked at this first, and things have moved on a bit, now I’m playing catch-up.
    Basically Apple can get away with it because the whole shooting match is theirs – hardware and software. But hackers have got through.
    MS on the other hand are pushing for maximum control while leaving the door half ajar to everyone else. I’ve just downloaded the overview of the new Intel spec and some thing have changed a lot.
    1. Open systems – Linux et al. have a greater say with Fedora, Novell, IBM and HP there.
    2. Original OS v User changed OS (M$ v Linux installed) looks to be sort of fixed. – You get a ‘Non optimize’ boot-up.
    The difficulty of user hardware upgrade still looks like its there but I have not looked at the full spec yet. But that’s the problem with this hardware security key/OS security key the Boot Manager has to arbitrate.
    ARM chip devices tend to be hardware and OS specific so I suppose M$ get away with it.

  10. p.g.sharrow says:

    tckev says:
    6 July 2012 at 2:25 am
    Hardware upgrades to our MS systems has been a nightmare for the last dozen years. Every time we needed to add a newer piece of equipment it would demand a total system upgrade as everything else was obsolete or out of date. I have been fighting Microsoft for over 25 years and it gets harder and harder to do workarounds of this enforced obsolescence of hardware and software. pg..

  11. Kevinm says:

    Is this designed to stop my kids from using the Linux dd command in a virtual machine for replicating copywrited material? The mass market seems to otherwise stop at the “If you just use Micro$oft…”, or eject to Apple.

    They are targetting a minority that cares, but only one of them must find a way to thwart it, Then the knowledge will pass along. Futile misallocations (iPad has corrected that to missile locations twice… Grrr! But it’s so portable and handy) of resources like that are how Microsoft is losing the market to Apple.

    Apple is even more restrictive, but that’s designed in, accepted, and paired with actual research and development of things it’s customers do want. If microsoft shifted its protection spending into development spending like google does, that near monopoly they had might still be flourishing.

  12. gallopingcamel says:

    Windows 95 was a great operating system so I rebuffed Bill Gates attempts to make me replace it with Windows 2000, Windows NT etc. I would be using “95” today but for the fact that some of my newer application refused to run unless I “Upgraded” to Windows XP.

    Back then I tried Knoppix and Red Hat Linux but lacked the software skills needed to make them work properly. In the end I gave up and bought several copies of Windows XP. Eventually I got to like XP almost as much as Windows 95 but Bill Gates was not done with me. He forced me to buy a copy of Windows Vista.

    Trying to make Windows Vista work on my main (quite complex) computer was such a dreadful experience that I gave the Vista disk away (with malice aforethought) and switched to Linux for good. Freedom at last! Fortunately the Linux world had advanced to the point that even a software dummy like this camel can handle it.

    Five years later all six of my computers run Ubuntu Linux ranging from “Hardy Heron” (Ubuntu 8.04) to “Precise Pangolin” (Ubuntu 12.04). My applications now run without Windows with one exception. My Optical Time Domain Reflectometer has a chip set for its serial port that has no Linux driver so whenever I need to make OTDR measurements I have to fire up Windows. Just one machine out of six for less than eight weeks per year.

    My business runs on Linux except for payroll that my bank (Bank of America) used to take care of. Sadly, the top management of BoA has deteriorated since the days of Hugh McColl.
    http://joannenova.com.au/?s=Bank+of+America

    In protest to this lunacy I am in the process of closing my BoA accounts. Can anyone recommend any good US payroll applications for Linux?

    Thank you Chiefio for the “heads up”. I don’t expect to be buying any PC hardware in the next few years but if something breaks I think you are telling me to be careful not to avoid replacements that need Windows 8. I am reminded of the “Sandisk” flash drive that came with “U3” that was irritating until I found out how to get rid of it.

  13. gallopingcamel says:

    adolfogiurfa says (6 July 2012 at 1:23 am),

    The marketplace is taking care of overeaching by MS much more effectively than any kind of legislation. Just take a look at the relative capitaliztion of MS and Apple.

    The Windows 8 ploy from MS will backfire because PCs are being overtaken by more compact devices that don’t use Windoze. Then there is the tiny minority like this camel who are too cheap to buy anything from MS or Apple.

  14. princessartemis says:

    P.G.: “Dual boot always leads to a system crash and the need to rebuild and recover. ”
    I had this issue with a dual boot system where Windows and Linux were on the same disc, but not when they were on two different drives. There are things which Windows gets upset about if it isn’t the master boot drive, such as certain updates, but otherwise, I’ve not had an issue with booting into one or the other via grub2 when they are on different drives.

  15. When the IBM PC came out, the technical docs had a listing of the BIOS, which was needed if you wanted to write device drivers. I did – where DOS2 had a disk-size limit of 32MB, the company I worked for at the time started selling 40MB disks, so I needed to write a device-driver to split it into several smaller partitions. Later versions of the BIOS were written in C, and we didn’t get the source for that, but one of the contractors I knew later on had a hand in the Phoenix BIOS. He was somewhat disparaging about the quality of the code.

    It’s looking like we might have to get a new BIOS open-source, so we can change the Micro$oft-inspired version to something more user-friendly. Needing to be online before you boot up is just not a good idea – lose the net and you lose everything, and the machine would be effectively useless out of range of a net link.

    As princessartemis says, Windoze has problems when it is not the only OS on the disk, but running a separate disk for other OSes works fine, and most versions of *nix expect to co-exist. One way of getting this is to use a plug-in external disk, so Windoze never sees it when it’s running, and when you want to run the other OS you plug in and boot from it. For the last few years I’ve been running Ubuntu only rather than dual-booting – I just can’t stand the continual updating and not being able to just switch the system off when I’m done.

    W95b was a nice stable OS, and the interface has gotten steadily worse since then, needing an ever-bigger machine just to run the OS. Security issues mean that you need a virus-checker that is ever-more intrusive (It’s For Your Own Good) whereas in Linux I’d have to put a password in in order to install a virus.

    Maybe a good idea is to stock up on boxes that run *nix, since we’ll need something to bootstrap the alternative effort of getting a solid BIOS written.

  16. Pascvaks says:

    Left Field Q: Is there a difference in this issue if you’re operating a website? (IOW- Is MS the Big Daddy Dictator of Website software as well as PC software in homes/offices? And, who’s the Big Cheese in the ISP software market? MS again?)

    PS: If someone were to try to ‘zero out’, or dump, all software on a new computer and install new ‘other’ programs, is there something still in there that’s going to mess up that attempt because the hardware folks have a deal with MS?

  17. j ferguson says:

    I’m having my usual migration nightmares with moving my HP G42 from a 500 gig to 1 TB disk. I have XP and Linux on the various partitions. I need the XP because we need a second machine with Quicken on it and the Wine installation for Quicken is far too tricky for prime time.

    I’ve used Apricorn Update Suite in the past (they ship a really impressive USB drive enclosure which can handle both SATA and IDE 2.5 inch disk hardware. In the past, I hooked the new drive up using the Apricorn case, and ran apricorn pretty much default all the way. It created partitions proportioned to consume the new bigger disk and then wrote the contents the old partitions to the new ones – more space on each partition. It even made the new XP run on E: just like the old one, AND it was bootable.

    I was then able to fix the loader with a Live USB version of Linux using something called boot-repair.

    This time, no such luck. the new Seagate Momentous (it’s momentous all right – especially the time you will invest getting the education you didn’t know you needed) 1TB drive has “advanced formatting” which means 4kB sectors instead of 512. Apricorn bungled the partition start locations to GParted’s dismay. AND, XP would hang while booting, possibly because it now thought it was C: instead of E: which it had been. So I re-partitioned the drive and loaded fresh installations of Ubuntu 12.04 (gnome classic) and XP. Then I copied all the stuff from the old drive, but of course I have to now have to re-install all of the XP applications.

    Many years ago, I made a moderate amount of money moving data from Jaccard 8″ floppies to Xerox 5 1/4 inch floppies using a C/PM utility called du which copied disks bit by bit. Worked great and no-one else on the northshore (Chicago) had figured it out (yet).

    There is a Linux utility called dd, which I think does the same thing and I’m now considering starting over.

    This is like rebuilding tunnel-type VW transmissions, a skill i will never again use and I wish didn’t clutter my dwindling capabilities.

    Adobe has quit supporting AIR for Linux systems and has a note on why on their web-site. You need AIR if you want to run the Compleat National Geographic on Linux. Adobe has been good enough to archive (there, i said it, the magic word unknown to climatology) the incarnations which did work. We use Compleat National Geographic for trip planning. At one time pre-1950, they used to have more words than photos and the stuff was mostly intelligent – good priming for trip to civilisation (UK).

    As you can see, I’m writing all this instead of starting on SWMO’s list for the day.

    Alas.

  18. DirkH says:

    It will be possible to run Linux VM’s on these machines. Just sayin’. Microsoft can’t prevent that, and would be stupid to try. Too many industry environments on Linux.

    But thanks for the news, ChiefIO, I missed it. Will stick to my existing XP and Win 7 machines – I wanted to skip Win 8 anyway because every second windows version sucks. Seems like they stick to this rule. Maybe they’re cursed.

  19. tckev says:

    Pascvaks
    First question – no
    Most of that is done by third party. HTML standards does not *care* what platform the code comes from, it just has to conform to the rules.
    “who’s the Big Cheese in the ISP software market?” It’s a mixture. The big boys like Google or Oracle its their own Linux type systems. Indeed the largest super computers system are Linux/proprietary systems. Lots of stock exchanges have gone to Linux system (many from M$ :)). Going down the scale of size you’ve got many system – IBM, Cisco, HP, M$ is there but is not the governor but M$ is growing its market share.

    Your question “PS: If someone were to try to ‘zero out’…” I’m not sure what you are asking.
    If you are asking can you clone a system (operating system + applications) across from one ‘puter to the other – no! That is what this technology is specifically trying to stop.
    If you are trying to move application programs (not operating systems) the UEFI system make no difference. That is an operating system security issue.

    IMO the big issue with the hardware is that is kicks the used/reused market into touch. Gone is the hardware hacking where you could remake systems from a pile of rejected parts.
    As Intel says in their literature “A core aspect of UEFI Secure Boot is its ability to leverage digital signatures to determine whether an EFI driver or application is trustworthy.” What they are basically saying is ‘we are locking the chips and assemblies in place on this computer board and that’s the only place where they’ll work.’ Yes there’s a long torturous method to get round this but I cannot see anyone ever using it.
    This also goes against the ‘sustainable, nice to environment messages’ that everyone chants. Got a hardware problem? Change the whole motherboard. Damn-it change the whole ‘puter. – not ver sustainable.
    Also I’m still not convinced about what happens when a hard drive dies in a computer. It’s a hardware *and* operating system security key exchange nightmare!

  20. Ed Forbes says:

    DirkH says: I wanted to skip Win 8 anyway because every second windows version sucks. Seems like they stick to this rule. Maybe they’re cursed.

    LoL…this is so true. I still remember when they released DOS 4.0…..It deleated data on the drive.

    4.01 came out quick !!!

  21. adolfogiurfa says:

    @Ed Forbes: The “winner is…..” Windows Vista, it could never be fixed, it duplicated “My Documents”. I guess that all difficulties are caused by the need of using OS as a means of spying people. That slows processing and it is naive and childish: What it is going to happen will happen anyway and no existing power in the world would stop it. An adult knows that, a child doesn´t.

  22. George says:

    Microsoft has been in a phase where they alternate crappy releases with good ones. There was Windows, then Windows ME which sucked, then WinXP which was good. Then there was Windows Vista which sucked, and then Win 7 which was good. If they are true to form, Win 8 will suck rocks and will gain no traction in the marketplace and so they will come out with Win 9 which will be usable.

  23. Paul Hanlon says:

    I’m not getting this. If we are effectively getting an operating system with UEFI, why not just use it as an operating system, OK, so some drivers don’t get loaded first time round, so just add them after the initial checks have been made. Because the operating system is loading from ROM, we’ll have the closest thing to instant on.

    Another thing that wasn’t really clear is what happens if I buy a Windows 8 enabled PC but immediately format the disk and load Linux, do I still have UEFI on my PC, with the potential backdoors still in place? Just because it is open source, doesn’t mean that that is what is on your PC.

    I can see a big market in embedded system embedding. Think I’ll brush up on my QNX skills.

  24. tckev says:

    p.g.sharrow :
    Sorry I missed your comments above. IMO The big problem with M$ and upgrading is that they make the assumption that your system is their system. You have older software or hardware that worked perfectly fine. Then M$ upgrade changes how an interface works in Windows and bang your old stuff is out. Usually you’re out trying to find the original designer/manufacturer of the old stuff to get driver/patch updates from them. By far the worst thing that happens is when M$ takes over a company that makes your favorite hardware or software – and there’s a lot them.
    See – http://en.wikipedia.org/wiki/List_of_mergers_and_acquisitions_by_Microsoft
    M$ usually dump most of the old companies product line (retaining the IP though), then incorporate the bits M$ wanted into their own products. All support for the old stuff withers away.

    Problem with dual boot systems are legion. One is that an M$ upgrade tool too often will want to change bits in the MBR (the M$ system start-up) for (no) good reason and this action destabilizes GRUB bootloader. Quite often this happens without the user being aware till is too late.

    This is not to say Linux is without it’s own far share of problems – trying to move from GRUB1 to GRUB2 was a mess. Same with some tools that were supposed to get you from a LILO boot start to GRUB, BSD and Linux systems that cannot correctly identify your hardware is the other big bugbear and the causes many months (years?) of tracking them down. I will not go into the man-years of desperate work that many people (including me) have expended trying to get the audio to play right on some Linux boxes. My only caveat is its getting better.

    Personally I’ve found that sticking with one computer manufacturer that is known to work (dual booting) saves a lot of heartache. My favorite is rebuilding used IBM/Lenovo boxes as they are quite cheap, very well built, plentiful and well supported for decades. The full maintenance manuals are available online from IBM and Levono.
    Big companies tend to buy/upgrade systems in huge numbers of PCs and servers at a time, so waves of Dells, HPs, and IBM come onto the ‘second user’ market often.

  25. gallopingcamel says:

    P.G & princessartemis,

    Five of my six computers still have multi-boot that includes Windows and Linux. At first I had to revert to Windows fairly frequently while replacing my applications one by one. The sixth computer has no CD or hard drive (to maximize battery life) so I could not spare 30 GB for Windoze.

    As long as you leave Windows in its own (roomy) partition and then install Linux on top of it, in partitions that Windows cannot see (Reiserfs, ext4 etc.) multi-boot should be trouble free.

    Install Windows on top of Linux and all kinds of problems will follow.

  26. Sparks says:

    UEFI Sounds similar to how they’ve engineered their proprietary platform on the XBOX 360, is this the idea?
    It could mean future M$ PC’s could be locked out from the Internet unless individual users pay an additional subscription, possibly through buying M$ points on-top of their ISP charges to access a Microsoft controlled network.
    It may also mean that application developers would have to fork out serious funds to develop and deploy their programs on the M$ network.

    Just a thought!! :)

  27. tckev says:

    Paul Hanlon :
    have a read of M$ site
    http://blogs.msdn.com/b/b8/archive/2011/09/22/protecting-the-pre-os-environment-with-uefi.aspx

    Look at the comments and concerns below the write-up. It seems to be a matter of whether the EOM system builders (hardware manufacturers NOT users) enable or disable the UEFI security features.

  28. Pascvaks says:

    @tekev – “Your question “PS: If someone were to try to ‘zero out’…” I’m not sure what you are asking. If you are asking can you clone a system (operating system + applications) across from one ‘puter to the other – no! That is what this technology is specifically trying to stop. If you are trying to move application programs (not operating systems) the UEFI system make no difference. That is an operating system security issue.”

    What I was trying to say/ask was – if a new computer is bought without software, or you wanted to dump everything (software) that it came with, and load you own software, would they then be ok; or is there something (OS?) that still keeps you guys from doing what you want to do and building your own ‘systems’ out of parts and pieces from the junk yard (or your own stash of stuff)?

    PS: If MS is the problem, isn’t someone going to offer a fix or totally new software package (and/or OS) for $? Hope that makes more sense.

  29. Gallopingcamel – Linux on top of Windows is OK till you need to reinstall Windoze after a year or so of use, and it’s getting too slow. If so, then back off the Linux partitions before any systems work. Running a separate disk avoids that problem nicely, though at a slightly increased cost. Having the Linux system on a separate disk does make life easier when you have hardware problems, though (except disk crash) so would be worth it for business. Having a bootable Windoze disk and shifting it between boxes is not possible, since you’d need to get permission from M$ each time it found new hardware.

    I’ve just found a nice layout software (SeeTrax XL Designer) that runs on Wine, which solves a long-term problem of not having a decent layout system in Linux. The only thing I’d need Windoze for now is Turbocad, which won’t run under Wine. The free alternatives all pretty well suck – even the 2D ones and there’s no 3D CAD apart from artistic systems that don’t give dimensions. Life’s too short to spend the effort writing or modifying a CAD system to the way I’d like it, considering how often I’d use it now. As you may have guessed I’m not into accounting software, so can’t suggest any decent offerings for this.

  30. Pascvaks – building processor boards using modern components is no longer something you can do in a garage on a small budget. Most of the stuff will be SMD and the processors will be BGA (various sizes, and getting to micro-BGA now) Soldering needs accurate process control and probably needs to be done in an inert atmosphere, too. As such, look for something that you can shirt-tail. A modern phone has more power than the old IBM PC by a long chalk, and the Raspberry Pi is much the same. Need more power – add more boards and use the Linux multi-processor support. For the same processing power, you won’t get down to the price of the standard PC, but that’s a price you pay for getting independence. I suspect that China may produce boards that don’t have this problem, and use an older version of BIOS without the EUFI hassles. May be worth a trip to Hong Kong. There may well end up a black market in motherboards that are OS-free. If M$ take over the net, though, they may look for the M$ code that allows you to use the net – if you don’t have it you’ll be blocked to protect everyone else from your piratical attitudes.

  31. gallopingcamel says:

    @ Simon Derricutt
    Thanks for those great insights. Currently I am planning to delete Windows from the five of my computers computers that still have it.

    I will follow your recommendation by installing a copy of Windoze XP on an I/O Magic external USB drive. I will also try installing Windoze on a 32 GB flash drive (belt and braces). I only have one computer that needs to run Windoze anymore (six to eight weeks per year) so carrying one or more external drives for that computer is the way to go.

    You mention the Linux Windows application layer (aka “Wine”). I have been astonished to find that some of the most useful and powerful Windows “Apps” run flawlessly via Wine. Some run better with Wine than they ever did with Windoze! Here are a few examples:

    Photoshop!!!!!!!!!
    Dreamweaver (the entire “Studio” application set). Linux still lacks a decent html editor.
    Windows games (Hearts, Solitaire, Spider and Freecell)

    My accounting software used to be from Intuit (Quicken, Quickbooks and Turbo Tax). While most of these programs run under Wine they tend to work in a flaky manner so I eventually gave up. My accounting is now Linux based:

    GNUCash for management accounts
    Tax Act for personal and 1120S taxes

    Since closing my Bank of America accounts earlier this week I need to find a way to handle the payroll. My first try will be based on “Non-free” Linux here:
    http://www.patriotsoftware.com/Payroll-Software/?pc=MPPC45&utm_source=MSN&utm_medium=text&utm_content=payroll%2Bcost&utm_campaign=PAY

    As my employees like getting paid promptly, this is “Job #1” so in a week or so I should be able to report success or failure.

    When it comes to serious CAD work Windoze apps like AutoCAD have long been outclassed by software running on Unix or Linux. Prior to “retiring” ten years ago I was involved in particle accelerator design using cell codes, solids modeling, finite element analysis, programmable logic etc. We did use some AutoCAD but the heavy lifting required EPICS, ProEngineer, CATIA or Concept Modeler running on Sun workstations.

  32. E.M.Smith says:

    @Pascvaks:

    Yes, a “just dump MS and reload” will fail as the boot process now checks in with Microsoft and gets validation of what is allowed to run, and that will be MS, not what you installed. It isn’t a “deal” between hardware guys and MS so much as a Micro$oft mandate. Do it or else no Windows 8.

    (Though they did back off, per the article above, for now. So if you root around long enough and IF the hardware vendor lets you, you might be able to turn off the MS Mandate.)

    Most ISPs and server farms these days are Linux or Unix based (and have been for many years). Microsoft keeps trying to become important there, and they have occasionally had some success ( NT had a memory leak requiring a weekly reboot, but one client still used it… Most of that was fixed in Windoze 2000. At this point, were I still choosing equipment, I’d have NO critical infrastructure services on a Windows box other than those that only ran on Windows.

    I’ve worked with many vendors and ISPs and the ones that made the most money had a floor full of Linux / Unix boxes. I ran the computers at a small software company for a couple of years. We ran BSD Unix for most servers. White Box PCs. One had been “up” for over a year non-stop and we only shut it down as my “walk around check” caught that not much air was coming from the power supply fan… it had gunked up from a year+ of non-stop air flow… At other sites, we did nearly weekly maintenance on the MS stuff…

    @J. Ferguson:

    I’m sorry for your dos ;-)

    Yes ‘dd’ is a unix copy command. DD stands for “Convert and Copy” (as cc was taken by the C compiler already… It’s a Unix thing… don’t ask…) It lets you do just about any kind of copy you could like. Even does a ‘swap bytes’ for endian things… I’ve done whole system backups using it, some times cross systems and with all sorts of “oddities” that it would iron out… It’s part of a standard “move a disk image here to there” script thingy I wrote decades back.

    @DirkH:

    It is common practice to have a ‘feature’ release that adds all the new bugs and then a “patch” release that fixes the new features. Most folks do this on the minor numbers and often the “odd” releases are the patch releases. So most of the time I avoid “even” minor numbers. Some folks have “minor” feature releases on minor even numbers and MAJOR overhauls on large even numbers. That makes things like “8.0” nearly lethal while 7.9 would indicate a very stable major feature set with several bug fix releases and a few important new features worked in, but largely patched. I NEVER install a 2.0 release…. Ever… Just too many burned fingers. It is often worse than the 1.0 release where things just are missing due to being broken. But 2.0 they got them working, added more, and have a raft of bugs…

    In the case of M.S., they seem to have a major bug, er, feature, release on even numbers and then keep on patching and bugging with each release…

    @Paul Hanlon:

    You get a “stripped” BSD. The applications are missing (and most of the tools) and you are locked out of it. Like the Linux hiding in many small routers. It is there, you just can’t get to it.

    Or your Android phone. It, too, is a Linux.

    Yes, wipe the disk, the 100 MB “bios” is still there. If it’s got a bug, it’s still there. It will then do a ‘key check” with MS and not boot your stuff as you are not MS approved.

    You then get to root around in its guts to see if you can turn off that “feature” (unless on an ARM box where even that is forbidden).

    @Sparks:
    Don’t give them any ideas!

    @Tckev:

    The rule WAS going to be “must be locked out to get a Windows 8 approval” but MS backed off after some uproar so that vendors can now let you shut it off, maybe. Except no big enough noise was made per ARM chips ( largely tablets and phones) so those were left as mandatory lock. I fully expect x86 to end up there as soon as MS can make it happen.

    @Pascvaks:

    IFF you buy new parts intended for non-MS use, this ought not to be a problem.

    IF you buy a Windows 8 box, it will be a mandatory “check with MS before booting” enforced by the boot process and removing Windows does NOT remove that “check and as MS for permission; and check that the drivers have keys that match”. Drivers you have now erased… So your hardware is dead. No, you can’t just wipe Windows 8, install Linux and go. In fact, I think you can’t even move to new hardware on hardware failure…

    This is going to seriously crimp the style on disaster “midnight fix” options…

    It is a way to LOCK the hardware to SPECIFIC software with security keys mandatory checked over the internet at Microsoft or The Box Will Not Boot.

    @GallopingCamel:

    I had zero problems just handing the payroll sheets to ADP and letting them to it all… Cheap, fast, and 100% accurate. http://www.adp.com/ Most banks can connect you with them.

    Wells Fargo does, IIRC.

    FWIW, I spent most of today playing with VirtualBox (was Sun, now Oracle) and it seems to work fine for Virtual Machines. Free download.

    https://www.virtualbox.org/wiki/Downloads

    It was a bit slow with Ubuntu (but I was running it a bit resource light and in an encrypted space via TrueCrypt). Yes, take a Windows File, but an encryption layer over it with TrueCrypt, then put a virtual machine in it, and then run a Fat Linux like Ubuntu in that VM…

    I could likely make it faster by turning on more than one CPU core and giving it more than 512 MB of memory… Did I mention this was on a laptop? So not the fastest disk on the planet… and it has paging on top of paging…

    But it worked. I may try a non-encrypted “disk” and more resources and see how it goes. Frankly, just having a bootable Linux on this HP laptop without doing the whole disk format dump / restore was worth it.

    I also had two other Linux ports running in their own VMs and they were faster (and smaller).

    Most likely this is the last Windows box I’ll ever buy. The only reason I wasn’t running Linux on it already was the lack of video driver and the funky all physical disk partition format. This lets me just use it as a base under the Linux.

    But I need to find a less resource hungry distribution than Ubuntu. ( That it spend a hour talking to the internet at the end of the install didn’t make me happy either. I want an install from iso image to be a private thing…) Maybe I’ll see if I can find an old RedHat 7.2 image… it was a nice stable and fast / efficient release…

    Oh, and as per “the future”:

    There is an explosion of ARM Chip machines happening. They are a RISC platform that uses near no power (like 2 Watts…) and performs as well as x86 in much smaller packages. They are widely used in phones and tablets, so getting Android and similar Linux ports all over the place.

    I predict that in a year or two, there will be so much stuff on ARM chips that the x86 will not be that interesting. As you can get a “system on a chip” “box” that is about a 2 inch cube and runs full Linux, and for about $100, I’m just not seeing the future for Windows….

    http://www.linuxfordevices.com/c/a/News/SolidRun-CuBox/

    I’d have already bought one, but they were sold out… ( as were several other similar devices… it is clearly a ‘hot area’…) Realize this “box” is being sold as a HTPC Home Theatre PC and does video decode fast enough to directly drive movies to a large TV with the build in port.

    Solid-Run is shipping an open source mini-PC platform for developing Android TV and media center apps. Measuring 2.17 x 2.17 x 1.65 inches and consuming less than three Watts, the CuBox runs Android 2.2 or Linux 2.6 on an 800MHz Marvell Armada 510 CPU, has 1GB of DDR3 memory and a microSD slot, and includes eSATA, USB, infrared, S/PDIF, HDMI, and gigabit Ethernet interfaces.

    So with that much in something the size of a Wall Wart and with not MS pain, I’m having trouble thinking of why anyone would put up typical mail or web servers or DNS or… on anything else. It will have “issues” with code tied to x86 for a year or two, but as tablets come to dominate (and their AMD chips with them) applications are going to be Android Friendly…

    I’d not try to build an IT Shop out of these today, but I wouldn’t put in a Microsoft Web Server or file server with that kind of power that cheap. ( There is another company with a ‘personal cloud’ server based on something like this Still teething a bit as some folks discovered that stuffing a thick 7200 rpm disk in the “cabinet” overheated, but the one with some air space at 5800? rpm worked well. Still, a 1 TB file server in a wall wart?

    http://www.tonidoplug.com/

    So were reaching the point where shortly your “infratructure” will be a brick on the wall (files, network ‘cloud’, media storage, firewall) and your “terminal” will be your tablet and your big screen TV. I’m not seeing where MS is needed. Me? I’d add a 2 inch cube for general purpose Linux playing…

    For a while one of them will need to be an x86 (as Open Office doesn’t look to be on ARM at the moment), but with the rate things are moving to Android on tablets (and thus on ARM), it is only a matter of time.

    So while EUFI has me PO’d, I’m seeing my own “development path” leading away from x86 and recycled Windoze boxes anyway…

  33. Pascvaks says:

    May be OT, it seems the weakest link in the chain is the ISP. MS appears to be taking advantage of ‘individuals’ and the ISP’s are letting them do it, maybe because they can’t do anything, or won’t?

  34. EM – You probably know that cross-compiling has always been a feature of GNU C, so re-compiling anything you can get the source of (so – most of Linux stuff) to run on the CuBox is not going to be that much of a sweat. Try http://www.openoffice.org/download/source/ if you feel like playing (could be a while…). Linux is not tied to x86 processors, and the only reason we have x86 dominant is because IBM chose it in the first place and M$ settled with it and made it relatively painless for non-nerds to use it. So maybe keep the high-capacity desktop for compiling stuff, and then run it on the CuBox. I never really got in to C, but I’ve got the books for when it’s necessary. I may still have a Red Hat CD or two, too – I’ll have to go search. If you can’t find it on the web I’ll go through the CD collection. I use Ubuntu now since it’s less hassle updating it, though there’s often a problem with sound chips. I also have Mandrake and Slackware of various vintages, and maybe the odd Suse or two.

    Gallopingcamel – thanks for the CAD suggestions – I’ll try them out sometime soon. XP on a pluggable disk sounds like a good solution for you, since it’s not tied to specific hardware like the later versions. It may take a little longer to boot up when it’s put on a different machine than last boot-up, but otherwise it’s stable and good enough for most Windoze applications. One other solution that’s possible (but I haven’t tried it) is to copy the Windoze DLLs into the Wine tree, so that you are using mostly Windoze in a virtual machine. Since you have presumably legal copies of XP, this ought to be legal and acceptable, and may make your applications happy again.

  35. Chiefio,
    Thanks for your comments which are most helpful to this software challenged person.

    Your comments on Ubuntu struck a chord. I was really happy with Ubuntu until the new Unity interface came along. Here are just a few of things that went south with “Unity”:

    “Cosmos”…the best screen saver ever. Ten striking Hubble photos cycling slowly…….
    The ability to customize “Panels” and icons.
    Can’t get “Force Quit” to work anymore
    Log in & log out is flaky. Often refuses to re-boot!
    Ubuntu 12.04 won’t install on a non-PAE CPU.
    File transfer problems on my home network.

    Doubtless you could fix all these things in short order but I was attracted to Ubuntu because it was user friendly and worked “Out of the Box”.

    While Ubuntu saved me from Windoze it may be time to move on to a brave new world filled with ARMs and their offspring.

    Discerning folks have been PO’d by Microsoft many times before but this EUFI thing is blatantly unfair competition, likely to have unintended consequences that Microsoft will not enjoy. At the very least it will accelerate the turning of the tide towards non-Windoze solutions. I think MS might have got away with this five years ago but given the decline of the PC their timing is lousy.

  36. adolfogiurfa says:

    @E.M. : “As above so below”: It seems that such EUFI replicates in computing what some folks are trying with the “New World Order”, then psychiatrically speaking, what our E.M. is trying is but his subconscious trying to remove such a menace from our Earth´s hard disk!!
    Don´t worry: We are ALREADY in the process of deletion…

  37. Petrossa says:

    Luckily there is only one department in MS working on the UEFI lock-in, and 1000’s of volunteers working how to get rid of it. We had already a first attempt in 2011 with Windows 8 Bootkit.

    Furthermore:

    In a nutshell, all the major distributors of Linux need to do is have their own key signed through the Microsoft developer portal at a one time cost of $99. The $99 does not go to Microsoft, it goes to Verisign. And after a distributor a signed signing key, they can sign all the modules they need to and UEFI (aka Secure Boot), will work with Linux and Microsoft.

    If you wish, you can bypass secure boot, or add your own signing key to the computers under your control, or finally register your own signing key (at a cost of $99) and distribute software of your own globally.

    http://www.redhat.com/about/news/archive/2012/6/uefi-secure-boot

  38. adolfogiurfa says:

    @Petrossa That would be like letting them to stamp a fire mark on your back, like cattle!, They will own you as soon as you do that.

  39. j ferguson says:

    Galloping Camel,
    I’m running 12.04 with the Gnome Classic GUI. Google to get the additional stuff to run Gnome Classic. Machine is an HP G42 -AMD Athlon and I’m using the 32 bit version of Ubuntu.

    After you get gnome classic running, move the cursor to over the top or bottom panel and hold the “windows key and alternate keys down while you right click. this will give you the panel menu and after a lot of fooling around, you may be able to find your way back to where you were before you “upgraded” to this bloated incarnation. As for Unity, if I wanted a GUI aimed at idiots, I’d go get a lobotomy.

    I loved 10.10.
    I loved sunOS 4.3.1 (IIRC) even better. I think is was their last version before they were forced to go to System V.

    good luck

  40. Greg F says:

    If you are a tech nerd …

    1. You are not buying preloaded OEM hardware.
    2. You are not dual booting, you are running all your OS’s on a hypervisor.

    [ Reply: Well, I am a tech nerd and I do run dual boot. I’ve also bought loads of preloaded OEM gear as Director of I.T. and inherited a lot that I wanted to re-purpose to Linux after done running Windoz. Yes, I’ve run various hypervisors and emulators too. Not quite the same as strait OS on bare metal… -E.M. Smith ]

  41. adolfogiurfa says:

    All my computing science ended with Turbobasic many years ago. But I think, based in what you are talking about, that some guys made with systems what GMO made to world agriculture.
    And what you try is a way to get back to the original genetic code. :-)

  42. jim says:

    @gallopingcamel says:
    7 July 2012 at 3:17 pm

    I, too, didn’t like Unity. I have moved from Ubuntu to Mint and really love it. Mint will not automatically upgrade the OS for you. I use Deja Dup to backup my files and Backup Tool to backup my software installations. When you upgrade to a new version of Mint, the files and software are restored. The software installs from the internet, so the repository has to be there. Some of my software didn’t re-install, but overall, it worked well.

  43. jim says:

    The files and software don’t install automatically, you have to kick it off, BTW.

  44. @ j ferguson
    Thanks for that idea. I installed “Gnome classic” and now the desktop on my main computer looks like it used to back in the good old “Hardy Heron” days.

    Thanks for that tip on adding apps to the panels. The “Force Quit” app is back! Wow! Makes me wonder why Ubuntu makes changes that confuse the faithful. It looks as if many have jumped ship already:
    http://royal.pingdom.com/2011/11/23/ubuntu-linux-losing-popularity-fast-new-unity-interface-to-blame/

    @jim
    The machine I use for trying things out is my “Business” laptop so I installed Mint as you suggested. Mint prefers installing from a DVD rather than a CD and that makes sense to me. To my surprise Mint found my Firefox “Bookmarks” so I did not have to run Xmarks.

  45. jim says:

    gallopingcamel – You can make a bootable Mint USB drive also. OS in your pocket ;)

  46. jim says:

    camel – I forgot about Mint Mate. It makes Mint look more like Windoze. Here are screen shots of it.

    http://news.softpedia.com/news/Linux-Mint-13-RC-MATE-Screenshot-Tour-270109.shtml

  47. adolfogiurfa says:

    @Jim The next improvement will be a RFID- implanted drive…. hard drive would be in Big Brother´s “Cloud”

  48. jim says:

    @adolfogiurfa says:
    8 July 2012 at 12:08 am
    So, you believe you stuff isn’t already in Big Brother’s cloud?

  49. Reblogged this on contrary2belief and commented:
    UEFI? UEFI?
    Gesundheit!

  50. Petrossa says:

    @adolfogiurfa
    If you can’t beat them join them. In a few years all mboards will have UEFI bios’s so your options are limited. Anyway i have a stamp on my back already, my socsec number. One stamp that rules them all, so one minor stamp more or less…

  51. adolfogiurfa says:

    @Petrossa: They are but kids playing the spy´s game. They already have your face to identify you and all your data in the “cloud”, no stamp or chip is needed. But, What do they really have after a bullet reaches their beautiful skull?: Nothing at all!

  52. This seems to be taking a sinister turn. Ghosts from George Orwell!

  53. adolfogiurfa says:

    @Gallopingcamel: We are but “dust in the wind” and children “spy games” are unable to reach our “selves”. Those games are as effective as CO2 in changing the climate :-)
    Time to choose sides kids, before it´s too late. :-)

  54. Paul Hanlon says:

    @tckev, @E.M.

    Thanks guys for the clarifications and links.

    I’m currently building a high end PC with two graphics cards. I’m hoping to learn OpenCL and parallel processing with it.

    It’s based around an MSI motherboard, and it so happens that it comes with a DVD with an MSI branded linux distro on it called Winki.

    They are pushing it as an instant on Desktop OS that goes under a full OS, but it has internet access and a few other things, somewhere between UEFI and a full install. It would be interesting to see if it could be extended to be a full OS on its own and not have to load another OS onto it.

    And that was the point I was trying to make. We currently have BIOS built into the ROM and it is updatable. That BIOS is now being replaced by UEFI, and there will surely be a way to update that. If UEFI is essentially a crippled BSD distro, what is to stop people updating it to a full BSD distro that still has all the BIOS (low-level) bits in it, circumventing the unrecyclability of the PC, and getting instant-on to boot. The fact that you cannot then load Windows 8 afterwards is a feature IMV :-) .

  55. adolfogiurfa says:

    I will never forget my old PC with just DOS 1 and Basic. Those were the days my friend….

  56. p.g.sharrow says:

    @Adolfo; DOS-1! WOW! I started with DOS-3 and WORKS-1 as well as a sort of Auto-cad on an XT, although Before that I worked on a CPM machine and basic, now that was a real waste of time. pg

  57. gallopingcamel says:

    adophogiurfa and p.g.sharrow,

    Nostalgia time! The peak of my computer career was as manager of the ME29 project for International Computers Limited the primary UK computer company. This fine “Super-Mini” had 32 cards (Printed Circuit Boards). Three cards were used for the bit-slice ECL logic CPU (low 16-bits, high 16 bits and “Control”). Then there was a DMA card for fast ECL memory (yes, we did have DMA in 1976). Next an interface card to connect the ECL bus to the TTL logic that the rest of the machine used. Then eight memory cards containing 2 MB of RAM, of which 1.3 MB was dedicated to the VME/K operating system.

    The other 19 cards were used for I/O including a 4 card hard disk controller managing up to 16 hard drives, each the size of a dish washer. Each hard drive had removable “Platters” capable of storing 200 MB. A maximum configuration boasted an unheard of capacity for a mini (back then) of 3.2 GB. The CPU clock operated at 40 MHz which was considered more than respectable. Here is a picture of a 6 hard drive system with a mighty 1.2 GB storage capacity:
    ICL 2966

    If you look at the ICL history on Wikipedia you will read mostly about the large machines like the 2920s, 2930s & 2940s. These machines had up to 1,000 cards and were developed by engineering teams consisting of 2,800 people whereas my ME29 team located in Kidsgrove had only 36 people (not including the operating system team in Bracknell). Yet for several years the ME29 (2960 series) accounted for roughly half of ICL’s sales. One critic of the ME29 commented that the hardware and software appeared to have been designed by independent groups that did not talk to each other very much. This comment stung…….the truth hurts!

    Working at ICL was a lot of fun. I got to see Manchester United in action many times at their wonderful stadium. Technically the ME29 was child’s play compared to building the Duke University Free Electron Laser:
    http://www.bdidatalynk.com/PeterMorcombe.html

  58. p.g.sharrow says:

    @the Camel; Kind of awesome what is now inside a smart phone. I started in electronics with discrete components at the end of the tube era in the late 1950’s. Today there are millions of components on a chip and hundreds of chips on a PC board that can fit in your hand. Very few have any idea of the investment in man hours and wealth to create the modern electronics era. pg

  59. pg – DOS1 was basically the same as CP/M – it had all the same OS hooks except that you didn’t need to write the BIOS when you installed it – it only ran on IBM PCs so the BIOS was constant and was in ROM. Take a CP/M program, change the name and it would happily run on DOS at that time. I think the underlying hooks remained there through to Windows 3.1. DOS remained the backbone of Windoze up till the NT incarnation, though even there you could open a DOS terminal. The main problem with running a command-line program was always remembering all the options and switches (and avoiding typos), so setting up little batch files for frequently-run operations was useful.

    It’s interesting to look back and see just how far we’ve come with current computing/electronics, and it seems quite a few of us here participated in the developments. The increases in computing power available to the average person still appears to be rising exponentially – what’s really needed is some way of avoiding software bugs, since we’ll be relying more and more on correct working of the hardware/software in future.

  60. E.M.Smith says:

    @Simon:

    Never had a ‘first port’ to a new architecture that didn’t turn up some unexpected dependencies….

    Yes, worth a trial compile, but as just one example of potential ‘issues’: The x86 is “little endian” while the ARM was little endian, but then changed to bi-endian….. So is an OO code “endian”?…

    @Jim:

    Neat card. Only complaint is that without Real World connectors, I need a motherboard with R.W. connectors…. Would be a great thing to use in a kind of DIY Beowulf Cluster supercomputer in a shoe box ;-)

    As every Real World connector mini-card ARM thingy I’ve tried to buy has been sold out, and back ordered; I’m pretty sure it is going to have an explosion of development.

    I spent last night looking at ARM Emulators to “pick on” just so I could start testing things withough real hardware in hand….

    @GallopingCamel:

    I’ve been happy with my old releases of LInux, but gradually getting “new stuff envy” and have started looking for a new release to go with. No idea yet…. But that Unity swap has just frosted me… So on to Debian ( or is that Dwebian ;-)

    I have some old Red Hat 7.2? something like that CDs and like it, but the browser is suckey…. and it’s old enough newer things don’t like to play there any more….

    We’ll see…

    BTW, no software bug is ‘fixed easily’ ;-)

    @Petrossa:

    Nice to know, but I’m fond of computers that don’t tell folks when the start up, or ask permission to run. I’ll likely be doing more “roll your own” so as to maintain anon hardware boots….

    @J. Ferguson & GallopingCamel:

    OK, so with enough hoop jumping….

    I think I’ll check out Mint….

    @Paul Hanlon:

    Well, one can only hope…. I’d love to have a generic BSD as my “boot” program ;-)

    BTW, does that OS download mean I can now ask if you are playing with your winki? ;-)

    @P.G.Sharrow, G.Camel, Adolfo, and others looking at nostalgia….

    I started with vacuum tube radios and moved up to an Altair Mits 8800. First program was toggled in through the front panel switches. It was about 10 lines of assembly and copied itself from low memory through all of it to the top….

    I miss front panel switches and register lights ;-)

    @Simon:

    Moore’s Law continues to run….

    But I did a calculation some time back that gave me pause. It was something along the line of noticing that there were 230,000 or so electrons used to store a single bit (then). Cut that in half every 18 months, and in something like 18 iterations, you have a single electron.

    So in 324 months, or about 27 years, we hit a quantum wall… Moore’s Law stops then OR we find something other than electrons to use for data storage…. (As that was from a few years back, we’re likely down to about 20 years now….

  61. EM – even if the quantum computers don’t come into being (and I really can’t see the maths for superposed logical states) there are 3D structures to explore that will give us quite a few years more of Moore. In storage, a while back a 100-layer DVD was in design, though obviously hasn’t yet made it to the shops (around 1/2 Tb on a CD size). As the physical sizes and electron counts go down, there will be less certainty over whether it’s a 1 or 0, but error-correction techniques are getting ever-better. People may go for ternary or higher systems – more complex and the logic will take some sorting, but it’s more dense so may also continue Moore’s progression another decade or so. Whereas the size may then stop diminishing, the cost will still probably follow the law a while after that, since if they are self-replicating the cost goes pretty near zero, and the main cost will come in writing the software to drive them.

  62. p.g.sharrow says:

    Moores law can not be depended on indefinitely. I think code bloat will be the stumbling block as it was at the beginning. A new way of creating the software is what is needed. The computer is a tool to expand the thinking process. Use the computer to do the work. Today, as at the beginning, men try to think out the process and then “write out”, code, all the instructions. Then the computer compiles them to the needed machine code. It creates a result and converts it to a language that the operator can understand, hopefully. Another step is needed. The computer must write the code. Hammer it to find the best path. To get the most work done you must give up some control, take risks, delegate responsibility, If necessary train, correct mistakes and move on. pg

  63. Petrossa says:

    Nail on the head P.G.
    At this moment the codebase for Windows 7 is mostly computer generated. The reason Win8 has been redone from almost groundup (it has XP roots) is because W7 is beyond human comprehension. No real changes can be made anymore without opening a can with the mother of all bugs.

    Actually already back in the day, when the SpaceShuttle software was created it was already almost beyond human control. That was borderline at what, 420.000 lines. Cost a megafortune to keep it at less then 2 digit number of bugs.

    Window rolls in at a staggering 50.000.000 odd lines. Evidently that can’t have been written by humans, nor can it be ever bug free. Regression bugs are almost a certainty at each bugfix.

    And when it comes down to it, it’s only an operating system.

  64. adolfogiurfa says:

    @P.G.:A new way of creating the software is what is needed
    What would you think of this one? (take into account that I am but a daring ignorant):
    http://es.scribd.com/doc/39978961/Tetraktis-Language

  65. adolfogiurfa says:

    @P.G.: Here you can see it better:
    http://www.giurfa.com/tetraktis_language.pdf

  66. pg – Back in ’83 I was sysprog on an IBM 4341, and the code was said to run to around 2 million lines then. Bugfixes were a weekly occurrence. Code-bloat was a problem then and has become worse.

    Looking at other peoples’ code, the main problem I saw was lack of clarity over what they actually wanted to do. Sometimes this was built-in at the start with underspecified code, other times it was a change in specification during the programming effort. My programs were more comment than code, since I described what each subroutine was supposed to do and what I expected as input and output for each. I hope it helped those who afterwards had to maintain it – I certainly wished someone had done that for me when I had to modify old code. The descriptions helped me to get the logic clean, anyway, and mostly I only had to change things when the specification changed.

    DLLs in Windoze should be bug-free and do what they say on the box, but that doesn’t seem to be the case. I’m not sure why there are so many of them, since again it’s a fairly limited number of things that need to be done, and the variations can be parameterised to allow the needed variations provided the code is re-entrant and threadable. Maybe with a back-to-basics approach the code-bloat could be addressed, though I doubt that anyone will put the effort in to doing it.

  67. p.g.sharrow says:

    Exactly Simon, No profit. Just get it into a box and ship it. If It breaks later then patch it IF the customer complains loud enough.
    There are many ways to get from point “A” to point “Z” on road and highway, While a programer can specify the path to be followed, no matter how convoluted, the computer CAN follow all the paths and determine the most efficient route from the destination back to the point of origin. pg

  68. p.g.sharrow says:

    @Adolfo; Binary is not positive or negative, it is on or off. As in switches that are set to conduct or not. The first “program” that I created was a set up of single and double throw switches. There has been proposed computing with charge levels as you suggest but nothing commercial that I know of. pg

  69. jim says:

    At IBM, code bloat was a problem partly because coders were paid by the kiloc. Kilo-lines-of-code. I can think of a lot of inelegant ways to code something if I’m gettin’ paid for it.

  70. adolfogiurfa says:

    @P.G.: Perhaps that was due to the hardware limitations then. Really it could be made with an additional circuit of variable frequency, thus the product would not be, if graphically represented, a square with black and white bits of the same size but an analog drawing.

  71. gallopingcamel says:

    adolfogiurfa (9 July 2012 at 5:11 pm)

    @P.G.:A new way of creating the software is what is needed

    adolfogiurfa, please accept my apologies for mangling your moniker in an earlier comment. When it comes to software necessity is the mother of invention.

    Back in 1993 my team at Duke began working with BINP (the Budker Institute of Nuclear Physics located in Novosibisrsk). When it came to writing mathematically intensive software such as “Finite Element Analysis” programs the Russians were (are?) astounding. Later I realized their software prowess was a direct result of a lack of high performance computer hardware. They were working with “286” technology while we were two generations ahead with “486” CPUs. Their software was compact and efficient; ours was bloated and slow. Running their software on our hardware was a BREATHTAKING experience!

    Probably the best book ever on project management and software development is the “Mythical Man-Month” by Frederick P. Brooks III. It was a real privilege to meet Fred while he was emeritus professor of computer science at UNC, Chapel Hill. His son masterminded a Mac based computer learning center in one of the six charter schools I helped to create.
    http://www.woodscharter.org/

  72. Petrossa says:

    Back in the day i developed/marketed 286 based multi-user systems with serial terminals to replace AS400’s. The whole system software included cost a fraction of a bare AS400 and outperformed it up to 16 users.

    Windows killed my business, weird how people did dataentry on a GUI. I even wrote an angry piece about it, https://dl.dropbox.com/u/1828618/gui.doc but well. Pretty pictures sell, functionality is just an afterthought.

  73. Petrossa – If you use a lot of different types of software, it’s kind of hard to remember all the switches and tweaks in a particular command and with long filenames it’s also easy to get a typo into the line and get something wrong. As such, I prefer a GUI that makes a little-used program directly usable rather than spend the time reading the man files on it. I want the computer to make life easier, not more complex. I’m unusual in that I have problems recognising icons without a bit of text to help me remember what it means – I’m text-oriented rather than picture-oriented, so some GUIs are far too arty for me to easily use. That’s why I use Linux rather than Apple, I suppose. People vary, and there is no one size fits all. Giving people what they find easiest, whether text-based, GUI with text or only icons, means not much more programming-time but saves a lot of hassle over the life of a program.

    Gallopingcamel – I also wrote a 3D drawing system to help in kitchen design that ran on a 286 with a Hercules graphics card. It drew a complete kitchen in wire-frame in about a minute, and you could choose the perspective type and where the camera was placed. It would maybe have run quicker on a Z80 box, but needed a fair amount of memory for the 3rd dimension – the 64K limit was a problem in 3D, and although the 64180 came in later it never really took off since the IBM PC took over everything else. Giving programmers small, slow boxes encourages tight coding. Paying by the kilo-line of code is definitely not the way to go!

    pg – Yep – seems that if it works OK in alpha then ship it. Getting it clean takes a lot of effort and no-one’s getting paid for the time. Maybe with the Raspberry Pi and similar little boxes we might end up with a new generation of programmers coming along who can write compact code that works. We can hope.

  74. Petrossa says:

    Simon
    I was talking about data-entry which even today is still the most common usage of computers in a business environment, at the time Windows 3 was a freaking nightmare.
    Data entry normally keeps the user away from command lines. A GUI in those days didn’t add anything but problems in those circumstances.
    Btw i still see here and there the characteristic blue Novell data entry screens.

  75. Petrossa – Using a GUI for data-entry is really not optimal, I agree. For data-entry we used a fill-in-the-box and skip to the next one with text. Quick and easy for everyone. A nice restful colourscheme was yellow text on a brown background – it seemed better than the green text/grey background of a normal monitor. Once colour started coming in you could highlight bad entries with red, but I never like the blue background.

    Most users used the command line once to start the day – not worth putting that into a GUI anyway. Windows 3.0 was almost unusable, and it only became reasonably stable with 3.1.

    Remembering what things were like then, running up to 24 terminals from an 8-bit 1MHz processor and still getting response-times of less than 5 seconds, makes you wonder where all the increases in processing-power have gone to. Code bloat and OS, mainly, I think.

  76. Paul Hanlon says:

    @Chiefio

    Well, it doesn’t look like I’m going to be playing with my “winki” after all :-(. It’s basically a crippled “live” distro (no installer software or console, can’t even find which flavour of Linux underlies it).

    However, because I bought a large hard-drive, I also got an SSD, and I did a net install of Debian, which is basically the barest minimum to get the PC booting (about 160MB), and then install the applications using apt. The BIOS POST takes longer to finish than getting from it to the login screen, about fifteen seconds from switch on to login, which is as close to instant on as makes no difference. Still, as you say, it would be nice to just have one “boot” program that morphs into an OS. Just have a core system on the ROM, with any user defined applications and data stored on disk.

    Great thread, I’ve learnt a pile.

  77. j ferguson says:

    Galloping Camel,
    One other thing vis a vis 12.04. If you want to use Chrome, don’t download it from Google. Use the Chromium you will find in the Ubuntu Software Center. The Google download will give you an education which isn’t likely to improve you in any qualitative (or quantitative) way.

  78. Petrossa says:

    @Paul
    Actually that was MS’s idea in the first place after W7. Windows completely in rom, and a pay by the second usage user license. Still on the shelves so that horror scenario might still come true.

  79. adolfogiurfa says:

    What would it happen if planes would run on Windowze?

  80. Petrossa says:

    Nothing much, it wouldn’t even manage a systems check before running out of resources.

  81. j ferguson says:

    Flogging the defunct horse:
    This URL points to some experience I had getting Ubuntu 8.04 to run on my big machine.
    https://answers.launchpad.net/ubuntu/+question/62918

    More recent versions have overcome this problem and I no longer have to do a lot of hardware-specific tweaking when I bring up a new incarnation. My point is that if you have a choice between a DC-3 or a Martin 202, go with the DC-3. Chances are better that someone else will already have discovered an “issue”, (ie. crashed), the problem will have been worked, and the fix is in, especially if there are more of them out there. Many of the things I had to “repair” in earlier revs now work out of the box. These observations may have nothing to do with Mint, but one might suppose that the Ubuntu user base includes more varieties of hardware, more (in terms of numbers) thoughtful problem solvers, etc. than a system with fewer installations. I guess I’m wondering if going to Mint might be like going back to 8.04 and going through all of that again – and with fewer knowledgeable sorts to help me sort it out.

    If you are going to be doing stuff that you make yourself and that runs on a command line, then maybe this isn’t a reasonable concern, but if you use applications which might be difficult to fix if they don’t run right, maybe it is. For example, we navigate the boat with PolarView, an inexpensive charting package which can display the various indicators, wind, depth, heading, etc as well as show noaa charts, show where the gps thinks we are, and run the autopilot. It runs fine on Ubuntu. The application for showing DVD’s works fine as well as the one for enabling asynchronous DVD viewing. I suppose I might have thought the same thing about XP – I do still have it on each of our machines, but then it doesn’t do all the things i wanted without spending money for bloatware. Linux did.

  82. This has been a great thread. I am now much happier with Ubuntu after getting rid of “Unity” and am trying a new flavor (Mint). Looking to the future it would be great to hear more about ARM hardware, operating systems and applications.

  83. E.M.Smith says:

    Wow! A lot of comments showed up while I was ranting on a new posting ;-)

    OK, I need a coffee break before I can read / address them all!

    @GallopingCamel:

    Well, I’ve now got an ARM emulator running with a Debian port on it. Seems to work OK. Faster than the emulated x86, as though that meant anything ;-) I’ll be downloading some other ports for ARM (assuming I can find them…) to see how they do. It also looks like “LibreOffice” is in some of them, so that OpenOffice being x86 only may be more a name-game…

    As of now, I’m leaning toward popping the $100 and getting a non-RaspberryPi-but-shipping-now board and just going with one. Worst thing that happens is that it gets turned into a stripped down security portal / router. (Making a secure Linux firewall / router is in many ways an exercise in deleting things you don’t need…) But from what I’ve seen so far, it’s enough for running a browser, mail client, etc. Having a $100 or less “disposable” web browsing platform is worth it.

    With the latest hacking tools, just clicking on an email address ( or doing a rollover / hover over a link) can get a “kit” loaded onto a Windoz box… so you pretty much have to assume if you’ve been “up” for a few days you have been hacked. Moving to an alien architecture and on a hardened Linux; then taking an OS “reset” on each boot: That ought to be enough to keep the OS clean. After that, just keep most data in dismounted encrypted containers unless you need them. If the box does “catch something” while wandering sites, the “reboot prior to using private files” purges it. (As the reboot comes from read-only media…)

    Basically, I think I’m comfortable enough with what works on an ARM to get one for testing further. (Though I’ll do a couple of more emulated runs first ;-)

    I’ve also got GEMU and VirtualBox both running on the laptop. Emulators / Hypervisors can suck a fair amount of resources (and I generally prefer real hardware to emulated for speed and control) but they might still have a bit of use. I need to think some more about how much of what is done in the emulator can be seen by the host (likely too much – so things like keyloggers would still capture what you were doing) but having a good firewall, a hardened host, and THEN an emulated world, well, most “crap” would land in the emulated world and be tossed out at reset.

    The question is just “would it be enough?”… Maybe I’ll put an emulator on top of the disposo system… have it be a ‘stripped’ environment just enough to run the emulator (so hard to hack). Frankly, just the complexity of an ARM, with a stripped security wrapper on an emulator (all reloaded from ROM at reboot) with an environment that is displaying network and CPU stats on the side (so you can see if it starts doing things when you are not…) would be a complex enough environment to confuse many attackers…. Then that place with tools (compilers et al) would be i the sandbox virtual world and reset on each restart anyway…

    Decisions decisions…

    Right now I’ve decided to have coffee ;-)

  84. jim says:

    @j ferguson
    I switched from Ubuntu to Mint. Mint comes with most needed drivers right out of the box. Mint, in fact, is based on Ubuntu. I like Mint much better. Easier to do things like play vids on the web, and stuff like that. I did do what I consider and upgrade and installed Mate to Mint. Love it!

    I’ve added a cheap Hauppauge WinTV board and am moving some of our videos from VCR to DVD. The TV resolution isn’t so good, but there may be some tweaks – I haven’t messed with that much. But the VCR resolution is as good as the original tape.

    If you go to http://distrowatch.com/ , you will find this download data:
    Page Hit Ranking
    Data span:
    Rank Distribution H.P.D*
    1 Mint 3915<
    2 Ubuntu 2218<
    3 Fedora 1716<

  85. jim says:

    I misstated that somewhat, I installed the Mate flavor of Mint.

  86. EM – it’s starting to sound like you may have a product that people would buy, just to save the effort of doing it themselves. May be worth considering as an alternative to making money from the markets. Even though all the software is free and available to anyone, the work of the systems integrator is always worth paying for. Smithux could be popular with the paranoid net users (anyone who has more than maybe 10 years running systems under the belt). You might want to add a publicly-available device-driver that is in fact a bit-bucket – somewhat like the canary in the coal-mine in that malware will try to see what it can write to and giving it something that looks writeable can tell you there’s a problem.

  87. j ferguson says:

    Jim,
    I don’t doubt that more people are downloading mint than ubuntu. And the population that downloads and self-installs is the one you’d want to be a part of. And it likely is in the phase where the sharpest people are paying attention, so maybe my observations on using a product in greater distribution is flawed – it does matter who they are, and if they are a lot of sheep at some corporation that’s decided to go with Ubuntu, there may only be one person in the whole place who understand the thing, if even one.

    We converted all our tapes to DVD’s using the ATI -Windows XP to get the data into files, and then Linux to write the DVDs. It’s funny, but the only one we ever watch is Cousin Vinny.

    Asynchronous DVD use means copying DVD borrowed from library or private collection, watching the file once and erasing it. I’d like to think that this isn’t piracy, but it probably is.

  88. jim2 says:

    j ferguson – I’m copying home-recorded tapes of family, not movies. They can’t be replaced.

  89. gallopingcamel says:

    Chiefio,
    Thanks for those comments on ARM. I will try to keep up even though I have never been comfortable with high level programming.

    As a hardware guy I never got beyond assembler level programming such as Mikbug for the Motorola 8-bit MPUs and MASM for Intel MPUs. Thanks to a great compiler from a little company called “Avocet” I was able to write assembler level programs for an office switching system with two phone lines and eight extensions. The program fitted comfortably in the 4 kBytes of EEPROM and 128 Bytes of RAM that was built into the MPU. One of the things that gave me the greatest satisfaction was a sub-routine that generated the precision audio tones for external DTMF signalling.

  90. gallopingcamel says:

    jim (11 July 2012 at 12:26 am),

    “I’ve added a cheap Hauppauge WinTV board and am moving some of our videos from VCR to DVD. The TV resolution isn’t so good, but there may be some tweaks – I haven’t messed with that much. But the VCR resolution is as good as the original tape.”

    There are still three Hauppauge cards installed in my main PC, two NTSC and one ATSC (HD). They were working very well with Windows XP but when I “Upgraded” to Vista they died as did several of my favorite applications. At that moment the worm turned. I replaced Windoze with Ubuntu “Edgy Eft”. I tried to get my Hauppauge cards working again using “MythTV” but it was beyond my meager software skills.

    My family insisted on maintaining their ability to skip over the advertising and “Talking Heads” that account for 75% of NFL broadcasts, so I obtained a DVR from ATT “U-Verse”. Originally the quality was abominable owing to a buffering problem that caused the TV to freeze whenever Tiger Woods was lining up a putt. We switched to “DirecTV” (excellent picture thanks to MPEG-4) for a year but now we are using ATT once again. Why?

    The ATT U-verse DVR allows you to record up to 4 TV programs simultaneously. They do this by treating TV as data. Their basic download speed is 21 Mbps which is sufficient for 4 MPEG-4 digital TV signals with something left over for the “Internet”. DirecTV provides a great picture but cannot provide a high speed broadband service.

    It does not make sense to get my Hauppauge cards working again as the ATT U-verse works better than my Hauppauge WinTV. However there is still a fundamental weakness in the ATT U-verse product. Fiber optics is used for most of the ATT local network (Fiber To The Curb) but the last few thousand feet uses copper cable. Consequently data rates are restricted to a maximum of 21 Mbps. While this is adequate to provide TV service given today’s efficient TV encoding technology it leaves very little bandwidth for Internet access at times when the TV is working at full capacity.

    I will remain a loyal customer of ATT “U-verse” until FTTH (Fiber To The Home) becomes available. With FTTH I can have the best TV picture currently available without any effect on my Internet download speed of 50 Mbps.

  91. j ferguson says:

    The big machine, ,built in a U3 aluminum server chassis, powered by a dc-dc power supply fed from our 12 volt system, and secured to the overhead near the helm has an ATI All-IN-Wonder card which has a TV receiver, proprietary drivers, something they call Catalyst software, and gets its signal from a DirecTV receiver, and a dish on the upper-deck which is kept aimed in the same direction no matter how the boat rotates by a device called Follow-Me which is essentially a heading-hold autopilot. Worked great after the convulsion of loading drivers which conflicted with the drivers for the Audigy Platinum 2 sound card was solved. set it up in 2003 when i first built the thing, and with XP. This thing was our television for our first 6 years of cruising until I realized that the Samsung 17 inch 16×9 monitor used a lot of electricity and that i could buy a 30 inch Sony that used half the total of computer and monitor – this was a problem during the tour de france, olympics, and world series where we would want to watch late into the night without disturbing others in the anchorage with genset noise.

    I tried for about a month to get one of the various linux tv receive applications to work when we installed 8.04 but was unable. maybe one of them works now.

  92. jim2 says:

    Catalyst is a suite of video drivers.

  93. E.M.Smith. says:

    @GallopingCamel:

    My Brother-in-law worked at NASA doing aeronautical modeling. Ph.D. “We talked” as we both were involved with computer modeling on Crays.

    Turned out that he had a nifty little graph. The total improvement in “computes” over time in his field. By both hardware and software improvements.

    The Line for Software was higher and faster than the hardware improvements.

    What this said was that GOOD algorithms and tight code made MORE improvement in total computing than Moore’s Law!

    The necessary corollary of this is that BAD software can consume ALL of the Moore’s Law gains (and then some)…

    And that is exactly what Microsoft coding practices does…

    The idea that faster hardware means it is OK to indulge in Code Bloat is wrong.

    So, as you noted, put the tight code on the hot hardware and stand back… (Why I like running the older versions of Linux and Unix on the newer hardware ;-)

    BTW, as an example of that: This posting is coming from the Opera browser on a Puppy Linux virtual machine. It has better responsiveness than FireFox by far. (No waiting on typing, for example.) While F.F. under the same OS was painful, this is quite comfortable.

    So for my “base system” for further testing it will be an Opera browser by default. (Later, on real hardware, if there’s enough performance, I can consider other browsers).

    @Jim:

    I’ve put “Mint” on my list to try in a virtual machine…

    @Simon:

    Yup. Good idea. I’ll just be posting the basic description, specs, and how to DIY to start. Folks who want to can do the integration themselves, folks who want a kit can get one from me. (Assuming any real interest develops) But it’s likely 6 months away. I’d also expect a couple of variations.

    @J. Ferguson:

    Sony filed a case back in the BetaMax days that found “copy for home use” legal. It is violation of copyright if you sell it or use for commercial purposes. All DVRs depend on that case. It is also humorous to watch Sony (who now are a media company too…) try to fight for “protections” against the case law they set… ;-)

    @GallopingCamel:

    Over time Linux apps improve. Might want to try that MythTV again… (Or just do a websearch on your card and that software name to see what comes of it.)

    As I have a large library of tapes (to dump eventually… including some Beta…) I’ll be getting a TV dongle eventually. (Likely HD and NTSC combined as my Sat receiver and tape machines are NTSC output). So if you can wait a year, I’ll likely get to it myself ;-)

    FWIW, High Level Languages are just like Assembler, but without the hardware ;-)

    @J. Ferguson:

    Where you dock in Florida is just a few miles from where GallopingCamel lives.

    You might want to holler at him next time you are in Florida port. Between playing with video and computers and the local pub, I think you both would enjoy the process ;-)

  94. gallopingcamel says:

    @J.Ferguson,
    The host of this site and most of the faithful who comment here continue to amaze me. Someday I would like to meet the totally awesome P.G.Sharrow or adolfogiurfa.

    If any of you find yourselves near the Space Coast I would love to show you some neat places that are not the tourist traps like the central Florida “Attractions”. Please feel free to contact me via my pathetic web site:
    http://www.gallopingcamel.info/ email address = info@gallopingcamel.info

    Make it soon as I plan to relocate to Medellin, Colombia when I can no longer afford to live in the USA. When will that be? I don’t see anything positive coming out of the election scheduled for November 6.

Comments are closed.