The Faster I Go, The Behinder I Get

Just a small note about improvement leading to overwhelming leading to failure.

The problem is simple. There are millions of people and computers creating ever more media, content, and information. I am only one person. Adding systems to enhance data gathering and management results in much more being handled, so ending further behind. It seems to be a general property.

I first was thinking this as I looked over my 15 to 24 TB of disks. Only a few years ago, I had a 500 MB main disk and some backups. Buying the first TB disk seemed far too much. Now I’ve reached the point where the time to manage it all is nearing the value of it… Some disks holding backups of that 500 MB disk, now long gone to disk heaven. Is it worth the time figuring out what is worth keeping, what a duplicate, and what trash? But just buying more disk exponentially increases the problem…

Most recently, I bought a Roku. ONLY to be a cord cutter. We had Netflix. Amazon Prime for shipping costs brought that video service functionally for free. Pluto has a bunch of movies and news too. So 345 channels on Roku AFTER I selected only the ones that looked interesting. Some of them, like Amazon, Netflix, Vudu. Hulu, etc have, themselves, subchannels, categories, and/or lists of shows. Then the Yahoo and Youtube channels bring the internet flood.

So what is the product of devices (computer plugged into TV as well) x channels x stations x shows x series x features x …???

I have no idea.

And that is the point.

I realized tonight that it is impossible to manage. I picked one of the 300+ Roku channels I had thought might be interesting, but had yet to actually watch, and watched it. I set the goal to do one every day or two so as to actually pick keepers and tossers. TubiTV is now in the keeper group. Watched a Denzel Washington movie “Out of Tine”. A very good cops flick set in Florida. One down.

Started it about an hour before dinner, paused to cook and eat, finished it a half hour after dinner. Marked the channel as good, looked at the dozens more movies to watch on that channel, and moved on. Looking at Amazon Prime TV, I decided to cruise some of the categories and see what was new, I.e get off the “watch list” I’d picked a week or two ago. I’ve watched maybe 4? things from that list. That was about an hour ago… More stuff showed up than I’d watched. On just one device, one “channel”, more new hours of content were created than I could consume.

That raised this dilemma to the front.

If it takes me 6 months just to check each channel once, and say 1/4 of them each individually create more content than I can watch in that interval, then I can never watch even a fraction of the content offered.

Buying one cheap device has assured I can’t ever “catch up” on entertainment alone. It has imposed a new burden to filter, prune, and select. With enough devices and sources, that burden alone will consume all available time. The necessary conclusion is that much must be discarded without evaluation.

I’m going to be resorting to channel description sites and “voting” or “stars” rankings for many channels and shows, just to toss the junk faster if nothing else. Yet I hate reputational rankings as my tastes often diverge from the crowed.

Basically, I now know why people pick a favorite and stay with it, even when better exists. The search and sort cost is too high. Why people depend on gossip and what’s trending. It avoids the work and cost of search and validation. In the face of impossible selection times, rolling loaded dice becomes a reasonable strategy. Watching something likely to be good enough is better than searching for an optimal choice that never is found.

So we pick politicians that are tall and look good, or that shout our own biases and wants at us. Why spend time looking at track records or actual ability?… We buy the hot album our friend played, believe the religion our Mom followed, eat the foods we grew up eating, wear the current “fashion” and never check on why; or even does it make sense. Just too much work and time to do do.

Thus the herd following Algore and “The (fictional) 97%”. Getting them to change requires them to invest time, to think, to care about doing better. They won’t, because the are happy just doing what their friend does. Even scientists know Science advances one funeral at a time. Because once we have decided a choice, new stuff shows up faster than we can possibly search it, so we just ignore that stuff. We are busy looking at some other, more familiar, Shiny Thing…

Lessons from the Roku… who knew?

Well, I think I’ll watch the news now. I usually watch the news…

Subscribe to feed


About E.M.Smith

A technical managerial sort interested in things from Stonehenge to computer science. My present "hot buttons' are the mythology of Climate Change and ancient metrology; but things change...
This entry was posted in Human Interest and tagged , , . Bookmark the permalink.

16 Responses to The Faster I Go, The Behinder I Get

  1. p.g.sharrow says:

    Too many choices to make ;-) I looked thru Directs 600 channels and nothing worth watching so back to Fox News as back ground while Net surfing…pg

  2. Larry Ledwick says:

    Reminds me of that commercial where the guy is sitting infront of his computer and pop up message appears :
    “You have reached the end of the internet, please go back!”

    I came to a similar realization a few years ago when I pondered the exponential growth of content. It is literally impossible for you to even keep up with one small niche of content.

    If you sat down and decided to watch all the youtube videos for a given topic, new ones would appear faster than you could view the. (one of the reasons I tend to put quick one line summaries when I post a link to a video, so you don’t need to waste 5 minutes watching a video you have no interest in to discover it has no content you care about).

    That is why recently I have moved toward using social media as an aggregator of content.
    I select a handful of people to follow that either have very similar or exactly opposite view points. I let the similar view point folks recommend things, if they post a link I am highly likely to be interested in it. On the contrary those who have very dissimilar interests make me aware of things I would never dream of looking for.

    In effect I am using social media to crowd source likes and dislikes. That way I can skim over content that has been viewed and evaluated by hundreds of people. Stuff I would never even know existed if left to my own browsing and search efforts.

    It is actually hard to comprehend that in the time I have taken to compose just this one message probably thousands of videos have been posted and postings it would take me a life time to read have been uploaded to various blogs, and web sites.

    In a sense we are not far from the universal borg mind in that if you take interest in something you can probably find 20 videos telling you how to do it in just a few minutes search.

  3. E.M.Smith says:


    Disney annonced an ESPN Streaming channel and the Disney streaming channel by 2018, cutting off Netflix as current contracts run out.

    Cord cutters hitting Disney on cable….

    Plus Netflix making their own, good, content.

    So the world is becomming direct to the consumer content providers AND many more of them.

    Instead of a few big studios with a few theatre chains, 3 networks, and local monopoly cable companies, it is flattening end to end.

    Minimum Economic Scale is one guy with a DIY Instant TV Roku channel.

    So, can that guy get mind share and eyeballs in all the search space, or will “just watch the usual” stand up when it costs by the month…

    I think there is a giant fracturing of media coming. Plus a “media search engine” is needed, so somebody is going to make money there.

    Rich folks will poney up for a dozen paid providers. Poor will drop out to the free media space. Middleclass, like me, will buy one or two like Netflix, then blow off the rest of the paid ones.

    I can see folks getting to know their neighbors better. One getting Disney for the collective kids, another ESPN for the Dads, a third CBS for the Moms & Trekkies, etc etc. So far, I get much more than I want for free, so only buying Netflix and CBS for the spouse (who is very Trekker).

    From my point of view, profits in media are going to be dropping for most providers, just due to the eyeballs being shared over 5000 channels and a million shows instead of 3 networks and 25 shows…

  4. How do you do it? My head spins on 2.6% of what you stored. My cloud backup is 63 GB and I can’t manage even that. How can anyone manage 24 TB? My head is exploding!

  5. The good news is that we can buy the content we want without paying for what we do not want. My cable company offered SEC sports but I wanted the ACC. So I cut the “Cable TV” cord, got the content I wanted and saved a bundle of money.

    You have to love it when new technology (IPTV) improves the quality of your entertainment while reducing the cost.

  6. E.M.Smith says:


    It’s not the GB it’s the file count….

    I have a few very big things. Like “scrspe of NCDC” or ” archive of all Debian Releases” that can suck down a TB for just a few files. If that were 24 TB of word docs and pdfs I’d be unable to cope too… But take Unix or Linux ios at 1 GB to 4 GB each, times 4 or so achitectures (x386, AMD64, Arm, Spark) x 20 release dates, thats5 about 200 GB in one wad. Now add Ubuntu, Puppy, BSD, and Slackare, you can be pushing a TB very quickly. Add a duplicate for backup, thats 2 TB, just for storing a very limited set of isos.

    That’s the stuff that sucks down bytes…

    One media issues:

    Tonoght added the BBC America channel on the Roku. It is on DirecTV so I have it there, but wanted to check out streaming. Click click: Not nearly all the shows, but has old episodes of what it has. Click on one…

    OH, it’s one of THOSE… wants me to login with a “code” to register. On to the tablet… Tapety tap… Now wants my provider… I select DirecTV making note to self it will break when I dump thrm… wants my email address AND my DirecTV password… as that would let them sign me up for two more years, or cancel, or change my programming, I cancel out and delete their channel…

    I don’t need that greif, nor do I need the BBC, as I’ve already got too much to watch….

    I suspect more of that will happen. Folks feeling fullup already just not willing to play 20 questions nor payup…

  7. Lynn Clark says:


    I’ve got ~40 TB of usable disk space on my home network, not counting the couple TBs installed in my iMac and Macbook Pro (MBPro). Why so much? Mostly it’s because I got a good scare a couple years ago when what I thought was my bullet-proof 6 TB “ultimate-backup-hardware-raid-5-array” decided to crap out without warning. So I bought and connected two new 4-bay Thunderbolt, software RAID 5 arrays to the iMac in my “office”, each with 9 TB usable disk space, plus a Synology 8-bay RAID 6 NAS array with 16 TB of usable disk space in the basement. Everything is plugged into uninterruptible power supplies, which are in turn plugged into Zero Surge surge suppressors (go to if you’re serious about surge protection). What was originally on the RAID 5 array that crapped out — luckily, I was able to get it to limp along long enough to recover everything that was on it — was dispersed to the two new RAID 5 arrays and everything on them gets backed up to the Synology RAID 6 array by rsync cron jobs. The remaining ~5.5 TB of the 40 total TBs is my almost-decade-old 4-bay Drobo which is used as a Time Machine (TM) volume that backs up everything on my iMac. On top of all of that, the boot volume in the iMac gets backed up onto a second internal disk every other day so that if the boot volume dies I can immediately boot from the backup volume. I don’t keep anything on the MBPro that isn’t also on the iMac or a RAID array, although I do connect a standalone 2 TB TM drive to the MBPro every couple days and let TM do its thing.

    In my case, the main thing that contributes to ever-expanding disk usage is my sense of paranoia. A few years ago I started saving all the GoPro dashcam video from when I drive anywhere. I had concluded that it might be a good idea to be able to prove where I was on a particular day and time if it ever became necessary to do so (since I started watching Dateline NBC and 48 Hours a year or so ago, my paranoia has been validated by many episodes that strongly suggest that there are a lot of innocent people serving long sentences in prison because they couldn’t provide definitive proof of their whereabouts when a crime was committed far away from where they were). I also have two wireless IP cameras, one pointing out a front window of the house, and the other in the garage, both of which regularly capture me going about my business when I’m at home. As with the GoPro dashcam video, I save all of the IP camera video just in case it ever becomes necessary to prove I was at home minding my own business at a particular point in time (all of the IP camera footage is transferred automatically via ftp to the RAID 6 NAS array). None of those dash cam or IP cameras document where I am 24 hours a day, but it’s better than nothing, and it takes little effort on my part other than importing GoPro dashcam footage to a RAID 5 array on the iMac when I return home from wherever I’ve been.

    Of course, I also have the usual collection of family/friend photos in my iPhoto library (just surpassed 9,000 photos), iTunes music collection and rips of my DVD collection, plus scads of personal documents (letters to/from family members, genealogy files, etc., etc.), all of which resides on at least three different RAID storage devices. The biggest unresolved issue with the personal stuff is what happens to all of it when I die? In previous generations there would be boxes of physical documents and photos that the survivors would find and take charge of. Now with it all existing only as digital data, I fear that this kind of stuff will be lost forever as people die off and their survivors wipe and re-purpose computers and hard drives, especially if it’s all password-protected and the password dies with the deceased.

    So that is a long answer to your question about how/why anyone would have so much storage and how they manage it. Anyone with rudimentary Unix skills and operating in a Unix environment (like MacOS, Linux, BSD) can easily set up a mostly-automated system to manage large amounts of data. Since I don’t do Windows (and haven’t for at least a dozen years), I can’t opine as to how easy or difficult it would be to do in that environment.

  8. ROM says:

    Down through history bursts of innovation and creative surges have lasted just a few generations at the best before once again sinking into a slothful slumber or far more likely descending into the chaos of war and conflict between groups, races, nations, ideologies and the inevitable clash of civilisations, often a deadly fatal clash for all the civilised nation states that become a part of a great conflict that will and has shaped history down through the ages.

    Just maybe we are now looking at and experiencing the last great burst of human innovation and creativity, good, bad and indifferent, that will soon be over for mankind once again for maybe many centuries into the unknown future.

    With all of the still unused, even unexplored technology and innovation and the creative aspects of the last two hundred and fifty years of human creativity and innovation, still only partly explored and developed, future generations might just sit back and wait for the AI Mechs to deliver the drinks and watch the Virtual Vid, the VV and only very occasionally wonder whether perhaps there is another way.
    And weren’t those damn 19th, 20th and early 21st century Vid Blocs bloody clever to come up with some of the stuff they did and not even have any AI to help them.

    They just used some very primitive methods and some mental exercises of some unknown type to create all those items that we use regularly today, as did my parents and my grand parents through the last two or three centuries.

  9. philjourdan says:

    I remember a conversation with the CIO of a multi-billion dollar retail chain telling me he had “9 gb” of online storage capacity.

    That was split between 3 Amdahl mainframes. And was only 25 years ago.

  10. paul says:

    I’m really liking my Roku. It plus an outside antenna for the Austin stations along with the sub-channels…. wow… there is a lot to watch. Sure, a lot is crap. That’s nothing new. But I’m not stuck here watching “Green Acres” or some soap opera because that is all that is on. I don’t have a problem with “Green Acres” either.

    I had DirecTv for 18 years starting with a Sony I bought on eBay. Flawless. The only time the picture went out was the evening it rained 6″ in 2 hours. About 8 years ago I went to HD. Great pic for the new 55″ Vizio. And yeah, the extra $10 a month for HD to feed a $2000 TV was ok. But none of it worked when it “rained like hell”. I’m not sure why HD would be more sensitive to weather… seems it should scale back to SD…. I see Amazon vids on the Roku scaling down. My guess is that DirecTv’s hardware is really cheap. Plus noisy like an old 486/66 CPU fan nearing death noisy.

    DirecTv didn’t give the local’s sub channels. They didn’t deliver the locals in HD either. I don’t miss the $143 a month bill.

    I’m having fun with the Roku. I do wish my internet connection was faster just because the commercials seem to be HD even if the show is not..

    Now, drive space. I must be a slacker. I bought my current PC in January 2013. A Gateway DX4860. It’s an i5 with a 2 TB drive. Win7. After all of this time, with 80+ GB of music plus untold amounts of porn and videos and 20 years of e-mail plus copying everything from the other two PCs to this HD (and the other way too) I have managed to use about 1/4 of the drive.

    I have a file drawer of pictures I could scan all of it and empty a drawer. But why? If my PC pukes, it’s all gone. One big thunder storm…. Why scan the stuff? I’m not putting it on Facebook or even on my website. My sisters are not interested in old pics mom snapped.

  11. beng135 says:

    Reminds me of Fry on Futurama surfing thru channels and saying “A billion channels and nothing good” — or something like that.

  12. angech says:

    Bookshelves at home will never read them all. Library even more so. That really,really big book store in New Zealand, never knew so many books existed in my life.
    It is not a new Computer problem.
    It is a fault of our six sense only 12 hour a day limited personal computer problem.
    Part of the pardox of why are we here.
    The good news is we have less and less time to worry about it all.
    Personally put it, the problem, into your blind spot and when it accidentally shifts out push it back in again.
    Works with everything else annoying in life.

  13. tractormike says:

    I just found one of my old laptops the other day and I was amazed that I could work on less than 1GB harddisk..than I remembered the floppys…maybe there is such a thing as too much information

  14. E.M.Smith says:


    Uh, yeah! There’s this “it is free just use more” cycle folks get into. Then they make sloppy languages and fat compilers and build systems with MB of buffers that aren’t used and more. Eventually you are doing EXACTLY THE SAME THING effectively in a file edit or a spread sheet, but instead of running in 128 MB of memory you need 2 GB to start… All from being lazy, not caring about efficiency, and wanting mindless glitz over simple effectiveness.

    Believe it or not, in every SD card, there’s a tiny little linux managing the bits. Something like a 4 bit processor and no applications at all. Just a very stripped down kernel and a specific function. Yet there it is. A $4 SD card has an OS running in it.

    So tell me again why I need a GB to open on a Linux box, ever? (Worse for PCs with Windows as their bloat goes wild…)

    My Old Cray Supercomputer was an XMP-48. That is 4 processors (64 bit word) and 8 “megawords” of memory, or 64 MB. Total. That’s IT. Unix ran incredibly fast and it was a wonderful experience to be on it. Now I’ve got a 4 core faster CPU, a GB of memory, and it’s not fast enough? What? The difference was the Cray folks did incredible fine tuning of every aspect of the software. From operating system to compilers to utilities and more. Crappy software can suck up ALL of Moore’s Law gains, and them some. Just look at what Microsoft does to a machine…

    Oh Well…

  15. Randy Alexa says:


    I have to admit that my main benefit of this space is the ability to increase my culture level and store my movie, music and photo collection.

    The rest of this expansion I fail to understand. My mac OS is gone from 5 gb to 20 gb and I still use the same programs as I did 6 years ago.

    Where does all that space go?

  16. E.M.Smith says:

    Where do things go? Well, in no particular order:

    When “word size” doubles, the size of binary files (programs) more or less doubles.

    So in the march from 8 bit to 16 bit to 32 bit (and now to 64 bit) processors we’ve had about a 2 x 2 x 2 = 8 fold increase in the size of a binary blob.

    Now it isn’t QUITE that bad, especially on the ARM architecture, as some processors implement the ability to run smaller words or older word formats ( ARM Thumb instructions, for example) but mostly it is.

    Added “features”. So software that spell checks if you select that option, now spell checks as you type. More code added. Repeat 100 times / program.

    Special Changes Of Features: In particular, the move from simple ascii to fonts and now to multilingual fonts. Used to be 8 bits to a character. For support of things like Chinese, we’ve gone to a kind of character that uses two bites (or 16 bits). This effects all things stored using characters…

    New! Improved!! TRENDY!!! and PAY ME MORE!!!! Languages!!!!!!!! i.e. “Object Oriented Programming”. There’s been about 3? or maybe 4 generations of language change based on the notion that This Time For Sure! we will get better code re-usability and more efficient use of programmer time by the next Great leap. It never works out that way. Each one DOES increase the size and sloth of programs dramatically, though. Roughly in order from memory:

    Assembly: Small fast. PITA to write. IF you really want efficiency, this is it. These are called 2nd generation languages as an “assembler” turned them into machine executable binary. 1st gen was toggling in that binary through switches by hand. I’ve done both…

    Assembly Libraries: Folks started collecting standard sets of subroutines into libraries anyone can use. Some, though, might be a “Math Library” so to load “Do ln X” you also loaded codes to do X / + – log base e, etc. etc.

    Compiled High Level Languages: Things like FORTRAN. Now the compilers loads ALL the math functions even if you don’t use them… 3rd Generation langaguages

    HLL Libraries: Same as the Assembly ones, but full of compiler added stuff…

    There are also what are called 4th Generation Languages (often “non-procedural”) and I was a consultant on a couple of them for several years. Highly effective at doing things like database reporting and such, but if you do them wrong can drive a machine to its knees. I once was called in for an “efficiency review” (which was my specialty) at the State Architects Office. They had an IBM Mainframe that was hitting 100% CPU and dreaded spending $4 Million more they didn’t have. When I was done it was at about 5% to 10% CPU utilization. I had a checklist of 26 things. Things like “Do your selection of records BEFORE you process them” (i.e. don’t calculate pay / person THEN pick out Joe Blow… doing 10000 pay calcs then picking one is much more compute intensive than picking one and calculating once).

    Table file pay
    select Joe Blow
    pay = hours x rate
    print pay

    is much more efficient than

    Table file pay
    pay = hours x rate
    select joe blow
    print pay

    by orders of magnitude, but these languages were sold on the basis just about anyone could write them… but not considering not anyone can write them well if they don’t know what the machine is doing… This tendency (hire cheap with languages to hide what the machine is doing) continues to today. It has crawled into operating systems design too (cough Mico$oft).

    Which brings us to the latest fads:

    Object Oriented Languages. These are just like procedural languages except without the procedure part (except when they do) and are a lot like the 4GL non-procedural languages except when doing procedural things… The big claim to fame is that you can easily reuse prior modules by teaching them a new trick (each layer had inheritance from other layers and you can effectively inherit a lot of prior work but just overlay the one bit you want to change). This is different from all the other prior ways of reusing code in that it is done backwards… Oh, and instead of loading only one lump of things you don’t need, you can load a long list of ancient things you don’t need, each layer loading the one under it, too. Also, since nobody can possibly learn / read all the documentation for all the things in the Object Library, lots of times you get the same function re-written and may even have two or three things loaded doing the same thing. Needless to say, I’m not fond of OO Languages…

    Interpreted Portable Languages: Java, Java-Script, and others.

    The advent of Web Stuff forced the need for code to run everywhere, so now you get to load a whole fake computer environment (Java Machine for example) on which to run your interpreted languages that are slower and often fatter, and then they can be OO Languages too. and… so multiply all the above doubles by each other…

    Now all your applications want to be “web enabled” so all THAT bloat gets stuck back into all your basic applications. Never mind that I do not want my spreadsheet talking to anything Web related…

    Virtual Machines: And the final fad is the use of Virtual Machines all over the place. Docker to let you load an entire fixed machine environment in a VM so you know your application will work as intended. (Only needed as the whole basic infrastructure has been made “dynamic” with things like Dynamic Linked Libraries DLLs… and poor upgrade practices – i.e. auto update daily break randomly)

    Finally, the folks making the operating system can choose to set things fat or thin. As the mantra was “Memory is cheap” they began setting all the compile flags to “USE LOTS OF MEMORY!!!” so why release and reuse memory if you can just malloc a GB or two and not worry? That first showed up when Linux went from a 64 MB memory size machine to 128. Buffers and such were just doubled in one switch setting… God knows how many now…

    Well, that’s the bulk of it, I think.

    Update: Oh, and Applications that hang onto stuff. FireFox is great at this. EVERY web page (and their embedded Java engines running their embedded Youtubes and…) gets held in memory. Just keeps expanding until you are running out of 2 GB memory and 2 GB swap and… (Some browsers on the Tablet don’t do this yet, so are much more efficient). Why take a page re-load if you can just hold everything in memory… Even Linux does this with caching file inodes. Move a TB of data you see swap being used even after the move. All the info about all the files held in memory even though not needed anymore. Eventually it ages out…

    As a quick check showed it getting a bit darker out the window, and it’s about 10 min to max eclipse, I’m going to take a break to experience it. Find a nice tree and look at the pinhole camera projects of crescent sun in the leaf shadows ;-)

Comments are closed.