Skip to main content

View Diary: Warnings From The Trenches (188 comments)

Comment Preferences

  •  When it comes to computers theres another prob. (5+ / 0-)

    Innovation in computer software has another insidious effect:  

    It teaches people to be bad at computer software.

    Computer software is unique as a labor saving device in that the labor it is saving you from is thinking rather than physically exerting your body.

    So the better the software is at its job the more it insulates you from the details of what's going on and the more ignorant of how a computer works it makes you become.

    Which becomes a problem for the next generation of software because the people who write it didn't grow up having to know much about their computer to use it.

    Once upon a time you HAD to know at least a crude basic level of commands to issue and how to write very simple software to just operate a computer.  The effect is that most people didn't bother having one.  While it's good that more people can use them now, it came at the cost of most of them having no clue how anything works, and worse yet, the systems these days ship without providing even the ability to learn more without buying add-on software to do it.  (Which if you don't know what you're doing, you won't even know that this is what you'd have to do to start learning what you're doing.)  Once upon a time all computers shipped with some type of programming language by default.  Not anymore.

    When they do decide later on to start learning how to write software, their teachers end up having to start from scratch with them.

    The problem is made worse by the legal morass of intellectual property when it comes to copyright and patent law.  When copyright and patent laws were first made, the demarkation between the two was simple: copyright is for text you write which is an inherently passive work that sits there until a human reads it, while patents are for active things that do stuff on their own.   The problem is that when it comes to writing software, that line of demarkation is nonexistent.  All software is BOTH text you can read AND and active thing that does a thing.  So software companies get the luxury of picking and choosing whether to file a patent for a thing their software does or to file copyright over the software (usually they do both at the same time).  The effect this has is that it becomes increasingly more and more illegal to learn how your computer works in a school setting.  You end up having to learn a lot of it from your employer later on instead.

    •  well, I am going to jump in on this (11+ / 0-)

      and agree only in part

      let me set up my remarks by noting that before I became a teacher I was in data processing in a variety of capacities for more than 2 decades. that my experience goes back to 1401s and punch card technology, that I was a certified systems professional and am technically still a certified data processor, that I helped write the 1985 CDP exam because I had the highest score in the nation on the systems analysis portion of the 1984 exam, and at that point I probably knew as much about the different Cobol compilers around the country as anyone in the nation

      In the earlier days we were quite limited in storage, so we did lots of shortcuts. That led to the supposed crisis in 2000 because of using yyddd formats for dates (which when packed took only 3 bytes of storage) - smart professionals used other methods - a fixed offset from some date in the past, for example, so that the so-called y2k problem would not exist.

      But we also understood about backup and recovery.

      We could not afford to send out software that was full of bugs (are you listening, Bill Gates?)

      It was rare to receive purchased software only in executable form, and most good programmers and all systems programs could reach machine-level code to see what was actually going on in the computer.

      OF course, most software then was far less complex than the average email or word processing or spread sheet program is today.  Remember, memory was very limited.  People learned to be very terse and precise in their coding to save memory, and this often meant we were far more likely to identify problems with the code before it went live.

      I have seen far too many expensive pieces of software today that are buggy.  I agree that the user is quite limited in being able to figure out problems.  i used to be able to take purchased software and debug it and tell the vendor what was wrong, because we were sent source code that we compiled on our own machines.  Now you get executable code in a black box format with no real way to examine what is going on, supposedly for proprietary reasons.  

      But sharp computer folks can almost always find a hack to get around problems -  being known as a hacker used to be a good thing some three decades ago.

      I have reached the point in my own life where I no longer care to get into the internals, but want only to know what I need to in order to apply software to the task at hand.  I paid my dues for several decades.

      Peace.

      "We didn't set out to save the world; we set out to wonder how other people are doing and to reflect on how our actions affect other people's hearts." - Pema Chodron

      by teacherken on Tue Feb 05, 2013 at 03:06:57 PM PST

      [ Parent ]

      •  I don't see the part where you disagree. (1+ / 0-)
        Recommended by:
        CA wildwoman

        You open by saying you only agree "in part" so I started searching your comment for the bit where you disagree and I couldn't find it.

        (Oh, and on an unrelated note, the system of marking dates from seconds-since-epoch didn't entirely get rid of the problem, it just kicked it down the road to the year 2034, which is when a 32 bit twos-complement signed integer will no longer be able to count seconds since Jan 1, 1970.)

        •  but that was only one solution to yyddd (1+ / 0-)
          Recommended by:
          CA wildwoman

          some used a fixed offset of days from some arbitrary date in the past - say 01/01/1901

          of course some of them forgot that 2000 would be a leap year even though 1900 was not, but that was a separate problem.

          "We didn't set out to save the world; we set out to wonder how other people are doing and to reflect on how our actions affect other people's hearts." - Pema Chodron

          by teacherken on Tue Feb 05, 2013 at 03:19:43 PM PST

          [ Parent ]

          •  That still doesn't fix the problem. IT just kicks (0+ / 0-)

            it down the road a lot further.  ALL date formats based on any Von Neuman machine will inevitably store the date in a format that has a fixed number of limited dates it is capable of representing.  The problem will always be there in some form and it only "goes away" due to the fact that the solar system won't last forever and therefore any finite timekeeping that, while still being finite at least lasts longer than the earth will last is probably good enough.  Basically the seconds-since epoch problem coming in 2034 will be solved by switching from storing the time in 32 bit numbers to storing it in 64 bit numbers (which is already starting to happen in modern OS's), which are capable of storing a number of seconds longer than the earth will last.  Still finite, but good enough for most things.  (I'm not being snarky when I say "most things" because in some astronomy calculations you do actually need to calculate things occurring in longer timescales than the lifespan of the earth.)

             (Basically, I find the seconds-since-epoch to be far superior just because the Western calendar is so ugly in how it works and when the computer calculates things in its own head it makes far more sense for it to just keep track of time as a single number.  It can be scaled to seconds if that's the accuracy you need or scaled to days if thats the accuracy you need, but in either case it's the same concept:  Don't bother dealing with the crap about some months being 30 days, some 31, and one being 28 except once every 4 years when it's 29 except it's not exactly every 4 years and sometimes it's still 28 even though they year is divisible by 4 and so on and so forth. - just ignore all that messy stuff until the moment you want to translate the time back into something human readable, or you want to take input from a silly human who insists on using that messy system - only for doing that translation do you deal with the messy human calendar.  The rest of the time you think of time as a simple scalar number.)

      •  Actually, that's not true as to a sophisticated (0+ / 0-)

        enough programmer binary is just as good as source code.  Perhaps you haven't seen how it was discovered that Sony used the same "random" number for all their PS3 encryption keys making it possible to use simple algebra to get them?

        As for systems not having tools like a compiler and assembler, thank goodness for the GNU project and gcc (and mingw32, etc).  In fact, that kind of thing is exactly why Richard Stallman started the Free Software Foundation and GNU project in the first place.

        You have watched Faux News, now lose 2d10 SAN.

        by Throw The Bums Out on Tue Feb 05, 2013 at 06:39:05 PM PST

        [ Parent ]

        •  don't be so sure (1+ / 0-)
          Recommended by:
          Throw The Bums Out

          going back to my day dealing with 2nd generation hardware and software a number of us knew how to create self-modifying software that was not easily traceable in the source code, and thus for practical purposes untraceable in the binary -  when did you take the dump, before or after it modified itself?

          "We didn't set out to save the world; we set out to wonder how other people are doing and to reflect on how our actions affect other people's hearts." - Pema Chodron

          by teacherken on Tue Feb 05, 2013 at 06:43:52 PM PST

          [ Parent ]

          •  I wouldn't be so sure, how about cracking software (0+ / 0-)

            that is not only heavily obfuscated and self modifying but actually uses it's own custom bytecode and virtual machine system which is also heavily obfuscated and self modifying with dozens of layers of encryption and obfuscation?  Ever hear of StarForce?  That is the equivalent of trying to debug a nes game running on pocketnes running on a gba emulator running inside dosbox only every single layer is heavily obfuscated and almost completely written in self modifying code.  Of course, it helps that now you can waste 512MB-1GB of RAM on the obfuscation/encryption/self modification part alone.

            You have watched Faux News, now lose 2d10 SAN.

            by Throw The Bums Out on Tue Feb 05, 2013 at 07:00:43 PM PST

            [ Parent ]

          •  Most modern systems explicitly forbid (0+ / 0-)

            self modifying code.  As in they break assumptions the OS makes about how to handle multitasking scheduling so if the hardware has such instructions they are deliberately disabled by adding them to the list of things that will cause a fault and get trapped by the OS.

            •  Actually, no they don't as that would make most (0+ / 0-)

              modern web browsers unusable as they use dynamic recompilation to convert javascript code into native code.  The same is done with Java and Flash both of which would be impossible without self modifying code.

              You have watched Faux News, now lose 2d10 SAN.

              by Throw The Bums Out on Wed Feb 06, 2013 at 10:55:38 AM PST

              [ Parent ]

        •  You are really stretching the truth there (0+ / 0-)

          when you used the phrase "just as good as".  That phrase implies it doesn't take any longer to understand binary code as to understand source code, and it implies that only seeing the binary code doesn't make things any harder to follow.  That's absolutely incorrect.  Even if you are dealing with a programmer who can understand the binary you lose all the naming of things.  Variables in disassembled binary code no longer contain English names (unless you were really lucky and got your hands on a debugger-usable version of the binary that therefore has the symbol table in it but that's typically not what gets distributed as the final product now is it?)

          If you claimed that there exist some people who can make use of the binary code I'd agree.  But when you claim that those people will find the binary code "just as good as" the source code, that's total bollocks.

Subscribe or Donate to support Daily Kos.

Click here for the mobile view of the site