Warning...A bunch of rambling from an old time programmer lies ahead of you should you wish to read below this line:

So I am sitting here munching on homemade deer salami while drinking coffee (would have been homemade beer but I gave up drinking years ago) and my mind begins to wonder (as it did all last night, which when combined with the fact that my son was sick resulted in a very few number of hours worth of sleep) onto all sorts of paths (what will I be doing job wise in the next five years, how will an up-and-coming black bear hunt turn out, what kind of home projects do I want to work on, etc).  After both reading one of the latest posts on this site (dealing with some new reversing library) and some unstructured focusing of mental capabilities I have begun to wonder if my new found frame of mind is shared among those members of the board who have been around the block time and time again or if I am the only one.

The frame of mind that I am referring to is almost one of apathy, compared to my younger years, as it relates to the fundamental urge to not only know how everything works but to also be able to apply that knowledge to manipulate those items that make up our little computer universe (hardware, OS's and the applications that live within).

At this point in my life (can't be a mid-life crisis as I am only 37) I find myself bored with the idea of reversing everything in site just for the sake of doing so (when I first started out, it was just the opposite as I could not find enough things to figure out).  As for the developmental side of the house, I have reached a point that I no longer care to spend any time developing unless there is both an actual need and specific purpose (just not one of those individuals who cares to code for the sake of coding).

Am I the only one who no longer finds themselves needing to understand each microscopic level of such things as what takes place when a user causes a memory access request (from the user perspective all the way down through the OS and into the micro code of the processor and then back up) but is satisfied knowing that should the situation arise I can quickly utilize my skill-set, prior experiences, prior knowledge and a vast amount of resources that I have acquired over the years to unlock the secrets of any computer based subject regardless of the interaction level (hardware, OS, user, etc) and then apply those secrets to the task at hand and then when all is said and done swap out this knowledge for some other task (until I need it again)?  While I am sure that the future of computing is full of new types of OS's, programming languages, and hardware interactions I for one find it all to not only be old news but the concept of learning each of these is only as important as the means to a particular end (heck, I have even had to dive back into the world of application level development using the latest .NET junk {perl, vb.net, etc} just to keep my employer happy).  What do I mean by this, well for me I am sure that each one of these new items have a reason to be in existence but if I am going to spend the time to learn their uses they must be the write tool for the job (i.e. no reason learning the latest craze in web development if it does not apply to the project that I am working on) -- if I could only apply my ability to quickly learn and understand programming languages to my battle on learning both the German and Latin language!

With that said, I find those days of my software version of the movie PayCheck (the rev eng. and then rebuilding part) all but over that is until I am forced to look for a new job.  Heck...who knows...perhaps I will write a book that is focused on the concept of making order out of the chaos that is a software bug (regardless of the OS, hardware and developmental languages that are involved) {i.e. given application A, developed in language X, running on hardware B on top of OS C and then introduce bug D within the stream of execution Y at level Z (hardware or OS or user).  Now using tools N utilize your knowledge of all things computer related (and not having prior knowledge of the particular system in question) to both track down the point of origin for bug D but then also come up with solution W to not only resolve bug D but ensure that other such bugs do not happen in the future).  Another section of the book could be on utilizing ones ability to look at both the bigger picture and microscopic level of computer operations to not only streamline the daily tasks (i.e. compile time tasks, Q&A, etc) but also improve / enhance ones ability to perform said tasks (in other words, use ones coding ability to code one out of his or her own job in a matter of speaking).
Posted on 2008-01-01 15:17:36 by madprgmr
I stopped hacking bits long ago (sometime around when I left the DOS world.)

Now I hack algorithms. The desire to understand why one algorithm performs better than another in a given situation and so forth... and I'm not talking about common algorithms. Mainly algorithms involving the field of machine learning and massive state space searching. Genetic Algorithms, Simulated Annealing, Neural Nets, Temporal Difference Learning, MTD(f), and so forth.

I spend a great deal of time thinking about and trying to tackle some simple problems which are so far computationally infeasable.

A very good example is the problem of finding optimal comparator networks (aka sorting network.) How a problem so simple could be so computationally infeasable.. One chap tossed a variation of a genetic algorithm at the problem (he called it the END Algorithm) and found a network that sorts 16 elements in 1 fewer comparator than any previously known network. It is still not known if this network is optimal and it is unlikely that there will ever be an exhaustive enumeration to prove it (the state space for even this small problem size is approximately 1.2E+0142)
Posted on 2008-01-05 05:11:28 by Rockoon
Rockoon:

After reading that all I can say is wow and that just thinking about the math involved  in all of that makes my brain hurt now.

Just out of curiosity, to what do you apply your findings to?

Posted on 2008-01-05 11:29:11 by madprgmr
madprgmr, I think there are many reasons for that apathy.
- PCs being powerful enough to do almost any task we currently need done. Meanwhile your competition are weak .NET-generation junkies - your job is not to make software like well-polished gems anymore - it's a plain race to quickly ship shovelware.
- how can you keep an interest in OSs' internals, when whatever you learn from one OS applies 90-99% to all others? Desktops, PDAs, smartphones, consoles, next-gen TVs - you can describe the differences in only several sentences. You have no need to study the new OSes for eventual future tasks at your job - because if you ever need to, you will quickly learn those differences. And surpass the .NET junkies in no time (as long as you're not forced to use only their primitive tools).
- also, your left-brain easily causes the apathy to those subjects: you probably no more have any incentive to plunge in that development/reversing. Before, your incentive probably has been "to train, to gain those abilities". Now that you are certain you know how to do all tasks you might ever need to do - more training is considered a waste of time; and thus you refuse to spend your spare time in such development/reversing. => you seem apathetic to those subjects.

Fortunately for you, there is a heap of fields in comp.sci. , where you can start leveling-up :) . Your reasoning will quickly kick-in on whatever you try to add as a hobby - so you really should force yourself to stop thinking about finance, investment and time (all forms of these 3 terms)- when you start searching for a new field to conquer. There are plenty of unsolved problems, but if studying is your forte - there are tonnes of materials.

Btw, just for kicks, see the Cell BE cpu documentation - its SPEs are the only place on Earth, where .NET shit is not and will never be allowed ;). Will be an interesting read for you, at least. Another similar place are current-gen GPUs... but there too much stuff goes under the hood, JIT recompilation happens whenever the drivers wish, and the "full docs" are a small uninspired paper full of blanks.

And btw, age has nothing to do with it - similar thing happened to a 26-yr-old friend of mine. He just had mastered the couple of subjects he was interested in.
I can bet you already knew/thought about the things I wrote down :)
Posted on 2008-01-05 12:02:29 by Ultrano

madprgmr, I think there are many reasons for that apathy.
- PCs being powerful enough to do almost any task we currently need done. Meanwhile your competition are weak .NET-generation junkies - your job is not to make software like well-polished gems anymore - it's a plain race to quickly ship shovelware.
- how can you keep an interest in OSs' internals, when whatever you learn from one OS applies 90-99% to all others? Desktops, PDAs, smartphones, consoles, next-gen TVs - you can describe the differences in only several sentences. You have no need to study the new OSes for eventual future tasks at your job - because if you ever need to, you will quickly learn those differences. And surpass the .NET junkies in no time (as long as you're not forced to use only their primitive tools).
- also, your left-brain easily causes the apathy to those subjects: you probably no more have any incentive to plunge in that development/reversing. Before, your incentive probably has been "to train, to gain those abilities". Now that you are certain you know how to do all tasks you might ever need to do - more training is considered a waste of time; and thus you refuse to spend your spare time in such development/reversing. => you seem apathetic to those subjects.


I must admit that you have done a fine job in complementing my post as the items that you mention above are things that I wish I would have included within the boy of my post (perhaps you work in the psychology field?).

I guess the one thing that I will never understand about those ".NET junkies" is that they seem to be only interested in the latest version of the latest "in" programming language and if you strip their tools away from them (with most of those tools being developed by someone else) they have no true understanding of programming.  Heck, the ones that I have interviewed (while I was working at Transmetta) could not even figure out how to write a basic memory copy routine (their precious objects just do all the work for them).  Heck, to this day I still remember a co-workers response to a programming question (while I was working at Symantec) "I don't know how its done, the application wizard takes care of all that for me".  While I see no need to always re-invent the wheel I do feel that it is important to understand the code block (object or whatever) that you are borrowing from someone (be it from a book, the web, or a co-worker) and if you do not understand it you should have the mental tools available to you to be able to figure it out.

Granted I am not the smartest one out there I do my best to ensure that I understand what I am using (be it tools or code or whatever), heck I remember a co-worker of mine (who was fresh out of college) always gripping at me that he could not understand the fact that I could rip through a tough debugging issue in a blink of an eye yet I did not have concept one of black / red binary trees and their various theoretical uses.  My answer to him "when I encounter a need to know I will pick up a book, study what I need, take what I need for the given situation, resolve the issue, swap out the newly gained knowledge for the next needed block and then move on" always drove him mad (he thought that my handwritten books of pieces of information, he called them my swap pages, on computer systems {ranging from the 8088 up to the present} along with various OS's was just nuts)!  What even drove him more mad, along with being almost incomprehensible to him was the concept of shotgun debugging (I still giggle when I think about whenever he would ask me why I dont go down every code path every time and I would respond with because I don't....its just of no interest to me yet).

To this day I still live my life by the words of Dr. Alan Solomon (for those of you old enough to remember Dr. Solomon Antivirus, it is that Dr. Solomon) departed to me during an job interview with him "If you try to remember everything you will only fail.  It is more about knowing your resources and how to quickly use them in any given situation than it is about memorization".  For those of you new to the programming game these words of advice from a master (this guy could look at a debug print out in hex and quickly tell you exactly what the program was doing, heck he use to think that I was crazy for wanting to debug viruses using softice -- he just used a debug print out) have served me very very well over the ages and I recommend that you take head.

For me, being a programmer is like being a master of a martial art.  The type of person who can take what he knows and mold it to fit any situation that he may encounter.  These .NET junkies (or whatever the latest craze in development is), to me are nothing more than individuals who want to pump up their resume and be in with the "in" crowd.  On the other side of the coin, I do understand the view point of those individuals in the scientific world who use such programming to perform works of art in their respective fields and I have nothing against that.


Fortunately for you, there is a heap of fields in comp.sci. , where you can start leveling-up :) . Your reasoning will quickly kick-in on whatever you try to add as a hobby - so you really should force yourself to stop thinking about finance, investment and time (all forms of these 3 terms)- when you start searching for a new field to conquer.


I wish that I could!  While I have expended much effort if trying to get back to basics (I have taken large pay cuts to move to Montana and work out of my house) it is very hard to get past having to think about where your job will be in the next x amount of years (especially if you have been laid off as much as I have, due to company failures and the such along with have a wife and kids). 


There are plenty of unsolved problems, but if studying is your forte - there are tonnes of materials.

Btw, just for kicks, see the Cell BE cpu documentation - its SPEs are the only place on Earth, where .NET s*** is not and will never be allowed ;). Will be an interesting read for you, at least. Another similar place are current-gen GPUs... but there too much stuff goes under the hood, JIT recompilation happens whenever the drivers wish, and the "full docs" are a small uninspired paper full of blanks.


I just checked out the information on this Cell BE cpu (from the ISSCC 2005: The CELL Microprocessor http://www.realworldtech.com/page.cfm?ArticleID=RWT021005084318) and it does sound very interesting.  Perhaps I may try and find a need to pick up a system that is using this processor.


And btw, age has nothing to do with it - similar thing happened to a 26-yr-old friend of mine. He just had mastered the couple of subjects he was interested in.
I can bet you already knew/thought about the things I wrote down :)


Glad to hear that both age has nothing to do with it and that I am not the only one with the problem (I thought that it might be time to up the medication :) ).  How did things turn out for your friend?

Posted on 2008-01-05 18:09:30 by madprgmr

I just checked out the information on this Cell BE cpu (from the ISSCC 2005: The CELL Microprocessor http://www.realworldtech.com/page.cfm?ArticleID=RWT021005084318) and it does sound very interesting.  Perhaps I may try and find a need to pick up a system that is using this processor.

Go pick up a PlayStation 3 :)

Iirc Sony has made it possible to run Linux on those, so you should be able to actually code for them.

Oh, and the latest craze in programming would be Ruby. And I wish they would make a fully native version of C#, the language itself seems pretty nice. Or better yet, make a "package" system for C++ like what dotNET and Java has, and Turbo Pascal had with it's "units" back in the early 90es.
Posted on 2008-01-05 19:10:15 by f0dder
f0dder: use D

:P
Posted on 2008-01-05 19:24:33 by vid
I think you should use Ruby unless you use Python but Perl is good enough except when you need C or better yet C++ but one guy recommends Lisp except another guy likes C# or VB.NET but Haskell is competing with Scheme.....something sick is going on here.

The Programmer's Hierarchy
Posted on 2008-01-05 20:20:02 by drhowarddrfine

f0dder: use D

:P

D is interesting and all, but I don't think it's necessarily The Solution(TM).
Posted on 2008-01-05 21:08:18 by f0dder

Glad to hear that both age has nothing to do with it and that I am not the only one with the problem (I thought that it might be time to up the medication :) ).  How did things turn out for your friend?

He quit making general PDA software, and moved to another company - to make only 2D/3D mobile games. He seemed happy about it. (even after two years of dealing with the nagging problems they meet in the OSes there). And whenever his job gets boring, visits the nearby Ubisoft office and gives them a hand.

The PS3 is the cheapest way to get a Cell cpu, but if you don't want to work with a tiny NTSC monitor's screen-size, you need a HDCP-enabled monitor (HDMI or DVI) or HDTV with component cables. The cheapest monitor is Samsung 205BW.  And Linux for the PS3 is slow and cranky. .


...I wish that I could!  While I have expended much effort if trying to get back to basics ....

What I meant mostly is that to fend-off apathy while starting to study some other field of comp.sci, there are three general thoughts you should avoid "financial: I'd be cheaper if I do nothing" , "investment: Even if I master this subject, I will hardly find a job with it", "time: Studying this overall a waste of time".  You simply need to entertain yourself in a creative way for as cheap as possible.
Posted on 2008-01-05 23:20:27 by Ultrano

Go pick up a PlayStation 3 :)


I would love to, can I use your credit card f0dder?  ;)
Posted on 2008-01-06 01:06:24 by madprgmr
This reminds me of a cartoon I saw in the New Yorker magazine: There was a sign which said, "Stop and Think." One guy says to another one, "Kind of makes you 'stop and think', huh?"
Posted on 2008-01-11 12:25:32 by bitRAKE
Sorry its a little late, just felt I had something to add.

I have been in this state of mind for a long, long time.

Pretty much straight away after learning I can do anything I set my mind to, I realised there is no point setting your mind to anything!

If you know you can do something, why bother trying to prove it (to yourself)!

My addition, I have recently forced myself to start learning again. With an urge to regain my old inquisative mind that was capable of anything.

The big concern that keeps flowing round my mind now tho is where will all this knowledge go to when I die!

Btw I am 25 years young  :shock:
Posted on 2008-02-13 15:43:28 by wilson

If you know you can do something, why bother trying to prove it (to yourself)!

The product could turn-out really useful to others or yourself. You will usually find some interesting quirks, not mentioned in the books you've read. You'll shuffle through dozens of books, but finally solve those problems yourself - it usually feels great afterwards. Might write an article later...
Reading is interesting, and so is programming. I think it's best if you do both. I think it's most fun when reading and writing a lot :). Both become boring if you've been stomping in the same field for a long time, so plunge into another field :). Or negate some of the limits of the current field (i.e at work I usually develop games for underpowered smartphones, and I got a bit sick of the limitations; so in my spare time I've undertaken some next-gen pixelshader3/4 goodness)



The big concern that keeps flowing round my mind now tho is where will all this knowledge go to when I die!

If the knowledge you have is not already completely present in chunks in many existing books, the only way to pass it on is by writing an article/book or becoming a teacher (not necessarily at a uni or any facility), obviously :P. At least in the future your kids probably will like to learn from you.
Posted on 2008-02-13 19:22:57 by Ultrano
At least in the future your kids probably will like to learn from you.


Not in my experience.
My father, who is in his 60s now, has been involved with restoring old cars for most of his life - I have never learned a thing about cars from him, opting to learn from my friends and other sources... such a wasted resource as I look back on my life. He has taught me much, just not from his own field of expertise, and that is only my fault.
My son, who is now 16, is unwilling to learn programming or engineering of any sort from me, although he will listen to general life knowledge and absorb it.
Perhaps this is a peculariarity of my bloodline, but I doubt that.
Generally I think we all want to forge our own path in life, dismissing the wisdom that is closest to us, but inevitably walking the well-trodden paths despite our attempts to leave our own mark.

Posted on 2008-02-14 00:04:22 by Homer
I havent been active lately when it comes to programming, its 3d modelling
Posted on 2008-02-18 12:17:12 by daydreamer