Friday, June 13, 2014

On the Dichotomy of Aging

The process of aging and feeling older (as opposed to actually getting older) has been on my mind a lot lately. This is at least in part because I feel like I haven't done anything groundbreakingly important with my life. Philosophically, though, this has lead me to realize a strange dichotomy.

When we're young, we often want to take it easy and have fun before we're forced to "grow up" and do responsible adult things, like work a regular job, get married, and raise spawn. Often, once we've gotten old enough, we may think, "I should have taken the time to have fun back then while I could. Maybe I wouldn't have high blood pressure and heart disease if I had." It seems to me that it's pretty frequently when we're in our older years that this thought arises, and usually about the time we spent through the middle. Perhaps we admit that we had fun when we were children, but that we should have done it again when we were of a middling age.

What if we had taken that time when we were in the middle years? Would life be vastly different? I have a sneaking suspicion that if I spent most of my time having fun right now, when I'm much older, I'll believe I haven't accomplished anything with my life, and complain to one and all that I still need to get shit done. I know this, because even in my middle years as I still play games on a daily basis and such, I feel like I haven't accomplished anything important with my life. It may feel more peaceful and relaxing at times, but I worry about how my older self will view my life.

The dichotomy, for me, is simply put as: We often think we should have relaxed more when we were younger, but if we relaxed when we're younger, we may believe we didn't achieve anything worth it and must still struggle to do so even as we're older.

One thing I do know that I've done is had a positive impact on many people's lives, from the people on the streets that I give leftover food to, to trying to make everyone around me laugh and keep up their spirits, even in harsh times. This is, in a way, one of my last refuges: At least I made the people around me happier, and rarely, if ever, at the expense of another person's feelings. That's important to me, because I was bullied a lot as a kid, and even as an adult by a psychopath ex.

Yeah. I think if I can make the people around me happy without hurting others, then I can, in the end, say I lived a life worth living, and I feel that's all anyone can hope for when the bell finally tolls.

Wednesday, June 11, 2014

The Icho and the Meme

Today, I invented a word that surprised me in the depth of meaning: icho. It is a portmanteau of the words "idea" and "echo". My proposed definition of it is: the subjective interpretation of an idea over time and experience. Allow me to elaborate.

When I was young, I was very, very Catholic. I tried to do everything the church told me to so that I didn't go to hell. In time, I came to realize that I didn't really believe what the church said or did. For example, I really didn't (and still don't) understand how a selective reading of the Bible could lead people to be able to rationalize the belief that condom usage is a bad thing, when using them has such obvious and powerful benefits. I realize it's a question of one's choice of morals, but I never did see any kind of appropriate justification of that belief, especially not in light of the science of it.

By that time, I'd given up my belief. But the echoes of many of the ideas in the Bible (and other sources) remained, such as do unto others as you would have done unto you, an idea present in the Bible, but also in numerous other pre- and co-existing texts across many cultures. These echo differently for different people, and it's easy to dismiss those echoes because it's difficult to see what could have transformed them to be the way they are. These echoes can evolve, in a way, over time and space, to come to mean very different things for different people. Perhaps a person has had a different experience such that the echo of the idea has a very different interpretation to them than to someone else, i.e. a certain hard surface in their mind mutated the idea along the way, either before or after learning the idea.

When I thought through this line of reasoning originally, I'd had trouble differentiating an icho from a meme. After all, what is a meme really except an inside joke among internet trolls. Just kidding. But it IS an idea or behavior that is passed more by cultural or sociological memory of sorts; a gene for the mind, as the originator of the term probably intended.

The primary difference that I can see is that the meme is intended to convey (debatably) objective information from one person to another, whereas an icho is the subjective interpretation of an idea across time and experience. Similarly, memes are intended to be relatively stable of meaning across people, as opposed to an icho of an icho, which becomes more subjective still.

Life looks to me to be a series of ichoes, from person to person, from person to self, from life to life. And so I believe that life is subjective, and changes and fades over time, to be re-ichoed by experiences.

Monday, June 2, 2014

Conway's Law for the Mind

I was reading up on Conway's Law this morning ('s_law) and had an interesting thought, which I'll detail here.

What makes a good programmer? A lot of people will tell you it's someone who is lazy; it's not that the lazy are inherently less inept, but rather that a good programmer will try to find the least difficult path for a solution to a problem and go with that, until and unless it proves futile; then they'll find the next least difficult and continue. Others will tell you it's someone who accounts for the most possible problems that may arise with a system and either codes so as to avoid them, or at least notes to managers that they could be an issue in the future. Heck, I'm sure we all have our own rules for what we need to check before we push anything to our source control servers to ensure we didn't break anything, for example, and a good programmer might be someone who has a lot of those rules.

But how can you tell if a person could become a good programmer? Can you tell? I don't mean find a child and determine if they can program. I mean take any ordinary person off the street and ask a couple questions about how they solve problems; if they answer in a certain way and meet certain criteria, they may be a person who could pick up programming and succeed. What questions would you ask, and what criteria would need to be met? Can you train someone to think differently so that the answers to the questions change over time, and they begin to meet the criteria? This would be useful to know for workforce rehabilitation and such, so that's why I think it would be an interesting area of research.

And what does this have to do with Conway's Law, Kyle, you stupid jerk? I'm sure you're asking yourself that very question right now. What I propose is that people who are good at programming are capable of and regularly do store, organize, and access information in their brains the same way that a computer does. In essence, the structure of the person's brain will closely reflect the structure of a well-thought-out computer program and vice versa. I call this Conway's Law of the Mind.

What does a well-thought-out computer program have? This is, of course, entirely subjective, but I think a few attributes would probably be: High cohesion, low coupling, strong modularity, responsive to errors. I don't think anyone would argue that those would constitute a likely-well-written program. There may be other attributes, but those are a good place to start. So how do those map to human personality traits, and how can they be divined from a simple question and answer scenario? Let's go through them.

High cohesion

Cohesion is a fairly simple concept which, loosely stated, goes as, "The responsibilities of a code entity are focused toward one goal." Put even simpler, a piece of code isn't trying to do too many things at once.

I would argue that this has a very interesting corollary already in an interview room. When a person is trying to solve a problem, do they say a lot of irrelevant things, little details that make no difference to the actual answer or problem? Are their questions actually related to the problem? These are the types of things I've looked for, and are useful for determining if a candidate is thinking clearly. Conversely, it's easy to confuse the negation of this with incorrectness; I don't believe it's correct to assume that if a person seems a little scatterbrained that they are therefore not going to be a good programmer. The simple reason for this is that they could just be very nervous, as interviews often cause similar feelings to White Coat Syndrome, though that's just my belief, not a factual statement that I researched.

Low coupling

Coupling is the idea that a code entity has dependencies on other code entities, and hence low coupling means the first one doesn't have too many dependencies on other code entities. High coupling means a piece of code depends on numerous other pieces of code, and that can be a really big issue, for the simple reason that code changes to the dependencies will likely cause the initial code to break, or at least potentially not be stable.

In this sense, when interviewing, I like to know how a person gets along with the rest of the team. I will usually have several people from the team interview them. I think this is likely something we've all come to do, but for each person, I have them talk with the previous interviewer in-between sessions so that they can stagger a series of questions that build upon each other. Finally, I have someone toward the end make a change to an earlier requirement. If the person identifies that this will break code from previous interviews, then that's VERY good, and a very good sign that they will be a good programmer, in my book. However, I believe the same negation problem as with high cohesion, where if a person doesn't see it, it doesn't mean they'll be a bad programmer necessarily; it likely could just mean they've had a long and exhausting interview process. We've all had them!

Strong modularity

Modularity the idea that a program is broken up into multiple entities that clearly define their boundaries and separate their responsibilities out cleanly so that they don't have too many dependencies, they each have high cohesion, and they're very, very difficult to break when used in the way they were intended.

This has rather a simple corollary to the interview room: If a person takes the problems given and breaks them up into small steps, clearly elucidates and enumerates the steps required to solve the problem, and executes those steps accurately, probably they will be a good programmer. This is almost the most important rule, I think, frankly. To me, this one almost doesn't have the negation problem that the previous two did. If a person can't clearly think through the steps to solve a problem, while you can say they may be tired or nervous, it will be their job in the hot-seat to kick ass in scenarios just like this, and if they can't cut the mustard, probably they can't be a good programmer. On the other hand, if they admit that they don't know how to solve the problem because they've never encountered anything like it, but they do make an attempt using all the knowledge they have available, I do actually really like that, as long as they clearly explain their steps and execute them the way they say they will. The difference is simple: They don't have to get the right answer, so long as they get AN answer, and have clear, logical steps that are close. In this sense, the solution to the problem is almost irrelevant; in an ideal world, everyone would get every problem right, but that just will never happen, especially since interview questions can and should be different per business. A person who has the right thoughts will be easier to train than one who does not.

Responsive to errors

Being responsive to errors means a program won't just blow up as soon as something unexpected happens. It will not become entirely inoperable, either, hopefully. We see this happen all the time, unfortunately, but this is mostly because a lot of problems can't be solved by the program itself, but require the operating system to get involved (think things like stack overflows, out of memory errors, etc.). Ideally, a program would be able to handle all other errors safely and recover appropriately, e.g. if a person attempts to log in using the wrong password by accident, instead of crashing, the application notifies the user of a typo and lets them attempt to correct it.

I find this one has a rather strange associated interview question: What if the interviewer gets something wrong? I like to see how people react to things they (should) know are wrong. Is the interviewee confident in their knowledge? When something is wrong, can they identify what it is and are they not afraid to speak up? Depending on the company you work for, do they over-react to error? This can be a serious problem in big, political corporations, and is certainly something to look out for: As much as we want a person who is able to get the thought-processes correct and also get the correct answer, so also do we want a person who isn't going to implode if they see an error, but rather someone who is capable of handling those errors in a sane and rational way.


This more or less concludes my post. I really like the idea of Conway's Law of the Mind, and was wondering if you had any interview tips to see if a person's mind is structured well to be or become a programmer.