Star Ford

Essays on lots of things since 1989.

You have a voice

I wrote this poem for the Pride vigil last Spring.


If you never understood how to be part of a movement,
like me, we still have a voice.
If you haven’t had friends to show you the way,
like me, we still have a voice.
Even if your mom told you how to think and feel about everything,
like me, we still have a voice.
If you are too new and inexperienced to be taken seriously,
like me, we still have a voice.
If you are too old and out of date to be taken seriously,
like me, we still have a voice.
If you were never radical enough, queer enough, oppressed enough,
like me, we still have a voice.
If your sexuality has sent people running in horror,
like me, we still have a voice.
Even if you sometimes could not use words at all.
like me, we still have a voice.
If you have been erased or treated like charity, or seen as a problematic outsider,
like me, we still have a voice.
Even if you don’t know what you would say if you really believed you had a voice,
like me, we still have a voice.
As long as you are breathing,
like me, we still have a voice.

1 Comment »

On racism

This is about my phases of understanding racism through life, and about structuralism vs liberalism.

  • PHASE 1 – UNQUESTIONED OPTIMISM. Ages 0-16. I believed everyone was equal and it’s best to be color-blind to race because that makes everything fair. Being autistic and faceblind, I was literally color-blind, meaning I didn’t understand where people drew the lines between “races” and didn’t get that people can even have a race identity. In high school I didn’t know my friend was black, and this phase ended when someone pointed out that she was; a line was cruelly drawn between us.
  • PHASE 2 – LIBERALISM. Ages 16-20. I learned that some people are still racist, but thought it was only a few, and still believed in the power of good will to fix it, because after all, don’t we all want justice? Seeing that black people seemed to stay poor through generations was confusing and I tried to come up with explanations that exclusively focused on what’s up with “them”, and I remember being irritated that I couldn’t understand it.
  • PHASE 3 – STRUCTURALISM. Ages 20+. Learning how social structures are maintained by profit and power, I added a structuralist layer on liberalism.

So what’s the difference? In this table, the titles are pretty abstract and it appears no widely agreed definitions exist, but “structuralism” here encompasses Critital Race Theory which is somewhat better defined. Each statement in the table is something you might believe if you generally fit into that column.

Naive/Extreme Liberalism Mix Naive/Extreme Structuralism
Each person is unique. Problems and solutions are the result of individual actions. People are unique and also products of group socialization. Racism is the result of individual actions fueled by a diseased social structure. People exist in groups. The white group as a whole oppresses all other groups. The problem is the structure.
Anyone of any race might be racist depending on if they do racist things. Any person can be racist to different degrees, plus the collective inaction by white people is also a form of racism. White people are racist by definition. People of color cannot be racist by definition.
We need laws, a justice system, and police to keep order. While we need order and justice, massive reforms are needed to actively prevent bias and racism in policing and courts. The laws are unjust. The police are part of maintaining white supremacy.
Each person is judged on their actions, not their intent or effect or associations. (So, walking around in a white cone hat is protected and not harmful.) Actions matter the most, and intent and effect also matter. Actions that constitute race-based threats can be wrong even if the same action in another context or done by a different person is not wrong. White people are guilty by association with their ancestors. People are judged by their associations and effect, regardless of intent or action. (So, humming a tune unaware that it goes back to slavery is a racist and oppressive action if a white person does it.)
Sub-criminal “wrongs” like making someone feel bad, are primarily communication issues, not a matter of guilt. Making someone feel bad through racist but protected speech is not likely to be solved by individual communication because it exists in the context of a violent history. Whtite aggressive behavior should be punishable, not protected by racist laws; POC aggressive behavior is retaliatory or protective and therefore just.
Policies that level the economic playing field matter, and race should play no part in policy. Becase of the history of racial violence and level of entreched racism, some race-informed policies need to be in place to change the course towards equality. (I have not yet found policy proposals from the structuralist side.)
Culture and language is shared, dynamic, and subcultures can mix and evolve. Cultures naturally mix and evolve, but those having a true lineage in the cultural element in question should usually be respected as authoritative, rather than white people co-opting, watering down and receiving credit and money for those things. Nonwhite cultures need to be preserved; it is wrong to appropriate elements from those cultures into white culture.
Everyone can study and understand racism; it affects all of us. While a few people directly benefit from racism, it mainly hurts all of us. White people usually have a harder time understanding it. People with privilege will be blind to the actual racism dynamics; they benefit from it and will never admit their guilt or truly understand it.
We can solve racism rationally through reasoned dialog and policy with all parties at the table. Power structures self-perpetuate unless we actively challenge them, so we cannot accept the usual players at the table. Those in power will prevent any change and so they cannot be at the table.
We should mix, not intentionally segregate anything ever. Minority and oppressed people need group identity and private space, but we should also aim to have one system that’s equal and open to all. To preserve and empower communities of color, POC need to own and control separate communities, and avoid integrating; integration always causes a loss of self.

Where do you fit? One way to identify your own position is to feel where you get energized by potential changes in power and policies, and where you feel fear. For example if you heard that a black separatist political party had won a handful of seats in the House, would you feel some fear that they would fuel a “race war” that would take away what you/we/someone worked for, or even destabilize the country? Or, would you be excited that white men have a new obstacle in their way? Noticing the fears and other feelings that come up can clarify how far you are willing to go away from the liberal side. (It should be clear that my bias is in the “mix” column.)

Woke and radical. Sometimes it seems that people are in a woke war and want to out-radical other people. But what is right is not necessarily the most extreme and simple things can be more radical than flashy absolutes. For example self-identified TERF’s have “radical” in their name but they don’t accept people; it might be more radical and less attention-seeking to accept everyone. In the same way, “All lives matter” might be more radical than “black lives matter” – but only if you really mean it, and are not just saying it to deflect the particular urgency of black lives mattering.

Culture supremacy. One of the ways racism thrives among liberals is the demand to keep white cultural patterns central – that is, ways of communicating, structuring people in groups, values, and language (spelling, word choice and everything). So POC can be invited into white-dominated space but only accepted if they conform to the culture demands; this is not a melting pot of cultures; it’s one over the others, also known as supremacy. Often it feels like the groundrules are set before the space has opened up for diversity and then anyone new coming in is secondary to the founders because some things are pre-defined to be off-topic. I’ve experienced this a lot because my disability makes it impossible for me to internalize white culture as strongly as others, and I think this is somewhat how nonwhite people can feel in spaces that are “open to everyone” but aren’t really.

Voicing. There’s a huge difference between being allowed to exist and having a voice that is heard – meaning the authentic experience of a person voiced by her own words in her way of expressing and about the topics that are important to her. If a diversity of voices is really accepted, then the demand for white culture supremacy would be set aside. If there are nonwhite authentic voices that affect how things are organized and what’s being discussed, it would feel very different and liberating.

White depowering. I’m not sure how white people would create real space for everyone else, but I’m very turned off by the rampant shame that goes along with adopting the extreme structuralist view. White people acting like we are automatically wrong and racist on everything is non-helpful, especially when it goes along with seeking validation for doing it. If we want black people to award us the non-racist prize for being allies, we are still centering ourselves. It would be better to do two things. One, be quiet and go on being liberal and just not engage with this. Two, live authentically in your cultures even if they originate from northern Europe. We need a middle way between one extreme of banding together to push out all other voices, and the other extreme of pretending to be powerless.

Structuralism as religion. Christianity has a nasty kind of internal logic that means when you don’t believe in one part of it, fundamentalists will diagnose your disbelief from within the belief system, such that the system can never be questioned. Structuralism as in the table above has the same feature – if you claim you are not racist, it proves that you are racist. The white individual is fully culpable but has no way to be non-racist. According to the logic, we have responsibility in the sense of guilt but no responsibility in the sense of agency. To me, taking it that far is the same as being a religious zealot.

Rootlessness. A lot of people in North America have tenuous links to any indigenous homeland, or none at all, even though all of our ancestors at some point lived indigenously. Dominant white culture today seems to be a remnant of some actual cultures of the past in Northern Europe, with a giant dose of colonial/capitalist thinking, idol worshiping, and civic and communication norms. I suppose we cling to idols and systems and other flat substitutes for rich culture because we’re hurting from the loss. From some observation, POC in America seem to drive culture change more than white people and don’t seem to cling as much.

Cultural appropriation. I haven’t grasped the outrage on this point, and would like to propose that mixing is mostly a good thing. Anyone who lacks culture needs it; autistic people like me can fail to integrate into a culture even if it is all around us, and I feel like the flatness of what’s left of white culture is even harder to assimilate to than a more real one. I used to do a lot of Israeli dancing before I understood that it was an element of a minority culture; I appropriated it into me and made it mine, and I think that’s the right way to spread things. But on the other hand when I call my preferences for arranging furniture “feng shui”, that’s misleading because I’ve never studied it and made it mine. In the extreme when minority-cultural elements are monetized by white people in that colonizing flattening way, it demeans the real thing. But even when a white pop star “steals” in this way and none of the original artists get the money or credit, it seems symptomatic rather than a cause of racial stratification.

Passivity. Here is a comprehensive and clear source on these topics and more. That author takes a mostly structuralist perspective but without the dogmatic absolutes. One of the points is about how white people lie to ourselves about being passive and therefore not responsible. The structuralists seem to want white people to actively identify as white in order to own the oppression that is going on and stop it, rather than just cop out. I never identified with being white and I have no idea how to purposefully adopt an identification with something when I don’t feel it, but it is an idea worth further reflection.

This is another article worth reading.

1 Comment »

Agile America

If a software project is big enough to be “nontrivial” (as programmers say), a company needs not only a system for management of the quality of the engineering product, but also a system for management of the quality of the process – that is, management of management. This meta-management has become a big business itself with companies that just sell processes to software companies, and it involves proving that certain practices and ways of making decisions are likely to produce better outcomes than others. The work process is itself a product, complete with advertising and brand loyalty.

Trends

Here are some of the trends since the 90s that set the context for what is happening now in the industry:

  • Systems are getting scaled larger (more simultaneous users), are more open to hacking, and have higher up-time requirements, so they require more people to keep them running.
  • New systems don’t appear to be any more complex, but they often contain more records because companies are monopolizing. The size of databases is much larger, but the complexity is the same.
  • New systems are more business-critical. Earlier, a business often had a paper backup or a way to work without the software system, but now the company relies on it with no alternative.
  • The total number of people involved in automation has exploded. The nerdy prodigy type that is historically associated with programming is now in a small minority; there are not enough of those kinds of people. Most people in IT departments might not even be naturals at the job and many are extroverted, not terribly exact about logic, and much like the society at large.
  • More people’s demands are perceived to be important – more stakeholders. In particular, non-technical people are now deciding on the look and feel and flow of systems, and there is a noticeable degradation in internal consistency across business applications.
  • In terms of hardware, tools and platforms, the main change is that speed and memory are much, much cheaper. Despite popular belief, there has not been much innovation in operating systems, databases, and languages compared to the period in the 80s and 90s.

Meta-management works with these trends and also aims to prevent the worst problems of the past. Two of the most serious and common problems have been:

  • A developer suddenly disappears and no one else knows how to change the system, so it becomes stuck.
  • A large project goes way over estimated time and cost, and by the time it is done (if ever), it no longer matches what the company actually needs.

In order to solve those problems, we now have “agile” processes. While there are competing variants, they all aim to solve those two major problems through teamwork (no one person should be irreplaceable) and short planning horizons (projects cannot be months long by design). The idea is to work on very small increments of improvement to a working system and leave the system in a working condition all along, instead of going into the back room and coming out months later with a big finished system. It ensures that a company is getting a continuous stream of value in return for a continuous stream of investment. It is intended to eliminate the risk of losing all the investment due to the two problems above.

Sabotage

In my observations of company productivity, I see about 20% productive people, 60% deadweight, and 20% malicious or destructive people. It does not appear to be possible to change these ratios, because useless and destructive people often have greater powers of persuasion than productive people, and thus they appear essential. Usually they honestly believe they are contributing.

One of the most important meta-management concerns is whether any one person can sabotage a project. Each of the 20% destructors will sabotage it if they can. What we want is a system that restricts the power of the misinformed or malicious to only slightly slow down progress or make a feature slightly worse, but not to completely sabotage it.

Different company organizations offer different ways to sabotage projects:

  • In pyramid-shaped organizations that work primarily with delegation, every project gets subdivided into tasks and delegated, further subdivided and delegated so on, making every person accountable to their superior in the pyramid. If any one person is a destructor, their part fails and the whole system fails. The destructors can be identified but perhaps too late. For the same reason, productive people can do a lot and be recognized. This structure is widely considered obsolete.
  • Team-based organizations are an answer to the pyramid problem: everyone is replaceable and the productive 20% accomplishes everything by going around everyone else, leveraging a great many more connections between people – not just up and down the chain of command. Sabotage is thwarted by ensuring that no one person has any substantial power, and only teams have collective power. Destructors are difficult to identify, as are producers. An especially productive person is held back to the slower pace of the team.
  • Team-pyramids are a way to combine the worst aspects of teams and pyramids. Authority is delegated down through levels in a pyramid, but the engineering product is not passed back up through that pyramid. The product building is done as with teams, but the decisions are in pyramids. This system allows anyone to sabotage the project with no accountability, since the producers are not allowed to work around the deadweight and destructors. My last job recently switched to this system (unknowingly I guess) and the ability of anyone to accomplish anything seized up like an engine without oil.

Other meta-management problems

A key problem of meta-management as a supposedly legitimate field of study is that it requires longer term knowledge than almost anyone has. Most people in the industry are still in their first 10-year project, and hardly anyone lives long enough to have done enough 10-year projects to be able to compare them and have personal experience of which process works better. Therefore most of the claims made about meta-management cannot really be substantiated by experience.

Related to that legitimacy problem is the bizarre worship of process that goes on. Like anything based on faith alone, people become frenetic adherents to their chosen “agile” process, and evangelize it with a giant set of internal vocabulary. There’s normal sounding terms like “runway”, “backlog” and “standup” – and esoteric words (sometimes bordering on religious doctrine) like “kanban”, “scrum”, “manifesto” and “epic”. The words can be used as an appeal to a higher authority to prove a point, when that point cannot be supported by common sense or the normal use of language.

In that last company, a lot of people were as sure as the sky is blue that the process they were trained in was going to work, even when they had never personally done that process successfully. Many of them had never accomplished even one thing independently, since their whole careers were in teams that did not expose whose contributions were relevant. Some of the deadweight people had a very faith-based allegiance to certain aspects of process, and the less experience they had with it working, the more evangelical they seemed to be about it.

Deep in the house of mirrors, there are blatantly countersensical and even reality-defying beliefs. My “favorite” one is the belief that software bugs fall into one of two categories: Either it is a critical bug and the whole team stops all other work and fixes it immediately, or it is non-critical and should never be fixed or even written down anywhere. Anyone who has used software knows there is really a whole range of severity: some bugs make a product unusable, while others only make it annoying, slow, confusing or risky in some way. Anyone who hasn’t drunk the Agile kool-aid can see that obviously some bugs need to be assigned a medium non-emergency priority, but that common sense position had not been canonized and you are not supposed to believe it.

Actual agility

The term “waterfall” refers to software development processes that cannot be reversed or changed mid-course; an extremely waterfall-ish approach would be one that does all planning up front, then does all development in a back room without communication with the users, then the product is considered “done” when the contract terms are met, even if it does not work or does not meet expectations.

An agile approach is one that by contrast continuously checks in with users and is capable of adapting quickly. Ironically though, a waterfall approach can often be quicker and more agile than one labeled “Agile”, and the reason has to do with what I call “chunk size”.

A large chunk size means attempting a plan-build-test cycle that includes the whole project in a single cycle. For a large project, people are not able to plan and communicate everything accurately and it can fail just because the chunk is too large. On the other extreme, if the chunk is too small, then the plan-build-test cycle might only include one micro-feature per cycle and then it ends up taking years to develop a useful product. Very small chunks also result in inconsistency in the product usage, as different people try to push the user-interface paradigm in all different directions at once.

As an aside, the process of building something, no matter what size the chunks are, must have three general stages: One is understanding what we are wanting to accomplish (requirements); two is building it in the back room; and three is showing what was built and then evaluating, testing, fixing, and integrating it. If you have a large chunk, the back-room part might be months long; with a small chunk it might be only hours long. In any case there must be a back-room period when no communication is occurring, because it is a technical creative process that requires focus, and because it is essential to commit to something and complete it instead of being continuously up in the air.

As I saw in my last company, the plan-build-test cycle itself can be disintegrated or defined out of existence. In those conditions, management thinks that the cycle can be so short that there is no back-room period at all, and that all work can be accomplished with continuous communication going on. But that makes the work stop completely.

The ideal chunk size is one that is about the same size as (and not more than 10% larger than) the most recent successful chunk done by the developer. So if that person is comfortable with a 2-week long chunk and has demonstrated that ability, then it will work. If the person is only ready for a 1-day chunk, that is the appropriate size. Anything larger is too waterfall-ish, and anything smaller is too slow. Picking the right size yields the most actual agility.

Can it work?

Part of my reason for writing this is to debrief myself on what happened in that last company (which was my first and only big-company job) and ponder whether large teams can really build software at all. Most of that company’s software had been written by a few people, before they started calling themselves “agile”. After that shift, it looks like they started spending more money and slowing down work. The tech giants all seem to be growing and slowing in the same way. The software I love to hate the most is healthcare.gov, a fiasco of waste (500 M$) and slowness; had it been written before Agile was doctrine, I think the contract would have been terminated without pay, and a different supplier chosen, which would have been faster in the end. All these things make me doubt all the new meta-management thinking.

My prediction is that someone with clout will come along one of these years and declare Agile to be dead, and the industry will shift to something similar with new vocabulary and the same problems. But what would REALLY work?

The things we cannot change are:

  • the 60% deadweight problem (the sector overpays because the demand for workers is so high, so people who are not naturals are swept in and cannot contribute much)
  • the 20% destructor problem
  • the fact that most people (even the contributors) value their careers over the product and will lie about their abilities and backstab anyone without allies
  • the newness problem (most people have never completed a large project cycle)

The newness and non-naturals problem means most people are in over their head; that anxiety combined with self-protective human nature is not a great combination for making good decisions. I don’t know how anyone could change this starting point, so we need meta-management that assumes these problems will always exist.

In my last company, as productivity and quality was perceived to be declining, decision making tended to become more centralized as a stress-induced reaction, and that further slowed productivity in a viscous cycle. As someone who has actually gone around more than one 10-year cycle of large application building, I would now consider myself qualified for technical leadership, but I had to complete many small projects independently and some large ones before I felt I was ready for that. Those who did get decision making power at that company appeared to feel they were qualified without having had that length of experience. There was a general sense of operating on theory alone – things were declared to be the right way because Agile (or because of some other appeal to outside authority), not because the decision-maker had any practical experience with it.

The universal rule is that if sabotage is possible, someone will do it, generally by preventing those who are the most qualified to make decisions from making them. Thus it is essential that any process is built around having many channels of communication and no single person who can restrict the communication or decisions. Statements like the following must be impossible: “Everything has to go through me”. “So-and-so needs to be at that meeting”. “We need buy in/approval from so-and-so for this request.”

So, to conclude, here are three ideas for structures that might work, at least better than the “agile” ways I’ve seen:

  • The pyramid of tiny teams: This is essentially an old-fashioned pyramid of delegation, but each node in the structure is a tiny team of 3-5 people that are collectively accountable for delivering their part. The most successful tiny-teams are given larger chunks and get newly formed tiny-teams under them, which they can delegate to. It might prevent a node from sabotaging the whole because there is likely to be one producer on each team.
  • Decisions by duplication: Instead of holding up engineering work to resolve conflicting points of view, the company proceeds to develop the product in multiple independent teams and then uses only the most successful result.
  • Chaos plus portfolio: This is a system where no one has fixed roles and the only management is in periodic evaluations of what a worked accomplished. It relies on the notions that people like to do engineering, and they will naturally learn to become more effective given a lot of freedom.
1 Comment »

On rationality and buffoonery

The structure of un-reason

Listening to the extreme claims made during the Kavanaugh hearings has made me consider the limits of rationality. There appears to be a structure of un-reason that we humans are stuck in, and it looks something like this:

unreason diagram

The two paths depicted are different ways of expressing why we do things or why we adopt positions, when the topic is contested. I am not attempting to explain why we do everything we do, and indeed a lot of what we do in a day is habitual or autonomic or arising just because we feel like it. This paper is only concerning those things that we make conscious verbal claims about, when we are thinking and expressing positions.

The top pathway seems to be the more common pattern and the subject of this paper. The lower pathway is the scientific version that I believe is only used in limited settings when we are able to be unusually objective.

Terms I’m using are:

  • The impetus is the actual basis for doing something, which can often be unconscious, and is self-interested. So it’s either related to a basic need (like hunger or loneliness) or resolves a state of unrest (like fear).
  • The justification is the expressed basis of the action, which we essentially make up after doing the thing, or as preparation for explaining ourselves after we have decided to do the thing. It’s retrospective of the action.
  • Buffoonery is the action-justification sequence, when viewed in the light of that retrospective order.
  • Rationality is the opposite of buffoonery, in which the reason is prospective of the action. In other words, the action is legitimately done for a known, expressed reason.
  • Buffoonery dissonance is what happens when justifications contradict each other.
  • The “cement” is all the layers of beliefs that we use to ease the buffoonery dissonance.

Examples

I will walk this through three examples – one from children in a classroom, one from a teen/adult perspective that I hope is relatable to the reader, and finally one from politicians showing an extreme case.

In the childhood example, imagine a student pleading to a teacher to relax the rules in some way – maybe to allow eating during class. “Teacher, we should be able to eat in class because we’ll be able to pay attention more!” The “reason” given is really a justification (or rationalization) – an invented basis that that the student hopes will appeal to the teacher’s supposedly disinterested sense of reason. But whatever basis is given, it is not the real impetus, which is actually more simple and direct: she’s hungry. So we have the real and self-serving impetus that happens first (hunger), and the false expressed justification that comes later (in order to pay attention more), and so far there is no rationality. The teacher might think “what principle can I apply here to make an impartial decision here” and that part could be actual rationality up to a point, but the teacher also has her own justifications for things, so the final ruling on the matter might not end up being rational.

The buffoonery of young children making up justifications for what they want is often transparent and teachers might even laugh at it. In particular they laugh when they notice the same child suddenly switches positions when their self-interest changes. Children’s lower level of sophistication allows us to see the structure. They are presumed to be less capable of reason so we do not hold them accountable. More on that below.

The second example is fictional but similar to things I have done. I love cookies and my impetus is to have them all for myself. Maybe I’m greedy or I fear a future cookie shortage, but I’m not consciously thinking of these causes. When I’m not actually hungry, I might tell others not to eat the cookies because “we should save them for a special occasion”. But when I get the munchies, I might eat them all and then say they were getting stale. The buffoonery here is switching justifications in a way that would make the other people in the house raise an eyebrow at my inconsistency and doubt my “reasoning”. If they pointed it out, I would feel the “buffoonery dissonance”, or the shame that goes with being caught.

The third example is what prompted this whole line of thought – the Supreme Court confirmation hearings on Kavanaugh. A senator supports something “on principle” one year and opposes the same thing the year later, appealing to the opposite principle. In this case, senators who made arguments for prudent and lengthy consideration of facts when Obama nominated a justice (and effectively delayed hearings until Obama was out of office) are now making arguments for quick action and not looking too hard at the nominee’s history. A staple of the workings of late night comedy shows is to search databases of footage for cases of inconsistency like this, and air the two clips juxtaposed. We all laugh at the very obvious lies, but beyond laughing is there any result of exposing them?

Responses to buffoonery dissonance

When confronted with buffoonery, people respond different ways:

  • In the case of children, they might just take stabs at whatever gets them off the hook or whatever seems to appeal to adults, and not feel the dissonance at all.
  • In the case of some senators, they appear to have developed an immunity to the shame, so they also take stabs at whatever gets them off the hook and rely on the press having a short memory. The increase of sophistication over children is only slight.
  • A conflict-avoiding person might retreat from their positions and stay safe within cultural norms, while not really facing or resolving the dilemma. (“Okay you’re probably right”)
  • A person valuing relationships over positions might soften or release principles and claims, and see multiple sides. (“It’s not so simple.”)
  • An introspective and ego-balanced person might feel the shame and admit “you got me there”. That could lead to bringing the impetus into consciousness and adjusting the position towards being rational, or having a growth moment. (However, people don’t mature in leaps like this every day, so the response would be less laudable most of the time.)
  • A person with a fighting spirit could double down and invent a more abstract justification that logically bridges the opposing ones. This is the “cement” that locks the positions in place.

Cement?

Cement is all the other beliefs that we adopt to cement in our justification after we invented it.

In the cookie hoarding example, if I was called to account for a discrepancy in expressed principles, and I was not ready to admit that the actual impetus was to have all the cookies for myself, then I would need to come up with a new all-encompassing “reason” why my two prior “reasons” were compatible. The layers of justification can get ever more intellectual until I win. A lot of politics is essentially the art of getting all the cookies for oneself, where “cookies” can be substituted by anything, such as agri-business subsidies or stockpiling for war.

A global example is slavery. The impetus for holding slaves includes greed and aggression, but that is not admitted to directly. Instead slave-holders make up a justification that makes themselves sound innocent. Those same people might also say that “all men are created equal”, and then they might be confronted with the inconsistency. They would then need to appeal to some more abstract justification that unites the inconsistency, which could be, among other things, that “slaves are not people.” In all of this, I am making the argument that the actual sequence in time is the action, then the initial justification, then the cement. When people want to sound rational, they reverse the sequence and claim that the abstract principles were first in mind, then it led to reasons which led to the action to hold slaves.

The brain is a re-sequencing machine

While I cannot prove that buffoonery (reverse rationality) is the norm, there are cousin processes in the brain that point to backwards sequencing being something that occurs constantly as a central part of consciousness.

The first observation supporting this is that hearing is faster than seeing. If you create an experimental setting where a subject has a brain monitor and a startling noise happens at the same time as a picture on a screen appears, then there are two sequences that occur in the experiment. One is what science observes: First both stimuli are detected (the light arrives before sound but that difference is insignificant at short range). Second, the sound is processed and it signals a twitch reaction in the muscles. Third, the slower visual processing part of the brain sees the image, after the muscle reaction happened. The alternate sequence is what we as subjects believe we experienced: we are really sure that we saw and heard the stimuli at the same time, and then twitched afterwards. So consciousness is subjectively sure of something that is not true. The brain is constantly re-sequencing stimuli in short term memory to align the timeline of hearing and sight and providing a false sense of being present in this exact moment, when we really are never aware of a moment until later.

The other observation is about dream recall. Freud and possibly others asserted that dreams occur as disconnected images without the usual adherence to the laws of time and space, but that we force those images into a story during the process of recall. So the sequence is imposed later upon the original dream. Even though we feel sure we “saw” the things in a linear story format, that certainty only came after the dream was already done and we were waking up.

Onlookers

What’s been bugging me about the un-reasons of the Kavanaugh supporters is the thickness of that cement, the beliefs built up to support the justifications for supporting him. There may have been some rationality exercised by the people who originally put him on a short list and then selected his name – those are the sorts of activities where, at least some of the time, we can be rational. But the millions of supporting Americans can’t be doing that most of the time – their real impetus for support could be fear of losing control by white men, or some related fear-stoked groupthink, or simply being drawn to the norms of the people around them for safety.

The easiest justifications for supporting him despite his potentially disqualifying traits are (1) There is no proof that anything happened; and (2) whatever happened was so long ago. Bart Simpson has a line that covers the bases – something like “I wasn’t there. You can’t prove it happened. I don’t know anything about it.” None of Bart’s justifications are consistent with each other, and likewise the two easy justifications for supporting Kavanaugh are like that too: It is inconsistent to argue both that nothing happened, and that it happened a long time ago.

Then to resolve the inconsistency, the cemented beliefs are constructed, and they feel dangerous. The could be abstractions like “violent assaults are inconsequential” (a way of saying it anything happened, it doesn’t matter). The only way a person can say that shamelessly is by adopting beliefs such as: rape is not a real crime, and women are not fully human in a way that makes crimes against women “real”. Possibly rape is imagined by men as not that bad, if they can only visualize themselves as the rapist. Or people have claimed the victims are fake, or that caving into the real concerns of the opposition fuels some kind of liberal conspiracy.

So the danger is that in the rush to pretend to be rational and escape buffoonery dissonance, millions of people invent and adhere to dangerous beliefs.

My guess is that the left (including me) is doing the same un-reason, and if the nominee was a Democrat with the same history, both sides might have adopted exactly opposite “principles”. One way to test the theory is to assume a hypothetical nominee 20 years in the future, and all we know is that he probably committed some misdemeanor or felony decades earlier that was not reported, but we do not know his political leanings. What would our principles be then? Without knowing our self-interest in a question like this, we usually resist answering, and when I’ve asked people questions like this, I get a lot of “it depends”. They will not say what it depends on, and I tend to think that what it depends on is self-interest at the time, which has to remain unsaid because we rarely can admit to our real impetus.

When I try to bring my own un-reason sequence into consciousness, I get some data from the introspection, but I doubt it is really possible to know oneself enough to fully escape buffoonery. I can admit that a crime in the distant past should not be a disqualifier for most any job. Relatedly, most people working to help ex-offenders re-integrate are left-leaning, but they do not want to apply that same principle to this hearing. However I also then feel drawn to the justification that the particular job of Supreme Court justice should have higher qualifications than others. Also I am drawn to the notion that he should be disqualified for lying under oath, but at the same time President Clinton did that and at the time I felt it was not consequential because the subject matter was not of national significance. To be fair I would have to admit that Kavanaugh’s probable crimes are also not of national significance. That is about as far as I can get with looking at my own un-reason.

Teaching and change

I wonder if the balance of buffoonery versus rationality could be partly influenced by culture, or if it changes in different time periods. On one hand if feels like a solid part of how the brain is wired. On the other hand there is a case to be made for buffoonish cultural patterns. For example when we want children to act rational and we ask them why they did something we do not approve of, they say something to answer the question, like “I was tired” or some other justification. When we engage with them on that level, it is as if we have believed that there has to be a lie and we can only talk about the lie, whether refuting or bolstering it, but always staying at that level. So parents can be the gatekeepers of pseudo-rationality by colluding to stay within the un-reason pattern and limit the conversation to which lie out of the many possible justifications is an acceptable one to settle on. Thus we are teaching and modeling dishonesty. But maybe that way of teaching is culturally prescribed and change is still possible. If we had a cultural pattern of not engaging in pseudo-rationality with children, we might not have a culture that appears to celebrate lying so enthusiastically.

I also wonder about the cycle of shame and revision with people who are introspective: I do not know if the cycle is really just adding so much sophistication to the lies that they really get convincing, or if we inch towards actual rationality. I worry that the more I cultivate the idea that I’m being rational myself, the more I’m building up “principles” that let me have more cookies without risking the shame of inconsistency. And that goes for all of us who feel that we are fair and benevolent.

I also wonder if the cycle of shame and revision is less prevalent in the internet age, and that could be why the Kavanaugh hearing seems to be so much more buffoonish than things in politics in the past. I cannot recall seeing that “you got me” shame in public in recent years, but I think I remember it from earlier. My memory might be limited to local settings and not national politics though. If that is a real culture shift, it could be related to the new internet-age phenomenon of people avoiding anyone they disagree with, so they can never be caught in their inconsistencies. Without ever being caught, the inconsistencies could grow larger without anyone noticing. Possibly related, the president now is someone who cannot be caught because he has no principles, therefore can rarely feel inconsistency or shame.

1 Comment »

A glop taxonomy

In my lifelong quest to un-confuse myself, I have gradually awakened to the fact that I was constantly thrown off by words that are defined associatively, because classical definitions are so much more accessible. As a toddler I would have been so much more aware of the world around me if people answered my demands for meaning in the form “X is a thing in set Y, but distinguished from other members of Y by variable Z”. Because no one would give me a clear definition like this, I spent all those decades not completely sure what an ottoman was, or a hatchback, or a ranch house, or khakis or bows or blouses, or salads or tarts or barbecues. I was more clear about things that can be defined classically, such as that fermions are particles distinguished from bosons by their spin, or that a county is a kind of jurisdiction that allocates 100% of the land area of states into non-overlapping regions. That kind of definition is so much more accessible than trying to figure out why some cars with doors in the back are hatchbacks and others are not, when no one could tell me the distinguishing feature of the class. When people would debate whether a tomato was a fruit or vegetable without defining fruit and vegetable, I thought there was some mysterious classical structure behind their claims, and I came to find out as an adult that there is not, and the question itself (which stole precious minutes from my life) was wrong. A fruit is that part of deciduous plants containing seeds, and a vegetable is any edible unprocessed plant part. At the time those minutes were stolen, I was too young to catch on that the dichotomy was unreal – the two options are not members of the same parent set.

The particular problem category I’m working on in this paper is words defined by fluid consistency. For example if we assume that peanut butter is made of the two ingredients peanuts and butter, as I naturally did, then we miss the critical understanding that “butter” does not indicate animal fats at all, but is instead used as a reference to the resulting consistency of ground peanuts. Coconut milk, by the same illogic, contains no milk. All my childhood I would wonder “what is oil” and “what is wax” and no one could say because every definition would fail to account for all other things that they would also call oil or wax but which were clearly different than the thing in question.

So it turns out, as everyone else already knew, that the meaning of words like butter, wax, and syrup is not about what is actually in the thing, but only has to do with how the thing interacts. In particular since the non-solid parts of us, and biomass in general, are mostly salt water, sugars and lipids, so many word definitions center on how materials interact with water, sugar and lipids. A “syrup” for example need not contain any sugar, but it is just anything that is “syrupy” – but then how do we know in the classical sense if something is syrupy? Answer: it has a certain range of solidity, coherence, and adherence; also it absorbs water and thus is easily washed by water. Molten metals such as mercury might have the same solidity but they lack adherence to water so they are not syrupy. Olive oil has the same solidity and the same adherence as something syrupy, but it repels water so it can’t be called syrup.

The variables

These variables appear to be the most relevant to how we define words for non-solid things:

  • solidity – at one extreme, a measure of how pourable the thing is, or at the other extreme, how much it tends to return to shape (rubberiness)
  • elastic energy – the effort required to re-shape the thing
  • friability – a measure of how pressure tends to either break the thing (like tofu) or stretch the thing (like sugars), or both (like corn starch or silly putty)
  • cohesion and adhesion – the binding qualities of the thing to itself and to other things, affecting surface tension and mixability

A note on elasticity vs solidity: While they might appear to be the same thing, consider jello (and generally things called “gels”) – it is easy to reshape compared to molasses, even though they are both sugar water. Jello has low elastic energy (moves with a light touch) but high solidity (does not pour), while molasses has higher elastic energy (requires greater time or force to spoon up) and lower solidity (it pours).

There are many other variables that chemistry knows about, but they don’t appear to be major players in word definitions. For example:

  • The volatility of solvents is a phenomenon you can feel (such as how hand sanitizer seems to disappear with use) but I could not think of a common word describing liquids with that quality.
  • Density does not appear to affect word choice, possibly because everyday things have very similar densities.
  • The irregularity of a thing because of contained materials affects word choice; for example grout or other cement, sand and gravel aggregates would never be called milky or pasty even if they match other properties of such materials. However there doesn’t appear to be an everyday noun describing aggregation.
  • The quality of surfactants, or soapiness doe not appear to have a word for the class of substance behaviors.

The taxonomy

Now to turn to the actual taxonomy of glop-words, I’ve placed those annoyingly un-definable words into a context of some of the variables above.

  • words for water-absorbing things with low elasticity, low friability, ordered from low to high solidity
    • water > milk > batter > glop > paste > dough > putty / clay
  • words for water-repelling things with low friability, ordered from low to high solidity (and also low to high elastic energy)
    • cream > oil > slime > grease > butter > wax
  • words for water-adhering things, ordered from low to high solidity (and also low to high elastic energy)
    • syrup > gel / jelly > marshmallow > gum > rubber

This is a limited attempt to give a classical definition to these 20 words by defining the class and the variable differentiating the thing from other things in the class.

There are a variety of other words that describe what a thing does for you, or how it affects a process, like a lotion, balm, or detergent. However we don’t use these words as a noun class defined by the exemplar of the class, the way we do with milk, syrup, and the other words in the taxonomy.

Leave a comment »

The custodial economy

We are in a gradual shift in the economy when more people’s economic lives are enveloped in custodial institutions such as day care, schools, prisons, and disability and elder services. This “custodial economy” is made up of part of the government and service sectors, and is not normally considered a sector in itself. However it is distinct because it involves two expanding groups of people, who are neither buyers or sellers in a market. Instead they consist of beneficiaries – the students, prisoners and other people whose lives are being occupied by and supported by the systems, and the custodians who are deriving income from running it – the teachers and wardens and so on.

This paper is meant just to shine light on this phenomenon and look at the history, economic forces, and some commonly believed myths.

The main points defining the custodial economy are:

  • The beneficiaries in general do not have jobs with earnings sufficient to support themselves. Without economic power, their world is partly or mostly run by other people.
  • The custodians are not being paid directly by the beneficiaries as they would be in a market system. Instead the beneficiaries are a third party to the transaction, with less economic power. It’s similar to the way users of social media act (often unwittingly) as a third party in the sale of data about them to advertisers. Likewise, students and prisoners neither buy or sell the services being performed on them, but they must be there to make possible the salaries earned by the people working for those systems.
  • There are fuzzy edges to these ideas – there is not an exact way to categorize every person or every job.

Does the custodial economy encompass all poverty programs?

Read the rest of this entry »

1 Comment »

Do not resuscitate

(This is what came to me the night after my mother in law died.)

scene

note2At the base of jagged cliffs in the river, the weapon and a used body remain but the pain is gone. Some things remain and some are free. Pines and soft grasses accept the coming and going of life. The Gallinas accepts the washing downpour. Everything proceeds in its cycle as if the suffering had never been.

In memory of Patricia Knoebel and all the stone-shaping waters that connected her to Earth.

note1

Leave a comment »

On spark plugs and being present where you are

Yesterday, which spanned 48 hours, my car broke down while moving a load of stuff to Las Vegas. Cruise control decided that jumping to 6,000 RPM would be appropriate on the Cochiti hill; pistons went sproing-a-tattatat; oil all gone. Until They came, the predicament had no solution because tow trucks must – according to some towing logic – leave your trailer there on the highway, amongst rain and pillagers. Towing was my only plan I could think of but would amount to giving away the trailer with three chain saws, chicken wire, a table, back packs, and basic necessities such as a croquet set and other things I can’t remember.

They came flying by and some kind of flash happened in her wild mind. Stop, she said, and that was final. (How can we leave the lady there? Can’t.) They came back, found some oil, checked everything. They were both mechanics. She explained she learned how to fix everything because she could not afford to get it fixed. The questionable tail lights caught his attention and he went to work on that too. She alternately hugged me and diagnosed the oil pump.

They helped tenaciously from that sprinkly warm afternoon through nightfall in Santa Fe, through two trips to Vegas, and sunrise outside Tecolote. What do they want, I kept thinking. If they were con artists, they were not very good at it. They were somewhat intolerable to be around in the pushy way of people selling, but they were not wanting money or anything. All sleepless night I thought, surely they put water in the oil reservoir, and they overheard me telling my address, and a looting was surely in progress. They would disable my car, then conveniently “run out of gas” outside Pecos after waiting til the bats go home when no one would ever know what happened. I facebooked my coordinates because I think ahead.

I noticed two interesting things about their language. One was the omission of names, hellos, good byes, or anything of protocol. You just start interacting in the middle of the conversation and end in the middle. Or just don’t talk if that suits you. The people involved in this thing were normally referred to as The Man or Dude (interchangeably as there were two of them, the talking one and the other one), Home Girl (the dominant one who said Stop), The Lady (that was me), Her (her who owned the Suburban), and the nephew. In this context, the formality of “nephew” stood out as oddly specific. In the 14 hours, no one asked my name or said theirs. Home Girl and I were both the age of grandmothers.

The other oddity – a syntax rule – was a way of phrasing possessives. It’s become a meme to say “my baby daddy”, or to laugh at people saying that. But their dialect took this further. The main man held up a device and said “This: home girl baby daddy phone” with no verb or prepositions, like you might (not really) say in German Das Heimmaedchenkindvatihandy, leaving the word for the actual thing as the last element in the compound word. Latin languages would start at the other end of the chain of nouns and say this is the phone of the dad of the baby of my girl, putting the word for what it is – a phone – first. English is historically undecided between those two syntaxes, apparently excepting this German-leaning dialect of Spanglish.

I also noticed two patterns about behavior. One was living in the moment. Really in the moment. Not filling up the gas tank before driving in remote country in the middle of the night. Not considering what he would do without a tow bar once he got there. Forgetting food.

The Man talked about his own Suburban (currently missing since being “borrowed” without notice) being a gas hog at one time, and then he put in new spark plugs and that raised it from nine miles per gallon to something more affordable. He said he would rather spend the ten dollars for new plugs “up front” rather than spending $60 in gas each time he drives to Albuquerque. But he announced that as if it was radical to do anything preventive or with foresight, while to me, 10<60 is pretty simple math so of course you would do that. On the other hand, the suburban of Her, which he had “borrowed” for today’s adventure, did not have features like updated spark plugs, so it still got nine miles to the gallon, a fact that one had a lot of time to contemplate in the silence that happens without gasoline.

The other behavior that was impressed on me was the dedication to being responsible that is so full that they could not seem to even consider dropping a commitment. So he stayed up all night through sunrise because I had no other choice. A middle class American would have set a limit: “It’s 2 AM, so I can’t help you any more!” That thought didn’t seem compatible with their whole way of thinking. When Home Girl saw me, stopping to help me became true forever, not just for a reasonable amount of time. It wasn’t up to me to refuse or for them to reconsider. There was no undoing of that flash.

These are poverty behaviors, they say. If people could learn to think ahead and set limits, they would get further; they would get out of the cycle of poverty. Rich people buy in bulk and do things in a durable, planned manner, so they actually spend less and save more, while the poor are forced to live crisis to crisis eating at convenience store prices and paying for gas because of not paying for tune ups. The story is that poverty thinking is the problem that keeps them always on their last ten dollars.

I agree that people who tread on others, accumulate, and hoard do get ahead. I’m just not sure I know which of the two sides is the one with the problem.

He knew how to fix tail light bulbs with scrap bubble gum, spending nothing as habit because habitually there is nothing to spend. While scientists may not have discovered it, he had become an expert in getting ambient air pressure at certain temperatures to raise gas fumes into an engine when there is no gas. He clearly had years of experience coaxing life out of broken things.

I wondered if she was a calm heart of gold on the inside with tarnish and rough edges on the outside, or maybe a con artist to the core with a thick layer of deceptive frosting. Both seemed to be true; the layers were so thin and so sandwiched together, that it was impossible to tell which was on top and which was beneath. She was in some larger battle in life and like me, losing it most of the time but never giving up.

I had a sense from them that their role in their fleeting social network is keeping things zipped together and resisting entropy. The Other Dude did not have that role; he ran reactionarily into the twilight of Tecolote, which is not walking distance from anywhere, and did not reappear in this drama.

Demonstrating the power of the Suburban while still in Santa Fe, the main man flew over curbs, and not all our stuff remained on the trailer. One of the things about moving is, if you cannot remember what else you had before, after a portion of your belongings launch from your trailer into the night, you might not have needed those things.

Leave a comment »

Code forests

This paper is about a layering paradigm for enterprise scale software called a “code forest”. The paradigm is a tree-shaped database of elements that compose the code base, with their full content and revision history. Developers edit the database rather than the file system. I will get into details on what that means, but first want to start with list of problems that the approach improves upon.

Why we need code forests

In my examples I’m using C#-like code and terminology but it is the same concept with java or any language designed for enterprise-scale software. In this context “enterprise” means possibly millions of lines of code and multiple tiers with overlapping legacy and new products.

Some problems with today’s large code bases:

  • Depending on the language there are now three or more competing naming systems permeating the code base. These include the class names and optionally heirarchical namespaces of classes; names in the file system; and names of folders, projects and “solutions”. The only reason for the complication is the history of adding tools on top of other tools; it is not needed. A code forest only has one naming scheme.
  • Developers are usually forced to deal with source files. The base unit in programming and compiling is traditionally the file, but it does not have to be that way. A source file is not a meaningful concept in the compiled product. A code forest allows you to work with named elements in the forest, not files.
  • Developers are currently forced to deal with deployment considerations when writing classes. A code forest allows deployment decisions to be made completely separately.
  • The skills and other team characteristics needed to manage a code base are different than skills for writing classes and methods. A code forest helps teams do forest management as distinct from code quality management.
  • Documentation and understanding often decline as code bases get bigger. A code forest organizes code with documentation about code structure in the same place as the code itself.
  • Unwanted dependencies and friend dependencies creep into code bases as teams get bigger. A code forest makes creating dependencies an action that you have to take explicitly, so there can be no dependencies creeping in accidentally.
  • Source control is standard practice now, but technically it is an optional layer on top of a non-source-controlled system, and it can be broken, avoided, worked around, misused, or not be fully integrated with the development tool, leading to complications. A code forest is source control to begin with, so it is impossible to not use source control with it.
  • Developers can spend a great deal of time recompiling “the universe” of code when only one line changed. The compiler is usually too unaware of the code layering to optimize away unnecessary work. A code forest allows compiling to be based on changes only.
  • Visibility of class members is not flexible enough, with the options of public, private and protected members. Sometimes you need visibility in more complex ways, but we end up making too much public. Code forests control visibility exactly.

Basic definition of the code forest

A code forest is a tree (a directed acyclic graph with any number of roots) of nodes (which I’ll call “code elements“) along with the revision history of each element and a map of all dependencies between all elements. It can also include the concepts of code branches, commits, and other source control features.

Each element is composed of an expression in the form “visibility-spec name = element-definition” and a separate area for typing definitional or contract comments. Some example elements are shown here:

  • public A = 3
  • B = int (string s) { return s.Length; }
  • visible C = class { … }

The examples show a variable element, a function element, and a class element, respectively. The class element will have child elements inside it. The only type of element that allows fairly long definitions is the function body. Since most functions are ideally less than 20 lines, that means source control is operating on much smaller units than we are used to.

You may be questioning the function and class syntax. I am not concerned with exact syntax in this paper. There are many function syntaxes, and the one used here is chosen simply because it puts the name on the left of the equals sign so it is consistent with all other definitions. We are assuming that the type of any element is unambiguous from the definition, so in the example, A is known to be an integer. Classes also use the name = class syntax.

To organize millions of lines of code, one can think of all those millions of lines in one giant file with a lot of nesting. Of course you would not display it that way because of its size, but that is one logical way to display it. Replacing brackets with indented bullets to indicate the tree shape, that would look like this example:

  • PersistentData =
    • public Person = class
      • Name = “”
      • IsSally = bool () { return Name == “Sally”; }
    • Team = class
      • Members = new List<Person>;
  • UI =
    • Person = PersistentData.Person
    • ThisUser = (Person)null;

A team of developers can be branching, editing and merging elements all at the same time. The editable unit is the element; there is no need to “check out” or edit whole classes as a unit.

Layer views

You can also look at a code forest visually, showing boxes for the organizational classes and arrows denoting dependencies. Here is an example:

The example comes from an earlier paper “Megaworkarounds” – http://www.divergentlabs.org/tech/megaworkarounds/

The advantage to this kind of view is that it shows how the code is layered. Tools can also allow you to draw layers and drag elements to change the structure of the code base. For example, you could draw a box around a number of functions dealing with the same thing in an overly complex class and create an encapsulating layer. Read the rest of this entry »

1 Comment »

Complete and incomplete covers in engineering

I confess I have been irritated my whole life about car dashboard controls for heating and cooling because they are an incomplete cover for the complexity that is going on inside. It has been a rough few decades for user interface enthusiasts!

What is a complete cover? It is a layer or shell over some machine complexity that completely hides it and does not let any of the complexity out. A cover is incomplete if it forces you to understand what is going on underneath, or if it is confusing when you do not understand, or if the cover is insufficient to operate all aspects of the machine. A cover can be thick or thin – the thicker the cover, the more it changes the paradigm of the machine interaction. A cover is optimal when it is complete, regardless of whether it is thick, thin, or absent. Sometimes it is optimal to have no cover.

I will explain this with some of examples, starting with a mechanical mercury thermostat. There are three kinds of people in relation to these devices: (1) Those with a gut fear reaction when they look at dials and numbers; (2) those who understand the two exposed dials – measured temperature and set point – but do not know or care how it works inside; and (3) those who understand that the rotation of the temperature-sensitive coil which is superimposed on the rotation of the set point tips a mercury switch, that the bi-stable 2-lobed shape of the mercury chamber affects the temperature swing, and why mercury is used in the first place. It is a lovely thing, but not really the scope of this paper. I am mainly concerned with the middle category of people who are functional operators of the cover and what kind of cover it is.

thermostat

The thermostat is a complete cover because you can operate every aspect of the heater with it, without needing to know how it works. It is also a fairly thick cover in the sense that it translates one paradigm to another. The actual heater requires an on/off switch to work, thus the only language it understands is on/off. But the thermostat exposes a set point to the user. It translates the language of on/off to the language of set points. Someone could replace the whole heater and wiring with a different inside paradigm but leave the exposed paradigm there, and the user would not need to know that anything changed, because the operation stays the same. In many systems – especially software systems, the replaceability of layers is an important design point, and complete coverage is one of the factors that makes it possible. Read the rest of this entry »

1 Comment »