In Sputnik’s Orbit

A few thoughts to tide you over…

 

A Monologue in Inner Monologue

Over the last few years, it’s come to the attention of researchers (and the rest of us) that some people have no “inner speech.” Inner speech, which has been called “inner monologue” by some, is the experience of hearing a voice in your head as you read and/or reason with yourself. Most people have inner speech, and most people are perplexed on learning that some others don’t. We don’t yet have a good handle on why we have it or why some don’t, but it’s quickly become clear that like autism, handedness, gender identity, and love of puppies, inner speech comes on a spectrum.

So…this is interesting.

I have inner speech, and while I am capable of thinking and experiencing the world without it, I would say that I seldom do. I most often hear my own voice in my head, or more precisely, a sort of standardized and simplified mental model of my own voice as I hear it when I talk. I strongly suspect that this inner voice is a model of my memory of my own voice. That is to say, we now know that memories are not recordings of sensory input, but rather of experiences. When we remember something, we are not reconstituting the sights and sounds we experienced during the event, but rather we are reconstituting our experience or understanding of the event, and reverse-engineering the sensor experience that we think must have caused them. This is why human testimony is so notoriously untrustworthy, and it is, I think, why inner speech doesn’t “sound like me” so much as it is “reminiscent of me.”

At any rate, apparently unlike most people, my inner speech is very rarely negative or self-chastising. I do, at appropriate times, “hear” my mother or father talking to me. Almost every time I use a hand saw, for example, I hear my father guiding me in its use–this is one of my earliest and fondest memories with him. But I don’t generally “hear” my mother, teachers, or myself criticizing me. Perhaps that just means I’m well adjusted–or that I’m old enough to be over it–or that I’m an asshole; who can say?

Also, unlike many people, I don’t generally “Talk to myself.” That is to say, I don’t have a two-way dialogue with myself, an imaginary other, or a model of another. We joke about “talking to yourself” being a sign of insanity, but in fact, it turns out to be common. This, apparently, is quite common, especially among people known to have had an imaginary friend as a child. Not me. I only have an inner dialog when I am rehearsing or revisiting a conversation with another person. I do, like most people, occasionally re-argue a conversation that is past and that I only thought up “what I should have said” after the fact. But I don’t argue with myself or talk to myself in a two-way give-and-take like one would with another, while many people say they do. I do “talk things through,” just as I might with a friend or coworker, when I’,m trying to reason through a complex problem, say a bit of computer code or a piece of literary blocking.

Interestingly, people who lack inner speech may be able to read more quickly, and many people report that when they read, their inner speech is not a word-for-word verbatim reading of the text, but a shorthand in which single words take the place of whole concepts or phrase. I don’t seem to be able to do that, though I may simply be out of practice. When I read, I hear every word, every inflection, every nuance, and this makes it very difficult to read any faster than normal human speech. Now, I can read at an accelerated rate, usually for schoolwork when just scanning for content, but when I do that I still have my inner speech, it’s just highly abbreviated, like “Persuiant to da da, da da, da…residential…commercial..permit….for any new construction…etc.” You get the idea. I can’t really do that when reading fiction, not because it can’t be done but because if I’m reading for pleasure, that’s not fun, and if I’m editing, that misses everything I would be looking for.

So…it’s interesting. I have long complained that too many people in business fail to understand that the purpose of writing (at least technical writing) is not merely to jot down some words associated with the pictures in your head, but to craft the words needed to build those pictures in someone else’s head. Clearly, I’m better at that than many. Is that because I lean more heavily on inner voice? Is the price for that ability the inability to read as quickly as some others? I don’t know, but it’s interesting.

Now…then there are the people who have no pictures….people with literally “no imagination” who, like one fellow I saw talkig about it on YouTube always thought when we say “picture” so-and-so, that that was only a metaphor. I can’t even imagine that mental state. I picture everything and can take it apart and rotate the pieces in my head. I can’t even understand how thought is possible otherwise, yet clearly, it is. So…spectrums.

Everything that makes us human, that makes us individuals, that makes us US, comes on a spectrum. We really, really need to chill out about that, embrace our diversity, and profit by our different approaches.

But what do you think? Do you have inner speech? Does it chastise and criticize? Does it encourage? Is it only an analytics tool? Leave a comment and let me know.

Zeroing in

Science is like an archer getting closer to the target with practice—and an ever-improving view of the remaining discrepancy. That the aim varies as it zeroes in does not make it wrong along the way—and only a fool would think so.

Estimates for the age of the Earth have evolved over time as new scientific methods have been developed and as new data has been collected.

  • From the 1770s to the 1890s, Earth’s age could only be guessed at (scientifically speaking) based on a crude understanding of natural processes such as geolologic change, planetary cooling, and ocean salinity balance, so estimates ranged wildly from a few million to a few billion years.
  • 1905: The physicist Ernest Rutherford suggested that the age of the Earth could be estimated by measuring the amount of lead in uranium minerals. His estimate was around 500 million years, but was only a swag intended to prod geologists into the atomic age.
  • 1920s: The geologist Arthur Holmes used the radioactive decay of lead and uranium to estimate that the Earth was around 4.6 billion years old. This estimate is still widely accepted today, although the margin of error has been refined over time.
  • 1990s: The development of new radiometric dating techniques, such as uranium-lead dating and samarium-neodymium dating, allowed scientists to estimate the age of the Earth with greater precision. These methods have estimated the age of the Earth to be around 4.54 billion years old, with a margin of error of around 1%.

 

  • 1917-1922: The first estimate of the age of the universe came from astronomer Georges Lemaître, who used Einstein’s theory of general relativity to suggest that the universe was around 10 billion years old. This estimate was based on assumptions about the expansion rate of the universe and the amount of matter it contained, but it did not have a margin of error.
  • 1920s-1930s: Other astronomers, such as Arthur Eddington and Edwin Hubble, proposed different estimates of the age of the universe, ranging from a few hundred million years to several billion years. These estimates were based on observations of the Hubble constant, the rate of expansion of the universe, and the ages of the oldest stars in our galaxy.
  • 1940s-1950s: With the discovery of nuclear reactions and the ability to measure isotopes, physicists were able to estimate the age of the universe more precisely. In the late 1940s, physicist George Gamow and his colleagues suggested an age of around 2 billion years based on calculations of the age of the oldest rocks on Earth. By the early 1950s, improved measurements of the Hubble constant led to estimates of 10-20 billion years with a margin of error of about 25%.
  • 1960s-1970s: The discovery of cosmic microwave background radiation in 1965 provided strong evidence for the Big Bang theory and allowed scientists to refine their estimates of the age of the universe. In the late 1960s and early 1970s, estimates ranged from 10-20 billion years with a margin of error of about 10%.
  • 1980s-1990s: With more precise measurements of the cosmic microwave background radiation, estimates of the age of the universe improved further. By the 1990s, estimates were in the range of 13-15 billion years with a margin of error of about 1-2%.
  • 2000s-present: Advances in technology and new observations, such as measurements of the cosmic microwave background radiation by the Wilkinson Microwave Anisotropy Probe (WMAP) and the Planck satellite, have allowed scientists to refine their estimates even further. Current estimates are in the range of 13.7-13.8 billion years with a margin of error of about 0.1-0.2%.

A Changing of the Guard

Today I spent the day with my new boss, a woman I first met near the start of my IT career when she, then a newly-hired contractor, was appointed business liaison for what turned out to be a highly successful software application I was designing. At lunchtime, we got to chatting and our conversation turned to my boss way back then, Frank, among the most brilliant, capable, and just plain decent human beings I have ever had the pleasure to know.

I will not waste your time detailing all the kind things Frank did for me or taught me over my years under his wing, except that a lot of it was not strictly work-related, the sort of thing my father might have impressed on me had my parents not divorced and my father had not been away most of the time on Air Force duty and, frankly, had been a better man.

Read More

Roller-Bat, Game for all ages

When I was a kid, we lived at the end of a long gravel road without any neighboring kids we could play with. I was the youngest, with a sister four years older and a brother two beyond her. To while away lazy, pre-Internet summers when we’d read all our books and the rabbits were fed and the clouds and the neighbor’s cows doing nothing of interest, we’d often play a game of our own invention called “Roller-Bat”.

Read More

The Double Edged Blade of Capitalism

Today, for the first time in a long time, I tried out a new product I was genuinely excited to get hold of.

Capitalism is not, as many millennials think, the root of all evil. Neither, as many boomers seem to think, is it the garden of all virtue. There is a balance to be found between public and private interests, and between innovation and foolish obfuscation. The shaving business is a case in point.

If you’re under 40 and don’t have an MBA, you may not be aware that the shaving razor business is a scam so well known it’s part of the Harvard curriculum. It works like this. Give away an attractive razor for cheap or for free, then make a profit selling the owner proprietary replacement cartridge blades that you somehow convince them are better in some way than the crazy cheap standard blades they were using before. Bonus dollars if you hook them young enough they never used the cheaper alternative, or still think of it as grampa’s old school. Why buy 50 blades for $9 when you can buy two for $10 and get half the performance? Ah, but the packaging is so manly and sleek, like what Captain Kirk would get his condoms in.

Read More

Wading in Java’s Dependency Hell

It’s more than disappointing that as I write this in 2022, more than a generation since I first worked as a software developer, nearly a century since the first programmable electronic computers, we are still wrestling with things as archane and crude as build files and what the Java community not-so-affectionately calls “dependency hell.”

For the uninitiated, computer programs are written in relatively “high-level” human-readable languages like Java and C, but execute in a low level machine language understandable by a microprocessor. In simple terms, the process of getting from high-level code to executable code is the “build,” and build files are instructions to help govern that process. Dependency hell arises because Java programs are built up from libraries of pre-built components that do everything from manipulating strings to writing the contents of memory out to an application file (like the .docx file produced by Microsoft Word). No developer or team could possibly afford the time or money to implement all these supporting functions from scratch, and libraries provide standardized mechanisms for various entities to develop and support them and make them available to application developers. So far, so good.

The trouble is, over time, code gets written against specific versions of these libraries, and sometimes becomes unintentionally dependent on version-specific bugs or peculiarities. Library ownership changes hands and the new owners decide to clean house, reorganize, or kill existing libraries. Open-source libraries branch into new flavors and those that were once a defacto standard become tomorrow’s orphan, while the new alternatives may be as similar as two sects of the Greek Orthodox Church or as different as Chinese and English. Meanwhile, applications are always being written and maintained, building up every growing and changing rat’s nests of dependency on these various libraries.

Dependency hell.

This is not unique to Java. .Net developers face “DLL hell,” but .Net developers are far more dependent on components written and maintained by Microsoft. Java is somewhat more vulnerable to it because its greatest strength–an open, platform-independent architecture–is inherently dynamic and fungible. Unfortunately, the language’s original developers didn’t foresee and/or adequately address this issue, so today we face the unenviable situation in which many existing applications cannot, for all practical purposes, be updated to the latest greatest (and more secure and supportable) libraries and language core because the change would simply be too expensive. The best solution would have been to build right into the language some mechanism for guaranteeing backward compatibility to all libraries, forever. Then the compiler could simply choose the latest version of every library it could find and it would be guaranteed to work so long as the application didn’t require functionality newer than the available version. Even that could be addressed by a centralized repository and automated system to permit the compiler to go find what it needs. Alas, that seems reasonable today (though it would be more problematic than it sounds) but it was impossible when Java first appeared.

Still, changes have been gradually made to improve the situation. One of these is that at a certain point, a rule was instituted that dependencies had to be unambiguously specified. Throughout the first 8 major releases of Java, this was not the case. You could reference two libraries, one of which had been compiled with the other, and the two different versions of that other would be equally available to the compiler and runtime. Not only would this make your application code larger, it meant that which library was used at runtime was a bit of a crap shoot. Mechanisms were in place to let you take control, but in the vast majority of cases, developers ignored them except when testing found a specific problem.

Do you know what programs that appear correct because of testing are called? Miracle programs. In theory at least, programs should be provably correct, like mathematical proofs. That’s not often the case in practice, but at the very least, we should avoid practices that necessarily increase the odds of hidden defects. Allowing random resolution of multiple library versions is one such practice, so starting in Java 9, it’s no longer permitted.

This brings me here today. I’m working on a novel design application derived from an open-source Java project originally developed in Java version 7. It contains lots and lots and lots of these “transitive dependencies” as they are called. When I try to compile it under Java 9 or higher (16 is the current version) I get almost 300 errors stating “Package such-and-such is available from more than one module…” This wouldn’t be so bad if the IDE (Eclipse) or build tool (Gradle) contained a nice little hook to let you right-click, resolve the problem and move on–you know, the way normal people would do it. But this is Java, so normal people need not apply. In fairness, it’s not JUST that the Java and open-source communities are arrogant twits who see no problem in requiring others to master arcane technical skills to accomplish simple, obvious tasks that should be baked into the interface. It’s also that very often, they also lack the resources or authority needed to bake-in such features, and that’s why we are stuck in a world dominated by Microsoft. But I digress…

MAKE A COMPLETE BACKUP OF YOUR ENTIRE PROJECT DIRECTORY BEFORE BEGINNING.

KEEP A RECORD OF EACH CHANGE YOU MAKE. YOU WON’T BE ABLE TO COMPILE AND TEST UNTIL ALL ERRORS HAVE BEEN REMOVED, SO THERE’S  A POSSIBILITY OF RUNTIME ERRORS CAUSED BY CHANGES MADE TO REMOVE COMPILER ERRORS.

DO NOT TRUST SUDDEN LARGE REDUCTIONS IN ERROR COUNT. THIS MOST LIKELY MEANS ECLIPSE HAS REVERTED TO A LOWER COMPLIANCE LEVEL AND MAY BE LYING ABOUT IT. CHECK THE COMPLIANCE LEVEL. PERFORM GRADLE AND ECLIPSE CLEANS. REVERT LAST CHANGE. RESTART IF NECESSARY.

REMEMBER TO MANUALLY SAVE CHANGES TO THE BUILD FILE. FORGETTING WILL LEAD YOU ON A MERRY CHASE WHEN ERRORS DON’T DISAPPEAR WHEN YOU EXPECT THEM TO.

So here’s how to resolve the problem in Eclipse and Gradle:

  1. Go to the command line and change to the folder containing your application. You can grab this in Eclipse by opening Project/Properties/Resource/Location.
  2. Run the command “gradle -q dependencies”
  3. Copy the resulting report and paste it into a searchable editor (like Notepad++)
  4. In Eclipse, go to Project/Properties/Java Compiler and change the compliance level to Java 9. It’s probably best to leave the workspace settings alone for now, as you’ll want to switch back and forth a lot to make sure you haven’t broken anything as you proceed. At least in my version of Eclipse, the compliance level gets set back to Java 1.7 (version 7) after almost every build, for which I thus far have no explanation. Change it to 9 (or higher), apply and close, and the project will build with Java 9 compliance. You’ll see a large number of “such-and-such cannot be resolved to a type” error caused by the failure of library import statements that no longer work due to transitive (or other ambiguous) dependencies.
  5. Double-click on any of these errors, and a related source code (Java) file will open. Go to the top, then scroll down until you see imports that have been flagged with the message “Package such-and-such is available from more than one module…”
  6. If you see an import ending with a wildcard, skip it for now, find one that’s a specific type.
  7. Highlight the type and press control-shif-T to open the type viewer (in Eclipse). You should expect to see multiple .jar files listed. The type viewer tells you exactly where the file is, but that’s not actually helpful. Instead, you need the name of the library (group or module), usually indicated in the filename of the jar. We are going to try to eliminate from this list all but one of the dependencies, preferably one with (jre) beside it because that’s a supported part of the Java language itself.
    1. First, search your gradle.build file–the one written in Groovy script and containing at least a “dependencies” section  and multiple “implementation” lines.
    2. If one of the multiple depencies is referenced here, you can try just deleting it (comment it out though, so you can bring it back if needed).
    3. Save the build file and use Project/Clean to clean and rebuild. If all errors dissappear, it probably reverted the compliance level to pre Java 9. Use Project/properties/Java Compiler to change it back to 9 or higher, apply and save, and see what that does.
    4. If the total number of errors has gone down, that’s a good sign, but even if it hasn’t, you can go back and look at the same type in the type viewer and you should see one fewer dependencies. Repeat until there is only one and the error count is down by at least one. Fortunately, fixing this issue anywhere fixes it everywhere, so the error count can go down by tens or more all at once.
    5. IF THE ERROR COUNT GOES UP, comment the dependency back into the build file and try something else. The idea is to find a library that contains a redundant reference to another, usually more generic library, and either eliminate it (the library containing the reference) or add an exclusion removing either the entire redundant dependency or better yet, the specific redundant dependency added by that library. Put another way, if your application requires library “A” and library “B” but is failing because “A” contains “B” and Java already contains “B”, you need to tell the runtime and build tools to ignore the version of “B” contained inside “A”.
      1. Search the dependency report you copied earlier for references to the problematic library. This will give you clues as to which listed dependency may be including a redundant (transitive) reference.
      2. If you can omit a dependency that includes the problematic transitive reference and get away with it, great.
      3. If you can’t, you might be able to add an exclusion at the bottom of the build file something like this one for xml-apis (though I haven’t worked out the details of how best to use this, I found it following an online thread):
        configurations.all {
        all*.exclude group: ‘xml-apis’
        }
      4. Or better yet, you can specifically exclude the transitive dependency brought along by some other needed library. That library will still use the code it brought with it, but the compiler and runtime won’t see it on the classpath and get confused.
        // Apache Batik
        // implementation ‘org.apache.xmlgraphics:batik-dom:1.8’ // CSH exclude added for Java 1.8 to 9 update
        implementation(‘org.apache.xmlgraphics:batik-dom:1.8’)
        {
        exclude group: ‘org.apache.xmlgraphics’, module:’batik-ext’
        }

 

New is Not Always Improved

Today I made a special trip to the big-box hardware store only to learn they no longer stock aluminum HVAC tape. In case you don’t know, this product is the jack-of-all-trades “duct tape” (more properly Duck Tape) pretends to for a wide range of tasks around the house, studio, and attic. It’s strong, easily to apply, light-proof, leakproof, and removable, almost always without a trace. And unlike Duck tape–which was designed for waterproofing ammunition crates in WWII, not for sealing ducts, aluminum HVAC tape sticks like hell, even to less than perfectly clean surfaces, is impervious to moisture and heat, and will not turn to dust in the attic.

I have used it to build studio lights, seal leaks in my HVAC system, build an attic door insulation cover out of scrap polystyrene, attach dryer vent hoses, and as a removable duct attachment to permit annual cleaning of the dryer exhaust duct through the attic. When I renovate, I also use it the seal the duct plenums to the ceiling drywall to reduce drafts and prevent windy weather from pushing attic dust down into the house.

Recently, I installed an air return from our master suite, which forms a peninsula at one end of the house and which has always been difficult to heat. Upgraded windows, attic insulation, and a duct booster fan have helped. The air return is to provide more space for air returning to the HVAC intake without having to squeeze around the bedroom door, building up a rind of dust over time.

Read More

How to Survive Nuclear War

I grew up at the tail end of the Cold War. I’m a nerd, and I used to play “camping” under giant vinyl sheets my dad brought home from the Air Force which I later realized were maps of aerial bombardment exercise ranges in central California. My dad’s job was to fly off on a moment’s notice never to return and convert thousands of Russian children very much like me into radioactive corpses. We didn’t worry too much, as we understood the point was that no rational person would start such a war knowing how it must inevitably end.

Read More

Speed Dating for Creatives

While prepping for a recent “Pay it Forward Day” presentation, I stumbled on a useful analogy: submitting creative work for sale is a bit like dating.

It’s tempting when a submission doesn’t work out, to think something like a teenager who just got ghosted by the hottest kid in the “in crowd”, to think “I wish I knew why they didn’t like me, then I could change.” You might even be tempted to respond to a purchasing editor asking exactly that. Don’t. Don’t, because you are selling yourself short. Don’t because you’ll look like a pestering kid asking “but why don’t you like me? Why? Why?”

The reality is, rejection of your story is like rejection after a date. It just means that story isn’t the right fit for that editor at that moment.

Read More

It’s a Small World And We’re Not Gonna Take It Anymore

You may have noticed, there’d a war on. Of course, that’s 6,000 miles away, so it doesn’t affect me, right? I can turn a blind eye the way most people do, most of the time when atrocities and injustices are happening a quarter of the circumference of the planet away, right?

Well no, not so much. If you are reading this on my blog, there’s a small but non-zero chance you’ve noticed my website was down for over 30 hours and has been running with all but the home page broken for several days. Indirectly, this is because of that war–my hosting provider no longer has contact with the region of the world from which, for many years, it’s been getting exceptional support labor. As I write this, Vladimir Putin is in the process of walling Russia off from the global Internet so his subjects can’t be told the truth. President Biden has barred all energy imports from Russia and many other nations are likely to follow suit, though this is mostly symbolic since shipping in the Black Sea essentially stopped the day Lloyd’s of London declared it a war zone. International sanctions mean that all Russian airlines are now required to return half their airplanes, all but 170 something of which are leased from Western companies for efficiency. And they’ll find a way to do it, too unless Putin puts a gun to their heads. They want this madness to end so they can go back to making money, and to compete in the global marketplace, they need more than permission to fly outside Russia, they need those leasing agents and access to tech support in the US and maintenance facilities in Germany.

Meanwhile, Ukraine isn’t just firing tank and aircraft killing missiles from the US and Western Europe–it’s firing its own, made with parts designed by a design company in Belarus–one brown shirt national thug nation supplying a goodly portion of Putin’s army of pointless conquest.

Putin claims he can’t abide NATO expansion toward Russia’s borders. This is absurd, in part because NATO was created to protect its members from a USSR hell bent of ideological colonization of the planet and in part because since the fall of that empire, the expansion of the alliance has been driven not by the US or other Western powers but by the well-founded fear of Russia’s neighbors. Russia has nothing to fear from NATO if it stops invading other countries, and those countries have no reason to join NATO unless they fear Russian beligerance.

Meanwhile, NATO or no NATO, economic ties binding Russia, the West, and all the former Soviet satellites had done more to ensure Russian security in the last 20 years than all the wars and “special military operations” fought by Putin or his predassesors all the way back to the iron age–until he pissed all that away in a week.

The best defense against attack is close economic partnership. Best, because it both fosters the communication and partnership needed to work out disputes without recourse to violence and greatly lessens the need for spending on defense.

The people of Russia, Ukrain, Belarus, and for that matter, Finland and Texas, have no interest in empires. They just want to raise their kids, see if Elon Musk really does die on Mars, and get prompt support for their website when they need it.

Wars of conquest are stupid. They serve nothing and no one but the egos of the men who start them. Thus, those should be the first to die when the shooting starts.