The marmot's tag-line is from The Big Lebowski, which is also the source of the marmot's moniker. It's a disclaimer, and a comment on blogs in general. Maybe life, too.

The marmot's ancestor was called Groundhog Day, and I believe it went through a few tag-lines. One I recall was from The Princess Bride, "Let me explain... No, there is too much. Let me sum up."

And one of the recurring themes in GHD was about the illusion of power.

In our sloppy, lazy idiomatic way of communicating we often ascribe certain outcomes or results to the "power" of leaders or parties or corporations. We speak of "asymmetric power dynamics" in relationships. We encourage and facilitate the illusion in our culture and society, largely because it helps maintain hierarchy and order.

But it's still an illusion.

The only power that exists, in a human relationship context, is the power to choose. Yeah, there's work per unit time power of my fist hitting your nose, but that's physics. A course of action involving an individual is the result of two things: habit or choice. Most of our behavior, our daily course of action, is habituated. This is, in the main, a good thing. Nobody has the cognitive capacity to consider, moment to moment, all the possible courses of action and potential outcomes of any particular choice.

And one of the artifacts of that habit is the illusion of power.

This post isn't intended to recapitulate all that. It provides a little context. When we say a police officer has "power," what we really mean is that he has "authority." Authority is an element of a social contract. We agree to consider the statements or commands of people whom society has granted "authority," as being compelling, having the effect of "force" (power again). But participating in that social contract, recognizing that authority, is always a choice.

Society recognizes the dangers of authority, so it erects guardrails to prevent its misuse. Authority is granted commensurate with responsibility or duty. Society says certain people are responsible for various elements of conduct or behavior, guiding or compelling people's choices, to maintain some beneficial feature of society, or military order.

That authority is bounded by responsibility. Its exercise is accompanied by accountability. That is, the organization that grants the authority can also withdraw it, and impose penalties for its abuse or misuse or failure to exercise it in meeting the responsibility for which it was granted.

So there's a three-legged stool that kind of supports this idea of "power" in the social contract.

It also exists in the professions. We grant titles to people who possess expert knowledge in particular fields that convey to us that we can rely on their opinions. Doctors, lawyers, architects and so on. Someone tells you that they want to cut you open and take out your gall bladder, you want to have some confidence they know what they're talking about!

Again, there are three legs, or pillars, pick an analogy Rogers. Responsibility, authority, accountability. We don't like our buildings falling down, so we make people responsible for ensuring they don't. We grant them the authority to state that a set of plans will result in a safe structure that won't fall down and their signature has a certain force to it. You can rely on it, and you must comply with it. And professionals are held accountable for misuse, or incompetent use, of their authority by legal and professional structures. People can seek damages, attempt to be "made whole." (Good luck getting that gall bladder back.)

Basically, everything else is bullshit and you're on your own. Now, people can acquire reputations for being sources of reliable information. But the only thing they have to lose is their reputation if they give you a bum steer. And there are certain requirements of law in contracts or testimony under oath that can impose penalties if you bullshit someone. But basically, most of life is just bullshit.

Which brings me to AI.

We are suckers for our own infernal cleverness. I remember when we got these fancy color computers in the combat information centers of navy ships. They purported to give an "all source," "fused" picture of the space around the ship. Because it was in a fancy box, and especially because it was in color graphics, we all bought it. It didn't take long to learn that it was bullshit. The data was time-late. The data was wrong. It was just a fancy picture. A toy. But we sure did love it for a while.

It did get better. We learned what its limitations were. We understood what it could tell us. We got better at vetting the data, and faster at inputting it. Today, that sort of thing works pretty well and it's used every day. But occasionally a drone will fly through and blow up a bunch of your shipmates.

AI is a toy. It's bullshit. Some fancy-boy, tech bro nerds might take exception to that characterization, but they're emotionally invested in their toys so that's understandable.

We can begin to take AI seriously when it's supported by the three pillars of responsibility, authority and accountability. And since it's hard to figure out how to make a machine learning model accountable, we'll have to be satisfied with holding its corporate masters accountable. When we start putting people in prison, I'll know we're taking it seriously.

In the meantime, I expect we're going to have a lot of fun with AI as a toy. And it's going to do quite a bit of damage too, because it's kind of like treating an AR-15 like a toy. Some folks will do some useful things with it, but a lot more folks will cause chaos and mayhem just for shits and giggles, or because they genuinely want to sow chaos and mayhem, or because they had good intentions, but just didn't know any better.

But, hey, this is all just my opinion. I could be wrong. I don't have any intention of deceiving anyone. I probably just don't know what I'm talking about.

And this is the marmot. It's a blog. I'm an authority on nothing. I make all this shit up. Do your own thinking. Or ask Chat GPT.

Originally posted at Nice Marmot 06:06 Wednesday, 7 February 2024