Posted on November 8, 2019
I’ve just watched Steve Jobs. Yes, the movie from 4 years ago depicting Jobs’ rise from the humble hanger-on of Wozniak’s demonstrably more innovative coattails to cultural entity that devoted most of his life to obfuscating the fact that he used to be just some fuckwad. My derision and special disdain for Steve Jobs and people like him aside, what truly struck me was the massive juxtaposition of the spirit of computing as it existed before Apple and Microsoft but post hysteria of chess-playing machines that might one day decide the trajectory of our nuclear arsenal and its current place as the fountainhead of our society and, indeed, the modern conception of capitalism in its entirety. Let’s be honest, we hardly go a day without hearing some half-cocked venture capitalist’s idea for a new app is going to change the way we look at X or how we live with Y.
It’s circa 1980. The Cold War is winding down, we’ve strode past the Cuban Missile Crisis. Reagan is a fuck but hasn’t been able to be much of one yet. Alien just came out last year and the first Star Wars a few years before that. In short, what a pretty alright time to be alive. By comparison, our current hellscape is truly just that. A hopeless march to climate death unless we can accomplish a conscious restructuring of human society that is utterly without compare in all of human history, that’s the current narrative. The narrative of the fabled age of Jobs and Gates and Wozniak, which, like it or not, is really the only mythological era we have in modern America, was one of boundless hope and optimism. There’s a reason why all our cultural products of our current day are busy harkening back to the early 80’s. Stranger Things, Ready Player One, the aforementioned Steve Jobs biopic, the new season of American Horror Story, reboots or long-awaited sequels to franchises like Star Wars, and the list goes on. I would venture to say that it was the last time anything felt new and not some derivation of something else. There’s no end of people who would challenge me on that. Who would claim the smartphone, wireless internet, self-driving vehicles as heralds of an era all their own. And those people would be wrong.
Innovation and optimism are now commodities. And commodities are derivations. They are what happens to inventions when the frontier is expurgated and now we may set about exploiting the discovery. Geography or technology, this is the spirit that drives on the great lumbering machine of capitalism. The fact that the computing power of a smartphone once lay in a machine that took up entire rooms is an amazing fact. An interesting tidbit. But no more. It’s a natural progression. It’s a commodity. It’s more people with more computers to get more money. Which, fine whatever. It’s a largely useless invention that satisfies an animal craving for “more, but faster this time”, but to each their own. What’s concerning is this is where we’ve ended up, so far from where we’ve started.
The heyday of personal computing, when these things were still being built in garages, a time that we nerds speak of with wonder and awe, was strongly infused with a democratic spirit of reclaiming computing technology from massive corporate giants like Hewlett-Packard and IBM, companies we must remember who have done more than any other to contribute to the slow destruction of the fabric of society. The latter went so far as to inveigle themselves with both the Nazis and the CIA. The whole spirit of the enterprise was putting into the hands of the have-not’s the ability to lift themselves up into a new technocratic utopia in which knowledge would serve the world. Bootstraps adjacency and naivete aside, it’s a far cry from the mythology that would take its place. The mythology of Jobs and of Gates, while Woz tinkered (richly) into obscurity.
These days, Apple and Microsoft are now engines of the economy. There are no greater patron saints in the corporate or tech worlds than Gates and Jobs, Jobs and Gates. Geniuses, misunderstood savants, messiahs to the unwashed masses who without computers were just apes. You can see the dividends this worship has paid in the yogababble of modern companies seeking to elevate the human consciousness by turning millennials into pod people in the name of sharing the vague camaraderie of a workspace. WeWork, on its face and on its own, is enough to make the argument that even the language of solidarity has been commodified before the altar of growth and consumerism. The idea of the concept of ‘We’ meaning anything to the ghouls who developed that company is laughable. But this is the logical endpoint of greed, not just corporate greed, but the kind enabled only by the cripplingly oppressive spirit of capitalism. It elevates failsons to gods and demands we worship them. And Steve Jobs was indeed a failson, whose only talent was elevating himself to corporate power through sheer force of will and despite a number of fuck-up’s. He had one golden goose on his side, though, the kind that enables failsons the world over to continue failing upwards: a fuckton of money. Money got from his humble start as the vampire at the shoulders of better men, by double-dealing them and sucking out their all too humble and gentle souls. It is perhaps only too fitting that he expired at a comparatively young age while pursuing quack cures for cancer when cancer treatment he could have paid for a thousand times over was right down his stupid autonomous vehicle ridden, dormitory (serf) housing lined, homeless peopled street.
Posted on October 21, 2019
Hark! It is done!
I have released my first book! “There Is Life in the Tree and Death in the Well” is the first novel in a Fantasy Horror series. Set in the crumbling city-state of Sulidhe, the story follows an orphan boy called Arnem and Dob, his three-eyed dog who also happens to be as big as a lion, as the two seek to unearth the secrets binding together such elements as: a mutative plague; an alien religion; an incestuous ruling cast of mages; and guerrilla bands of druidic rebels. It is presently available for preorder with a hard release date of Oct. 31st. Just in time for Halloween!
I hope you make a purchase and enjoy reading it as much as I enjoyed writing it. I promise you won’t be disappointed.
But, wait! There’s more!
I now also have a Patreon. Jumping the gun? Maybe. But I’m a fan of being preemptive. In addition to grabbing a copy of my novel, if you so choose, you may now also help me to keep on writing and thus publishing the sequel sooner by not having to worry so much about living expenses.
More to follow! Thank you all!
Posted on May 31, 2019
It is important to note the chief similarity between main and side characters. That is, at any point, a main character might become a side character and vice versa. Their lives could stop. This is what helps them achieve personhood, this imagining them as people with control over their lives and not bound to some linear quest. Ultimately, this is what we want for our own lives. Besides entertainment, it is why we consume culture. To live vicariously through the thankless decisions of another that we might in some way find answers to our own dilemmas.
I am often reminded, in telling stories, of a scene in the Kubrick/Spielberg picture A.I.: Artificial Intelligence. David (played by Haley Joel Osmet) is telling Joe (Jude Law) of his plans to keep looking for the “Blue Fairy”, a distortion of the Pinocchio myth that the rest of the movie plays on extensively. He hopes to become a real boy so that his adoptive mother will love and accept him. It is a heart-wrenching tale that, for me, always hit too close to home (much like the original Pinocchio). In the scene, Joe implores him to instead give up his quest. He tells him that his mother’s love is not real and never could be real, no deeper than her love could be for a pet. He suggests David just stay with him and await the incipient destruction of humanity from the safety of the robot-friendly city. The story could have ended here, and not progressed. Another character, the main character, in a revision of the script could have happened upon them. Weighted with their own past and destination, the pain of David and his strange friendship with the pleasure-bot Joe would have been only a pale reflection to the main thrust of the story.
Thinking of character and plot this way not only helps flesh out the side characters, but develops the protagonist. What if they didn’t proceed? What reasons do they have not to? What palpable will or necessity drives them on if the consequences of settling down into history are not really that dire? This method also performs the function organically. You’re asking true and genuine questions of yourself and of your characters, as if you were them, rather than sitting down with a sheet of paper and filling in the blanks.
Motivation: etc. etc.
Worst fear: etc. etc.
Traumatic origin: etc. etc.
You see where I’m going with this. Perfectly fine books are written following the fill-in-the-blank formula. Many of them. But I know from my own work that things feel more alive, the narrative more present, if I’ve done as much as I can to create things organically. Formulas produce expected results. They can be relied upon. But isn’t it sometimes more entertaining, for the writer and the reader, to fuck something up and watch an explosion?
Posted on May 24, 2019
It’s important not to interrupt yourself, but it’s even more important not to interrupt the other person. In writing or in speaking, interruptions can be disastrous. Reading isn’t any different, and the dialogue writers hold with their readers is (or should be) as sacrosanct as that which we might have with a friend or close family member. However, those two forms of dialogue hold little else in common.
Their modes differ and take on different forms. I, as the writer, cannot literally stand over your shoulder and tap you, the reader, on the shoulder and interrupt your meditative consideration of the page. We hope not, anyway. If I knew you, it would simply be annoying. If I didn’t, then it’d be creepy. But I digress.
The ways in which writers may interrupt their readers’ experience are many. From a slip in skill to losing the thread of their own plot, failures of craft only succeed in kicking the reader out the door. But, in my experience, there is a very subversive and subtle tick that can best be described as the writer opening the door in a snowstorm. Expected, if a little inherently irritating. You might not even notice it, except in the oblique way that the fluidity of the narrative – and by extension your immersion – is thwarted. Unless you’re paying strict attention, your eyes will just keep on tracking across the words and you’ll take them in but with a little less gusto than you did before.
I am referring to interruptions of inference, when the reader has to stop and unduly consider something the writer has expressed or an image they have attempted to craft. Clarity and elegance are usually the elements lacking. There are, of course, times when the reader must be expected to try and wrap their head around the text. That’s the prime enjoyment in reading literature, considering and digesting complex thoughts. There’s a whole argument to be made concerning the clothing of said thoughts in such wayward and dense language that most readers will have to rope a professor or two into the fray. But, for now, we’ll settle on refraining from the little speed bumps in narration and/or exposition that draw the reader out of the experience.
It’s a delicate art, giving just enough detail but leaving room for the reader’s mind to play in the background. It’s frighteningly easy to give too much and not enough, sometimes at the same time. It’s the description of a ruin without giving reference to the object that has been ruined until later in the process of describing an action in relation to said ruin. Convoluted? Assuredly. Nitpicking often gets convoluted.
So let’s break it down:
A group of thieves are scaling the ruin’s side, say. Finding this and that foothold, climbing to this and that promontory. Then they get to the top and tie themselves off to the temple’s minarets in order to scale down through holes in the roof. Temple? Minarets? Reading this, I would have to infer (read: fill in the blank) that this was a temple all along and its minarets were the true object of the climbers. I might even go back to make sure I didn’t miss anything else that was important about this structure.
Ideally, the author would detail the thieves’ struggles to climb with the weight of the coiled rope on their shoulders and shouted out the presence of the minarets before or during the climb. This gives the objects a precedent and the reader won’t have to stop and read back to see if they missed something or trust the author and quickly infer what the image they’re creating to get on with reading. In either instance, the reader has been booted out of the scene and has to try and get back in again. It’s like the doorbell ringing. Mild annoyance, but an annoyance.
Is this worth paying attention to? Is this nitpicky? Is this convoluted past the point of importance? You could make that argument. But I’d make the argument that the dividing line between good and great is paying attention to this sort of thing, whether from the get-go or in the final draft. And don’t feel like I’m shouting my sermon down from Mount Righteous. That example up there? From the rough draft of a book I’m going back through currently. I’m just as lost as anyone else.