Is Atheism/Religious Apathy Killing People?

“Deaths of despair” are rising among middle-aged whites, and may be due to a reduction in religion.

Source: Photo by Zachary DeBottis from Pexels: https://www.pexels.com/photo/silhuoette-of-a-person-2953863/

According to Marketwatch, which reports a study by researchers at several universities, “deaths of despair” have been growing dramatically among middle-class white Americans.

Deaths of despair, according to Wikipedia, are deaths attributed to “suicide, drug or alcohol overdose, or liver failure.”

Note the researchers:

The authors noted that many measures of religious adherence began to decline in the late 1980s. They find that the large decline in religious practice was driven by the group experiencing the subsequent increases in mortality: white middle-aged Americans without a college degree.

The disturbing rise in these tragic types of deaths among middle-class whites is not a new phenomenon. It’s been going on for a while, and has mostly been attributed to the opioid epidemic and economic inequality. However, this is perhaps the first article I’ve seen that attributes the trend to a decline of religion. That’s been a theory of mine for awhile, so it’s nice to see a study back up what myself and probably a lot of others might have already known intuitively.

This article struck me for a number of reasons. I am white (partly, anyway). I am middle-aged. And I am a former fundamentalist Chick-tract-passing-out Christian teenager turned non-religious/agnostic adult.

I’m almost smack dab in the bullseye of the target demographic.

The only exception is I do have a college degree, at least as far as a liberal arts diploma counts as a “degree.” FYI, it doesn’t.

Income and net worth wise, I’m well past the U.S. median, though still considered middle-class.

Despite this, I don’t engage in any behaviors that would lead to a “death of despair.” I don’t drink, smoke, or do drugs.

I do, however, struggle with general feelings of apathy and nihilism, which I would attribute partly to my agnostic worldview. Emerging from the vortex of fundamentalist Christianity was probably the most psychologically grueling process I’ve ever undergone. And I’d be lying if I said that I feel “freed” or “liberated” from the supposed shackles of religion.

Going from thinking you’ve got a mansion in the sky waiting for you after you die, to realizing you’re going to simply cease existing when your ticket’s finally punched, ain’t an easy thing to do.

Though nowadays I do lend credence to the concept of reincarnation. If it was possible for me to exist once, who’s to say it can’t happen again, in some future universe? Lightning strikes twice all the time, contrary to the old saying.

Sometimes, I wish I could go back. Sometimes, I wish, like Cypher in The Matrix, I could reinsert myself back into my religious worldview, and forget I’d ever left in the first place. It was comforting. Hopeful. It gave a sense of peace. I mean, how reassuring is it to think you’re communicating with the all-powerful creator of the universe when you close your eyes and pray? As opposed to just whispering into empty air?

But once you’re out of the religion game, there really is no going back. It would take monumental self-hypnosis, or a literal Road to Damascus moment, to psyche myself back into becoming a true believer.

I don’t align with the atheist movement, which is predominantly far left-wing, with a small libertarian aspect — two political ideologies I don’t espouse — and is also borderline toxic and hateful. I find the movement has a lot of misplaced anger towards religion, and is populated with ax grinders burned from bad childhood religious experiences. I am not anti-religious. I think religion has an important, even vital place in society, and for most people. Just not for me.

However, this article has prompted me to consider a question:

Is it worth believing in a “lie” if it makes your life better and gives you a sense of purpose?

I put “lie” in quotes because who’s to say that any one religion or another is truly just mythology or not, and whether a god or gods exist or not. I’m content with simply saying, “I don’t know,” hence my agnostic posture. I prefer to keep an open mind rather than make some arrogant declaration of certainty.

I don’t think the solution to all of these deaths of despair is simply getting back to religion. I don’t think a revival, even a Billy Graham-level one, would do the trick. Science has put too much of a wedge between glib faith and the cold, unrelenting force of reality. Information is too readily available that debunks so many religious claims, that in years past, would have gone unchecked and unchallenged. Naivete came easy thirty plus years ago. Now you’ve got to actively work at it. Facts are a mere Google search away.

For instance — and I’m ashamed to admit this, but I’m going to anyway — I believed into my young adult years the claim of a young earth. One of my “evidential” claims was the fact that Neil Armstrong only stepped into a small layer of dust on the moon, which would indicate, given the annual amounts of space dust that land on our lunar neighbor, that creation had only been around for 6,000 years. Then I found out how the dust compacts into the hard surface over time, leaving only a powdery top layer, and my long-held “theory” got blown apart.

Hey, at least that’s not as bad as claiming God is real just because the banana can fit in your hand, like Mr. Ray Comfort.

Living in a “post-religion phase,” coccooned in cold scientific truths, has its downsides. The loss of religion leaves a huge vacuum. The human mind is too active, reflective, and unwieldy in some ways to only subsist off of raw data. It craves meaning, purpose, and fulfillment, which are not things the world readily provides. Especially not our modern culture, which prioritizes consumerism, and trains its young to mainly aspire to corporate citizenship.

I graduated college almost five years ago. In my second to last semester, at the end of the term, one of my professors asked the class what our plans in life were post-college. The question did not indicate career specifically. It was general and open-ended. It was more about the vision we had for our lives. We were all well-acquainted with one another at this point. It was two weeks before Christmas.

Everybody answered something related to either career or continuing advanced degrees, with many stating they would be weaving political activism into their lives in one form or another. Not one person mentioned a desire or plans to get married or start a family. This was a class that was two-thirds female, incidentally, all ranging in ages from 22–25.

Everybody was apparently a good little worker bee eager to punch the time clock.

The lone exception was a young female international student from a Muslim country. She’d had an arranged marriage when she was a teenager, and already had one child. I don’t advocate for arranged marriages. But there was no doubt that this young woman felt fulfilled in her life. She often spoke glowingly about her daughter, and had published poems about her. This young woman was also a good student, though she was held back by the language barrier. She often reached out to me for help outside the classroom.

From a Western perspective, this young woman was “exploited,” because of her arranged marriage, and having a daughter at such a young age. Yet, I never got a sense from her that she felt exploited whatsoever. What’s more, she was freely pursuing an education. It’s not like she was being kept locked in the house in some patriarchal dungeon.

Even if you see religion as superstitious remnants of mankind’s ancient past, there’s no denying the role it plays as a social and psychological glue. Remove it, and people fill in the gaps with something else. And that something else may not always be comprehensive enough of a framework to navigate through the struggles of life. Things like career, poltical activism, social media, pop culture, even movie franchise fandom. Star Wars and Marvel may as well be de facto religions by now. Those are all nice things to care about, but I doubt any of them are enough for most people.

Religion also used to foster many romantic relationships. Now millions of singles turn to the slot machine world of Tinder, Match, Bumble, Hinge, and others. Reducing themselves to a mass of digitized pixels to be swiped away with the flick of a finger. A generation of secular wizards poofing away unacceptable mates on a magic screen. No wonder global population is plummeting.

Even basic relationships seem to have gone missing. Community is largely atomized and directed online. As though our souls were being slowly sucked into the cyber world, leaving skin-shaped husks behind to play pretend in the “real world.”

Ask most people today what their spiritual views are, and you’ll likely get the standard answer: “I’m not really religious.” A statement almost always delivered with a palpaple grimness, if not discreet regret. But whether one adheres to one denomination or another, or subscribes to one holy book, or the other, is not really the point. It’s more about what thread keeps your seams from splitting apart. For thousands of years, for most people, religion was that thread, however nonsensical, quaint, or silly it may seem to another’s perspective. But sadly, it seems many people have had that thread pulled, and whatever replacements they’ve found are sorely lacking.

Modern Super Bowl Rings are Oversized, Gaudy, and Ridiculous-Looking

Are you commemorating a hard-fought championship that took blood, sweat, and tears to earn, or starring in a rap music video?

Source: https://twitter.com/nfl/status/1358615158562709505

A lifetime ago I worked at a Saturn dealership in Memphis, TN. One of the very first — if not the first — Saturn dealership to ever open in the country, in fact.

My career as a “sales advisor” (or “lot lizard,” a term used for both car salesmen and truck stop prostitutes, interestingly) was brutishly short. As was the Saturn brand itself, actually. The niche GM label went belly-up in 2010 due to the 2008 Wall Street crash that pulverized the car company’s balance sheet.

I did, however, have the chance to meet some cool people during my short tenure. One of whom was a former Dallas Cowboys football player. He and his wife showed up one day interested in buying an Outlook. I had no idea who he was, but judging by the way our Sales Manager immediately showed the couple into his office, I sensed they weren’t your usual tire kicking weekend walk-ins.

At some point my manager asked me over to his office to deliver some paperwork. And it was then I noticed the twinkling rock on the man’s right ring finger.

“Yes, Dean, that’s a Super Bowl ring,” my manager said, as if I didn’t know.

I was introduced to the former pro athlete, who had played on the Cowboys during the ’70s, when they went to five Super Bowls, winning two of them, under Hall of Fame Coach Tom Landry and QB Roger Staubach. The ’70s was when Dallas really built its “America’s Team” identity and its winning culture. Two decades later Dallas would dominate again, winning three Super Bowls in the ’90s with QB Troy Aikman and RB Emmitt Smith. Since those two eras of dominance, Dallas’s post-season record has been rather spotty. But there was a time when the Cowboys were synonomous with victory, and the swaggering pride of dynasty.

Seeing that I’d noticed the ring, the former Cowboy immediately, without saying anything, twisted it off his finger and handed it to me so I could examine it up close. This must have been something he did often. Super Bowl rings are quite exquisite and eye-catching. I mean, just look at this thing:

Source: https://sports.ha.com/itm/football-collectibles/others/1977-dallas-cowboys-super-bowl-xii-championship-ring-presented-to-harvey-martin/a/7130-80096.s

Beautiful, right?

Even back in the ’70s, Super Bowl rings were jumbo-sized. The 1978 Cowboys ring, which they earned defeating the Denver Broncos 27–10, felt heavy, but not so much that it would be cumbersome to wear. It had a rough, scratchy exterior due to the numerous diamonds studded into its surface. But its real “weight” came from the victory it represented, and the teamwork and athletic exceptionalism required to achieve that victory. Way bigger than any wedding band or class ring would be, the Dallas ring I held felt big enough to encapsulate the importance of the Super Bowl win. It felt “appropriate-sized.”

Nowadays, Super Bowl rings are gaudy, ridiculously over-sized, and look like something you’d wear if you were making a Weird Al Yankovic parody of a rap music video.

Just take a look at the Los Angeles Rams Super Bowl ring from last year’s championship game, as worn by DeSean Jackson:

Source: https://www.si.com/nfl/rams/news/los-angeles-desean-jackson-super-bowl-ring-championship-las-vegas-raiders

WTF? There is no way a ring that size is even remotely comfortable or practical to wear on an everyday basis, or even once in a while. A ring that size isn’t even really a “ring.” It’s more of a Christmas tree ornament that you stick on your finger. A ceremonial pendant that you’d realistically only wear once a year, likely during championship team reunions or public media events.

I know most of these players are big guys with sausage fingers, but a ring of such elephantine proportion would look too large even on Andre the Giant’s hand.

Imagine trying to stick your hand in your pockets, or conducting routine activities like opening a door, driving, or pulling out your wallet with a glittery rock the size of a strawberry sticking out past your knuckles. It’d be clanking, snagging, sticking, and getting caught on everything. And all those physical impacts would risk damaging it. Diamonds may be the toughest material on earth, but they can be knocked loose. The New England Patriots ring from Super Bowl 51 famously has 283 diamonds (as a trolling nod to the 28–3 score midway through the third quarter before Tom Brady mounted the team’s epic comeback). That’s a lot of rocks to keep track of.

Not to mention a ring that big would unduly attract thieves and other low-lifes. You wear bling that size and you might as well be announcing into a bullhorn that you have a shit ton of money, and are therefore the perfect robbery target. Does the NFL expect its players to maintain a security detail everywhere they go for the rest of their lives?

A championship ring should be a complement to your attire, not a liability. It should be something you could wear everyday, if desired. Not something that realistically you need to stick in a safe or safety deposit box for the 364 days a year that you can’t wear it.

There’s another issue, too. These gaudy, comically gargantuan finger stones, to me, devalue the true meaning of the championship itself. I mean, the NFL is already a pretty glitzy theatrical affair as it is. Many stadiums shoot off fireworks after a touchdown. You’ve got jumbo screens everywhere, decorated dancing cheerleaders, war chants, bright lights, confetti, trophy presentations, billion dollar broadcasting deals, centimillion dollar player contracts, and hundreds of millions of fans tuning in every week. Adding in a Super Bowl ring the size of a Volkswagen Beetle kind of seems like overkill.

A Super Bowl ring is all about both a player’s individual effort, and a team’s collective effort, to achieve perfection. It’s about the brotherhood you forged that year in pursuit of the top trophy. The relationships made during the season that will likely last a lifetime. Going all “Pimp My Ride” on the championship ring kind of cheapens the whole deal. It places more emphasis on a gaudy trinket, rather than on those intangible sacred bonds.

Making matters worse, in the effort to keep one-upping every previous year’s ring, the latest thing now is making the tops removable. A trend the Tampa Bay Buccaneers started after their 2020 season win over the Kansas City Chiefs. Take a look at the features for the championship game number 55 ring:

Now, is that “cool?” Well, yeah, sure. The Rams did their own version last year. Their ring is modeled off of SoFi Stadium, lets you peer through the side into a miniature version of the field, and even has a section of the game ball inserted into the surface.

These newer split rings are definitely a creative upgrade. But they’re also risky and kind of stupid. How long until a player loses the top half? Or until the snap-on fittings become worn and don’t seat properly? A ring with a top like that is obviously meant to be shown off again and again. Eventually they’ll be worn enough to render the ring unwearable. Then what do you do? Stick it in a drawer? Kind of defeats the purpose of getting to wear a championship ring in the first place.

At the rate we’re going, by Super Bowl 100, rings will be the size of footballs. Players will need special gauntlets to wear them. They’ll cost a million dollars each, and require their own seperate mining quarries to procure the needed precious metals and stones. When you press a button the ring will open up, and a holographic display will show the player’s career highlights and a recording of the championship game. And they’ll also have mortar tubes installed so players can launch commemorative fireworks whenever they get nostalgic about the big game.

Going back to the Dallas Cowboys Super Bowl XII ring I got to hold back in 2007, it was modestly “small” and “quaint” by today’s standards. But it was clearly something the retired player wore probably everyday. Everyday for almost thirty years. That’s a long time for a piece of jewelry to last being worn constantly, and a real testament to the hardiness and durability of a Super Bowl ring. Despite its “diminutive” dimensions, it was still pretty awesome to see and hold.

Afterall, it’s what the ring symbolizes — the teamwork, sacrifice, blood, sweat, and tears, and everything else that goes into winning a championship — that matters most. Not the size of the bling.

Four Helpful and Humorous Writing Secrets From Jay Cronley, Author of ‘Quick Change’

Source: Book cover for ‘Quick Change’ by Jay Cronley

So, I was doing some preliminary research on my next film review for the ‘90s-era “cromedy” (crime-comedy) Quick Change, when I stumbled across one of the most hilarious book introductions I’ve ever read.

Firstly, since you may not know, Jay Cronley was an author and newspaper columnist who wrote for Tulsa World, who achieved some notoriety in the late 1980s/early ‘90s for a string of comedy films made from his books. These adaptations include Good Vibes, made into the 1989 comedy Let It Ride, starring Richard Dreyfuss. Funny Farm, made in 1988, starring Chevy Chase. And Quick Change, which was adapted twice into film. First in France in 1985, then in America in 1990, starring and directed by Bill Murray, and co-starring Geena Davis and Randy Quaid.

Cronley had quite an under-the-radar run. For awhile, he was like the comedy version of Ira Levin. Everything he wrote got filmed. However, we’re not here to talk about his films, but about his writing. More specifically, his introduction to his 1981 novel Quick Change, rereleased in 2006 (not an affiliate link) with his reflections on the impact of his book.

Source: https://www.seattletimes.com/nation-world/author-newspaper-columnist-jay-cronley-dies-at-73/

Quick Change is “about a bank robbery,” as Cronley writes in the opening line of his introduction. It’s a comedy about three thieves who mastermind a clever heist in New York City, only to run into every problem imaginable trying to escape Gotham.

Cronley’s book introduction for Quick Change contains a lot of interesting, useful, and funny writing tips that I felt would be good to share. As an author of dark comedies and satires myself, I certainly appreciated happening across this gem. And I’m someone who almost always skips author intros.

So, here are four writing secrets from Jay Cronley’s Quick Change intro:

1. Dig Deep to Find an Original Idea

Writes Cronley:

Before I began to write this novel, I sat down with a pencil and a notepad and I thought of every way I had ever seen anything stolen…Name it, I noted it. Then I began making notes of all the angles and methods ever used to take what isn’t yours.

It’s fair to say that the crime genre is a fairly well-mined one. Especially with famous authors like Donald E. Westlake (more on him later), Agatha Christie, and classic writers like Edgar Allen Poe and Sir Arthur Conan Doyle, among many others, having hatched almost every conceivable crime story over the last 150 years of publishing history. It’s true now. It was also true back in the early ’80s, when Cronley wrote his book.

This sounds like an easy thing to do. Just research a bunch of plots and then write something that hasn’t been done before. What’s so hard about that, right? Except few writers do that. Instead, they grab hold of whatever new idea they have, refusing to let go. But if you’re trying to break through, you have to do something that hasn’t been done before, that will get you noticed ahead of the many other writers working in your genre.

Speaking of genre, this little tip also means you have to know your genre through and through. And that means reading the hell out of it. Or at least knowing some of the common tropes and twists, so that you can surprise with some of your own.

2. Don’t Give Your Good Ideas Away to Other Authors!

Apparently, all of Cronley’s research for the “last great bank robbery idea on earth” attracted the attention of Donald E. Westlake, whom Cronley considered, “arguably the greatest living American writer.” Westlake actually wanted to use Cronley’s idea for Quick Change in one of his own books. But, as Cronley writes:

“No,” I said to Don on the phone that night. “You can’t have my idea. It took me a year to think of it and a year to write it.”

It’s actually rather hilarious to me the idea of Westlake, the legendary crime author with almost 100 titles to his credit, coming to a lesser known author like Cronley hoping to procure a good story idea. It makes me wonder if this is a common thing amongst best-selling authors. It just seems wrong and impolite. Popular musicians borrow, steal, and pay homage to one another all the time, though, so I suppose published authors would do the same. I know I’ve provided good feedback, and even suggested story plots and ideas to other writers on forums and comment threads. But I’ve never given away, or even discussed an idea of mine that I felt had merit for a good book or screenplay.

3. Hollywood Sucks Because Nobody Reads

This is right in line with Stephen King’s famous On Writing maxim: “Read a lot, write a lot.” Cronley, who was criticising the creative shortcomings that were plaguing Hollywood even back in the early 2000s, goes on to say:

The simple reason behind the creative crisis is that nobody reads good stuff, which is the old stuff. The only way to learn how to write well is to read. If nobody reads, you get that Adam Sandler baby-talk thing.

This is especially true nowadays. It is so so easy to get wrapped up in the mindless bits and pieces of Twitter and other short-form-style social media. I find myself getting caught in this no-reading trap all the time. But sadly, so many today are smartphone slaves, addicted to the dopamine-giving hits from divisive news headlines, celebrity gossip, or vapid Buzzfeed-style articles that convey little to no useful information. To say nothing of the infinite scroll of YouTube videos, TikTok shorts, streaming shows and movies, Twitch broadcasts, and immersive video games. Who has time for classic literature?

This problem extends even beyond the general population, to English majors, writers, and novelists like myself, as well. When I was in college, I rarely encountered classmates who’d read much of anything beyond the Harry Potter series, or other modern books published prior to the 1980s. And that’s honestly a crime, because classics are classics for a reason.

4. Movie Deals Don’t Always Lead to a Pot of Gold — In Fact, They Might Even Get You Sued For $10 Million

Source: Warner Bros. ‘Quick Change’ (1990)

Quick Change quickly landed a movie option, which is a contract during which someone has a given amont of time to make a piece of intellectual property into a film. Sometimes options may only last for a year, and an author might be paid a few thousand dollars. The company that first landed the option to Quick Change wound up in bankruptcy. Right before the option expired, Cronley’s agent offered the book to a producer in France. However, this caused the first option holder to sue Cronley for almost $10 million.

Fortunately, the lawsuit was eventually tossed. The French producer went on to make Quick Change into the film Hold-Up in 1985, starring Jean-Paul Belmondo. The Bill Murray version would, of course, come later. But it goes to show that sometimes Hollywood deals can actually be spring-loaded boxing gloves ready to punch you in the face.

If you haven’t checked out some of Jay Cronley’s books, now’s a good time. If you can find them, of course. Many of them are out of print. But evidently, judging by the Amazon link above, Quick Change is still readily available.

‘Quick Change’: A Solid, Sadly Forgotten “Cromedy” Sapphire

Pagliacci meets the heist genre in a comical crime caper that proves there are no easy getaways.

Source: By The poster art can or could be obtained from Warner Bros.., Fair use, https://en.wikipedia.org/w/index.php?curid=5364770

In July of last year I initated a review of the late ‘80s/early ’90s sub-genre I’ve dubbed the “cromedy” (crime/comedy), beginning with early 1992’s Christian Slater-starring Kuffs as my inaugural piece. This article will constitute the second in what I’d like to make an ongoing series.

But why bother commencing a cromedy concatenation, especially about a largely underappreciated sub-genre relegated to a bygone era? There are several reasons. First, a personal one. Many of these films were in perpetual rotation on such channels as TBS Superstation and USA back in the late ’90s, a time when I was a teenager and constantly curating my isolative boredom on the weekends with middling minor hits. The network channels had all the big movies (which I’d already seen), and my house didn’t have premium cable. So, I satisfied myself with more niche fare. Most importantly, this was a sub-genre I’d discovered for myself, rather than one presented or pre-screened for me by friends, family, or parents.

Secondly, I’ve recently begun seeing movies beyond mere vehicles of entertainment, but as time capsules. Little glowing glimpses to particular periods in history. The crime/comedy sub-genre is, of course, not unique to the late ‘80s/early ’90s. Abbott and Costello were satisfying studio contracts with such films as Abbott and Costello Meet the Invisible Man and Meet the Killer, Boris Karloff back in the 1940s and ’50s. However, the “cromedy” era is one I’ve designated as ranging from about 1984, with the release of Beverly Hills Cop, the sub-genre’s undisputed apex predator, concluding with the Tarantino-penned True Romance (1993). The foot fetish dialogue virtuoso would effectively obliterate cromedies forever with Pulp Fiction the following year, a film that instantly rendered any crime flick obsolete if it didn’t have henchmen riffing on about cheeseburgers and pop culture.

Odd as it is to say, but American movies set in the late ’80s/early ’90s, and earlier, are practically period pieces at this point. They depict a very different cultural landscape. Before the internet, cell phones, social media, largescale racial integration, and the financialization and digitization of the economy. Politically, the era takes place right at the tail end of the Cold War, and well before 9/11 pricked the USA’s bubble of security. People smoked (on airplanes even!) without verbal warning signifiers, words like “faggot” and “retard” were casually slung around by film protagonists, and sexual harassment and violence, even against underage teens, were often playfully portrayed. Unacceptably strange behaviors that would obviously never fly in today’s Puritanical woke climate.

Aside from Eddie Murphy’s monster ’84 smash about a Detroit cop who fish-out-of-waters in upscale L.A., and A Fish Called Wanda, there are few if any other diamonds in the cromedy genre. It’s an oddball assortment of sapphires, rubies, occasional pearls, and a whole lot of quartz.

Source: Warner Bros. Pictures ‘Quick Change’ (1990)

Which brings me finally to Quick Change, a strong sapphiric entry. A 1990 crime comedy starring Bill Murray, Geena Davis, and Randy Quaid as three bank robbers who pull off the perfect heist on a New York City financial institution, only to get hilariously bogged down in Looney Tunes-esque fashion in their escape from Gotham. The film remains Murray’s only directing credit, a distinction he shares with Howard Franklin, who co-helmed the picture.

Despite having watched the film numerous times during its syndication heyday on cable in the late ’90s, and then again recently, I only just find out that it’s actually based on a book. The novel of the same name was published in 1981 by Jay Cronley. Cronley was an author of eight comedy novels, most of which became films in the mid-80s through 1990, starring such actors as Chevy Chase (Funny Farm) and Richard Dreyfuss (Let It Ride, based on the novel Good Vibes).

If the cromedy genre has Founding Fathers, Jay Cronley is perhaps its Ben Franklin. Even his last name sounds serendipitously similar. Cromedy/Cronley. How nice is that?

According to Cronley in his introduction to the 2006 re-release of his novel Quick Change, he spent almost a year wracking his brain looking for the last great heist idea. His efforts more than pay off with a clever high-concept robbery that appears to do a little more than inspire the Joker’s mob bank hit in the opening to The Dark Knight 18 years later. Granted the Joker character predates Quick Change by decades, and the Clown Prince of Crime no doubt pilfered many a vault in the comics, but I’d be surprised if Christopher Nolan hadn’t seen and been a fan of the Bill Murray film before making the Batman sequel.

Murray, playing “Grimm,” shows up a Manhattan bank dressed in a full-on clown costume, complete with a suicide vest packed with dynamite. After initially failing to garner any attention with his droll announcement that “this a robbery,” (this is Manhattan, afterall) he fires his gun into the air. That does the trick. Police are summoned by the notorious under the counter RED BUTTON alert system, and a SWAT team assembled. The criminal clown is quickly surrounded. Except this is all part of his perfect plan. Grimm makes some outrageous demands, including a monster truck, which the hapless but earnest Police Chief Rotzinger accommodates in exchange for hostages. Except the first three hostages are actually Grimm’s accomplices, Phyllis (Geena Davis) his girlfriend, Loomis (Randy Quaid) his childhood best friend, and Grimm himself (sans clown make-up). Who have all taped the stolen money under their clothes. Hence the film’s double-meaning title.

Unrecognized due to disguises, the trio slip away to a rendezvous point in a ghetto section of town, where they plan to leave for JFK Airport and eventually, a spot in the Caribbean, with their ill-gotten gains.

It’s a masterful plan. The police are completely fooled. But just when it appears Grimm and company are set to make a clean getaway, all their good luck runs out. They get lost en route to the “BQE” (Brooklyn Queens Expressway), held-up by a gun-wielding apartment visitor, robbed by a conman, lose their car, stymied by a rulemeister bus driver, and have to fool their way through the mob, a foreign cab driver who doesn’t speak English, and a bizarre assortment of NYC characters. All the while Loomis regresses into infantile buffoonery, and just-pregnant Phyllis threatens to leave her beau to protect their baby from a life of crime.

Quick Change has a lot going for it aside from its clever concept. It has some cool cameos, including Phil Hartman, Stanley Tucci, Tony Shalhoub, and a post-Robocop pre-That 70’s Show Kurtwood Smith. An amusing, if one-note, musical score. And some endearingly nostalgic technical touches: Payphones, paper maps, and film spool recording equipment.

Source: Warner Bros. Pictures ‘Quick Change’ (1990)

The film seems comfortable relying mostly on its comedic sequences. What little dramatic character moments there are barely register. Unlike the turbulently jocular relationship dynamic between Wanda, Archie, and Otto in the smartly-scripted A Fish Called Wanda, another heist film that focuses on the aftermath of the crime rather than the crime itself, Quick Change remains more plot-focused and surface-level. Yet its superficial trappings don’t seem to hurt it. It still boasts witty dialogue, colorful performances, surprising enough twists and turns, all delivered with a well-paced flair. It gets the job done, and more than earns its distinction as a sadly forgotten cromedy gem.

What questionable points remain are largely ignorable, if not workably humorous. Grimm’s bank robbery adds up to a whopping one million dollar haul, split three ways. 1990’s $333,000 in today’s inflation-spiked currency is about three quarters of a million dollars. Hardly enough to go on the lam permanently from law enforcement. Much less raise a new family. Though it comes close to Walter White’s initial, uh, fundraising goal of $737,000, which the chemistry teacher sets for himself when he first starts selling meth in Breaking Bad.

Then there’s the glaring issue of Police Chief Rotzinger seemingly figuring out the criminal trio are escaping on a plane just taking off, after Grimm provides the false name of “Skipowski” (he’d used the name “Skip” as an alias while robbing the bank). Are we to assume the NYC Police Chief wouldn’t have the ability to have a plane grounded if he suspected there were three bank robbers on board, which the city has been on a manhunt for all day? Even as a credulous teen, I found the movie’s ending rather implausible. As a middle-aged adult, even more so. I imagine a Spongebob-style “Three Hours Later” time card popping up, then cutting to Grimm and friends getting arrested on the tarmac by the feds. Fade to black.

For a very brief time, Quick Change was my favorite film, if only because I hardly encountered people that knew it existed and it therefore felt like something I alone had “discovered.” And this was even back in the late ’90s, well before Bill Murray’s resurgence as a serious actor in Lost in Translation and a fixture in Wes Anderson films. As a Murray vehicle, Quick Change’s Grimm hardly competes against his more popular roles. Which is a shame, because it’s one of his most natural performances. Although, I will say that in addition to possibly influencing The Dark Knight’s bank robbery opening, I noticed a lot of similarity between Murray’s quippy, quick-thinking, and smart-mouthed portrayal of Grimm, and Bob Odenkirk’s wily criminal lawyer Saul Goodman. Grimm dances out of danger with his mouth at multiple points. Even talking down a gun-wielding psycho, outwitting a local mob boss, and of course, playfully taunting the police. Grimm and Goodman could practically be brothers.

Quick Change may lack the charisma and energy of Bevery Hills Cop, and the intelligence of A Fish Called Wanda, but as a cromedy entry, it’s a solid sapphire.

Do yourself a favor, and check out Quick Change, both the movie and the book.

Reviewing Some of My Crypto “Mistakes” in 2022

An end-of-year look-back at my crypto investments, and a 2023 look-forward. Is it worth HODL’ing to infinity? Also, some price predictions.

Photo by David McBee from Pexels: https://www.pexels.com/photo/round-silver-and-gold-coins-730564/

2022, we gotta talk, man.

What the hell happened? You were supposed to be the year cryptocurrency went super mainstream. The year where everything that wasn’t dirty, dirty fiat went to the moon, baby.

Instead, we crash landed in Death Valley. Celsius and BlockFi went bankrupt. FTX imploded, causing thousands of investors to lose access to their funds. TerraUSD and LUNA completely evaporated, taking billions with them.

And instead of Bitcoin crossing the much ballyhooed $100,000 threshold and putting a Lambo in my garage, it’s ending the year below its 2017 high-water mark.

Screenshot by author from CoinGecko.com

What am I supposed to drive now, 2022? A freaking Toyota? GTFO of here!

Man, what a terrible year for the crypto asset class.

(Disclaimer Side Note: None of this is financial advice. I am not an investment professional. This is mostly a humorous gut-checking look at my crypto stack and the future of crypto, than some serious chart-analysing deep dive. If you want a pro’s opinion, go check out ClearValue Tax on YouTube, which is where I get a lot of my info. And do your own due diligence too, of course.)

Like most people, I got burned badly this year in the crypto space. But rather than wailing and whining, I thought I’d take a sober look at the wreckage. Throughout the year, I did sell some of my holdings, while keeping some others.

But even if I dumped some of my coins, that doesn’t mean I’ve lost faith in the asset class. If anything, I think all the chaos will make the space much stronger. Crypto needed a wake-up call. Investors of all stripes, amateur or otherwise, needed a wake-up call. With the Federal Reserve now rapidly raising interest rates, the era of easy money is over. For now. Investments have to really prove themselves, rather than just offer some potential speculative future gains.

So what does the current crypto winter mean going forward? Should you just HODL ‘till the cows come home, or cut your losses, and move on in life?

I think the answer is somewhere between the two extremes. For sure, some crypto assets are dead and may never come back. According to ClearValue Tax, my go-to for crypto answers, many altcoins from 2017 failed to regain their former highs in the 2021 bull market run. So just because some alts are down big now, doesn’t mean that in a future bull run they’ll make all-time highs again.

I think overall this recent crypto crash should prompt a return to the fundamentals. Even if many alts might promise exponential gains, how do you know which ones will survive through the crypto winter?

For me, I’m going to stick with Bitcoin and Ethereum, and maybe an alt or two going forward into the next run. Luckily, I was mostly loaded up on those two in the beginning anyway. So for me, not much is going to change. It’s all about HODL’ing through the pain.

For sure, it sucks seeing an asset you hold plummet by double-digit percentages over the course of a year. But if you’re planning to hold long-term, this latest “setback” is really a huge buying opportunity.

Bitcoin and Ethereum may drop more from their current levels going into 2023. In fact, I wouldn’t be surprised to see Bitcoin drop to the mid-teens to even as low as $10,000. I could see ETH hitting $600-$800. But long-term, I think these assets still have a lot to offer in the way of gains.

So, here are my Very Crude, Very Amateur Predictions

I think Bitcoin will rise somewhere between $80,000 and $120,000 sometime in the next bull run. Possibly late 2024 into 2025. After the next halving event (roughly around March 2024) and assuming the Federal Reserve pivots from hawkish to dovish. If Bitcoin first falls to $10,000, that means you could possibly be looking at an 8x-12x if you time your purchase and sell points right.

For Ethereum, I think it will rise to somewhere between $6,000 and $10,000 at around the same time. If it first falls to $600, then you could be looking at a potential 10x or greater if you time the bottom right and sell near the top.

Regarding altcoins, DeFi, DEXs, CEXs, etc. I have no freaking clue. I’m very much a crypto minimalist. I get my coins off exchanges ASAP, and hold everything in cold storage. Part of what appeals to me about cryptocurrency is the decentralization and self-custody aspect. But therein lies some danger. You have to be extra careful guarding your own stack.

I’m proud and lucky to say none of the exchange nonsense impacted me. I did briefly hold some Bitcoin and other assets on BlockFi earlier in the year. But even then, I kept hearing negativity regarding that institution. And BlockFi kept reducing its yield anyway. It didn’t seem worth it give my coins to someone else to hold for such a little payoff. So sometime around the end of last winter I pulled everything I had off BlockFi, and into my own cold storage wallet. A decision that’s paid off very well.

If an exchange, central or otherwise, is offering ridiculous yields that look too good to be true, they probably are. Stay away. I’m glad I trusted my gut regarding BlockFi. And while I never used FTX, when I kept seeing a bunch of random celebrities and YouTube crypto dorks shilling for it, my B.S. detector went off. I didn’t know why. I just didn’t like the vibe that exchange put off, much less its supposed “man of the people” founder. I don’t have some sixth sense or a crystal ball. It’s just I’m old enough to remember institutions like Lehman Brothers, WorldCom, Enron, and the numerous amount of Dot Com failures that blew up in the 2000s. Common sense told me something wasn’t right.

Hey, maybe this getting older thing has some advantages.

With all that said, here’s a more microscopic look at some of my crypto HODLings, starting with the biggest one of all.

Bitcoin (BTC)

Photo by Pixabay from Pexels: https://www.pexels.com/photo/close-up-view-of-a-golden-coin-315788/

I first started buying Bitcoin back in September, 2020. Slowly, tentatively, without really knowing what I was doing. But the more I learned about the digital decentralized store of value, the more excited and enthralled I became about its future. Even if I didn’t fully understand the technology behind Bitcoin, I understood its unique and revolutionary characteristics.

Then when the bull run began, fueled by the Federal Reserve money printing into oblivion following the Covid market crash in March, 2020, I felt vindicated. Though I’m proud to say I never got into the laser eyes craze, which felt stupid and cocksure to me.

I only bought $25 worth of Bitcoin that September, when it was valued around $10,000. Then slowly I accumulated more through the end of the year, and aggressively ramping up more into the bull run. By the time it hit around $25k-$35k in the first quarter of 2021, I’d already doubled or even tripled my investment. Bitcoin had a double top in 2021, hitting near $69k twice, and crashing almost 50% in-between. But since November of last year (2021), it’s been almost all down hill, and now it sits somewhere between $17k and the abyss. If the Fed keeps jacking rates, and the economy spirals into a recession, Bitcoin could crash even lower.

But I’m still accumulating and holding. And now is the best time to do so, hard as it may be. When the price is low, and everyone else is saying it’s dead and never coming back. Not in two or three years when it’s making all-time highs again and crypto bros are mortgaging their houses at $80k a coin, and posting YouTube thumbnails of their stupid faces and their stupid mouths wide open.

I believe in Bitcoin. But I wouldn’t say I’m a fantatic or evangelical about it. And I don’t buy into the moon shot price predictions that so many mouth breathing YouTube dweebs like to make. My plan is take advantage of the bear market over the next two years (or however long it may last). When I first started buying in fall 2020, I felt like it was too late. I missed out on buying when Bitcoin dropped to around $5000 during the covid crash because I didn’t see it for the deal it was. But next time I intend to take advantage of major price drops more, and then sell more into strength once the bull run starts up again.

Ethereum (ETH)

Photo by Jonathan Borba from Pexels: https://www.pexels.com/photo/ethereum-coin-on-yellow-background-14891535/

ETH has actually been my most lucrative crypto investment. I started buying this one in December, 2020, and have held since. I’m still up about 2x to 3x on my holdings, even considering I bought some at the tail end of the bull run.

Even though ETH’s been a good investment, I still have doubts about it as a long term value, given the collapse of FTX and many other crypto exchange firms. Not because I think ETH will collapse like those firms, but because unlike Bitcoin, it’s not adequately decentralized. So, it goes against the whole ethos of “crypto” in general.

ETH is also buggy and expensive to use, unlike many other competitors. I compare it to Microsoft Windows in the ’90s, when despite it being slow and awkward, it had such enormous market share that everyone used it.

However, ETH will most likely continue to be used well into the future. It has the distinction of first mover advantage, which, in the tech world, is often enough to secure a stronghold for years to come. Ethereum is the foundation for many DeFi and NFT projects, and I don’t see that changing anytime soon. At this point, whatever supposed “ETH-killers” come out, they’re really competing for third place or worse.

So, my plan with ETH is pretty much the same as Bitcoin. Keep HODL’ing, and keep stacking. I missed out on loading up on ETH at its ridiculously low bottom during the Covid crash — something I regret not doing. I still say there is downside to come. So next time I’ll be more ready to deploy capital and swallow the risk. But to be clear, it’s riskier than Bitcoin, and Bitcoin itself is already far out on the risk curve as far as investments go.

Polkadot (DOT)

Screenshot by author of CoinCecko.com

While BTC and ETH will likely remain in the number one and two spots in terms of market cap and utility for the forseeable future, the question that remains is who’s going to place in the #3 spot and down?

Polkadot is an altcoin I started buying sometime in 2021 into early 2022, but eventually offloaded in favor of more BTC. In fact, I wrote about it in this article here.

DOT has some positives going for it. It’s supposedly meant to act as a bridge of sorts, linking other cryptocurrencies together via use of its native token. Its founder is Gavin Wood, who also co-counded ETH with Vitalik Buterin. DOT also has a good number of developers. It offers high-yield for staking the token, which you can do on your own wallet or on an exchange like Kraken. The downside is that DOT has an unlimited supply, and an effective infinite inflation rate. DOT may pay out between 7% and 14% (or higher), but it has to, because its current annual inflation rate is about 7%. Token inflation kind of goes against the whole philosophy underlying crypto, particularly Bitcoin, itself. Inflation is what fiat money does all on its own. And God knows we’ve seen enough inflation in 2022 to last a lifetime.

At this point, it’s really a guessing game as to what token will be able to compete alongside ETH in the future. I highly doubt anything will beat ETH at this point, even if a coin is cheaper and easier to use. I may return to DOT in the future now that it’s fallen to it’s near-ICO price (sub-$5). I may even stake it again on my Ledger, which for newbies is not the easiest (or risk-free) thing to figure out.

DOT could be one of those rare alts that make higher highs in the next bull run. Which would provide enormous returns considering its present price. But there are still just too many questionable elements. If it keeps dropping, however, and gets to the $1-$2 level during a potential capitulation crash next year, I may decide to stack some again. For now I’m staying away.

Algorand (ALGO)

Screenshot by author from CoinGecko.com

Ugh, this one was a big disappointment that once held some promise. While it may still technically be a “blue chip” alt, it’s fallen out of favor pretty hard, like many supposed “ETH killers.”

Despite its current state, I actually did okay with Algo, initially buying some up in 2021 when it was around $.30, watching it pump to over $2, before plummeting back down to earth. Like DOT, this is another alt competing to at least get into the same ball park as ETH. It has some MIT-trained founders, and a development team. It also had a very easy staking feature. All you needed to do was put it on your wallet, and you’d get daily gains (around 4% annually). I liked the simplicity of it. It was one of my first alt coin “wins.” I did okay, but sold most of my small stack well below the high.

This is another alt I may buy into if it drops hard enough, like DOT. I’m not into meme coins. I don’t do DeFi. I no longer trust exchange tokens, no longer how established they may be. For me, I look for a strong development team, utility, and other fundamentals. Algo has some good things. But its glaring weakness is its max supply, which is ten billion. While that’s better than DOT’s potential infinite supply, that’s still hardly a scarce amount.

Algo still has a lot of questions. But like DOT, it could be a survivor into the next bull run. So I’ll keep my eye on it and may reacquire if it drops low enough.

PancakeSwap (CAKE)/Uniswap (UNI)/ApeSwap (BANANA)/DeFi in General

Screenshot by author from pancakeswap.finance

I only dabbled with DeFI and its various outrageous (and childish) forms. Yes, I see the potential. No, I don’t really care. At least not anymore. And that’s because for all the research and experimentation I did, I still don’t really understand it. And it’s riddled with scams.

I think at most I put a few thousand into the DeFi space. I even made some small gains. But because I just never really got the tech behind it, I ultimately pulled out. Even if an asset space is new and looks promising, I don’t think it’s a good idea to get into it if you don’t grasp it. This is why I avoid penny stocks, most tech stocks, or investing in anything that’s too cutting edge. I like some risk, sure, but I’m not a pioneer.

Still, it will be interesting to see how this space matures in the future. I don’t know that I agree with the idea that DeFi is the “future.” Just because trust is such an important aspect of the financial world. Banks, whether you like them or not, have been established for thousands of years, have government backing, and in 99% of cases, work just fine. Ask the millions who use them for mortgages, car loans, and to store their savings. Banks ain’t going anywhere, buddy. And if DeFi never matures beyond desserts and fruit-themed tokens that crash more than 95% during bear markets, it probably won’t make it beyond the sidelines in the finance world.

Conclusion

This look-back actually encompasses 2020–2022, not just a single year. But it’s been a long time coming. And now that crypto winter has clearly set in, it was time to look at my holdings and think about the future.

2023 going forward will be all about fundamentals, and priorityzing Bitcoin and Ethereum over pretty much everything else. I see those two as the “safest” bets in terms of crypto, that still have a lot of runway. The way I see it is this: Would you rather have a good chance at your investment making an 8x-12x? Or a much smaller chance of your investment 20xing or more by going into a riskier altcoin? I’ll stick with the “smaller” but more likely return. It’s easy to lose sight of the big picture and appreciate the potential gains in the crypto space. A good diversified S&P 500 Index Fund will generally make 8%-12% a year. An 8x-12x on a few virtual coins in a few years is quite frankly a ridiculous ROI, and trying to get more than that is just asking for trouble.

Another lesson I’ll take with me moving forward is keeping a better eye on the exit. It’s nice to learn about new technology, and seeing where all this crypto could lead to in the future. But in the end, it’s an asset like anything else. Not a religion. Not a philosophy. I’m here to make money, not change the world. Had I sold my Bitcoin and Ethereum holdings late last year in 2021, I’d be sitting on a pretty good down payment for a nice house. As it is, I’ll have to wait for the next bull run. But my “mistake” of HODLing through the peak, and my patience, may potentially get rewarded, if the Fed and the economy ever gets itself sorted out. The next peak could be way bigger.

We’ll see where things go in the new year. One thing’s for sure. I’m looking forward to 2023 and beyond. 🙂

Review: Avatar: The Way of Water

A stunning visual spectacle, as expected. But does the story match up with the eye-popping CGI?

Source: Official ‘Avatar: The Way of Water’ Poster from 20th Century Studios via WDW News Today

Last week I wrote an article titled I Love ‘Avatar’ and Haters Can Choke. An aggressive response to a nearly decade and a half of people unfairly singling out James Cameron’s 2009 smash hit, and beating it over the head for its supposed lack of originality, simplistic story, or sometimes for no reason whatsoever.

Avatar haters really are a special breed. Filled with glib self-righteousness, dismissive of the titanic efforts that went into making the groundbreaking special effects, all while ignoring the scores of big dumb sci-fi/action franchise films like Transformers, Star Wars, or 80% of Marvel products, that don’t even possess half the technical precision, directorial vision, much less box office gross of the Pandora-set film.

Frankly, I get second-hand embarassment just thinking about them.

But we’re not here to talk about those with clearly lesser-developed frontal lobes. We’re here to talk about the monumental, long, looong-delayed sequel, that’s just finally premiered this week. We’re here to discuss Avatar: The Way of Water.

::SPOILERS::

AWOW, as I call it, (which is pronounced “Hey, wow!” by the way) starts off over a decade after the events of the original film. Jake Sully, now a full-bodied Na’vi member, having transferred his consciousness into his avatar form with the help of Eywa in the first film, is married to Neytiri, and has a growing brood. This includes Neteyam, his eldest son, their second son Lo’ak, and “Tuk,” their youngest daughter. 14-year old Kiri, born of an apparent immaculate conception from Grace Augustine’s avatar body, and played by Sigourney Weaver, and Spider, a human male, and son of Colonel Quaritch, are the family’s adoptive son and daughter.

Unlike the first film, where Jake only had his own safety to be concerned with, now he has a family to worry about. So when humans in powerful warships descend from the sky over Pandora to recontinue their quest for the planet’s minerals, he must balance his role as a protector of the tribe, with a protector of his immediate family. At first he’s able to make this arrangement work. He leads the Omatikaya clan in derailing (literally) a train system, stealing weapons and supplies to be used against further encroachment of the “sky people.”

However, it isn’t long before an old enemy resurfaces, who has made this restarted colonization process a personal matter rather than one staking the future of a dying earth, and Jake is forced to go on the run with his family. Leaving their home in the forest, Jake leads his wife and children to the Metkayina clan, who live near the ocean, in harmony with the vibrant aquatic life forms. But Colonel Quaritch, now reincarnated in an avatar body, but with all of the old soldier’s maliciousness, eventually catches up to the Marine that betrayed him, leading to an explosive (and watery) showdown.

Visually, this film is a masterpiece, of course. It’s even unprecedented, despite being a continuation of the original. It makes 2009’s Avatar look like an early 2000s Playstation game (Resident Evil: Nemesis, anyone?), even though Avatar’s CGI still looks pretty good, or better, than films made in the past five years. It all goes without saying that Cameron has raised the bar again, having pioneered underwater mocap suits for AWOW. See it in 3D and IMAX and it’s as transportive as the original, maybe even more so. There are sequences that just look impossible. Cameron sought to create a cinematic spectacle that was like “dreaming with your eyes wide open,” and he certainly pulled it off. Except while dreams are usually chaotic and messy, Way of Water is a focused, entertaining spectacle of the highest order.

All that said, what about the story? While I’ve defended the original as vigorously as an attorney who knows his client is innocent, it’d be the height of intellectual dishonesty not to acknowledge the film’s shortcomings. The Godfather, it is not. But then, no four-quadrant tentpole science-fiction/action/fantasy that costs a quarter million to produce is either. It’s a simple story told on a vast stage, which is what most of your big budget spectacle films try to do.

However, there is a noticable upgrade when it comes to nuance, in the character work and plot of AWOW, versus the original. People who liked the first but left wanting a little more, will likely be satisfied with this latest sequel.

In fact, some of that extra narrrative weight actually proves the film’s weakness, as well as its strength. In AWOW, Cameron does something I don’t think he’s ever done before in any of his films. That is, balance a large central cast filled with intricate relationships. This is a story about a family, not about a man (Sully), as in the original. Sometimes Jake recedes to the background in favor of his colorfully characterized children, only to resurface when needed. Neytiri almost fades away after the mid-point, only to come back powerfully at the climax. Almost all of Cameron’s films involve a single, central relationship. In The Terminator, it’s Sarah Connor and Kyle Reese. In Aliens, it’s Ripley and 8-year old Newt, the lone surviving colonist. In T2, he steps it up with John Connor and the good T-800, and John and his mother, Sarah. Titanic it’s Jack and Rose. In Avatar, it’s Jake and the Omatikaya, and Jake and Neytiri.

In AWOW, there is no one central relationship to steamline the story. There are like five interwoven ones. Jake and his overall role as patriarch to his family and husband to Neytiri. Jake and his relationship to his “disappointing” second oldest son, Lo’ak. Lo’ak, and his relationship to Payakan, a whale-like creature. Kiri, and her spiritual relationship to Eywa. And lastly, Spider, and his relationship to his sorta father Quaritch.

It’s all pretty big and cumbersome. Almost unwieldy, were it not for the film’s monstrous 3 hour and 12 minute running time. Cameron gives ample space for most of the major plot threads. Though some are no sooner started than minimized to background noise. For instance, after Spider is captured by Quaritch and his avatar-embodied marines, he largely disappears, save for a few short sequences, until the film’s final fight. Meanwhile, Lo’ak encounters the friendly and misunderstood Payakan, while Kiri explores her abilities to connect with the planet’s maternal entity, Eywa. Most of the threads come together for proper payoffs, save for Kiri’s epileptic reaction to plugging in to Eywa. It’s stated by human medics that if the teenager attempts to access the spirit mother again, she could die, in an apparent set-up to a future sacrificial moment. But this never comes comes to fruition, and is never mentioned again.

All of these relationships in AWOW are meant to reflect the bigger theme — which is mankind’s relationship with nature itself. This film is a “love letter to the ocean,” as Cameron puts it. More specifically, it’s a calling to be more mindful of what we put in the ocean, and in our environment. Reinforcing that theme is the appearance of a cruel, Captain Ahab-like character, who’s hunting Tulkuns (Payakan’s species) for their highly valuable brain fluid, which can evidently “stop aging” in its tracks. Though one wonders why humans would want to live forever on a “dying planet.” The metaverse must be significantly improved to Matrix-like levels in the future earth of Avatar.

While it’s possible sequels will resolve some of the missing links in AWOW’s great chain of relationships and plot points, it’s fair to say that this latest film is a ten-pound story stuffed into a five-pound bag. Even though it’d never be financially feasible to construct an Avatar streaming series given the technical requirements (not to mention Cameron’s time-consuming perfectionism), you all but one need one at this point. If the visionary director who submerged to the lowest point in our oceans has his way, we could be seeing at least three more sequels. Might five installments (or more) be enough to contain the whole story? It’s hard to tell, but part of me thinks this is a bit of a runaway train. Not that I’m complaining.

It would be impossible to count off every CGI marvel in AWOW. But for me, the most impressive, and honestly most touching visual, was a brief shot after Payakan splashes his new human friend, Lo’ak. The Tulkun dives under the water, but not before casting back a self-aware, playful look in his eye. I don’t think I’ve ever seen behavioral subtlety done that effectively in an animal character before. For sure, humanoid CGI characters like Thanos emote quite realistically. Or the chimpanzee protagonist Caesar from the Apes films. We’ve become spoiled with all the computer-generated wizardry. But to make a whale’s personality shine in a brief moment — that takes things to another, emotional level. Quite impressive and stellar, to say the least.

Overall, I’d give Avatar: The Way of Water 4.5 stars out of five, losing half a point only for its overstuffedness, and some minor excessive indulgence. There were some sequences that could have been trimmed for a leaner run-time. But that’s like complaining about there being too much food at a long-awaited for feast. AWOW is the best you’re going to see in IMAX 3D, and absolutely worth checking out once, twice, or more times. I just hope part three arrives in a timely fashion. I’d like to see this franchise come to a completion within my lifetime.

Sorry, Tattoos Are Still Stupid

Photo by Kevin Bidwell from Pexels: https://www.pexels.com/photo/man-with-floral-arm-tattoos-2183132/

So, I was sitting in some vinyl or cheap plastic-covered chair, in a low-lit room, surrounded by a few strangers. I had my sleeve rolled up. I was ready for what was coming.

Then she came by. Set up all her equipment. Rubbed some disinfectant on my arm. And after making sure everything was a go, inserted the needle into my skin.

I felt that familiar pinch as the metal penetrated through my epidermis and into my vein, and started drawing my blood like a thirsty little vampire.

And so began my latest whole blood donation to the Red Cross. My 26th in total since first giving back in 1999 in my junior year of high school.

What does this have to do with the fact that tattoos are stupid, beyond the misdirection? There’s the needle connection, of course. But it’s also the fact that most people like to ink themselves to commemorate something of significance. A person, like a partner. A date. A symbol representing a group of some kind. Or maybe just a cool design they like. Whatever they put, it usually means something important to them.

Just like I think donating blood is important. I’ve been giving blood for almost a quarter century. I actually really enjoy doing it. It’s the same to me as any valued relationship. But as much as I like helping out the nation’s blood supply, and for as long as I’ve been doing it, I would never consider stamping my body with a Red Cross logo. Or printing something like “GALLON DONOR” on my shoulder, with a big bolt of lightning going through it.

And why not? Well, because that would look tacky and stupid. It would demean and cheapen the altruistic act, as far as I’m concerned. It would also be unneccessary. Who exactly am I trying to inform about my blood donation activities? Myself? Am I going to forget that I donate, and therefore need the constant reminder? Or would I be getting it in hopes of signifying to the world that I donate blood, in hopes of convincing others to do the same? That’s something I do already. I’ve featured it in a previous article. I bring it up in conversations when appropriate, such as when discussing hobbies or volunteer work. It’s not necessary to carry around a permanent sign on my flesh that screams “ME, YES ME, DONATES BLOOD, AND YOU SHOULD TOO!” I’m not a blood donation TV evangelist, or something.

Plus, a tattoo would disqualify me from donating blood for at least three months if I got it in a state that does not regulate tattoo facilities. Or potentially disqualify me permanently if I were to pick up something nasty like Hepatitis while getting the tattoo.

If I got a Red Cross tattoo, let’s say, on my neck or hand, or some visible area not often covered by clothing, most people would probably think I’m insane. But if it were a skull, or a bird, a snake, or a rainbow, that’d be A-okay. Those are “normal” type tats. People get all kinds of such tats on those areas, and nobody bats an eye.

Why is that? Because unfortunately, tattoos have been normalized in society. So much so that NOT having one is unusual. I can remember a time when only bikers, convicts, and military types got tattoos. And usually they only had a few. Now everyone’s got them. And people rarely stop at one. Even nerds and dorks are getting full sleeve tats. I used to work with a guy who had tats of horror movie villains all over his body. He had Michael Myers standing at the top of the stairs in Halloween (the pic shown at the top of this linked article) on one bicep. Freddy Krueger on another. I think Jason Voorhees was on him somewhere, too. He was 26 years old. I’m sorry, but why the fuck does a full-grown man need to garishly exclaim to the world in the form of permanent ink on his skin that he likes mainstream horror movies so much? Does he think that makes him unique or special? Judging by the box office returns and cultural impact of the three franchises represented by those characters, EVERYONE likes those movies. You’re not the only one, dude.

I mean, with that kind of passion for horror movies, I’d expect the guy to be a writer/director of horror movies himself. Or maybe an actor in one. But he wasn’t. He was just a guy working in a shitty market research company for $29k a year. Just like me at the time.

I don’t care that it’s 2023 next month and everyone and their grandma has a tattoo. I will never not think they look stupid and unnecessary. In fact, I’ll go even further. I have never once ever seen a tattoo that I thought looked attractive, unique, or creative.

But what about those 3D tattoos? Or those colorful, freshly-inked ones? Or the ones done by top tattoo artists that are one of a kind?

Let me explain to you exactly what a tattoo is to me with a little story.

I had just pulled into a Home of Economy when I noticed a brand new top of the line Ford F-250 King Ranch parked in the lot. This truck was jacked up, had camoflauge trim, a custom olive green color, and a creamy leather interior. It had the works. It likely cost somewhere close to $60,000+ with all the modifications. I am not a truck guy, but even I know a customized King Ranch means something. You order a truck like that, and you’re probably waiting six months for it to arrive. This truck was somebody’s baby. It looked immaculate.

Except for the bumper sticker on the back that read, in all upper-case black letters, against a white background: “I’M ONLY DRIVING THIS FAST BECAUSE I REALLY HAVE TO POOP!”

That right there is what a tattoo is to me. A trashy, low-class, contemptible, 100% stupid bumper sticker.

“But I got my tattoo for (insert deep heartfelt explanation that for some reason brings meaning to your life, and that I’m supposed to agree with to make you feel better)!”

Nah, see you’re wrong. You didn’t get that butterfly tat on your shoulder because you just love flying insects. I don’t see a degree in lepidopterology framed on your wall, I see a Natty Light poster. You got it because you’re dumb and impressionable, and you thought it looked cute and would make you fit in more with your equally dumb friends. You might as well hang a big neon sign around your neck that reads: “I am desperate for approval, can’t think for myself, and willing to scar my body to fit in with the crowd.”

And that sleeve tattoo of flames going down your arms? What the fuck is that supposed to tell me? That you’re some kind of badass? That if you started punching you’d swing so fast it would look like your arms were on fire? No, all that tells me is you know how to waste money on dumb shit. My local part-time librarian associate has a sleeve tattoo. I know she can’t make more than $20k a year. A typical sleeve tattoo can cost thousands of dollars. So it’s likely she spent at least 10% or more of her annual pre-tax income on a vomitous mess of pre-Millennium kids cartoon-themed ink, including a piss poor rendering of Rainbow Brite. I don’t care how much Proust she reads. She is a dumbass.

Look, you can justify them all you want. You can come up with any reason you can about why they’re meaningful to you. I don’t care. Tattoos are stupid, and you are stupid for having them.

I Love ‘Avatar’ and Haters Can Choke

Source: Official ‘Avatar: The Way of Water’ Poster from 20th Century Studios via WDW News Today

“Avatar is just Fern Gully in space.”

“Avatar is just Pocahontas in space.”

“Avatar is a white savior movie…in space.”

Avatar left no pop culture footprint.”

Avatar is the most successful failure ever.”

“Avatar has phenomenal CGI but a terrible story and stock characters.”

Yeah, yeah, go fuck yourselves.

You know, for a movie so many people keep insisting sucks and was evidently “forgotten” after premiering in December 2009, you’d think they were talking about something like Delgo. That weird, creepy-looking animation from 2008 that became one of the biggest box office bombs in history, making barely a million bucks against a $40,000,000 budget.

Delgo, like Avatar, is set in a mythical fantasy land where conflict arises between two warring alien clans, and features two lovers from opposing sides trying to find romance despite their differences in heritage.

But then the two films diverge quite massively, as Delgo legit blows, while Avatar is a bonafide masterpiece and a technical milestone that made nearly $2.7 billion in its initial theatrical run.

Avatar kicks ass, and I’m tired of people pretending that it doesn’t.

The same losers who shit on Avatar are the same ones who made Star Wars: The Force Awakens the number one highest grossing U.S. domestic release in box office history. And that JJ Abrams-directed fan fiction is nothing but a bigger budget remake of Episode IV, and has about as much creativity as an elephant fart. You know it and I know it.

Oooh, ooh, instead of a Death Star, we’re going to have a planet-sized Death Star this time! And instead of blowing up just one planet, it’s going to blow up several! And we’re going to make a FEMALE as our protagonist. A female lead? OMG that’s unprecedented! And we’re even going to have a black guy in it as a lead too (though in sequels we’re going to reduce his importance until he’s buffoonishly inconsequential so much the actor wants nothing more to do with the franchise). Never been done!

So which is it, idiots? Avatar’s just a ripoff of Fern Gully and Pocahontas using a familiar hero’s journey template, but Force Awakens is somehow the greatest thing ever put to film? You can’t have it both ways.

Source: Screenshot of Google results for “Avatar is…”

I’m not going into all the technical details about why Avatar’s CGI and motion capture technology was so ground breaking. It’s all been said before, I’m pressed for time, and I’m not that much of a dork to get into the nitty gritty tech stuff.

I will, however, compare Avatar to a jury of its peers — other massive science-fiction/action film franchises — which is how everyone should appraise James Cameron’s movie.

First, some blunt honesty.

Avatar is obviously not The Godfather. It’s not some deep A24 indy film about some guy’s existential middle-aged crisis or something. It’s a finely crafted popcorn flick with phenomenal special effects, with some superficial environmentalist themes. That’s about it. I love it because it’s a transportive amusement park ride, and executes its story just about perfectly. A rare feat few movies accomplish. Avatar is still the best and most immersive movie theater experience I’ve EVER had. It does everything a sci-fi/action film should do, and way more. Yet for some reason everyone judges Avatar harshly, while giving substantially lesser franchises an easy pass.

People only hate Avatar at this point becuse they think they’re supposed to. It’s cool and hip to be anti-Avatar because it’s so popular. And I get it. Sometimes popular stuff does indeed suck ass. God knows I’ll never understand the appeal of Twilight or Fifty Shades, but I’m not going to sit here and act like I’m better than people who like those franchises. Most times, things are popular for a good reason. They connect with people in a visceral and profound way.

Anyway, we’ve already peeked at the new Star Wars films, Avatar’s closest competitor that’s also within the massive Disney ecosystem. And what do we see? A disorganized mess with no vision, no scope, no outline, no coherency, and no theme. Seriously, what the fuck are the newest Star Wars products even about? Empire bad, robed people good? LOL, GTFO! They stand for nothing!

Whether you hate Avatar or not, it’s virtually impossible to walk away from seeing that film and not get the pro-environmental and anti-imperialist messages, however hamfisted those themes are handled. And whether you agree or disagree with those themes, the fact that a writer/director is able to inject his vision into a spectacle film about giant blue people in a coherent and frankly meanginful way is honestly pretty impressive.

Now let’s take a look at another competitor — the Marvel films. I will grant you that some of the Marvel installments like Infinity War, Captain America: Winter Soldier, and even the first Guardians of the Galaxy, are pretty good. The mocap technology used to bring Thanos to life is incredible. And I have no idea how producer and plot captain Kevin Feige manages to interweave so many characters and storylines together between so many directors and writers. But for the most part, your typical Marvel movie is servicable, without saying much or doing much to stick with you. Really, do you remember anything from Iron Man 2, or Captain Marvel, or The Incredible Hulk? I don’t. And nowadays, the Marvel mega-franchise has become increasingly watered-down and impossible to follow with all the Disney+ stuff and about ten films released every year. Marvel is proof there is sometimes too much of a good thing. I’ve given up on the superhero soap opera simply because there’s too much to keep track of. Marvel used to feel special with maybe two releases a year. Now it feels forgettable and paint-by-numbers.

Meanwhile, James Cameron took 13 years — 13 years! — to make Avatar: The Way of Water, the sequel to the 2009 film. Could you imagine any other franchise taking that level of care and patience when it came to producing a sequel? Could you even imagine any director delaying an Avengers film for almost a decade and a half until the tech was “right?” Of course you couldn’t. And that’s because Marvel films, for all their occassional creativity and spectacle, are mostly business products at the end of the day. They are there to enhance Disney’s bottom line.

But what is Avatar: The Way of Water? According to the creator himself, it’s the “worst business case in movie history.” TWOW apparently has to gross almost $2 billion just to break even. Or like one sixth the current market cap of Dogecoin. Its budget is a bloated potential time bomb to Disney’s already floundering stock price. A cathedral to one director’s massive ego to make the biggest film of all time thrice.

And you know what? That’s awesome. Not many artists are willing to go all in on their own work time after time again, costs be damned. Hell, most people aren’t even willing to invest 10% of their money without making sure it’s in a properly “diversified portfolio.” And most artists can’t sell themselves for shit, and have as much confidence as a teenage boy with a face full of acne standing in front of his high school crush. Not James “Big Balls” Cameron, who’s hit a grand slam twice in a row with Titanic and Avatar both taking up the number one wordwide box office crown, and declared himself the “King of the World.” Can he do it a third time? Time will tell.

So, what mega sci-fi franchise is next? Jurassic World? Ha ha. Yeah, right. Each installment has devolved so much it got to the point where the 2015 hit was basically a meta joke of itself, and is mostly known for its lead actress unrealistically running around in heels the whole time. Seriously, Bryce Dallas Howard, what’s your deal? Even Debbie Reynolds wore flats for some of the dance routines in Singin’ in the Rain. And her dance partner was Gene Kelly. Who was your co-star again? Oh, yeah, this goober:

Source: Parks and Recs (via Den of Geek)

Jurassic Park is a brilliant novel and film, without a doubt. But looking at the franchise collectively, and you can’t help but but feel exhausted by its purely business-driven cynicism. It’s like looking at a giant Excel spreadsheet with a T-Rex pasted in the corner. Pass.

Now what? Transformers? The cinematic equivalent of what a 13-year old kid would puke up after binging all night on Mountain Dew and Snickers bars? A franchise filled with robot on robot action scenes so messy and convoluted you need shock therapy to recover after seeing them? It’s kind of amazing that for a franchise that ran for 11 years, it only produced one decent film — Bumblebee. The last of the live action Transformers films since its release in 2018, and actually made the least amount of money. Which is a bad look for the once mighty audience of this franchise, as when they were finally presented a sequel of quality, they turned their noses up to it.

Oh, but I’m sure Transformers: Rise of the Beasts is totally going to rejuvenate the franchise when it comes out next year. Can’t wait for that one.

The Matrix? You talk as if there are sequels to the 1999 groundbreaking bullet-time hit. Maybe one day there will be. But for now, we’ll just have to keep waiting.

The Terminator franchise, aka the Old Arnie Show? Even as a big fan of the original two Cameron films, I can’t bring myself to acknowledge the sequels/reboots/remakes/reimaginings/whatever you want to call them, even as fun escapism. And as far as artistry or filmmaking craft go, they are obviously woefully short of the original and Judgment Day.

It’s not all negative, though. Avatar does have franchise peers in its genre.

The Alien Franchise, including the two Ridley Scott Prometheus/Alien: Covenant films, is probably the closest to Avatar in terms of special effects and directorial vision. Not to mention polarization among audiences, with many still split on the thematically muddled messes the last two Scott films have presented. But unlike Avatar, with its colorful, lush world, the Alien films are restrained to their bleak and nihilistic horror settings. Not exactly a fair comparison. And while the latest films to feature the creature with acid for blood haven’t been knock outs, they’re far better crafted than 99% of what you’ll find in your typical horror/sci-fi genre.

Another peer franchise to Avatar would have to be the new Planet of the Apes films, with the next sequel Kingdom of the Planet of the Apes scheduled to premier in May, 2024. But unlike Avatar’s blowout box office numbers, Apes has largely flown under the radar, with Dawn of the Planet of the Apes (the franchise’s highest-grossing film to date) only grossing under one quarter of what Avatar did. It really goes to show how hard it is to find an audience these days, even when you have groundbreaking special effects, great writing, a classic IP, and a strong cast, which the Apes films have in spades.

Will Avatar: The Way of Water become another smash hit like the original when it comes out on December 15? There’s no way to know for sure. But I do know that it’s one of the few films I’ve actually been looking forward to seeing for a long time. And I’m damn sure I’m not the only one who feels that way.

Put against its rivals, Avatar stands head and shoulders above most of its competition in every categorical measure. Special effects, story, cast, box office draw, vision, and yes, even impact and legacy. Who exactly is still talking about Transformers? No one’s even discussing Jurassic World: Dominion, and that movie came out this year. But everyone will be talking about the new Avatar sequel. Not to mention seeing it again and again.

I love Avatar and haters can choke.

Did Humanity Peak in the Late ‘90s-Early 2000s?

Source: Screenshot of ZubyMusic Twitter.

Or is this just pedestalization of the past?

I’ve followed ZubyMusic on Twitter for almost three years now, at least since around 2019.

If you’re not familiar with the artist, “Zuby,” short for Nzube Olisaebuka Udezue, is a 36-year old English rapper educated at Oxford University, with a substantial and growing audience of worldwide fans. Known mostly for his music, he’s also a strong conservative voice, often criticizing identity politics, and is a Christian. He’s self-released three albums, and has a podcast and YouTube channel.

I’m not a Christian myself, nor do I listen to rap. In fact, I’ve never once even listened to Zuby’s music, as I think “Christian” and “rap” sounds about as cringe as almost anything the “Christian” world tries to attach itself to in the secular realm in order to be hip and relevant. Christian comedy. Christian rock. Christian movies. Ugh. No, thanks.

Still, I like Zuby because he often makes interesting and thought-provoking tweets. Even if I don’t always agree, it’s nice to get a different or unique perspective on current events, especially on Twitter. It’s funny how conservativism is actually quite maintream and common in everyday life, yet online it’s seen as odd and “alternative,” with liberalism and left-wing politics seen as the default. In reality, it’s much more evenly split.

Last October Zuby tweeted the above comment, which I frankly dismissed almost immediately. I think there’s a temptation to glamorize one’s youth, seeing it as some bygone golden age. Zuby, born in 1986, would have had his most formative childhood years in the ’90s, and been a teen for the first half of the ’00s. I remember John Stewart on The Daily Show saying something like “you don’t miss that era, you just miss being a carefree child,” in response to a pre-sex scandal disgraced Bill O’Reilly saying how he felt the decade of the 1950’s (O’Reilly’s youth) constituted Americas best years. Politically, Stewart and myself are quite opposed, though I have to admit the guy could be pretty insightful at times.

Nostalgia-gazing is something particularly characteristic of the right wing. And while it’s soothing and addictive, it’s also as pointless and counter-productive as the left’s own habit of future utopia fantasizing. Neither side seems to want to deal with the here and the now, preferring to longingly await a DeLorean to whisk them away to another timeline. No wonder things remains such a mess, when both sides abdicate their responsibility in the present.

Then this morning I was reminded of Zuby’s tweet by Nick Sherwood, author of The Social Virus: Social Media’s Psychological and Social Impact on America (And What We Can Do About It). He posted a series of tweets articulating why he feels Zuby is correct.

Source: Screenshot of N. Sherwood’s tweet.

The above was followed by a long thread of reasons and supportive evidence, some of which I thought had credence. Others I found questionable. And by “others,” I mean most. And by “questionable” I mean mostly B.S.

To begin, I don’t think it’s possible to declare any particular era in human history a “peak” at all, given that so many cultures and nations around the world are undergoing vastly different experiences than others, both positive and negative.

If we’re talking strictly the Western world (America and Western Europe), one could make the argument the late ’90s to early 2000s certainly wasn’t a bad era. The Cold War had ended, and the economy and job market were strong. But that’s looking at things from the macro view. For someone working a cash register in a small town in Idaho, was their life any better or worse, or much different for that matter, than ten years prior?

Sherwood continues:

Source: Screenshot of Sherwood’s Tweet.

I agree with the first half of the second sentence, if by “progress” we’re talking technologically and socially. No doubt the ’90s was an era of progress. But so was the ’80s, the ’70s, and almost every decade before. At least in America and other places in the world. “Progress” is also subjective. No doubt Lenin and Stalin would have considered their Communist Revolution in Russia “progress.” But was it? Big doubt.

The second half of the statement is basically meaningless. How do you even measure levels of overindulgence and entitlement? These are aspects of human nature, and I don’t think humanity has evolved much, if at all, in just the past 25 years. So I’d say there’s a good chance that we’re seeing the same levels of indulgence and entitlement now that we saw a quarter-century ago. Maybe now it’s just more visible due to social media.

Moving onto his next points:

Source: Screenshot of N. Sherwood’s tweet.

Sherwood seems to posit that the late ’90s/early 2000’s comprised some kind of Goldilocks “sweet spot” era in which we had the just the “right amount” of technology. Not too much to where it became omnipresent, like the smartphone in everyone’s pocket, but just enough to where it acted in the background.

Again, this is highly subjective. One man’s too much technology is another man’s not enough. I can certainly remember people fixating on computers even as far back as the mid-90s, when the internet became more accessible to the mainstream.

Infrastructurally speaking, we’ve been dependent on computers probably since the 1960s. Almost all of our telecommunications, major medical equipment, civil defense systems, etc. all depend on computers and microchips.

If we’re talking about how the ’90s was the beginning of computers separating people into their own bubbles as everything went digital, there’s an argument for that. I do think people were more fluid socially back then than they are now. Younger generations today can’t seem to effectively communicate unless it’s through a screen. It was Millennials, afterall, who popularized “ghosting.” When people are reduced to simple online avatars, it’s much easier to dismiss their humanity and snap them out of your existence. People today shy from conflict more readily, and terms like “social anxiety” are prevalant.

Source: Screenwhot of Sherwood’s tweet.

I wrote for a newspaper as a teen. Had my own column. I also worked in the printing industry for eight years as it transitioned into the digital age. Newspapers are cool, but I wouldn’t associate them specifically as being the best or even a good source of information necessarily. At least, not anymore than radio or TV. Local news hasn’t really changed in 25 years, either. Traffic on I-95. Some guy got busted for dealing drugs. A kindergarten teacher retires. New waffle restaurant just opened. The song remains the same.

It’s true we get hit way more with B.S. news alerts and app notifications. But that’s a simple fix. I either delete a misbehaving app, or don’t turn on notifications at all. The only alerts I get on my phone are from my Medium app, which is actually starting to get on my nerves.

But again, Sherwood is really making more of a case against smartphones, and by extension social media, and not so much a case for the ‘90s/2000s being some golden era. You can’t just argue in the negative. Smartphones didn’t exist during the Bubonic Plague in Europe either, and I don’t think anyone would argue those were good times. Not unless they’re some hardcore “survival of the fittest” Darwinist fanatic, or something.

Source: Screenshot of Sherwood’s tweet.

What?! Has this guy not heard of the John Birch Society, which handed out leaflets and pamphlets pandering to very specific and extreme right wing beliefs WAY back in the ’50s and ‘60s?

Or The Daily Worker newspaper, published by the Communist Party USA back in the 1920s?

Or Bop Magazine, delivering steamy servings of teen heart throbs like Jonathan Taylor Thomas, Johnny Depp, and Jonathan Brandis?

Hmmm…if the ’90s was peak anything, it was was Peak Hot Guys Named John.

Source: Screenshot of Sherwood’s tweet.

No matter how many streaming or cable channel options exist, there are effectively only a small number that any one person will ever regularly watch, as there is only so much attention one can give, and limited time.

And why is the expansion of entertainment media necessarily a bad thing? You wouldn’t say the same about the millions of books that have been printed in the last few hundred years. So why would TV shows and movies be any different. There being five million Star Wars movies/shows/books/toys is annoying to me, yes, but it’s not like it ruins the quality of my life. I just ignore it, like anyone older than twelve and who possesses a frontal lobe should.

Source: Screenshot of Sherwood’s tweet.

Ah, so media is only “good” if EVERYONE is watching so they can dicuss it the next morning around the water cooler. Got it. That being the case, I guess the daily state broadcasts North Korea puts out to all its slaves, er, “citizens” must be of the highest excellence. I’m sure KCT fosters something a bit more than a “semblance of monoculture.”

It’s true that much of pop culture and media is fractured amongst varying demographics and audiences. But that’s always been the case. I can remember my friends and I discussing how freaking awesome the T-1000 was around the school cafeteria the year Terminator 2: Judgment Day came out, only to get blank stares from the girls, who themselves were talking about Beauty and the Beast. Then going home and my step-dad telling me to shut-up about “Turdinator” while watching a re-run of Welcome Back Kotter. Then running to my mom to whine that her husband insulted my hero Arnold, only for her to shut the door in my face so she could watch Knots Landing.

Like that South Park videogame, it’s always been a fractured but whole, Sherwood.

Monoculture is a myth. No matter how big a movie is, it’s likely not even three percent of the world population will even see it. Take Avatar, the highest grossing movie of all time not adjusted for inflation, at almost $3 billion in global ticket sales. In 2009, the year Avatar premiered, if the average movie ticket was $7.50, then that means a maximum of 400,000,000 saw James Cameron’s remake of Fern Gully in theaters, out of around 7 billion people. Except that number doesn’t count the people who went to go see the movie repeatedly. And it doesn’t count the fact that many people paid way more to see it in glorious 3D. If you cut that number in half to 200,000,000, that means only about 3% of the world population saw Avatar. Even if you double it to 6%, that’s still pitifully low in the grand scheme of things. And that’s the biggest movie ever released.

To put that in perspective, the biggest religion in the world, according to the Pew Research Center, is Christianity, and it hasn’t even cracked 1/3 of the global population with its 2.2 billion followers.

Source: Screenshot of Sherwood’s tweet.

No, Chapter 1 of the internet was “How Much Freaking Longer is This Thing Going to Take to Log On, Goddammit!” With the sub-chapter “Don’t Use the Phone I’m on AIM Right Now!” Chapter Two was “When Are We Getting Broadband, Everyone Else Has It Now!”

The internet sucked 98% of the time back in the ’90s. It wasn’t cool. It wasn’t aweome. You didn’t find anything “fresh.” It was where you IM’d your friends from school until some creep found your teen chat room and tried to have cybersex with you. There’s a reason why To Catch a Predator came out in the mid-2000s right after the supposed “golden age” of the internet. It’s because the world wide web, due to its anonymity and wild west novelty, empowered a lot of perverts in the early days.

The internet was also a place for piracy. Remember Napster, which single-handedly almost destroyed the entire music industry? “I Love the ’90s” my ass, especially if you played in a band named Metallica.

The internet was weird, distrusted, seen as a fleeting fad by some, buggy, slow, mostly useless, and the driver of the Dot Com meltdown. Saying the internet was “cool” back then before high-speed and regulation is like saying bloodletting was cool before modern medicine discovered viruses and bacteria.

Source: Screenshot of Sherwood’s tweet.

Ah yes, that wonderful period in the late ’90s and early 2000s when politicians never pandered for votes, didn’t treat those across the aisle like horrid zombies, and joined arms as fellow Americans. Back then we didn’t have contested elections, or impeachment trials, or “vast right wing conspiracies,” or third party presidential runs conducted by eccentric billionaires. Politicians didn’t lie. They never even used foul language. Certainly they didn’t have affairs with interns, or cheat on their cancer-stricken wives. Or invade countries based on false claims of weapons of mass destruction. None of that ever happened.

Source: Screenshot of Sherwood’s tweet.

If kids growing up and maturing sooner is your benchmark for the golden years, then you’d have to look way past the ’90s. Back to, say, during WWII, when kids lied about their age so they could go to war.

Take the case of Calvin Graham, for instance. Born in Canton, TX, Graham signed up for the U.S. Navy after the bombing of Pearl Harbor at 12 years old. He’d later get wounded by sharpnel at the Naval Battle of Gaudalcanal, for which he’d receive the Bronze Star and the Purple Heart. Graham would eventually get booted from the Navy after attending his grandmother’s funeral without permission. Get married at age 14. Become a father a year later. Divorced at 17. Then join the Marine Corps at 17 to serve in the Korean War. Then break his back in 1951 after falling off a pier.

Look at that. Two wars. Two branches of the military. Married and divorced. Has a kid. And even gets his first case of workman’s comp. All before most kids even learn how to shave.

Sorry, kids were not free-roaming Mad Max badasses in the ’90s. They were mostly soft, squishy, sticky bags of shit. Eating Lucky Charms, Pop Tarts, and Ellio’s Pizza. Capable only of Nintendo marathons, watching Saturday morning cartoons, remembering the Konami code, and making fun of Michael Jackson’s face.

I don’t know what causes people to glamorize and pedestalize the past. Nostalgia has practically become its own genre now, with Hollywood dumping ‘80s-inspired crap like Stranger Things on us constantly like Nickelodeon slime. I remember the ’80s, man. I was a kid then, too. Well, mostly I remember watching TV and movies during the late ’80s, and not having to worry about a whole hell of a lot. What do you mean the Russkies could drop a nuke on us any moment? I don’t care, I’m watching Inspector Gadget here and drinking chocolate milk.

For sure, sometimes I miss not having any responsibility other than deciding what kind of dinosaur I want to be for Halloween. But it’s kind of ridiculous and suspect to declare any particular era “humanity’s peak” when it just so happens to coincide with your childhood. It almost sounds like indulgence and entitlement, come to think of it.

Think of it this way. Right now there’s a horrible war going on in Ukraine. It’s the worst of times for anyone who lives there now. But somewhere in Colorado, Florida, Canada, or maybe even Japan, some kid somewhere is having the time of his life. He’ll grow up thinking it never got any better than the late teens and twenties. The ’90s and early 2000s will be as foreign to him as the ’60s and ’70s are to a Millennial or Gen Z’er.

And you know what? He’ll probably be right. At least he didn’t have to deal with the Macarena.

Should the Voting Age Be Raised to 28?

Source: Screenshot of Peter Schiff’s Twitter

Is voting an inherent right? Or is it something that should be “earned” with maturity?

I was scrolling through Twitter on Election Day afternoon when I came across Peter Schiff’s tweet, which I’ve screenshot above.

If you aren’t familiar with Schiff, he’s a popular gold bug, media commentator, and CEO of Euro Pacific Capital. He famously hates Bitcoin, considers all digital currencies Ponzi schemes, and is often regarded as an economic “doom and gloomer.”

He’s 59 years old, lives in Puerto Rico for the tax benefits, loves gold, has been warning of an imminent global economic collapse for almost two decades now, and favors fiscal conservatism.

He’s the quintessential Boomer’s Boomer.

I like Peter Schiff somewhat. I think he’s right on many things. Not Bitcoin. But I mostly agree with his overall ethos.

Which is why his tweet on voting yesterday afternoon got me thinking.

Admittedly, the knee-jerk response to his proposal to raise the voting age to 28 is a resounding “No!” It seems preposterous on its face. How dare you suggest taking the right to vote away from people who are old enough to join the military and die for their country.

You can drink at 21. You can sign up for six figures of student loan debt at 18, even if you’re going to a posh private art school to learn fingerpainting. You pay taxes even when you’re still a minor. You can sign business deals at 18.

So why should you not be able to vote starting at 18?

I can remember graduating high school in the thick of the 2000 presidential election, and actively looking forward to casting my first ballot. I even volunteered to work for the GOP on a street reconstruction project, as the convention was being held in Philadelphia, where I lived. The idea of being able to have even a very small say in who ran the government was an exciting prospect for me as a newly-minted adult.

Of course, that election became infamous for being undecided until December, hung up by “hanging chads” in Florida. George W. Bush slipped through with a razor-thin margin of victory, thanks in part to a Supreme Court ruling to stop the recount process. It was a cold-plunge initiation for me into the oftentimes crazy democratic process.

Schiff’s proposal may sound anti-democratic on its face. But I don’t think it’s that simple. It’s about applying more standards to a democratic practice that Schiff feels is sacred, beyond just the incidental component of age. You could argue it puts more of a premium on democracy. Something freely given is rarely valued as much as something earned, afterall.

We apply standards to nearly everything in life. You have to pass a test to obtain a driver’s license, and you must abide by the rules of the road if you intend to keep your license. You have to apply to college, and pass your classes if you expect to graduate. You have to show up to work on time and do the job if you want to stay employed.

So why not apply stricter standards to voting?

At its core is the idea that those with a bigger stake in society should have a bigger say in how it runs. Why should the middle-aged father or mother of two kids, who own a house, pay property taxes, work two jobs, have no greater say in who governs them than the 19-year-old unemployed college student living in their basement?

Schiff aticulates another angle to his argument here, in response to a tweet:

Source: Screenshot of Peter Schiff’s Twitter

I disagree with Schiff’s assumption that older voters necessarily equal a better government. Some of the Founding Fathers were in their mid to late 20s and early 30s during the American Revolution. In fact, the average age of the delegates during the Constitutional Convention was 42. If anything, Schiff’s argument puts a premium on middle-age. Schiff, at age 59, inadvertantly undercuts his own age group.

Nor do I think that just because someone has kids or a mortgage that they’re more qualified to vote than someone who doesn’t. Much less that they’re more mature. There are plenty of dumb parents and irresponsible people who got conned into bad mortgages. And there are also plenty of saavy wise-beyond-their-years young people well under Schiff’s critical age of 28.

Schiff’s stance is likely a recipe for stagnation. But I’d be remiss not to point out that underlying his argument is the idea of wanting to filter out younger voters because so many of them vote in socialist policies that increase taxes on people like Schiff and workers (like myself) in general. Schiff’s proposal is more about trying to protect his wealth than in advocating some more pristine version of democracy.

And he’s not wrong about wanting to do that. As someone who spent years working in the harsh North Dakota oilfields to obtain some measure of financial freedom, I abhor the idea of a bunch of freeloaders coming along and helping themselves to my money out of some half-baked notion of “equity.”

But then again, it’s not really the young, socialist voters that are the biggest threat to an investor’s net worth. Bad policies by the Federal Reserve that caused it to print too much money, have now led to spiraling inflation, which has helped crater the stock market and economy. And I don’t see too many young faces sitting on that banking board at all. So much for the wisdom and maturity that supposedly comes along with age.

Schiff is right, however, in wanting to apply stricter standards to the power and privilege of voting. I don’t think voting should be a free for all. Otherwise you run the risk of mob rule. Voting should be regarded as an important duty, given to those who have proven they care about this country and have a vested interest in securing their community.

But is age the best way to apply the standard of civic responsbility?

I can think of better metrics. Living independently. Paying your own bills. Being free of consumer debt. Being gainfully employed or financially secure. Not having a criminal record. All things that aren’t necessarily age-related.

I do think you should be able to prove that you aren’t a burden to society and dependent on others (outside of factors beyond your control like physical handicaps, etc.), if you intend to have a say via voting in how it runs. Voting is something that should be earned rather than just handed out by virtue of reaching a certain magical age.