The God Tool
The Tech Industry turned a Lamborghini of Potential Into a Donkey Cart Hauling Manure on Concrete Wheels
I fucking hate computers. Every time I sit down to work, my first task isn’t logging in—it’s praying that nothing breaks before I get past the welcome screen. These days, successfully logging in feels like getting your coffee right before walking out the door: it sets the tone. But in tech, the ritual goes far beyond coffee. First it’s the password. Then 2FA. Then the fingerprint. Then third-party verification. Then standing on your head and patting your stomach while the authentication gods decide if you’re worthy. And even if you pass all that, you’re still not working—you’re waiting to see if email loads, if your calendar’s been scrambled, if Teams opens without a seizure.
When everything does load, you let yourself exhale—until the screen freezes, stutters, and collapses into a full blue-screen-of-death. Like a slot machine that almost hit jackpot, then hit you in the face instead. Restart. Re-authenticate. Re-enter the pit. When things finally stabilize, what you’re met with isn’t clarity. It’s chaos. Emails full of spam, scams, and “gotcha” phishing tests from the cybersecurity team. Teams? Less like a communication tool and more like a digital landfill—overflowing threads, missed messages, half-formed ideas rotting in open view. There’s information in there somewhere. But it’s buried. Weaponized inefficiency masquerading as productivity. And the worst part? This isn’t a bug. It’s the design.
Our systems of productivity are a hodge-podge of services built around great marketing, clean aesthetics, and a philosophy of “should”—all grounded in duct-taped hope. The vendors do a great job pitching the dream: that SharePoint, Teams, Slack, Zoom, Jira, Outlook, MinIO, Dropbox, OneNote, OneDrive, and Google Drive are all part of a seamless ecosystem that integrates human patterning, enhances productivity, increases customer satisfaction, and ultimately boosts shareholder value. It all sounds compelling—until you try to use it. The gap between what these tools promise and what they deliver isn’t just a technical failure—it’s the natural outcome of how we’ve chosen to deploy technology: with misplaced priorities, commercial incentives, and a complete disregard for the psychological and social cost.
Mental Harm
Targeted ads are the uninvited company that shows up in the RV for a two-week stay. Not only does their arrival disrupt whatever focus you had left, but trying to expel the intrusion requires a mental flush that permanently erases memory you’ll never get back. Every hover box closed, every "speed-scroll" dodge through nonsense in your feed adds up. And somehow—without you saying a word—they know you were looking for a pre-trimmed, nylon, bank-robber head stocking. These aren’t random coincidences. They’re the result of behavioral profiling systems tuned to your vulnerabilities. You think you're browsing—but you're being watched, shaped, and sold in real-time. What you're seeing isn't advertising—it's influence, and it works better than you'd like to admit.
And it goes deeper than click-tracking. They’re not just logging where you go or which ad led to a click-through—they’re logging how long your scroll hesitates. They’re measuring eye movement, not just engagement. Your "off" camera? It's not really off. They can use it to detect where your gaze lands on a page, who’s in the room with you, and what recognizable objects are in frame—using shape detection, ambient sound signatures, and sensor feedback. Accelerometers, microphones, GPS, biometric data, proximity sensors—it's all fair game. Your device doesn’t just know what you’re doing. It knows how, when, where, and increasingly, why. This isn’t personalization. It’s surveillance dressed up as convenience.
These systems weren’t designed to be helpful. They were designed to be hard to put down. Infinite scroll, autoplay, like counters, refresh mechanics—all of it built to keep your brain chasing a reward that rarely arrives. It’s not subtle. It’s casino logic: randomized gratification to keep you in the chair. And just like a slot machine, the longer you stay, the more you lose—time, attention, presence, clarity. And here’s the part no one wants to say out loud: none of these features actually help you. They don’t make you smarter, calmer, more productive, or more connected. They help one thing—the information extraction machine. You’re not being served. You’re being mined. The apps just dress like chiclets—bright, rounded, friendly icons hiding compulsive loops beneath. TikTok, Instagram, YouTube Shorts—they don’t serve content, they drip-feed stimulation. And just when your brain starts to fatigue, it gets a new hit. The damage doesn’t just show up in productivity metrics. It shows up in the growing number of people who can no longer read a full paragraph, sit in silence, or finish a thought without checking their phone. And to make sure you never break the cycle, the system dings, vibrates, and flashes—reminding you that you’re not in control. You’re on a leash.
The charisma machine is in your hand. With a few swipes, a filter can make you J-Lo pretty. You can crop out the chaos, retouch the blemishes, and post the stills that make it seem like you’re living the glamorous life—even if you're knee-deep in debt, loneliness, or dysfunction. Over time, this curated self starts to replace the real one. You don’t just clean up your photos—you start sculpting your behavior, your opinions, your voice to fit the performance. And it doesn’t stop with appearance. Social media rewards the extremes: the hottest takes, the loudest breakdowns, the slickest highlight reels. Authenticity becomes a liability, vulnerability becomes content, and real human connection gets replaced by engagement metrics.
What we end up presenting isn’t our ego—or even our id. It’s not our higher self or shadow self. It’s a hollowed-out digital caricature—flattened, stylized, and trained to chase likes instead of meaning. It mimics the alter ego but lacks the soul. It wears the costume of identity but is detached from experience, context, consequence. And the more we feed it, the more it feeds on us.
True story: my ex-wife used to try to talk me into letting the kids buy clothes for their online characters. The first time I heard the request, I was pissed. Who the fuck buys fake clothes you can’t wear? What’s going to happen to my $15 when you buy your cartoon a jacket? But one day, I saw the charges on the bank statement. I was fuming. She’d let them do it. They bought the fake clothes for the fake characters—and I knew we’d crossed a line. And yeah, I got sucked in. I started telling myself this was the “new way.” Digital currency, online expression, evolving norms. Fucking bullshit. I gave in. That’s how it happens. Not through logic—but exhaustion.
Now we’ve got a generation raised to believe value is virtual, ownership is cosmetic, and identity can be built one skin at a time. We’re not teaching them how to live in the real world—we’re training them to decorate a simulated one. These aren't just microtransactions. They're psychological training wheels. They teach dependency, not agency. Appearance over substance. Symbols over reality. And worst of all, they teach that status doesn’t have to be earned—it can be bought. In the physical world, effort produces tangible rewards. In the digital one, the only thing produced is corruption of the soul. One of those systems has value. The other is a lie.
Environmental Harm
Every click costs something. Not metaphorically—literally. The convenience we take for granted runs on an industrial-scale engine that most people will never see. Every email, AI query, photo backup, or cloud sync burns electricity and drains water. Data centers pull megawatts of power and millions of gallons of water to stay cool. AI isn’t magic—it’s a glutton. The more we ask machines to think for us, the more we burn through the planet to keep them fed. You think you're just checking your calendar. You're boiling a reservoir to do it.
And if we weren’t wasting all that power on filters, memes, and synthetic bullshit—what could we be doing with it? That energy could desalinate water for drought-stricken cities. Power hospitals. Charge public infrastructure. Light up the towns that still go dark after sunset. The power is real. The waste is real. The tragedy is what we’re wasting it on.
They say the rising cost of electricity is just a percent or two each year. What they don’t say is that those increases are stacked on top of a pricing model designed to extract, not serve. Under the logic of Citizens United, utility CEOs aren’t accountable to communities—they’re accountable to shareholders. So when Big Tech plugs in a new data farm or expands an AI model, the cost of expansion gets offloaded—onto you. Your rates rise, not because electrons are scarce, but because capital needs growth.
Even when you use less power, your bill doesn’t drop. It can’t. Because you’re not the customer anymore. You’re the product. The system isn’t broken. It’s performing exactly as it was redesigned to—to keep you paying more for less, forever.
When the machines break down or become obsolete, the story doesn’t end—it gets shipped overseas. Recycling is the lie we tell ourselves to feel clean about dumping waste on the poor. “E-waste processing” often means open-air burning, unprotected dismantling, and lead poisoning in countries we pretend not to think about. We export our guilt and call it sustainability. We send trash disguised as opportunity and pretend it's charity.
Every time a corporation brags about “green practices,” what they’re really saying is: we’re not doing the damage here. Out of sight. Out of mind. Out of moral responsibility.
Before that device was in your hand, it came out of a mine—often dug by slaves. Cobalt from the Congo. Lithium from South America. Rare earth metals from regions stripped of sovereignty, safety, and soil. Our addiction to upgrades feeds a global economy of exploitation. These aren’t just minerals. They’re blood-soaked building blocks.
And when the next phone comes out, we throw the old one away and do it again. This isn’t consumption. It’s cannibalism. And the tech industry has made it look like progress.
Social Harm
I guess it was naive of me to think there were no slaves tied to the logistics trail of tech manufacturing. I wanted to believe that working in a forward-looking industry meant leaving the horror and legacy of the past behind. Surely something as cutting-edge as AI, quantum computing, and cloud infrastructure couldn’t still have blood in its circuitry. But that was wishful thinking. Even more naive was the hope that slavery was just a tech problem. Because if slavery still exists—and it does—it can be tied to any industry. But which is more insidious? The one that’s always been rooted in labor exploitation, like agriculture or textiles? Or the one that pretends to be solving humanity’s future while quietly dragging the oldest sin into the digital age?
You don’t have to look hard to find the bodies. Cobalt, essential for lithium-ion batteries, is mined in the Democratic Republic of Congo, often by children, often by hand, often without protection. Competition for control of these mineral-rich regions has led to open conflict, and with that comes atrocities most people can’t stomach to imagine. Rape, mutilation, forced child conscription, and yes—even cannibalism—have all been documented in the wake of mining violence. This isn’t just labor abuse—it’s industrialized suffering. In China, reports of Uyghur forced labor tied to tech component factories have been traced through major supply chains, while companies scramble to deny responsibility behind layers of contractors and legal insulation. Apple, Samsung, Tesla—none of them are clean. At Foxconn’s factories in China, workers assembling iPhones were driven to such despair that nets had to be installed to stop suicides. In places where regulation is weak and profit margins are sacred, the human cost is simply priced in. You don’t see it on the box. But it’s in the box.
Tech doesn’t democratize—it divides. The digital world pretends to be open to everyone, but it’s tiered by design. Wealthy users get premium security, responsive support, faster speeds, and devices that last. Poor users get surveillance, throttling, pre-installed bloatware, and platforms engineered to extract attention and data. In schools, kids in wealthy districts get iPads and coding classes. In poor ones, they get outdated Chromebooks with monitoring software that tracks every keystroke. We call it “access.” What it really is, is digital caste. The poor don’t just get worse tools—they become the product the tools are built to farm. Privacy, autonomy, even basic functionality—all of it is a privilege now.
This isn’t just a class issue—it’s geopolitical. In some parts of the world, tech giants don’t just influence the infrastructure—they are the infrastructure. In Myanmar, Facebook became synonymous with the internet. Entire communities only accessed the online world through Facebook’s “free internet” initiative. And what did they get? Not education. Not opportunity. They got an algorithm that amplified racial hatred, conspiracy theories, and calls for ethnic cleansing. What was framed as a donation—a gift to the disconnected—helped fuel a genocide. Facebook didn’t build a bridge. It handed a loaded weapon to a volatile regime and turned away. This is what it means to export digital tools without accountability. Not just bad outcomes—mass graves.
What is my time now? It’s fighting Outlook to make sure it isn’t hiding a message I actually need—or worse, trying to recover an important email I accidentally dragged into the void with my monkey-like hands and now have to spend an hour chasing through twenty folders. It’s troubleshooting authentication. Permissions. Bandwidth limitations. Firewall rules. Mandatory reboots for updates. Mandatory reboots to uninstall the update that broke everything. Mandatory reboots to reinstall the update that’s supposed to fix the update that broke the update. It’s requesting group inclusion, requesting distro list removal, turning off the dings and pings—then doing it again after the latest app update resets the preferences I already set. It’s migrating from Teams to Slack for collaboration, only to find out we’re now using both, because someone in a meeting said “agile.” This isn’t productivity. This is unmanagement. A treadmill of digital maintenance work dressed up as contribution.
This is where the hours go. Not into creation. Not into mastery. Not into anything that strengthens a soul. Just… into the system. Time isn’t just being stolen—it’s being ground into dust.
Conclusion
The god tool still exists. The machine still works. The circuits still hum. But we no longer control what it’s doing—or who it’s doing it for. The problem isn’t the code. It’s the compass. We deployed this system with the wrong values, under the wrong incentives, for the benefit of the wrong people. And now we’re living inside its consequences—mentally fragmented, environmentally drained, socially hollowed.
It hasn’t just hollowed out the individual spirit—it’s drained the spirit of entire cities. San Francisco, once a symbol of innovation and rebellion, now bleeds under policies crafted to please the very tech elite who fled as soon as the chaos touched their doorstep. Open-air drug markets, tent cities, and algorithm-driven policing didn’t appear out of nowhere—they followed the logic of “optimization.” In Myanmar, that same logic helped feed a genocide. In Memphis, it manifests in generational economic abandonment and digital redlining—another layer of systemic disregard baked into the same machinery. And all the while, our collective intellect erodes. Books go unread. Attention spans atrophy. The very wiring of our minds is being overwritten. Not by accident—but by design.
The tragedy isn’t that we failed to use the tool wisely.
It’s that we never intended to.
The harm was always a feature.
The breakdown was always the business model.
Unplug with intention. Not to retreat, but to regroup. Audit every tool, every platform, every digital ritual in your life. Ask what it gives you—and what it takes. Refuse convenience that costs clarity. Refuse systems that steal your time, gut your privacy, and sell your children hollow dreams wrapped in dopamine.
Build what serves. Burn what doesn’t.
Because if we don’t reclaim the tool,
it will keep shaping the world in its image.
And if that doesn’t scare you,
you haven’t been paying attention.
I use AI to help me refine structure, clarify my points, and punch through the noise—but the ideas, the voice, and the fire are all mine. If you want raw thoughts, read my notebook. If you want the sharpened blade, you're holding it.