blog

Instability


🔗 a linked post to robinrendle.com » — originally shared here on

The whole point of the web is that we’re not supposed to be dependent on any one company or person or community to make it all work and the only reason why we trusted Google is because the analytics money flowed in our direction. Now that it doesn’t, the whole internet feels unstable. As if all these websites and publishers had set up shop perilously on the edge of an active volcano.

But that instability was always there.

The only social network I post on anymore is LinkedIn. I have close to 2,000 followers there.

Lately, I’ve noticed that the “engagement” on my posts is increasingly sparse. Earlier this year, I was routinely seeing thousands of views per post. These days, I’m only seeing hundreds, and when it comes to sharing links to my newsletter, I’m seeing only dozens.

Meanwhile, here on my rag tag blog, I know my thoughts end up reaching people who matter the most to me.

It’s certainly less than the 2,000 people who follow me on LinkedIn, and substantially less than the tens of thousands of people a week who “engage” with my “content”1 there… but I don’t care.

By posting here, I’m taking the harder route of building an audience without the flashy shortcuts promised by platforms like LinkedIn and Google.

Whenever I try to take shortcuts and play SEO games, I end up doing things to my website which make it feel less authentic.

And these days, I find myself asking, “what exactly do I need to take a shortcut for?”

Robin also quotes this piece by Jeremy Keith where he discusses our need for human curation:

I want a web that empowers people to connect with other people they trust, without any intermediary gatekeepers.


The evangelists of large language models (who may coincidentally have invested heavily in the technology) like to proclaim that a slop-filled future is inevitable, as though we have no choice, as though we must simply accept enshittification as though it were a force of nature.

But we can always walk away.

It’s tough to walk away from the big tech companies, but I can assure you it is possible.

Facebook used to dominate my daily existence, but besides perhaps Marketplace for selling my junk, I do not miss any of Meta’s properties since I left several years back.

Google was my portal to my email, search, and maps for years. In the past few years, I have switched to primarily using Fastmail, Ecosia, and Apple Maps. Here in 2024, they all work well.2

I do my best to avoid ordering stuff off of Amazon, and I hardly stream anything on Netflix anymore.3

I haven’t made the move over to the Light Phone yet, and I find it hard to believe that I’ll give up my Apple Watch, Apple TV, or iPad/Macs… but I do find myself questioning the prolific presence of Apple in my life more often than I did, say, ten years ago.

As I continue to experiment with LLMs, I’ve noticed that the locally-run, open source models getting closer to the performance you see in closed source models like GPT-4o and Claude Sonnet 3.5 Sonnet. It’s only a matter of time that they’re good enough to do the tasks that I find myself turning to ChatGPT to complete today.

Enshittification isn’t inevitable. Like depression, it’s an indicator that something in your digital life needs to change.


  1. Sorry for the obnoxious emphasis on terms like “engagement” and “content”… I’ve reached a point where I feel like those words are meaningless. A lot of the themes of this post can be summed up with trust, and in order to accurately engagement, you have to trust that the metrics provided by the platform vendor are accurate (which I do not). And calling our collective knowledge “content” as though it’s the equivalent of feed for the cattle also upsets me.  

  2. Ecosia’s results are powered by Bing, which traditionally haven’t been that great, but I just consider this to be a benefit of Google’s results becoming terrible. Now both search engines return subpar results, and by using Ecosia, I am helping to plant trees. It ain’t much, but it’s honest work

  3. The last couple weeks have seen my most Netflix action in years, because I did watch Muscles & Mayhem, the American Gladiators documentary, on Netflix last week, and I do highly recommend it. I’m also gonna give the Tour de France documentary a shot as well. 

Continue to the full article



Choose Boring Technology


🔗 a linked post to mcfunley.com » — originally shared here on

I saw this article referenced while reading Bill Mill’s recap of relaunching a website, which in and of itself is a delightful read for those of us who nerd out on large-scale system architectures.

I am almost certain I’ve read Dan’s piece on boring code before, but I wanted to share it here because it serves as a great reference for those of us who are sick of making bad tech stack decisions for bad reasons.

In particular, the ending here sums up my experience consulting with many different tech teams:

Polyglot programming is sold with the promise that letting developers choose their own tools with complete freedom will make them more effective at solving problems. This is a naive definition of the problems at best, and motivated reasoning at worst. The weight of day-to-day operational toil this creates crushes you to death.

Mindful choice of technology gives engineering minds real freedom: the freedom to contemplate bigger questions. Technology for its own sake is snake oil.

The teams which move the fastest are the ones who are aligned on a vision for what is being built.

Often, these teams hold a “strong opinions, loosely held” mentality where they decide what tools they’ll use, and they’ll use them until they no longer solve the problem at hand.

Put another way: in a business context, experimenting with your tooling is a huge organizational expense that rarely yields a worthwhile return on investment.

Your focus should be on what you are building rather than how you’re building it.

Continue to the full article


Perplexity’s grand theft AI


🔗 a linked post to theverge.com » — originally shared here on

We’ve seen a lot of AI giants engage in questionably legal and arguably unethical practices in order to get the data they want. In order to prove the value of Perplexity to investors, Srinivas built a tool to scrape Twitter by pretending to be an academic researcher using API access for research. “I would call my [fake academic] projects just like Brin Rank and all these kinds of things,” Srinivas told Lex Fridman on the latter’s podcast. I assume “Brin Rank” is a reference to Google co-founder Sergey Brin; to my ear, Srinivas was bragging about how charming and clever his lie was.

I’m not the one who’s telling you the foundation of Perplexity is lying to dodge established principles that hold up the web. Its CEO is. That’s clarifying about the actual value proposition of “answer engines.” Perplexity cannot generate actual information on its own and relies instead on third parties whose policies it abuses. The “answer engine” was developed by people who feel free to lie whenever it is more convenient, and that preference is necessary for how Perplexity works.

So that’s Perplexity’s real innovation here: shattering the foundations of trust that built the internet. The question is if any of its users or investors care.

Well, I sure do care.

Continue to the full article


Selfish


🔗 a linked post to ofdollarsanddata.com » — originally shared here on

As everyone was celebrating and feeling good, I was barely functional. Truthfully, I had never felt closer to death in my life. I’ve done hard workouts before. I know what it’s like to push myself. I’ve been running for over a decade. But what I experienced after crossing that finish line was something else entirely.

And for what? To have a 07:25 pace instead of a 07:30 pace? Remove my two sprints from the race and I come in maybe 30 seconds later. What difference would it have made in my life? None. I don’t win some extra prize by coming in at 25:57 instead of 26:27. 

So why did I do it? Yes, I wanted to push myself. Yes, I wanted to beat my goal. But, ultimately, I did it because I was selfish.

I love a good running analogy.

I heard Derek Sivers make a similar point with biking a few years back. Pacing is an important aspect to a well-lived life.

I also enjoyed this Josh Brown quote he included in this article:

Make yourself useful to smart, successful people. That’s how you should spend the first ten years of your career.

Surround yourself with smart, successful people and then bet on them. That’s how you should spend the next ten years.

Continue to the full article


Get Rid Of The Imposter Syndrome For Good!


🔗 a linked post to goodness-exchange.com » — originally shared here on

When you embrace the idea that, yes, you were lucky, the fear drops away. And then you become more open to the possibility that the universe will continue to guard your back.

Because here is a truth that only a few discover: when you look for signs that the Universe is ‘friendly’ you will find them everywhere.

It is far better to live in a ‘friendly’ universe than an ‘indifferent’ or ‘hostile’ one.

I’m used to ascribing neutrality as the universe’s default mode, but I didn’t consider the possibility that a neutral universe can be harnessed in whichever way you want.

As a developer, whenever I see my code works right, I often squint at it in disbelief, wondering what I did wrong, feeling like it’ll break the second I push it to production.

Maybe in those moments where my impostor syndrome is peaking, I should accept the pat on the back from the universe and give it some flowers.

Continue to the full article


Conan O’Brien Doesn’t Matter


🔗 a linked post to nytimes.com » — originally shared here on

I'm a sucker for profiles on people like Conan O'Brien. The way his mind works is endlessly fascinating to me.

What intrigued me about this particular New York Times piece is his observations on agony:

Many comedians see a connection between misery and their ability to be funny, often citing humor as a survival mechanism. But after considerable therapy and reflection, O’Brien has changed his mind. He’s come to believe that not only are they not related at all, but so much stress didn’t help him be funnier. With new eyes, he has set about creating a new story. “Looking back now, I think some of my best ideas came from just goofing around,” he told me.

He points to possibly his most celebrated writing credit: the monorail episode of “The Simpsons,” which many television critics agree is the greatest in the history of the show. He describes its origins in an Olympic Boulevard billboard for a monorail, leading him to write on a legal pad: “Springfield gets a monorail. Homer likes the idea. Marge not so sure. First act: ‘Music Man.’ Second: Irwin Allen parody.”

He brought this pitch to the “Simpsons” office, writers liked it and started adding jokes. “It was like falling off a log,” he said. No agonizing at all.

I have a ton of quotes on the main page of this site1, and one of them is from Eckhart Tolle: "Suffering is necessary until you realize it is unnecessary."

The more I agonize over my own life choices and what's next for me, the more I realize that I just need to let go. It's a constant push/pull; you have to be both unabashedly dogged in your pursuit of what you want, but you also need to be chill about it.


  1. Conan is in this rotation twice now, and one of those quotes came from this article, so thanks, Conan! 

Continue to the full article


Security at Startup


🔗 a linked post to vadimkravcenko.com » — originally shared here on

In my opinion, security is one of the most forgotten aspects of software engineering. It rarely gets focused on until it’s too late. Even though at least one incident lands on HackerNews every week where some data gets leaked or someone gets hacked — people still think, “Nobody cares about my little startup.” You might think you're too small to be noticed by the big, evil hackers. Wrong. Size doesn't matter. You're always a target; there’s always data to leak and ways to exploit your business.

This is a great primer for the security-related items you need to consider when you’re building software.

Some takeaways:

First, any human-built product is going to be insecure. Nothing is 100% secure, ever. The best you can do is make the bad guys earn it by making it difficult to break into.

Second, your biggest vulnerabilities are almost always human. You can build Fort Knox, but if I’m able to trick your guard into opening the door for me, then what’s the point?

Third, I’m grateful for frameworks like Ruby on Rails which handle a good chunk of the author’s “step 0” items out of the box. Picking the right tool (and keeping that tool sharpened) is the best first step.

Fourth, there’s never a moment with software when you can dust your hands and say, “ope, we’re done!”

Security is especially an area in which you can’t sit still. If you build an app and let it sit for a decade without any updates, I can almost guarantee you that there’ll be a vulnerability in one of your dependencies which I could exploit to take over your system.

Finally, if you reach a certain size of organization, you need someone thinking about this stuff full time and orchestrating all the pieces needed to keep a secure system.

Continue to the full article


It’s Time to Dismantle the Technopoly


🔗 a linked post to newyorker.com » — originally shared here on

[Techno-selectionism] is a perspective that accepts the idea that innovations can significantly improve our lives but also holds that we can build new things without having to accept every popular invention as inevitable. Techno-selectionists believe that we should continue to encourage and reward people who experiment with what comes next. But they also know that some experiments end up causing more bad than good. Techno-selectionists can be enthusiastic about artificial intelligence, say, while also taking a strong stance on settings where we should block its use. They can marvel at the benefits of the social Internet without surrendering their kids’ mental lives to TikTok.

As much as I personally enjoy hanging out on the cutting edge and experimenting with new technologies, I would consider myself a techno-selectionist when it comes to adopting these tools into our lives.

I am sure some people enjoy the new Google search results that are driven by AI, but when it still recommends you add glue to pizza despite the widespread mockery they received initially, maybe we should take a step back and demand better from our techno overlords.

Or, since we know that’ll never happen, maybe we need to decide for ourselves which tools are worth incorporating into our lives.

Continue to the full article


I Will Fucking Piledrive You If You Mention AI Again


🔗 a linked post to ludic.mataroa.blog » — originally shared here on

Consider the fact that most companies are unable to successfully develop and deploy the simplest of CRUD applications on time and under budget. This is a solved problem - with smart people who can collaborate and provide reasonable requirements, a competent team will knock this out of the park every single time, admittedly with some amount of frustration. The clients I work with now are all like this - even if they are totally non-technical, we have a mutual respect for the other party's intelligence, and then we do this crazy thing where we solve problems together. I may not know anything about the nuance of building analytics systems for drug rehabilitation research, but through the power of talking to each other like adults, we somehow solve problems.

But most companies can't do this, because they are operationally and culturally crippled. The median stay for an engineer will be something between one to two years, so the organization suffers from institutional retrograde amnesia. Every so often, some dickhead says something like "Maybe we should revoke the engineering team's remote work privile - whoa, wait, why did all the best engineers leave?". Whenever there is a ransomware attack, it is revealed with clockwork precision that no one has tested the backups for six months and half the legacy systems cannot be resuscitated - something that I have personally seen twice in four fucking years. Do you know how insane that is?

This whole article is a must read.

The main point: with any major leap in technology, there will be hucksters who purport to use the new hotness to solve all your problems.

The problem is that most organizations don't even take the time to solve the already solvable problems that exist within that organization.

New Javascript frameworks, database software, on-prem versus cloud-based server architecture, containerized systems, blockchain, mobile apps... unless you know how using these tools will solve a problem that your existing tech stack cannot solve, they're nothing more than distractions.

You don't need a garage full of tools to get a job done. Getting the fundamentals right is so much more important than making another trip down to Home Depot to buy your sixth version of a hammer.

Continue to the full article