Sunday, 20 April 2025

Why I Can’t Take Surveys

I don’t take surveys. Not because I don’t want to help or that I’m uninterested - but because I understand the intent.

Surveys, in theory, are supposed to gather honest, unbiased feedback. But in practice? They’re often a game of subtle cues and loaded questions. I can see through them. I know what’s being asked, and more importantly, I know why it’s being asked.

When I read a survey question, I don’t just answer it - I analyze it. I think about what the question is trying to elicit, what the organization hopes to hear, how the data might be used, and how my response will contribute to a narrative. My mind doesn’t just respond; it reverse-engineers the intention behind the question.

And because of that, my answers are no longer neutral. They're no longer spontaneous or unfiltered. They’re shaped by awareness - an awareness that makes it impossible for me to respond without second-guessing everything. That bias, in turn, makes the results of any survey I take unreliable.

Surveys count on participants not overthinking. But I do. Every. Single. Time.

So I skip them. Not out of apathy - but out of respect for the integrity of the data. Because if I can’t answer honestly, maybe I shouldn’t answer at all.

No One Wants to Look Like a Fool — Except Children, and That's Why They Learn Faster

Somewhere along the line, we all picked up the idea that being seen as a fool is a fate to avoid at all costs. We bite our tongues in meetings, hesitate to try new things, dodge eye contact when we’re unsure of the answer. It’s not always pride. Often, it’s fear. The fear of being seen. Of being wrong. Of trying and failing in public.

But here's the thing: children don't care. Not because they're braver or wiser — but because they haven’t yet learned the social rules that teach us shame. When a toddler stumbles while learning to walk, no one laughs. When a kid mispronounces a word or draws a very questionable-looking cat, it’s met with smiles, not mockery. They’re not afraid to be seen trying, failing, experimenting, because no one expects them to have it all figured out.

And that’s exactly why they learn so fast.

A child’s world is trial and error. They touch, taste, ask, mimic, mess up, and try again. They don’t worry if their questions sound silly, if they’ve done something “wrong,” or if someone sees them fall down. They don’t yet know what it means to “look foolish,” and even if they do, they don’t attach the same shame to it that adults do.

Then they grow up. School happens. Social dynamics shift. You get laughed at for the wrong answer. You feel your cheeks burn when your voice cracks during a presentation. You start to hide your curiosity behind a cool facade. You begin waiting until you’re “good enough” before trying something new. The fear of looking like a fool settles in — and with it, the learning slows down.

But here’s the twist: if you want to grow, you have to be willing to look foolish again.

Every new skill has an awkward phase. Every creative idea starts rough. Every first attempt is a bit clunky, a bit embarrassing. But hiding from that discomfort means hiding from progress. The irony is, the people we admire most — the risk-takers, the innovators, the creators — they all looked foolish at some point. Many still do. They just got used to it.

So maybe the goal isn’t to avoid looking like a fool. Maybe the goal is to relearn how to be childlike — not childish, but childlike. Curious. Unafraid. Willing to stumble out loud. To take swings. To be seen in the mess of becoming something more.

Because when you drop the fear of looking foolish, you pick up the freedom to grow.

The Illusion of "Facts" in the Age of Opinion

We live in an age where "facts" are at our fingertips. A simple search on Google can bring up a wealth of information, and with it, a sense of certainty about the world around us. But are we truly equipped to discern what is factual, or are we just being fed a curated version of reality?

It’s easy to be swept up in the momentum of a compelling discussion or an engaging panel. As you listen to different perspectives, some ideas will strike you more than others — they appeal to your sense of reason or align with your beliefs. These ideas might seem so convincing, so logical, that you accept them as true without questioning them. The more passionately the speaker defends their viewpoint, the more compelling it becomes, and before you know it, you are advocating for these ideas too, convinced of their truth.

But here’s the catch: often, the "facts" presented in these discussions are anything but. In the world of debates and panel discussions, exaggeration is often used as a rhetorical tool. The more a speaker feels their point is being challenged, the more they stretch the truth to hold attention and win over the audience. What starts as a grain of truth can quickly transform into a monumental exaggeration designed to capture your attention and make their viewpoint seem indisputable.

Now, imagine you’re one of the few skeptics in the room, the one who wonders, "Wait, is this really true?" After the discussion ends, you go home and set out to verify the information. Your first instinct is likely to turn to Google — after all, what better place to confirm a fact than the world’s most powerful search engine?

But here’s the catch: the articles and sources you find online are rarely unbiased or objective. Every writer, journalist, or content creator presents their own perspective, shaped by their experiences, beliefs, and agendas. The articles that pop up when you search for answers are not definitive, neutral statements of truth; they are simply someone’s perspective, written through their own lens. In fact, many writers are motivated by their desire to convince you of a certain point of view, not necessarily to provide a balanced, fact-checked presentation of events.

What happens next is a kind of “confirmation bias.” The more you read, the more you find articles that support what you already want to believe. And soon enough, you fall into the trap of assuming that because you found several articles on the same topic, you’ve found "the facts." But here’s the hard truth: the internet is full of opinions masquerading as facts. The writers you read are presenting their own beliefs, and when you read their work, you are, in essence, adopting their perspectives.

This is why I always remain skeptical of so-called “facts.” In a world where information is so easily manipulated, it’s difficult to separate truth from bias, exaggeration, and opinion. The more I delve into any topic, the more I realize that what I thought was a "fact" might just be someone's well-argued opinion, carefully crafted to make me believe in it.

So, where does that leave us? How do we navigate a world where “facts” are increasingly subjective?

  1. Question Everything: Always approach information with a critical mindset. Just because something sounds convincing doesn’t mean it’s true. Look for multiple sources, especially from those who are known for their impartiality and fact-checking.

  2. Validate Your Sources: Ask yourself where the information is coming from. Is the writer an expert in the field? Are they citing reputable sources, or just presenting an opinion based on personal experience?

  3. Be Open to Multiple Perspectives: The truth isn’t always black and white. A nuanced view, acknowledging multiple sides of an issue, is often more accurate than a single, exaggerated viewpoint.

  4. Recognize the Power of Persuasion: We are all influenced by persuasive arguments. Being aware of how rhetoric works can help us see through the manipulation and form our own opinions based on evidence, not persuasion.

At the end of the day, the pursuit of truth is a messy, complicated process. But it’s a process worth undertaking. The next time you come across a "fact," take a moment to question it. Verify it. Don’t just take it at face value — because in this era of information overload, it’s all too easy to confuse opinion with fact.

In a world of exaggerated truths and persuasive voices, a healthy dose of skepticism is not only reasonable — it’s essential.

Thursday, 17 April 2025

"Good" Is Dead: Why Only Exceptional Work Matters in the Age of Infinite Content

We’re living in an era unlike any before. The gates to information, tools, and distribution have been thrown wide open. Anyone with an internet connection can now access the same resources that were once reserved for experts or institutions—YouTube tutorials, podcasts, generative AI, books, online courses, social media, and more. The democratization of knowledge is one of the most powerful shifts of our time.

But with that shift comes a new reality: good is no longer good enough.

In fact, "good" has become the bare minimum—the cost of entry. The internet has leveled the playing field, and in doing so, it has dramatically raised the bar. Competence is abundant. Skill is scalable. What used to be impressive is now expected. And in this landscape, only the exceptional stands out.

Welcome to the Winner-Takes-All Era

We now operate in a winner-takes-all environment. The top creators, thinkers, and performers capture a disproportionate share of attention, influence, and reward. The long tail has flattened, and the middle is crowded. Everyone is trying to break through the noise, but the noise is louder than ever.

This means that to be noticed, to make an impact, to build a meaningful brand or business—you have to be operating at an A+ level, not an A or even an A-.

“Pretty good” won’t get you anywhere. “Good enough” will be forgotten. “Solid” is just invisible.

The Sea of Adequacy

We’re flooded daily with content and creations that are fine, decent, or even impressive in a vacuum. But in context—when stacked against the sheer volume of output happening globally—they disappear. They fade into the sea of adequacy.

You can produce thoughtful YouTube videos, write smart essays, build decent products, or launch solid services… and still get zero traction. Not because your work isn’t valuable, but because value alone doesn’t differentiate anymore. Excellence does.

There Are No Exceptions

This is the hard truth: there are no exceptions. Whether you're a writer, designer, developer, founder, freelancer, artist, strategist—whatever your field—being "good" puts you in the middle of the pack. And that’s the most dangerous place to be.

Excellence is no longer a nice-to-have. It’s a necessity.

So What Does This Mean?

  • Be ruthless with quality. If it’s not outstanding, don’t ship it. Rework it. Elevate it. Sweat the details.

  • Find your edge. What’s the thing only you can bring to the table? Double down on it.

  • Play the long game. Excellence takes time. Don’t chase virality—chase mastery.

  • Iterate with intention. Improve relentlessly. Learn fast, fail better, and build momentum.

Final Thought

We’re in a golden age of creation—but also a brutal one. The tools are accessible, but the expectations are higher than ever. If you want to stand out, build something that’s not just good—but unforgettable.

Because in a world where everyone is creating, only the exceptional will rise.