Unknown's avatar

Posts by Nathan Coumbe

My mission is to learn, inform, inspire, and improve. I am a passionate teacher, an avid writer, a leader of people, and a strategic thinker. Wherever I am, whatever the work I am called to do, my goal is the same: make my little corner of the world better for everyone in it. To do this, I ask better questions and solve more interesting problems for those I serve. Think deeply. Think often. Keep exploring. Always be curious.

The first rule in juggling…

Never lunge for the ball.

If you make a bad throw, just let it drop. Then start over.

Learning to juggle taught me how to handle life: sometimes you make a bad throw.

Sometimes you take on one too many projects. Or Murphy’s Law derails your plans.

Don’t lunge to save things. Let them drop.

Reset, and begin anew.

What kind of fear is it?

Is this fear keeping you safe?

Or is it the kind of fear that’s preventing you from being your best?

Learn to differentiate between the two.


Inspiration

Do you know a virtuous person?

Who do you know who is courageous?

Wise?

Disciplined?

Just?

Do you know anyone who embodies all four of these cardinal virtues?

How much better would things be if you had a boss like this? A coworker or employee?

How would the world improve if we had leaders like this?

It’s hard to succeed with only one or two. You need all four to be truly effective.

The German soldiers who steamrolled Europe were courageous and disciplined. But they were brave and disciplined for the most unwise and unjust of reasons.

You can probably think of several people who were incredibly wise… But who lacked the courage to stand up and do the right thing when the time called for action.

We need more virtuous people in the world.

They aren’t born this way. They make themselves so.

AI isn’t taking your job

…at least not yet.


I use AI almost every day to assist with work and learn new topics (as part of my job) that I’m unfamiliar with. I read diligently to stay up-to-date on the latest developments, so I can learn how to use it more effectively.

AI will become (if it hasn’t already), and continue to be, a large portion of all of our lives.

However, we’re receiving a significant amount of misinformation about what’s happening and the effects it’s having on workers. Some of it is outright deception, while some is simply lazy reporting.

First, the deception.

The CEOs of these massive tech companies (e.g., Dario Amodei, Sam Altman) are brilliant business people who’ve created mind-boggling products. But they’re hemorrhaging cash trying to make their programs more powerful…

And after years of unbelievable growth and progress, they’re failing. The scaling law on which they used to project LLM growth is slowing down, and the improvements are now incremental, rather than exponential.

This is a serious financial problem for them. They need to keep their current investors engaged, and they need new investors to infuse them with additional capital. So what do they do?

They go on cable news shows or podcasts and claim that their AI software will replace all entry-level workers (10-20% of the workforce) within a matter of months.1 It just isn’t true.

But you wouldn’t know that from the news you’re consuming. They’ve bought into this story hook, line, and sinker.

Which brings me to my accusation of lazy reporting. Headlines like “Goodbye, $165,000 Tech Jobs” and “AI is Replacing 10 million Workers” (I made that one up) are attention-grabbing… But untrue.

These media companies, like the AI companies they write about, need to make money. They do that by getting as many eyes on their work as possible. And the best way to do that is to scare people into giving them attention… Even if the claims are untrue or misleading.

To paraphrase Ryan Holiday, who warned us about this years ago: “Trust them… They’re lying.”

It is true that computer science graduates are having a much harder time finding jobs at the moment. And it’s true that there have been massive layoffs in the tech sector.

It’s also true that the companies doing these layoffs are investing more of their money and efforts in AI. But AI is not the cause of this, nor is it replacing those who’ve been laid off.

Here’s what’s actually happening:

During the pandemic, these tech companies went on a massive hiring spree—they simply overhired. Now they’re bloated, and the quickest way to reduce the bloat and (temporarily) increase shareholder value is to shed programmers left and right.

At the same time, the tech sector itself is contracting, which means there are fewer jobs for all the newly minted computer science graduates.

This has historical precedence. The same thing happened in 2008 during the financial crisis. And it happened before that during the dot-com bust at the turn of the century.

The number of people entering the computer science field fluctuates in response to the economy. There’s a tech boom, prompting more people to enter the field. Then the sector contracts, and all those people get laid off, which in turn reduces the number of people entering the field.

Until the next boom.

Contrary to what many journalists have written, these people aren’t being replaced by AI. They’re simply being let go because companies overhired during the pandemic or because the companies are refocusing on AI.

However, that refocus, coupled with layoffs and fewer job openings, has led them to conflate the two, concluding that these computer science graduates are being replaced by AI.

This simply isn’t true. That may happen in the 2030s, but it’s not happening right now.

I’ve been guilty of buying into this hysteria too, as you can see in my piece on job hunting in 2025. And I’m here to tell you I was wrong in what I wrote about AI replacing workers in that piece.

All that to say this: Read AI journalism with a healthy dose of skepticism right now. And take any apocalyptic predictions with a grain of salt.


  1. Dario Amodei actually said this in an interview with Anderson Cooper and, ironically, claimed to be worried about it… Which begs the question: if you’re worried about it, why do you continue to do it?

    Why doesn’t he just stop if it actually worries him? It’s his company. ↩︎

Four questions

What are we measuring?

How are we measuring it?

Should we be measuring this?

And, most importantly, are we sure we measuring the right things?

Ignore Goodhart’s Law at your own peril.

Perfectionism hinders progress

We all have a tendency to strive for perfection. But it’s often a trap that keeps us from reaping any rewards at all.

Case in point:

I used to write morning pages each day to clear my mental clutter and get ideas flowing for the day. And, like their creator, Julia Cameron prescribed, I did them longhand on 8.5″ x 11″ sheets of paper first thing upon waking.

But doing it this way took me nearly an hour each morning. Not a problem when I was laid off and unemployed, but quite difficult on a regular, full day when I also needed to cook breakfast, help my wife get ready for work at 5 am, and squeeze in a workout.

That was hard enough, but life made it harder.

After I left my two-week ICU stay in 2020 and went home to recover from COVID-19, I found myself dealing with horrible inflammatory issues all throughout my body, including my hands and wrists. This made it difficult, if not impossible, to do much writing by hand.

So I stopped writing morning pages. And as a result, I lost all the benefits of that wonderful mental decluttering each morning and the ease with which new ideas flowed.

I tried here and there to dive back into morning pages—the right way, by hand—for years, but never managed more than a few days before I quit. Frustration, or pain, stopped me from continuing.

But in the back of my mind, I always knew there was a simple solution to this: just type your morning pages out on your computer! Rather than mangle my hands or suffer through a slow hour of writing I didn’t have, I could just type them.

But I resisted, because it wasn’t the right way to do morning pages. Julia was very explicit.

I let perfection prevent me from doing anything at all, when something, however imperfect, would have been better than nothing. By refusing to do them any way other than “perfectly,” I was missing out on 100% of the benefits of the process itself.

Doing no morning pages guaranteed that I got 0% of the benefits of morning pages—no mental declutter and no new ideas to work with during the day.

Doing anything, however imperfectly, had to be better than nothing, right? 25% better? 50%?

In fact, it would have been infinitely better! Because a 1% benefit is infinitely better than a 0% benefit.

Interestingly, I found that writing out my pages by typing them up on my computer not only let me do them on days when time was limited and with no physical pain, but I still got all the benefits that I received when writing them by hand.

I delayed doing a less-than-perfect version of something and missed out on all the benefits of that something rather than doing a “good enough” version of it and getting at least some of the benefits.

We’re all guilty of this.

We do it with our health: if we can’t do the extreme workout perfectly, we just don’t do anything at all. But going for a 15-minute walk is literally infinitely better than vegging out on the couch.

If we can’t stick to our meal plan perfectly (and no one ever can), we say “f–k it,” and eat an entire 18″ pizza. But eating 3 slices of pizza with a little salad is infinitely better than binge eating out of frustration.

We do it with our hobbies: if we can’t set aside two hours to practice our guitar, we let it languish on the stand in the corner. But spending 15 minutes learning a small section of a song is infinitely better than doing nothing.

This all-or-nothing mindset is all too common and the enemy of progress in everything we do. We’re trained in school to live by an A+ mindset: how far away am I from 100%?

But we’d be so much better off if we reversed it and asked, “how far away from 0% am I, and what decision would let me move a notch or two higher?”

I use this tactic all the time with my coaching clients when trying to make behavior changes stick, and it works wonders.

The next time you find yourself battling perfectionism, stop and take a breath. Then ask yourself, “What is 0% on this thing I’m trying to do?” Then figure out what a tiny notch higher on that scale is for you and do that.

Those little points, day after day, add up.

Courage is a skill

Seth Godin has arguably one of the best ideas for getting a project started that you’ll ever read. It’s called “First, ten.”

The idea is to share what you’ve created—a book, podcast, newsletter, business idea, whatever—with 10 people who already know and trust you. And if it’s good, they’ll share it with three, five, or ten others. Soon, your idea will spread, and you’ll have the opportunity to do it again.

But sometimes, even that is too terrifying to contemplate. So what can you do instead?

Find a single person. Just one person who loves you unconditionally and whom you trust implicitly. Maybe it’s your sister, your mom, or your best friend.

Share it with them. Not because they’ll praise you for it or because they’ll share it widely. Do it simply to show your fear who’s boss.

Stretch that courage muscle by starting as small as possible. Because bravery is a skill. It can be learned through practice and repetition.

Some ideas on hiring (Part 3 of “Same job, different pay”)

(This is part 3 in a rant on hiring, salaries, and job postings. You can read part 1 here and part 2 here.)

If someone with less experience than I, who didn’t attend college like I did, started today at the company I currently work for, in the same role I hold now—do I think she should be paid the same salary as me?

Absolutely, I do.

The work is the work, and if she is doing the same work as me, to spec, she should earn the same amount I do. My years of experience and educational background do not entitle me to a higher salary if we’re doing the same work.

Can she do the work as is expected of her? That’s all that matters.

And for someone looking for a job like mine, if they can learn how to do it well, then why does it matter how they learned?

So if I’m against education as a prerequisite for entry, how should we go about hiring people?

I have two ideas, and the first is simple: more companies should adopt open hiring practices.

There’s a factory in New York that makes the brownies for Ben & Jerry’s Ice Cream. If you want to work at the factory, you put your name on a list.

When a job opens at the factory and you’re next on the list, you get a call asking if you’re still interested in the job. If you are, you report to training, and if you can pass the training, you’re hired.

They don’t care about education, criminal backgrounds, living situation, previous employment, or current skills. If you can learn how to do the job, you get the job at the wage they pay everyone else.

Keep a list of applicants, hire them in order when jobs open up, and train them to do the work.

Why doesn’t every restaurant, coffee shop, and retail establishment in the world already do this?

Simple: They’re scared of making a bad hire. They worry that they’ll hire someone who doesn’t show up to work, arrives late, has a bad attitude, or struggles to perform.

Well… what happens when those same people are hired through traditional (i.e., shitty) job search qualifications?

They get fired.

And if you implement open hiring practices, you spend a lot less money on the recruitment process than you otherwise would, so you lose out on a lot less if you must fire someone.

The absurdity of hiring people through traditional methods of posting job openings, soliciting hundreds of résumés, and holding interviews gives companies a sense of control over who they hire.

But it’s an illusion: either the person will work out or they won’t. And interviewing people with the “right” background on paper isn’t a guarantee that they’re a good hire.


Now, I already know that this idea is so radical for people in knowledge work that it won’t happen anytime soon, even though I firmly believe many companies can hire and train people to perform a significant percentage of the available office jobs out there.

So, what if you’re absolutely certain that you can’t use open hiring due to the nature of the work your company does? Let’s say, because you don’t have the time or resources to train a person to the level you need quickly enough to make it worthwhile.

That’s where my second solution comes into play: contingency hiring.

Time for a story: I was laid off a few months after the COVID-19 pandemic began (from a job I had held for only a few months).

My job search lasted nearly a year (no one was hiring). It was excruciating and terrifying.

But one of the best things that ever happened to me in terms of my career happened near the end: a CEO took a chance on me in an unconventional way.

I’d been studying (and, to some small extent, practicing) marketing for a couple of years before the layoff. I’d spent a lot of time during my unemployment talking to people in the field to get a sense of the jobs available. I wanted to know what skills and knowledge they required so I could make myself more appealing to employers.

A former classmate from university recommended me for her role at a marketing agency when she left to take another job. I went into the interview feeling woefully underqualified, but I knew I had a decent foundation of self-study to build on and could learn the rest of what I needed to know on the job.

The CEO agreed, but was still somewhat reluctant to fully commit to this neophyte in the marketing world (and who wouldn’t be?). So he offered me a deal:

He said that he would let me work for the agency for four weeks in exchange for a single lump-sum payment to see if I could do—or learn how to do—the work required for the role.

After that, we would reconvene, and depending on the results, either he would hire me full-time, or we would part ways with no hard feelings and gratitude for giving the company my time and skill.

It was one of the most generous and thoughtful offers I’d ever gotten.1

If I were starting my own business today, this is exactly how I would hire someone. I’d post a job opening, and then I would select an applicant to work with me on a contingency basis.

Rather than conducting an interview (which only tells me if she’s good at interviewing), I would work on a project with her that I already needed to complete for the business, and I would pay her for her time.

If, at the end of the project, we found it was a good fit (and she still wanted to work for me, because she would also get to try out the business), I would extend a full-time offer to her.

Why doesn’t every company do this all the time? They get work done, can assess whether the person is a good fit for the role (which isn’t always apparent from an interview), and the applicant not only gets paid but also gets to test whether it’s a role they actually want.


I can’t fix the hiring process simply by writing and ranting about it. But I do know that it’s 100% broken right now.

Social media (LinkedIn) has made it worse, not better.

AI and ridiculous job requirements for college degrees and 30 years of experience have reduced the job search to months or years of misery, frustration, and indignity.

But these simple tweaks—from removing degree requirements to contingency hiring—could go a long way to fixing a broken system.


  1. I didn’t end up accepting his offer. I asked him to give me a day or two to think about it and discuss it with my wife, which he happily agreed to.

    Literally, that same day, after I got home from the interview, I got a call from another company I’d had a few interviews with. They had a firm offer for me for full-time remote work with benefits.

    I called him that evening to let him know I was taking the other job offer, which he completely understood. But he also let me know that the offer was still on the table if it didn’t work out. ↩︎

On the frivolousness of educational requirements in job postings (Part 2 of “Same job, different pay”)

I anticipated some of the pushback I’d receive from yesterday’s post, and I wanted to address it here.

Some might argue that certain jobs require graduate or other advanced education to obtain them. And I agree some should: I’d much rather have a surgeon who went to medical school operate on me than one who learned from YouTube.

And we’re all better off with engineers who went to school for the subject than relying on amateurs to build our bridges.

But doctors, engineers, nurses, and other “professional” roles all require specialized education simply to learn and carry out the basics of their jobs.1

This isn’t the reality for many knowledge or service sector jobs, which comprise a significant portion of our modern workforce. However, you might point out that many of these jobs require a college or even graduate education, as indicated in their job postings. Isn’t that at odds with what I’m saying?

No—because these jobs don’t actually need you to have a degree. It’s a tool to keep you from applying for them.

Hiring managers simply use that to make their lives easier and weed out 90% of otherwise qualified applicants without ever having to look at their applications. It reduces their workload.

For the vast majority of us working in the knowledge sector, a college education neither prepares us for specific jobs, such as those professional jobs listed previously, nor does it actually equip us with most of the skills required for knowledge work. We learn on the job and through self-education.

You don’t need a master’s degree in computer science from MIT to work as a software developer. You simply need to know how to program (and be damn good at it). You can learn on Codecademy or attend a bootcamp, gaining enough knowledge to get a job. You’ll have to learn the specifics of the role when you start working, anyway.

I work in learning and development, so I’m somewhat biased in my thinking on this. The shift in L&D now is toward skills-based training and qualifications. Essentially, we’re trying to answer this question:

Leaving aside formal education, what specific skills does a person either need to possess—or need to learn—to be qualified for this specific job?

You don’t consider their past college education (or lack thereof); you only consider what they’re capable of. This approach doesn’t harm people who attend school for specific fields because, as long as they’re qualified by their skills, they can still get the job. However, it also doesn’t prevent those who didn’t attend a formal school, but who do possess the necessary skills, from securing work for which they’re qualified.

Again, why do we need someone who is applying for a job in marketing or customer success to have a college degree? If they have the skill, or can learn it outside of a university, shouldn’t that be enough?

For most knowledge work jobs, it’s simply ridiculous to require a college degree (and I have two of them, neither of which I’ve ever used in my knowledge work jobs).

So I suppose I’m arguing two things:

  1. Eliminate degree requirements for most jobs.
  2. Pay the same wages (and good ones) for the same work, regardless of educational background or years of experience.2

Only the quality and the results of the work should matter. Not how much education someone has.

And for God’s sake, not based on how good someone is at negotiating. Not everyone is comfortable negotiating salaries or demanding raises, especially when they think their boss will just fire them and hire someone cheaper if they try.

We are obsessed with meritocracy in this country, often to our detriment. And we obsess over it in such a way that it harms people who may be just as capable at a job as someone else, but who don’t have the courage or the skills to negotiate with someone in a position of power above them.


Does this mean that I think another L&D specialist at my company who hypothetically started today should make the same amount of money I make after nearly four years of raises?

Absolutely, I believe that. But I’ll save that rant for Part 3 tomorrow.


  1. I have a holdup about including lawyers in this list for one very specific reason: Until sometime in the 20th century, you did not have to go to law school to practice law. You simply needed to pass the bar exam for the state(s) in which you intended to practice.

    Walter Gordon, a native of my home state of Mississippi (who was one of the main characters in Band of Brothers), passed the bar exam after the war while still in law school and was allowed to practice even without his diploma.

    At some point, law schools realized they could make a lot more money if they collaborated with the American Bar Association to require would-be lawyers to attend their schools, thereby creating a somewhat arbitrary barrier to entry into the field.

    Are lawyers who attend law school objectively better off? I don’t know, nor do I feel qualified to say. However, it does seem similar to what we encounter in knowledge work job searches.

    Are all hiring managers part of some cabal to make it difficult to get jobs? No. It’s simply easier for them and also just “how things are done around here.” The status quo is the status quo for a reason. ↩︎
  2. What if someone in the same role as someone else is objectively better at the job than the other person? Personally, I believe this issue can be easily resolved with bonuses or commissions.

    I don’t mean paying people crappy wages and hoping that they’ll make it up in bonuses (looking at you, restaurants paying servers $2.13 an hour). I mean that if two people are doing the same job, but one of them has been outstanding for just one quarter or year or whatever, pay that person some sort of bonus to show your appreciation.

    But don’t punish the other person who is doing good work—to spec, doing what needs to be done and is asked of her—by paying her less for arbitrary reasons. ↩︎

Same job, different pay

I saw a job posting’s salary description the other day that gave me pause.

The salary was dependent on three things:

  1. The number of courses taught (yeah, that makes sense. More work = more pay)
  2. The type of courses taught (More advanced courses = more difficulty = more pay. Also makes sense)
  3. The educational level held by the instructor…

That third item is the one that gave me pause. Here’s why:

If two people are doing the exact same type of work at the exact same level of quality, why should one with a higher-level degree be paid more than the person with a lower-level degree?

You might say, “Well, they went to school longer. They have more education. They’re more qualified.”

So what? Does that degree automatically mean that the person is more skilled at the job? No, not at all. 1

More education does not automatically confer a higher level of qualification or suitability for a job. The skill of the person, and nothing else, does that.

If the person with the higher degree actually delivers more or better work than the other, then I understand receiving more pay. They are arguably more valuable. But that has nothing to do with the degree and everything to do with the output of the worker.

Additional education might enable that higher quality, but then again, it might not. There are countless MBA graduates out there who are suitable for little more than responding to email or working in middle management. They would flounder trying to run a small business.

Perhaps changing the nature of the work in question would make this make more sense:

Let’s say Person A has a master’s degree in burger-flipping, and Person B has a bachelor’s degree in burger-flipping. But both workers flip the same number of burgers each hour at the same level of quality expected of anyone on the line.

Should Person A be paid more money simply because they got a master’s degree in the subject? I would argue no, because the quality of the output and the nature of the work are the same.

You might think I’m stretching this a bit, but I’m not. It’s the work that matters, the output, the results.

A person’s demonstrable skill determines their qualifications, not a piece of paper. That paper is often a false proxy for genuine qualification, a stand-in for real value.

But we buy into it because we’ve been trained to believe that more is better, higher is better. We must stop this.

We have to start measuring the proper targets and rewarding the right things appropriately.


  1. I’m aware that teachers are paid at different salary levels based on their educational levels (e.g., a master’s degree earns more income than a bachelor’s degree. However, just because that’s the case doesn’t make it right.

    Teachers should not be paid based on how much schooling they received, but on how good they are at schooling others. If someone with a master’s degree is educating students in a way that they outscore everyone else, then I can understand paying that teacher more (and she should share her secrets with everyone else so they can level up their students and make more money too!).

    Also, teachers should simply be paid substantially more than they currently earn, but that’s a topic for another day… ↩︎