Some ideas on hiring (Part 3 of “Same job, different pay”)

(This is part 3 in a rant on hiring, salaries, and job postings. You can read part 1 here and part 2 here.)

If someone with less experience than I, who didn’t attend college like I did, started today at the company I currently work for, in the same role I hold now—do I think she should be paid the same salary as me?

Absolutely, I do.

The work is the work, and if she is doing the same work as me, to spec, she should earn the same amount I do. My years of experience and educational background do not entitle me to a higher salary if we’re doing the same work.

Can she do the work as is expected of her? That’s all that matters.

And for someone looking for a job like mine, if they can learn how to do it well, then why does it matter how they learned?

So if I’m against education as a prerequisite for entry, how should we go about hiring people?

I have two ideas, and the first is simple: more companies should adopt open hiring practices.

There’s a factory in New York that makes the brownies for Ben & Jerry’s Ice Cream. If you want to work at the factory, you put your name on a list.

When a job opens at the factory and you’re next on the list, you get a call asking if you’re still interested in the job. If you are, you report to training, and if you can pass the training, you’re hired.

They don’t care about education, criminal backgrounds, living situation, previous employment, or current skills. If you can learn how to do the job, you get the job at the wage they pay everyone else.

Keep a list of applicants, hire them in order when jobs open up, and train them to do the work.

Why doesn’t every restaurant, coffee shop, and retail establishment in the world already do this?

Simple: They’re scared of making a bad hire. They worry that they’ll hire someone who doesn’t show up to work, arrives late, has a bad attitude, or struggles to perform.

Well… what happens when those same people are hired through traditional (i.e., shitty) job search qualifications?

They get fired.

And if you implement open hiring practices, you spend a lot less money on the recruitment process than you otherwise would, so you lose out on a lot less if you must fire someone.

The absurdity of hiring people through traditional methods of posting job openings, soliciting hundreds of résumés, and holding interviews gives companies a sense of control over who they hire.

But it’s an illusion: either the person will work out or they won’t. And interviewing people with the “right” background on paper isn’t a guarantee that they’re a good hire.


Now, I already know that this idea is so radical for people in knowledge work that it won’t happen anytime soon, even though I firmly believe many companies can hire and train people to perform a significant percentage of the available office jobs out there.

So, what if you’re absolutely certain that you can’t use open hiring due to the nature of the work your company does? Let’s say, because you don’t have the time or resources to train a person to the level you need quickly enough to make it worthwhile.

That’s where my second solution comes into play: contingency hiring.

Time for a story: I was laid off a few months after the COVID-19 pandemic began (from a job I had held for only a few months).

My job search lasted nearly a year (no one was hiring). It was excruciating and terrifying.

But one of the best things that ever happened to me in terms of my career happened near the end: a CEO took a chance on me in an unconventional way.

I’d been studying (and, to some small extent, practicing) marketing for a couple of years before the layoff. I’d spent a lot of time during my unemployment talking to people in the field to get a sense of the jobs available. I wanted to know what skills and knowledge they required so I could make myself more appealing to employers.

A former classmate from university recommended me for her role at a marketing agency when she left to take another job. I went into the interview feeling woefully underqualified, but I knew I had a decent foundation of self-study to build on and could learn the rest of what I needed to know on the job.

The CEO agreed, but was still somewhat reluctant to fully commit to this neophyte in the marketing world (and who wouldn’t be?). So he offered me a deal:

He said that he would let me work for the agency for four weeks in exchange for a single lump-sum payment to see if I could do—or learn how to do—the work required for the role.

After that, we would reconvene, and depending on the results, either he would hire me full-time, or we would part ways with no hard feelings and gratitude for giving the company my time and skill.

It was one of the most generous and thoughtful offers I’d ever gotten.1

If I were starting my own business today, this is exactly how I would hire someone. I’d post a job opening, and then I would select an applicant to work with me on a contingency basis.

Rather than conducting an interview (which only tells me if she’s good at interviewing), I would work on a project with her that I already needed to complete for the business, and I would pay her for her time.

If, at the end of the project, we found it was a good fit (and she still wanted to work for me, because she would also get to try out the business), I would extend a full-time offer to her.

Why doesn’t every company do this all the time? They get work done, can assess whether the person is a good fit for the role (which isn’t always apparent from an interview), and the applicant not only gets paid but also gets to test whether it’s a role they actually want.


I can’t fix the hiring process simply by writing and ranting about it. But I do know that it’s 100% broken right now.

Social media (LinkedIn) has made it worse, not better.

AI and ridiculous job requirements for college degrees and 30 years of experience have reduced the job search to months or years of misery, frustration, and indignity.

But these simple tweaks—from removing degree requirements to contingency hiring—could go a long way to fixing a broken system.


  1. I didn’t end up accepting his offer. I asked him to give me a day or two to think about it and discuss it with my wife, which he happily agreed to.

    Literally, that same day, after I got home from the interview, I got a call from another company I’d had a few interviews with. They had a firm offer for me for full-time remote work with benefits.

    I called him that evening to let him know I was taking the other job offer, which he completely understood. But he also let me know that the offer was still on the table if it didn’t work out. ↩︎

Touching the hot stove

Sometimes, you just have to let people touch the metaphorical hot stove.

We work so hard to enact safeguards that protect people from making poor choices. But those safeguards are often viewed as a shackle on individual liberty, either because they don’t understand or don’t care.

For many, experiencing the consequences of their actions and choices is the only way they’ll learn.

The problem is that, in a society as interconnected and dependent as ours, those of us who know the stove is hot often get burned in the process.

If you value it, subsidize it

You would think that after what we saw with the COVID-19 pandemic and the ongoing shortage of doctors, nurses, and other healthcare practitioners, we would be seeing some sort of decline in the price of these educational programs.

Fewer people going into the field would mean lower prices for those programs, right? (Supply and demand.)

Let me hypothesize why this might not be happening.

We have deeply ingrained in our culture the idea that the most important thing you can do is make a lot of money. Therefore, the best thing you can do for yourself is obtain a degree that leads to a certain type of job that pays well.

This means that, because we’ve conditioned our kids to believe that money is everything, people will continue to borrow astronomical amounts of money to attend medical school, believing that they will earn enough to cover it afterward. 

I suspect that a similar pattern is emerging with other college degrees, where individuals are borrowing six figures to earn degrees that lead to jobs paying half that or less, and this will eventually affect medical students. 

This trend is already happening with dental students. There are now a few hundred dentists in the United States who owe more than $1 million in student debt!

Tuition costs are likely to continue rising while salaries remain stagnant. Consequently, we may have doctors with $1 million in loans earning $250,000 a year (or less).

I think one solution is collective action. To make a difference, we, as a society, must unite and declare that we will not continue this way. But that’s hard to do.

The other option is to implement some form of government intervention based on the values we hold as a country.

If we believe that we need more doctors, engineers, and teachers in this country, rather than more hedge fund managers and trust-fund babies, our policies have to match that belief.

One of my professors in college—a funny little self-described country boy from the Mississippi Delta—had something of a law he preached to us:

  • If you want people to start do something, subsidize it.
  • If you want people to stop doing something, tax it.

It works: this very idea is how we almost created a generation of non-smokers.

All the ad campaigns in the world about the dangers of smoking didn’t make a difference. What worked was taxing cigarettes to make them so prohibitively expensive that most Gen-Zers never started smoking them to begin with.

Now, we’re “taxing” the wrong things in the form of tuition increases and poor salaries.

Right now, we’re making it incredibly expensive to become a doctor or engineer. Or we’re making other fields financially unviable to work in (e.g., teaching) by failing to pay practitioners what they’re worth.

Our tax incentives and subsidies (the “rewards” our government doles out) don’t help these people, but they damn sure help those who are less visibly beneficial to society but make vastly more money. It’s why we have so many people entering finance and so few entering teaching.

I don’t know about you, but I think it’s time we flipped this.

The teaching experience got worse after COVID

When the COVID-19 pandemic erupted, it sent all the children home for an extended period of virtual schooling. This showed parents what it was like to deal with lots of kids all day long.

Not just “deal” with them either, but also hopefully have them learn something.

Multiply parents’ experience with their handful of children by 10, and that was what the average schoolteacher dealt with on a daily basis for years before 2020.

What fascinates me about this, however, is that the experience didn’t lead to an outpouring of support. It didn’t lead to calls for higher pay, better working conditions, and more classroom assistance for teachers.

Instead, COVID-19 made schooling much, much worse for teachers as, inexplicably, it led to a focus on culture war issues and concerns over what was being taught in the classrooms.