← All posts
Customer satisfactionHome services operationsMay 11, 2026Clint Research Team

How to Track Customer Satisfaction in a Home Service Business

Customer satisfaction in home services runs through three channels: online reviews, post-job surveys, and repeat booking rate. Most businesses only watch star rating averages. Here is how to track all three and what to do with the data.

8 min read

Key takeaways

  • Post-job text surveys sent within 2 hours of job completion achieve 35-55% response rates. Email surveys achieve 8-12%.
  • Repeat booking rate (what percentage of customers from 12 months ago have booked again) is the most honest satisfaction signal available.
  • A tech with a 65% first call resolution rate generates a 35% lower probability of a 5-star review from those customers, regardless of how the return visit goes.
Contents
  1. 01Online Review Metrics
  2. 02Post-Job Survey Setup
  3. 03Repeat Booking Rate as the Honest Metric
  4. 04Callback Rate as a Negative Signal
  5. 05Connecting Satisfaction Data to Techs
  6. 06Sources
  7. 07Frequently Asked Questions

Customer satisfaction in home services is not one number. It runs through online reviews, post-job surveys, and repeat booking behavior, and each channel tells a different part of the story.

Most businesses watch their Google star rating average and call it customer satisfaction tracking. That is lagging data on one channel. A business with a 4.7 Google rating can have a 28% annual churn rate and never see the contradiction, because unhappy customers leave quietly rather than reviewing.

This post covers setting up all three channels and connecting the data to the tech level, where satisfaction is actually made or lost. The retention math that compounds from satisfaction is in how to track customer retention in a home service business.

Online Review Metrics

Google rating is the visibility metric. It is what prospective customers see before they call you, and it affects your Local Services Ads cost-per-lead and your organic ranking in local search.

The number to track is not just the star average. Track four metrics:

Star rating average. The visible number. Anything below 4.5 on Google starts to affect conversion on first contact. Below 4.0, it actively costs you leads.

Review count. Volume signals activity and recency to new customers. A business with 4.8 stars and 12 reviews is less credible than a business with 4.6 stars and 340 reviews. Review count matters for LSA ranking as well. Google's documentation confirms that higher review volume contributes to Local Services Ad placement.

Review velocity. New reviews per month. A business that earned 60 reviews two years ago and gets 1 new review per month is in slow decay relative to a competitor getting 12 new reviews per month. A 12-month trend in review velocity tells you whether your reputation is building or stagnating.

Review response rate. The percentage of reviews that receive an owner response. Google recommends responding to all reviews. BrightLocal's 2024 Local Consumer Review Survey found that 88% of consumers would use a business that responds to all reviews. Responding to negative reviews within 48 hours limits reputational damage from that specific review.

Set up Google Business Profile notifications so every new review triggers an email alert. Assign someone to respond to reviews within 24 hours. This is a 5-minute task that most businesses skip. See how to respond to negative reviews in a home service business for the response framework and how to get more Google reviews in a home service business for the collection mechanics.

Text Clint: "how many Google reviews have we received in the last 30 days and what is the average rating?"

Post-Job Survey Setup

Post-job surveys are the most actionable satisfaction data because they connect to a specific job, a specific tech, and a specific date.

A star rating average does not tell you which tech, which job type, or which service area is generating the complaint. A post-job survey tagged to the job record does.

Channel: Text, not email. Post-job surveys sent via text within 2 hours of job completion achieve 35-55% response rates. The same survey sent via email achieves 8-12%. The gap is due to how customers interact with post-service communication: most residential customers check texts within minutes and email hours or days later, if at all.

Format: Two questions, not five. Question 1: "How was the service today? Reply 1 (poor) to 5 (excellent)." Question 2: "Anything we could do better?" The two-question format maintains response rates. Five-question surveys lose half the respondents by question three.

Timing: Within 2 hours of job completion. The job is fresh, the tech just left, and the customer is still thinking about the service. Same-day texting captures the emotional state at the moment of highest relevance. Sending 24-48 hours later picks up a different and lower-quality response.

Routing: Responses that score 4 or 5 trigger a follow-up text with the Google review link. Responses that score 1, 2, or 3 trigger an internal alert for the service manager to call the customer within 4 hours. This routing converts your post-job survey into both a review generation system and a complaint interception system.

Most field service CRMs have a native follow-up message feature or integrate with tools like Podium, NiceJob, or Birdeye for automated post-job review requests. These platforms handle the routing logic natively.

Text Clint: "what is the average post-job rating by tech this month, and which techs have ratings below 4.0?"

Repeat Booking Rate as the Honest Metric

Repeat booking rate is the one satisfaction metric that cannot be gamed.

A customer can give you a 5-star review and never call again. A customer who leaves no review but has booked 3 times in 18 months has already voted with the highest-signal behavior available.

Calculate it: of the customers who completed at least one job with you in the 12-month window that ended 12 months ago, what percentage have booked again in the most recent 12 months?

For example: 340 customers completed a job in the window June 2024 to May 2025. Of those 340, how many have completed another job in the window June 2025 to May 2026? If 187 of them have, your repeat booking rate is 55%.

What a healthy repeat booking rate looks like varies by trade:

  • HVAC (service and repair): 45-60% annual repeat rate is strong for a primarily repair-driven business. Maintenance agreement customers should approach 85-90% annual retention.
  • Residential cleaning: 60-75% repeat rate is the baseline for a well-run operation.
  • Plumbing (repair only): 30-45% is realistic given the emergency-driven nature of the work.
  • Landscaping (recurring plan): 80%+ for plan customers is the target.

If your repeat rate is materially below these ranges, survey quality will not reveal why. Only a look at churn segments will: when did these customers stop booking, what was their last job type, and who was the assigned tech?

Text Clint: "what percentage of customers who had a job completed in the 12 months ending May 2025 have booked again since then?"

Callback Rate as a Negative Signal

Callback rate is the inverse satisfaction signal. A customer who calls back within 14 days for the same complaint is, by definition, not satisfied with the first visit.

This metric is covered in depth in the first call resolution post. The satisfaction connection: a customer who receives a callback visit within 14 days has a 35% lower probability of leaving a 5-star review, even if the return visit resolves the issue completely.

The callback rate tracks two things simultaneously: service quality (is the tech fully resolving the issue on first visit?) and customer satisfaction (is the customer's experience clean and final, or does it require a follow-up?).

Track callback rate by tech for the same reason you track post-job ratings by tech: the aggregate number tells you where the business is, but the tech-level breakdown tells you where the coaching needs to go.

A tech with a 4.8 post-job rating average and an 18% callback rate is producing a contradiction. Customers are happy when they rate the survey (probably because the tech is personable and communicates well), but they are calling back at a high rate (because the technical completion is inconsistent). That pattern requires a different coaching conversation than a tech with a 4.2 rating and a 6% callback rate.

Text Clint: "show me the callback rate and average post-job rating for each tech side by side over the last 60 days."

Connecting Satisfaction Data to Techs

All four satisfaction signals become useful when they can be attributed to a specific tech on a specific job.

Online reviews: Most review request platforms allow you to tag the tech who ran the job when sending the request. NiceJob, Podium, and Birdeye all support tech-level attribution. When a review comes in from a tech-tagged request, the review is associated with that tech in the platform's reporting.

Post-job survey scores: If the survey is triggered from your CRM at job close and the job record has an assigned tech, the survey response inherits the tech attribution. Run the average score by tech monthly.

Repeat booking rate: Segment by the tech most frequently assigned to a customer's jobs. If a customer's last 3 jobs were all with Tech A, their rebooking behavior is attributable to that tech relationship.

Callback rate: Native to the job assignment once the callback tag is in place.

The combined view by tech: revenue per day, post-job rating, callback rate, review count, and repeat booking rate on assigned customers. That set of five numbers tells you not just whether a tech is generating revenue, but whether they are generating durable revenue or revenue that churn and callback will erode over time. See technician performance metrics for home services for the full tech-level scorecard.

Text Clint: "what is the average post-job rating by tech this month?"

Sources

Frequently Asked Questions

4 questions home service owners actually ask about this.

  • 01What response rate should I expect from a post-job text survey?

    35-55% for a 2-question survey sent within 2 hours of job completion. Response rate drops meaningfully with each additional question (a 5-question survey typically achieves 15-25% via text) and drops sharply if the survey is delayed beyond 4-6 hours post-job.

  • 02Should I ask for Google reviews in the same message as the satisfaction survey?

    No. The satisfaction survey goes first, and only customers who score 4 or 5 receive the follow-up with the review link. Routing everyone to a review request regardless of their score produces negative reviews from customers who would have been handled quietly through the complaint path. Keep the two steps separate.

  • 03How do I track which tech a Google review is about?

    The most reliable method is tech-tagged review requests sent immediately after job close. When the review request is sent from the job record with a tech attribute, and the customer then leaves a review, most review platforms can associate the review to the tech. If you use NiceJob or Birdeye, the tech-level attribution is native. Google's own platform does not support tech tagging natively, so the attribution lives in your review request tool, not in Google Business Profile.

  • 04What is a realistic repeat booking rate for a $2M HVAC company?

    For a company with a mix of service repair (call-and-dispatch) and maintenance agreement customers, a combined repeat booking rate of 45-55% is reasonable. Maintenance agreement customers should be at 80-90% annual retention. If the blended number is below 35%, the business is over-reliant on new customer acquisition to maintain revenue, which is a structural cost and marketing dependency problem.

See Clint in action

Clint is the pre-built AI for home service shops. Connect your CRM, email, and phone system in minutes and the agents run on your real data.