How We Score

Every roofing software product we review gets an RSG Score from 1 to 10. It’s not a gut feeling. It’s not based on how nice the vendor’s sales rep was. It’s a structured evaluation across five categories that matter to contractors who actually use this stuff on job sites.

We built this system because we were tired of review sites that hand out 4.5 stars to everything. If every product is “great,” the rating is useless. Our scores have teeth. A 7.5 means we found real problems. A 9.0+ means we’d recommend it with confidence.

This page explains exactly how we calculate every score. No black box. No mystery algorithm. If you disagree with a score, you can point to the specific category and tell us why — and we’ll listen.

The Five Categories

Every product is evaluated across five categories. Each category is scored independently on a 1–10 scale, then combined into the overall RSG Score using weighted averages.

Ease of Use 20%

Can your least technical crew member figure this out without a week of training? We evaluate onboarding time, interface clarity, mobile app quality, and how many clicks it takes to do the things you do 50 times a day. If a tool is powerful but takes a month to learn, that costs you real money.

Features & Core Function 20%

Does the tool do what it claims to do — and does it do it well? We evaluate feature depth, reliability, integration options, and whether the product actually delivers on its marketing promises. A CRM that crashes when you have 200 active jobs isn’t a CRM.

Pricing Value 20%

We don’t score cheapest = best. We score value — what you get for what you pay. A $300/month CRM that saves your office manager 15 hours a week is a better value than a $50/month tool that creates more work than it eliminates. We also factor in hidden costs: per-user fees, add-ons, report charges, and required integrations that aren’t included in the base price.

Support Quality 15%

When your CRM goes down during a hailstorm and you’ve got 40 leads to follow up on, how fast can you get help? We evaluate response times, support channels (phone, chat, email, knowledge base), onboarding assistance, and what real users report about their support experiences on G2 and Capterra.

Roofing-Specific Capabilities 25%

This is the category that separates us from every other review site. A generic field service tool can score 9/10 on Capterra and still be a 6/10 for roofers if it doesn’t understand roof measurements, material ordering from ABC or Beacon, insurance supplement workflows, or storm damage documentation. We weight this category the heaviest because it’s the entire reason this site exists.

Why the weights aren’t equal: Roofing-Specific gets 25% because that’s why you’re here — you need software that understands roofing, not generic business software. Support gets 15% because it matters but it’s not the reason you buy. Everything else is 20% because ease of use, features, and pricing are equally important in different situations.

Two Scoring Tracks: Full Platforms vs. Specialized Tools

It’s not fair to score a CRM platform the same way you score a measurement app. AccuLynx and CompanyCam are both roofing software, but they do completely different things. So we use two scoring tracks:

Full Platforms

CRMs and all-in-one tools that try to handle the full job lifecycle. Scored on breadth and depth across all categories. AccuLynx, JobNimbus, Jobber, ServiceTitan, Projul.

Specialized Tools

Tools that do one thing exceptionally well. Scored on how well they execute their specific function. CompanyCam, EagleView, Roofr, RoofSnap, HailTrace, Hover, iRoofing, Leap.

A specialized tool isn’t penalized for not having CRM features. CompanyCam doesn’t manage leads — and we don’t score it like it should. It’s evaluated on photo documentation, which is what it’s designed to do. This is why CompanyCam (9.5) scores higher than AccuLynx (9.1) — it’s the best at what it does, even though AccuLynx does more things.

The Tier System: Gold, Silver, Bronze

Every RSG Score maps to a tier. The tiers tell you at a glance whether we recommend a product — and how strongly.

SCORE
WHAT IT MEANS
9.0 – 10.0
RSG Gold
We recommend this product with confidence. It excels in its category and we’d use it ourselves.
8.0 – 8.9
RSG Silver
A strong product for the right use case. Good but not best-in-class — usually one or two categories hold it back.
7.0 – 7.9
RSG Bronze
Functional but with notable limitations. Usually overpriced for what you get, or built for a different industry and adapted for roofing.

Products scoring below 7.0 are reviewed but don’t earn a tier badge. We won’t pretend a bad product is good just because we wrote about it.

Where Our Data Comes From

Every score is backed by research across multiple sources. We don’t base a score on a single demo or a marketing page.

Vendor websites — official pricing, feature documentation, and product updates
G2 and Capterra — aggregated user reviews, filtering for roofing-specific feedback
Industry publications — Roofing Contractor Magazine, RC&D, trade show coverage
User communities — roofing contractor forums, Facebook groups, Reddit
Vendor press releases — product updates, partnership announcements, pricing changes

90-Day Review Cycle

Software changes fast. A product that had terrible support six months ago might have hired a whole new team. A tool that was $50/month might now be $150/month. Scores based on stale data are worthless.

Every RSG Score is reviewed on a 90-day cycle. During each review cycle, we check for pricing changes, new feature releases, changes in user sentiment, and any major updates from the vendor. If something material has changed, we update the score. Every review displays a “Verified current” badge showing when it was last checked.

If you notice something we missed — a pricing change, a new feature, or a problem we didn’t catch — let us know. We’ll verify it and update the score within 24 hours if warranted.

What Scores Don’t Reflect



Personal preferences. Matt might personally prefer one interface style over another. The scoring categories are designed to be as objective as possible — they measure what can be measured, not what one person likes.

One-time problems. If a vendor had a server outage in January and fixed it in February, that doesn’t permanently tank their score. We evaluate the current state of the product during each 90-day cycle.

Questions about our methodology? Get in touch.