Every roofing software product we review gets an RSG Score from 1 to 10. It’s not a gut feeling. It’s not based on how nice the vendor’s sales rep was. It’s a structured evaluation across five categories that matter to contractors who actually use this stuff on job sites.
We built this system because we were tired of review sites that hand out 4.5 stars to everything. If every product is “great,” the rating is useless. Our scores have teeth. A 7.5 means we found real problems. A 9.0+ means we’d recommend it with confidence.
This page explains exactly how we calculate every score. No black box. No mystery algorithm. If you disagree with a score, you can point to the specific category and tell us why — and we’ll listen.
The Five Categories
Every product is evaluated across five categories. Each category is scored independently on a 1–10 scale, then combined into the overall RSG Score using weighted averages.
Can your least technical crew member figure this out without a week of training? We evaluate onboarding time, interface clarity, mobile app quality, and how many clicks it takes to do the things you do 50 times a day. If a tool is powerful but takes a month to learn, that costs you real money.
Does the tool do what it claims to do — and does it do it well? We evaluate feature depth, reliability, integration options, and whether the product actually delivers on its marketing promises. A CRM that crashes when you have 200 active jobs isn’t a CRM.
We don’t score cheapest = best. We score value — what you get for what you pay. A $300/month CRM that saves your office manager 15 hours a week is a better value than a $50/month tool that creates more work than it eliminates. We also factor in hidden costs: per-user fees, add-ons, report charges, and required integrations that aren’t included in the base price.
When your CRM goes down during a hailstorm and you’ve got 40 leads to follow up on, how fast can you get help? We evaluate response times, support channels (phone, chat, email, knowledge base), onboarding assistance, and what real users report about their support experiences on G2 and Capterra.
This is the category that separates us from every other review site. A generic field service tool can score 9/10 on Capterra and still be a 6/10 for roofers if it doesn’t understand roof measurements, material ordering from ABC or Beacon, insurance supplement workflows, or storm damage documentation. We weight this category the heaviest because it’s the entire reason this site exists.
Two Scoring Tracks: Full Platforms vs. Specialized Tools
It’s not fair to score a CRM platform the same way you score a measurement app. AccuLynx and CompanyCam are both roofing software, but they do completely different things. So we use two scoring tracks:
CRMs and all-in-one tools that try to handle the full job lifecycle. Scored on breadth and depth across all categories. AccuLynx, JobNimbus, Jobber, ServiceTitan, Projul.
Tools that do one thing exceptionally well. Scored on how well they execute their specific function. CompanyCam, EagleView, Roofr, RoofSnap, HailTrace, Hover, iRoofing, Leap.
A specialized tool isn’t penalized for not having CRM features. CompanyCam doesn’t manage leads — and we don’t score it like it should. It’s evaluated on photo documentation, which is what it’s designed to do. This is why CompanyCam (9.5) scores higher than AccuLynx (9.1) — it’s the best at what it does, even though AccuLynx does more things.
The Tier System: Gold, Silver, Bronze
Every RSG Score maps to a tier. The tiers tell you at a glance whether we recommend a product — and how strongly.
Products scoring below 7.0 are reviewed but don’t earn a tier badge. We won’t pretend a bad product is good just because we wrote about it.
Where Our Data Comes From
Every score is backed by research across multiple sources. We don’t base a score on a single demo or a marketing page.
G2 and Capterra — aggregated user reviews, filtering for roofing-specific feedback
Industry publications — Roofing Contractor Magazine, RC&D, trade show coverage
User communities — roofing contractor forums, Facebook groups, Reddit
Vendor press releases — product updates, partnership announcements, pricing changes
90-Day Review Cycle
Software changes fast. A product that had terrible support six months ago might have hired a whole new team. A tool that was $50/month might now be $150/month. Scores based on stale data are worthless.
Every RSG Score is reviewed on a 90-day cycle. During each review cycle, we check for pricing changes, new feature releases, changes in user sentiment, and any major updates from the vendor. If something material has changed, we update the score. Every review displays a “Verified current” badge showing when it was last checked.
If you notice something we missed — a pricing change, a new feature, or a problem we didn’t catch — let us know. We’ll verify it and update the score within 24 hours if warranted.
What Scores Don’t Reflect
Personal preferences. Matt might personally prefer one interface style over another. The scoring categories are designed to be as objective as possible — they measure what can be measured, not what one person likes.
One-time problems. If a vendor had a server outage in January and fixed it in February, that doesn’t permanently tank their score. We evaluate the current state of the product during each 90-day cycle.
Questions about our methodology? Get in touch.