SwipeReport

SwipeReport/Methodology

How I review dating apps

Last updated: April 2026

Quick version: I sign up, use the app for 30 days in the city I'm writing about, log what happens, and write it up. This page explains exactly what I'm logging and why.

The longer version is that most dating app reviews in Australia are useless. Not because the people writing them are bad writers. Because they've never actually used the apps. They're working from press releases and App Store screenshots. That's not a review, it's a summary.

I started SwipeReport because I kept finding that gap. I'd be trying to work out whether Hinge had enough users in Brisbane to bother with, or whether RSVP's user base in Adelaide had thinned out, and there was just nothing. Either US-focused content that ignored Australian market dynamics entirely, or listicles from fashion magazines. Neither was any use.

The 30-day protocol

For every platform I review, I create a test profile with a consistent photo set and bio, and run it actively for 30 days in the city I'm writing about. Throughout that period I log:

At the end I have numbers. “9 matches over 30 days, 28% response rate, 6 suspicious profiles” is something you can use. “Moderate activity” is not.

How I spot fake profiles

There's a pattern to them. I flag a profile as suspicious when it hits two or more of these:

I'm not calling every flagged profile definitively fake. Some will be real users with odd behaviour. But 3 suspicious profiles in 30 days versus 11 tells you something real about a platform's moderation quality.

How I score platforms

Five categories, 100 points total.

CategoryWeightWhat I measure
Profile authenticity25ptsFake profile rate, verification quality, responsiveness to reports
Match quality25ptsRelevance to stated intent, location accuracy, filter functionality
UX and features20ptsOnboarding friction, messaging quality, mobile vs desktop parity
Pricing transparency15ptsClear pricing before signup, easy cancellation, no dark patterns
Safety features15ptsReporting tools, block functionality, privacy controls

The affiliate thing

I earn a commission when you sign up through links on this site. That's how the site funds itself. I'm disclosing it upfront because pretending it doesn't exist is worse than acknowledging it.

It doesn't change the scores. I've given below-average scores to platforms I'm affiliated with when the 30-day data warranted it. I've recommended platforms where I earn nothing when they were the better option for what someone was looking for.

Where I've personally tested

I can't test every platform in every city. For Tier 1 cities (Sydney, Melbourne, Brisbane, Perth, Adelaide, Canberra, Gold Coast, Darwin) I've run the full 30-day protocol. For Tier 2 regional cities, I use Google Trends data and community feedback to estimate platform density, and I say so clearly on those pages. For smaller regional areas, it's labelled as data-driven estimates.

Being honest about this is more useful than implying everything has the same depth of testing when it doesn't.

Updates

Apps change constantly. Pricing changes, features get added or cut, user density shifts. I update reviews when the platform makes significant changes, when I get multiple reader reports that something's off, or when a year has passed since I last tested. Each review shows a last-tested date.

Questions about the methodology? Contact alex@swipereport.com.au