What’s In Your Database?
Given that Virgent is a technology-driven brokerage (and the fact that we’re all huge data nerds), it should be no surprise that we collect a lot of data about our transactions, on everything from basics about a listing to excruciating details on showing scheduling and offers.
We knew abstractly that all of this data was valuable, but it was only recently that we started playing around with it to see what we could learn. The results were fascinating, educational, and, most importantly, told us a lot about our business. Here are some highlights (all data points are averages):
List Price: $558,911
Days On Market: 29.6
Sale Price To Valuation Ratio: 97.0%
Valuation Price: $321,669
Time between valuation request and delivery: 9.6 Hours
Homeowner response time to showing requests: 333.09 seconds
Showings per sold house: 13.0
Showings that receive buyer feedback: 44.6%
Extra cash sellers walk away with using Virgent: $11,230
Now sure, you could go on the MLS or Zillow and use a calculator to figure out some of these data points, but a lot of this you’d have to mine your proprietary database in order to get.
So how do we look at this data from the business building lens? Some data points are pretty self explanatory – being able to prove that we make home sellers an extra $11,230 on their home sale is pretty powerful.
Some insights are less obvious, but still useful. For example, our average list price is more than $200k above our average valuation price. This implies that while homeowners with lower prices homes are interested in our service, we’re converting better among homeowners with higher-end homes, so we have some work to do to connect with the former group.
We can also improve in securing feedback after showings on our listings when our automated systems don’t capture it. 44.6% is very good compared to industry averages, but our sellers want to hear from 100% of buyers who see their home, so we have some work to do.
Given our culture of transparency, we publish all important stats, like the the ones above, to the relevant part of our platform unedited, so our current and potential clients have unfiltered access to our track record.
That type of transparency isn’t for every company, but whether or not you choose to share the results, you should consider diving into your database to see what you can learn about your results and how you can improve. You never know what you’ll find out.
Bryn Kaufman
Posted at 16:14h, 18 AprilBen, thanks for sharing that interesting data.
Regarding the Days on Market not sure how your MLS or you track that, but I just posted on Facebook at about our MLS Days on Market not being accurate, so for our case is not really something to pay too much attention to.
http://www.oahure.com/
Not sure if you would mind sharing how you came up with the calculation on extra cash sellers walk away with? That is a great stat, but not sure how you can accurately calculate that?
Ben Kubic
Posted at 06:21h, 19 AprilThe MLS uses days between listing and under contract to calculate DOM. For comparison’s sake we do the same, but we can easily calculate list date to closing date as well.
While DOM really matters at the neighborhood or micro-market level, we watch the top level number because it both tell us if we’re doing something wrong compared to broad market averages (which we’re more than twice as fast right now) and it also tells us if the market is slowing down/speeding up. Is there a real difference between 29.6 and 31.2 for an individual sale? No, but at a macro scale it could say something.
As for extra cash – we charge a flat listing commission of $5,000, so it’s a fairly straightforward calculation. What’s more challenging is what we’re working on now, which is a PPSQFT comparison of our solds vs. comps to see where we stand on pricing. That’s going to take more finesse, but it’s an interesting and useful exercise.
Bryn Kaufman
Posted at 12:11h, 19 AprilGlad to hear your MLS does it right. Hopefully mine will change and do it right too in the future.
Regarding a PPSQFT that would seem like it could be influenced by things outside your control, so perhaps not representative of anything significant.
For example, there are two listings the same square feet. Your listing is in average condition, the other one is in excellent condition. Your listing will obviously sell for less per square foot, but that in no way reflects negatively against you, it was just the listing you got. The same goes for the agent selling the excellent listing, they are not better than you because they got a higher PPSQFT, they just got a listing in better condition.
I realize this might wash out with enough data, but I would still be concerned about this number.
Ben Kubic
Posted at 12:16h, 19 AprilYeah, that’s what makes the endeavor interesting. It may not be feasible without massive numbers to wash out condition differences, etc, but we’re taking a look
graymoment
Posted at 20:36h, 18 AprilI would be interested to read a post comparing database options. What are you using for your database backend?
Bryn Kaufman
Posted at 20:37h, 18 AprilJust FYI, I use MySQL
Bryn Kaufman
Posted at 20:42h, 18 AprilI should mention we also have our website tied into Pipedrive which is working really well for us, but the heavy database stuff is MySQL with php.
Ben Kubic
Posted at 06:22h, 19 AprilOur platform is build mostly in Ruby on Rails with a PostgreSQL database. I’d be happy to do a post comparing database options, but for the most part I’ve found most production databases to be functionally similar for day to day purposes