You are here: GeekEstate Blog » Listings Syndication, MLS Technology » A Few Thoughts on Upstream

A Few Thoughts on Upstream

I’m somewhat surprised I haven’t written anything about Project Upstream yet (though Bryn has). Rob has written about the topic numerous times recently (see here and here) which got me thinking about broker listing data, and the future of the MLS.

I actually mostly agree with Rob’s vision and thinking of a modular system where brokers control their listing data, and can easily grant access (or revoke it) to any vendor/partner.

That said, I’m not sure the path to get there, since the only way that world exists if if every vendor is connected to such a system. Let’s take Facebook as an example, because it works the same way — ie I can authenticate any service, and grant them access to my Facebook data, and I can revoke that same access at any time with the click of a button. That scenario would not be possible if everyone didn’t already have a Facebook account (ie 10 years of growth to get to 1 billion plus users). What’s going to FORCE every broker onto a new tech system? What’s going to force every vendor to integrate with such a system/standard?

Two questions that I see…

1. Sales Data?

This was brought up in the comments on Rob’s post. Who owns the sales data? What can all the various parties do — or not do — with that data?

Keep in mind, sales data is a matter of public record in most states. It’s certainly takes a long time if collected from public sources, but the data can be had (eventually). Thus, the portals are going to get sales data either way — they’d just prefer to have it sooner rather than later.

The topic of sales data leads me directly to the products that are created on top of that data, notably the likes of Cloud CMA and Top Producer.

2. Derivative Works?

Going back to the Facebook scenario we operate in today, the sites/companies/apps that I’ve authenticated my Facebook data with can’t store that data — meaning they can’t aggregate it, or produce derivative works from it. Virtually none of the real estate market data products on the market could exist if they had to obtain buy in from every single brokerage to deliver a compelling (and accurate) data product for local market X. Without complete data for one market, all the sales trends are largely useless. Can you imagine what it would take to get every real estate broker in a given market to grant access to sales data?

I’d wager a strong bet the portals aren’t going to like, or agree to terms with, a world where they can’t do derivative works (& own them completely). That’s (one of) the largest reason(s) they don’t use IDX data — because if they did, they would be severely limited on any derivative works which used listing data as one of the inputs (like Zestimates) as well as the fact they’d have to follow 900 different sets of IDX display rules.

Can a modular system work?

Yes.

Is it without it’s fair share of challenges?

Absolutely not.

Will it become a reality?

The verdict’s still out.

About Drew Meyers

Founder of Geek Estate Blog / Geek Estate Labs. Zillow Alum. Travel addict & co-founder of Horizon. Social entrepreneurship & microfinance advocate. Fan of Red Hot Chili Peppers and Kiva.

This entry was posted in Listings Syndication, MLS Technology and tagged . Bookmark the permalink.
  • Sonatta

    nice post

2008 - 2017 GEEK ESTATE · ALL RIGHTS RESERVED - THEME BY Virtual Results
Hosted by Caffeine Interactive