You are here: GeekEstate Blog » Company News & Analysis, MLS Technology, Websites » Upstream & The National Broker Public Portal: Unearthing, Creating Super-MLS Data Layers

Upstream & The National Broker Public Portal: Unearthing, Creating Super-MLS Data Layers

Upstream, Broker Portal, and MLS DataProject Upstream aims to become the defender and gatekeeper of brokers’ real estate listings nationwide.  The National Broker Public Portal intends to provide a nationwide display of data that puts brokers at the controls.  The former is not intended to exclusively fuel the latter (though it seems like a match made in heaven–more on that later).

Upstream is not an MLS and will not replace the MLS’s co-broker, commission, and professional standards roles.  Its creators do not intend for it to be public-facing on its own.  It’s intended to feed many portals with standardized data.

Upstream and the National Broker Public Portal are projects about advertising online to the real estate consumer. Accurate data and fair display guidelines are important to brokers.  Based on consumer traffic numbers to certain portal sites, consumers don’t seem to care as much about the same issues.

Brokers need to achieve their goals by creating their own vision of perfected data display, while simultaneously building an attractive platform that consumers will not only approve of, but proactively search out.

Differentiation on a level that consumers can easily and quickly understand will be key to gaining a significant market share at a price that’s attainable.

To differentiate in the consumer space broker projects need unique data layers that are only available to brokers.  They can create them by:

1) Reworking the proprietary data they already have, or

2) Generating entirely new layers of data that accompany any new listing that enters the marketplace through their gateway.

Unearthing another layer of current listing data

When we say the portals have “access to MLS listings”, we’re really talking about just a small subset of listing data.  Portal advertisers show the basic listing data, and surround it with other data from publicly available sources.

An Upstream-type gateway could require its members to give broader access to their listing data fields in return for being a part of this new unique consortium of data providers.  By allowing more data fields into this sole repository to become publicly displayed, a preferred portal that is allowed to display this data can claim a truly unique identity.

Just one example of data that might be leveraged for unique consumer content:  keybox history.  The preferred portal might be able to display:

  • The number of showings for a property
  • The rate of showings
  • The days in which the property gets the most activity
  • Aggregated showing data across neighborhoods and cities
  • Heat maps on Sunday traffic for house hunting
  • Which neighborhoods are trending upward
  • Which ones are lagging and have better potential for discounts on prices

That’s just one facet of listing data that’s currently behind a shroud.  We could think of another half dozen in an hour.  An initiative to expose this kind of data on a single website would have immediate consumer impact.

Creating a New Layer of Super-MLS Data

If an Upstream-style project really gained ground as the starting point for listing input, why not give the option, or potentially the requirement, that new listings on this data source add new fields that might not have existed at the MLS level before?

Imagine requiring every single listing entered on Upstream to include a legal description, a floorplan, a permit history, or a 3D-model of the home.  Think of the value of a Super-MLS layer of data that added the kind of consumer-viewable documents and features that can’t be found anywhere else online and can’t be reproduced through public data sources.

Of course these things would have legal and financial ramifications, but the point is not that the examples given are the specific answers.  The answer itself might not be apparent today, but the ability to provide a listing with a totally unique display layer, one that isn’t available to any other advertiser, is riveting.

Why would it work?  The company inputting the data has 1 million data entry staff members.  

Agents are the data creators.  Even a well-funded portal’s engineering army is a fraction of the size of the agent masses creating unique content on a property-by-property basis.

When the data gatekeeper becomes the de facto starting point for the industry, whatever new layers of proprietary data have been added to the listing input fields will become the standard of operation.  The listings could still be fed to MLSs for B2B/IDX/co-brokerage functionality, while a single consumer portal which is allowed to be the sole national display vehicle for these new layers would be able to hammer its opponents over their heads with this Super-MLS listing data.

Get To The Point Or Go Back To Work

The conversations we’ve had about portals selling our data back to us have been long on what brokers want.  They’ve always been short on how to realistically get those things in a consumer advertising market, which is far more important.

If brokers want to create better data, and display it in a superior way, they should focus first on who will pay attention.  Ask what new things can be done with the proprietary data they already have.  Add on a layers of new proprietary fields and make them a feature or a bonus for the agents who input them–an opportunity for their listings to stand out, not just to do more work.  Create a reason for consumers to fund the project(s) through traffic going forward.

By unearthing hidden listing data and creating new Super-MLS layers, you might just have the kind of data that consumers can’t get enough of and that a portal could benefit financially from in a big way–and that’s where the data creators start taking their leverage back.

About Sam DeBord

Sam DeBord is a former management consultant and web developer who writes for for Inman News and REALTOR® Magazine. He is Managing Broker for Seattle Homes Group with Coldwell Banker Danforth, and 2016 President-Elect of Seattle King County REALTORS®. His team sells Seattle homes, condos, and Bellevue homes.

This entry was posted in Company News & Analysis, MLS Technology, Websites and tagged , , , . Bookmark the permalink.
  • Exciting ideas Sam, it would be great if they can execute on some of these. Not sure why it has taken so long for Brokers to be having these ideas, but at least they are now.

    The only thing I disagree with is I think consumers care a lot about accurate data.

    It is very discouraging to see a home for sale that is no longer for sale, or see square footage that is not accurate.

    There are 4 reasons the portals get a lot of traffic.

    1. Consumers don’t realize the information is not as accurate as a direct MLS feeds.

    2. The information is not too far off. Another words if 50% of it or more was wrong their traffic would drop, but perhaps the number is less than 10% so they can get away with it to a degree.

    3. Their huge marking budget drives a lot of traffic, not to mention their great search engine placement.

    4. They produced a clean, easy to use website site with a lot of information.

    • That’s really it, Bryn, consumers can’t tell clearly enough that it’s inaccurate, so they just keep doing what they’re doing. Good point.

  • Chris Doyle

    So when is this going to happen? The current situation is less than optimal.

2008 - 2017 GEEK ESTATE · ALL RIGHTS RESERVED - THEME BY Virtual Results
Hosted by Caffeine Interactive