You are here: GeekEstate Blog » API's, Mapping » Building a (Simple) Mashup in 24 Hours

Building a (Simple) Mashup in 24 Hours

Zillow Labs When Zillow released their neighborhood data last Wednesday night, I was floored. My company had talked about trying to acquire similar data off and on over the past two years, but it was always too cost prohibitive to do so. As I thought about the magnitude of what they had done that night, I realized that it was practically required for me as a coding geek to see what I could do with the data.

Zillow neighborhood data demo I knew next-to-nothing about GIS before I able to download the data on Thursday night and had thus never heard of ArcGIS, ESRI, or a shapefile. I do know a good bit about Google Maps though, and like any self-respecting software developer, I figured I could just do a bit of research and cobble together a quick demo showing what the data is all about. 24 hours later, I had a functioning demo; check it out here. Following is a simplified rundown of what I did to build it.

(Step 1) I did quite a bit of research to find out what the heck a shapefile even is. I found out that the ArcGIS software costs over $1000 for a license, and since that was way to much money for me to spend, I had to find an alternative. I found this page that details a number of ways one can convert a shapefile to Google Earth’s KML format, but I couldn’t use those KML files because of the way I wanted to highlight the area on Google Maps with a filled-in polygon. I considered writing my own shapefile parser based on their technical specs, but it would have taken far more time than I was willing to invest in this little project. I also tried The Google Map Creator, which takes a shapefile and cuts up hundreds or thousands of Google map tiles based on the file and then gives you all the code necessary to use it on your own site. It wasn’t what I was looking for though.

Ultimately, I found a little program called shp2text that I was able to run against the California shapefile zip I had downloaded. The data it gave me back was in tab-delimited spreadsheet format, and I was then able to easily move that data into our database. As an aside, the same site I found shp2text on also suggested Christine GIS as a free alternative to ArcGIS; I didn’t try it out though, so if you want to try it, do so at your own risk.

(Step 2) I had to do some research to figure out how to convert the raw data I had into a format Google Maps could take in. While I could have just fed it all of the latitude / longitude pairs in order, some neighborhoods had more than a thousand different pairs of points to define the boundary. I was worried that if I fed it all of those points, it would be slow to initially load and then slow to map out in the browser. To combat this, Google created an encoding algorithm to condense all of the data into a shortened form based off a somewhat standardized encoding format. Basically, they made something that acts like a zip file program for all of those latitude / longitude pairs.

I started coding my own implementation of the encoder, but coding it and debugging it was taking more time than I thought it would. I looked around on the web for something that I could just plug right into my program, and I found a page called Encoding for Performance. The code was written by some guys in Austraila and worked well for me in my initial tests (with modifications), so I used it. I ran their code against all of the latitude / longitude pairs that I had previously imported into our database. When it was finished running, I had a single entry in our database for every neighborhood in California that Zillow had made outlines for. Each of those entries in turn had the encoded latitude / longitude pairs stored next to them in the database.

(Step 3) Now that I had all of the data I needed in the form I needed it in, the rest was cake. I was able to throw together a quick page using a Google Map and a tree outline “control” that the brilliant guys over at Yahoo! had made for exactly this kind of purpose. I used my favorite JavaScript library to make AJAX calls whenever one of the neighborhoods was clicked. When the server receives one of those AJAX requests, it does a lookup in the database for the neighborhood that was clicked. The server then takes that information and sends off a request to Zillow’s GetDemographics API to see if they know anything more about that area. If they do have information, I add all of the charts they have for that area to the AJAX response and then send it back to the browser. The browser then takes that information, draws a polygon around the area, and displays all of Zillow’s charts for that area in a little info window on the map. Easy!

Anyway, I hope this just goes to show that you (agents / brokers / software developers / everyone else) don’t need to write off something just because you don’t know anything about it. I knew next to nothing about the details of what Zillow was freely offering, but I knew that I wanted to learn more. In the same way, I hope and expect that a least a few of Geek Estate’s readers will want to run off now and create some mashups of the data in the way I just did. You (usually) don’t need any money nowadays, but you do need time, patience, and a willingness to learn.

This entry was posted in API's, Mapping. Bookmark the permalink.
  • As a fellow coder, I must applaud you. This is a very nice demo with lots of potential. Well done! The world is a better place with free APIs and open source standards. Glad you could work around that $1000 license.

  • Love it — well done Andrew!

  • Andrew, it is amazing what you were able to do in 24 hours including all of the research of these open source tools. Wow.

  • Pingback: A Mashup with Zillow Neighborhood Shapefiles and the Zillow API - Zillow® Blog - Real Estate News and Analysis()

  • Wow – very impressive for just 24 hrs!

  • 24 hours uninterrupted…nice…get some sleep 🙂 Such neighborhood data would be very applicable to city websites looking to provide more consumer-centric data to prospective and current residents. http://www.GlendaleAZ.com (host city where this year’s 2008 Super Bowl) could use more GIS, neighborhood centric data.

  • What did you think of the topological quality or accuracy of the data?

  • Heh, I never said 24 hours uninterrupted! I started working with the data on Thursday night, went to work on Friday, and had everything pretty much wrapped up by the end of Friday night. Ergo, that’s 24 hours to me.

    Like I said, I’m certainly not a GIS guy, so I don’t know that I’m the one to ask regarding the quality or accuracy of the data. All I know is that they effectively provided coordinates for polygon outlines, and I was able to draw them on a map. 🙂 On a subjective level, it seems very accurate for the neighborhoods around where I live (Irvine, CA in Orange County). I’m not sure how one would be able to measure the data objectively though. As far as topographical data, I don’t believe that the shapefiles they provided represent anything in 3D space (only 2D).

  • Very cool. This is something that a lot of people would be interested in – especially as mashups tend to be built by people with tons of time and “geekspertise”. It’s nice to see articles explaining how it can be done more easily. 🙂 Great job!

  • Man… Great Work!!!

    I am amazed at the ability people have to pick something up like this and run with it to a finished product!!

    Giles

  • Pingback: Now Available: Google Street View via API | GeekEstate Blog()

  • Pingback: Zillow’s API Expands the Reach of Real Estate Data()

  • Jack Smith

    Good work Andrew. I followed your guide to essentially replicate what you did so I can tailor it to suit my needs. However, I couldn’t figure out how you got the hierarical data for state>country>city>neighborhood and mapping the neighborhood ID in the Zillow shapefile to this hierarchy. Can you talk about that? Thanks.

  • @Jack: I didn’t actually structure the data hierarchically. Instead, I just created a quick SQL query that loaded all of the data from a single table in our database. I then processed that resulting data in a few lines of application layer code and turned it into a string representing a JavaScript object that I wrote to the page when the page is first loaded. Once that JSON object is fed to Yahoo’s treeview control, it’s visible as the hierarchy that you see in the demo above.

  • Great work Andrew!

  • Gr8 work Andrew !
    I tried to follow the same steps.. but when I tried the shp2txt, it gave me the error which said “unable to open shape file”. Any idea why this is hapening?

    Thx.

  • This is great!! I'd love to tinker around with this project, could you make the database and scripts available for download? 🙂

  • hudsonbar

    I am to submit a report on this niche your post has been very very helpfull

    Regards

  • Pingback: Canadian prescriptions and percocet.()

  • Pingback: Buy adipex.()

  • Pingback: Free animal sex.()

  • Pingback: Buy percocet online.()

  • Pingback: Mens sexy underwear.()

  • Pingback: Zoo sex.()

  • Pingback: Buy targeted traffic.()

  • Pingback: Oxycodone.()

  • Pingback: Xanax.()

  • Pingback: Percocet.()

  • Pingback: Cialis.()

  • Pingback: Diflucan.()

  • Pingback: Viagra.()

  • Pingback: Provigil.()

  • Pingback: Cymbalta.()

  • Pingback: Viagra.()

  • Pingback: Viagra uk.()

  • Pingback: Vicodin.()

  • Pingback: Viagra.()

  • Pingback: Viagra.()

  • Pingback: Percocet.()

  • Pingback: Viagra.()

  • Pingback: Bulk alprazolam.()

  • Pingback: Ambien cr.()

  • Pingback: Benefits of valtrex.()

  • Pingback: Paxil side effects.()

  • Sam

    Sadly, almost none of these links work. But very cool.

2008 - 2017 GEEK ESTATE · ALL RIGHTS RESERVED - THEME BY Virtual Results
Hosted by Caffeine Interactive