tirsdag den 29. januar 2008

Watching Google Maps Changes

http://mw1.google.com/staticfiles/gmre/index.html

torsdag den 24. januar 2008

Emerging Tech You Should Know About: Jing

Emerging Tech You Should Know About: Jing

New tech tools are arriving at an astounding pace and many are free. Adena Schutzberg explores a new image and video capture tool that might help you in your day-to-day professional (and personal) life - Jing. Jing makes it easy to record and demonstrate a particular workflow in a software product, for example, and eliminates the more tedious alternative of trying to explain the details in written form. Jing is a productivity tool that may have some quirks but gets the job done.

mandag den 21. januar 2008

Building a (Simple) Mashup in 24 Hours

http://www.geekestateblog.com/building-a-simple-mashup-in-24-hours/

Building a (Simple) Mashup in 24 Hours
By Andrew Mattie January 21, 2008

When Zillow released their neighborhood data last Wednesday night, I was floored. My company had talked about trying to acquire similar data off and on over the past two years, but it was always too cost prohibitive to do so. As I thought about the magnitude of what they had done that night, I realized that it was practically required for me as a coding geek to see what I could do with the data.

I knew next-to-nothing about GIS before I able to download the data on Thursday night and had thus never heard of ArcGIS, ESRI, or a shapefile. I do know a good bit about Google Maps though, and like any self-respecting software developer, I figured I could just do a bit of research and cobble together a quick demo showing what the data is all about. 24 hours later, I had a functioning demo; check it out here. Following is a simplified rundown of what I did to build it.

(Step 1) I did quite a bit of research to find out what the heck a shapefile even is. I found out that the ArcGIS software costs over $1000 for a license, and since that was way to much money for me to spend, I had to find an alternative. I found this page that details a number of ways one can convert a shapefile to Google Earth’s KML format, but I couldn’t use those KML files because of the way I wanted to highlight the area on Google Maps with a filled-in polygon. I considered writing my own shapefile parser based on their technical specs, but it would have taken far more time than I was willing to invest in this little project. I also tried The Google Map Creator, which takes a shapefile and cuts up hundreds or thousands of Google map tiles based on the file and then gives you all the code necessary to use it on your own site. It wasn’t what I was looking for though.

Ultimately, I found a little program called shp2text that I was able to run against the California shapefile zip I had downloaded. The data it gave me back was in tab-delimited spreadsheet format, and I was then able to easily move that data into our database. As an aside, the same site I found shp2text on also suggested Christine GIS as a free alternative to ArcGIS; I didn’t try it out though, so if you want to try it, do so at your own risk.

(Step 2) I had to do some research to figure out how to convert the raw data I had into a format Google Maps could take in. While I could have just fed it all of the latitude / longitude pairs in order, some neighborhoods had more than a thousand different pairs of points to define the boundary. I was worried that if I fed it all of those points, it would be slow to initially load and then slow to map out in the browser. To combat this, Google created an encoding algorithm to condense all of the data into a shortened form based off a somewhat standardized encoding format. Basically, they made something that acts like a zip file program for all of those latitude / longitude pairs.

I started coding my own implementation of the encoder, but coding it and debugging it was taking more time than I thought it would. I looked around on the web for something that I could just plug right into my program, and I found a page called Encoding for Performance. The code was written by some guys in Austraila and worked well for me in my initial tests (with modifications), so I used it. I ran their code against all of the latitude / longitude pairs that I had previously imported into our database. When it was finished running, I had a single entry in our database for every neighborhood in California that Zillow had made outlines for. Each of those entries in turn had the encoded latitude / longitude pairs stored next to them in the database.

(Step 3) Now that I had all of the data I needed in the form I needed it in, the rest was cake. I was able to throw together a quick page using a Google Map and a tree outline “control” that the brilliant guys over at Yahoo! had made for exactly this kind of purpose. I used my favorite JavaScript library to make AJAX calls whenever one of the neighborhoods was clicked. When the server receives one of those AJAX requests, it does a lookup in the database for the neighborhood that was clicked. The server then takes that information and sends off a request to Zillow’s GetDemographics API to see if they know anything more about that area. If they do have information, I add all of the charts they have for that area to the AJAX response and then send it back to the browser. The browser then takes that information, draws a polygon around the area, and displays all of Zillow’s charts for that area in a little info window on the map. Easy!

Anyway, I hope this just goes to show that you (agents / brokers / software developers / everyone else) don’t need to write off something just because you don’t know anything about it. I knew next to nothing about the details of what Zillow was freely offering, but I knew that I wanted to learn more. In the same way, I hope and expect that a least a few of Geek Estate’s readers will want to run off now and create some mashups of the data in the way I just did. You (usually) don’t need any money nowadays, but you do need time, patience, and a willingness to learn.

Author's Website: http://www.diversesolutions.com

tirsdag den 1. januar 2008