How to choose the optimal Bing maps route? - php

I want to use this script to work out journey distance. Simple enough. Except this returns a different answer from Bing Maps itself. Okay, I understand that. But what I'd like is a way of picking the "best" one. How do I define best? Well the one that Bing would choose. For example, in this case, it's 11.9 miles (and that's the correct one).
However the script thinks it's 182 miles - which is true if it uses a different Shuttleworth/Bedford.
How do I get it to give me the one that Bing would pick?
For the purpose I'm using this script I can't have an "option to choose", it needs to decide for itself.
<?php
$from="Bedford";
$to="Shuttleworth";
$load="http://dev.virtualearth.net/REST/v1/Routes?wp.1=$from&wp.2=$to&key=MYAPIKEY&distanceUnit=mile";
$data= file_get_contents($load);
echo $data;
?>

I'm not going to get a Bing API key in order to test this, so don't get terribly excited until you test it yourself.
I suspect that what is going on is that Bing is using what it thinks is your current location to determine what Bedford and Shuttleworth mean. So if it thinks you are in England, you get Bedford and Shuttleworth England, and if you are in some other part of the world that has a Bedford and a Shuttleworth, it picks those. I'm in Texas and I got the 11.9 mile route you want. So maybe Bing is smart enough to know that in every part of the world except near the 182-mile-apart Bedford and Shuttleworth, people really are looking for the English ones. I'm guessing that the server where your code is is near the wrong Bedford and Shuttleworth.
So I was looking at the API, and at Bing API User Context Parameters is documented the UserLocation parameter, where you can put in your latitude and longitude to help Bing make decisions. Try adding that parameter to your query and putting in your actual longitude and latitude, or, if that fails, try putting in one for Austin, TX, USA (where I am) or one that is smack dab between the Bedford and Shuttleworth you are interested in.

Related

Road coordinate caching (breaking Google's T&Cs?)

We have sets of rough coordinates for a few hundred road routes across the UK.
The waypoints along the routes are often 50 metres apart. This means that when we draw a line through the coordinates (our software limits us to straight lines) they sometimes cut across roads, buildings etc.
The plan is to create a PHP script that will run the coordinates through something which will return close, nicely placed road coordinates and insert them into our database, essentially replacing our spread out, 50 metre apart coordinates.
The only technology we've found that can do this is the Google Maps Directions API. If we pass the waypoints along the route, Google can in return give us perfect road route with the coordinates of each step on a particular leg/straight on the route.
We'd go ahead and do this now if we weren't uncertain about this being allowed.
https://developers.google.com/maps/faq
We have read through the Google Maps FAQ and can't find anything about caching the road coordinates. We don't want to do anything that would breach the terms of service as our application heavily relies on other Google APIs and Google Maps itself.
How should we continue? If this is breaking Google's terms of service, couldn't we just randomize the coordinates they return slightly so they're not the same? How could it be proved we've broken them then?

Use Steamworks API to pull competitive game scores?

I have a dilemma that I need to figure out.
So I am building a website, where people can go watch a competitive game (Such as Counter Strike: Global Offensive), perhaps using either a Twitch TV stream, or actually through the matchmaking streaming services that the game may offer (In the case of this example, CS: GO TV). While playing, members can place "bets" on which teams will win, using some form of credits with no real value. Of course, the issue here, is that the site will need to be able to pull the score from the game, and update in real time. So sticking with the example of CS:GO, is there a portion of the Steamworks API, that would allow for real-time pulling of a game's score, through some kind of PHP or JavaScript method?
I'm sorry to tell you that you can't, for now.
In the API description of the CS:GO Competitive Match Information says:
It would be interesting to be able to find out competitive match information -- exactly like what DOTA 2 has. It could contain all the players in the map, with their steamids and competitive ranks, the score at half time/full time. There are probably a few more bits of info that could also be included. Pigophone2 16:54, 14 September 2013 (PDT)
To answer your question, there is no Steam developed API that does this.
However many websites still do exactly what you are looking for.
My guess is that they use a regularly updated script which parses websites like ESEA and ESL and pull data about those matches. After all, they are the ones who host almost all big games that people care about.
You'll need to keep up-to-date with private leagues though, as they don't typically publish live stats in an easily parse-able format. GOSU Gamers can help you track any new players that come to the big-league table.

most efficient way of calculating nearest city (from whitelist)

I have a whitelist of cities. Let's say, Seattle, Portland, Salem. Using GeoIP, I'd detect user city. Let's call it $user_city. Based on $user_city, I want to display classified-listings from nearest city from my whitelist (Seattle || Portland || Salem) with in 140 miles. If city is not listed in 140 miles, I'd just show a drop-down and ask user to manually select a city.
There are a few ways of doing this:
calculate this on the fly (I found an algorithm in one of SO answers)
with help of DB (let me explain):
create a table called regions
regions will have
city 1 | city 2 | distance (upto 140 miles)
city 1= cities from whitelist
city 2= any city within 140 miles from city 1
This would create a reasonable sized table. If my whitelist has 200 cities, and there are 40 cities (or towns) within 140 miles of each city. This would create 8000 rows.
Now, when a user comes to my site:
1) I check if user is from whitelist city already (city 1 column). If so, display that city
2). If not, check if $user_city is in "city 2" column
2a) if it is, get whitelist city with lowest distance
2b) if it is not, display drop-down for manual input
Final constraint: whichever method we select, it has to work from within iFrame. I mean, can I create this page on my mysite1.com and embed this page inside someothersite2.com inside an iframe? Will it still be able to get user_city and find nearest whitelisted city? I know there are some cross-domain scripting rules so I am not sure if iFrame would be able to get user-ip address, pass it to GeoIP, and resolve it to $user_city
So, my question:
How best to do this? If a lot of people embed my page in their page (using iframe) then my server would get pounded 10000s of times per second (wishful thinking, but let's assume that's the case). I don't know if a DB would be able to handle so much pounding. I don't want to have to pay for more DB servers or web-servers. I want to minimize resource-requirement at my end. So, I don't mind offloading a bit of work to user's browser via JavaScript.
EDIT:
Some answers have recommended storing lat, long and then doing the Math. The reason I suggested creating a 'regions' table is that this way all math is precomputed. If I have a "whitelist" of cities, and if I precompute all possible nearby city for each whitelisted city. Then I don't have to compute distance (using Haversine algorithm for eg) everytime.
Is it possible to offload all of this to user's browser via some crafty use of Java Script? I don't want to overload my server for a free service. It might make money but I am very close to broke and I am afraid my server would go down before I make enough money to pay for the upgrades.
So, the three constraints of this problem are 1) should work from inside iframe (I am hoping this will go viral and every blogger would want to embed my site into their page's iframe. 2) should be very fast 3) should minimize load on my server
Use one table City and do a mysql math-calculation for every query, with the addition of a cache layer eg memcache. Fair performance and very flexible!
Use two tables City (id,lat,lng,name) and Distance (city_id1,city_id2,dist), get your result by a traditional JOIN. (Could use a cache layer too.) Not very flexible.
Custom data structure: CityObj (id,lat,lng,data[blob]) just serialize and compress a php-array of the cities and store it. This might rise your eyebrows but as we know the bottleneck is never CPU or memory, it's disc IO. This is one read from an index of an INT as apposed to the JOIN which uses a tmp-table. This is not very flexible but will be fast and scalable. Easy to shard and cluster.
Is it possible to offload all of this to user's browser via some crafty use of Java Script? I don't want to overload my server for a free service. It might make money but I am very close to broke and I am afraid my server would go down before I make enough money to pay for the upgrades.
Yes, it is possible...using Google Maps API and the geometry library. The function you are looking for is google.maps.geometry.spherical.computeDistanceBetween. Here is an example that I made a while ago that might help get you started. I use jQuery here. Take a look at the source to see what's happening and modify as needed. Briefly:
supplierZips is an Array of zip codes comparable to your city whitelist.
The first thing I do on page load is geocode the whitelist locations. You can actually do this ahead of time and cache the results, if your city whitelist is constant. This'll speed up your app.
When the user enters a zip code, I first check if it's a valid zip from a json dataset of all valid zip codes in the U.S.( http://ampersand.no.de/maps/validUSpostalCodes.json, 352 kb, data generated from zip code data at http://www.geonames.org).
If the zip is valid, I compute the location between that zip and each location in the whitelist, using the aforementioned computeDistanceBetween in the Google Maps API.
Hope this helps get you started.
You just have to get the lat and the long of each city and add it to the database.
So every city only has 1 record. No distances are stored on the position on the globe.
Once you have that you can easily do a query with using haversine formula ( http://en.wikipedia.org/wiki/Haversine_formula ) to get the nearest cities within a range.
know there are some cross-domain scripting rules so I am not sure if iFrame would be able to get user-ip address
It will be possible to get the user ip or whatever if you just get the info from the embedded page.
I don't know if a DB would be able to handle so much pounding
If you have that many requests you should have by then found a way to make a buck with it :-) which you can use for upgrades :D
Your algorithm seems generally correct. What I would do is use PostGIS (a postgresql plugin, and easier to set up than it looks :-D). I believe the additional learning curve is totally worth it, it is THE standard for geodata.
If you put the whitelist cities in as POINTs, with latitudes and longitudes, you can actually ask PostGIS to sort by distance to a given lat/lon. It should be much more efficient than doing it yourself (PostGIS is very optimized).
You could get lats and longs of your user cities (and the whitelist cities) by using a geocoding API like Yahoo Placefinder or Google Maps. What I would do would be to have a table (either the same as the whitelist cities or not) that stores city name, lat, and lon, and do lookups on that. If the city name isn't found though, hit the API you are using, and cache the result in the table. This way you'll quickly not need to hit the API except for obscure places. The API is fast too.
If you're really going to be seeing that kind of server load, you may want to look into using something besides PHP though (such as node.js). Incidentally you shouldn't have any trouble geocoding from an iframe, from the Point of View of the server, its just like the browser is going to that page "normally".

Find the closest locations to a given address

I have built an application in CakePHP that lists businesses. There are about 2000 entries, and the latitude and longitude coordinates for each business is in the DB.
I now am trying to tackle the search function.
There will be an input box where the user can put a street address, city, or zipcode, and then I would like it to return the 11 closest businesses as found from the database.
How would I go about doing this?
I use the Yahoo Geo Planet API to identify the place corresponding to the search term the user entered. This normally matches multiple places, so you have to present them back to the user to get them to pick the right one. Then, once you know the right place, and it's lat longs, which the Yahoo API provides, you can use the haversine formula to get the closest businesses to the users location. There's a good example in the answer to this question.
I'd approach this by creating a square around the point, to get the point is a whole thing in isself, as you'll need a postcode database, or api, which tend to cost money. Either in buying the database or per lookup.
Doing it by city or similar at least, you could probably, not sure, return the long-lat from GMaps for that city.
Then I'd try and get 4 corner longlat coords around that point. Then I could search the database for values which are between those.
Either way it's a tricky thing I'd say, fab question though! Interested to see peoples suggestions.

Google Maps Geocoding API - country biasing, bizarre results

I'm connecting to the Google Maps API from PHP to geocode some starting points for a rental station locator application.
Those starting points don't have to be exact addresses; city names are enough. Geocoding responses with an accuracy equal to or grater than 4 (city/locality level) are used as starting points, and the surrounding rental stations searched.
The application is supposed to work in Germany. When a locality name is ambiguous (i.e. there are more than one place of that name) I want to display a list of possibilities.
That is not a problem in general: If you make an ambiguous search, Google's XML output returns a list of <PlaceMark> elements instead of just one.
Obviously, I need to bias the geocoding towards Germany, so if somebody enters a postcode or the name of a locality that exists in other countries as well, only hits in Germany actually come up.
I thought I could achieve this by adding , de or , Deutschland to the search query. This works mostly fine, but produces bizarre and intolerable results in some cases.
There are, for example, 27 localities in Germany named Neustadt. (Wikipedia)
When I search for Neustadt alone:
http://maps.google.com/maps/geo/hl=de&output=xml&key=xyz&q=Neustadt
I get at least six of them, which I could live with (it could be that the others are not incorporated, or parts of a different locality, or whatever).
When, however, I search for Neustadt, de, or Neustadt, Deutschland, or Neustadt, Germany, I get only one of the twenty-seven localities, for no apparent reason - it is not the biggest, nor is it the most accuracy accurate, nor does it have any other unique characteristics.
Does anybody know why this is, and what I can do about it?
I tried the region parameter but to no avail - when I do not use , de, postcodes (like 50825 will be resolved to their US counterparts, not the german ones.
My current workaround idea is to add the country name when the input is numeric only, and otherwise filter out only german results, but this sounds awfully kludgy. Does anybody know a better way?
This is definitely not an exhaustive answer, but just a few notes:
You are using the old V2 version of the Geocoding API, which Google have recently deprecated in favour of the new V3 API. Google suggests to use the new service from now on, and while I have no real experience with the new version, it seems that they have improved the service on various points, especially with the structure of the response. You do not need an API key to use the new service, and you simply need to use a slightly different URL:
http://maps.google.com/maps/api/geocode/xml?address=Neustadt&sensor=false
You mentioned that you were filtering for placemarks on their accuracy property. Note that this field does not appear anymore in the results of the new Geocoding API, but in any case, I think it was still not very reliable in the old API.
You may want to try to use the bounds and region parameters of the new API, but I still doubt that they will solve your problem.
I believe that the Google Geocoder is a great tool for when you give it a full address in at least a "street, locality, country" format, and it is also very reliable in other formats when it doesn't have to deal with any ambiguities (Geocoding "London, UK" always worked for me).
However, in your case, I would really consider pre-computing all the coordinates of each German locality and simply handle the geocoding yourself from within your database. I think this is quite feasible especially since your application is localized to just one country. Each town in the Wikipedia "List of German Towns" appears to have the coordinates stored inside a neat little <span> which looks very easy to parse:
<span class="geo">47.84556; 8.85167</span>
There are sixteen Neustadts in that list, which may be better than Google's six :)
found google autocomplete works better then their geocode
http://code.google.com/apis/maps/documentation/javascript/places.html#places_autocomplete
Found this question searching for this exact issue. Then realized Bing Maps API works much better. It has its quirks though. For example, if you pass an airport code, it would show you 6 different solutions, each corresponding to each terminal of the airport.

Categories