IP location using ipinfo.io - php

I need to get the state and country from the visitor IP. I will be using the country info to showcase custom made products. As for the state info it will not be used for the same purpose but only for record keeping to track the demand.
I have found on this site an instance of using the ipinfo.io API with this example code:
function ip_details($ip) {
$json = file_get_contents("http://ipinfo.io/{$ip}/json");
$details = json_decode($json);
return $details;
}
However, since I do not need the full details, I see that the site does allow to just grab single fields. So I am considering using these 2:
1) ipinfo.io/{ip}/region
2) ipinfo.io/{ip}/country
like so:
function ip_details($ip) {
$ip_state = file_get_contents("http://ipinfo.io/{$ip}/region");
$ip_country = file_get_contents("http://ipinfo.io/{$ip}/country");
return $ip_state . $ip_country;
}
OR would I be better off going with:
function ip_details($ip) {
$json = file_get_contents("http://ipinfo.io/{$ip}/geo");
$details = json_decode($json);
return $details;
}
The last one has the "/geo" in the url to slim down the selection from the first one with "/json". Currently I am leaning to the second option above by using 2 file_get_contents but wanted to know if it is slower than the last one having it in an array. Just want to minimize the load time. Or if any other method can be given it would be much appreciated.

In short, go for your second option, with a single request (file_get_contents makes a get request when parsed a url):
The result is a simple array, access the details you want via its key:
function ip_details($ip) {
$json = file_get_contents("http://ipinfo.io/{$ip}/geo");
$details = json_decode($json);
return $details;
}
$ipinfo = ip_details('86.178.xxx.xxx');
echo $ipinfo['country']; //GB
//etc
Regarding speed difference - 99% of the overhead is network latency, so making ONE request and parsing the details you need will be much faster than making 2 separate requests for individual details

Related

Microsoft Graph API - paging large collections

I'm just looking at the Microsoft Graph API PHP SDK to get a bunch of resources, notably Users.
Looking a the SDK docs, there's 2 ways to get users, one using the createRequest() method and the other using the createCollectionRequest() method.
The docs suggests using the createCollectionRequest() and then just doing a while loop, array_merge and getPage() to create an array.
while (!$docGrabber->isEnd()) {
$docs = array_merge($docs,$docGrabber->getPage());
}
The issue is, I have a collection of ~50,000 users, so this method isn't particularly efficient.
I guess the biggest issue, i that the above example (using the while loop) is to avoid using the #odata.nextLink that the API returns.
But, what if we actually want to use this, instead of returning every single record in a single array?
Thanks
Instead of using getPage() and that sample, you can access the nextlink with something like this:
$url = "/users";
// Get the first page
$response = $graph->createCollectionRequest("GET", $url)
->setPageSize(50)
->execute();
if ($response->getNextLink())
{
$url = $response->getNextLink();
// TODO: remove https://graph.microsoft.com/v1.0 part of nextlink
} else {
// There are no more pages.
return null;
}
// get the next page, page size is already set in the next link
$response = $graph->createCollectionRequest("GET", $url)
->execute();

parsing paginated json from web service

I am trying to parse a large amount of JSON data generated from a remote web service. The output produced is paginated across 500 URIs and each URI contains 100 JSON objects. I need to match a property in each JSON object, it's DOI (a digital object identifier), against a corresponding field fetched from a local database and then update the record.
The issue I am having is controlling my looping constructs to seek out the matching JSON DOI while making sure that all the data has been parsed.
As you can see I have tried to use a combination of break and continue statements but I am not able to 'move' beyond the first URI.
I later introduced a flag variable to help control the loops without effect.
while($obj = $result->fetch_object()){
for($i=1;$i<=$outputs_json['meta']['response']['total-pages'];$i++){
$url = 'xxxxxxxxxxxxxxx&page%5Bnumber%5D='."$i".'&page%5Bsize%5D=100';
if($outputs = json_decode(file_get_contents($url),true)===false){
}
else{
try{
$outputs = json_decode(file_get_contents($url),true);
$j=0;
do{
$flag = false;
$doi = trim($outputs['data'][$j]['attributes']['identifiers']['dois'][0], '"');
if(!utf8_encode($obj->doi)===$doi) continue;
}else{
$flag = true;
$j++;
}
}while($j!==101);
if($flag===true) break;
} catch(Exception $e) {
}
}
}
}
}
What is the optimal approach that guarantees each JSON object at all URIs is parsed and that CRUD operations are only performed on my database when a fetched record's DOI field matches the DOI property of the incoming JSON data?
I'm not 100% sure I understand every aspect of your question but for me it would make sense to change the order of execution
fetch page from external service
decode json and iterate through all 100 objects
get one DOI
fetch corresponding record from database
change db record
when all json-objects are progressed - fetch next url
repeat until all 100 urls are fetched
I think it's not a good idea to fetch one record from local DB and try to find it in 100 different remote calls - instead it's better to base your workflow/loops on fetched remote data and try to find the corresponding elements in your local DB
If you think that approach will fit your task - I can of course help you with the code :)

Sideload API Calls with PHP

Is there a way to sideload (load multiple API calls at the same time) API calls to lessen the impact on API call limits, using PHP?
For example, we're using the EchoNest API to gather information on musicians. When the artist page on our site is accessed, we run multiple functions which each call a different API method that returns the specific data that we need. Everything works and looks awesome!
Here are a few (abbreviated) methods that we're calling that each count against our call limit:
function artistPageNews() {
$artist_name = $_GET['artistname'];
$results = iTunes::search($artist_name, array(
'entity' => 'musicVideo'
))->results;
$echonest_api_key = "OUR_API_KEY";
// News Method
$echonest_news = 'http://developer.echonest.com/api/v4/artist/news?api_key='.$echonest_api_key.'&name='.str_replace(" ", "+", $artist_name).'&format=json&results=2&start=0';
$echonest_news_json = file_get_contents($echonest_news);
$news_json = json_decode($echonest_news_json);
$news_entry = $news_json->response->news;
foreach ($news_entry as $news) {
// Do Magic Stuff Here...
}
}
function artistPageVideos() {
$artist_name = $_GET['artistname'];
$results = iTunes::search($artist_name, array(
'entity' => 'musicVideo'
))->results;
$echonest_api_key = "OUR_API_KEY";
// Videos Method
$echonest_videos = 'http://developer.echonest.com/api/v4/artist/video?api_key='.$echonest_api_key.'&name='.str_replace(" ", "+", $artist_name).'&format=json&results=6&start=0';
$echonest_videos_json = file_get_contents($echonest_videos);
$videos_json = json_decode($echonest_videos_json);
$videos_entry = $videos_json->response->video;
foreach ($videos_entry as $video) {
// Do More Magic Stuff Here...
}
}
We have maybe about 7 (or more) of these methods that are called on each Artist page load. Obviously this can mean trouble when lots of people are viewing the artist pages every hour.
I understand that there's a way to store the more static information into a database and use that info instead of calling the API methods on every request. I am currently exploring that option. But I also read here that there may be a way to 'sideload' the API calls so that you can make multiple requests at one time. In that example, they're using Curl. I'm trying to do this with PHP.
curl https://{subdomain}.zendesk.com/api/v2/help_center/fr/articles.json?include=users \
-v -u {email_address}:{password}
Can anyone help me get started with this or perhaps recommend a better way to do this, such as storing this information into a database or table and pulling from that instead of calling the API every time?
Thanks in advance.

Netsuite SuiteTalk - requesting list of invoices for a customer via PHP

I'm using the Netsuite PHP Toolkit to try to obtain a list of invoices for a customer. I can do the call (using a TransactionSearch) with no problem, but I'm struggling to understand how I'm supposed to get all details for an invoice - i.e. the invoice "header" details (e.g. grand total, currency, main menu line etc) as well as details for each line item (net value, taxable value, item etc).
I have tried a couple of approaches:
TransactionSearchAdvanced, with return columns specified and returnSearchColumns preference set to "false". This gives back all the separate lines (woo!) but things like currency and term aren't expanded out - you just get internalId specified and not the actual text (or the symbol). Also, with TSA, do you really have to specify every column you want? i.e. is the default really just an empty set of fields? Isn't there a way of just saying "give me all the details for all lines of each invoice?
TransactionSearch, with returnSearchColumns preference set to "true". This gives a list of single Invoice type records, with all the currency and term stuff correctly populated, but frustratingly, none of the individual line items. It's more of a summary.
So I am left with a couple of options, neither of which are very palatable, namely:
Do both calls for all invoices and combine the data. These searches take a long time (performance is another bugbear for me, so I really don't want to do this.
or
Figure out a way of requesting the data for terms, currency etc and also a way of obtaining invoice lines.
I have no idea how you're supposed to do this, and can't find anything on the internet about it. This is one of the worst interfaces I've used (and I've used some pretty bad ones).
Any help would be hugely appreciated.
Just like you I started out trying to do things with the Web Services API (aka SuiteTalk). Mostly it was an exercise in frustration because eventually what I found out was that I plain couldn't do what I wanted with them. That and the performance was pretty bad, which would have killed my project even if it had worked properly.
Like Faz, I've found it much easier and faster to use a combination of RESTlets and Saved Searches than deal with the web services framework.
Basically break your problem down into these parts:
Saved Search that returns the results that you want (keep track of the internal ID you'll need it later)
RESTlet it's just a Javascript file that defines the function you will use to return the results from the search
Client code to call the RESTlet and get the results.
Part I:
So the saved search is pretty straightforward. I'm going to assume you can make that happen and also that you can actually get all the fields you want in one place. That hasn't always been the case in my experience.
Part II:
The RESTlet involves a lot more steps even though it's really a very simple thing. What makes it complicated is getting it uploaded and deployed on your NetSuite site. If you don't already have the NetSuite IDE installed I highly recommend it if only to make deploying the scripts a little easier. The autocompletion and tooltips are extremely useful as well.
For instance here is code I use to get results from a search I cared about. This was adapted from some kind soul's posting somewhere on the internet but I forget where:
function getSearchResults(){
var max_rows = 1000;
var search_id = 1211;
var search = nlapiLoadSearch(null, search_id);
var results = search.runSearch();
var rows = [];
// add starting point for usage
var context = nlapiGetContext();
startingUsage = context.getRemainingUsage();
rows.push(["beginning usage", startingUsage]);
// now create the collection of result rows in 1000 row chunks
var index = 0;
do{
var chunk = results.getResults(index, index+1000);
if( ! chunk ) break;
chunk.forEach( function(row){
rows.push(row);
index++;
});
}while( chunk.length === max_rows);
// add a line that returns the remaining usage for this RESTlet
context = nlapiGetContext();
var remainingUsage = context.getRemainingUsage();
rows.push(["remaining usage",remainingUsage]);
// send back the rows
return rows;
}
This is where you get things primed by passing in your Saved Search Internal ID:
var search = nlapiLoadSearch(null, SEARCH_ID);
var resultSet = search.runSearch();
Then the code repeatedly calls getResults() to get chunks of 1000 results, this is a NetSuite limitation. Once you have this written you have to upload the script to NetSuite and configure and deploy it. The most important part is telling it what function to assign to each verb. In this case I assigned GET to execute the getSearchResults. There is a lot of work to do here, and I'm not going to type all of it out because it is worth your time to learn this part. At least enough to get the IDE to do it for you =D. You can read all about it in the "Introduction to RESTlets" guide.
Part III.
Client code can be in whatever you want that does REST the way you like to. Personally I like Python for this because the requests library is fantastic.
Here's some example Python code:
import requests
import json
url = 'https://rest.sandbox.netsuite.com/app/site/hosting/restlet.nl?script=123&deploy=1'
headers = {'Content-Type': 'application/json', 'Authorization':'NLAuth nlauth_account=1234567, nlauth_email=someone#somewhere.com, nlauth_signature=somepassword, nlauth_role=3'}
resp = requests.get(url, headers=headers)
data = resp.json()
The URL is going to be displayed to you as part of the deployment of the RESTlet. Then it's up to you to do what you want with the data that comes back.
So the things I would suggest you spend time with would be
Setting up the NetSuite IDE
Getting and reading the SuiteScript developer reference docs
Finding a good way to create REST client code in you language of choice.
I hope that helps.
I created a saved search in Netsuite and call that search using restlet. With this it is pretty lightweight and you can call the data as it is in the saved search.
Performance wise Restlet is much better than webservices.
Create a new suitelet script and deploy
Below script will give you invoice list by customer internal id
function customSearch(request, response) {
var rows = [];
var result;
var filters = [];
//9989 is customer internal id you can add more
// by pushing additional ids to array
filters.push(new nlobjSearchFilter('entity', null, 'anyOf', [9989] ));
var invoiceList = nlapiSearchRecord('invoice', null, filters, []);
// by default record limit is 1000
// taking 100 records
for (var i = 0; i < Math.min(100, invoiceList.length); i++)
{
if (parseInt(invoiceList[i].getId()) > 0) {
recordid = invoiceList[i].getId();
try {
result= nlapiLoadRecord(invoiceList[i].getRecordType(), recordid);
// pushing in to result
rows.push(result);
} catch (e) {
if (e instanceof nlobjError) {
nlapiLogExecution('DEBUG', 'system error', e.getCode() + '\n' + e.getDetails());
} else {
nlapiLogExecution('DEBUG', 'unexpected error', e.toString());
}
}
}
}
response.setContentType('JSON');
response.write(JSON.stringify({'records' : rows}));
return;
}
}
}
response.setContentType('JSON');
response.write(JSON.stringify({'records' : rows}));
return;
}
Here is what I have for getting a customer's invoices:
public function getCustomerInvoices($customer_id)
{
$service = new NetSuiteService($this->config);
$customerSearchBasic = new CustomerSearchBasic();
$searchValue = new RecordRef();
$searchValue->type = 'customer';
$searchValue->internalId = $customer_id;
$searchMultiSelectField = new SearchMultiSelectField();
setFields($searchMultiSelectField, array('operator' => 'anyOf', 'searchValue' => $searchValue));
$customerSearchBasic->internalId = $searchMultiSelectField;
$transactionSearchBasic = new TransactionSearchBasic();
$searchMultiSelectEnumField = new SearchEnumMultiSelectField();
setFields($searchMultiSelectEnumField, array('operator' => 'anyOf', 'searchValue' => "_invoice"));
$transactionSearchBasic->type = $searchMultiSelectEnumField;
$transactionSearch = new TransactionSearch();
$transactionSearch->basic = $transactionSearchBasic;
$transactionSearch->customerJoin = $customerSearchBasic;
$request = new SearchRequest();
$request->searchRecord = $transactionSearch;
$searchResponse = $service->search($request);
return $searchResponse->searchResult->recordList;
}

How to track users location / region in PHP

I'm trying to get the country from which the user is browsing the website so I can work out what currency to show on the website. I have tried using the GET scripts available from: http://api.hostip.info but they just return XX when I test it.
If anyone knows any better methods please share.
Thanks.
I use this:
$_SESSION['ip'] = $_SERVER['REMOTE_ADDR'];
$ip = $_SESSION['ip'];
$try1 = "http://ipinfodb.com/ip_query.php?ip=".$ip."&output=xml";
$try2 = "http://backup.ipinfodb.com/ip_query.php?ip=".$ip."&output=xml";
$XML = #simplexml_load_file($try1,NULL,TRUE);
if(!$XML) { $XML = #simplexml_load_file($try2,NULL,TRUE); }
if(!$XML) { return false; }
//Retrieve location, set time
if($XML->City=="") { $loc = "Localhost / Unknown"; }
else { $loc = $XML->City.", ".$XML->RegionName.", ".$XML->CountryName; }
$_SESSION['loc'] = $loc;
Try these:
http://ip-to-country.webhosting.info/
http://www.ip2location.com/
Both are IP address-to-country databases, which allow you to look up the country of origin of a given IP address.
However it's important to note that these databases are not 100% accurate. They're a good guide, but you will get false results for a variety of reasons.
Many people use proxying to get around country-specific blocks and filters.
Many IP ranges are assigned to companies with large geographic spread; you'll just get the country where they're based, not where the actual machine is (this always used to be a big problem for tracking AOL users, because they were all apparently living in Virginia)
Control of IP ranges are sometimes transferred between countries, so you may get false results from that (especially for smaller/less well-connected countries)
Keeping your database up-to-date will mitigate some of these issues, but won't resolve them entirely (especially the proxying issue), so you should always allow for the fact that you will get false results.
You should use the geoip library.
Maxmind provides free databases and commercial databases, with a difference in the date of last update and precision, the commercial being of better quality.
See http://www.maxmind.com/app/geolitecountry for the free database.
I think it should be sufficient for basic needs.
You can use Geolocation to get the Coordinates and then some Service to get the Country from that, but the geolocation API is browser based so you can only access it via JavaScript and then have to pass theese Informations to PHP somehow, i wrote something on the JS Part once:
http://www.lautr.com/utilizing-html5-geolocation-api-and-yahoo-placefinder-example
When it comes to getting the Location via the IP, there are a bazillion Services out there who offer databases for that, some free, some for charge, some with a lot of IP's stored and much data, some with less, for example the one you mentioned, works just fine:
http://api.hostip.info/?ip=192.0.32.10
So You can ether go with the Geolocation API which is pretty neat, but requires the users permission, works via JS and doesnt work in IE (so far) or have to look for a IPÜ Location Service that fits your needs :)
Try these:
$key="9dcde915a1a065fbaf14165f00fcc0461b8d0a6b43889614e8acdb8343e2cf15";
$ip= "198.168.1230.122";
$url = "http://api.ipinfodb.com/v3/ip-city/?key=$key&ip=$ip&format=xml";
// load xml file
$xml = simplexml_load_file($url);
// print the name of the first element
echo $xml->getName() . "";
// create a loop to print the element name and data for each node
foreach($xml->children() as $child)
{
echo $child->getName() . ": " . $child . "<br />";
}
There are many ways to do it as suggested by those earlier. But I suggest you take a look at the IP2 PHP library available at https://github.com/ip2iq/ip2-lib-php which we developed.
You can use it like below:
<?php
require_once("Ip2.php");
$ip2 = new \ip2iq\Ip2();
$country_code = $ip2->country('8.8.8.8');
//$country_code === 'US'
?>
It doesn't need any SQL or web service lookup just a local data file. It is faster than almost all other methods out there. The database is updated monthly you can download it for free.
The only thing you will need to do for now if you need the country name in your language is map it to an associative array from something like this https://gist.github.com/DHS/1340150

Categories