How can I secure JSON web service? - php

I have location data containing lat,long,location_name to be shown in the map. Only logged in users can see this map. What I did was that I used php and with a select * to MySQL DB and then I used json_encode to format the data in usable way and echoed it to be grabbed in the front-end and used in map api. This php file echoing the JSON file is called mapData.php
I want this file to not be accessible even from logged user. I came across session and request headers in the mapData.php file (internal api file)but then again if h hacker sign up to my service and open dev console he/she can see the received file and with one side requesting tool can put the header and see the data. Or maybe changing the access level with Linux but I have no idea how.
Another method is uglify and minifying JSON but since I am having 29000 rows in my dB with another inner join I think it will slow down the process. Any suggestion for securing this internal api so that even logged in user cannot access to it?

I would hide the map data file in a subdirectory, then use a service to access the data file and retrieve just the data you need. If you absolutely need the 29,000 rows at once, then there's not much you can do. Even if you encrypt it, eventually the data is going to be in native JavaScript format, and then it's just a matter of running a debugger and peering in the data structures.

Related

php scrape dynamically loaded content via ajax using knockout.js

I need to scrape some data from a website which is being loaded via ajax using knockout.js (I don't know exactly on which technology it is working.)
Site is www.msc.com. Here I am searching for schedules like from Barcelona to Miami. So the result is loaded via ajax but doesn't show up in console or firebug.
I have tried too many times. Any help or suggestion will be appreciable.
Their script is located at: https://www.msc.com/CMSTemplates/CraftedCMS/WebServices/RouteFinder.svc/Routes
They prevent you from calling it directly in a browser tab/window, probably because what you're trying to do is against their policy. If they wanted people to scrape their DATA, they would not block direct requests to their API or they would provide another publicly documented API for you to use on your server.
With that said, you can see in your browser console that their web service returns JSON objects. You will have to hack (maybe illegally) your way by faking protocol variables in order to accomplish what you're aiming for. The first things to consider is that they only return results through that web service when:
a) The call is made through XMLHttpRequest as POST. (This you can fake it easily, but the next points, not so much...)
b) The call is made using a referer, in this precise case, the referer is: https://www.msc.com/routefinder?fromId=406&isCountryFrom=false&toId=83&isCountryTo=false
c) The call passes a cookie to the server, which is encrypted and signed, so each session is in their database and your key is probably unique, so good luck decrypting this: CMSPreferredCulture=fr-FR; ASP.NET_SessionId=gza5rfjrog2eb21ukrzma223; BIGipServerkentico.app~kentico_pool=439883018.20480.0000; bbbbbbbbbbbbbbb=LIKIGEACDJHDJPGPEOKGJBKODKDGOMHNKAEGEGKNODEDAEILEICBMLNLEFMAOIPPKMOIBBFAILFEEKJPIJDCBDDLFNBBMBPBGGKAIDOCMGHBEEIDMLPMIJJAMNFNIFMI; rxVisitor=1497537754979PTPODMSFNIR8BFVAKK353FS76M2D1KNN; dtPC=3$537845860_975h-vCQTABPJMGEOKDPDVNLHPCPDASGAPMCPCBA; rxvt=1497539656937|1497537754995; dtSa=-; dtLatC=8; _ga=GA1.2.1247106544.1497537756; _gid=GA1.2.879601947.1497537756; _gali=results; cookiePolicyApproved=true; MSCAgencyId=355840; _gat=1; _gat_local=1; dtCookie=3$B74DFC30736F7DBF485B79C31C55B167|www.msc.com|1

Cache XML API Response in php server side caching

Is it possible to cache the response from a POST request to a XML API? I want to do this without having to create actual files on my server. I was told this is possible but Ive been unsuccessful.
Essentially.. I want to check if that call was already made, if so..pull that data, if not do the API call and save for future use.
To do this without files you would need to store the call and the result in a database. It is probably the best way to do it anyway. I have done this with google geolocation API because there are limits on how many requests you can do per day.
So I would just store the request (i.e. a zip code like 37206) and the result from google. Then the next time 37206 was requested I would just return the result stored in my database.
Further, to prevent something like a zip code lat/lon change I would store the date at which the result was stored and after say a year, I would use a cron to remove it.

Mobile app downloading JSON with PHP

I've got an iOS and Android app which downloads data from a database using JSON and PHP. Basically its a lot of mysql queries which returns data from my MySQL database. When I started this project is just created an array in php which holds all queries. Then I would send an index in my url to access that query and optionally some variables.
http://url.nl/script.php?index=1&var1=foo&var2bar
This worked fine and for a small project it wasn't bad but I knew this isn't good programming nor a good model.
So basically it's something like this:
APP with Model-View-Controller-Store model
When app needs data, Store classes request data through url and also send an index in that url
PHP script reads index, executes saved query in array, encodes data to JSON, returns data
App's store classes read and decode data
App's View classes present the data in any way wanted.
So I'm not really doing much with php other than accessing my database, encoding and returning data.
Since my app is getting very large and using more and more queries I wanted to do things right in my new version. What would be a good model for PHP to use in this scenario?
I'm no web developer so I was trying to keep all PHP processes to a minimum but realized this isn't a good way of programming.
Instead of just storing your queries in an array, you should instead use some kind of RESTful API on your server.
You then would send GET requests to your server, which executes them and returns the desired data. This could then be read and decoded. (You can also send and update data to the server).
There are a bunch of REST Framworks for PHP, but i used "Slim Framework" because its really easy to understand (even for people not familiar with php).
This Example from their Website:
$app = new \Slim\Slim();
$app->get('/hello/:name', function ($name) {
echo "Hello, $name";
});
$app->run();
makes it possible to call www.yourside.com/hello/Mark, which then returns "Hello Mark". With 5 lines of code. Its awesome.
You can write any php code there. To encode your data from a MySQL Database just follow this Tutorial: http://coenraets.org/blog/2011/12/restful-services-with-jquery-php-and-the-slim-framework/ Just ignore the JQuery part.
In your App you then request the Data from the provided URL. I use AFNetworking for this. (Google it, on their page find "HTTP Request Operation Manager" and look at GET)

PHP Get Cookie by Session ID (or otherwise pass data between two different connections)

Normally I try to format my question as a basic question and then explain my situation, but the solution I'm looking for might be the wrong one altogether, so here's the problem:
I'm building a catalog application for an auction website that has the ability to save individual lots. So far this has worked great by simply creating a cookie with a comma-separated list of IDs for those lots, via something like this:
$_COOKIE["MyLots_$AuctionId"] = implode(",",$arrayOfIds);
The problem I'm now hitting is that when I go to print the lots, I'm using wkhtmltopdf through the command-line to request the url of the printout I want, like this:
exec("wkhtmltopdf '$urlofmylots' filename.pdf");
The problem is that I can't pass a cookie to this call, because Apache sees an internal request, not the request of the user. I tried putting it in the get string, but once I have more than a pre-set limit for GET parameters, that value disappears from the $_GET array on the target url. I can't seem to find a way to send POST data between them. My next possible ideas are the following:
Maybe just pass the sessionID to the url, and see if there's a way that I can use PHP to dig through the cookies for that session and pull the right cookie, but that sounds like it'd be risky security-wise for a PHP server to allow (letting one session be aware of another). Example:
exec("wkhtmltopdf '$urlofmylots?sessionId=$sessionIdFromThisRequest' filename.pdf");
Possibly set a session variable and then pass that session Id, and see if I can use PHP to wade through that information instead (rather than using the cookie).
Would I be able to just create an array and somehow have that other script be aware of it, possibly by including it? That doesn't really solve the problem of wkhtmltopdf expecting a web-facing address as its first parameter.
(not really an idea, but some reasoning) In other instances of using this, I've just passed an ID to the script that generates the markup for wkhtmltopdf to parse, and the script uses that ID to get data from the database. I don't want to store this data in a file or the database for the simple purpose of transferring data from the caller to the callee in this case. Cookies and sessions seem cleaner since apache/php handle memory allocation for these sessions.
The ultimate problem here is that I'm trying to get my second script (referenced here by $urlofmylots) to be aware of data available to the calling script but it's being executed as if it were an external web request, not two php scripts being called from the web root.
Can anyone offer some insight here?
You might consider rendering whatever the output of $urlofmylots?lots=$lots_to_print would be to a temporary file and running wkhtmltopdf against that file.

PHP - Manage data from URL variables

Using a PHP script, I need to manage data sent to the script in a variable format.
The URL sent is something like: http://hawkserv.co.uk/heartbeat.php?port=25565&max=32&name=My%20Server&public=True&version=7&salt=wo6kVAHjxoJcInKx&players=&worlds=guest&motd=testtet&lvlcount=1&servversion=67.5.0.1&hash=randomhash&users=0
(clicking the link returns a formatted table of the results)
What is the best method of storing this information for it to be used in a formatted HTML page?
Multiple URL's will be sent to the script, with different values. The script needs to store each response to be used later, and also "time out" responses that haven't been updated in a while.
Example scenario:
3 servers exist, Server 1, Server 2, and Server 3. Each of these servers send the above url every 45 seconds with a few values changed per server. A formatted table can display information when the page is requested, and is updated when the page refreshes to any new information that the servers send.
Server 1 goes offline, and doesn't send any more requests. The script accounts for this lack of request and removes Server 1's information from the list, declaring it offline.
Although code is greatly appreciated to have, I think I can just go off the best way of doing it. Is it storing each url as an array in a file, and reading the file when needed, or is there some other way?
I would store the variables + the time the request was received in a database. The database can be a SQLite one if you don't like to go through the hassle of setting up a full blown system. The advantages of using SQLite over dumping arrays to a file is that you can do flexible queries without coding up parsing routines and the like.

Categories