So, currently i am using firebase for storing my app data online,
I would like to create my own database,
so i was planning to get a 100gb bandwidth hosting plan with php and mysql (is that bandwidth enought) per download, my app downloads approximately 0.4MB of data (as per firebase).
So, to create the api, i just have to encode the mysql data into json and print it ? then my android app will read it and use it ? is this the best method ?
$sth = mysqli_query("SELECT ...");
$rows = array();
while($r = mysqli_fetch_assoc($sth)) {
$rows[] = $r;
}
print json_encode($rows);
or is there any other, more efficient method to do this ?
Yes but you should send response code like 200,403 as well.
here is a similar question
How to write a REST API?
Related
I have a app over 500k member and all of their user get data from server every 1h.
All of my data store in a php file and device get it with JSON.
This is my Simple php file:
<?php
$response = array();
header('Content-type: application/json');
$response["AppInf"] = array();
$product = array();
$product["apptitle"] = "string1";
$product["apps"] = "string2";
$product["apps2"] = "string3";
$product["apps4"] = "string4";
$product["idapp"] = "stringid";
array_push($response["AppInf"], $product);
$response["success"] = 1;
echo json_encode($response);
?>
but when access over 15k user in server my cpu load grow to 100%.
I have a good vps server with 64g ram and xenon cpu.
Anyone can help me for manage and fix this problem???
If your content is really static as in your example: store content in static file and use caching. If your content is the same for at least a group of users then you only have to calculate the desired result once and store the data for later retrieval
Consider using a reverse proxy like varnish to move load from your web server to another server
If it is possible: Don't let all users fetch data at the same time. Add some random offset to the time when data is being pulled.
I've got a Minecraft Software written in C# that I want to send a heartbeat to my site. I've got the way to send the beat already written.
if (Server.Uri == null) return;
string uri = "http://GemsCraft.comli.com/Heartbeat.php";
// create a request
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = "POST";
// turn request string into a byte stream
byte[] postBytes = Encoding.ASCII.GetBytes(string.Format("ServerName={0}&Url={1}&Players={2}&MaxPlayers={3}&Uptime={4}",
Uri.EscapeDataString(ConfigKey.ServerName.GetString()),
Server.Uri,
Server.Players.Length,
ConfigKey.MaxPlayers.GetInt(),
DateTime.UtcNow.Subtract(Server.StartTime).TotalMinutes));
request.ContentType = "application/x-www-form-urlencoded";
request.CachePolicy = new System.Net.Cache.RequestCachePolicy(System.Net.Cache.RequestCacheLevel.NoCacheNoStore);
request.ContentLength = postBytes.Length;
request.Timeout = 5000;
Stream requestStream = request.GetRequestStream();
// send it
requestStream.Write(postBytes, 0, postBytes.Length);
requestStream.Flush();
requestStream.Close();
/* try
{
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Logger.LogToConsole(new StreamReader(response.GetResponseStream()).ReadToEnd());
Logger.LogToConsole(response.StatusCode + "\n");
}
catch (Exception ex)
{
Logger.LogToConsole("" + ex);
}*/
}
Now, I want to be able to retrieve the heartbeat in PHP, upload it to the SQL database, and then display each user's server in a table that will be displayed on the webpage
How do I do this?
portforwardpodcast's answer isn't very well-suited for your purposes, here's a process for you to ponder
Server accesses the following page: heartbeat.php?port=25565&maxplayers=25&players=2&name=Cheese_Pizza_Palace
Your PHP script will then do the following...
Go through each value, making sure they're all the types you want them to be (integers/strings)
Connect to the database
Update the server in the database if it already exists, create it if it doesn't
Return some value so the server knows that it completed successfully.
And to display the servers
Fetch all 'active' servers
Loop through them and display each one.
Things you'll need to figure out:
How to determine uptime
How to determine "active" servers
How to update/create MySQL entries
How to (properly) connect to a database. I would suggest using PDO since you're using PHP. It's a bit difficult to learn, but it's much more secure than writing the queries directly.
How to loop through all the GET variables.
Good hunting!
I would create a simple php page accept a get variable. something like www.site.com/beat.php?lasttime=123456&serverid=1 where the number us the unix timestamp. Then you need to re-work your c# to do a simple get request on a website. Finally your php should insert into a mysql table with a column for id, timestamp, server_id etc.
First you need to pull the data from the request. The $_REQUEST variable in php is nice because it works for both GET and POST:
http://php.net/manual/en/reserved.variables.request.php
Start out by var_dump or echo the fields you want. Once you can get the needed data into variables you are done with the first part. For the next part you need to create a database and table in MySQL. The best tool for this is phpmyadmin. If you have a host like godaddy (or some others) you can get at this from the control panel. If not you may need to install upload the phpmyadmin files yourself. It's a pretty simple tool to use:
http://www.youtube.com/watch?v=xxQSFHADUIY
Once your database has the correct columns, you need to insert the data from your php file. This page should help:
http://www.w3schools.com/php/php_mysql_insert.asp
I have a script that is running on a shared hosting environment where I can't change the available amount of PHP memory. The script is consuming a web service via soap. I can't get all my data at once or else it runs out of memory so I have had some success with caching the data locally in a mysql database so that subsequent queries are faster.
Basically instead of querying the web service for 5 months of data I am querying it 1 month at a time and storing that in the mysql table and retrieving the next month etc. This usually works but I sometimes still run out of memory.
my basic code logic is like this:
connect to web service using soap;
connect to mysql database
query web service and store result in variable $results;
dump $results into mysql table
repeat steps 3 and 4 for each month of data
the same variables are used in each iteration so I would assume that each batch of results from the web service would overwrite the previous in memory? I tried using unset($results) in between iterations but that didn't do anything. I am outputting the memory used with memory_get_usage(true) each time and with every iteration the memory used is increased.
Any ideas how I can fix this memory leak? If I wasn't clear enough leave a comment and I can provide more details. Thanks!
***EDIT
Here is some code (I am using nusoap not the php5 native soap client if that makes a difference):
$startingDate = strtotime("3/1/2011");
$endingDate = strtotime("7/31/2011");
// connect to database
mysql_connect("dbhost.com", "dbusername" "dbpassword");
mysql_select_db("dbname");
// configure nusoap
$serverpath ='http://path.to/wsdl';
$client = new nusoap_client($serverpath);
// cache soap results locally
while($startingDate<=$endingDate) {
$sql = "SELECT * FROM table WHERE date >= ".date('Y-m-d', $startingDate)." AND date <= ".date('Y-m-d', strtotime($startingDate.' +1 month'));
$soapResult = $client->call('SelectData', $sql);
foreach($soapResult['SelectDateResult']['Result']['Row'] as $row) {
foreach($row as &$data) {
$data = mysql_real_escape_string($data);
}
$sql = "INSERT INTO table VALUES('".$row['dataOne']."', '".$row['dataTwo']."', '".$row['dataThree'].")";
$mysqlResults = mysql_query($sql);
}
$startingDate = strtotime($startingDate." +1 month");
echo memory_get_usage(true); // MEMORY INCREASES EACH ITERATION
}
Solved it. At least partially. There is a memory leak using nusoap. Nusoap writes a debug log to a $GLOBALS variable. Altering this line in nusoap.php freed up a lot of memory.
change
$GLOBALS['_transient']['static']['nusoap_base']->globalDebugLevel = 9;
to
$GLOBALS['_transient']['static']['nusoap_base']->globalDebugLevel = 0;
I'd prefer to just use php5's native soap client but I'm getting strange results that I believe are specific to the webservice I am trying to consume. If anyone is familiar with using php5's soap client with www.mindbodyonline.com 's SOAP API let me know.
Have you tried unset() on $startingDate and mysql_free_result() for $mysqlResults?
Also SELECT * is frowned upon even if that's not the problem here.
EDIT: Also free the SOAP result too, perhaps. Some simple stuff to begin with to see if that helps.
What is the best way to count a Facebook user's friends...
I'm currently using (PHP):
$data = $facebook->api('/me/friends');
$friends_count = count($data['data']);
and its very slow... (about 2 secs)
Querying the facebook api sends a request to facebook. Because its a common http-request this Probably takes most of the time. There is usually no way around it. If you need the values more often, you should cache them somewhere
if (file_exists($cacheFile)) {
$data = file_get_contents($cachefile);
} else {
$data = $facebook->api('/me/friends');
file_put_contents($cacheFile, $data);
}
$friends_count = count($data['data']);
Remember to update the cache file from time to time.
If you are not processing the data given by Facebook on server side, instead of doing it using PHP, you can use JavaScript graph API to fetch, it can fetch it using ajax which wont effect your page load time.
I'm trying to create an iOS application which upon loading, will initially connect via HTTP back to a PHP web service which will output data as JSON from a MySQL database. I would then like it to import this data into a local SQLite database within the iOS app. I've already downloaded the JSON-Framework for Objective-C.
My question is two fold.
1) What is the best way to output the JSON from PHP so that I can send multiple database tables in the same JSON file? I have 4 tables of data that I'm trying to send (user, building, room, device).
Here is how I am currently outputting the JSON data:
// Users
$query = "SELECT * from user";
$result = mysql_query($query,$conn) or die('Errant query: '.$query);
$users = array();
if(mysql_num_rows($result)) {
while($user = mysql_fetch_assoc($result)) {
$users[] = array('user'=>$user);
}
}
// Buildings
$query = "SELECT * from building";
$result = mysql_query($query,$conn) or die('Errant query: '.$query);
$buildings = array();
if(mysql_num_rows($result)) {
while($building = mysql_fetch_assoc($result)) {
$buildings[] = array('building'=>$building);
}
}
// Rooms
$query = "SELECT * from room";
$result = mysql_query($query,$conn) or die('Errant query: '.$query);
$rooms = array();
if(mysql_num_rows($result)) {
while($room = mysql_fetch_assoc($result)) {
$rooms[] = array('room'=>$room);
}
}
// Devices
$query = "SELECT * from device";
$result = mysql_query($query,$conn) or die('Errant query: '.$query);
$devices = array();
if(mysql_num_rows($result)) {
while($device = mysql_fetch_assoc($result)) {
$devices[] = array('device'=>$device);
}
}
header('Content-type: application/json');
echo json_encode(array('users'=>$users));
echo json_encode(array('buildings'=>$buildings));
echo json_encode(array('rooms'=>$rooms));
echo json_encode(array('devices'=>$devices));
I fear that this method isn't the right way to send multiple objects.
2) In the iOS app, how can I automatically take this JSON data and insert it into the corresponding local database tables in SQLite?
Thanks for any help.
On 1. Instead of JSOn you could use binary Property lists they are natively implemented on the iPhone and there is a library to turn PHP into binary Plist https://github.com/rodneyrehm/CFPropertyList
There are many benefits to using binary property lists, they are probably 1/5 of the size of JSON, you don't need a external library to parse them, therefore all code is much simpler, etc.
On 2. There is no easy way to take the JSON/Plist structure and insert it to a SQL database, because JSON/Plist allow much more flexibility then SQL tables. So you would have to first create the right tables in your SQLite DB and then use normal INSERT to insert the data one by one into the database exactly like you would do with PHP.
Yeah Luke's recommendation is good but you will be fine with the way you are exporting your tables. You may just have to dig "deeper" into the structure to get what you want - i.e. your output with return a "dictionary of dictionaries of arrays" which will then contain the data for each table.
As for downloading them first:
1) NSURLConnection and its delegate methods - you can send asynchronous request to your webserver to get this file and get notified when the data has been downloaded so the user interface is never blocked in your app.
Here's the documentation with some good examples from Apple: http://developer.apple.com/library/mac/#documentation/Cocoa/Reference/Foundation/Classes/NSURLConnection_Class/Reference/Reference.html
At the end of the download, you will have an NSData object which can then be converted back to a string using NSString *jsonContents = [[NSString alloc] initWithData:jsonData encoding:NSUTF8StringEncoding.
You can then use a JSON parser library - I recommend SBJSON https://github.com/stig/json-framework - which will parse the data and return it as a dictionary or array depending on your structure.
From there you can access your tables and value with valueForKey in dictionaries or objectAtIndex: in arrays and then map it into your chosen local storage, for which I recommend Coredata (or you could use sqlite if you are familiar with it too).
Hope it helps.
Rog
I can't speak to 2), but for 1), I would recommend combining your JSON into a single array. One of the nice things about JSON (and arrays) is the ability to nest elements as deeply as you like.
echo json_encode(array(
'users'=>$users,
'buildings'=>$buildings,
'rooms'=>$rooms,
'devices'=>$devices,
));
What about preparing the SQLite database on the webserver and downloading that to the iOS application? This way, you do the heavy lifting on the PHP side. If the data is relatively static, you can even setup a scheduled task to generate the SQLite database on a regular interval.
We've done this for one of our apps and it worked very well.
One thing to keep in mind then is that you should enable gzip compression on the webserver to minimize the data transfer. Remember that you have to do some extra stuff to use gzip compression with NSURLConnection:
http://www.bigevilempire.com/codelog/entry/nsurlconnection-with-gzip/
you can use REST server and RESTKit.
If you would like a more full-featured solution that what is offered by a standalone parsing library, you may want to take a look at RestKit: http://restkit.org/
The framework wraps the operations of fetching, parsing, and mapping JSON payloads into objects. It can handle deeply nested structures and can map directly back to Core Data for persistence.
At a high level, here's what your fetch & post operations would feel like in RestKit:
- (void)loadObjects {
[[RKObjectManager sharedManager] loadObjectsAtResourcePath:[#"/path/to/stuff.json" delegate:self];
}
- (void)objectLoader:(RKObjectLoader*)loader didLoadObjects:(NSArray*)objects {
NSLog(#"These are my JSON decoded, mapped objects: %#", objects);
// Mutate and PUT the changes back to the server
MyObject* anObject = [objects objectAtIndex:0];
anObject.name = #"This is the new name!";
[[RKObjectManager sharedManager] putObject:anObject delegate:self];
}
The framework takes care of the JSON parsing/encoding on a background thread and let's you declare how attributes in the JSON map to properties on your object. A number of people in the community are working with PHP backends + RestKit with great success.