Send long JSON string by post in php - php

In my windows client, I should sent my data to server with JSON like this:
[{"score":"MathJun: 90","shenase":"2981051261"},
{"score":"MathJun: 80","shenase":"2981021192"},
{"score":"ChemJun: 90","shenase":"2981027931"},
{"score":"MathFeb: 90","shenase":"2981060775"},
{"score":"MathJun: 90","shenase":"2981010824"},
{"score":"MathJun: 00","shenase":"2981017039"},
{"score":"ChemJun: 10","shenase":"3120292011"}]
And number of JSON blocks in between 1 to 40.
And in my PHP file, in a For loop I insert a record to my database with data of every JSON blocks.
So the JSON string will be so long. One person told to divide it to 5 parts and send it into 5 GET. Does it really effects?
What is the best solution for this job?
Does then length of JSON string causes problems?
And how should I fix it?
Does executing 40 query in a for loop caused problem?

Some time JSON will create problem, because it follows proper structure and more over its not stranded if you want to send big data better to use POST.
You can refer the bellow link for more details about GET and POST
http://www.w3schools.com/tags/ref_httpmethods.asp
And coming to your problem
If you avoid multiple req's to the server it will in busy mode. Instead of better make send data in single shot and you can insert data using for loop or else like bellow wise,
insert into table1 (First,Last) values ('Fred','Smith'),
('John','Smith'),
('Michael','Smith'),
('Robert','Smith');
I suggest you, use POST and send data in single shot, and crate a string
$str="insert into table1 (First,Last) values ('Fred','Smith'),
('John','Smith'),
('Michael','Smith'),
('Robert','Smith')";
like wise and execute the query. It might be fast.

Related

PHP - Read every line in a CSV, take the data and Post/CURL each line one by one

I have about 14,000 lines in a CSV file.
Each line is formatted like this:
Firstname Lastname,UUID
Firstname Lastname,UUID
I want to Post/CURL to a url like:
http://some.url/directory/file.php?name=Firstname%20Lastname&uuid=UUID
I have tried doing this a few ways, but I always run into an issue - what's a good way to do this, and I don't want to attack the URL by sending 14,000 requests at once.
I do not need any data back from each posting, I just need to send the data over to the PHP file so it can be analyzed.
NOTE: Each post means a mysql query will be analyzing the data, so I don't want to flood the server with 14,000 mysql requests at once either - more reason why I might want to throttle this.
I just have to do this one time ever.
Would something like pull data from csv file and submit each row to url with curl be a good place to start?

Which way is better to send data by POST method in objective-C?

I am moving data from sqlite to webserver (mySql), from iphone to web and php will be landing site which insert data in mysql after getting data in POST format.
I am using NSURLConnection in iOS. I am confused how to move data, so I figured out that I should append all the data in string from a table seperated by comma and \n for row. Since data consist complete postal address(name, address, phone, email) and maximum record might be 500, don't know if there are other better ways.
My options are:
Send each line of row after reading data from table
Sending whole data in one single variable(string)
I don't know.
Please suggest me.
I recommend POST method and because POST has no limit of characters you can do the whole database to your mysql.
I should make in PHPmyAdmin (or something similar) an export or an backup and send that to your mysql.

Using data from a web-service in an iOS Application

I have to list the table entries from mysql in my iPhone app when a button is pressed. I am able to do it properly but it is just not the way I want it. I am using a PHP script to send Xcode what has to be printed.
First, I tried doing it using HTML table, but I didn't like the way it was printed with cells. Next I tried printing plain text by giving spaces(between columns) and \n for every new row. I used NSURL and loaded the webView to the iPhone. Looks good on browser but the same is not preserved when the iPhone tries to open it.
Is there a good way to do this? So I can just list the table entries without having to go through the traditional HTML table or any other idea is welcome.
Also, please try to be easily understood, as I am new to Obj-C, and PHP as well.
Thanks!!
Any thoughts on how I can do this in a UITableView..?? Do I have to return a string with component separation characters and fill in the tableView?
Output the results encoded in JSON. Send an a(sync) request to the server using NSURLConnection or using a third-party library such as AFNetworking. Parse the JSON using NSJSONSerialization, turning the results into an array/dictionary depending on the contents. Then parse the results into the UITableViewCell. It may be easier to subclass the cell so that you include the data that you'd like to use.
To encode the results from the database into JSON, you can use the method json_encode().

Which thing is better and efficient for data display using AJAX Full template or JSON data

I am new to jquery and JSON and i am bit confused as what is best and professional approach.
Suppose i have the long list of data like song name , artist , author etc.
Now i dynamically want to dislay records from database.
I have two options
Here i return the full html and update that with the target element
Second is to retrive the JSON data full of songs info and then build that html with javascript and populate it.
I want to know which approach is better and used by high traffic sites
Second is better because of data size being sent over the network. Reducing it will improve the performance.
Write a function that will use JSON data to generate HTML elements.
If you're running a high traffic site with lots of data then the second solution, using JSON does have the advantage of only giving you the raw data and relying on the browser to generate the HTML to display the data.
Personally I would need to hear some really convincing arguments for using the first option at all.

handle csv import with larga array through a three step process

i need some help with a project of mine. It is about a dvd database. In the moment i am planning to implement a csv data function to import dvds with all information from a file.
I will do this in three steps.
Step 1
- show data i want to import, building array
- import data, building session arrays
Step 2
- edit informations
Step 3
- showing result before update
- update data
so far it works but i have a problem with large files. the csv data has 20 columns (title, genre, plot etc.) and for each line in the csv there are some arrays i create to use it in the next steps.
When i have more about 500 lines the browser often collapse while importing. I get no response.
Anyway now i trying to do this as an ajax call process. The advantage is, that i can define how many procedures the system handle each call and the user can see that the system is still working, like an statusbar when down/uploading a file.
In the moment i try to find some usefull example illustrating how i can do this, but i could not find something useful till now.
Maybe you have some tipps or an example how this could work, saying processing 20 lines each call, building the array.
After i would like to use the same function to build the session arrays using in the next step and so on.
Some information:
i use fgetcsv() to read the rows from the file. i go through the rows and each column i have different querys like is the item id unique, the title exist, description exist etc.
So if one of these data is not entered i get an error which row and column the error occures.
I´d appreciate any help from you
use 'LOAD DATA INFILE' syntax. ive used it on files upwards of 500mb with 3mil rows and it takes seconds, not minutes.
http://dev.mysql.com/doc/refman/5.0/en/load-data.html
While this is not the direct answer you were looking for
500 lines shouldnt take too long to process, so.. heres another thought for you.
Create a temporary table with the right structure of fields
you can then extract from it using select statements the various unique entries for the plot, genre etc rather than making a bunch of arrays along the way
mysql import would be very fast of your data
You can then edit it as required, and finally insert into your final table the data you have from your temporary but now validated table.
In terms of doing it with ajax, you would have to do a repeating timed event to refresh the status, the problem is rather than 20 lines, it would need to be a specific time period, as your browser has no way to know, assuming the csv is uploaded and you can process it in 20 line chunks.
If you enter the csv in a big big textbox, you could work on by taking the first 20 lines, passing it the remainder to the next page etc, would strike me as potential mess.
So, while I know ive not answered your question directly, I hope I gave you food for thought as to alternative and possibly more practical alternatives

Categories