How to get all items from database in json format php? - php

I am new to php development.
How to display multiple json objects fetched from database? So far I am getting single data. I need to display all inserted data in database in json format.
When I insert first data, I get a response like this. If I insert second data, it should display first and second data and so on in upload_details object but it displaying last inserted data only.
{"code":200,
"message":"The file FileUpload1444329638_li.jpg has been uploaded.",
"upload_details": {"desc":"hi",
"file_name":"abc.com\/FileUpload1444329637_li.jpg"}}
When I insert second data:
{"code":200,
"message":"The file FileUpload1444329638_li.jpg has been uploaded.",
"upload_details": {"desc":"h2",
"file_name":"abc.com\/FileUpload1444329638_li.jpg"}}
Here is my code:
<?php
include 'db_config.php'; //echo "hi";exit;
if($_POST['api_name']=="upload_file"){
if(!empty($_FILES["profile_pic"]["name"])){
$fileName = time().'_'.$_FILES["profile_pic"]["name"];
if (move_uploaded_file($_FILES["profile_pic"]["tmp_name"], "uploads/".$fileName)) {
$sql = "Insert into file_upload(`desc`,`file_name`) values ('".$_POST['desc']."','".$fileName."');";
if($conn->query($sql)){
$response= array('code'=>200,'message'=>"The file ". basename( $_FILES["profile_pic"]["name"]). " has been uploaded.",'upload_details'=>array("desc"=>$_POST['desc'],"file_name"=>$_SERVER['SERVER_NAME'].dirname($_SERVER['SCRIPT_NAME']).$fileName));
//print_r($response);exit;
}else{
$response= array('code'=>500,'message'=>"Error in uploading file");
}
} else {
$response= array('code'=>500,'message'=>"Error in uploading file");
}
}else{
$response= array('code'=>500,'message'=>"Error in uploading file");
}
}elseif ($_POST['api_name']=="get_files"){
$response['code']=200;
$response['file_lists'] = array();
$res = $conn->query("select * from file_upload");
while($row = $res->fetch_object()){
array_push($response['file_lists'],array('desc'=>$row- >desc,'file_path'=>$_SERVER['SERVER_NAME'].dirname($_SERVER['SCRIPT_NAME']).$row ->file_name));
}
}
echo json_encode($response);exit;
I expect to get response as data from all which is inserted in table, like:
{"code":200,
"message":"The file has been uploaded.",
"upload_details": {"desc":"hi",
" file_name":"abc.com\/FileUpload1444329637_li.jpg"},
{"desc":"hi2",
"file_name":"abc.com\/FileUpload1444329638_li.jpg"}
}

First, your script currently only has access to the data being inserted for that POST. Your code does nothing at all to query all records that currently exist in file_upload table. Without this, I don't know how you expect these other records to magically be returned in the response.
Second, it would be VERY atypical to tie a single insert operation to a full listing of all records in the table into which the insert was made. This seems like a problematic approach in that you slow down the process of returning a successful insert message to the caller while you try to query the full table. Normally, if one wanted a full listing of records, one might expect to make a GET request against an API specifically designed to do this rather than make a POST (insert) to get this information. Additionally, it seems very odd to intentionally design an API that will get slower and slower over time as you add more records and have a larger payload to return to the caller. The problem is compounded by the fact you are trying to json_encode the data structure. This means that your script will actually continue to take more and more memory to execute with each call to the insert API (as you have to hold the entire data structure for all records in memory to encode it). This means that, at a server level, calls to the API will continue to take a greater percentage of system resources with each passing request, perhaps causing you to need to scale hardware just to meet this use case. You should REALLY, REALLY, REALLY (is that enough REALLY's?) reconsider this requirement.
Third, You have significant SQL injection vulnerability. You should look into using prepared statements and/or sanitizing/validating the user input before making the insert.
Finally, the response format you propose is not valid JSON. I would think that, if you decided you REALLY want to return all records for each successful insert, you would want a format like:
{
"code":200,
"message":"The file has been uploaded.",
"upload_details": [
{
"desc":"hi",
"file_name":"abc.com\/FileUpload1444329637_li.jpg"
},
{
"desc":"hi2",
"file_name":"abc.com\/FileUpload1444329638_li.jpg"}
}
]
}
Note the array wrapper around the two returned records.

Related

parsing paginated json from web service

I am trying to parse a large amount of JSON data generated from a remote web service. The output produced is paginated across 500 URIs and each URI contains 100 JSON objects. I need to match a property in each JSON object, it's DOI (a digital object identifier), against a corresponding field fetched from a local database and then update the record.
The issue I am having is controlling my looping constructs to seek out the matching JSON DOI while making sure that all the data has been parsed.
As you can see I have tried to use a combination of break and continue statements but I am not able to 'move' beyond the first URI.
I later introduced a flag variable to help control the loops without effect.
while($obj = $result->fetch_object()){
for($i=1;$i<=$outputs_json['meta']['response']['total-pages'];$i++){
$url = 'xxxxxxxxxxxxxxx&page%5Bnumber%5D='."$i".'&page%5Bsize%5D=100';
if($outputs = json_decode(file_get_contents($url),true)===false){
}
else{
try{
$outputs = json_decode(file_get_contents($url),true);
$j=0;
do{
$flag = false;
$doi = trim($outputs['data'][$j]['attributes']['identifiers']['dois'][0], '"');
if(!utf8_encode($obj->doi)===$doi) continue;
}else{
$flag = true;
$j++;
}
}while($j!==101);
if($flag===true) break;
} catch(Exception $e) {
}
}
}
}
}
What is the optimal approach that guarantees each JSON object at all URIs is parsed and that CRUD operations are only performed on my database when a fetched record's DOI field matches the DOI property of the incoming JSON data?
I'm not 100% sure I understand every aspect of your question but for me it would make sense to change the order of execution
fetch page from external service
decode json and iterate through all 100 objects
get one DOI
fetch corresponding record from database
change db record
when all json-objects are progressed - fetch next url
repeat until all 100 urls are fetched
I think it's not a good idea to fetch one record from local DB and try to find it in 100 different remote calls - instead it's better to base your workflow/loops on fetched remote data and try to find the corresponding elements in your local DB
If you think that approach will fit your task - I can of course help you with the code :)

Google Big Query + PHP -> How to fetch a large data set without running out of memory

I am trying to run a query in BigQuery/PHP (using google php SDK) that returns a large dataset (can be 100,000 - 10,000,000 rows).
$bigqueryService = new Google_BigqueryService($client);
$query = new Google_QueryRequest();
$query->setQuery(...);
$jobs = $bigqueryService->jobs;
$response = $jobs->query($project_id, $query);
//query is a syncronous function that returns a full dataset
The next step is to allow the user to download the result as a CSV file.
The code above will fail when the dataset becomes too large (memory limit).
What are my options to perform this operation with lower memory usage ?
(I figured an option is to save the results to another table with BigQuery and then start doing partial fetch with LIMIT and OFFSET but I figured a better solution might be available..)
Thanks for the help
You can export your data directly from Bigquery
https://developers.google.com/bigquery/exporting-data-from-bigquery
You can use PHP to run a API call that does the export (you dont need the BQ tool)
You need to set the jobs configuration.extract.destinationFormat see the reference
Just to elaborate on Pentium10's answer
You can export up to a 1GB file in json format.
Then you can read the file line by line which will minimize the memory used by your application and then you can use json_decode the information.
The suggestion to export is a good one, I just wanted to mention there is another way.
The query API you are calling (jobs.query()) does not return the full dataset; it just returns a page of data, which is the first 2 MB of the results. You can set the maxResults flag (described here) to limit this to a certain number of rows.
If you get back fewer rows than are in the table, you will get a pageToken field in the response. You can then fetch the remainder with the jobs.getQueryResults() API by providing the job ID (also in the query response) and the page token. This will continue to return new rows and a new page token until you get to the end of your table.
The example here shows code (in java in python) to run a query and fetch the results page by page.
There is also an option in the API to convert directly to CSV by specifying alt='csv' in the URL query string, but I'm not sure how to do this in PHP.
I am not sure do you still using the PHP but the answer is:
$options = [
'maxResults' => 1000,
'startIndex' => 0
];
$jobConfig = $bigQuery->query($query);
$queryResults = $bigQuery->runQuery($jobConfig, $options);
foreach ($queryResults as $row) {
// Handle rows
}

Wrong dataype for in_array

Good morning.
I'm currently trying to build a very basic caching system for one of my scripts. The cache is JSON data and contains only 1 key and it's value, but many individual fields, something like this;
{"Item1":"Item1 Description"}
{"Item2":"Item2 Description"}
{"Item3":"Item3 Description"}
What I'm intending to do is;
First check if a cache file is available
Then check if an item exists in the cache
Then add the new item along with it's description if it's not already in the cache...
...or return the item description if it's not there.
All data being stored is strings. The cache file doesn't store any other type of data.
I've put together a basic function but I'm having trouble getting it functioning;
function ItemIsInCache($CacheFile, $ItemId) {
if(file_exists($CacheFile)) {
$json = json_decode(file_get_contents($CacheFile, true));
if(in_array($ItemId, $json)) { // <<
$itemname = array_search($ItemId, $json);
return itemname;
} else {
$item[$itemId] = GrabItemName($ItemId);
$itemname = array_search($ItemId, $json); // <<
return $itemname;
}
} else {
$item[$ItemId] = GrabItemName($ItemId);
$ejson = json_encode($item);
file_put_contents($CacheFile, $ejson);
return $item[$ItemId];
}
}
Notes
GrabItemName is a different function that returns the description data based on the $ItemId.
The warnings I'm getting are Wrong datatype for second argument in both array_search() and in_array(), on lines 4 and lines 9 respectively (those are the line numbers in the above code - due to the nature of my script these numbers are later on) -- for simplicity, I've marked the problem lines with // <<.
The function is running in a loop which I've no problems with. The problems lie within this function.
What currently happens
Right now, if the cache doesn't exist, it creates it and adds the first item from the loop to the cache file in it's respective JSON format (that fires since the cache file doesn't exist, so after the final else statement).
However, items from the loop after that don't get added, presumably because the file exists and there's something wrong with the code.
The last part of the function works exactly as I want it to but the first part does not.
Expected behaviour with fixed code
Check cache > Return description if item exists ELSE add new item to cache.
The items and their associated descriptions will NOT change, but I'm pulling them from a rate limited API, and I need to ensure I cache whatever I can for everyones benefit.
So, any ideas what I'm doing wrong with the function? I'm sure it's something incredibly simple that I'm overlooking.
Your file is not JSON for an erray. The correct JSON for an array is
[
{"Item1":"Item1 Description"},
{"Item2":"Item2 Description"},
{"Item3":"Item3 Description"}
]
You're missing the brackets around the array, so you just get a single object.
When creating the initial file, you need to do:
$ejson = json_encode(array($item));
so that it's initialized as an array of one item, not just an item.

Ajax calling memcache functions always querying Db

I'm experiencing a strange problem. I'm caching the output of a query using memcache functions in a file named count.php. This file is called by an ajax every second when a user is viewing a particular page. The output is cached for 5 seconds, so within this time if there will be 5 hits to this file i expect the cached result to be returned 3-4 times atleast. However this is not happening, instead everytime a query is going to db as evidenced from a echo statement, but if the file is called from the browser directly by typing the url (like http://example.com/help/count.php) repeatedly many times within 5 seconds data is returned from cache (again evidenced from the echo statement). Below is the relevant code of count.php
mysql_connect(c_dbhost, c_dbuname, c_dbpsw) or die(mysql_error());
mysql_select_db(c_dbname) or die("Coud Not Find Database");
$product_id=$_POST['product_id'];
echo func_total_bids_count($product_id);
function func_total_bids_count($product_id)
{
$qry="select count(*) as bid_count from tbl_userbid where userbid_auction_id=".$product_id;
$row_count=func_row_count_only($qry);
return $row_count["bid_count"];
}
function func_row_count_only($qry)
{
if($_SERVER["HTTP_HOST"]!="localhost")
{
$o_cache = new Memcache;
$o_cache->connect('localhost', 11211) or die ("Could not connect to memcache");
//$key="total_bids" . md5($product_id);
$key = "KEY" . md5($qry);
$result = $o_cache->get($key);
if (!$result)
{
$qry_result = mysql_query($qry);
while($row=mysql_fetch_array($qry_result))
{
$row_count = $row;
$result = $row;
$o_cache->set($key, $result, 0, 5);
}
echo "From DB <br/>";
}
else
{
echo "From Cache <br/>";
}
$o_cache->close();
return $row_count;
}
}
I'm confused as to why when an ajax calls this file, DB is hit every second, but when the URL is typed in the browser cached data is returned. To try the URL method i just replaced $product_id with a valid number (Eg: $product_id=426 in my case). I'm not understanding whats wrong here as i expect data to be returned from cache within 5 seconds after the 1st hit. I want the data to be returned from cache. Can some one please help me understand whats happening ?
If you're using the address bar, you're doing a GET, but your code is looking for $_POST['...'], so you will end up with an invalid query. So for a start, the results using the address bar won't be what you're expecting. Is your Ajax call actually doing a POST?
Please also note that you've got a SQL injection vulnerability there. Make sure $product_id is an integer.
There are many problems with your code, first of all you always connect to the database and select a table, even if you don't need it. Second, you should check $result with !empty($result) which is more reliable as just !$result, because it's also covers empty objects.
As above noted, if the 'product_id' is not in the $_POST array, you could use $_REQUEST to also cover $_GET (but you shouldn't, if you are certain it's coming via $_POST).

How to protect processing files

So I've a php form processing file; say a file name process.php with the codes as
<?php
$value = $_POST['string']; //Assume string is safe for database insertion
$result = mysql_query("INSERT into table values {$value}");
if($result) {
return true;
} else {
return false;
}
?>
Ideally, only someone who's logged in to my website shall be allowed to send that POST request to perform that insertion. But here, anyone who know this processing file's path and the request being sent can send any spoof POST request from any domain (if I'm not wrong). This will lead to insertion of unwanted data into the database.
One thing I did is, before the insertion, I checked whether a user is logged in or not. If not, I ignore the POST request. But how exactly should I secure my processing files from exploits?
As it stands this is vulnerable to SQL Injection. Make sure you use a parametrized query library like PDO for inserting the file and the mysql "blob" or "long blob" type. You should never use mysql_query().
You should also keep track of the user's id for user access control. It doesn't look like you have taken this into consideration.

Categories