i wanted to know somethings about the firebug,
when i try to load a page with firebug opend, it start the time lines.
what is :
waiting,
reciving,
DomContentLoaded,
Load,
mysql queries what affect from the list ? i see that more mysql queries i am adding, the reciving part is increasing.
let me paste a request that ihave used on my core , to generate a dynamic link or content.
function getContent($id = '') {
$id = mysql_real_escape_string ($id);
$sql = 'SELECT id,post_title,post_content FROM wp_posts WHERE post_category="67" ORDER BY post_date DESC LIMIT 1';
$res = mysql_query($sql) or die (mysql_error());
if (mysql_num_rows($res) !=0):
while ($row = mysql_fetch_assoc($res)) {
// this remove caption from wordpress, get 450 words to be used for exerpt, encode html,
$mycontent = $row['post_content'];
$mycontent = strip_tags($mycontent);
$mycontent = substr($mycontent,0,250);
$mycontent = preg_replace("/\[caption.*\[\/caption\]/", '', $mycontent);
$mycontent = htmlentities($mycontent);
//encode the words for html
$title = $row['post_title'];
$title = htmlentities($title);
echo '
<<h1>'.$title.' </h1>
<div class="cssclass"> '.$mycontent.' </div>
'; //echo
}
else:
echo 'This page dosnt exist.';
endif;
} // end
Is any thing wrong on this code or its normal, my db is about 75.000 lines.
Thank you for reading this post.
waiting: after sending a request to the server, this is the time spent waiting for data to start coming back
receiving: time spent receiving content
DomContentLoaded: time spent until the entirety of the DOM is availble (note, this is not all resources loaded, just the html portions, e.g. the </html> tag has been received/processed).
load: time until the entirety of the page, including images/scripts/css has been received/processed/loaded.
Don't worry about the receiving portion increasing. You're outputting more data, so it'll take more time to receive. That's perfectly normal.
Waiting is the time between when the browser sends the request and receiving any data at all from the server.
Receiving is the time actually getting data.
The reason Receiving is longer is because you're sending more data, so it's taking longer to download. You might expect Waiting time to go up slightly too, but the time to transfer across the network is more significant than the time spent processing the data on the server.
Related
I've script which get some values from XML record.
There's code:
<?php
//Data
$xml_data = '<image_process_call><image_url>https://i.pinimg.com/originals/e4/41/54/e44154308e3466d987665c6d50887f06.jpg</image_url><methods_list><method><name>collage</name><params>template_name=Nun Face in Hole;</params></method></methods_list><result_format>jpg</result_format><result_size>800</result_size><template_watermark>false</template_watermark></image_process_call>';
//Settings
$app_id = '';
$key = '';
$sign_data = hash_hmac('SHA1', $xml_data, $key);
//Send request
$request_url = 'http://opeapi.ws.pho.to/addtask?data='. $xml_data .'&sign_data='. $sign_data .'&app_id='. $app_id;
$request_xml = simplexml_load_file($request_url);
$request_id = strval($request_xml -> request_id);
if (isset($request_id)) {
$result_url = 'http://opeapi.ws.pho.to/getresult?request_id='. $request_id;
sleep(6);
$result_xml = simplexml_load_file($result_url);
$result_status = strval($result_xml -> status);
$result_img = strval($result_xml -> result_url);
if (isset($result_img)) {
echo $result_img;
} else {
echo 'Result image not found';
}
} else {
echo 'Request ID not found';
}
?>
The problem depends on time to generate the second XML file. $result_xml took few seconds so I have to use sleep(6) function.
If I remove this, I need to refresh the page (minimum three times) to get a link to generated image from second XML.
Do you have an idea how to do it more professionally? I can't be sure that every image will be generated in 6 seconds (sometimes shorter sometimes longer).
Is there any method for genereting the result only after receiving $result_img? Thanks in advance for your help!
I think it is worth writing.
In practice, it looks like this:
Script does $request_xml and XML from site return:
<image_process_response>
<request_id>2d8d4dec-4344-4df0-a1e1-0c8df304ad11</request_id>
<status>OK</status>
<description/>
<err_code>0</err_code>
</image_process_response>
Script gets request_id from this XML and do $result_xml. However, this is XML and script doesn't get image's url immediately. It needs to wait a few seconds.
After three times refreshing the page or using sleep(6) function finally we get:
<image_process_response>
<request_id>2d8d4dec-4344-4df0-a1e1-0c8df304ad11</request_id>
<status>OK</status>
<result_url>
http://worker-images.ws.pho.to/i1/9F1E2EAF-5B31-4407-8779-9A85F35862D3.jpg
</result_url>
<result_url_alt>
http://worker-images.ws.pho.to.s3.amazonaws.com/i1/9F1E2EAF-5B31-4407-8779-9A85F35862D3.jpg
</result_url_alt>
<limited_image_url>
http://worker-images.ws.pho.to/i1/3F797C83-2C2E-401C-B4AF-C4D36BBD442D.jpg
</limited_image_url>
<nowm_image_url>
http://worker-images.ws.pho.to/i1/9F1E2EAF-5B31-4407-8779-9A85F35862D3.jpg
</nowm_image_url>
<duration>2950.879097ms</duration>
<total_duration>2956.124067ms</total_duration>
</image_process_response>
Edit:
After trying to immediately generate the image I get such an XML:
<image_process_response>
<request_id>e615f0a1-ddee-4d81-94c4-a392f8f123e8</request_id>
<status>InProgress</status>
<description>The task is in progress, you need to wait for sometime.</description>
</image_process_response>
So this is reason why I see blank page...
Do someone have an idea how to force a script to reconnect with the second XML until it finds a result_url?
The problem depends on time to generate the second XML file.
$result_xml took few seconds so I have to use sleep(6) function. If I
remove this, I need to refresh the page (minimum three times) to get a
link to generated image from second XML.
Do you have an idea how to do it more professionally? I can't be sure
that every image will be generated in 6 seconds (sometimes shorter
sometimes longer). Is there any method for genereting the result only
after receiving $result_img? Thanks in advance for your help!
According to Pho.to API, An add task request is a queued POST request.
In my opinion, Send request in while-loop, but wait for smaller time instead of fixed 6 seconds, Check the status in image_process_response, Keep looping until it is not InProgress, After that, You can safely send second request to get processed image result.
You may encounter timeout issue due to low timeout configuration for DoS protection if you run this script on web server (via CGI/FastCGI), To resolve this situation, You need a queue for adding task in your HTTP request, and then process it offline (means without web environment).
I am doing this animation tool where I fetch a value from my database and then a picture will animate to a certain position. My question is if it is possible to retrieve data constantly or like every 5 seconds?
Somehow like this:
while(autoretreive){
$data = mysql_query("select * from ......");
}
UPDATED from here
Thanks for your answers! Made it a little bit clearer what to do! Maybe I can explain better what I'm doing in my code.
I am doing this animation program as said, where balls with information is moving around to different locations. I have one value that will be updated frequently in the database, lets call it 'city'.
First at previous page I post the balls of information I want based on the 'city' and I do like this (simplified):
$pid = $_POST['id'];
$pcity[0] = $_POST['city'];
$pcity[1] = $_POST['city'];
$pcity[2] = $_POST['city'];
//...
$while(autoretrieve) { // HOW TO?
$data = mysql_query(select * from table where city == $pcity[0] OR $pcity == [1] //...);
while($rows = mysql_fetch_array($data)){
$city = $rows['city'];
$id = $rows['id'];
if($city == example1){
"animate to certain pos"; //attached to image
}
else if($city == example2){
"animate to certain pos"; //attached to image
}
}
}
So for every update in the database the image will animate to a new position. So a time interval of 5 seconds would be great. I'm not an expert in coding so sorry for deprecated code. Not so familiar with AJAX either so what is going to be imported to the code? It is also important that the page is not reloading. Just the fetch from database.
you can do it with ajax and javascript
make one javascript function which contains ajax code to retrive data from database
and at your page load using setTimeout call your ajax function at every 5 second
You can use sleep function to control how often you want to fetch data.
while(autoretreive){
$data = mysql_query("select * from ......");
//output your data here, check more in link about server sent events bellow
sleep(5);
}
Since you haven't specified how you plan to access data I'm writing this answer assuming Server-Sent Events as they are only ones that make sense according to your question.
Now all this was according to your question which wasn't very clear on how do you plan to use data. Again you'll most likely want to fetch data using ajax, but Server Sent Events can also be a good way you could achieve this.
And don't use mysql_* it's deprecated, switch to PDO or mysqli_*
I'm making a query over a database with over 20MM entries, that means im breaking the query into several smaller queries.
The problem is if I try to fetch the 20MM entries the page does not load, and displays a blank screen, with no title and content. However, if I fetch 5MM entries, the page does load correctly, and displays the content:
Here's my code
for($n=0; $n<20000000; $n=$n+500000){
$m=500000;
$query = "SELECT * FROM user_likes LIMIT ". $n .",". $m;
//echo $query;
$result = mysql_query($query) or die(mysql_error());
// craete arrays
while($row = mysql_fetch_array($result)){
set_time_limit(0);
$like[$row['name']]=$like[$row['name']]+1;
if($like[$row['name']]==375) $likes375 ++;
}
}
// print the size
echo count($like)."<br>";
echo "375: ".$likes375;
I would appreciate if someone can help me with this.
Thanks
EDIT:
after adding error_reporting(E_ALL); it display this notice: MySQL server has gone away
The reason for that is because your page will be blank while MySQL is crunching through its index looking for your results. After a while, this will time out (depending on your settings). That is why this doesn't happen on smaller queries.
This is due to time limit of execution of php scripts, by default php script will run for a total of 45 seconds before timing out...
Just use this after <?php
set_time_limit(0); // 0 means unlimited
but it is not recommended to use it, better improve your logic at the both database and php code logic.
EDIT:
try to set error log for this file specificaly by doing this:
ini_set("log_errors" , "1");
ini_set("error_log" , "eorrors.txt");
ini_set("display_errors" , "0");
I've created autosave via Ajax for my content management system. Having problem. Problem is, when i'm testing on my local server, the php side updates big piece of data easily but when i'm testing it on my webhost, I see that, if the updated content is a big data, then php doesn't update the table row on the first attempt, updates only after second attempt. Any suggestion? How to deal with that problem?
PHP Side
<?php
session_start();
require '../../core/includes/common.php';
$err=array();
$name=filter($_POST['name'],$db);
$id=$db->escape_string($_POST['id']);
$title=filter($_POST['title'], $db);
$parentcheck=$db->escape_string($_POST['parentcheck']);
if(isset ($_POST['parent'])) $parent=$db->escape_string($_POST['parent']);
else $parent=$parentcheck;
$menu=$db->escape_string($_POST['menu']);
$content = html($_POST['content'], $db);
if (!isset($content)) die('error');
$result=$db->query("UPDATE pages AS p, menu AS m SET m.parent='$parent', m.name='$name', m.showinmenu='$menu', p.id='$id', p.title='$title', p.content='$content' WHERE m.id='$id' AND p.id=m.id") or die($db->error);
if ($result){
echo "{";
echo '"msg": "Success" ';
echo "}";
}
else{
echo "{";
echo
'"err": "Error"';
echo "}";
}
?>
Your code looks like you can have tons of potential errors, this example should only show what you can do to always return something back which might be in the format your AJAX request can deal with:
You're mixing two types of error handling: die'ing straight away and reporting your action result to AJAX. You should do the one way or the other, this one is reporting back always:
<?php
session_start();
require '../../core/includes/common.php';
...
if (!isset($content)) die('{"err": "No content"}'); // This will never happen BTW.
$result = $db->query("UPDATE pages AS p, menu AS m SET m.parent='$parent', m.name='$name', m.showinmenu='$menu', p.id='$id', p.title='$title', p.content='$content' WHERE m.id='$id' AND p.id=m.id");
if ($result)
{
echo '{"msg": "Success"'}';
}
else
{
echo '{"err": "Error"}';
}
?>
Can you do something like this, parse the html content for only the important content ie the div and not the entire page, Mysql text datatype supports 2GB, but its not safe to overload mysql.
Idea 1
Can you by any chance save the HTML content on an XML and refer the link along with the node in the Mysql Column so that AJAX picks up the xml and the node to display data on the fly.
Idea 2
Gzip the content and store in Mysql
I'm working on a jQuery AJAX call where I get a set of results from a search. I return html with an echo and I need to backup the results by inserting into the db. Will also have to check if they already exist before inserting.
I want the html to be returned as soon as possible. If I add insert code below the echo will the entire code have to finish running before the html is returned?
The most important thing is content returning back to the user right away.
This is all on mobile so every 100 milliseconds count.
$data = file_get_contents($url);
$result = json_decode($data, true);
foreach ( $result->results as $items ) {
$name = $items->name;
$description = $items->desc;
$id = $items->id;
$coverurl = $items->coverurl;
$returnhtml .= "<h3>".$name."</h3>";
$returnhtml .= "<h4>".$description."</h$>";
}
echo $returnhtml;
//how to backup to database
//check if already in db
//insert into db
To answer your question, yes. The entire script must complete before the HTML is outputted back to the client. You can use output buffering to capture the output, send it, and then continue on with other processing. The server will then try to output the remaining when the script finished but AJAX doesn't know how to handle the second part so it just ignores it.
Look into http://www.php.net/manual/en/ref.outcontrol.php
It'll be something like:
ob_start();
... generate html for client
ob_flush();// send output buffer to the client
... insert data into database
ob_end_clean();
After do the ob_flush as explained before, you can try MySQL INSERT DELAYED if you don't wait for the INSERT to complete.
http://dev.mysql.com/doc/refman/5.5/en/insert-delayed.html