I'm having difficulties to fire an image pixel code from crontab (server side) using PHP
for example:
<img src="http://www.company.com/tracking.php?p=abcdefg#abcdefg.com"/>
i tried using fput, curl and file_get_content every time i fire the pixel the "company.com" didn't receive the hit with the parameter.
but when i browsed the url itself they said that it worked as expected.
what is the preferred way to fire http or https (not postback) pixels from cron and php?
maybe i should use wget to fire the pixels?
Thanks,
Bentzy
Without knowing exactly how company.com's tracking works, it could be several things.
One could be that it might only register hits for 'valid' browsers - e.g. it might look at the headers sent with the request and decided whether it's a valid hit or not based on that. That would probably be my guess.
You could try and fool it be sending headers along with your request - you should be able to do this using cURL and PHP. You can find more about cURL in the docs:
http://php.net/manual/en/book.curl.php
Other than that, without knowing what company.com is, it's hard to say exactly how their tracking works. It could also be that it requires to set a cookie on the user's browser for it to register a hit, in which case this can also be done with cURL, but is a little involved.
Related
I've got the following problem at hand:
I'm having users on two seperate pages, but saving page input to the same text file. While one user is editing, the other can't. I'm keeping track of this with sessions and writing changes and who's turn to edit it is in a file.
Works fine so far, the output in the end is quite similar to a chat. However, right now I'm having users manually actualize their page and reload the file. What I'd like to do is have the page execute a redirect when the file-timestamp changes (to indicate that the last user has saved their edits and its another users turn). I've looked into javascript shortpolling a little, but then found the php filmtime function and it looks much easier to use. Well - here's what I got:
while(true){
$oldtimestamp=filemtime(msks/$session['user']['kampfnr'].txt);
$waittimer=2;
$waittimer++;
sleep($waittimer);
$newtimestamp=filemtime(msks/$session['user']['kampfnr'].txt);
if ($eintragszeit2 > $eintragszeit1){
addnav("","kampf_ms.php?op=akt");
redirect("kampf_ms.php?op=akt");
}}
In theory, while the user sees the output "it's ... turn to edit the file." this should loop in the background, checking if the file has already been updated, and if yes, redirect the user.
Practically this heavily affects server perfomance (I'm on shared hosting) until it breaks with a memory exceeded error message.
Is something wrong with the code? Or is it generally a bad idea to use a while loop in this case?
Thanks in advance!
PHP language should be only used to generate web content (client do a request to the server => server calls the required script, and returns the response to the client).
Once page is loaded and displayed to the client, the connection is closed, so Internet can die, the client isn't informed...
So with an infinite loop, not only the client can wait for response... an infinite time, but also the server may be heavy impacted because of load... Effectively It is a really bad idea :)
PHP can't be used to make a bidirectional communication: it is just called to build web pages that client demands, and so he can't do anything "in the background" (not directly, effectively you can call an external script, but not for notify a client...)
Also, to do a bidirectional communication, php and "regular" http is not good, because of client / server architecture (the server only answers client request, it is passive)
I can suggest to use WebSocket protocol, to do a chat application:
http://socket.io/
https://en.wikipedia.org/wiki/WebSocket
But for that, you need to use an "active" server solution, such as node.js or ruby (depends of your server capabilities...)
The other way if you want to stay in php is that client makes Ajax request every 10 seconds, for example, to call a php script which check the file, and send back a message to the client if file is updated, but it is really deprecated, because of heavy performance loss, so forget it immediately.
I have a cron job running a php script, but theres some html and javascript that I need to execute for the actual script to work.
Converting the javascript to php isnt an option.
Basically I need it to act as though a person is viewing the page every time the cronjob runs.
EDIT:
the script uses javascript from a different site to encrypt some passwords so it is able to log into my account on the site, and the javascript is thousands of lines. The way the script flows is: Send data to website>get the data it sends back>use sites javascript to alter data>set html form value to value of data returned by javascript function>submit html form to get info back to php>send data to log me in. I know the code is very shoddy but its the only way i could think to do it without having to rewrite all the javascript they use to encrypt the password to php
Yau can try Node.JS to run JavaScript code on the server.
install your favorite web browser, and then have the cron job run the browser with the url as an argument.
something like
/usr/bin/firefox www.example.com/foo.html
you'll probably want to wait a minute or so and then kill the processes, or determine a better way to find when it finishes.
cronjobs always runs on server side only. when there is no client side - how can you expect javascript to work really???
anyway solution is: use cronjob to run another php script - which in fact calls the php script you want to run using CURL.
e.g. file1.php - file you want to execute and expect the javascript on that page to work.
file2.php - another file you create ... in this file use curl to call the file1.php ( make sure you provide full http:// path like you type in browser - you can pass values like get/post methods on html forms do as well ). in your cronjob - call file2.php.
Make sure curl is available and not any firewall rule blocking http calls i.e. port 80 calls to same server. Most of the servers both conditions above are fulfilled.
---------- sorry guys - Kristian Antonsen is right - so dont consider this as full answer at the moment. However I am leaving this on as someone might have food for thoughts from this -----
In simplest terms, I utilize external PHP scripts throughout my client's website for various purposes such as getting search results, updating content, etc.
I keep these scripts in a directory:
www.domain.com/scripts/scriptname01.php
www.domain.com/scripts/scriptname02.php
www.domain.com/scripts/scriptname03.php
etc..
I usually execute them using jQuery AJAX calls.
What I'm trying to do is find is a piece of code that will detect (from within) whether these scripts are being executed from a file via AJAX or MANUALLY via URL by the user.
IS THIS POSSIBLE??
I've have searched absolutely everywhere and tried various methods to do with the $_SERVER[] array but still no success.
What I'm trying to do is find is a piece of code that will detect (from within) whether these scripts are being executed from a file via AJAX or MANUALLY via URL by the user.
IS THIS POSSIBLE??
No, not with 100% reliability. There's nothing you can do to stop the client from simulating an Ajax call.
There are some headers you can test for, though, namely X-Requested-With. They would prevent an unsophisticated user from calling your Ajax URLs directly. See Detect Ajax calling URL
Most AJAX frameworks will send an X-Requested-With: header. Assuming you are running on Apache, you can use the apache_request_headers() function to retrieve the headers and check for it/parse it.
Even so, there is nothing preventing someone from manually setting this header - there is no real 100% foolproof way to detect this, but checking for this header is probably about as close as you will get.
Depending on what you need to protect and why, you might consider requiring some form of authentication, and/or using a unique hash/PHP sessions, but this can still be reverse engineered by anyone who knows a bit about Javascript.
As an idea of things that you can verify, if you verify all of these before servicing you request it will afford a degree of certainty (although not much, none if someone is deliberately trying to cirumvent your system):
Store unique hash in a session value, and require it to be sent back to you by the AJAX call (in a cookie or a request parameter) so can compare them at the server side to verify that they match
Check the X-Requested-With: header is set and the value is sensible
Check that the User-Agent: header is the same as the one that started the session
The more things you check, the more chance an attacker will get bored and give up before they get it right. Equally, the longer/more system resources it will take to service each request...
There is no 100% reliable way to prevent a user, if he knows the address of your request, from invoking your script.
This is why you have to authenticate every request to your script. If your script is only to be called by authenticated users, check for the authentication again in your script. Treat it as you will treat incoming user input - validate and sanitize everything.
On Edit: The same could be said for any script which the user can access through the URL. For example, consider profile.php?userid=3
I have a while loop that constructs a url for an SMS api.
This loop will eventually be sending hundreds of messages, thus being hundreds of urls.
How would i go about doing this?
I know you can use header(location: ) to chnage the location of the browser, but this sint going to work, as the php page needs to remain running
Hope this is clear
thankyouphp h
You have a few options:
file_get_contents as Trevor noted
curl_ - Use the curl library of commands to make the request
fsock* - Handle the connection a bit lower level, but making and managing the socket connection.
All will probably work just fine and you should pick one depending on your overall needs.
After you construct each $url, use file_get_contents($url)
If it just a case that during the construction of all these URLs you get the error "Maximum Execution Time Exceeded", then just add set_time_limit(10); after the URL generation to give your script an extra 10 seconds to generate the next URL.
I'm not quite sure what you are actually asking in this question - do you want the user to visit the urls (if so, can you does the end users web browser support javascript?), just be shown the urls, for the urls to be generated and stored or for the PHP script to fetch each url (and do you care about the user seeing the result) - but if you clarify the question, the community may be able to provide you with a perfect answer!
Applying a huge amount guesswork, I infer from your post that you need to dynamically create a URL, and the invoking of that URL causes an SMS message to be sent.
If this is the case, then you should not be trying to invoke the URL from the client but from server side using the url_wrappers or cURL.
You should also consider running the loop in a seperate process and reporting back to the browser using (e.g.) AJAX.
Have a google for spawning long running processes in PHP - but be warned there is a lot of bad advice on the topic published out there.
C.
I am looking for a way to start a function on form submit that would not leave the browser window waiting for the result.
Example:
User fills in the form and press submit, the data from the form via javascript goes to the database and a function in php that will take several seconds will start but I dont want the user to be left waiting for the end of that function. I would like to be able to take him to another page and leave the function doing its thing server side.
Any thoughts?
Thanks
Thanks for all the replies...
I got the ajax part. But I cannot call ajax and have the browser move to another page.
This is what I wanted.
-User fills form and submits
-Result from the form passed to database
-long annoying process function starts
-user carries on visiting the rest of the site, independent of the status on the "long annoying process function"
By the way and before someone suggests it. No, it cannot be done by cron job
Use AJAX to call the php script, and at the top of the script turn on ignore_ user_ abort.
ignore_user_abort(true);
That way if they navigate away from the page the script will continue running in the backround. You can also use
set_time_limit(0);
to set a time limit, useful if you know your script will take a while to complete.
The most common method is:
exec("$COMMAND > /dev/null 2>&1 &");
Ah, ok, well you're essentially asking therefore, does PHP support threading, and the general answer is no... however...
there are some tricks you can perform to mimick this behaviour, one of which is highlighted above and involves forking out to a separate process on the server, this can be acheived in a number of ways, including the;
exec()
method. You also may want to look here;
PHP threading
I have also seen people try to force a flush of the output buffer halfway through the script, attempting to force the response back to the client, I dont know how successful this approach is, but maybe someone else will have some information on that one.
This is exactly what AJAX (shorthand for asynchronous JavaScript + XML) is for;
AJAX Information
It allows you to code using client side code, and send asynchronous requests to your server, such that the user's browser is not interuppted by an entire page request.
There is alot of information relating to AJAX out there on the web, so take a deep breath and get googling!
Sounds like you want to use some of the features AJAX (Asynchronous Javascript and XML - google) have to offer.
Basically, you would have a page with content. When a user clicks a button, javascript would be used to POST data to the server and begin processing. Simultaneously, that javascript might load a page from the server and then display it (eg, load data, and then replace the contents of a DIV with that new page.)
This kind of thing is the premise behind AJAX, which you see everywhere when you have a web page doing multiple things simultaneously.
Worth noting: This doesn't mean that the script is running "in the background on the server." Your web browser is still maintaining a connection with the web server - which means that the code is running in the "background" on the client's side. And by "background" we really mean "processing the HTTP request in parallel with other HTTP requests to give the feel of a 'background' running process"