I am trying to manage caching on heavily used webpage written in PHP. I have marked some cacheable sections of PHP code, which I want to execute only pre-cache when administrator make changes in CMS. For this, I use this method:
I have file (for example "index-source.php") with some marked ares of PHP code, which are interpretable alone. When admin change some settings, these marked parts are executed and replaced with result (for example MySQL queries which reads menu items from DB are replaced with generated HTML menu). Resulted file is saved as new "index.php", which still have some PHP code, which can't be optimized by caching.
Now to my problem
If we assume, that this server is heavilly load, which means there is for example 100 requests per second, which in PHP requires file index.php. If I will use file_put_contents() to overwrite this index.php with new pre-cached version, is there any risk, that some requests will be interrupted, because of locked/not fully overwritten file? Basically I want to somehow update my PHP file and assure that PHP will include complete old or complete new version of that file or wait few milliseconds until file is overwritten. I dont't want PHP to fail require or load partially overwritten file.
Is that possible? Thanks
file_put_contents is not what you want.
Have a look at this project, and dive into the source to get a feel for what challenges you may have to face as well as the solution chosen.
https://github.com/PHPSocialNetwork/phpfastcache
Related
I am working on a website in which it would be useful to to allow a user the option of downloading the content of a file, even when it's going to be updated by another user at the same time or later.
My problem is that the solution I've tried so far allows downloading, but will disrupt any later updating of the file. I don't think I can represent the code relating to the updating concisely (it is spread over multiple files), except that it's through AJAXing the data (which I'm not sure why it would cause this problem). In case it's relevant, this is a file which gets updated multiple times.
When I use fireftp I can download the file without disrupting this process, which makes me optimistic there's a PhP solution. I am currently downloading the data by Ajaxing the file contents to the page the "downloading user" is on. The code for this (within php) is:
$file_contents = file_get_contents ($_POST['file'])); //file address comes through Ajax POST request.
echo ($file_contents); //to access the content client side
Is there another way to access the text/content within a file without any unintended consequences on other server processing of it?
I'm looking for a way to send a user a regular file (mp3s or pictures), and keeping count of how many times this file was accessed without going through an HTML/PHP page.
For example, the user will point his browser to bla.com/file.mp3 and start downloading it, while a server-side script will do something like saving data to a database.
Any idea where should I get started?
Thanks!
You will need to go through a php script, what you could do is rewrite the extensions you want to track, preferably at the folder level, to a php script which then does the calculations you need and serves the file to the user.
For Example:
If you want to track the /downloads/ folder you would create a rewrite on your webserver to rewrite all or just specific extensions to a php file we'll call proxy.php for this example.
An example uri would be proxy.php?file=file.mp3 the proxy.php script sanitizes the file parameter, checks if the user has permission to download if applicable, checks if the file exists, serves the file to the client and perform any operations needed on the backend like database updates etc..
Do you mean that you don't want your users to be presented with a specific page and interrupt their flow? If you do, you can still use a PHP page using the following steps. (I'm not up to date with PHP so it'll be pseudo-code, but you'll get the idea)
Provide links to your file as (for example) http://example.com/trackedDownloader.php?id=someUniqueIdentifer
In the tracedDownloader.php file, determine the real location on the server that relates to the unique id (e.g. 12345 could map to /uploadedFiles/AnExample.mp3)
Set an appropriate content type header in your output.
Log the request to your database.
Return the contents of the file directly as page output.
You would need to scan log files. Regardless you most likely would want to store counters in a database?
There is a great solution in serving static files using PHP:
https://tn123.org/mod_xsendfile/
I am editing PHP files on my server. I am in the habit of including several comments at the top of each file containing licensing information, my web address, and the version number of the software. I'd like a more convenient way of maintaining this practice without having to manually deal with it when I open and update PHP files on my server. It would be nice if I could also append a timestamp of my recent modification at the bottom of the file too.
I am currently using Dreamweaver 8.0.2.
If Dreamweaver would append this header to the top (taking into account that the file may have a header already in place), that would be acceptable.
If the FTP server would append the file (assuming it is a PHP file) when I download or upload it, that may work too.
I'm unsure what approach to take or what would work for this.
The better option is use "Beyond Compare". It lets you to see file as Explorer and allows you to open multiple files and editing with saving.
I always uses that. So using that, you can open multiple files on server and save them. You don't have to wait for them that they are saving it on server. Just switch to another file(in new tab)
:)
#David: by whom*
Best solution is to include a header script that checks the last edit date for the time stamp and simply outputs the rest of the information.
All licensing information, web address, and version number of the software should be only changed once for all the files: Thus, echoing from php the comments or simply putting them on a html part would be enough.
As for the timestamp of recent modification, you could use the following script to get the last file update time:
$time = filemtime($filename); // You may use $_SERVER['PHP_SELF']
$convenient_date = date('d-m-Y', $time);
echo "Last modified on {$convenient_date}.";
Before I begin, I must warn you that I'm not much of a web programmer so my methods may seem somewhat roundabout and the terminology I use may be awkward.
Here's the situation. I'm developing a website for users to visualize data.
I have a public php page sitting in /var/www/thepage/index.php path (yes, Linux server + apache). This is the main page of the site and is also where users make selections in a form.
Upon form submission, a second php page will be called and this is where the form selections from the first php page are passed to the javascript that creates the visualization. In order for that to happen, csv files are first written into this directory using a php script that queries from a MySQL database.
Thing is, I want users to be able to see the visualizations but not be able to download the csv files (unless they are admin). How I allow admin to download the files is to create a protected (.htaccess) subdirectory /var/www/thepage/secure/ which has an index html that runs a cgi script once an admin logs in (prompted when a download link is clicked). This script copies the latest files (with dynamic names) from the /var/www/thepage/ directory and moves them to the secure/ directory with static filenames. Download links pointing to these files with static names are on the protected index.html. However, if a user looks at the source code of the 2nd php page, they can also download the files as they know the paths and they are not protected.
If remove file permissions, the php script won't be able to read the files either, causing the visualization to fail (I want normal users to be able to see the visualizations). It is also important to have the files because I have a cgi script (bash + awk) running a mathematical function on the files which also requires permission
Obscuring the filenames doesn't really work either since the files are written on the fly and the source code of the html page will reveal the obscured csv filenames being written.
How can I get around this problem? I would prefer not to have to create sessions and log-ins for normal users, etc...
As previously said, it's hard to hide anything on the net, especially if you need to send it to javascript. You could try hacking it a bit, could cost you a bit of performance, but would be a deterrent against people who aren't web savvy... But could also be seen as a challenge by others :)
A rough example would be something like..
$csv = fgetcsv("/var/www/thepage/secure/file.csv");
echo "<script type'text/javascript'>";
echo json_encode($csv);
echo "</script>";
Bit rusty here, but javascript should interpret the json as an object, that you can use in your code. You could go a step further and break the php array into sections before sending it off, making it harder to know what's going on.
Like I said. It's rough, but it could be a solution.
I would have imagined the best way is to store the files in a secure directory that isn't accessible in a web browser (outside of the web root). You could then show a list of the available files to authenticated users with a download link. When they click the link you could check they are logged in and if so then begin the file download.
PHP readfile - May help
I am trying to generate a RSS feed from a mysql database I already have. Can I use PHP in the XML file that is to be sent to the user so that it generates the content upon request? Or should I use cron on the PHP file and generate an xml file? Or should I add the execution of the php file that generates the xml upon submitting the content that is to be used in the RSS? What do you think is the best practice?
All three approaches are technically possible. However, I would not use cron, because it delays the update process of your XML-files after the database content has changed.
You can easily embed PHP-Code in your XML-files, you just have to make sure that the files are interpreted as PHP on the serverside, either by renaming them with a *.php extension or by changing the server directives in the .htaccess-file.
But I think that the best practice here is to generate new XML-files upon updating the database contents. I guess that the XML-files are viewed more often than the database content changes, so this approach reduces the server load.
Use a cron to automate a PHP script that builds the XML file. You can even automate the mail part as well in your PHP.
The third method you mentioned. I don't understand how cron can be used here, if there are data coming in users' request. The first method cannot be implemented.
Set the Content-type header to text/xml and have your PHP script generate XML just as it would generate any other content. You may want to consider using caching though, so you don't overwhelm the server by accident.