Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 3 years ago.
Improve this question
In my php website, I call a python script using theano and running on GPU.
However, when calling this python script from php, it seems apache doesn't have any permissions on GPU so the program falls back on CPU, which is far less efficient compared to GPU.
How can I grant apache rights to run programs on GPU?
I would split that up, save the requirement as event in some storage (redis for example or even rabbitmq) and listen to that with some daemonized script (cron would be a bad joice since its hard to make it run more often than every minute). The script will update the storage entry with the results and you can access it again in your http stack. You can implement the functionallity via ajax or utilize a usleep command in php to wait for the results. If using a while loop, dont forget to break it after 1 second or something, so the request is not running too long.
Your problem might be the configured user, that executes the php binary - it maybe not permitted to access those binaries on your system. Typically its the www-data user. By adding the www-data user to the necessary group, you might be able to solve it without splitting all up. Have a look at the binary's ownerships and permissions to figure that out.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I'm working on a program that utilizes Python scripts that pull information from a MySQL database. The MySQL database is managed from a PHP site interface (so HTML forms and buttons and the such). I have this all set up on a raspberry pi and it works. But I want to add some more functionality.
Specifically I want to be able to execute certain python scripts from the PHP site. I want it to be as simple as a press of a button, literally.
Is this a scenario where I should use Django? I've never used it before but have read about how it connects Python with the web. I found an answer to a similar question but I'm wondering If I need to set up anything special on my apache server: https://stackoverflow.com/a/31811462/5609876
I even made a little picture for a visual representation of my program incase my explanation wasn't good enough:
No, you do not need django at all.
If all you want to do is execute a Python script from PHP - assuming you have already written the script and stored it somewhere:
First, assign execute permissions on the Python script to the user that is running the PHP code. Normally, this is the same user that is running Apache. Usually this is called www-data or apache or something similar. The user will be listed in the Apache configuration file.
Then, on your PHP side, all you really need is exec:
<?php
exec('/path/to/python /path/to/your/script.py')
?>
If the shell_exec function is allowed on your server, you can use that to run your Python script through Bash. shell_exec returns the output of the bash call. The only thing you have to make sure of is that shell_exec isn't disabled in your server's php.ini file (look for the line disable_functions shell_exec).
If your python script is called mypythonscript.py and is in the same directory as the PHP file, you can run it like this:
<? shell_exec('python mypythonscript.py'); ?>
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I'm arriving in PHP world just now.
By the experiments I'm doing, I'm making some assumptions:
1- A PHP script is launched when the file it resides on is requested via http.
2- The script creates an independent scope for it's vars.
3- The script can only access any other DOM element at the end of it's execution. However, it can perform file's jobs any time.
Are those rights assumptions?
What other actions it can do while being executed?
Thanks.
All of these assumptions are at least partially incorrect or, in the case of the last one, a bit nonsensical. You might want to find a good tutorial and also read through PHP The Right Way.
To address your assumptions in part:
1- A PHP script is launched when the file it resides on is requested via http.
Usually, but not necessarily. PHP files can also be run from the command line or via other protocols.
2- The script creates an independent scope for it's vars.
No. Any variables created outside of a function, class, or method have global scope and are automatically shared across any included or required files.
3- The script can only access any other DOM element at the end of it's execution. However, it can perform file's jobs any time.
This doesn't really make sense; PHP scripts don't have anything to with accessing the DOM. PHP runs purely on the server side. It can do anything on the server (assuming correct permissions, etc.) right up until it terminates.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I need to create an application in php with a background thread containing an timer that keeps updating a database (by collecting data from different sites) separately without any user interference. What I mean by this is: without anybody visiting the site, the thread has to keep updating a database. Is this possible in PHP and how am I able to realise this ?
The best way I think it is to create a php script to do whatever you want and then set up a cron job to run that script at specific time.
There are several options for this:
A scheduled task in your operating system, such as cron on *nix or Windows Scheduler for the Windows platform.
A permanently running script. This is not ideal for PHP though, as memory usage is sometimes not correctly thrown away, and the script can run out of memory. It is common for scripts to be set to die and respawn, to prevent this from happening.
A scheduled task in your database server. MySQL now supports this. If your purpose is to run database updates, this might be a good option, if you are running MySQL, and if your version is sufficiently recent.
A queue, where some processing is done in the background upon a request signal. See Gearman, Resque and many others. It is useful where a user requests something in a web application, but that request is too lengthy to carry out immediately. If you need something to run permanently then this may not be ideal - I add it for completeness.
Having a PHP process run for a long time isn't really a good idea because PHP isn't a very memory efficient language and PHP processes consume a lot of memory.
It would be better to use a job manager. Take a look at Gearman.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Is it possible, on a website/webserver (having full root access) to run a PHP script which calls mysql queries in the background. What I mean with that:
An user clicks to process something - however to prevent the user waiting for running the query it should look like it is done for the user - he doesn't has to wait for the PHP/MYSQL in the browser
However the script should be running on the server and finish
How can I do that? If there is none effective solution in PHP - is it possible with other languages?
I'm not talking about cron jobs - I'm on a ubuntu machine (no windows)
Would be for running many PHP scripts (all the same) in the background - Nginx be the better solution or Apache? Is it even relevant?
The best architecture I could recommend here is probably a queue/worker setup. For instance, this is simple to do with Gearman (alternatively: ØMQ, RabbitMQ or similar advanced queues). You spin up a number of workers which run in the background and can handle the database query (I'm partial to daemonizing them with supervisord). Spin up as many as you want to support running in parallel; since the job is apparently somewhat taxing, you want to carefully control the number of running workers. Then any time you need to run this job, you fire off an asynchronous job for the Gearman workers and then return immediately to your user. The workers will handle the request whenever they get around to do it. This assumes you don't need any particular feedback for the user, that the job can simply finish whenever without anybody needing to know about it immediately.
If you do need to provide user feedback when the job is finished, you may simply want to try to execute the request via AJAX. For really sophisticated setups and realtime feedback, you may use the Gearman approach with feedback delivered via a pub/sub websocket. That's quite an involved setup though.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I manage a VPS of my client. He wanted a backup solution includes some folders and mysql databases. OS is Ubuntu, web server is apache. I don't want my client to mess with ssh or ftp.
I think i can save database backup files and trigger PHP's exec function from a wab page to zip folders and database backup files, then give a link to download this zip file.
This is technically possible but i wonder if there is a better solution except automatically copying backup files to another server. Because creating backups anytime is required in my situation.
There are lots of possibilities, here is a very simple one we use every day:
create a backup script (e.g. in bash) with the usual suspects as mysqldump, tar and date
make sure, this backup script locks against double runs
create a cron job, that runs every minute, checks if a flagfile exists, and if yes starts the backup script and then clears the flagfile
if you want, create more cron jobs (e.g. a daily one), that do nothing but set the flagfile
create a trivial PHP script, that just touches the flagfile to trigger an adhoc backup
You can download the finished backup package, once the flagfile is cleared (again check via a trivial PHP script)