I'm currently developing a new website where a lot of files will be uploaded and downloaded.
When a file is uploaded to the server clamav will start a virus scan on the tmp file before it will be moved to the http server. Everything works great except when i use clamscan and it seems like clamav needs to upload the hole virus database before the scan starts every time and this stress my cpu to 50% for maybe 10 - 20 seconds.
This seem to be a big problem because if two users upload files at the same time to my website it will probably be very slow.
So I installed clamav deamon because it runs in the background and already have the virus database loaded so a lot of time and cpu power can be saved. But to the problem...
When i use clamdscan (clamav deamon) it cant access any of the tmp files that's uploaded with the php script. It only works when i use clamscan. This is probably because clamdscan is running in the background and uses some very strict user permissions.
But how can i solve this? Can i change the tmp php upload file permission from 0600 to 0644? Is that safe? Or should i change permission for clamav deamon?
I don't really know how to do this and if someone know and what to shear I'm very thankful.
Try passing --fdpass as an option to clamdscan.
I had the exact same issue and it worked for me.
--fdpass
Pass the file descriptor permissions to clamd. This is useful if clamd is running as a different user as it is faster than streaming the file to clamd. Only available if connected to clamd via local(unix) socket.
Source: https://linux.die.net/man/1/clamdscan
Related
I have a problem with my php webseite.
It is suppossed to run a batch file. The Batch starts a programm.
That programm is reading a file and creates a bmp and txt file.
This is my php code:
exec('cmd.exe /c "path\\to\\file.bat"');
The problem is when i run the php script i can see the programm in the task manager as "background processes" but no bmp or txt file is created. also the programm shuts itself down after creating the files.
i tried giving permission to the users but it seems i still make a mistake somewhere.
I had the same problem 3 months ago. The file is not created because the instance of cmd you’re running does not have administrative permission (i.e. run as administrator). The only solution I could find was to give the application pool for that app administrative permissions on cmd which ultimately resulted in me implementing a completely different approach since giving a website admin rights to the server is bound to end badly.
Hope this helps.
I know this probably is a subject many other places as well, but I have tried many of the things written in other posts, and still no luck.
I am running a Ubuntu VPS with apache, ftp and php.
My goal: Every time I add an image to a folder the image should be accessible for the public. I have tried different chmod-commands but still no luck.
I want each image to have access rights: rwxrwxrwx, but as default when I upload them to the the folder through Filezilla they end up with access rights rwx-------.
I hope also I have given enough information. Please comment below if not, and I will provide as fast as possible. ( I am kind of new to the game, sorry about that)
You can change the default permission in your ftp server. I don't know which ftp server you're using? Or you can build a script that changes the permissions and execute this.
I am attempting to fwrite() via an FTP wrapper from a UNIX-based server to a Windows-based server with PHP. I am connecting successfully, and am able to (for example) create directories. However, I can't seem to write files to the folders! When I look at the directory permissions using an FTP client, I noticed that they are all 0000, and cannot be changed. Apparently, Windows-based servers do not use the same FTP permission system.
So... what's up? What do you recommend? I am able to upload files to the directory using an FTP client, but when I attempt to write files with PHP, nothing happens.
I am at a complete loss as to why this may be occurring. I have confirmed my script works by writing to UNIX-based servers, so that is not a problem. Is it possible that the fact the destination server does not have PHP installed matters? I would not think so, but I'm open to any ideas at this point!
Thank you!
EDIT - What's really getting me is that I AM able to create directories, so it doesn't make any sense that writing shouldn't be working. On Windows servers, is there some setting that would prevent just the writing of FILES, but not folders?
EDIT 2 - More research has told me that, though you cannot CHMOD on Windows servers, PHP's CHMOD still somehow does something with the permissions. However, this does not appear to be working for me. Is there a way to change the permissions directly with PHP code, or is this something that has to be done directly on the server, outside of my reach?
I recommend file_put_contents, and make sure PHP is allowed to write in that directory.
Our website currently backs up every night to a seperate server that we have which is fine, but when we go to dowload the files the next day it take's a long time to download the files (usually around 36,000+ images). Downloading this the following day takes quite some time and affects the speeds of everyone else using our network so ideally we would try and do this in the middle of the night - except there's no-one here to do it.
The server that the backup is on is running Cpanel which appears to make it fairly simple to run a PHP file as a Cron job.
I'm assuming the following, feel free to tell me I'm wrong.
1) The server the backup is on runs Cpanel. It appears that it shouldn't be too difficult to set up a PHP script to run as a Cron job in the middle of the night.
2) We could deploy a PHP script utilizing the FTP functions to connect to another server and start the backup of these files using this cron job.
3) We are running Xampp on a windows platform. It has Filezilla as part of it so I'm assuming it should be able to accept incoming FTP connections.
4) Overall - we could deploy a script on the backup server that would run every night and send the files back to my local computer running Xampp.
So that's what I'm guessing. I'm getting stuck at the first hurdle though. I've tried to create a script that runs on our local computer and sends a specified folder to the backup server when it executes, but all I seem to be able to find is scripts relating to single files. Although I've some experience of PHP, I haven't touched upon the FTP functions before which are giving me some problems. I've tried the other examples here on stack overflow with no success :(
I'm just looking for the most simplistic form of a script that can transfer upload a folder to a remote IP. Any help would be appreciated.
There is a fair amount of overhead involved in transferring a bunch of small files over FTP. Ive seen jobs take 5x as long, over a local network. It is by far easier to pack the files in something like a zip and send them in one large file.
you can use exec() to run zip from the command line (or whatever compression tool you prefer). After that, you can send it over ftp pretty quickly (you said you found methods for transferring 1 file). For backup purposes, having the files zipped would probably makes things easier to handle, but if you need them unzipped you can setup a job on the other machine to unpack the file.
This problem occurred to me multiple times now, and it's time for me to do it the right way!
How can I upload a website to the server, where php has access to the folders for writing data.
Usually I use an FTP program, but I can't upload as root, so there are restriction problems all over the place...
How do you do stuff like this?
Thanks!
EDIT
I'm sorry, I accidentally added rails to the tags instead off php.
Probably I need to clarify my problem, since the answers didn't really help me out here:
I already have a server running apache, DirectAdmin and some other stuff like rails.
And the problem is when I upload a website like joomla or wordpress via FTP the restrictions always need to be set to 777/775 or these sites can't write to the folders..
So what I need to know is:
How can I upload these sites (via FTP/SSH) as a user (root) that is the same as php, so that php can create files in all folders it needs to write to?
Hope I'm being more clear now, thanks for the help so far!
Use a server with ssh access and full write access to wherever your Rails app is hosted (and usually ssh access is as the user that Rails runs as).
For me this usually means a VPS type server, I like Rackspace Cloud which turns out to be around $11 - $15 per month for a low traffic, low spec server. I've also heard good things about Linode
The solution
Upload your site with FTP
SSH to the server and go to the public_html folder
chown -R [user_name]:[group_name] [folder_name]
For me the right user was apache..