So I got a client that wants me to work on his WordPress website. The issue is, the code he shared with were just PHP files and a database dumb. However, on WordPress, I see importing only takes projects exported from WordPress as XML files. Is there another way I can set this up from local PHP files, JS, and CSS scripts back to WordPress?
Here is the file structure:
You can zip up all the files, transfer them to the webhosting webserver main webserver directory, sometimes under a folder named public_html using SFTP client. Then unzip it,then the index.php should be accessible. Be sure to upload the .sql file (mysql database data) into your mysql database that should be created on the server.
Related
I'm creating a files sharing service that runs through a mobile app, there's a folder in the server that hosts users uploads, I know usually in these scenarios the uploads folder must be put outside the public http directory, but I'm hosting the code on an online hosting service which doesn't allow doing that
So far here are the security measures that I've done:
Files inside the folder are named with randomly generated IDs while all the file information (Name,type..etc) are stored in the database
The Folder itself is protected using htaccess (Order Deny All) so nobody can access any data inside except scripts hosted on the server
When a user wants to download a file, my idea is to make a script that would copy the required file to a temporary folder, while adding a record in the database to delete the temp file after 2 hours of the request (Cron Job)
How efficient is my method? Can a PHP file handle cloning large number of files without putting too much pressure on the server? And what alternative ways are there to protect the folder data
Thanks for your time reading this
I have been working a web app where I need to allow the user to select a number of files and allow them to download said files as a zip file. I am working with lots of data so storing the zip file in memory or on disk isn't an option.
I am currently using Apache and haven't been able to find any solutions to be able to dynamically create and stream zip files to a client. One thing I did find was nginx mod_zip that seems to do exactly what I want.
What would be an Apache equivalent to mod_zip, or another solution to dynamically zip and stream zip files (without using disk space or loading the whole file in memory)?
I am trying to build a solution where I sync a local directory with a ftp server directory. Then the files in the local directory should be send to Jira as a specific issue.
I am trying to understand how I can make script which watches the local directory and when a new file (.pdf) and attachments is created (Look after file id) send a create command to Jira with some data from this newly created file.
Currently i have made the powershell script which reads the FTP and sync's the files to a local directory. Could the rest be made in PHP?
Regards,
Kristian
PHP and PowerShell are both very powerful languages with extensi9on capabilities so you can pretty much do anything you want with both of them. PHP is also available on windows, so I see no objection to writing the whole thing in php.
However, seeing that you already built something in PowerShell, i would recommend that you register the freshly downloaded files in Jira with the same script. Otherwise you would have to build the logic for detecting the changes/additions twice.
I have a wordpress blog which is replicated, i.e. 2 servers behind load balancer serve the same wordpress blog. I pointed the database on both servers to the same database so I have no problem there. However, when a user is forwarded (by the load balancer) to server-1 and uploads files, they are kept on server-1. The same goes for server-2. Those files are not shared between the 2 servers and therefore user who is forwarded to server-2 will not see the files (e.g. images) which were uploaded to server-1.
I read that the upload folder can be changed but "This path can not be absolute. It is always relative to ABSPATH".
What are the best practices to share the upload folder between servers?
Options:
Set something up to replicate files between servers. ie: rsync in a cron job
Mount a network share to the uploads folder on both servers.
You are already load balancing, why not get rid of some of the http load.
Move the uploads to something like s3
Here is one plugin for it http://wordpress.org/plugins/wp2cloud-wordpress-to-cloud/
Moving the rest of your static files, eg. theme & plugin files would also be good for the server load.
I am designing internal mail delivery application for the users of my site using PHP and MySQL.
In which users can attach attachments like images and others.
but problem occurs when user attach a PHP script or html page as an attachment. Giving direct url to the recipient cause the script/page attached to run on server.
Which in turn dangerous if it has some vulnerable code for the website.
So What I want is :
Denying all PHP scripts , .html and .exe files within attachment folder to run on server
But all files must be still downloadable and viewable inside website as text file(for .php, .html, .css, .js files) by checking authorization of user by another PHP script located in another folder
All in all Denying access to scripts but not the folder
Can any one help me in this ? I've seen questions similar to this like: Disable PHP in directory (including all sub-directories) with .htaccess
but this is different from that as it specifically asks for iis environment and web.config solution(if possible). But others uses .htaccess or httpd.conf which both doesnt work on iis on windows
Thanks in advance.
I think the quickest and easiest way would be to simply zip all uploaded files, and then serve them all as zips, regardless of content.
You could also get some stuff done with .htaccess files, but in case the server software changes you may need to mess with it again in order to make everything safe again, and while you don't, you're exposed.