I'm a student, new to netBean php remote server. I was trying to upload php files to remote server through netbeans (on run). I'm sure everything in run configuration, ftp information is correct, I have rights to upload file to that folder.
And this error occur:
Message prompt:
"Cannot logout from server *name. Reason 250 OK. Current directory: *dir name"
Logs:
Log in output window:
220---------- Welcome to Pure-FTPd [privsep] ----------
220-You are user number 17 of 500 allowed.
220-Local time is now 04:13. Server port: 21.
220-This is a private system - No anonymous login
220 You will be disconnected after 3 minutes of inactivity.
USER a4514022
331 User a4514022 OK. Password required
PASS ******
230-OK. Current restricted directory is /
230-362 files used (3%) - authorized: 10000 files
230 2476 Kbytes used (0%) - authorized: 1536000 Kb
TYPE I
200 TYPE is now 8-bit binary
CWD /public_html
250 OK. Current directory is /public_html
CWD /public_html
250 OK. Current directory is /public_html
Summary
Failed:
file test.php Cannot upload file test.php (unknown reason).
Runtime: 408 ms, transfered: 0 file(s), 0 KB
I don't get what happen. How to fix it?
I use netbean 6.9.1, windows 7 and java ver 7 (build 1.7.0_05) platform 1.7 Those numbers, I don't know which one is the version. I just put all those there. It seems I'm the rare one get this problem...
I Know you asked this and probably your problem has solved
but I write this for the another ones;
first check the Proxy.
Tools>Option>Proxy Settings
and then set on the "no proxy" .
It will probably be solved.
I had to select less secure option e.g. Encryption: Pure FTP option to get it working for GoDaddy setup.
This answer comes even later. I also had the error and ended up here in search.
My mistake was that I had the file I wanted to download in editing with another reader (outside Netbeans). In other words, the downloaded file could not be saved to local filesystem, because the opened file prevented that.
Related
I have a a small PHP script that uses phpseclib to download files from remote server.
The script is like below:
include('Net/SCP.php');
echo var_dump($ssh->exec('whoami')); // debug to test the ssh connection. returns "myuser"
$scp = new Net_SCP($ssh);
try{
$remotePath = '/home/user/test.txt';
$localPath = '/tmp/myfile';
if (!$scp->get($remotePath, $localPath)) {
throw new Exception("Problems to get file");
}
} catch (Exception $e) {
echo "\n\n" . var_dump($e->getMessage()) . "\n\n";die;
}
There are some other questions here in SO that uses very similar snippets.
It works like a charm for many files, but it fails for some binary files ($remotePath = '/home/user/test.p12';, for instance).
Are there any know limitation to download binary files using phpseclib (I didn't find anything in their issues on github)? If not, what I'm doing wrong? Am I forgetting some options or anything?
As a side note scp myuser#serverip:/home/user/test.p12 /tmp/teste.p12 works fine in command line.
Following the comments, I must indicate that my script just fails. The statment $scp->get($remotePath, $localPath returns false for all binary files that I tried. Thats all i have for now.
As far as I know, phpseclib does not have any detailed log on these fails.
My application log (nginx) does not show anything special. Access log on my remote server (centOS. for these tests I have the control over it, but its not the real scenario) I got something like below:
Jul 27 15:22:58 localhost sshd[14101]: Accepted password for myuser from myip port 51740 ssh2
Jul 27 15:22:58 localhost sshd[14101]: pam_unix(sshd:session): session opened for user myuser by (uid=0)
Jul 27 15:22:58 localhost sshd[14103]: Received disconnect from myip port 51740:11:
Jul 27 15:22:58 localhost sshd[14103]: Disconnected from user myuser myip port 51740
PHP version: 7.3 (the script is also used in servers with older versions)
Local server: Debian 10
Remote server: CentOs 8
The problematic file that fires the problem is a certificate p12 file.
I found the problem and it was much more simple than I tought. It was just a permission problem (for any good reason, I put all my test files in a directory without read permition).
I decided to leave this answer here, because I think that this is not clear on phpseclib documentation, but the 'Net/SCP.php' only works with files with read permission, so, before download make sure that the file are readable or execute something like chmod o+r filename.
The snippet in the question works fine with binary files.
I am generating a small script that manipulates a file name, renames the file, and moves it to a directory within it's current directory.
The problem (I think) is that I am running a local PHP Script from my computer and trying to have it do the manipulation on a network drive (R:\ in this case)
I decided, since I'm using Windows 7, that I'll be using the rename() function. Here is the snippet of code..
if (!empty($_REQUEST['uploaded_file']))
rename("R:\\".$fileName, "R:\\Movies\\".$newFileName);
Note: $filename and $newFileName are pulled earlier in the script
The resulting error is:
Warning: rename(R:\movie.title.2011.avi,R:\Movies\Movie Title [2011].avi):
The system cannot find the path specified. (code: 3)
The code use to work for a while, but for whatever reason, all of the sudden, it doesn't want to work anymore.
I'm not sure if using the rename scheme is the best approach for getting this done.. My Synology DS1010+ didn't like me using SFTP to connect and rename too many times.. It eventually started to refuse connections, so I needed to do this differently..
Environment Information:
Operating System:
Windows 7 Ultimate x64
PHP Version: 5.4.4
FileZilla Connection Attempt:
Status: Connecting to 192.168.5.25:21...
Status: Connection established, waiting for welcome message...
Response: 220 Diniz-Stora FTP server ready.
Command: USER admin
Response: 331 Password required for admin.
Command: PASS **********
Response: 230 User admin logged in.
Status: Connected
Status: Retrieving directory listing...
Command: PWD
Response: 257 "/" is current directory.
Command: TYPE I
Response: 200 Type set to I.
Command: PASV
Response: 227 Entering Passive Mode (24,63,165,211,217,0)
Command: LIST
We have a script which downloads acsv file. When we run this script on command line on EC2 console it runs fine; downloads the file and sends success message to the user.
But if we run through a browser then we get:
error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection without sending any data.
When we checked in backed for the file download, it's there but the success message sent after the download is not received by the browser.
We are using cURL to download from a remote location with authentication. The user group and ownership of the folder is "ec2-user", the folder has full rights ie 777.
To summarize: the file is downloaded but at the browser end we are not getting any data or success message which we print.
P.S.: The problem occurs when the downloaded file size is 8-9MB; if it is a smaller file size say 1MB it works. So Either script executing time or download file size or some ec2 instance config is blocking it from giving browser a response. The same script is working perfectly fine on our Godaddy Linux VPS. We have already changed Max execution time for the script.
Sadly, this is a known problem without a good solution. There's a very long thread on the amazon forum here: https://forums.aws.amazon.com/thread.jspa?threadID=33427. The solution offered there is to send a keep-alive message to keep the connection from dying after 60 seconds. Not a great solution, but I don't think there's a better one unless Amazon fixes the problem, which doesn't seem likely given that the thread has been open for 3 years.
A very strange thing is happening. I am running a script on a new server (it works on my current server and laptop).
The strange thing is that I only get it to (sort of) work when I increase memory limit to 1024M (!). It is extracting a large zip file and going through the files, so I thought it was normal. Instead of this script terminating or ending with errors. I get an error from my browser:
The server at www.localhost.com is
taking too long to respond.
Localhost.com? The web server is just localhost:9090 and I can see Apache is still running. Maybe Apache crashes momentarily and it can't find the server? But nothing about apache crashing in the log files.
This isn't a server issue, its more to do with my PHP script and memory usage I think, so no need to move to server fault.
What could be the problem? How can I narrow do the cause, I am at loss here!
The server is a windows server running Apache 2.2 with PHP version 5.3.2. May laptop and the other working server are running version 5.3.0 and 5.3.1 for PHP.
Thanks all for any help
Ensure that,
ini_set('display_errors','On');
ini_set('error_reporting',E_ALL);
ini_set('max_execution_time', 180);
ini_set('memory_limit','1024MB' );
I'd pop this in the top of the script and see what comes out. It should show you errors and the like.
The other thing, have you checked fopen and the path of the file which it's loading?
Abs said,
check files being zipped up can be zipped by PHP (permissions
especially on a Windows OS with multi
users)
I kept getting this problem too, and none of these sites really helped until I started looking at the same thing for people using Internet Explorer. The way I fixed it is to open up the system hosts file, located at C:\Windows\System32\drivers\etc\hosts, and then uncomment out the line that mentions ::1, which is needed for IPv6. After that it worked fine.
Somehow your system's munged up and isn't treating localhost as the local 127.0.0.1 address. Is your hosts file properly configured? This is most likely why you're getting the "too long to respond" error:
marc#panic:~$ host www.localhost.com
www.localhost.com has address 64.99.64.32
marc#panic:~$ wget www.localhost.com
--2010-08-03 22:41:05-- http://www.localhost.com/
Resolving www.localhost.com... 64.99.64.32
Connecting to www.localhost.com|64.99.64.32|:80... connected.
HTTP request sent, awaiting response... Read error (Connection reset by peer) in headers.
Retrying.
www.localhost.com is full valid hostname as far as the DNS system is concerned.
I am not a php guru by any means but are you writing the extracted files to a temporary local storage location that is within the scope of the application? Because if you are not then I think what is happening is that the application is storing the zip file and extracted files in memory and then is attempting to read them. So if it is a large zip and/or the extracted files are large that would introduce a huge amount of overhead on top of the overhead introduced by your read and processing actions.
So if you are not already I would extracted the files and write them to disk in their own folder, dispose of the zip file at this point, and then iterate over the files in your newly created directory and perform whatever actions you need to on them.
<?PHP
print "hello";
?>
I write this code and save as "1.php";
Then I upload this PHP script to my
server.
I have 8 diffrent free hosting server's accounts.
And I noticed that there are 2 types of
server settings.
(1)
"type A"
for exapmle, sqweebs.
We need to set the PHP file permission as 640.
This means that sqweebs server requires us
to give group permission for PHP script running.
If I set 604,then the server generate such a
errors.like,
Warning: Unknown:
failed to open stream:
Permission denied in Unknown on line 0
Fatal error: Unknown:
Failed opening required
'/www/sqweebs.com/1.php'
(include_path='.:/blahblah')
in Unknown on line 0
(2)
On the other hand ,there are other type
servers on this world.
"Type B",
for example, izfree.
On this server, I found that
I can make PHP script work if I give it
604 as the permission.
So I want to know why there are many server settings,
and what is the reason, and some other related
opinions.
like which server should I use ,or etc,etc.
The problem is probably with using the free hosts. They add server limitations, and most likely, disable/enable certain restrictions that they feel are fit for administering how you can behave on their site.
It probably depends on the permissions and under which group/user the apache is running. If it's running with your rights, you will be fine with 600, if the apache is not even in the same group as you are you probably need something like 604. All I'm saying is, it depends on the server configuration AND the file permissions. This is a wild guess, but if you really need something like 604, it could be a sign that there is potentially something wrong and other users maybe able to look into your home directory...
it is expected, by the webserver, that the file has to have the permission for the user that the webserver runs as, to open and run it.
So, if the webserver (say Apache) runs as www, then www should have read access to the file. (some run apache as www, and some run it as apache, or nobody).
When you upload the file, depending on how the umask is set, the file permission is set so. (so, on one host, the file could have permission 655 or other could be 600, when permission is not set explicitly).
It always helps if you know a bit about the OS you normally deploy your applications on. Mostly, PHP is deployed on *nix system, and permission scheme is nearly (almost always) same across all the *nix systems.
Try getting hold of "Unix system Administration Handbook" (by Evi Nemeth & Co). Its quite fun to read and easy to understand (it is an old edition .. but unix permissions have not changed)