How to ignore woocommerce_product_image_upload_error over REST API - php

I need to re-create a shop and need to do this parallel to the current online shop. I create a new server but didnt point the dns record for the domain to the new server. I create a entry in my and the server hosts file pointing to the new servers ip address.
I also configured the ssl connection with a selfsigned certificate.
Now I want to upload the products over the WooCommerce API but get an error:
{
"code": "woocommerce_product_image_upload_error",
"message": "Error retrieving the external image https://example.com/wp-content/uploads/2023/02/product_76548_1.jpg. Fehler: cURL error 35: error:0200100D:system library:fopen:Permission denied",
"data": {
"status": 400
}
}
I already changed the php configuration and add
curl.cainfo="/tmp/CA.pem"
openssl.cafile="/tmp/CA.pem"
All images are uploaded to the server and I can open the link and get the image. I also can open the link over the cli on the server with curl and get a 200 response.
I need to ignore this error and configure the products with the images.
Is that possible or do I need to take down the shop and do the stuff?

Related

json api plugin error in wordpress

I'm using json api plugin to fetch data as json records in my wo site.
Now when I run my site in local host it runs without any problem.
but when I upload my site to the host when I run my site to get json data I face with this error
Something went wrong fetching URL with JSON-data: Failed to connect to localhost port 88: Connection refusederror: get of json-data failed - plugin aborted: check url of json-feed

PHP error doing file_get_contents on a Hostinger host: failed to open stream

I'm trying to use the http://ip-api.com/json/ api for get and show the information of a visitant that conects to my website on hostinguer.com.
When I do the next, it should responds me with a json with all the info in a json format:
echo file_get_contents("http://ip-api.com/json/{$user_ip}");
But I get the following error message:
Warning: file_get_contents(http://ip-api.com/json/[MY PUBLIC IP]): failed to open stream: Connection refused in /home/[HOSTINGUER USER]/public_html/ip2.php on line 31
And other strange thing: if I use another API it works correctly and returns me the correct Json! The other API command:
echo file_get_contents("http://ipinfo.io/{$user_ip}/json");
So, I want to use the ip-api.com API cause the results are more accurate, but only works if I use ipinfo.io API... Why can I do the petition to a website and not to the other?
Otherwhise, I tried both in local using curl or writting in my webrowser and it works correctly. Also I tried in local in a Lammp and the both works perfectly. And finally I tried somthing like this post: PHP file_get_contents() returns "failed to open stream: HTTP request failed!" in hostinguer and it doesn't works..
I thought maybe is something in the hostinguer configuration but...
Thanks in advance!
Your IP address is banned. If you're using normal web hosting, your outgoing IP address is shared with other users, who probably did more requests than allowed.
Go to http://ip-api.com/docs/unban and enter your server's outgoing IP address (check it via http://ipinfo.io/json)

Why can I obtain file size and the time stamp with FTP but not be able to retrieve the actual file?

I've been working on this for about 20 hours now and I need some major help. I'm able to get the file size and the timestamp on the file but I am unable to actually obtain the data.
The server I'm trying to get the data from requires FTP over explicit TLS
I'm receiving the same error(s) with both FTP_BINARY and FTP_ASCII in the ftp_fget()
The server the file is coming from is UNIX
If I refresh the page every few hours the errors I get from PHP are different with no change in code
Error 1: 'ftp_fget(): Transfer mode set to BINARY if ftp_get is binary
or it's 'ftp_fget(): Transfer mode set to ASCII' if ftp_get is ascii
Error 2: 'ftp_fget(): Entering Passive Mode(12.345.678.90.12.34)'
On the above errors I read that PASV mode being FALSE is what triggers Error 1, so I think the switching between the errors is for pasv mode working or not working. Not positive though.
<?php
$server = "12.345.678.90";
$local_file = 'inv3.txt';
$file = 'inventory-alp.txt';
$con = ftp_ssl_connect($server,21) or die("Could not connect to $server");
ftp_login($con,"xxxxxx","xxxxxx") or die("Could not login");
ftp_pasv($con,true);
$fsize = ftp_size($con, $file); // works
if ($fsize != -1)
{
echo "</br>$file is $fsize bytes.</br></br>";
}
else
{
echo "</br>Error getting file size.</br></br>";
}
$lastchanged = ftp_mdtm($con, $file); //works
if ($lastchanged != -1)
{
echo date("F d Y H:i:s.",$lastchanged)."</br></br>";
}
else
{
echo "Could not get last modified</br></br>";
}
if (ftp_get($con,$local_file,$file,FTP_ASCII)) //fails
{
echo "successfully written to $local_file";
}
else
{
echo "There was a problem while downloading $file to $local_file";
}
$var = error_get_last();
echo '<pre>';
var_dump($var);
echo '</pre>';
ftp_close($con);
?>
EDIT 1: Solution: I ended up not being able to access what I needed to change the firewall settings and such in php. While this is not the true answer, I did make it work and it is relatively easy.I ended up running across WINSCP, having the ability to connect to the server in a filezilla type layout and then save the session url was nice. All i did was access the saved session in the .exe and was able to set up my connection in half an hour.
What Martin indicated is very true, the SIZE and MDTM commands run synchronized over the main FTP Command Connection only. The transferring of data files, and usually the directory listing also (unless MLST/MSLD is used) requires a separate connection, the Data Connection, which is negotiated by the client and server over the Control Connection using a series of commands, most notably PORT and PASV.
Without going into a ton of detail (There's a link to our white paper later), when the Client & Server negotiate the terms of the Data Connection, one of the end points will tell the other endpoint the specific IP address and Port number for the connection. One endpoint will listen and wait for a connection from the other endpoint. This works great unless there is a firewall in front of the endpoint that is waiting for the inbound connection. If the client/server session is running in Active mode, the Server will actively connect back to the client on the IP/Port which was received by the server from the client in the form of the PORT command. In Passive mode, the Server will passively wait for the client to connect on the IP/Port which was sent by the server to the client in the response to the PASV command sent by the client to the server.
Again, firewalls tend to block FTP data connections, unless the firewall does active FTP NAT'ing or unless Port Forwarding has been set up on the Firewall and a set of passive-ports has been opened and routed to the endpoint specifically.
So check the firewall settings on the client if you want to use Active/Port mode; check the firewall settings on the server if you want to use Passive/PASV mode.
Here's a link to our white paper which outlines the basics of FTP/PASV/PORT, hopefully it'll help you with your issue.
http://www.webdrive.com/wp-content/uploads/FTP_Explained1.pdf
Best of Luck!
Michael
With the FTP protocol, it's perfectly possible that you are able to obtain the file size and the modification timestamp (using SIZE and MDTM commands respectively), but not the file itself.
The SIZE and MDTM commands use the FTP control connection only.
While a file transfer (or a directory listing) requires a separate data connection. And it's likely that there's something that prevents the data connection from being opened.
See (my) article on the FTP connection modes for more details and typical issues with data connections.
Typically a culprit would be a firewall on your webserver. If you have an SSH/terminal access to the webserver, are you able to connect from it to the FTP server?
Another possibility is a misconfigured FTP server. Is the IP address in "Error 2" routable from your web server? (=Is it the real IP address you connect to?)
It is unlikely this is related to an ASCII/BINARY mode. The messages you are getting (Transfer mode set to ...) are status messages, not error messages. They are not related to your problem. It's indeed strange that you got no other message/error.
You can try to use the active mode, instead of the passive.
ftp_pasv($con, false);
But usually the active mode is more problematic.

Programatically Update FTP account quota in cPanel?

I am trying to update the quota of a cPanel FTP account through php.
https://username:password#mydomain.com:2083/frontend/x3/ftp/doeditquota.html?acct=mailid#mydomain.com&quota=50
I am making a HTTP Request with the above URL.
And I getting response message
The FTP account maildid#mydomain.com was successfully modified with a new quota of 50 Megabytes. 0Failed to determine FTP maildid#mydomain.com directory.
But in server quota is not updated . Actually whats issue?
And why this message :
0Failed to determine FTP maildid#mydomain.com directory.
Finally I got the solution.
https://username:password#mydomain.com:2083/frontend/x3/ftp/doeditquota.html?acct=mailid&quota=50
remove domain from the 'acct' parameter.

file_get_contents - failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found

I'm having some weird problems with file_get_contents after moving my site to a new domain. I had to set up a new domain and IP address (using Plesk) to get a new ssl certificate working. Now my file_get_contents calling a script on the same domain is giving me this:
failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found
If I call the same url using file_get_contents on another server it works fine, and if I call www.google.com from the server thats failing that works, so it only seems to be if I call a url on the same sever!
I have a feeling it might have something to do with having two IPs with two different ssl certificates on the one server, when i file_get_contents / (index page) of the server from the server I get the plesk 'this is a new domain' page so its like apache isnt looking up the right virtual host when its called from its own sever.
To clarify (hopefully!):
On the server hosting the domain:
file_get_contents('https://mydomain.com?limit=4&offset=0&s_date=2012-02-05&e_date=2012-03-13&order=release_date&dir=desc&cid=12');
gives "failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found"
file_get_contents('http://www.google.com');
works correctly
On another server:
file_get_contents('https://mydomain.com?limit=4&offset=0&s_date=2012-02-05&e_date=2012-03-13&order=release_date&dir=desc&cid=12');
works fine.
I have tried turning ssl off and I still get the same problem.
I've had this problem too, when I working on a little test server at home. The domain name is resolved to your external IP address, and a request is sent. But because the request is coming from inside your network, the router doesn't recognise it as a normal request. It probably has a web interface for configuring it, and tries to return a page from its own management system, which is then not found at the path you specified.
In that case, I was working on a Windows PC, and I could solve it by adding the domain I was testing to my hosts file, specifying 127.0.0.1 as the IP-address (or the IP-address of the server, if it is another machine within the same network). In Linux there should be a similar solution, I think.
The problem isn't PHP or your server, but your router.
If you just need to handle the warning when the URL is not found (as I did), you may just do this to turn Warnings into Exceptions:
set_error_handler(
function ($err_severity, $err_msg, $err_file, $err_line, array $err_context) {
// do not throw an exception if the #-operator is used (suppress)
if (error_reporting() === 0) return false;
throw new ErrorException( $err_msg, 0, $err_severity, $err_file, $err_line );
},
E_WARNING
);
try {
$contents = file_get_contents($your_url);
} catch (Exception $e) {
echo $e->getMessage();
}
restore_error_handler();
Solution based on this thread/question.
Most hosting provides now block the furl_open parameter which allows you to use file_get_contents() to load data from an external url.
You can use CURL or a PHP client library like Guzzle
Try to do this :
file_get_contents('https://mydomain.com?'.urlencode('limit=4&offset=0&s_date=2012-02-05&e_date=2012-03-13&order=release_date&dir=desc&cid=12'));
I got the same error in CodeIgniter 3. I was doing like this
file_get_contents(base_url('database.json'));
and then this
file_get_contents(site_url('database.json'));
My problem get resolved after I changed it to this
file_get_contents(__DIR__.'/database.php');
Reason behind this was, I was trying to get the internal resource from external url which these methods base_url and site_url return. While __DIR__ return internal url.

Categories