E.g. I am selling a WordPress or Joomla plugin and after the user installs and activates this plugin, but it still remains not vworking, because he needs to click on some "verify" button so the status of the plugin changes to became working.
This button will trigger a function that will connect to some "service" where I will previously add his website URL e.g. http://myclientsweb.com plus maybe some verification code attached in the url matched with the data on my serrver, the status of the plugin change to activated.
I can do the programming stuff on both sides (client's web and server for verification) basically, but the problem is I need some solution as server where verification urls and codes are stored is available all the time , something like CDN, so even one server is off, the client can always verify his plugin somwhere else.
So, the best solution would be some kind of CDN service that specialize for that. It could be free or paid. Do you know about something of that nature? Or do you suggest some better solution?
I am thinking you need something else than a CDN for that purpose. I would suggest you look to the cloud for a solution. I think, if you want the most uptime for your verification app, with the simplicity of a CDN, you should go for something like Pagodabox or Heroku.
Those two services host your code for you, much like a regular server, but they automaticly scale and handle requests. I theory it should make your app available 100% of the time, with minimal resources spent from you.
Both services offer free plans to get going, and test out if it's something for you.
I hope this helps, this is my suggestion to your problem at least.
If your server host is good, they usually offer 99% up time guarantee. CDN is required if you are getting large number of verification requests causing delay in processing.
Here, I would suggest, when someone tries to verify plugin and if server is down (couldn't reply), send an email to admin (you). So that you instantly know that the server is down and you can manually process the verification.
CDN stands for content delivery network a large distributed system. Which is mostly used to serve content. And does not seem to fit what you are looking for. If i understand correctly you are looking to offer 99.x% uptime. So client can always register their plugins.
This could require settings up multiple virtual hosts / dedicated servers behind a load balancer. In that way when one node goes down. The load balancer could redirect the traffic to a new node to handle requests. You really need to find a good hosting company. That's all!
Related
I’m making new a static website for a client, and they are wanting a contact form that a user could fill out and it will send the enquiry to the clients email.
The client is currently using cPanel as their web host manager, on what I believe is an Apache server. I don’t actually know what server side language that it would be using as I don’t have access to their cPanel account.
I was initially thinking I could create a PHP script that would use the mail() function, and I would put that into the public_html folder with the rest of the static site, and so when the user submits a POST request, the forms action will call that PHP script.
But from what I’ve read, that isn’t the best way to go about it and instead I should be using an Simple Mail Transfer Protocol (SMTP). I have a vague idea of what this is from googling, but have no idea how to implement such a thing.
I’m typically used to creating sites with React.js/ Gatsby.js, Node.js, Netlify, and so I’ve had no experience with cPanel and the like.
Has anyone done something similar or has any advice? Any thing will be much appreciated!
I think the first step is to get many information as you could about the available server structure. It is very hard to elaborate on any good solution without having control and/or knowledge about the server-side: maybe a tool or resource you use for implementing an SMTP-based solution could not be available to you when deploying it.
You could try to argue with your client on the importance of knowing the server structure. Use arguments on quality of service, security, among others.
If you get the server information, consider use PHPMailer if the server provides you with the required resources and dependencies.
If you could not have that information, the solution you think first ( use mail() ) will probably work. Great discussions about the topic occurred here.
However, firstly ensure that the server offers email sending service. It would be hard if it does not offer that service, but it is a possibility.
Long time reader first time asker, if my question is silly or missing info or miss titled lemme know and I'll fix it.
Okay, so I'm working at a community center for the next 8 weeks as a tech help assistant, I'm also a CS student.
They have a web application that is quite old running locally on an iis server (version 7). It's for keeping track of their members, events and registration It's written in asp.net and is using and Access database. They also have a wordpress website (php, mysql, apache) for advertising events and sharing information about what's going on in the community.
What they would like is to link their wordpress to the local application. I've been racking my brain about if this is even possible or not. I'm leaning towards not possible because the local application and shouldn't be outward facing as it has sensitive data on it and was not designed to be secure in the face of would-be hackers.
The only solution that I could think of is create a "walled off" section of the computer hosting the local application. Also an outward facing port that accepts incoming data from the wordpress site that is then passed onto the access database as an update (increasing a counter for the amount of people registered to a program). It needs to be possible for a file to have some kind of global (from the web) executable permissions and have all the other files on the localhost computer locked down from this global permission.
We would also need to be able to get 2 boolean values from the local app for the wordpress site. This is for if the program/camp/whatever is full and if the update was unsuccessful in the event of something going wrong. I'm just not sure if something like that is even possible and where to start with that. The most important thing is that it's secure.
If a secure API could work I have time to create something like that.
I don't have enough time to upgrade their local system to make it safe enough to be online because I have to run tech help sessions. I know that is the most realistic option.
Thanks very much
What they would like is to link their wordpress to the local
application. I've been racking my brain about if this is even possible
or not. I'm leaning towards not possible because the local application
and shouldn't be outward facing as it has sensitive data on it and was
not designed to be secure in the face of would-be hackers.
I think you've hit the nail on the head right there. It looks like you have a decent understanding of the situation but not of their internal app. The fact is that it's hard to scope something like this without getting in and getting details. Step 1 would be to see if you can talk to whoever it is that built the thing and get their feedback. It might be secure enough to expose some sort of connection.
Really there's not enough information here to determine a good answer, and you should be wary of anyone that says it's secure. There are a ton of factors that go into web security.
You might be able to throw together a basic RESTful API with authentication to send only to the wordpress site's IP. But if it's sharing the IP that information can be consumed by third parties so you'll have to decide if that is an okay risk.
I wouldn't try and expose everything and partition with apache. A basic RESTful API with authentication would be best at first glance IMO. That way you only show consumable data and limit what can be used.
I am using the magemonkey extension from Ebizmart and when i save my config in the admin i get the following error:
Could not add Webhook "http://example.com/monkey/webhook/index/wkey//" for list "Test Mailing List", error code 508, We couldn't connect to the specified the URL. Please double check and try again.
I did some digging and arrived at the conclusion (duh) that mailchimp cannot see my local environment so it's unable to add the webhook. Is it possible for me to configure this locally for testing purposes or do I have to wait until the site is live (sounds pretty strange to me)?
UPDATE: I reached out to Mailchimp and got the following response. Seems like they won't add a host entry to recognize my test environment. The only way to accomplish this would be to use a handshake key.
Thanks for reaching out to MailChimp support. I can certainly understand the concern here and will be happy to help.
Unfortunately, any webhooks being used must be publically available and there would not be a way to add a host entry in MailChimp so that the URL can be used.
If your testing environement allows for HandShake keys, one options might be to add that on to the url: (can't add more than 2 links)
At MailChimp we definitely appreciate testing and encourage it with our users and I will be sure to pass this feedback along to our developers so that testing in closed environments might be a bit easier. I also wanted to provide a link to our feedback form in case you wanted to leave some feedback for our developers directly: (can't add more than 2 links)
If you have any additional questions or concerns, feel free to reach back out and we will be happy to help.
Thank you,
Mikey
Use https://ngrok.com/, available for all platforms.
It allows you to tunnel requests to your local dev machine. It's very easy to use, just download and run:
ngrok http 80
Then it'll show you the forwarding URL (where xxx is randomly generated):
Forwarding https://xxxxxxxx.ngrok.io -> localhost:80
Use https://xxxxxxxx.ngrok.io as the begining of your webhook callback URL.
Once it's running, a web interface is available at http://127.0.0.1:4040 that shows metrics and let's you replay requests.
We have a unified portal which links multiple services through a jQuery tab based interface making use of iframes to display content from different services. Our portal runs on a secure server with HTTPS/SSL. While most of our external services are HTTPS, two of them aren't. Obviously we are aware of the issues with mixed content and we didn't like the idea of having non-https sites within the portal, but we didn't have a choice. Everything was ok until a few days ago when Google updated chrome to version 30, which now silently blocks mixed content.This has created a great number of problems for us. We contacted the external services and asked them if they could upgrade their services to HTTPS and one of them has come back saying they have no plans to do so for the next 2 years.
Obviously this is a problem. We tried getting around the problem by getting this service to open into a new browser window, but this is a rather inelegant workaround which I would like to get rid of, if at all possible. Is there any way that I can use AJAX or PHP to prefetch the page in question and then display it within the portal iframe without Chrome blocking it?
I would be very grateful for any advice at all. I do understand how bad an idea it is to mix secure content with non secure content, but I have no choice in the matter as my manager is adamant that the service have to be a part of the portal.
Thanks in advance.
Regards
Alex
A somewhat simple solution would be to use a reverse proxy. You can configure Apache quite easily to take an HTTPS connection, fetch the requested content from another URL and return it. See mod_proxy. The problem is that the browser will necessarily see a different URL/domain on its part (your reverse proxy), which may or may not cause problems with cookies or hardcoded links.
I have already heard about the curl library, and that I get interest about...
and as i read that there are many uses for it, can you provide me with some
Are there any security problems with it?
one of the many useful features of curl is to interact with web pages, which means that you can send and receive http request and manipulate the data. which means you can login to web sites and actually send commands as if you where interacting from your web browser.
i found a very good web page titled 10 awesome things to do with curl. it's at http://www.catswhocode.com/blog/10-awesome-things-to-do-with-curl
One of it's big use cases is for automating activities such as getting content from another websites by the application. It can also be used to post data to another website and download files via FTP or HTTP. In other words it allows your application or script to act as a user accessing a website as they would do browsing manually.
There are no inherent security problems with it but it should be used appropriately, e.g. use https where required.
cURL Features
It's for spamming comment forms. ;)
cURL is great for working with APIs, especially when you need to POST data. I've heard that it's quicker to use file_get_contents() for basic GET requests (e.g. grabbing an RSS feed that doesn't require authentication), but I haven't tried myself.
If you're using it in a publicly distributed script, such as a WordPress plugin, be sure to check for it with function_exists('curl_open'), as some hosts don't install it...
In addition to the uses suggested in the other answers, I find it quite useful for testing web-service calls. Especially on *nix servers where I can't install other tools and want to test the connection to a 3rd party webservice (ensuring network connectivity / firewall rules etc.) in advance of installing the actual application that will be communicating with the web-services. That way if there are problems, the usual response of 'something must be wrong with your application' can be avoided and I can focus on diagnosing the network / other issues that are preventing the connection from being made.
It certainly can simplify simple programs you need to write that require higher level protocols for communication.
I do recall a contractor, however, attempting to use it with a high load Apache web server module and it was simply too heavy-weight for that particular application.