How do I protect individual wordpress urls from being attacked? [closed] - php

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I am trying to secure my WordPress site from hackers - more specifically, individual non-content pages that appear to be getting more hits. I am using Siteground and installed WordPress a few months ago. Checking the website statistics I was taken aback by what I saw. I have briefly summarised the page hits below.
https:// ... /wp-admin/admin-ajax.php --> this page has been viewed over 16k times each month. Considering my site has been live for only 2 months, contains no SEO, and no-one knows about its existence this is odd!
https:// ... /index.php/wp-json/wp/v2/users/ --> this page is giving away my usernames.
https:// ... /index.php/wp-json/wp/v2/pages --> appears to display code from one of my main pages.
And a whole load of pages that appear very odd to be accessing:
https:// ... /index.php/wp-json/wp/v2/taxonomies
https:// ... /index.php/wp-json/wp/v2/categories
https:// ... /index.php/wp-json/wp/v2/taxonomies/post_tag
https:// ... /index.php/wp-json/wp/v2/taxonomies/category
https:// ... /index.php/wp-json/wp/v2/tags
https:// ... /wp-admin/load-styles.php --> shows blank screen
There's a whole load more, some redirects to the WordPress login page and others show a blank screen. There's also some URLs that allow any user to download a *.woff file (whatever that is?!).
Point is, I thought WordPress would be secure enough to not let these pages appear visible and show details at the very least.
Is there anything I can do? As I pointed out, I'm using Siteground which doesn't use cPanel.
I thought the most difficult part of a blog site is the content creation and overall web design. I'm not sure now.
Any help and/or advice would be greatly appreciated.
Thank you.

As for accessing login pages, that's to be expected with a WordPress site - bots love them.
You should have a strong password and use a nonstandard username for your admin-rights accounts. Bots will always access the default page with default login credentials to try it out. You could go another step and move the login page, too, that will massively drop accesses to the real login page, there's a plugin for it if you aren't comfortable coding that yourself: WPS Hide Login.
As for the wp-json URLs, you can ensure they are requiring logins / disabled with answers provided here, such as a plugin that disables it: Disable REST API.
Concerning the .woff files, those are just font files, either a bot scrapes over them or a user is accessing them to view the web page as it was designed; not a concern really.
WordPress has a decent article on additional things you can do to secure your website as well here.

Related

Pass URL parameter (utm code) across my site's webpages [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a webpage I'm trying to promote via ad banners. I want to associate a utm code to those those links, so when a visitor lands on my website, I'll be able to track where they came from (mysite.com?utm_campaign=adXYZ)
Normally these ad banners lead to a single webpage with single point of conversion where I'm able to capture the utm_campaign ID and gauge how effective my marketing is. However, I'm now leading users to a full website with many pages and many points of conversion. I'm hoping to keep that utm_campaign ID across multiple pages using some crafty JS or PHP.
For example:
user clicks ad banner to mysite.com?utm_campaign=adXYZ
user lands on mysite.com but wants to go to mysite.com/features
user goes from mysite.com/features to mysite.com/pricing
By the time the user reaches mysite.com/pricing, I want ?utm_campaign=adXYZ to still be there in the URL.
I know there are ways via analytics and what not to track a session/conversion, but I specifically need to capture the referral utm code in an HTML form down the road. Can anyone point me in the right direction? Thanks a bunch!
Edit: An important point to note. The site should still be accessible organically via search, bookmark, linking, etc and not have the trailing campaign ID in the URL. Only when user visits the site from ad banner should the campaign be there and all subsequent pages.
I would set a cookie containing the relevenat information the first time the user enters your website.
Otherwise you have to pass the information every time again with every request (GET / POST). This solution will work even if the user don't allow cookies. Murat Cem YALIN wrote how this works in detail. But if you want to use the JS-method: Be aware that the user must have JS activated!
The third option might be using PHP Sessions.
you can do it with both php and js.
in php use simplehtmldom (http://simplehtmldom.sourceforge.net/) to access all links in html output and add ?utm_campaign=adXYZ to all of them just before outputting rendered html.
in js you use jquery to do the same when the document loaded. ex:
$("a").each(function(){
$(this).attr("href", $(this).attr("href") + '?utm_source=adxYZ');
});

Content management for my already coded website [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I am in secondary school and learning web development and our latest school project was to come up with our own company based around a website.
Basically my website is going to display aspiring animator’s videos, there is going to be a place where other users of the website can comment feedback on these videos and there are going to be other resources for the animators to use.
I have already created the base of the website. I have placeholder youtube videos on the home screen (where the user’s videos would go) and I have a contact page and a resource page.
Basically, my teacher told me that if I wanted the website to actually function, that is to have a login system where users can go in and be able to post their own videos for the other users to see, (posting videos would most likely be in the form of submitting a youtube link, there the video would be displayed on the home page) and have a comment system for other users to be able to leave feedback on other user’s videos and so on, my best option was to use a CMS e.g. Drupal. I was unsure if this would be my best option, because as far as my research goes, I believe that CMS are made for users to use their web templates and it doesn’t work well for those who have already got a website coded. (unsure)
I am new to making websites but I am quite capable with a bit of learning. Basically, all I need to know is what method I should use to integrate this login system for users to be able to post and comment to my website and a way for an admin who would run the website to be able to manage the content on the website easily without having to change any of the code. Considering that I have already coded my website, I am unsure if this is possible and I do not have the time to start again.
Thanks for your help.
Actually I belive that it would be lot easier to simply take your coded website and convert it to template for one of the most popular CMS platforms (Joomla, for example). It would allow you to use thousands of free plugins (also for video uploading and galleries, for that matter), and will make your site LOT safer. It's lot faster than coding your own CMS too - if design is not very complicated and you don't have lot of functions, I belive it would take you few days max to install Joomla, find, add and configure few necessary plugins, and follow one of hundreds of tutorials about converting your HTML to Joomla template.
If you insist on coding your own CMS, start with this tutorial
https://css-tricks.com/php-for-beginners-building-your-first-simple-cms/
It's old, from 2009, but it covers most of the basics of working with simple databases, user login sessions, etc.

Why design for "404 not found" page needed? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I want to know the advantage of 404 Not Found page. Why should one create a design for that ? Why not just set 404 header and redirect it to the home page with a message something like "What do you want of the URL? Please just walk into the website".
I figured many websites have a special page for the purpose. But why ?
Regards
404 Not Found
The 404 (Not Found) status code indicates that the origin server did not find a current representation for the target resource or is not willing to disclose that one exists. A 404 status code does not indicate whether this lack of representation is temporary or permanent; the 410 (Gone) status code is preferred over 404 if the origin server knows, presumably through some configurable means, that the condition is likely to be permanent.
Hypertext Transfer Protocol (HTTP/1.1), section 6.5.4
404 pages let users know that the page does not exist.
Many people will change the URL manually (for example, /pictures/page/1 to /pictures/page/2). If you redirect the user to the front page it will be confusing to them. What they expected to happen (either go to page 2 or get a "Page not found" error) will not happen; finding themselves on the front page is not useful.
The more important thing, however, is that users should ideally never see 404 pages. When they actually do see one, it should be very clear that the page does not exist. Redirecting them does not tell them that the page does not exist; it tells them that... it is the front page, which it should not be. It is confusing.
Another issue is that search engines may find it odd that a lot of your pages get redirected to your front page. It is not exactly how the internet is supposed to work, so they may actually penalize your website because of it.
If you want your users to have a good experience on your website then your 404 pages should attempt to help them find whatever they were looking for. Some things you can show the user are:
A search box. Google has custom search boxes you can put on your website, which only search your site. If making your own is too complicated then this is a good solution.
If you are able to do something like this, then showing the user content that may be similar to what they are looking for can be useful. Just make sure that it works reasonably well. For example if you cannot find anything similar then don't show random stuff -- it is not helpful.
The newest content on the page. This is especially useful if it is a blog, news site, or some other kind of website that frequently gets new content.
The most popular content. If the user is just browsing to pass the time then popular content may allow them to continue browsing without leaving your website.
A link to or showing a sitemap may also be useful if the website is small enough to summarise on one page.
...and so on. Just try to think of what would be helpful to the user.
A good custom 404 page will help people find the information they're looking for, as well as providing other helpful content and encouraging them to explore your site further.
Moreover if you do not make your own custom page, the server's 404 error will be displayed which would not go by the design of your website. There are also negative elements which would try to access the unauthorized pages, and keeping a check of the same will give a sense of secured website.
What if you go in a departmental store and looking for a soap which
actually is out of stock, but the store keeper just make you to start
again from the section. Isin't the message not available is a
better option? The same goes here
The idea of a 404 page is to tell the user that the file they were looking for wasn't found, or that the link they clicked on was broken.
Say you're running a news site and a user clicks on a link to an article on your site, but the article has been deleted. A 404 page makes it very clear to the user that the article is gone. If you just redirect them to your homepage, they might think a featured article on your page is the one they were linked to, or that they got redirected for no reason. They won't have any reason to think that what they were looking for no longer exists on your server.
So the purpose of a 404 page is to say "Hey, what you were looking for isn't here." If you want to get more fancy, you can even use a 410 error instead, which means "Hey, what you were looking for used to be here, but it's gone now."

how google crawls dynamic pages? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am about to create an Online Shopping site for my one of the client. I have to make this site SEO Friendly and therefore I must have to understand few things before I proceed to make a custom CMS Based website.
As I said I am going to make a Custom CMS Based website so that my client will be able to add new content through CMS but I don't understand few things.
For Example: I have an index.php page which has many links to different products and all of these links are created through Database using PHP. Site Link like
http://www.def.com/shoes/Men-Shoes
My Questions:
1) I want to know that when the GoogleBot crawls my site, will it also open my dynamically created links and index them? Will GoogleBot also index the content of my dynamic links?
2) Do I have to create seperate pages for all of the products on site and store them on my server? Or just a single page which serves dynamically according to user query for every product?
I read this
"It functions much like your web browser, by sending a request to a web server for a web page, downloading the entire page, then handing it off to Google’s indexer."
is it right?
my above query was actually looking like this and I used .htaccess file to make it pretty
http://www.def.com/shoes.php?type=Men-Shoes
so is it right and google will crawl it to index?
SEO is a complex science in itself and Google is always changing the goal posts and modifying their algorithm.
While you don't need to create separate pages for each product, creating friendly URL's using the .htaccess file can make them look better and easier to navigate. Also creating a site map and submitting this to Google Via their webmaster tools will help them to know which pages to index.
GoogleBot will follow the links in your site, including dynamically created one, but it is important not to try and game the system using Blackhat methods if long term success is your aim.
Also, use social media (Twitter, Facebook, Google+) to help promote your brand and make sure you follow Google's guidelines with regards to SEO and inpage optimisation.
There is a huge amount of information on the internet on this subject, but be careful what advice you follow.
Google and other search engines index the dynamic links too. So a way to avoid duplicate content is to use the "Crawl"->"URL Parameters" tool in Google Webmasters. You can read more about how that works here https://support.google.com/webmasters/answer/6080548?rd=1. Set "Crawl" field to "No URLs". By this way you could hide from search dynamic links but you have to have a list of all of your dynamic links of your website/CMS in order not to hide important content accidentally. The "URL Parameters" feature is available in Bing Webmaster tools also http://searchenginewatch.com/sew/how-to/2195777/bing-webmaster-tools-an-overview#.

bulk "fetch as google" by PHP [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I update my website content daily, around 15 to 20 new pages.
From my webmaster account, I "Fetch as Google" for each page, long process..
Is there a way to automate it by PHP?
Can PHP "auto submit" my new pages for me (The New Links are in a MySql data) to "fetch as google"?
Please help.
Thank you.
It is wrong to do that "by hand".
I will cite another answer
I would say it is not a preferred way to alert Google when you have a
new page and it is pretty limited. What is better, and frankly more
effective is to do things like:
add the page to your XML sitemap (make sure sitemap is submitted to Google)
add the page to your RSS feeds (make sure your RSS is submitted to Google)
add a link to the page on your home page or other "important" page on your site
tweet about your new page
status update in FB about your new page
Google Plus your new page
Feature your new page in your email newsletter
Obviously, depending on the page you may not be able to do all of
these, but normally, Google will pick up new pages in your sitemap. I
find that G hits my sitemaps almost daily (your mileage may vary).
I only use fetch if I am trying to diagnose a problem on a specific
page and even then, I may just fetch but not submit. I have only
submitted when there was some major issue with a page that I could not
wait for Google to update as a part of its regular crawl of my site.
As an example, we had a release go out with a new section and that
section was blocked by our robots.txt. I went ahead and submitted the
robots.txt to encourage Google to update the page sooner so that our
new section would be :"live" to Google sooner as G does not hit our
robots.txt as often. Otherwise for 99.5% of my other pages on sites,
the options above work well.
The other thing is that you get very few fetches a month, so you are
still very limited in what you can do. Your sitemaps can include
thousands of pages each. Google fetch is limited, so another reason I
reserve it for my time sensitive emergencies.

Categories