I am building a Mediawiki at here.
It is going OK, but now I want to have some articles to document things that begin with a lower-case letter such as Unix commands: "man", "chmod", "ls", "iPod", etc. I don't want them to show up like Mediawiki tries to make all page titles and article titles begin with an upper-case letter.
Also, I don't want the searches to be case-sensitive. I want searching for "apple" to find "Apple", etc.
I believe that I've achieved having case-insensitive searches by following the instructions from the web page.
I believe that this is good, but I am a little squeamish about what I had to do:
Change the Structure of the database table _pages, changing the type of the page title to be VARCHAR(255) and changing the collation sequencing to be a kind of utf-8 case-insensitive.
Adding a global function to globalFunctions.php.
Changing the php code in the wiki's skin.
It seems like this should just be a php variable in LocalSettings.php
But this all seems to work. I mean, I could enter "apple" and it would find the article on "Apple" rather than prompting me to create a new article called "apple".
But then, I noticed that the page titles were still capitalized for such things as new article such as an article on "chmod".
I went back to googling and I found a web page that said to use the Mediawiki global variable called:
$wgAllowDisplayTitle = true;
and that this would enbled me to use templates such as the following:
{{DISPLAYTITLE:chmod}}
http://www.learnbymac.com/wiki/index.php?title=Chmod
This partially works. The title of the article is now "chmod", but really, in the database the title is still "Chmod" which wouldn't be so bad, but when I go to the Category "Unix", all of the Unix commands show up starting with an upper-case letter.
I read on the Mediawiki site that beginning a page title with a lower-case letter, in any language, is disallowed.
I would like things on my wiki to be like they are on my Mac, not case-sensitive, but case-preserving.
I know that Mediawiki has to consider about every language in the world, but I don't.
I really would rather not modify the structure of my Mediawiki database any further, but maybe that's what's required. I just noticed that not only are the page titles wrong in the category pages, but they are also wrong in the title when you are editing pages.
Here's a link to a category that lists the titles in the wrong case:
---Edit---
I figured it out. I believe that it is fine now. I was missing the following line in my Mediawiki configuration file, called "LocalSettings.php:.
# disable first-letter capitalization of page names
$wgCapitalLinks = false;
I know that I entered this the first time. I believe what happened was the changes got saved in my local file system instead of being saved by my text editor, via ftp, to my website.
As you noted, setting $wgCapitalLinks = false; in LocalSettings.php will do the trick. If you already had pages in your wiki, you will probably want to run the maintainence script CleanupCaps as well: http://www.mediawiki.org/wiki/Manual:CleanupCaps.php
For your second question: To have the search case insensitive you can use the TitleKey extension (http://www.mediawiki.org/wiki/Extension:TitleKey).
It is stable, and used on many major wikis. There is also the possibility to plug in a the Lucene serach engine, if you want more control over the behaviour (http://www.mediawiki.org/wiki/Extension:MWSearch)
Related
the thing is, In my WordPress website i have posted some links to some post. and the domain name of the links have been changed and they are so much in an amount in total of 549 posts.the link is like this yourdomain.com/free-access/ and they are in a button. so any suggestions how can i change the domain name.
thanks
The best way to change the domain names in posts is to export and download the database backup. Search and replace the domain name occurrences and save the sql file. Then import the sql file.
You can do a global search and replace with a plugin. There are others but I suggest Better Search and Replace. It's in the WordPress Repository and is free.
Before you make a change like this you should run a full backup, just in case anything goes wrong.
The plugin in question will let you do a dry run, that is you enter your search text and replace text, select the tables you want to search in and then check the box to Run as Dry Run. This will show you where your changes would be made without actually making them, so you're sure you've entered the right search and replace terms.
You can also make changes like this directly on the database tables using SQL but I strongly recommend against it. This tool is much safer.
How can i get result if word wrong or spelling wrong or Greek word enter get English result
example:
in my web site i want to search "jeans" or "jenz" get same result its a simple examples my main concern if user enter same meaning word or wrong greek word but my site develop in English format how can this word search
can any one give me idea how i get solution for this problem my site in Laravel 5.2
For the "spelling wrong" part (Did you mean)
You could do some LIKE queries in the db e.g. LIKE %j%e%a%n%s% but if you want to do it well, you should use a search library like this:
https://github.com/TomLingham/Laravel-Searchy
http://tnt.studio/blog/did-you-mean-functionality-with-laravel-scout
You can publish the configuration file to your app directory and
override the settings by running php artisan vendor:publish to copy
the configuration to your config folder as searchy.php
You can set the default driver to use for searches in the
configuration file. Your options (at this stage) are: fuzzy, simple
and levenshtein.
You can also override these methods using the following syntax when
running a search:
By defining levenshtein distance you can finetune how far off a word can be.
If you need better performance you should consider something like solr or elasticsearch for this task.
https://wiki.apache.org/solr/SpellCheckComponent
https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-fuzzy-query.html
For the translation part
You just should have a dictionary ready in your database or localization files and run the input trough that as well. Expect high computing times for both spelling correction and translation in your code.
Cheers everyone!
Please bear with me, I really did do some research on this, but I couldn't come to a final solution, hence I'm here to hear your opinions.
What I want to build is a small i18n-CMS with dynamic hierarchical pages such as:
domain.tld/en/I/am/a/path
I want to find the least performance intense way that allows me to have beautiful, SEO and human-friendly URLs.
I use a Closure-Table, so two tables in the database, one for the pagenodes and one for the pathtree plus another table for the localised page, that references a certain pagenode (three in total).
My different solutions so far:
Sure I could make an algorithm, that goes through all the different request segments and checks if there is an English "path" under an "a" under an "am" under an "I", but this seems very unwise considering a multitude of page-hits.
Or is it?
Positive: I wouldn't need to save the path anywhere, because it would be calculated. So moving pages around wouldn't need to recalculate the path and save it again.
I could simply save the whole path to the database, as VARCHAR(2000) or something and then just check if there is a page with path "I/am/a/path" in English language and get that one.
This seems to be rather messy.
As I do it now. Currently I add an "ID" at the end of my path. Such as:
domain.tld/en/I/am/a/path.1
So if you enter "domain.tld/en.1" you get forwarded to the one with the right slug. But here again I need to save the slug to the database, for each single page.
Also I would love to get rid of the id (could I do this with mod-rewrite and .htaccess?)
Any more insights on this one? As I'm not a webdeveloper, so I'm not really sure regarding performance.
Kindest regards,
Meren
It seems to me that page request will happen a million times more often than an editor changing a page address. So I would definitely go with the save-to-db option. What you can do is create an extra field in which you save the 'slug' for that page, in combination with .htaccess you can redirect pages from the 'slug' addresses. For example in http://www.fuuu.com/futest-fu , 'futest-fu' is a slug which could be rewritten to an ID number (or anything you would want it to be). Amongst others, Wordpress works this way. Check out this discussion for some insights: http://wordpress.org/support/topic/where-are-the-permalinks-slug-stored-in-the-database
I looked into the "text" table in the SQL database and found the fields for the page contents rather complicated. Was trying to use WhatLinksHere but got myself into a bigger mess.
I believe there must be a simple method that I can use, judging from the ReplaceText extension, as well as the Search php files. But those files do really look cryptic to me, since they referenced some other files. I wonder if anyone can help me out on this.
(P/S: I looked into the "pagelinks" table and saw only pl_from. Wonder why there is no pl_to?)
Template transclusions are recorded in the templatelinks table, not in pagelinks.
The format of both tables is the same: the tl_from field contains the page ID of the linking / transcluding page, while the fields tl_namespace and tl_title contain the namespace and normalized (DB key form, i.e. underscores for spaces) title of the target page being transcluded.
For templatelinks, the target namespace will usually be 10 (Template), but this need not always be the case: pages in any namespace can be transcluded using the syntax {{Namespace:Title}} (or just {{:Title}} for pages in the main namespace).
The reason for this asymmetry is that, while the transcluding page must, necessarily, exist, there's no guarantee that the template being transcluded does. Thus, the target page might not have a page ID, and so we need to refer to it using its title (and namespace).
i have set up Wiki:Family using the tutorial #2 specified here
http://www.mediawiki.org/wiki/Wiki_family
Specifically, i am using same code, same folder, same database and different tables (they differ by prefixes) to create a multilanguage mediaWiki.
for now i have created two languages,
french and english which can be access by fr.sitename.com/wiki/ and en.sitename.com/wiki/
now i need to add interlanguage links to the article, the syntax
[[:fr:Main Page]]
does not work it just redirects me to a new page saying that i need to create the page as it does not exists while i can access it at fr.sitename.com/wiki/Main_Page
can someone please help me solve this?
Probably need to update your interwiki table - what does it contain now for iw_prefix='fr' ?. For example, see maintenance/wikipedia-interwiki.sql. Also I think there is a MediaWiki extension to do this, if you prefer a Web-based interface.