.htaccess tips and tricks

<ifModule>
 clever stuff here
</ifModule>

Introduction to .htaccess..

This work in constant progress is some collected wisdom, stuff I've learned on the topic of .htaccess hacking, commands I've used successfully in the past, on a variety of server setups, and in most cases still do. You may have to tweak the examples some to get the desired result, though, and a reliable test server is a powerful ally, preferably one with a similar setup to your "live" server. Okay, to begin..

..an old Win32 Apache mirror of corz.org     
peecee Explorer view with invisible files

.htaccess files are invisible

There's a good reason why you won't see .htaccess files on the web; almost every web server in the world is configured to ignore them, by default. Same goes for most operating systems. Mainly it's the dot "." at the start, you see?

If you don't see, you'll need to disable your operating system's invisible file functions, or use a text editor that allows you to open hidden files, something like bbedit on the Mac platform. On windows, showing invisibles in explorer should allow any text editor to open them, and most decent editors to save them too**. Probably most Linux users know how to find them without any help from me.

In the image, the operating system has been instructed to display invisible files. ugly, but necessary sometimes. You will also need to instruct your ftp client to do the same.

By the way; that folder is no longer there. But folks still find it via my clever 404 script.


What are .htaccess files anyway?

Simply put, they are invisible plain text files where one can store server directives. Server directives are anything you might put in an Apache config file (httpd.conf) or even a php.ini**, but unlike those "master" directive files, these .htaccess directives apply only to the folder in which the .htaccess file resides, and all the folders inside.

This ability to plant .htaccess files in any directory of our site allows us to set up a finely-grained tree of server directives, each subfolder inheriting properties from its parent, whilst at the same time adding to, or over-riding certain directives with its own .htaccess file.

For instance, you could use .htacces to enable indexes all over your site, and then deny indexing in only certain subdirectories, or deny index listings site-wide, and allow indexing in certain subdirectories. One line in the .htaccess file in your root and your whole site is altered. From here on, I'll probably refer to the main .htaccess in the root of your website as "the master .htaccess file", or "main" .htaccess file.

There's a small performance penalty for all this .htaccess file checking, but not noticeable, and you'll find most of the time it's just on and there's nothing you can do about it anyway, so let's make the most of it..

** Your main php.ini, that is, unless you are running under phpsuexec/CGI, in which case the directives would go inside individual php.ini files (sometimes named ".user.ini") in your site's directories.

Is .htaccess enabled?

It's unusual, but possible that .htaccess is not enabled on your site. If you are hosting it yourself, it's easy enough to fix; open your httpd.conf in a text editor, and locate this <Directory> section..

Your DocumentRoot  may be different, of course..
# This should be changed to whatever you set DocumentRoot to.
#
<Directory "/var/www/htdocs">
#

..locate the line that reads..

AllowOverride None

..and change it to..

AllowOverride All

Restart Apache. Now .htaccess will work. You can also make this change inside a virtual host, which would normally be preferable. In fact, if you are hosting yourself, it's probably smarter to disable .htaccess altogether and make all these changes in the virtual host configuration!

If your site is hosted elsewhere, check your control panel (Plesk. CPanel, etc.) to see if you can enable .htaccess there, and if not, contact your hosting admins. Perhaps they don't allow this. In which case, switch to a better web host.

What can I do with .htaccess files?

Almost any directive that you can put inside an httpd.conf file will also function perfectly inside an .htaccess file. Unsurprisingly, the most common use of .htaccess is to..

Control (Allow/Deny) Access..

.htaccess is most often used to restrict or deny access to individual files and folders. A typical example would be an "includes" folder. Your site's pages can call these included scripts all they like, but you don't want users accessing these files directly, over the web. In that case you would drop an .htaccess file in the includes folder with content something like this..

NO ENTRY!
# no one gets in here!
deny from all

which would deny ALL direct web (HTTP) access to ANY files in that folder (your scripts reach them via the filesystem). You can be more specific with your conditions, for instance limiting access to a particular IP range, here's a handy top-level ruleset for a local test server..

NO ENTRY outside of the LAN!
# no nasty crackerpots in here!
Order Allow,Deny
Deny from All
Allow from 192.168.0.0/24
# this would do the same thing..
#Allow from 192.168.0

Note the Order directive, which controls the order in which Apache handles the access rules (aka. directives) when making its three passes. With Allow,Deny, first checking and applying Allow rules, then Deny rules, and denying everything else. With Deny,Allow, first applying Deny rules, then Allow rules, then allowing everything else.

If you think about it, the Deny line in example above, is redundant. This..

NO ENTRY outside of the LAN!
Order Allow,Deny
Allow from 192.168.0

.. is enough to secure a local server. And because Apache processes the directives in three groups (one on each pass), the processing order defined by the Order directive, the actual ordering of the rules in your config file is unimportant. This..

NO ENTRY outside of the LAN!
Allow from 192.168.0.0/24
Order Allow,Deny

..is identical in operation to the previous example.

Generally these sorts of requests would bounce off your firewall anyway, but on a live server (like my dev mirrors sometimes are) they become useful for filtering out undesirable IP blocks, known risks, lots of things. By the way, in case you hadn't spotted; lines beginning with "#" are ignored by Apache; handy for comments.

Sometimes, you will only want to ban one IP, perhaps some persistent robot that doesn't play by the rules..

post user agent every fifth request only. hmmm. ban IP..
# someone else giving the ruskies a bad name..
order allow,deny
deny from 83.222.23.219
allow from all

The usual rules for IP addresses apply, so you can use partial matches, ranges, and so on. Whatever, the user gets a 403 "access denied" error page in their client software (browser, usually), which certainly gets the message across. This is probably fine for most situations, but in part two I'll demonstrate some cooler ways to deny access, as well as how to deny those nasty web suckers, bad referrers, script kiddies and more.

One final note about Allow and Deny rules for local servers (or anywhere you have acceess to the main httpd.conf, vhost.conf and such files). If AllowOverride All is set in a config file processed before the one containing these rules (it usually is), they will override any rules set in the preceding config file.

For example, if you have AllowOverride All and Deny All set in your VirtualHost config, and Allow All in your .htaccess, the .htaccess rules apply, allowing access from all addresses. If you delete the Allow rule in the .htaccess, the rules from your VirtualHost config will apply. If you delete those rules, the ones from your main httpd.conf will apply. <Location> rules override everything.

Custom error documents..

I guess I should briefly mention that .htaccess is where most folk configure their error documents. Usually with sommething like this..

the usual method. the "err" folder (with the custom pages) is in the root
# custom error documents
ErrorDocument 401 /err/401.php
ErrorDocument 403 /err/403.php
ErrorDocument 404 /err/404.php
ErrorDocument 500 /err/500.php

You can also specify external URLs, though this can be problematic, and is best avoided. One quick and simple method is to specify the text in the directive itself, you can even use HTML (though there is probably a limit to how much HTML you can squeeze onto one line). Remember, for Apache 1; begin with a ", but DO NOT end with one. For Apache 2, you can put a second quote at the end, as normal.

measure twice, quote once..
# quick custom error "document"..
ErrorDocument 404 "<html><head><title>NO!</title></head><body><h2><tt>There is nothing here.. go away quickly!</tt></h2></body></html>

Using a custom error document is a Very Good Idea, and will give you a second chance at your almost-lost visitors. I recommend you get mine. But then, I would.

Password protected directories..

The next most obvious use for our .htaccess files is to allow access to only specific users, or user groups, in other words; password protected folders. a simple authorisation mechanism might look something like this..

a simple sample .htaccess file for password protection:
AuthType Basic
AuthName "restricted area"
AuthUserFile /usr/local/var/www/html/.htpasses
require valid-user

You can use this same mechanism to limit only certain kinds of requests, too..

only valid users can POST in here, anyone can GET, PUT, etc:
AuthType Basic
AuthName "restricted area"
AuthUserFile /usr/local/var/www/html/.htpasses
<Limit POST>
 require valid-user
</Limit>

You can find loads of online examples of how to setup authorization using .htaccess, and so long as you have a real user (or create one, in this case, 'jimmy') with a real password (you will be prompted for this, twice) in a real password file (the -c switch will create it)..

htpasswd -c /usr/local/var/www/html/.htpasses jimmy

..the above will work just fine. htpasswd is a tool that comes free with Apache, specifically for making and updating password files, check it out. The windows version is the same; only the file path needs to be changed; to wherever you want to put the password file.

Note: if the Apache bin/ folder isn't in your PATH, you will need to cd into that directory before performing the command. Also note: You can use forward and back-slashes interchangeably with Apache/php on Windows, so this would work just fine..

htpasswd -c c:/unix/usr/local/Apache2/conf/.htpasses jimmy

Relative paths are fine too; assuming you were inside the bin/ directory of our fictional Apache install, the following would do exactly the same as the above..

htpasswd -c ../conf/.htpasses jimmy

Naming the password file .htpasses is a habit from when I had to keep that file inside the web site itself, and as web servers are configured to ignore files beginning with .ht, they too, remain hidden. If you keep your password file outside the web root (a better idea), then you can call it whatever you like, but the .ht_something habit is a good one to keep, even inside the web tree, it is secure enough for our basic purpose..

Once they are logged in, you can access the remote_user environmental variable, and do stuff with it..

the remote_user variable is now available..
RewriteEngine on
RewriteCond %{remote_user} !^$ [nc]
RewriteRule ^(.*)$ /users/%{remote_user}/$1

Which is a handy directive, utilizing mod_rewrite; a subject I delve into far more deeply, in part two.

Get better protection..

The authentication examples above assume that your web server supports "Basic" http authorisation, as far as I know they all do (it's in the Apache core). Trouble is, some browsers aren't sending password this way any more, personally I'm looking to php to cover my authorization needs. Basic auth works okay though, even if it isn't actually that secure - your password travels in plain text over the wire, not clever.

If you have php, and are looking for a more secure login facility, check out pajamas. It's free. If you are looking for a password-protected download facility (and much more, besides), check out my distro machine.

500 error..

If you add something that the server doesn't understand or support, you will get a 500 error page, aka.. "the server did a boo-boo". Even directives that work perfectly on your test server at home may fail dramatically at your real site. In fact this is a great way to find out if .htaccess files are enabled on your site; create one, put some gibberish in it, and load a page in that folder, wait for the 500 error. if there isn't one, probably they are not enabled.

If they are, we need a way to safely do live-testing without bringing the whole site to a 500 standstill.

Fortunately, in much the same way as we used the <Limit> tag above, we can create conditional directives, things which will only come into effect if certain conditions are true. The most useful of these is the "ifModule" condition, which goes something like this..

only if PHP is loaded, will this directive have any effect (switch the 4 for a 5 if using php5)
<ifModule mod_php4.c>
 php_value default_charset utf-8
</ifModule>

..which placed in your master .htaccess file, that would set the default character encoding of your entire site to utf-8 (a good idea!), at least, anything output by PHP. If the PHP4** module isn't running on the server, the above .htaccess directive will do exactly nothing; Apache just ignores it. As well as proofing us against knocking the server into 500 mode, this also makes our .htaccess directives that wee bit more portable. Of course, if your syntax is messed-up, no amount of if-module-ing is going to prevent a error of some kind, all the more reason to practice this stuff on a local test server.

** note: if you are using php5, you would obviously instead use <ifModule mod_php5.c>.

Groovy things to do with .htaccess..

So far we've only scratched the surface. Aside from authorisation, the humble .htaccess file can be put to all kinds of uses. If you've ever had a look in my public archives you will have noticed that that the directories are fully browsable, just like in the old days before adult web hosts realized how to turn that feature off! A line like this..

bring back the directories!
Options +Indexes +MultiViews +SymLinksIfOwnerMatch

..will almost certainly turn it back on again. And if you have mod_autoindex.c installed on your server (probably, yes), you can get nice fancy indexing, too..

show me those files!
<IfModule mod_autoindex.c>
 IndexOptions FancyIndexing
</ifModule>

..which, as well as being neater, allows users to click the titles and, for instance, order the listing by date, or file size, or whatever. It's all for free too, built-in to the server, we're just switching it on. You can control certain parameters too..

let's go all the way!
<IfModule mod_autoindex.c>
 IndexOptions FancyIndexing IconHeight=16 IconWidth=16
</ifModule>

Other parameters you could add include..

NameWidth=30
DescriptionWidth=30
IconsAreLinks
SuppressHTMLPreamble (handy! or..)
XHTML (at last!)

I've chucked one of my old fancy indexing .htaccess file onsite for you to have some fun with. Just add a readme.html and away you go! Note: these days I generally use a single header files for all the indexes..

HeaderName /inc/header.html

.. and only drop in local readme.html files. Check out the example, and my public archives for more details.

custom directory index files

While I'm here, it's worth mentioning that .htaccess is where you can specify which files you want to use as your indexes, that is, if a user requests /foo/, Apache will serve up /foo/index.html, or whatever file you specify.

You can also specify multiple files, and Apache will look for each in order, and present the first one it finds. It's generally setup something like..

DirectoryIndex index.html index.php index.htm

It really is worth scouting around the Apache documentation, often you will find controls for things you imagined were uncontrollable, thereby creating new possibilities, better options for your website. My experience of the magic "LAMP" (Linux-Apache-MySQL-PHP) has been.. "If you can imagine that it can be done, it can be done". Swap "Linux" for any decent operating system, the "AMP" part runs on most of them.

Okay, so now we have nice fancy directories, and some of them password protected, if you don't watch out, you're site will get popular, and that means bandwidth..

Save bandwidth with .htaccess!

If you pay for your bandwidth, this wee line could save you hard cash..

save me hard cash! and help the internet!
<ifModule mod_php4.c>
 php_value zlib.output_compression 16386
</ifModule>

All it does is enables PHP's built-in transparent zlib compression. This will half your bandwidth usage in one stroke, more than that, in fact. Of course it only works with data being output by the PHP module, but if you design your pages with this in mind, you can use php echo statements, or better yet, php "includes" for your plain html output and just compress everything! Remember, if you run phpsuexec, you'll need to put php directives in a local php.ini file, not .htaccess. See here for more details.

Hide and deny files..

Do you remember I mentioned that any file beginning with .ht is invisible? .."almost every web server in the world is configured to ignore them, by default" and that is, of course, because .ht_anything files generally have server directives and passwords and stuff in them, most servers will have something like this in their main configuration..

Standard setting..
<Files ~ "^\.ht">
 Order allow,deny
 Deny from all
 Satisfy All
</Files>

which instructs the server to deny access to any file beginning with .ht, effectively protecting our .htaccess and other files. The "." at the start prevents them being displayed in an index, and the .ht prevents them being accessed. This version..

ignore what you want
<Files ~ "^.*\.([Ll][Oo][Gg])">
 Order allow,deny
 Deny from all
 Satisfy All
</Files>

tells the server to deny access to *.log files. You can insert multiple file types into each rule, separating them with a pipe "|", and you can insert multiple blocks into your .htaccess file, too. I find it convenient to put all the files starting with a dot into one, and the files with denied extensions into another, something like this..

the whole lot
# deny all .htaccess, .DS_Store $hî†é and ._* (resource fork) files
<Files ~ "^\.([Hh][Tt]|[Dd][Ss]_[Ss]|[_])">
 Order allow,deny
 Deny from all
 Satisfy All
</Files>

# deny access to all .log and .comment files
<Files ~ "^.*\.([Ll][Oo][Gg]|[cC][oO][mM][mM][eE][nN][tT])">
 Order allow,deny
 Deny from all
 Satisfy All
</Files>

would cover all ._* resource fork files, .DS_Store files (which the Mac Finder creates all over the place) *.log files, *.comment files and of course, our .ht* files. You can add whatever file types you need to protect from direct access. I think it's clear now why the file is called ".htaccess".

<FilesMatch>

These days, using <FilesMatch> is preferred over <Files>, mainly because you can use regular expression in the conditions (very handy), produce clean, more readable code. Here's an example. which I use for my php-generated style sheets..

parse file.css and file.style with the php machine..
# handler for phpsuexec..
<FilesMatch "\.(css|style)$">
 SetHandler application/x-httpd-php
</FilesMatch>

Any files with a *.css or *.style extension will now be handled by php, rather than simply served up by Apache. And because you can use regexp, you could do stuff like <FilesMatch "\.s?html$">, which is handy. Any <Files> statements you come across can be advantageously replaced by <FilesMatch> statements. Good to know.

More stuff..

At the end of my .htaccess files, there always seems to be a section of "stuff"; miscellaneous commands, mainly php flags and switches; so it seems logical to finish up the page with a wee selection of those..

php flags, switches and other stuff..
# let's enable php (non-cgi, aka. 'module') for "EVERYTHING"..
AddType application/x-httpd-php5 .htm .html .php .blog .comment .inc

# better yet..
AddHandler php5-script .htm .html .php .blog .comment .inc

# for CGI/FastCGI, you might need..
AddHandler php-script .htm .html .php .blog .comment .inc

# legacy php4 version..'
AddType application/x-httpd-php .htm .html .php .blog .comment .inc

# don't even think about setting this to 'on'
php_value register_globals off

# no session id's in the URL PULEEZE!
php_value session.use_trans_sid 0

# should be the same as..
php_flag session.use_trans_sid off

# using both should also work fine!
# php error logs..
php_flag display_errors off
php_flag log_errors on
php_value track_errors on
php_value error_log /home/cor/errors/phperr.log

# if you like to collect interesting php system shell access and web hack scripts
# get yourself a SECURE upload facility, and just let the script-kiddies come …
# in no time you will have a huge selection of fascinating code. If you want folk to
# also upload zips and stuff, you might want to increase the upload capacities..
php_value upload_max_filesize 12M
php_value post_max_size 12M

# php 5 only, afaik. handy when your server isn't where YOU are.
php_value date.timezone Europe/Aberdeen
# actually, Europe/Aberdeen isn't a valid php timezone, so that won't work.
# I recommend you check the php manual for this function, because many crazy places ARE!

Note: For most of the flags I've tested, you can use on/off and true/false interchangeably, as well as 0/1, also php_value and php_flag can be switched around while things continue to work as expected! I guess, logically, booleans should always be php_flag, and values, php_value; but suffice to say, if some php erm, directive isn't working, these would all be good things to fiddle with!

Of course, the php manual explains all. The bottom line is; both will work fine, but if you use the wrong type in .htaccess, say, set a php_flag using php_value, a php ini_get() command, for instance, would return true, even though you had set the value to off, because it reads off value as a string, which of course evaluates to not-zero, i.e. 1, or "true". If you don't rely on get_ini(), or similar, it's not a problem, though clearly it's better to get it right from the start. By the way; one of the values above is incorrectly set. Did you spot it?

Most php settings, you can override inside your actual scripts, but I do find it handy to be able to set defaults for a folder, or an entire site, using .htaccess.

over to you..

That should get you started with .htaccess, quite easy when you know how. If you really want to bend your brain out of shape, follow the link below for part two of the series, where I delve into the arcane mysteries of URL rewriting.

;o) Cor

Before you ask a question..

Firstly, read this at least once in your life. I insist!

NOTE: THIS IS NOT A COMMUNITY. And I am not your free tech dude. Sure, folk sometimes drop back in, but realistically, the chances of someone else coming along and answering your tech question are about as close to zero as it gets; almost no one sticks around but me, the guy who wrote all that text (above).

If you can't be bothered to read the article, I can't be bothered responding. Capiche? I do read all comments, though, and answer questions about the article. I'm also keen to discuss anything you think I've missed, or interesting related concepts in general.

If you are still sure that you want to post your own, personal, tech question, then please ensure that you first, either..

a) Have read the article (above) and have tried "everything" yourself; in which case; post the exact code that isn't working (preferably inside [pre][/pre] tags), replacing any personal domain names with "example.com" (advertising gets deleted) or else..

b) Pay me. The PayPal button is at the top right of the page. I offer many related services, if you need priority assistance, get in touch.

Other posts will be ignored and/or deleted.

If you want to know about rewriting with mod_rewrite please see the next page!



return to paged comments
Bill - 24.09.04 7:20 pm

I found that the command to bring back directories didn't work on my web server "Options Indexes MultiViews FollowSymlinks" But I did find one that worked perfectly "Options +Indexes"

Thank you for the tutorial.


cor - 17.10.04 6:40 pm

hey, good call!

the +/- options are to merge rules with previous rules, so if /mydocs/web/ had
Options Indexes FollowSymLinks

and /mydocs/web/code/ was set to
Options +Fancyindexing

then in effect, /mydocs/web/code/ has all three of the options set, the FancyIndexing directive being merged with any features already in effect. If you don't prepend with +/-, you are saying, in effect, switch OFF all rules and apply only these.

Probably at least followsymlinks has been set further up the tree somewhere. The default is that everything (except MultiViews) is ON, though most web hosts have disabled these directives before the webmaster gets there. Additionally, MultiViews will only work when mod_autoindex is installed in your server (it usually is).

You can use this mechanism to switch features on and off throughout the web tree. And I probably should add something about this to the tut; maybe I just did! I definitely recommend using MultiViews and FancyIndexing on top of plain indexes, though. With a touch of css they can look real professional, and automatically.

Finally, and this is the funny bit, my own server's public directory is set up like this..
Options +Indexes +MultiViews +FollowSymlinks

Which is pure overkill, but well on the safe side. both work, but I've since replaced the online .htaccess example file with that one.

Many thanks for the input Bill, no doubt someone else will appreciate the exact same solution.

;o)


Nick Seal - 20.02.05 10:12 pm

Hey, is it possible to add something to a website to directly update a .htpasswd file from a registration page?

Cheers


cor - 24.02.05 1:05 am

Nick, I don't know of anything specific, but it wouldn't be diffucult to achieve with php, so long as the webserver process has write permission to the .htaccess file.

;o)


anonymous - 24.02.05 6:58 pm

Hello,

how to use css in indexing???

and i using your file_view.htaccess.text all works fine exept
IconHeight=16 IconWidth=16 NameWidth=30 DescriptionWidth=30
i see changed icons and etc but Height Width NameWidth and DescriptionWidth is in default..
how can i fix it?


cor - 25.02.05 4:43 am

Ahh, css like in my public archives?

Well, you just add the style statement to your header.html file, simple.

As to the second question, I've no idea! It's possible that the master httpd.conf file isn't allowing over-ride on these attributes, but this sounds highly unlikely.

Have you enabled all the options? Particularly FancyIndexing..

Options +Indexes +MultiViews +FollowSymlinks
<IfModule mod_autoindex.c>

IndexOptions FancyIndexing IconHeight=16 IconWidth=16 NameWidth=33 DescriptionWidth=30

...


Feel free to mail me your .htaccess file, I could have a wee look.

for now..

;o)


sprock3t - 25.02.05 11:27 am

earlie known as (anonymous - 24.02.05 6:58 pm)

Hello cor,

I found the problem and fixed it, now all works fine for me! :)

Thank you for support!


cor - 25.02.05 12:15 pm

Gr8 news!

So then, sprock3t, what was it?
Maybe useful to other peeps?
If so, let us know!

;o)


Tony - 04.04.05 5:55 am

Hi
With IP access deny/allow rules, is there a way to list a block of IP addresses other than the whole range?
Not just 24.265.32.
But allow access to the range 24.265.32.5 to 24.265.32.122
I have a site that has a subscription based access system that allows some but not all IPs in a range.
# no nasty crackers in here!
order deny,allow
deny from all
allow from 24.265.32.5-24.265.32.122


TonyE


cor - 04.04.05 5:02 pm

nah, that won't work.

Fortunately, there is a trick, which I'm fairly sure I originally found at webmasterworld, you'll need something like this..
# deny IP Range with .htaccess..
SetEnvIf Remote_Addr ^24\.265\.32\.([5-9]|[1-9][0-9]|1[0-1][0-9]|12[0-2]) ban

<Files ~ "^.*$">
order allow,deny
allow from all
deny from env=ban
</Files>

The first portion bans numbers that fall between 5-9
The second portion bans numbers that fall between 10-99
The third portion bans numbers that fall between 100-119.
The last portion bans numbers that fall between 120-122.

It's a wee bit long-winded, but it gets the job done.

;o)


tech girl - 10.04.05 7:04 am

I would like to have a page that lists the files and directories but that is password protected.
I seem to be able to have one or the other. Either I can have a public page that lists the files and directories using +Indexes or I can have a password protected directory with out files being listed. When I add the +Indexes to my .htaccess that is set for password protection the page just loads with the files listed and skips the password aspect.
Is there a way to have both?
I tried the subdirectory idea but that didn't work either. It was the closest though.
Thanks
(p.s. this is not for porn or anything illegal, its just that I need to give somefolks access to some hudge files that I have the copyright to and I do not want just anyone to have access to them.)



cor - 10.04.05 3:52 pm

that's not right tech girl! So you are saying, this..
AuthType Basic
AuthName "restricted"
AuthUserFile /usr/local/var/.htpasses
require valid-user

Options +Indexes


doesn't work? Or something like it. It certainly should work (I was so shocked I had to go and try this myself, just to be sure, and it worked as I expected; no pass, no entry! regardless of whatever other directives I add!)

Something must be wonky somewhere, copy-and-paste you .htaccess file in here (remove personal info first!) and we'll have a wee look.

;o)


cor - 10.04.05 5:43 pm

ahh, forgot to mention, tech girl, you might want to take a look at this.

;o)



Shane - 16.04.05 5:17 am

First of all very interesting reading. Thank you.

Now to the question. I have in my index.php a href to a htm file and would like to know is it possible to block direct access to the htm file and yet the file will still fuction when the index.php calls for it? This file contains a script that runs when you access the index.php but i dont want people to type in the url of the htm file and be able to see it. If this can be can it also be done for direct access to images also but yet the images still are able to be called by the scripts.


cor - 17.04.05 3:01 am

Sure, you can block access to .htm files, or you can simply redirect them to their .php "containers", both methods are outlined above, and both will prevent users accessing the .htm files directly.

Images are trickier. You can block access easily enough, but if your scripts use them, they need to use them wisely! What I mean is, you'll need to use "passthru" functions or similar to grab the images, not direct links. Perhaps that's obvious, probably not.

Check out pajamas, (and also the link for the distro machine, which I dropped in my last post) which do exactly this.

Remember, .htaccess directives control server request  behaviour, that is, requests from outside. Your php scripts will access files directly, via the filesystem (though they don't have to do that) so generally, .htaccess directives don't affect your scripts' ability to access anything.

for now..

;o)


luuk - 08.05.05 6:32 pm

Is there any way to block the entrance to your site via onther site.
There is a site which has the link of my site in it, and I don't want people from there to visit my site...can that be done?
thanks in advance,
luuk de vries


cor - 09.05.05 3:25 am

luuk, simply put their domain in a deny statement in your main .htaccess file..

# go away microsoft.com!
order deny,allow
deny from 207.46.130.108


If you check out page two of these aticles, the hot-linking code could easily be adapted to your needs, too (well, rarely is it easy with mod_rewrite, but usually doable!). Then you could send them to a nice page instead, with a cute message or something.

;o)


natasha - 20.06.05 3:40 am

hi guys
i m trying to make .htaccess file on window professional 2000, and when i try to save file like ".htaccess" it doesnt allow me and says "must write file name" its takin .htaccess as extension :( pls help me fellows how can i make .htaccess file to use with my site that resides in c:\apache\htdocs\mysite ............i know u buddies will help me soon :)
bye


cor - 20.06.05 2:03 pm

Try WITH the quotes.. ".htaccess"

;o)


dant - 20.06.05 11:06 pm

How about using an external program, using the useraname and password in basic auth, as arguments, to generate a result? I have a third-party app that I'd like to be executed every time basic auth is initiated, using the username and password as arguments. This application would then generate a result to iether allow or deny a user based on the given username/pass.

Does Apache's .htacecss controls have a mecahnism where I can, I don't know, pipe the user/pass to an external application?

Thanks!
-dant


cor - 22.06.05 5:35 am

dant, the external application you need will likely be called "script.php", or perhaps "script.cgi". That's what these things were invented for, and a whole lot more besides!

once you have it in the there, you can do whatever you like.

;o)


Listo - 25.06.05 12:52 am

Hello,

I would like a script that would read a search term from the "referrer" in my web logs then redirect to a particular page relevant to that search term. Is this something you can do??

Thanks in advance!!


cor - 26.06.05 12:46 am

firstly, Listo, parsing the logs isn't the way to go. a) it's expensive (in time and resources, those log files can get LARGE) and b) it's too late.

you want to catch the request at the time of the request. In php you might do..

<?
if (isset($_SERVER['HTTP_REFERER'])) {

    
// code to ascertain what page to redirect to
    // maybe some regexp, then send a location header..

    
header("HTTP/1.1 301 Moved Permanently"); // if it is
    
header("Location: http://some/$new/$page.html");

}
?>


You could probably do it in .htaccess, depending on how complex the redirection is, check out .htaccess part two, which is all about mod_rewrite, a module you can configure in your .htaccess file. it does exactly this sort of thing!

;o)


Bia - 08.07.05 7:02 pm

Hi,

I would like to know how do I enabe .htaccess in OS X Server version 10.4.1

thank you in advance.

Bia


cor - 09.07.05 10:07 am

as far as I know Bia, it should already be enabled. A good way to test is to drop an .htaccess file into the (verified 100% working) root of your website. put some gibberish text inside the .htaccess file. If .htaccess is enabled, you'll get a 500 error.

If they aren't enabled (you sure?) then add the following lines to your main httpd.conf file..
<Directory />
    Options FollowSymLinks
    AllowOverride All
    Allow from all
</Directory>

AccessFileName .htaccess
which should get everything working. Check you don't already have these lines (you're bound to have a main <Directory> section, perhaps the override line is commented out or something) and add if not.

This is generic advice, I'm not familiar with the latest OS X Server. Anyways, get back here if it doesn't work and I'll look into it more.

;o)


Feverdog - 28.07.05 6:56 am

Thanks you guys I have been saved because of you this helped me so much =***************


carl - 08.08.05 7:41 am

I would like to print the username and password to a file with the users IP and other info like refferer etc

ErrorDocument 401 /401.html
AuthUserFile /home/carl/web/xxxxxxxx/.htpasswd
AuthName "Members Must Login"
AuthType Basic
<Limit GET>require valid-user
</Limit>

Many thanks
Carl
admin@newfriends.net



cor - 09.08.05 2:12 am

In a word carl.. php!

;o)

ps.. careful printing password to plain text, that's kinda retrograde, security-wise.


Rachel Mlanao - 10.08.05 8:07 pm

Hi,
I have been fooling around with a Mac OS X Apache site. I would like to do 2 things and I haven't been able to figure it out.
I would like to have certain folders outside of the web documents root accessible. Is this possible in apache?
Or make it so I have different aliases and these aliases can point to different folders. The problem is that the aliases just point to the same site even thought I tell it to go somewhere else.

do you have any suggestions as to a good resource to figure this out? FYI: I am new to this apache web admin thing.
thanks,
Rachel


cor - 10.08.05 11:27 pm

Rachel, the first thing you need is called "Virtual Hosts", and there's LOADS of good documentation on the subject. You can create any number of unique hosts all running on the same box, with document roots all over the place. Get started here.

And secondly, yes, Aliases. If you look inside your httpd.conf, you'll see a section for them. Use the /icons/ alias as a template for your own. It begins
 Alias /icons/ "/usr/local/apache/icons/small/"
This mechanism also allows you to serve up folders outside of your regular document root.

You'll want to ensure the security arrangements of these folders matches what you expect. It's often a good idea, if you use your server purely for testing, to add deny rules to your configurations..
order deny,allow
deny from all
allow from 192.168.1.4
would allow access only from a machine at 192.168.1.4. You can allow subnets in a similar way, "allow from 192.168.1", or "allow from 192.168.1.0/24" would have the same effect, essentially denying access to all "outside" visitors.

;o)

ps.. it's wise to keep your virtual hosts in a separate file, simply "include" that file in your main httpd.conf..
include /usr/local/apache/conf/vhosts.conf
or wherever you keep it.


JenER - 23.08.05 11:08 pm

I am trying to convert a php file to a .gif so I can post it on a phpnuke site. I was told I could insert code into the .htaccess file so that it fools the site into thinking its a .gif file, but shows in the forum as a php file. Any suggestions?


cor - 24.08.05 12:07 am

you mean like a dynamic sig?

just add an image extension, like this..

[img]http://mydomain/path/to/img/script.php/somename.gif[/img]


should work fine.

;o)


Teaboy - 06.09.05 12:11 pm

How about this, allowing access from 192.168.0.x and using a htpasswd for all other IP ranges?


Confused - 06.09.05 3:31 pm

A totally unrelated question;

What's the color scheme you used in the Windows XP screenshot?

Har! Unexpected!


Sean - 06.09.05 6:22 pm

cramming everything inside a PHP file just to compress it is not very intelligent.

Instead, why don't you ask your hosting provider to install mod_gzip?


sajb - 06.09.05 6:34 pm

Teaboy, just use something like:

order deny,allow
allow from 192.168.0.0/24
deny from all
require valid-user
satisfy any



Will - 06.09.05 7:12 pm

How about automattically redirecting to a secure URL (https)?


cor - 07.09.05 11:49 am

Confused, same one I use on windows to this day, it's built-in. Basically, you just enable the olive theme, and then disable themes, so you are left with the colours, but no fancy windows!

good answer sajb! this is a technique I employ for a few of my local servers. As to redirecting them to a secure page, Will, you need to check out page two.

Sean, are you on the right page? Wha do you mean?

;o)


tallyho - 10.09.05 11:00 pm

redirecting to a secure URL (https)
using mod_rewite in .htaccess
- - - - - - - -
RewriteEngine on
RewriteCond %{SERVER_PORT} !443$
RewriteRule ^(.*) https://%{HTTP_HOST}/$1 [R=301,L]



Miles - 13.09.05 4:08 pm

I would like to know if there is a way to use .htacess to do this:
Only the ones who come to my domain being able to access my html files
for example i have a hacker at my site who uses this sort of url
file:///C:/Documents%20and%20Settings/hack/My%20Documents/hack.html
to be able to view into my chat site from the outside... is there a way i can use htacess to where only those who use the url of my site.. www.whatever.com can see into my chat room but no outside links gaining access... they are doing this more and more to eat up my bandwidth according to what i been told... thanks... Miles


Miles - 13.09.05 4:09 pm

Oh ps.. i dont want to password protect it maybe something like referrer i have heard about that doesnt allow outside linking only from the domain.... thanks again.. Miles


cor - 14.09.05 7:16 am

Miles, don't believe the popular press; hackers built the internet; you're problem is with kids.

See "hot-link protection" in the text above these comments, also referred to as "The Main Article" - that part folk generally read before asking questions.

Referrer information can be spoofed, of course, kids know how to do that, too. To prevent snooping, what you really need, is a better chat system, your sucks.

I mean, I wrote my entire site chat system in one Saturday morning heavily under the influence of a whole coctail of interesting drugs, but it still knows the difference between a chatter and a snooper.. https://corz.org/chat/engine.php

get an upgrade!

;o)

ps.. and repeat after me.. "hacker good, cracker bad, hacker good, cracker bad, hacker good ..."


DAV - 04.10.05 6:03 am

suppose my url is www.domain.com/index.php/sample.
I want to convert it to www.domain.com/index/sample using .htaccess file

Plz help


cor - 04.10.05 5:05 pm

DAV, see part two (link above).

;o)


newguy - 05.10.05 4:03 pm

cor,

Are there other ways to redirect, restrict traffic from internet sites other than .htaccess?


cor - 07.10.05 4:09 pm

Well, newguy, the .htaccess file itself is just a place to put "Apache directives", it's the Apache Directives that do the actual work. You could put these same commands in other places, like the main httpd.conf, or inside included files, vhost blocks, etc. Is that what you mean?

At almost every link in the chain, you can alter access in some way. If the server is behind a firewall, or NAT, you could use those to block access from domains, or domain blocks, ports, whatever you like.

It all depends on your setup, your level of access, and exactly what you wish to achieve.

;o)


ccy - 17.10.05 4:54 pm

im using cpanel to addon on a domain.

the curect domain i am using is e.g. currrent.com with the added on new domain named new.com

when i type the new.com into the addres and navigate, it redirect me to current.com/new

how can i mask the url to display as new.com ?

i really coun't find any resources, or maybe you could name me a few keyword ot digg in google ?

i know it works with DirectAdmin.



corz - 17.10.05 7:53 pm

Your cPanel is broken, ccy, tell the sysadmins. You shouldn't have to mess around with .htaccess if you added the domains with cPanel, indeed it could quite possibly break something.

This is an "add-on" domain, right?

;o)


ccy - 18.10.05 3:44 pm

yes indeed. i am using cpanel via the "addon domain" feature.

i went on digging on google, and i landed on a webhosting page stating this problem. but the admin resolve the matter "privately" while not posting it on how it was done..

anyway thank you for your reply. i will look further into it.


oxigen - 19.10.05 9:09 pm

Hi, how can i do the authentication without using the URL (ex: user:password@www.bla.com/directory/) and without a dialog box showing up.Are there any scripts in PHP?


corz - 19.10.05 10:34 pm

Yes, there are other methods of authentication, but none so simple as the regular .htaccess user/password method, and yes, oxigen, there are php scripts..

You might want to check out pajamas, kinda dormant at the moment, but fully working and downloadable. I use it onsite for my admin page, and I'm about to incorporate it into my blogging system, amongst other things. Help yourself.

;o)


oxigen - 19.10.05 11:36 pm

What i was looking was a way to authenticate using htaccess but without the standard pop-up window.
I wanted to send the username and password on the URL ou something like that, because i have already a login script in PHP
and i have the htaccess only for restricting access from those people who didn´t logged in.


oxigen - 19.10.05 11:52 pm

What i was looking was a way to authenticate using htaccess but without the standard pop-up window.
I wanted to send the username and password on the URL ou something like that, because i have already a login script in PHP
and i have the htaccess only for restricting access from those people who didn´t logged in. .


dangit - 21.10.05 2:54 pm

How can I edit the .htaccess with PHP? Dot files don't want to cooperate.


corz - 22.10.05 4:57 am

This is a php question, dangit, and believe it or not, standard php file open techniques will work just fine with .dot files, even something like..

<?php
echo '<pre>';
echo
file_get_contents('/.htaccess');
echo
'</pre>';
?>

would happily display your site's main .htaccess file if you saved it in your site root, perhaps called test.php, and ran it in your webbrowser.

Saving shouldn't be a problem either, so long as the server process has write-access to the file, which many would argue isn't such a good idea to begin with, but it's your server! Try this..

<?php
// test whether .htaccess file is writable, and write to it
$page = $_SERVER['DOCUMENT_ROOT'].'/.htaccess';
$text = 'oops! I just wiped my .htaccess file!';
if (
is_writable($page)) {
    
//$fp = fopen($page, 'wb');
    //fwrite($fp, $text);
    //fclose($fp);
    
echo 'success!';
} else {
    echo
"
    <strong>couldn't write to the file!</strong><br />
    you'll need to edit its permissions and make it world-writable!"
;
}
?>

Note I commented out the actual file writing code, in case some fool copy-pastes it into a file (perhaps called "test.php"), saves it into the root of their webserver, and runs it. If you do that (goan!) it should report failure, at least until you make your .htaccess file world-writable, which may or may not be a good idea.

Uncommenting the three file-access lines would not be a good idea, unless you a) have a blank .htaccess to begin with, and b) want concrete proof that all this works perfectly, which it does.

In general, there are better ways to approach almost every task that has folk trying to do these sort of semi-automated things with .htaccess, and I'd definitely look into one of them if I were you (which as ever, means "if you were me") At the very least treat all incoming script data as "highly malicious".

have fun!

;o)


naresh - 22.10.05 7:55 pm

i want to store my .htpasswd outside document root. how do i do that?


Jeff - 24.10.05 11:10 pm

Hey, thanks for the helpful article. I have been having trouble setting up clean urls with .htaccess. Things were working fine, until I had to add an additional redirect.

here is the file:
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} -f
RewriteRule ^(.+) - [PT,L]
RewriteRule ^main/(.*)$ /index.php?id=$1
RewriteRule ^(.*)/(.*)$ /$1/index.php?project=$2

to this, I want to add:
RewriteRule ^(.*)/(.*)/(.*)$ /$1/index.php?project=$2&id=$3

which should remap http://url.com/sample/project/10 to http://url.com/sample/index.php?project=project&id=10, but its a no go. Things start combusting..

A question: what is the $ indicate after each rewrite (when not used with variables). Also: I took RewriteCond %{REQUEST_FILENAME} -f and RewriteRule ^(.+) - [PT,L] from a sample file. Not sure what it does, but without it nothing in the .htaccess file works.. Thanks in advance.


jeff - 24.10.05 11:26 pm

Okay, I'm a nob, and didn't see part two of the article on redirections. That said, there are a few things that are still puzzling me. One, is that I still can't get the htaccess file to work without
RewriteCond %{REQUEST_FILENAME} -f
and
RewriteRule ^(.+) - [PT,L]
and I hate using code that I don't understand. And two, still cant get
RewriteRule ^(.+)/(.+)/(.+)$ /$1/index.php?project=$2&id=$3
to pass $2 and $3 into project and id.


corz - 25.10.05 4:03 am

Ahh yes, you're not the first to do this. The comment facility here has the capability of accepting a message from the calling page (this page) and places that in big bold letters above the comments, handy for warning, etc. I must remember to do one for here! smiley for :lol:

Anyways, the "$" means "ends with", in other words, the string preceding that character MUST be located at the very end of the string. The PT is a passthrough, and you definitely want to try things without that. If the rules break when you remove it, perhaps you have some further URL manipulation going on, mod_alias, for instance. "L", means "Last Rule". Check out the useful links for a cool cheat-sheet with all this stuff on it.

The RewriteCond %{REQUEST_FILENAME} -f simply checks for a file's existance (a consuming process). The corresponding RewiteRule will only execute if the request doesn't point to a "real" file.

As to your non-working rule, perhaps the first bracket is cacthing everything. It depends on the query. You might want to try ([^/]+)/([^/]+)/ type constructs. Enable rewrite logging, too and find out *exactly* what's coming out the other end, certainly there's nothing inherently wrong with the rule itself and I'd expect to see project and id available in the target script.

Try pointing your rules to this. I find it most useful.

;o)

ps.. I'll move these posts to the other page eventually.


corz - 26.10.05 4:41 pm

Oh! naresh, I missed your question. It's a real simple one, too. The answer is; exactly the same way as you store one inside the document root!

So long as you specify the correct location in your .htaccess, and the server process has read access to the file, it will work fine.

By the way, storing .htpasswd files outside document root is definitley recommended.

;o)


himagain - 29.10.05 4:46 am

Hi there, from Downunder! (In more ways than one :-( )
I'm trying to source some information written as in your inimitable ACTUALLY UNDERSTANDABLE way. I'm even beginning to grasp the htaccess structure!
Thank you!

What I'd like to do is find a password generating system that ties in to HTACCESS on automatic. Say allow access from a landing page, to create a unique password which relates to/includes the IP origin and then updates the htaccess file.

Like a PHPBBS system does.

Cheers!

Him



corz - 29.10.05 7:30 am

himagain, firstly, no, I don't know of any system that does this sort of thing (I'd be surprised if there was one). And secondly, even if I did, I'd have a hard time recommending it. It sounds like one of those just-for-fun projects you wrestle with on some rainy Sunday afternoon, and then forget.

There are lots of potential problems with this sort of approach, and unless it's for something fairly trivial (security-wise), it's probably best handled with a more robust php/mysql type system.

There are also potential issues in tying in the IP address of the user. Many use proxies, or proxy farms, changing IP's all the time. Others have dial-up, and new IP every two hours. Even DSL users get a new IP on reconnect (generally). That's why we invented php sessions, etc, more efficient ways to track a browser.

My own pajamas uses a mix of variables to track the user (IP being an optional one, for the above reasons) and I know from experiences dealing with pajamas beta testers that the User IP is variable with a capital "V"!

At the end of the day you might be better just installing some forum software and using its user database functions for your members. I'm rather taken with Simple Machines Forum which gives you easy access to the user database from outside itself, so you can check the member status of a user by "including" the SMF include at the top of your page. Very handy.

Even just setting a cookie might be all you really need.

;o)

ps.. if you do come across a system that does the automatic .htaccess passwords thing, do let us know! smiley for :ken:


Greedy - 30.10.05 11:18 pm

I want to stop people directly entering a URL into the address bar and pulling out my files. Can I do this with .htaccess and still allow them to click a link on my site and get to a file for download?




corz - 01.11.05 10:56 am

Greedy, (appropriate name for this post!), in short, No. Check back through the comments. Either you allow access or you don't.

There are other ways around this, of course, password protection, using php to "passthru" the file, other ways. But not with simple mod_rewrite.

There are alternative strategies, too.

;o)


JohnBoy - 03.11.05 3:55 am

Greedy, are you talking about somebody typing in the URL and getting an index of all the files in the directory from which to pick, instead of navigating the site and only going to the pages that you have hotlinks to? If so, you can prevent to indexes from show by using

Options -Indexes

in your .htaccess file.



Stu - 04.11.05 12:42 pm

Hey there,

I have some scripts sitting under a folder in my cgi-bin. I have set up htaccess on this folder so that people can't normally access it. There is one script in that folder that visitors to my site need to access without going through htaccess authentication. Is it possible to set up a htaccess file to say that everyone can access 123.pl but not the rest of the directory unless they authenticate themself?

Cheers,
Stu.


corz - 05.11.05 12:41 pm

Stu, yes. Check the main article (above), the last section, about <Files>. That's what you need.

;o)

ps.. if you still can't get it to work, chuck your rules here and we'll have a look.


Stu - 11.11.05 5:04 pm

Hi Cor,

I did try that, but for some reason it didn't work. I actually found a switch in the script that allows you to control access to it based upon your IP, so I won't need to use htaccess there. However, I am part-way through a redesign at the moment, and are looking for a htaccess file to do something like what I've outlined below. Not sure if this is possible though.

I need a htaccess file to redirect visitors from the main domain to a subdomain, and also to redirect visitors from old pages now deleted, to their new counterparts (old pages are still showing up in Google). Below is an example of the sort of thing the htaccess file would need to do.

http://www.mysite.com/blog/archive/old-category.htm -> http://blog.mysite.com/
http://www.mysite.com/blog/archive/2005/03/a-post.htm -> http://blog.mysite.com/archive/2005/08/a-post.htm
http://www.mysite.com/blog/archive/2005-03-15.htm -> http://blog.mysite.com/archive/2005-08-01.htm
http://www.mysite.com/blog/uploads/*.* -> http://blog.mysite.com/uploads/*.*

Also, can this file go under http://www.mysite.com/ or must there be a separate htaccess file for each subdirectory of the blog site? (ie. one for mysite.com/blog, another for mysite.com/blog/archive, etc.)

These redirections would be permanent. I know how to do "Redirect permanent thisfile.htm http://blog.mysite.com/" - however that just appends the original filename to the new URL. As is outlined above, I don't always want to do that.

Any pointers would be much appreciated!

Cheers,
Stu.


sneh - 23.11.05 6:37 am

I've created subdomain named "sub1" in domain "mydomain.com"
i want that whenever i access "http://sub1.mydomain.com/index.php",
it should render "http://mydomain.com/index.php?subdomain=sub1"
but, in browser, the 1st link i.e. "http://sub1.mydomain.com/index.php" should be displayed..
i've tried following code in .htaccess file for index.php file:::


RewriteEngine on
RewriteBase /
RewriteCond %{REQUEST_FILENAME}    !-f
RewriteCond %{REQUEST_FILENAME}    !-d
RewriteCond %{HTTP_HOST} ^mydomain.com$ [OR]
RewriteCond %{HTTP_HOST} ^www.mydomain.com$ [OR]
RewriteCond %{HTTP_HOST}    ^sub1.mydomain.com$ 
RewriteRule ^(.*) http://mydomain.com/index.php?sub=$1 [L,QSA,R]


it worked..but i want to write a single rule for various file, like register.php, faq.php, privacy.php etc..

can anybody plz, help me ...(as-soon-as possible)



sneh - 23.11.05 12:18 pm

help for htaccess rule for subdomain:
I've created subdomain named "sub1" in domain "mydomain.com"



corz - 24.11.05 1:54 pm

Guys guys!
I have a page just for rewrite stuff! Your answer is probably here

;o)


Mark Guadalupe - 19.12.05 3:58 pm

how do you secure a movie files from direct download and straight from streaming via browser, how do you set the security up? please correct me if i'm wrong, and do you do that via .htacess? then how? do you have a sample tutorial?

this is becuase I would have a group of members, and how would I apply it into that kind of setup?

thanks,
Mark Guadalupe


corz - 20.12.05 3:59 am

Mark Guadalupe, check out this section, above. Essentially you just need to create a user group with only those users in it, and then use the password mechanism to protect that folder.

But as I moved your comment here, maybe you hadn't even read that when you posted, so THERE is your tutorial!

;o)


Aaron D. Campbell - 29.12.05 7:00 pm

All it does is enables PHP's built-in transparent zlib compression. This will half your bandwidth usage in one stroke, more than that, in fact. Of course it only works with data being output by the PHP module, but if you design your pages with this in mind, you can use php echo statements and includes for your plain html output and just compress everything!

Well, there are a few mistakes here:

First, I think your syntax is off. It should be "php_flag zlib.output_compression On" or "php_flag zlib.output_compression 1" which will turn it on, and set it's compression-level to -1 (default...same as 6). You can also fine-tune this by adjusting the compression-level "php_value zlib.output_compression_level 6" (values can range from 1-9...remember that you sacrifice CPU for compression). This is all as per http://www.php.net/zlib

Secondly, it will not necessarily half your bandwidth. That really depends on what your bandwidth is being used for. For example, even if you pipe all your images or archives (zip, gzip, bzip, rar, etc) through PHP, it will save little to nothing. However, if you send mostly text ([x]html, css, etc) it will save more than half.

Lastly, you do not have to echo your html. If a page is parsed by PHP, it is sent as compressed. This simply means naming your file with the .php extension (or whatever extension you use for preocessing php).


corz - 29.12.05 7:24 pm

Nope, this must be the advanced syntax! As far as I know (and I've been wrong before), as well as enabling max compression, this also sets the chunk size (unlike the gzip handler, it spits the page out as it goes, in chunks), which by no mere coincedence exactly matches the chunk size of the well known http clients, wget, for example. Kills three birds with one stone, syat.

Of course the amount of real compression will alter with the content, no one would expect a jpeg to compress much (or perhaps they would. hmm. a tut on compression might be a good idea), or an mp3, but as most content is plain text, and that compresses sometimes by as much as 90%, "half" is a nice ball-park figure. If anyone was truly interested, I expect they would test their site's output themselves.

And lastly, I rarely echo php, and I do not have my html documents parsed as php, either. Mixing code and content in the same document is messy, so, I include (something.htm) wherever possible, and this seems to me to be the best balance. The point was, and is, to illustrate that the output must be going through the php machine to be compressed. You could even get specific..
<FilesMatch "\.(php|html?)$">
  php_value zlib.output_compression 16386
</FilesMatch>
and only compress certain files, so long as they are flagged as php parsable. Something I should probably implement around the distro machines. smiley for :roll:

I hear where you're coming from, sure, but hopefully now you hear where I'm coming from, too!

;o)


subrose - 04.01.06 11:32 pm

i am trying to override the extension=php_ldap.dll, which is commented out in the root php.ini file,
i need this to authenticate users and i don't have acces to php.ini.

Is there anyway to do this with .ht?


corz - 05.01.06 2:39 am

Well, subrose, I'd imagine that libraries like this neet to be loaded at php's initialisation, and for very good reasons, so, as far as I know, the only way to do it is to ask the system admins nicely if they will enable it for you.

;o)


dan - 12.01.06 9:49 am

Hello there, i've tried almost all the resources on this page, but didn't work almost nothing.
for example i want to do this:
www.mysite.com/articleview.php?id=1

to become

www.mysite.com/article.htm

for the part with *.htm i've done this already, but how i can pass the parameter ?id=1 otherwise than url, so that my url will become friendly????


corz - 12.01.06 10:09 am

Well dan, everything on this page is tested 100%, and works day-in-day-out right here at corz.org. Thousands of others have also confirmed this. So, logically speaking, where do you think the trouble lies? smiley for :ken:

There's nothing unfriendly about ?id=1, is there?

If you want to ask about redirection, first, read part two, which is ALL about redirection, and if you still can't suss it, post the exact commands you are using, along with specific details about what you are trying to achieve, on that page.

;o)

ps.. you know about [qsa], right?


dan - 13.01.06 12:10 pm

with all respect for this site and all his friends, i have the following problem:

1. i've made an site that shows articles(i have an index.php that shows the article, and a link for the full article - this is the page view_article.php)
so the problem is that i transmitt the parameter ?id=1,2,3, etc through url, and view_article.php gets this id=1 thrw $GET etc...,
2.the problem is that in index.php the links trough the full article is like www.site.com/view_article.php?id=1. I want to make this url more friendly link www.site.com/view_article.htm, without the part ?id=1.
3. I've tried the scripts from this site and adjusted to my site, but the page view_article.php doesn't get the parameter id=1.
4.how i can receive from the page view_article.php the parameter id????

many thanks


corz - 13.01.06 3:12 pm

Dan,

If you want to ask about redirection, first, read part two, which is ALL about redirection, and if you still can't suss it, post the exact commands you are using, along with specific details about what you are trying to achieve, on that page.

thanks.

;o)


Liviu - 15.02.06 8:17 am

I don't know how to undo the htaccess settings ive made. The .htaccess file on the server is hidden and I don't have access on it. I've placed the .htaccess file in the site root folder, and my whole site is now under password!!!
I need help!
Thanks


corz - 15.02.06 12:04 pm

Liviu, simply enable showing of invisible files in your FTP client. The command is usually..
LIST -al
or something like that. Your FTP client software may have an option you can check, instead. You could probably see it if you use your hosting panel's file manager, too.

Hopefully this has taught you the value of testing everything on your development mirror first!

;o)


Liviu - 15.02.06 5:53 pm

Thanks, cor.
I don't use any FTP client software. I upload my files with MS internet explorer.
I've done so: a new empty file .htaccess placed exaclty over the last one. It overwrites the last one and because it is empty, the settings are lost. That's what I need.



corz - 15.02.06 7:05 pm

Liviu, are you serious??? smiley for :eek:

Please, if you take webmastering seriously, go get an FTP client!

INTERNET EXPLORER?? smiley for :eek:

I really have heard it all, now! smiley for :lol:

http://flashfxp.com/

;o)

ps.. there are also some excellent free ftp clients kicking around.


Liviu - 15.02.06 8:03 pm

Your information is helpful. I'll get a FTP client. I'm a beginner, that's why I asked for help. Thanks for solutions.


Andrew - 16.02.06 12:00 pm

Hello,your tips are very useful.
I have an .htaccess file on the root of my site to prevent accesing it people I don't want, but I want a subfolder to be full accessible.
To be more precise, I have the main htaccess file into http://mydomain.com, prompting everyone for a password (it works) and I want the folder http://mydomain.com/photos to be accessible by anyone, no password prompting.
I guess i have to create another .htaccess file inside the photos folder and a code like : allow from all. I tested and it doesn't work.
Any help is welcome.


corz - 16.02.06 1:10 pm

Don't worry about it, Liviu, others will learn from your pain! smiley for :lol:

Andrew, at the risk of inviting some clever dick to come along and say, "Oh Yes it IS!", I have so say, that's not possible. Not as it stands.

One of the great things about .htaccess rules, you will remember, is that they apply to that folder and all the folders inside. By password protecting your root directory, the directories inside it are automatically protected.

The .htaccess file inside /images/ (or whatever) won't even be read (at least, applied) until the user has authenticated themselves. That's the trouble.

Your real problem is site design. Password protecting the root isn't really a clever idea. Far better to have a home page, and then a link to the "protected" area, which would be a directory inside the root, perhaps called "protected". This will save you much future agony.

Finally, remember that "basic auth" is quite feeble security. If you want to protect particular content, your MP3's or whatever, you might want to instead consider some sort of secure download facility.

Alternatively, use a more intelligent method of authentication, and then simply check your authentication status in your page header, or wherever, where you can decide what the user can and can't see. Beauty of php sessions!

There are loads of ways to achieve the effect you are after, even better ones, but with basic http auth in your site root, you are stuck with all-or-nothing.

have fun!

;o)

ps.. Perhaps consider installing some forum software. SMF, for instance, will allow you to check the user's authentication status by simply "including" a small php file. Very useful.


ED - 28.02.06 1:00 pm

Hi Cor

I've been password protecting my web directories for some time using htaccess.
Recently someone boasted that they had hacked into my site and that it had been easy.
(Both the htaccess and htpasswd files are in the web directory. The server does not allow .ht files to be displayed. By default directory contents are not listed.)


Is it possible for someone to hack in to my site and bypass or decrypt the access login.... how would they do it?

ED


akalias - 01.03.06 1:05 am

Anyway to do an .htaccess auth for remote hosts only?
I know how to do the Auth stuff for all hosts and also
for a limited number of hosts.
But how can I saw only show Auth if not a localhost?
I don't want to type a user and pass when I am one one
of my own boxes within my own IP_range.

-akalias


corz - 01.03.06 9:53 am

ED, whole books have been written about ways to hack web sites, where to begin?

The most likely cause is poorly written web-side code. Because php is easy, loads of folk dive right in and create web applications. This is good.

But writing web applications and writing secure web applications are two different things. This code is great for the dev mirror at home, or the fun home page, but not for a production environment. I've seen download scripts that will, for instance, allow you do grab any file on a server, this sort of thing..

<?php
$file
= $_GET['file'];
echo
file_get_contents ($file);
?>

to which the amateur hacker simply requests.. download.php?file=../../../../../etc/passwd or whatever they like. Examples of this sort of thing on the real world wide web are legion. The bottom line is; treat all user input as potentially nefarious; filter it, screen it, and rather than removing what isn't allowed, accept only what is.

A good example. A download script that allows users to download any file in a directory. Rather than screen the user input for xss attacks, path traversal attacks and all the rest, you simply scan the directory, create an array of filenames of actual files, and then only allow requests that match one of those files exactly, which is pretty much the opposite of the way it's usually done.

Even the most innocuous little seemingly harmless script can often be put to the most nefarious uses by the determined cracker. I've seen would-be-corzoogle type "search engines" that happily display excerpts from any file they find (database passwords are easily discovered this way), I've seen database scripts that will obligingly email you a dump of the sql, you name it, even when the webmaster was thoughtful enough to provide a password for the thing in the first place, which many don't.

One of the most famous mac underground sites ran for years with no password on the root MySQL account. Once you get root, there's little to stop you. And simply using IIs as your chosen web server is enough to blow any carefully planned security arrangement!

Whole books, really. Read some of them if you are truly interested. Start here.

Remember; never rely on "Basic Auth". It's fun, and useful, but completely inappropriate for situations where you need anything but the most trivial of protection. Remember, with basic auth, the passwords travel over the wire in PLAIN TEXT, so at any node between the browser and server it's very easy to sniff this data. The situation for users of proxy servers, particularly unknown "free" proxy servers, is infinitely worse.

Sadly, most hosting plans still use Basic Auth for site admin logins, entirely obliterating any protection the webmaster might himself provide smiley for :erm:

Check out Pajamas, my own humble effort to work around this and other security issues with "basic" authentication. I have a nice OOP version of it kicking around here, mail me if you're interested.

Lastly, if you write all your own back-end code, go through it all with a fine-toothed comb, screen all user input, then do it again. If you use other folk's code, get the latest version, most recent patch, whatever, and get your site tight.


akalias, what you need to do is simply add the local subnet to the allow group, something like..
AuthType Basic
AuthName "corz secure area"
AuthUserFile "/home/.htpasswds/corz/secure/passwd"
require valid-user

<Limit GET POST PUT>
  order deny,allow
  deny from all
  allow from 192.168.1
  require valid-user
  satisfy any
</Limit>
The key line is satisfy any which basically allows access if you have either the correct IP, or the correct user/pass combination. Users in the 192.168.1 subnet (you can write that loads of ways, and of course, specify only one single machine) can access the folder without entering a user/pass, but any user outside the local subnet must authenticate. Switch GET POST PUT for whatever you need.

And thanks for bringing it up, akalias, I hadn't considered this before; it's a neat way to do things, so I'm glad I had to think about it! The only trouble is that I'll bet every time you freely access the area from the local domain, you get a pang along the lines of "shit! is this still secure from the outside?" smiley for :lol:

I had to test it with a proxy, just to be sure! smiley for :ken:

for now..

;o)


akalias - 03.03.06 12:07 am

Cor,

Thanks a lot ...I knew about the subnet thingy, but not the satisfy key. That "satisfy any" was exactly what I was looking for.
I knew I didn't have to do this check in php. Awesome, thanks a mill.
Exactly what I need for my music stream page.

BTW, if you need a mp3/ogg on demand stream server, with ogg transcoding, let me know. Cause I wrote one in perl about 2 years ago, with metadata updates as well as seamlessy joining songs for Techno albums.
Plays nice with Apache and proxypass, so if your work has a proxy that only has p80 open, it will get through ;)

Again, thanks for the reply.

Cheers
-akalias




ED - 07.03.06 11:42 am

Gulp! Thanks... will now start to inwardly digest.


meh\!_32768 - 13.03.06 5:21 am


hiya, great site.
I have been trying to find a generic way to specify expire and inactiveExpire timeouts using .htaccess, i'm using thttpd server so apache's pubCookie doesn't work. The server does not have cgi or php it's minimal.

I wrote a perl user/pass create script which I run on the server and users are able to log in, my problem is that users log in once and then never again.

I'd like to get a session timeing out after a total logged in time of a few hours, and secondly a session timeing out after the user has not touched the thing for half an hour. (or if not possible one of the two methods would be great ;D, since at the moment I don't have any methods x\ )
thttpd webserver system is SGI IRIX, read only NFS system that contains htdocs and .htaccess is Sparc Solaris9
thanx lots wtfpwn Cor!



corz - 13.03.06 9:37 am

meh\!_32768, I've been using mod.expire to great effect recently. that's not much help to you, but it does remind me that I should put a bit on this page about caching.

Without cgi of any sort, your options are limited, to say the least. I don't understand how you get the perl script to login users without cgi. Am I missing something? But if you are trying to put a solution together, you might find this a good read.

At the end of the day you will probably be better off either upgrading to a different httpd (maybe Tux, or smomething, or else perhaps patching the one you've got.

;o)


Mantooth - 03.04.06 6:10 pm

Hello,

I'm on a shared server that has disabled the phpfunction "fsockopen". Here's my host's php info http://minidemo.hostultra.com/

I need that function for some php scripts but my host told me that they absolutely will not open the function for the entire server because of security issues.

I've heard that there's a way to enable php functions for particular directories using .htaccess

Is this true? and would you know the lines i would need to add to enable "fsockopen" for the script's directory?

Thanks


corz - 04.04.06 5:46 am

Mantooth, just tell your hosting admins to stop being so lame and install some *real* security measures. Disabling useful php features just because they don't know how to properly secure their box is shameful. No passthru either? pfff.

Solution: consider a real web host.

;o)


Mantooth - 04.04.06 4:19 pm

quote
Solution: consider a real web host.

true that. I picked hostultra because the plan was dirt cheap. I've officially learned my lesson.

Thanks for the advice.

-Mantooth


Mihai - 07.04.06 7:35 pm

Is there a way to include files in htaccess files?
For instance - I have more vhosts pointing to the same folder so I've created custom .htaccess files (.htaccess_site1, .htaccess_site2, etc..). Yet - I want to have some parts of these to be the same without changing every file of over 300 I'll have.

.htaccess
{shared content}

.htaccess_site1
{site1-specific content}
{shared content} <- will have to include .htaccess

.htaccess_site2
{site2-specific content}
{shared content} <- will have to include .htaccess

...

Thanks


corz - 08.04.06 2:28 pm

Nah, Mihai, "include" is only available inside your main httpd.conf (or virtual hosts configuaration).

It sounds like you have access to the vhost configuration, so make your changes there.

;o)


Adam - 26.04.06 5:01 pm

I need to protect sub directories with htaccess. I have a database of users and have all the information to make this work. My only problem is, on the login screen (in the root) they login, but another popup window comes up asking for username and password to access protected directories. What I want to do is when they login using my login page, that information is passed to the htaccess and provided they are a registered user, have automatic access to the protected directories.

In other words, I want to get rid of the pop up window, and everywhere I look for the code to do this, its the same copy and pasted article from Apache that says there is a way, but no one has any code to prove it.




corz - 30.04.06 1:42 am

Adam, my first response is; well, it shouldn't do. Once they login, that's it. There's nothing in Apache that would make a second dialog pop up, you must be doing that. So the answer is STOP DOING THAT!

At any rate, this just sounds like one big mess, why not simply authenticate with php, and use sessions. The code is lean, clean, and transparent to the user. The login can be a regular text input (like the name input in a comment form) so no pop-up at all.

Basic authentication sucks, anyway, and Internet Explorer won't do it any more unless you hack the registry, if not yet, then soon. Most folk are moving to better solutions, these days.

But perhaps I misunderstand. If you want to show me this "copy and paste" code that doesn't work, I might have a better idea of what you mean, and make it work.

l*rz

;o)


thedude - 01.05.06 8:47 pm

I'm attempting to password protect a directory using .htaccess and everything seems to be set up correctly. But the authentication window will only display under three known instances:
1) when accessing that directory's index page (e.g. http://www.mysite.com/directory/)
2) by entering a non-existent page (page 404s after logging in)
3) by accessing a page with an html extension (http://www.mysite.com/directory/page.html).

Entering the URL for an existing page opens that page without requiring authentication (e.g. http://www.mysite.com/directory/page.php).

Do you know what might be causing this?


corz - 02.05.06 12:19 am

thedude, yes, I'm almost certain it's caused my a server misconfiguration.

If you show us the configuration, someone might spot the "mis".

;o)


alert(document.cookie) - 09.05.06 8:33 am

<script>alert(document.cookie)</script>


rickjames - 14.05.06 3:27 pm

hey im having trouble setting up .htaccess on my site. i've signed up for godaddy (aw man, awesome deal) and to access my site files i'll have to login to godaddy/ highlight hosting servers/click my hosting account ...then click open which loads me to a separate page on hostingmanager.secureserver.net...to which i either access my files via hostingmanager's control panel to FTP client (java based) under which my directory /teepee (member/directory name under hosting manager) is listed under a remote system

(sometimes i just use smartftp to ftp://ftp.mysite.org to link to my /teepee main directory)

i've tried the "satisfy any"...thang but that didnt work.
in turn i've come to believe its something to do with that AuthUserFile line...

with the aforementioned info...my AuthUserFile has been /teepee/.htpasswd OR ftp://ftp.mysite.org/teepee/.htpasswd OR mysite.org/teepee/.htpasswd...i've also did the allow IP string with the IP i've found on the hostingmanager control panel.... someone please help!!!


corz - 15.05.06 10:38 am

Yup, rickjames, godaddy is the place for domains!

However, I know absolutely nothing about their hosting plans, nor their server setup. My first instinct is that it's nothing to do with .htaccess. In every host I've been with, the FTP access is controlled through the hosting panel (or in rare cases, in the shell).

Check your hosting panel carefully, probably there's an option to enable and/or setup ftp accounts. Do that.

If it's something else (and asking the godaddy staff would probably get you a swift answer) then let us know.

l*rz..

;o)


BluEfficient - 03.06.06 9:29 am

The GoDaddy issue stems from it's use of a cgi wrapper. The particular wrapper they use not only causes your issue, but causes MAJOR issues with .htaccess.

For instance, let's say I have a folder I want to protect, called admin. After fixing the .htaccess and .htpasswd file, I type in:

http://www.domain.com/admin/

Now, here's the kicker... If my index page is index.htm OR index.html, I will be prompted for a username/password. If my page is called index.php, the page will be displayed without asking for squat.

WTF?

Well, again, it's the cgi wrapper. Since the page has the php extension, it is handled much differently than a plainly coded html page with no parsing.

Thought someone might be interested to know. And, no, GoDaddy will NOT change this for you and does not have hosting without the cgi wrapper.


Daniel - 03.06.06 10:29 pm

My website is http://www.crazyworld.biz.ly,
i want to grant cgi access, to a folder called cgi,
how do i do that?


corz - 05.06.06 7:21 am

Daniel, add..
+ExecCGI

;o)


SongDog - 06.06.06 7:59 am

I know I've posted here b4 (in last month or so) , we talked about apache, penguins with feathers, mod_rewrite and password protection. For the life of me I cannot find those posts. (have you ever considered a threaded board?)

Anywho I gots another question on mod_rewrite that maybe you can help me width smiley for ;)

There is a particular person who has screwed with me 2 much in the past who I would like to send to a particular page on my site when they try to access any of the rest of my site.

OK in the world of dynamic ips the actual numbers change from time to time, but the constant is that the ip number resolves to xx.bigKableKompany.xyz

What would I do to send anyone (I understand whats meant by anyone) from xx.bigKableKompany.xyz to send him to http://myDomain.xyz/hiTurdleHead.html?


corz - 07.06.06 10:12 am

SongDog, the reason you can't find the posts is because they were on the mod_rewrite page (part two), where this post should probably be!

This is a tricky one. If hostname lookups are enabled on your server, you can redirect using the remote domain name in your rule (REMOTE_ADDR in the rewritecond). However this is highly unlikely (it's an incredible waste of bandwidth, and disabled by default in Apache).

So you might be looking at a server-side scripting solution. This would be easy to do in php, for instance.

If you have a page header (something that's included in all your pages) then you could check there, and redirect as required. If you don't have such an include, it's easy enough to add that on a global basis (and this is why I left this reply on this - part one - page), in your .htaccess..
php_value auto_prepend_file /full/path/to/a55h01e-checker.php
Have fun!

;o)


justin - 19.06.06 2:39 pm

I need the directory say protected_doc, protected from access by any means other that of a particular link to it.
or
if you put it in another way, one should be able to access the directory only if he uses this link and not even giving the location in the browser (url bar)

Is it possible..?

The posts here were quite useful in understanding (little at least) on the might of .htaccess smiley for ;)

Thanks in advance ..

Justin


corz - 20.06.06 12:12 am

justin, probably your best bet would be to use regular hot-linking code, except remove the line that allows access with no referrer. While some (a few, very rare) browsers don't send referrer headers (or "referer", if you prefer the usual misspelling). this will cover 99% of your surfers.

Other then that, a server-side coding solution would likely be best (like a php session). You might also want to consider using regular cookies, which can be placed in the browser with JavaScript, php, whatever.

;o)


Eddie - 06.10.07 2:14 pm

Virtual Servers and Directory Permission

Great Tips! Finally someone who explains code rather than just showing it, nice work!

So I have read all over the web about being able to restrict directory access, allwoing your own server. This prevents people from viewing a directory of file directly.

.htaccess deny, allow code
order deny,allow
deny from all
allow from .trippymedia.com
allow from localhost
allow from 208.109.181.72
allow from 127.0.0.1
allow from 192.168.0

now as you can see i have tried every possible alias for my server, but still it gets denied requesting its own pages.

Is there any way to obtain its local IP, or get similar results?

Basically i want a web page in another directory to have exclusive access to this one.


corz - 06.10.07 4:35 pm

Yes, Eddie, it definitely helps to understand "what's going on", and along these lines I will now continue, except on the subject of "Web Requests"..

The source of your trouble is not the DNS, or the host name, or the IP, or anything like that, but that you don't realise that no matter what page the links are on, nomatter where on your server the "other" folder is, the requests for the resources in the "denied" folder COME FROM THE USER'S BROWSER. And their IP is anyone's guess.

Unless you are using a server-side scripting language like php (which I would recommend, for what I suspect you are trying to do) to "passthru" resources from that folder (with "fpassthru()", for example) there's simply no way to protect it entirely.

You might want to check out part two of this article, and use hot-link code for a modicum of protection, but at the end of the day; it's server-side scripting, partial hot-link protection, or else put all the resources in one main folder, and password protect it.

Have fun!

;o)


Eddie - 06.10.07 10:49 pm

Well don't I feel foolish...

Luckily my site does run on PHP, which I guess is the reason I thought the request would come from my server. Thinking about it though, PHP just prints the resource for the browser to read, and the browser retrieves it.

I'll have to check out the fpassthru method you mentioned.

Thanks a million!



corz - 06.10.07 11:59 pm

No problem! And hey, I wouldn't worry too much about it, I've seen assumptions much more foolish than that made by experts. smiley for :ken:

You are absolutely right to say that php "prints the resource for the browser to read", but what, and how those resources are created is a whole different world; the infinite realm of coding..

php is an entire programming language; with some desire and a little skill, you can do pretty much anything you might imagine a web page could do, and more, using all sorts of resources, even resources outside your web root. This whole site is basically me+php.

The method I suggested would be part of a larger program, which would roughly..
  1. decide if the resource can be accessed (security)
  2. send the browser the correct http header()s to let it know the resource is (say, a zip file)
  3. read the file and send that data to the browser (with fpassthru)
Check out my distro machine, which does exactly this. I've just finished updating that part of the site today, and a new release will be out within a few days, but the code you want is identical to the existing version. Download it, and have a look.

Or just use my distro machine! smiley for ;)

;o)


Varahram - 12.10.07 2:58 am

Hi,

I have encountered a problem, previously I could access my website without adding "index.php" to the url but I made some changes to my website and now I have to add it to be able to get access to my homepage. I appreciate if you can help me with this matter.

Thanks.



corz - 12.10.07 3:03 am

It looks like your DirectoryIndex directive got deleted. Try adding the following line to your .htaccess file..

DirectoryIndex index.html index.php

That should fix it.

Note: Apache first looks for a file called "index.html", and if it doesn't find one, looks for "index.php". This is the standard way to do things, and handy if you need to work on some index file, live, you just drop in a temporary index.html, and browsers get that instead. It might say "come back in five minutes".

When you're done, you delete it, and that folder's default reverts to index.php. You could also simply use..

DirectoryIndex index.php

;o)


AskApache - 07.11.07 4:07 pm

Some of the greatest and most creative .htaccess tips and tricks I've ever found, and i have searched extensively.. Used some of your ideas for my .htaccess articles.


corz - 08.11.07 12:36 am

Feel free to use anything you like! So long as you give credit, link back to here, it's all good.

Hmm. Strange, I don't see you in my referrers. smiley for :ken:

;o)


RS - 12.11.07 7:57 am

Dude, that askapache might as well be a spam bot. I'd lose the link to his site. That ass is constantly spamming the wiki, forums, etc and just grabs other peoples shit for his site. Peace!


corz - 12.11.07 8:23 am

Thanks for the heads-up, RS.

It's a fairly common practice. I get mails every now and again from folk concerned that someone has ripped off my work, sometimes whole articles, verbatim. In fact, there's a whole TLD out there that is nothing but my .htaccess articles, and it has nothing to do with me, I can assure you!

The only thing that annoys me, is that I do still occasionally update these pages, refining and adding things, and anyone seeing a copy or a rip-off, is missing out on all that. A simple link back here solves the problem. But it's sooo hard to make a link, isn't it?

I got mailed recently about an interesting article on Automatic translation with .htaccess. The funny thing is, they don't seem to be able to get it to work properly, and as the author hasn't acknowledged that he knicked the idea from me, and isn't able to see what he's done wrong, himself, they are screwed until a) someone points them back to here, or b) the "author" does some hard work.

Most people won't notice who did what and where and when, and it doesn't actually matter that much to me, so long as the information the user gets, is good, code works, which sadly isn't always the case..

Our askapache friend seems to be in "growth phase" at the moment, gunning for PR, I guess. I checked out his site over the weekend, and found quite a few examples of unworkable code, clearly lifted from elsewhere without a full understanding of the caveats and gotchas that go with it. Other "ideas" were just plain stupid.

My plan was to see if he might give some credit on his page (not just to me), and if so, let the link slide, though with a "rel=nofollow", of course smiley for ;)

But after your extra information, I may just neuter it, or point it somewhere else, DisneyLand, perhaps.

Cheers!

;o)


RS - 15.11.07 11:02 pm

too funny. that douche just got his affiliate account killed and it looks like the hosting account won't be far behind for stuff like spamming and deceptive affiliate link redirection. he even had 'sponsored by dreamhost' on his site and thought it was okay because he was an affiliate. lmao

this askapache clown is delusional. this thread at his host's forum is pretty funny:

http://discussion.dreamhost.com/showthreaded.pl?Cat=&Board=forum_promotion&Number=94595


corz - 15.11.07 11:49 pm

no comment! smiley for :blank:


Daymon - 01.01.08 7:41 am

I recently moved a static HTML site (don't ask) to Godaddy and used

AddHandler x-httpd-php5 .php .html

so it will handle all the html pages as php, which saved me from renaming them. The only problem is when I try to upload 3'rd part software written in php, Godaddy gets retarded on me. Do you know of a way for it to ignore the rewrite rule on .php pages while still rewriting html? smiley for :eek:


Aryan - 09.01.08 10:10 pm

Hi,

I have encountered a problem, here it goes..

I have a set of files in "root/docs/id/" and I have common php script that have to be execute when ever any of these files have been accessed and the access methods or request type may be any thing.

Is there any way to archive this....

Thanks smiley for :D ...


xtirpata - 28.01.08 5:32 pm

Thanks for the tute!

In your 'Useful Links.." you mention .htaccess generator and ask 'Where's the php source. Anyone?'

There's a link to

http://www.southeasttelephone.com/tools/htAccessor/readme.txt

which credits

Chris Todd
chris@bitesizeinc.net
http://www.bitesizeinc.net/

but it's a dead link.

smiley for :ehh:


Bob_bob - 01.02.08 1:19 am

First of all thank you for making this page and helping public with their problems they have, (I am one of them).

I have a question, and would like to know is that any possibilities for .htaccess to make a session expires after x times of trying to make login?

I am asking because, there is someone is coming to my site everyday for few hours and trying to login and then leaves. So I would like to make his/her IP ban only for that day; lets say after 3 times of unsuccessful login.

Regards,


corz - 26.02.08 5:42 am

Daymon, try part 2 (next page).

Aryan, try php's auto_prepend_file. see here for details. You can do it in the main .htaccess file for that directory/tree.

xtirpata, thanks dude. I'll maybe see if I can track this guy down, get his php.

Bob_bob, it's an interesting question, and though I suspect it *may* be possible, it probably wouldn't be pretty, and would likely involve some fairly inefficient jiggery-pokery.

Alternatively, a cookie is fairly easy to set (on your 401 page/.htaccess) and check (on your login page/.htaccess), though obviously, the client could unset that themselves, if they sussed out your tactic. But it would make things inconvenient, and that's usually enough of a deterrent for most folk.

At the end of the day, a php session would be *much* easier to implement, and more secure, too. Try any of my logins here at corz.org (admin page/add-a-blog/etc.); you'll see what I mean.

for now..

;o)


Bijan - 28.02.08 8:31 am

Thanks for the content.

I have a image folder which I want to access limit it to all public users by browsing the folder directly eg: http://mydomain.com/images/myimage.jpg

But I want to have access to it by a php script that can show the images in that folder.
I have used the .htaccess file in the image folder like :

deny from all

But with this I cannot view the images with that script,
If I use something like :

deny from all
allow from 127.0.0.1 # or the server IP

It would be solved, But my question is :

Is there another way to this stuff ?
which nobody can have sirect access to a folder, and they only can have access to view the images by a special php script that can limit access the folder ?

Thanks,


corz - 28.02.08 10:07 am

Bijan, try part two (next page - 'hotlinking' section).
Link above these comments.

;o)


Nacus - 26.03.08 12:55 am

I just found this site and it is very helpful. I do have an issue that I would like to present. Maybe the community can help solve it.

I have a web site that uses HTAccess to protect files from unauthenticated site visitors. The content is in a protected directory, but the pages that have the links to that protected content are publicly available. When I just click on a link before logging in, the authentication dialog appears as expected. However, before logging in, when I right click on the same link and select Save Link As, the authentication dialog does not appear. Instead, the Save As dialog appears and when I try to save the files (which can be fairly large), the Downloads window appears a shows a very fast download. If I open the file with a text editor it says inside that I have to authenticate to before I can get the file I want.

If, before authenticating, I try to download a file with Safari, IE and Opera, I get the expected behaviour: Authentication dialog appears, I log in, then the download starts.

Am I missing something or this this a Firefox issue?


corz - 26.03.08 2:05 am

Yes, the behaviour is browser-specific. I guess the Firefox developer's rationale would be that it's unusual for links to protected resources to appear on an unprotected page, or something like that. Regardless, many (perhaps most) folk wouldn't realize that the .zip file (or whatever) was actually an HTML page - they would just have a broken zip, and no explanation.

K-Meleon reports, "Download failed. Authentication required", which at least lets you know what happened; but I still prefer the IE/Safari/Opera method.

You might want to bug the Firefox developers about what you "expect", and with a bit (a LOT) of luck, they might even consider changing it. It might even already be a bug - worth a look.

Good luck!

;o)

p.s. this is NOT a community!


dREW - 06.04.08 1:26 am

I have set up my .htaccess but when i visit the page where there should be a prompt for user name and password nothing appears. Any thoughts?


corz - 06.04.08 2:25 am

Yeah. Paste your .htaccess code.

;o)


Sumeet - 08.04.08 3:57 pm

What an enlightening article!
I knew nothing of .htaccess files before
visting ur site,really very thankful to u.
Great job!


jhoy imperial - 16.04.08 7:08 am

hello!

i just like to ask if this is possible with htaccess

rewrite from : www.domain.com/main
rewrite to : www.domain.com/folderhere/hello.php

thanks =) smiley for :roll:


corz - 16.04.08 7:14 am

jhoy imperial, try reading part 2! (link above)

;o)


jhoy imperial - 06.05.08 2:11 am

cool thanks smiley for :)


angelique54 - 20.05.08 11:03 pm

I'm a true newbie. Your writing is clearly written and wonderfully implemented. Thank you!


Marc - 27.05.08 7:27 am

Great stuff. I ran across this site while trying to figure out how to use .htaccess files to do fancy directory listings (using IndexOptions FancyIndexing). I got that part figured out now, but just one thing left I'd like to do if possible--is there a way to suppress the display of the:
"Apache/2.2.8 (Unix) mod_ssl/2.2.8 OpenSSL/0.9.7a mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 Server at 74.54.250.154 Port 80"
at the bottom of the page? It would just make the page look a bit cleaner.


corz - 27.05.08 10:07 am

There are a couple of ways to do this. The first, and most obvious, is to put..

ServerSignature Off

in your .htaccess, which is the default. Yours signature must be set to On, higher up somewhere. On some servers, the above setting has no effect, because the override has been disabled.

Another way to disable the ServerSignature, is to use a custom readme, which is the html that goes below the fancy index, just like in my own /public/ archives, something like..

ReadmeName readme.html

would insert the contents of a local file called readme.html under the index listing. You can also specify a full path (handy when you want to share readme files in many directories) from the root of your site..

ReadmeName /includes/txt/readme.html

While we're here, note there's also a HeaderName directive, which works in exactly the same way, except inserts the contents above the index listing.

;o)




Puniksem - 29.05.08 7:09 pm

Thanks for this most useful page, I really didn't know so much could be achieved with .htaccess files. thanks again.


Marc - 30.05.08 3:09 am

First I tried out ServerSignature Off and that did work on the server I'm on. Then I tried adding custom HeaderName and ReadmeName files. Super easy (now that you've taught us), and super cool.
Thanks cor! smiley for :D



Mohsen Ghassemian - 01.06.08 8:15 pm

Hi,

very nice, thanks for the nice simple page!


neal - 05.06.08 2:55 pm

its cool man


Slip - 06.06.08 4:20 pm

Thanks dude, your guide do help me a lot smiley for :)


Daivd - 09.06.08 1:01 am

Thank you for providing this information for people like me who are new to programming and wouldn't even have a chance to learn without people like you posting this tutorial.
I've spent days researching a problem getting .htm to work with php.
Within 1 hour of applying your information, my site is running perfectly and I can't thank you enough!


Mumin - 15.06.08 3:44 pm

Wow, you've made trying to learn easy and fun!
Grr, it's quite annoying trying to write to an .htaccess file using Cpanel >.<

Anyway, you rock dude.


pankaj - 21.06.08 11:04 am

I am not getting the use and functionality of ht access


Thomas - 07.07.08 9:57 pm

Hey there, really great article!

When using .htaccess for basic authentication, is it possible to read the username & password with php? The usage would be to redirect the user to "their" directory after logging in on a generic url.

Thanks!
.Thomas


corz - 14.07.08 6:00 pm

Yes, Thomas. But there's no need to know their password, they won't access the php script unless they have logged in, right? After that, all you need to direct them, is their user name, which is in the Apache variable "REMOTE_USER", e.g..

<?php
echo $_SERVER['REMOTE_USER'];
?>

;o)


Dranoweb.com - 05.08.08 1:59 am

Nicely written.

If textbooks were written like this, more people would stay in school.

Had some issues with hackers dumping some php code inside gif images on my uploads section lately, blocked ot off with htaccess like much of my site.

I came looking for ways they may get around this, and as usual, got side tracked at the wonders of imagination that can be realised in code of all sorts.

I might have to add you to my useful links section.


Zarg - 07.08.08 3:23 pm

Very good articles - and your writing style's fun too.

Based on what I read of your examples of limiting access to certain files, I tried to set up my own, but oops! Internal server error... Feck!

I'm happy to pay you. How much would you charge for the following? (Please email me.)

I want all html files restricted to valid users in my htpasswd file, but to allow access to absolutely everyone for all other files, including those in an /images subdirectory.

Cheers!


corz - 08.08.08 6:44 am

Hmm, something like this..

# your standard auth code..
AuthType Basic
AuthName "restricted area"
AuthUserFile /var/www/.htpasses
# or similar, then..

<FilesMatch "\.(html|htm)$">
  require valid-user
</FilesMatch>

Order Allow,Deny
Allow from all

All donations warmly accepted!

;o)


Andre - 08.08.08 9:12 pm

Very nice explanation of the .htaccess file functions. I loved it. Thank you very much for this resources.


Zarg - 13.08.08 9:15 am

Hi Corz. Thanks for the code. I tried that, and yes it prevented html file from being viewed and allowed other files to be viewed, but instead of presenting the dialog to enter the user and password, the page just returned:

Authorization Required
This server could not verify that you are authorized to access the document requested. Either you supplied the wrong credentials (e.g., bad password), or your browser doesn't understand how to supply the credentials required.

Does this look like a problem with my server?


corz - 13.08.08 11:34 am

Nah, Zarg, it looks much more like a problem with your browser (read the message!). Try another one, or simply restart it and try again.

I'm assuming you have a proper user/pass setup that works 100% before you make it conditional, yes?

;o)


Zarg - 13.08.08 3:49 pm

Ta muchly indeedy, Cor, but it's not a browser issue (Firefox 3 on Ubuntu).

I don't know what was happening at our server end but I contacted the sysadmins who seem to have fixed it so that the htaccess file you kindly provided now works. Most odd. I'm wondering whether, since our server is of the virtual type, it doesn't do all apache stuff in the same way as a vanilla Apache install would.


Zarg - 13.08.08 3:53 pm

PS - I just read my previous comment and the part "...the htaccess file you kindly provided now works. Most odd..." doesn't read properly. Of course I do not mean that it's odd that the code you provided works, I mean it's odd that it works now after someone at our hosting did summat. So. They're odd. Not you. Although, of course, you might be odd for different reasons. Effing hell, I'm rambling. I'll shut up now...


corz - 13.08.08 8:28 pm

smiley for :lol:

I knew what you meant, Zarg!

Interesting that the server is virtual; I've been investigating and testing virtual machines these last couple of days; it's been a while since I did that; so much is now possible; it's crazy. But you are probably referring to Virtual Hosts, so no; there shouldn't be any relevant functional difference.

*Much* more likely, you alerted the sysadmins to something which was broken, and they have now fixed it.

Good work!

;o)


AskApache - 20.08.08 12:15 pm

Hey!! What the heck..!

cor - 12.11.07 8:23 am

Our askapache friend seems to be in "growth phase" at the moment, gunning for PR, I guess. I checked out his site over the weekend, and found quite a few examples of unworkable code, clearly lifted from elsewhere without a full understanding of the caveats and gotchas that go with it. Other "ideas" were just plain stupid.

My plan was to see if he might give some credit on his page (not just to me), and if so, let the link slide, though with a "rel=nofollow", of course smiley for smiley for ;)

But after your extra information, I may just neuter it, or point it somewhere else, DisneyLand, perhaps.

I'm not going to comment on my longtime "fan" RS, other than to say you shouldnt form an opinion based on a flamers flames.

If you find anything on my site that is yours let me know and I'll credit you, but I don't think you will as I don't lift stuff so much as experiment with stuff and post my findings.

Cor, you may want to check out my "Favorite .htaccess Links" section on my Ultimate .htaccess Tutorial and take a second unbiased look at my PR5 content.
;)


~AskApache


corz - 20.08.08 2:10 pm

I form opinions based on what I see. I've already seen your site, and I've no immediate plans to return. Also, I didn't keep notes.

Here's a free tip, though. If you are going to drop links on other pages, first ensure they work - I prefer not to link to error pages. Then I might even consider adding them here for real, as opposed to simply deleting them, which I have.

Here's another tip: spend more time on your work, and less time worrying about other people's opinion of your work. It makes for better work.

Experimentation is great, and I'm all for it; but real people find lab notes confusing. Transforming them into actual learning materials takes more than just typing. Regardless of what Stalin said, quality beats quantity every time.

;o)


corz is awesome - 13.09.08 4:33 am

this is a greate site, best tutorial I've read on htaccess. Keep up the good work =)


ChristinaSeay - 16.09.08 2:26 am

I've looked around your site and many others... thank-you so much for such and easy to read explanation of .htaccess files.

I have a, hopefully, easy/quick question that I can't seem to find an answer to...

Do you know of a way to use a .htaccess file to redirect every request to a server to a new URL?

I am transferring my site from one host to another and I've completely set up my site on the new host on a new domain and while the old domain is transferring to the new host, I would like to redirect my visitors to the new domain and site so I don't have any down time.

Any thoughts.. ??

Yeah! Check out page two! ;o)



ChristinaSeay - 17.09.08 12:23 am

THANK-YOU THANK-YOU THANK-YOU!!!! I read that page but for some reason before it just went right over my head last night! Thanks again... :o)




ChristinaSeay - 17.09.08 12:31 am

Oh... I've just thought of something I don't know how to go about testing... I'm using:

Options +FollowSymlinks
RewriteEngine on
RewriteRule ^(.+)\.php$ http://www.myURL.com [R=301,NC]

To redirect my old site/domain to the new site/domain on the new server... but while I'm physically transferring (not redirecting.. literally taking my domain hosting from tierhost to 1and1) and the domain is in limbo... will a query on that domain still initiate the redirect even though the domain is in limbo if the .htaccess file is on the old server?


corz - 17.09.08 12:52 am

If your TLD (Top Level Domain) name isn't changing, you don't need to do anything; let the DNS system take care of it - requests with old DNS information will still access the old host, right up until their DNS server gets up to date (usually within 24 hours), then they'll get the new host. Simply leave the site (the physical documents, etc.) on the old host for a few days until all the browsers catch up - so long the site exists in both places, the transition will be seamless - I've switched hosts a few times without anyone noticing.

But if you really are moving the site to a whole new TLD (e.g. "olddomain.com" -> "newdomain.com"), then your rule will do the job fine, though personally, I'd do..

RewriteRule . http://www.myURL.com [NC,R=301]

Which sends every request to your new home page.

;o)

p.s. any more rewrite questions, please, page 2!


B!n@ry - 27.09.08 7:41 am

This is a wonderful tutorial, and I would like to know if I can translate it to my language?

Best regards,
B!n@ry

Yes! So long as you aren't looking to translate part 2 (mod_rewrite tips & tricks) into French - we already have an excellent human French translation of that article, here and here. ;o)



Mohangk - 28.09.08 11:09 am

Just want to say thanks for this excellent guide! Provided all I needed to solve my issue.


B!n@ry - 30.09.08 1:51 am

Hi cor,

Thanks for your accept, actually I shall be translating it to Arabic. I shall send you the file as soon as I finish, so that you can use it here if anyone needs it.

Thanks again, and wish you keep these tut. moving on smiley for ;)


Numz - 05.10.08 7:35 am

Excellent, been surfin' a few hours looking for a clear and concise page to save.
(and ftr I just came from askapache and gave up...too much blather and ego) smiley for :roll:

Anyhoo, thanks and well done, took longer to find a freehost that allows .ht and mod_re

~Numz~


JD - 05.10.08 6:18 pm

Speaking of askapache, I see he removed his glowing link to corz.org after you gave him a dressing-down! Rather childish.

-JD


corz - 06.10.08 11:01 am

smiley for :lol:
Guys!

Thanks for caring, really, but that's enough now! smiley for :ken:

Numz, feel free to save away, but worry not; this page will be here for a long, long time. At least until 2014!

B!n@ry, an Arabic translation sounds great! Your command of English is as strong as most of the posters here, so I am looking forward to something highly useful from you!

You imply that you won't be hosting it anywhere. That's not a problem; I can host it around here. But if you are hosting it somewhere, let me know; I'll link to you.

Good luck!

;o)


rob - 13.10.08 12:16 am

I have a membership website..If i have a folder on my website with for example ten files in it and the folder is protected by htaccess so i can allow only members access to the ten files.Is it possible to htaccess each individual file to allow my customers to buy and access only one file in the folder? without gaining access to all ten?
thanks Rob.

Yes, it's possible, but would be extremely long-winded in practice. Fortunately, such things are quite easy with most server-side languages, such as php. ;o)



arkid - 16.10.08 9:21 am

hi all

i have a very simple question.

i have a site with a root .htaccess file that has one line in it,
ErrorDocument 404 /errorhandler.php

Ive got a custom written error handler page here. Thats great and works well throughout my site.

The issue is that this influences 404 errors on all subfolders throughout the hosting folder space. Now theres one folder where I dont want my custom error messages to show but normal ones.


So what do I do to either

1) make a specific folder have a custom .htaccess that says "ignore all higher folders properties"
or

2) use some ErrorDocument clause that tells it to revert to using the default system error document.

Can anyone suggest a way of doing this!??!




aconana - 26.10.08 8:16 pm

Merci beaucoup for this great intro to htaccess! I hope you haven't achieved too much "htaccess exhaustion" to field a couple of (probably naive) questions.

•The discussion of <ifmodule> refers to mod_php4.c (implying version 4 of PHP), while the "wee" selection of code in the "more stuff" section refers to x-httpd-php5 and php5-script (i.e. implying version 5). Do these names actually imply versions, or are they just standard monikers?

They explicitly state the version of php the section will apply to ;o)
.

•In the discussion of <Files> versus <FilesMatch>, both code examples use regular expressions - ^\.ht and \.(css|style)$ respectively - so I am confused by the statement that <FilesMatch> is preferred "...mainly because you can user regular expressions in the conditions".

Neither examples use regular expression (only basic wildcards), though <FilesMatch> could, if you wish. ;o)


Many thanks!


Fahed - 30.10.08 8:07 am

First of all, many thanks for this tutorial.

I have 2 htaccess related goals which I need help with and could not find the answers any where else.

1. I would like to remove the extension on all php files so that /something.php becomes /something

This would allow for files to become directories and provides a tiny bit of security through obscurity. To achive this, htaccess would have to...

- Check if there were a file called /something.php and load it.
- If not, check if there were a folder called /something/ and load that.
- If not, trigger a 404.

2. I would also like to know how a single 404 file can be used by all subdomains.

Right now, the root htaccess file says:

ErrorDocument 404 /404.php

However, if a 404 is hit in a subdomain, it looks for subdomain.example.com/404.php and finds nothing. Any ideas?

Many, many thanks in advance for your help.

(p.s. If your site doesn't notify me of a response, may you be kind enough to let me know that you responded.)


corz - 30.10.08 9:45 pm

Fahed, the answer to part 1, is in part 2 of this two-part series of articles; the part about mod_rewrite - click the chevron-looking link above these comments.

As for part 2 of your question, try using the full url in the subdomain's .htaccess files, e.g..

ErrorDocument 404 http://maindomain.com/404.php

I really must add a link to the Apache core features documentation in the useful links section! (also above these comments) There's a section about this in there, I'm sure.

You could also do it in the file system, using a symlink to a common document, or better yet; folder of the various error documents.

;o)

p.s. nope, I expect you to come back of your own free will, checking every hour in desperation! smiley for :evil:


Sugar - 31.10.08 12:43 am

Hi i find this artile very useful but i have a doubt, please can u help me out with this?

i am working with lion framework and want to alter the .htaccess file in the lionDemo directory so that
http://localhost/LionDemo/index.action

and

http://localhost/LionDemo/ produce the same result.

i tried adding 'DirectoryIndex index.php' in the htaccess file but its not helping.

How do i get that to work?

please advise


Surely you meant..

DirectoryIndex index.action

;o)



Daito - 03.12.08 2:17 am

Excellent guide thanks for sharing. Always something new to learn.
Thx
Petersmiley for :D


Dex Barrett - 11.12.08 7:48 pm

Nice tut man, just what i need to flatlink my site. Keep it up!


kushal - 23.01.09 11:19 am

i need some information about .htaccess...

i dont want to show my images folder while any one can check that folder...

www.domain.com/images..

if any one type this path it shows error..

i need that code

The code you need is on this very page. ;o)



A - 08.02.09 1:16 pm

I would like to stop users from directly viewing an image on my server, but still have the image appear if it is called from the code on an html or php generated web page. i.e. I would like an error to appear if a user types in http://www.mydomain.com/my_image.jpg, but to have the image appear if it is called from the code of an html page, like http://www.mydomain.com/my_image_with_text.html.

I found .htaccess code to prevent "hot linking" at http://www.javascriptkit.com/howto/htaccess10.shtml

It seems to work fine when I create a page with a link to http://www.mydomain.com/my_image.jpg - in other words I receive an error - (the address even shows up in the browser address field) but when I type http://www.mydomain.com/my_image.jpg directly into the browser address field the image still shows up! I tried clearing my cache thinking the image might have been residing there, but still no luck. Any suggestions?

Thanks in advance!

See part two, about rewriting. Link above these comments. ;o)



A - 08.02.09 6:56 pm

Thanks for the above!

I used:

Options +FollowSymlinks
# no hot-linking
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^http://(www\.)?mydomain\.com/ [NC]
RewriteRule .*\.(gif|jpg|png)$ - [F]

and put the .htaccess file in the sub-directory where I wanted to protect the jpg files from direct access.

In other words I removed

RewriteCond %{HTTP_REFERER} !^$

from the original script.

I tested it and it seems to work fine. i.e when I type http://www.mydomain.com/my_image_sub_directory/my_image.jpg into my browser I get an error indicating there is no access to the file (.jpg), but when I call my_image.jpg in the context of a php page that resides in another sub-directory, the image appears on the loaded page just fine.

Just wanted to check if there was something I was missing, or if there are any draw backs to this method - more importantly that my images are securely protected from being accessed by any means other than through html or php pages from my web site, housed on my hosting provider's server. (obviously not including ftp )

Thanks again!smiley for :)





corz - 08.02.09 7:33 pm

Two things to note:

1. Some users will have referrers disabled altogether. Because you have removed that line, they won't see your images, regardless of whether or not they load your page properly. That's either a big problem, or no problem, depending on your users and/or point of view.

2. There is no way to completely prevent direct access by this method. If someone was determined enough, they could easily load your images directly (by spoofing the referrer header, which is a trivial matter with many web clients; download managers, browsers, etc.).

If you want to completely block direct access, you need to put the images in a folder with zero outside access (e.g. deny from all), and then use passthru(), or similar, to feed the images to the browser, like this page does.

;o)


Henry - 19.02.09 9:55 pm

i have a problem here...

this is for the intranet, my manager want my intranet to be in the internet.. but with security.. firstly, i was thinking of using iis6 with AD configure so people outside the Lan can connect and use their username/password to authenticate also if they are inside they don't need to use anything, they will just go directly..

my problem is, i guess.. in linux.. is their a way i can say.. all 10.0.0.0/8 can view and don't need to use the username password, but if outside load the .htaccess

i hope this make sense, thanks

I have no experience of "iis6 with AD", I didn't even know you could run them on Linux. As to a more general solution, I would suggest running the two sites on completely different servers, and have your firewall sort it out, or at least having a special intra.domain.com subdomain where you can split the configurations more logically. Also remember, you can use .htaccess to check for the REMOTE_USER and AUTH_TYPE server variables, so you can, at least with Apache, know if a user has or hasn't logged in.

Check out part two for some mod_rewrite tricks which would enable you to roll together more functionality; check their IP block, redirect all requests from outside, and more. ;o)



John - 28.02.09 3:56 am

This is excellent guide, but I have a problem and don’t find solve for my issue. Please can u help me out with this?
I wish to all pages in all my folders can be viewed and indexed by Google
http://domain.com/folder1/pages.html
http://domain.com/folder2/pages.html etc.
but I don't wish that pages in root folder be available and I don't wish that Google can index them
http://domain.com/pages.html
except
http://domain.com/index.html
and
http://domain.com/about-us.html
Can it be done with .htaccess?
Thank you very much smiley for :)

There's no need, you can do it all in robots.txt, a file specifically purposed for this. Check this out. ;o)



Vijay - 06.03.09 10:57 pm

Excellent information.
Here is a big request:
Beside Apache documetation, I wish you add section regarding httpd.config file(Apache on Windows) and directives.

I have, so far, intentionally avoided giving information on httpd.conf (or whatever you choose to call it), thinking that anyone competent enough to want to configure Apache will be familiar with its documentation. This document is aimed more at web masters than server admin, though of course, many are both. Having said that, most of the stuff here will happily work in any context; .htaccess is just where most people would be putting it. ;o)



Somename - 13.03.09 8:41 pm

This article was very helpful to me, thanks. *Bookmarked*


sxe - 18.03.09 9:54 am

Sweet, just what I needed. Thanks smiley for ;)


Niladri - 23.03.09 11:51 am

Full tutorial of .htaccesssmiley for :lol:smiley for :lol:smiley for :D


svetikshark - 24.03.09 8:19 am

Have problem - made "Allow from my.IP.In.here". First time open page works fine, after refresh - writes access denied. Same with calling from other pages - link itself works fine, call from other page - access denied. Thx

Your question does not contain enough good information to form any reasonable hypothesis. ;o)



Andrei Cimpoca - 25.03.09 10:21 am

WOW! What a beautiful site! It is one of the most interesting and probably the most original design i have ever seen.


Andrei Cimpoca - 25.03.09 10:44 am

read your contact page and found a link for http://www.wholisticresearch.com/ecom/index.php. It was just what i needed. http://www.wholisticresearch.com/shop/home/m/Shop/c/20/. this section was especially helpful especially if you live in a country of colonists and want to take it to the next level. Another thing.. if you know of such a thing, i need to buy a book for brain surgery in just 3 weeks and why you shouldn't give yourself a neck operation. Please help.


User - 10.04.09 1:16 pm

This article was very nice and helpful to me, thanks.


Pankaj Pandey - 28.04.09 5:22 pm

Nice article, Can I post this article on my website?

No, please don't. Feel free to link to this page, though, with perhaps a wee exerpt. ;o)



Your Friend - 06.05.09 10:51 am

# no one gets in here!
deny from all

Will the above directive prevent access to sub-folders also smiley for :ehh:




Jay - 22.05.09 3:25 pm

I spend couple of hours going of .htaccess etc, I still am not able to figure out how to do this:

I want to prevent PHP from reading a file like this:
c:\somefile.txt

But no luck after 2 1/2 hours - Any help would be appreciated.

See Here. ;o)



Juha - 26.05.09 10:07 pm

Hello there!
If i use the "deny from all", would <img src="includes/image.jpg"> still work?

If you put the deny all inside the "includes" directory, then no, they would not work. ;o)



BU - 06.06.09 2:28 pm

Great article! But I have real trouble reading yellow (?) letters on a yellowish background. I know that in theory I can change it myself, but any chance that by default we get higher contrast colors? This seems a page I will be visiting and referring to quite often.

There already other themes. There are simply no controls to access them. Try adding "?theme=blues" onto the end of a corz.org URL, and you'll see! (btw "oranges" gets you back to the regular theme))

But aside from this sort of thing. no, I have no plans to increase the contrast or any such thing. If people have visual impairments, they need to take matters into their own hands (I use "Read Easily", in Firefox). That is not something a web designer should be thinking about, unless he is designing sites for the blind. ;o)



HughJass - 09.06.09 12:22 am

Yep, I actually *read* the disclaimer... what a novelty! smiley for ;)

Purely a comment, not a futile beg for help.

The PHP switches and flags you're showing above are HIGHLY dependent on your hosting provider's settings, and not all of us have access to them. Trying to flip a PHP or Apache switch that you're not allowed to touch generally ends in a 500 FATAL ERROR page for your viewers. Oops. I flipped one of the switches in an old .htaccess with one host, and didn't know for a week that the site was stone dead because the host suddenly decided to not allow us to disable register_globals. I eventually moved to another host because those jokers couldn't understand why I wanted basic security established for my site. <sigh>

Merely a word to the unknowing or unwise...

I do mention this! I too have had providers switch out functionality on me in the past, and had the site go down, even completely off-air. But I expect that aside from these unforseen events, there's not usually a problem, people tweak, test, tweak, test, and so on. Extreme caution keeps most folk safe.

In fact, if anyone playing around with their live .htaccess isn't shitting their pants, then they are probably doing something wrong! ;o)



Lester - 19.06.09 5:09 pm

Hi. I have a hidden directory which I've password protected. This .htaccess file works fine:
AuthName "Restricted Area"
AuthType Basic
AuthUserFile /XXX/YYY
AuthGroupFile /dev/null
require valid-user

However, some of these sub-directories and files are "hidden" -- start with "." and I cannot see them? What can I add to the .htaccess file above to permit this once I can login?

Thanks.

Lester

It depends "how" you want to see them. If you want invisible files to display in regular directory listings, you can try playing with the "Options" (see my example .htaccess), but I suspect this may not override the server settings.

Perhaps a better way would be to use some kind of php script, or similar, to access the hidden files. It won't be limited by any server settings further up the tree. ;o)



Chris - 23.07.09 6:46 pm

When working with Error Documents, can you capture the requested url with htaccess to change the redirect page?

Our company is in the process of translating our website into Chinese, and so far maybe 70 out of 500+ pages have been completed - eventually all pages will be translated so I do not want to rewrite links only to have to change them back later...

I thought maybe I could use Error Document 404 to capture the url when a visitor clicks a link to a page that has not yet been translated and redirect them to the English version of that page - Can this be done with htaccess? Or will I have to resort to something like a php header redirect script in my 404 page?

Thanks Corz - You Rock!

Sure! Download my super-cool 404.php (link at the top of this very page) and you'll see *exactly* how to do it! ;o)



Florida SEO - Edward Beckett - 03.08.09 5:55 am

Nice ... smiley for :)

This has got to be one of the coolest sites I've seen in months ...

You Rock ...

If you posted more [a hem ...]

I'd definitely aggregate your posts on SeoMasterList

SEO's ... need you ... smiley for ;)

Ciao ...

Florida SEO

EB

I wondered when you guys would spot I'm top of Google for everything! ;o)



George Garchagudashvili - 16.08.09 9:28 pm

very helpful tutorial thanks smiley for :)



Jorge Andrés Cañón Sierra - 18.08.09 11:57 pm

Hello smiley for :cool:

I want to know if is possible to enable php_openssl.so in .htaccess file. smiley for :roll:

Regards smiley for ;)

Afaik, No. ;o)



no idea about htaccess - 05.09.09 5:05 pm

If you do not want to show files/folders in Google search then you have to block google bots in .htaccess file.

how is this done? I have a main website and then have several others that are under same hosting.

but google has indexed my site by the primary domain then showing the secondary
IE: felixgato.com/otherwebsite.com/file name
Is this correct? how is this changed.

Thank you

There is a file specifically for this sort of thing, called "robots.txt". Google it. ;o)



homer - 13.10.09 7:05 pm

Hi. I've came across this "article" and this sounds interesting, but I have been trying to do something on my site, which is:
I have a pictures folder, and I want the pics to be displayed in the site, like using <img src=...> but i dont want people accessing my folder directory and viewing all the pics that are there. How can this be done? Thanks.


I have a Problem - 02.11.09 7:58 pm

http://www.example.com/somefiles-you-like.html this is my url, but if in the browser i change the url to http://www.example.com/somefiles-you-like-a-lot.html, the browser shows the content of the first url http://www.example.com/somefiles-you-like.html

Why this ?


Casey - 02.11.09 10:29 pm

Your site is by far the most organized that I've found, and I am hoping you might have an answer to a very strange issue with mod_rewrite.

I am using a Macbook Pro, running OS X 10.6 (Snow Leopard). I've gotten mod_rewrite to work, but not as intended. I've tried to correct the error using a virtual host, success with the virtual host, not with fixing the problem.

When I make htaccess files on my localhost, they do not appear to recognize their relative directory.

Example:

I try to rewrite alice.html to bob.html, but I get an error saying bob.html does not exist. In the directory localhost/. But both the htaccess file and bob.html are in localhost/~Casey/Test. I was able to circumvent this by either including a full path from localhost (RewriteRule ^alice.html$ /~Casey/Test/bob.html [L]) or by setting the path manually in RewriteBase.

I thought perhaps using a virtual host would resolve this, but nope. I mean, if I point it strait to /~Casey/Test/, yeah the htaccess file in /test/ works, but anything in a sub-directory beyond that has the same issue.

After trying just about every variant of configuration I could think of to resolve this, I am not completely rebuilding my apache .conf files, and I have learned more about Apache2 than I had ever hoped to know.

If you happen to see this post and know the fix, I'd greatly appreciate it.

Sincerely,

- Casey

PS - Snow Leopard comes with the packages pre-set, so the latest version (apache2.2 I think), and a full set of modules.


russ - 21.11.09 6:13 am

Hi Homer,

I want to protect my image folders on my website too. Simple protection is all I need......family code word. Do you have some script yet?


russ


globalinternethosting.com - 27.11.09 9:42 pm

excellent quick reference. too many things to keep in the head today. the key to success today is REFERENCING. remember.... never give WRITE privilege/permission for .htaccess to OTHER! Thanks guys.


mike - 02.12.09 6:28 am

Thank you very much for the instructions.
I have a question:
Can I redirect visitors from such ip to a different page, on the same site?
If I use "deny from..." then put
ErrorDocument 403 /403.html
in .htaccess file it won't work because when it try to open 403.html it is being denied also.
Any way to give visitors from such ip range a special page?

Thank you.




Jorge Alves - 14.12.09 3:12 pm

i have two folders:

/main
/main/files

i have one file in /main

index.php

i have lots of files in /main/files

something.png
something.css
something.js
something.java
something.php

well, i need to use all those files in index.php with image tags, script tags, java applet tag, php includes, etc... but, users can't access the /main/files folder or any of it's content or download those files. how can i do that?

thanks.


CjKun - 07.01.10 3:16 pm

When using redirect, is there a way to get a variable for the HTTP_HOST so you don't actually have to type http://yourdomain.com/new_file.html on the last parameter?

My case is that I have to implement the same on two different subdomains, and I want to be able to use the same htaccess file in both so I don't have to write one for each.
For instance,

Redirect /pages/oldfile.html http://sub1.mydomain.com/pages/newfile.html and
Redirect /pages/oldfile.html http://sub2.mydomain.com/pages/newfile.html

Can I just get the variable of the HOST?


arjun - 18.01.10 12:00 pm

i want some hacking scripts to test our web sites


Simon - 19.01.10 7:05 pm

you rock! hehe

I run a fashion website, and use Godaddy WIndows web hosting..
But I got hacked today, so I really need to ban some IPs to prevent this happen again... I used a .htaccess files, and put the IP there, upload to my website, and nothing happened, I used my IP to test...
Could you please tell me what happened? please help/// Email to me will be great... Thanks soooo much!



Sujith - 20.01.10 12:48 pm



Well and I need rewrite rule for following requirement


my web pages have several links with different query strings ,my case is if click the any link in that page it must redirect in to respective page with defined variable/query string

i hope following example will help you to understand

Link: http://myweb.com/index.php?module=mine&page=login

After click the link it will like

http://myweb.com/index.php?module=mine&page=login&VCODE=777

is this possible it .htaccess please let me know the same, thanks in advance expecting your positive reply

Best regards,
Sujith



Akos - 21.01.10 4:42 am

I am programming in Java and C, reading lotsa O'Reillys, manuals, codes, code-sites etc.
This is one of the best technical sites I have ever seen!
It is detailed, to the point yet still friendly.
Perfect.
I reckon you are better than O'Reilly.

Ever though of publishing?

Pls. keep up the excellent work,
cheers and thanks,

A


alok - 09.02.10 1:41 am

I am website developer i want to know how we can used .htaccess with PHP
code....
I don't knowledge about .htaccess




dk - 13.02.10 2:38 am

the great breakdown of htaccess i was looking for.
foxmarked smiley for :)


Ram Krishna - 04.03.10 4:16 am

Hi

I want to define rule for zend framework's .htaccess file, that is I want when url is http://www.example.com/unichat/... then it should not parse as controller and action. I want it process it as core.


Andrew - 10.03.10 9:54 pm

Wow, Thank you for the amazing tips!
I spent so much time trying to find the detailed information on .htaccess, and I wish I landed here first.
All I wanted to know was how to deny the access of random visitors on the directories.
Thanks once again!



TomBo - 24.03.10 4:11 am

Congrats on doing this nice page. However, it hasn't helped me... yet!

I have a dedicated server on NetDepot.com or really Gnax.net, running Linux and Apache 2.x with PHP 5.2.9. Trying to move my large years-old site onto this new server, but can't get the new machine to include php files when it serves my basic middle-of-the-page html pages that use php to serve up sidebars, headers, footers, etc.

Of course, PHP 5.2.9 breaks the old AppType and Application declarations in my htaccess file that have worked very nicely for years on other servers using PHP4.

I've tried about every combo of AppType and AppHandler declarations in htaccess you can imagine, using advice from every site showing up in Google for searches like "htaccess parse html php include in php 5.2.9" and so on. Spent a couple days on this.

Doesn't seem to be anyone who can say what will actually work for PHP 5.2.9 to get it to parse through my html pages to include my php headers, sidebars and footers that are on all million plus pages of my site.

Can anyone help me? Thanks. TomBond30 at Yahoo.com.


Turbo - 30.03.10 3:22 am

this is the most super-rad .htaccess explanation I have found yet. awesome job.


Marc hankin - 15.04.10 7:06 pm

Thanks. Much appreciated.


Marc


MArtin - 05.05.10 1:25 am

I agree - very helpful ressource, the best I found - thank you.
If there are even more hints, especially names and commands against hackers - go ahead, please.



Mehu - 06.05.10 12:26 pm

This is Good Script


manish kumar - 06.05.10 2:40 pm

I want to covert php file(ex.php) to html file(ex.html) on address bar how to posible


daves - 20.05.10 12:08 pm

Very nice article.
Keep it up.


ken - 08.06.10 3:22 pm

im not sure for how long this article has been up on your website, but today it has helped me and i just wanted to say thank you

a) a long time, and b) appreciated. ;o)



Jody - 18.06.10 6:38 pm

I hear and I forget. I see and I remember. I do and I understand. - Author unknown

A gifted teacher is as rare as a gifted doctor, and makes far less money. - Author unknown

You are a good dude. - Jody


jarav - 24.06.10 9:41 am

Could you please add how .htaccess could be used for cache control ?

I use it for this myself; it's very likely I'll get around to adding this at some point. Thanks for the kick. ;o)



Sandesh - 01.07.10 10:19 am

I have started bloggin site recently It is hosted on godaddy windows. GOdaddy doesn't support .htaccess file? i have tried couple of time uploading and testing the compression stuff but no effect

I've said it before; Godaddy hosting = teh sux. You cannot use .htaccess with a regular godaddy hosting plan (they do reseller type accounts, too, but way more costly). Get a decent web host. ;o)




Jack - 05.07.10 4:46 am

Thanks for this superb article. It helped a lot. All in a nutshell.
Keep the good work up!! smiley for :)


dante - 21.07.10 3:53 am

Hi, You can create an alias in one. htaccess???


shaffy - 10.08.10 8:44 am

This is Good Script


Tom - 17.08.10 5:30 am

This is Great. So much good, useful and well written information and entertaining too! I appreciate the time you have spent on this. It is a big help for me. - Thank you!



siva - 21.08.10 2:09 pm

this site is nice and very usefull. i liked it.


Chris - 28.08.10 7:55 pm

smiley for :eek: awsome, thank you


Jomadar - 31.08.10 2:21 pm

You have just cured my 2 day headache smiley for :)
Great Thanx!


KazaJhodo - 31.08.10 6:44 pm

Excellent .htaccess article, one of the top- if not the best for sure. Interesting, informative and correct- can't ask for more than that. Should seriously send the guy a few bucks, maybe he'll write something else super useful.


webguy - 23.09.10 10:56 am

Hi.
Thanks for these two very informative articles and taking the time to explain what is a pretty complex topic on web management. Well written, clear and concise.

Mod_rewrite is difficult to grasp quickly. I am in the process of having to share a site I built with another web master. This was unplanned and sudden. I have some pretty complex PHP code that I don't really feel like sharing with the other person. It took way too much time to develop to just give it away so the other webmaster can market it to their customers and clients. Unfortunately, my customer (the domain owner) will insist that I work with the other web master and wouldn't understand fully why I would want to prevent full access to the site. They just want the site to run smoothly and don't really care how many countless hours were involved.

Mod_rewrite rules will allow me to assign a folder to the other webmaster for the main site content, while simultaneously allowing me to park most of the PHP and database tools in another folder. I'm hoping to have both folders appearing to operate from the root. I haven't executed this task yet, but it looks doable based on your article.

Thanks again for taking the time to write it. It was very helpful.





Kavita - 07.10.10 2:39 pm

My blog is hosted on Godaddy and I want to hide php extension for my php files in rootdirectory\go\ folder. Where to find httpd.conf file. I cannot find it anywhere to make .htaccess enabled if it is not so. What to? do

I guess I should add that Godaddy hosting DOES NOT ALLOW .htaccess, maybe in the FAQ section, it does seem to come up a lot. Of course, first I'll need a FAQ section... ;o)



Mark Trail - 14.10.10 10:21 pm

smiley for :cool: Thanks, this is a great tutorial!


Ragha - 03.11.10 10:29 am

how to allow one type of files to browse eg : only .php or only .html etc ?

Pls help

Remove the other file types! ;o)



Perangi - 16.11.10 9:29 pm

If in the 'photos' folder of my site some pictures are missing they are replaced by some other pictures, e.g. file f28.jpg which is absent in the browser replaced by f8.jpg. Can you tell me how turn this off.

Disable mod_negotiation. ;o)



xvader - 06.01.11 6:21 am

perfect! i'm a new beginner in web development, and this .htaccess guide really helps me out. thanks and keep up the great works! smiley for :D


dynamind - 05.02.11 12:02 am

I'd just like to say thanks + give you a virtual hug for the (not only in my eyes) best ever .htaccess guide'n cheatsheet on the internet. best regards : ) you did a great job with this site smiley for ;)


mr1989foster - 25.02.11 4:13 pm

Thanks so much!

Am a new webmaster and this has made about ten of my jobs 10x easier!

Thanks again,
mr1989foster.


pezzo - 11.03.11 9:52 pm

with:

php_flag display_errors on


...you saved my life. THX!!!


Rajashekar - 29.03.11 5:43 pm

Nice site, it's helpful a lot to beginners.

I appreciate this Site.


jeff - 31.03.11 6:57 pm

Hi;
thanks so much to your tutorial. I have a question

1. I have lot of IP that i deny access to my pages and i end up modifying the .htaccess. Is there a way to
write the ban IP address into a file and have the .htaccess read from that file ?
2. if I restrict access to a directory based on IP address, is there away to make sure the that remote IP address is really what it is.

thank so much

jeff

Check out this. ;o)



LP - 09.04.11 5:12 am

I have yet to see a web site that does the "perfect" job of explaining the ".htaccess" file in general, and how to write or modify it in particular.

Having said that, yours does come the closest I've yet found to actually being useful... to me.

I realize that the above sounds like a poor or "back-handed" complement, but it's not meant to be.

Everyone has different ways of learning any given topic. For _ME_ finding out more about .htaccess has been a challenge, and you've done the best job so far. Keep up the good work. Don't settle for how it looks now, keep on polishing.

Thank you, LP


micmic - 21.04.11 9:28 am

I have a question you probably haven't seen before: Does godaddy support .htaccess ? smiley for :D

Seriously, which would you consider a decent host these days ? I know that these things change quite quickly...

Yes, these things change all the time, companies come and go every day, literally and I'm out of the loop because I've had years of prompt, personal service from LVCS; a fully-featured hosting company with a knowledgeable support staff that act quickly when needed, actually caring about, and for their clients; I've had no complaints.

By the way, I don't get a commission, though by all means tell Ed I sent you, I might hit him up for a few free GB next year! ;o)



wasim - 22.04.11 8:14 am

THANKS for sharing this tutorial it eradicate my confusion about .htacess and through your great tutorial i also learned well to rectrict files in directory thanks again... smiley for :)


jarodms - 06.05.11 3:12 pm

Hey this is an awesome write-up!!! Definitely bookmarking to re-use later.

Thanks!!!!


varun - 12.05.11 7:56 am

Nice post.. really helpful.... smiley for :Dsmiley for :Dsmiley for :Dsmiley for :Dsmiley for :Dsmiley for :Dsmiley for :Dsmiley for :Dsmiley for :Dsmiley for :Dsmiley for :Dsmiley for :Dsmiley for :D...


epc - 01.06.11 11:27 pm

Thank you for the great guide! Just a few questions - which you will probably ignore smiley for ;) : is there any official apache community-released documentation for htaccess which covers everything about it? If not, where did you get this information from? I hope it's not just from experience... in that case, man, patience is your highest virtue!

1. See the useful links section, above. 2. Both. ;o)



e07rdd - 06.06.11 10:58 am

Can anybody tell me that how to retrieve the .htaccess file using HTTP request from the server??

You cannot access .htaccess via an HTTP request, unless the server is seriously misconfigured. It would be trivial to knock-up a php script that could edit it, though. Of course, you would need to secure that script against unauthorised HTTP requests! ;o)



James - 06.07.11 1:25 am

Thanks, this has been very useful to me.
I'm not sure, but if you can believe Google on "how much is a pint in Aberdeen" I may have just bought you one.

Well, it might not be enough for an actual pint at the local pub (who has the time, anyway?), but it certainly is enough for a bottle of my favourite Czech Pils from the local shop! Cheers! ;o)



ghost2012 - 22.07.11 6:37 am

eh.....someone mentioned free virtual doughnuts.

Cool and very informative site. Thanks for the loads of info and the hard work you put into it.

You will never know! … smiley for :eek:
;o)



blades - 23.07.11 4:16 pm

Great information.

Do you know of a way to see if a user is logged in or not when they are outside of the htaccess protected directory using basic authentication. Without the apache login popping up. And I do not want to use cookies. I have tried many things, but had no success.

No, unless you are inside the authenticated area, the REMOTE_USER variable simply isn't set.

Why not place everything inside the authenticated area? They only need to login one time. And why not use cookies? They are generally transparent, and session-only cookies are trivial to work with.

Remember, HTTP is a one-shot affair; after the request has been served, it's no more than a line of data in a log. If you want variables to be retained, you need to save them somewhere. Remember also, basic authentication is just like a "cookie", except cookies can be way more secure, with basic auth the browser sends the full login details with each and every request.

Consider php sessions. ;o)



bobby - 20.08.11 2:54 pm

Hey,
I have php browser which show the user files - zip, jpg etc. The php itself needs to be in the folder where files are uploaded and i want to hide index.php and one more php file. How can i do this with .htaccess?

You really want to do this within the (php) code of the file browser. It may even have a preference to ignore files.

Also, check out this. ;o)



Bill - 16.09.11 12:23 pm

Thanks for the crash course in .htaccess. I feel like I"ve learned enough to be dangerous!


nayandeep - 04.10.11 2:33 pm

i want to more about .htaccess.
how apply a block of code of htaccess file on whole website to re-write urlsmiley for :roll:

Put the command in the root .htaccess file. The rule cascades automatically. ;o)



domainmonstrocity - 08.10.11 2:44 am

I am a newbie at Web Hosting and I very much appreciate these tips. Feel free to check out my website at the name above and add the .com at the end. I won't spam ya smiley for ;)


Simon - 18.10.11 2:20 pm

"Does godaddy support .htaccess ?"
My understanding is that they do, but changes to htaccess files may take an hour to take effect. This may only apply to virtual server hosting. not sure 18/10/2011


Powers - 27.10.11 10:26 am

First of all I want to say great site, I never even knew what htacess was until I landed here.
My question is when I put deny all into one of my folders it works. The only problem is, the files in that folder can not be linked to my pages.

I have a CSS folder with all my CSS inside. I also have other folders for Javascripts and so on. I dont want someone typing mysite.com/css/style.css and getting a look at my CSS or javascripts. When I place "deny all" My pages outside the folders do not call my CSS or Javascripts.

Can you help please?

Thanks

The link you want is hidden away up there in the main article, in the "Control access.." section, but I'll re-post it here for your convenience. ;o)



Thanks - 01.11.11 8:11 pm

Just a quick one. If I have PHP enabled on my server will it still make sense to write the following line?

<IfModule mod_php5.c>

What actually does mod_php5.c check for? Check if php5 module is loaded or not? Due to which PHP can run on that Apache server?
So my question here is even if I use that tag and still get an error then in that case is it because I am not allowed to change the value of that php setting in the .htaccess file?

btw Corz I love your site. Your post help me tremendously in understanding things so easily smiley for :)
Keep up the good work smiley for :D

This tag creates a CONDITION, like I said above, only if php is loaded will the commands within it come into effect.

If you have some wonky code that causes an error, putting it inside a conditional tag that is true (i.e. the php5 module IS loaded) causes the wonky code to run, which again causes the same error.

The idea of using a conditional section is to prevent code from running in inappropriate environments, for example, trying to tell a php4 server to set php_value date.timezone which only became available with php5.

And of course, none of this works if php is running as a cgi rather than a module..

;o)



Manohar - 01.12.11 5:07 pm

great concept


xxx - 01.12.11 11:08 pm

WTF!! smiley for :roll:©±


OH Fiddlestix - 13.12.11 3:54 am

I don't know how I found this site, somebody help me, how do you get back to the world of no answers that make any sense?
smiley for :roll:


Surojit - 16.12.11 9:40 pm

Hi,

I'm looking for a way to save wrong url strings, so I could register what wrong urls visitors are trying to reach.

some like this:
visitor tried: http://www.site.com/anywrongurl

htaccess redirects:
ErrorDocument 404 http://www.site.com/error.php?url=anywrongurl

error.php does:
saves url parameter into database and redirects to home page (or anywhere).

Tks, congratulations for the page and marry christmas.
Ricardo

Yup, that's roughly how to do it (except you don't add ?parmeters to the 404 ErrorDocument command, you get the 404 script to grab the URI from the incoming HTTP request headers, i.e.. $_SERVER['REQUEST_URI']).

If you get stuck, take a look at my own 404 script which does this and a whole lot more.

;o)



Robert Benson - 17.02.12 9:12 pm

In my .htaccess file I have:

ErrorDocument 404 " (message) "

What could possibly go wrong? Well, selected IPs are getting the standard "Oops!!!" page instead of above message.

What am I doing wrong? Thank you.

'standard "Oops!!!" page '? ;o)



Robert Benson - 12.04.12 5:19 pm

In my .htaccess file I want to deny access to all domains ending in .gov . Is this possible? How do I do it?

Thank you!

I don't follow. Do you mean deny access (to your web site) to people coming from .gov sites (referers)? Or deny access (to your web site) to people inside the .gov IP blocks? Or deny access to .gov sites from your own network? Or something else? ;o)



Robert Benson - 13.04.12 4:47 pm

Sorry - was not clear.

What I want to do is block access to all traffic from persons sitting at Government desks using Government computers. These would be domains, I'm assuming, that end in .gov .

So is there a wild card I can use in .htaccess that in one line will exclude all the domains ending in .gov ? This would be super helpful to me, because the onesy-twosy mode is getting tiresome!

There is another beer in this for you.

Thanks.

I'm not familiar with the onesy-twosy method, though it sounds like a lot of fun!

As for your .gov clients, YES, it is theoretically possible to deny access to folk inside those domains, you simply use:
Deny from .gov

HOWEVER, for this to work, you will need to have host lookups enabled. Theoretically, again, you can achieve this in .htaccess, like so..
HostnameLookups On

but in practice, that will most likely get you a 500 error. Try it and see. HostnameLookups is somewhat wasteful of server resources, due to the DNS query being performed before each request is processed.

If it's your own server, you can easily enable this in your main httpd.conf (or equivalent). If it's a shared server, the admins are unlikely to enable it for the above mentioned reason.

Failing all that, you can block the IP ranges associated with these domains, though with .gov, .mil, etc., this would be a huge list. Here's a slightly out-of-date version of that list..

http://www.totse2.com/totse/en/hack/understanding_the_internet/governmentowne170262.html

Here's where to get a current list of all the .gov domains, though without IP address information..

https://explore.data.gov/Federal-Government-Finances-and-Employment/Federal-Executive-Branch-Internet-Domains/k9h8-e98h

Clearly, translating all that to an IP database would be a huge effort, though not beyond the scope of a fairly simple program.

And if you are programming something, this would probably be best handled with php (or similar), doing DNS calls for all inbound requests and creating a black-list of any domains which match your criteria (.gov, etc.), somewhat like the latest version of Anti-Hammer does for referers.

Your php script (most likely used as an "Auto-Prepend") would consult its local black-list before performing lookups, and if an IP is there, no lookup need be performed, saving resources and bandwidth.

The advantage of a php-based solution is that you don't need shared server admins to do anything. Just code-and-go!

Have fun!

;o)



zauber - 18.10.12 12:15 pm

can you break up a picture into dots and make it writeable
so replce each dot with the original color + some text?

how? where do find info for such a task?
how do i break up a pic into writeable dots + orignal color
so when I look at original dots, from ABOVE", i still see
the original picture??
can this be done? how? how hard? what do i need?
need to know what?
thanks much..

Sounds like Steganography. ;o)



Jim - 23.10.12 9:38 pm

I read your tutorial 3 times. Thanks for using layman's terms for we noobs but I still have a question, if you don't mind expanding one of your tips.

You said: "Save bandwidth with .htaccess" I have reseller account on Hostgator with .htaccess enabled on WordPress blog. Will this work for me? If Yes, where in .htaccess do I place your snippet? At the head? Tail?

Thanks

Wherever you like! The whole .htaccess is processed before serving the page. ;o)



Paul - 03.12.12 11:44 pm

Hi Corz,

Just dropping a line being you allow me to!

Fantastic site , Fantastic work,

I was completely blown away at how much effort you have put into this and how open you are with helping others out
excellent to see that human beings can still be helpful (my little bit of doubt in mankind creeping in my apologies)

I must say I will return and see what else you get up to and I love some of the excellent downloads

Anyway Thank you

been a pleasure being here and I will be back !

Your Hashing program is what got me here and it is indeed amazingly fast,

I may drop you a line as to a little project which would use your hashing program and would like your advice on it some time, but let me get my thoughts together and then compile an email to you rather than here in plan view.

My inbox is always open! ;o)



Michael - 11.12.12 3:09 pm

Thank you for all the examples, really helpful to learn more by oneself!

smiley for :cool:


Steve - 01.02.13 12:20 am

Hi,

I found your article during my search for information. Good article. However, I do have a question why my application does not apply the rules in the .htaccess as you described them.

For example:
/app
    /include1
        .htaccess - deny from all
        files...
    /phpscripts
        .htaccess - deny from all
        php file
    index.php
    .htaccess - Options -Indexes 

When I visit my site through localhost (localhost/app/index.php), it displays the web page created by index.php. However when I clicked a link on the index page, which points to a script named test.php located in the folder /phpscripts, I get the message that /phpscripts/test.php is forbidden.

I though your article stated that files in /phpscripts would be accessible through the filesystem? Am I missing something?

I'm trying to restrict access to PHP files located in /phpscripts folder from
visitors to my site, but be able to run them through the filesystem.

Any help would be appreciated.

When you say, "points to a script named test.php located in the folder /phpscripts", do you mean it includes it? If so, you have a problem. If it's just a link the user has to click, then everything is working as expected. ;o)



John - 01.02.13 7:16 pm

Thanks for helping so many of us with great information!
I would like to limit the requests for any single page to a set number within a set time from any single requester. (Sort of like a DOS attack.) Say a limit of 5 requests within 30 seconds. How would I do this, is it even possible?


As far as I know, you cannot do this in .htaccess. But you can do it with anti-hammer. Note, the version on the page has been superceded with an as yet unavailable beta which has much more functionality - until I get a chance to get that up, feel free to mail me for a copy. ;o)



Steve - 01.02.13 9:20 pm

Thanks for your prompt reply.

I do have a question about the /include1 folder. In my setup, the /include1 folder contains files for PHP functions and other useful PHP code that are 'included' and used by application PHP scripts.

I want to setup the structure and .htaccess to allow the application PHP scripts to use the keyword 'include' to add PHP files from the /include1 directory but PREVENT site visitors from running the /include1 PHP files directly (these files only make sense in the context where they are included)?

Any help would be appreciated!


It is explained here. ;o)



Chetan Sharma - 31.03.13 1:33 pm

Thanks buddy, it really helps


hisham - 29.07.13 3:07 am

i`m using joomla 1.5

i have 2 server connection

local network ---->info.kk.net (domain server. local only)---->10.4.8.15 (local joomla installation)

i`m using reverse proxy

the address for the site is http://info.kk.net/mysite and it can open the site

but when i point to the menu it point to http://10.4.8.15/mysite/...

so i want the site is point to http://info.kk.net/mysite/...

can i use .htaccess for this. anyone can help me?

check out part 2 - rewriting (link above these comments). ;o)



Jugnu - 19.09.13 4:20 am

Hi

I am trying find a code for restricting folder access only if the request has come via a specific port.

For e.g. I am setting up Port 7050 and have got the Incoming and Outgoing Enabled on TCP.

Now I want the .htaccess script to use to allow content under the folder accessible only if the content was requested via the port 7050.

Thanks.

RewriteCond %{SERVER_PORT} ^7050$
;o)



R.S - 21.11.13 5:40 am

i couldn't update my htaccess file.No changes can be done in htaccess file.If i did changes then within 5 seconds htaccess file regaining its old content.Can anyone say what the problem is?

Bad Web Host. ;o)



jazz - 24.11.13 6:29 pm

Hi corz - thanks for the article.
I'll be back to study it more as I try to implement
many of your suggestions.

Right now I'm trying to do something and dont know
if its possible ... at least I cant work it out?

I have a folder tree protected by htpassword (auth basic?) and in that
tree I have a php file I want to use to check the username
of the person reading it ... that person has just logged into the tree with their username and password obviously.

I know I can read the .htpasswd file but
can my php code read the htpassword username of the user now active from the apache session or somewhere?

thanks again.

jazz

Check out part two, there's a download at the foot called "debug-report" Drop that inside the user area and load it in your web browser once logged in. Voila! ;o)



jazz - 25.11.13 11:10 am


Hi

Many thanks for the fast response cor.

That was exactly what I needed - perfect.
The value PHP_AUTH_USER does the trick for me.

You're worth your weight in gold (or bitcoins!)

cheers
jazz



John - 12.12.13 11:45 am

Hi many thanks to this site. this is a big big big help for me. This is the only site that a got what I need.


Can I Ask About the Multiple Domains in One Root?

RewriteCond %{HTTP_HOST} domain-one.com
RewriteCond %{REQUEST_URI} !^/one
RewriteRule ^(.*)$ one/$1 [L]


What is the proper way to add the WWW and Non WWW in domain-one.com?

I made this version but this is not totally working with both of WWW and Non WWW
RewriteCond %{HTTP_HOST} ^(www.)?domain-one.com
RewriteCond %{REQUEST_URI} !^/one
RewriteRule ^(.*)$ one/$1 [L]




I'll wait for your response.

Thanks

John smiley for :)smiley for :)smiley for :)smiley for :)smiley for :)smiley for :)

Try part 2 of this article - it's all about rewriting! ;o)



rahamsher - 07.03.14 1:45 pm

...its great job...
Keep it up...


Shailendra - 18.04.14 3:13 pm

I have a URL like "localhost:800/testapp/Showcase/demopage/S1/user" in which the number after folder "S" is dynamic and the "user" folder is also dynamic.Now i want my URL to look like "localhost:800/testapp/". I have hide the folders "Showcase and demopage" from the URL so that it looks like "localhost:800/testapp/S1/user" but i want to hide these S1, S2, S3 and user folders from the URL. Any help will be greatly appreciated.

Thanks:
Shail

See part 2 of this article. ;o)



tmac - 11.02.15 1:06 pm

Thanks for this page very informative. I have a problem though, whenever I change the AllowOveride to All from None, instead of the PHP interpitor opening the page it opens a save dialog. I have been trying for 2 days now to get inline php code in and html file to open. I can open an .php
file like menu.php opens fine but, if I call that in an html file like this:
<div class="menu">
<?php include 'menu.php';?>
</div>
it does not work, this is what brought me to this page. any idea what is going on?

thanks,

-Troy


After enabling AllowOveride All (in httpd.conf), something is then overriding your settings (in an .htaccess somewhere, perhaps) which disables the interpreter from handling .html files as php. Try something like in the local .htaccess file (as diagnostics!)..

<FilesMatch "\.html">
    SetHandler application/x-httpd-php
</FilesMatch>

;o)



Iban - 18.04.15 2:36 am

I have a question-problem:

I have a wordpress site in the root domain.
And I have another wordpress site in a subfolder of that root domain, with another domain.


It means:

Web site a: Root with wordpress site. Domain: a.com
Web site b: Subfolder. Root/bsite. Domain: b.com

I need in the web b, to put the wordpress url and site url point same place because otherwise the woo commerce checkout doesn't work.

So I understand that I have to copy .htaccess and index.php of the subdirectory to the root and change:

Open the index.php file you just copied to root in a text editor
Find the line require('./wp-blog-header.php');
Edit that to read require('./wordpress/wp-blog-header.php');

BUT HOW can I do that If I already have a .htacces file and and index.php already in the root folder the belongs to the wordpress website I have already in the root folder????????

I have no idea how can I do that. Anybody have a solution??


Thank you.


The wording of your question makes it impossible to give a full answer but I can tell you this..

.htaccess rules can be ADDED to the root .htaccess file. In other words, if you have an .htaccess in the root and want to add rules from another .htaccess file, you simply tag them onto the end of the root .htaccess file.

Replacing index.php is another matter. It sounds like you would be better off with a single domain.

;o)



Nordi - 30.07.15 8:31 pm

Is it possible to make a forward depending on the screen size or if a mobile (like in CSS @media only screen...) then redirect to the one or to the other file? How?


No, this isn't possible. Nor would it be desirable. ;o)



Gourav RR - 15.02.16 2:55 pm

Hello,

I have an query, i.e. I want to get contents from external url in my site, but I can't.

Help me out of this issue, or anybody suggest me, how to set access permission of external url,

in server configuration file i.e. httpd.config or php.ini.

Thank you,


Stephen P - 26.05.16 5:38 am

Just wanted to mention regarding your statement

> or use a text editor that allows you to open hidden files, something like bbedit on the Mac platform

You can open hidden files with any editor (or any program) on Mac OS X because the standard OS file open dialog will show them. Press
Cmd + Shift + .
(command shift period) in the Open dialog and it will show you the "dot" files (however it doesn't work in column view, only in icon or list views).

About time! I'll update the text. Thanks. ;o)



Jason - 28.06.16 8:44 am

"Antiquated Browser"???????????? A browser that is still used Millions of times every day is NOT Antiquated!! I used a different browser just to say, Bite Me! :P


faquhar - 02.10.16 2:41 am

hello i really need your help.

I dont know how to block using .htacess on this situation


www.mysite.com/contact.php <--realpage

but this guys always visit my site

www.mysite.com/contact.php/contact.php
www.mysite.com/aboutme.php/aboutme.php



do you know how to block this using .htacess




AskApache - 27.08.20 2:56 pm

Love the creative cookie acceptance banner!


First, confirm that you are human by entering the code you see..

(if you find the code difficult to decipher, click it for a new one!)


Enter the 5-digit code this text sounds like :

lower-case dee, Upper-Case Why, f-hive, lower-case kay, Upper-Case Ee


 

Welcome to corz.org!

I'm always messing around with the back-end.. See a bug? Wait a minute and try again. Still see a bug? Mail Me!