I need to do an operation a bit strange.
First, i run on Debian, apache2 (which 'runs' as user www-data)
So, I have simple text file with .txt ot .ini, or whatever extension, doesnt matter.
These files are located in subfolders with a structure like this:
www.example.com/folder1/car/foobar.txt
www.example.com/folder1/cycle/foobar.txt
www.example.com/folder1/fish/foobar.txt
www.example.com/folder1/fruit/foobar.txt
therefore, the file name always the same, ditto for the 'hierarchy', just change the name of the folder: /folder-name-static/folder-name-dinamyc/file-name-static.txt
What I should do is (I think) relatively simple: I must be able to read that file by programs on the server (python, php for example), but if I try to retrieve the file contents by broswer (digiting the url www.example.com/folder1/car/foobar.txt, or via cUrl, etc..) I must get a forbidden error, or whatever, but not access the file.
It would also be nice that even accessing those files via FTP are 'hidden', or anyway couldnt be downloaded (at least that I use with the ftp root and user data)
How can I do?
I found this online, be put in the file .htaccess:
Order allow, deny
Deny from all
It seems to work, but only if the file is in the web root (www.example.com / myfile.txt), and not in subfolders.
Moreover, the folders in the second level (www.example.com/folder1/fruit/foobar.txt) will be dinamycally created.. I would like to avoid having to change .htaccess file from time to time.
It is possible to create a rule, something like that, that goes for all files with given name, which is on *www.example.com/folder-name-static/*folder-name-dinamyc/***file-name-static.txt*, where those parts are allways the same, just **that one change ?
EDIT:
As Dave Drager said, i could semplify this keeping those file outside the web accessible directory.
But those directory's will contain others files too, images, and stuff used by my users, so i'm simply try to not have a duplicate folders system, like:
/var/www/vhosts/example.com/httpdocs/folder1/car/[other folders and files here]
/var/www/vhosts/example.com/httpdocs/folder1/cycle/[other folders and files here]
/var/www/vhosts/example.com/httpdocs/folder1/fish/[other folders and files here]
//and, then for the 'secrets' files:
/folder1/data/car/foobar.txt
/folder1/data/cycle/foobar.txt
/folder1/data/fish/foobar.txt
Answer
You could use Files/FilesMatch and a regular expression:
Order allow,deny
Deny from all
This is how .htpasswd is protected.
or redirect any access of .txt to a 404:
RedirectMatch 404 \.txt$
No comments:
Post a Comment