Register | Login
Views: 19364387
Main | Memberlist | Active users | ACS | Commons | Calendar | Online users
Ranks | FAQ | Color Chart | Photo album | IRC Chat
11-02-05 12:59 PM
0 user currently in Programming. | 3 guests
Acmlm's Board - I2 Archive - Programming - Restricting a folder to prevent leeching | |
Add to favorites | "RSS" Feed | Next newer thread | Next older thread
User Post
Stifu

Level: 28

Posts: 68/304
EXP: 129458
For next: 1880

Since: 03-16-04
From: Your mom's bed

Since last post: 18 hours
Last activity: 11 hours
Posted on 06-03-04 04:07 AM Link | Quote
Hey hey.

Here's the deal... I want to restrict one folder of some site to prevent direct linking.

Example: I got 2 FTP accounts.
1 with the site and all the pages, another one with zip files that are to be downloaded.
I want to make it so people can only download the files from the second site if they are trying to get them while clicking a link that is on the first site.
If the link is on any another site, it should not work.

I've already looked around on the net, and only found stuff about how to protect image files, or how to protect sites from certain IP or provider addresses...
But what I want is to protect zip files from getting downloaded if you're not in the right place.

Anyone can help ?
Parasyte

Bullet Bill
Level: 35

Posts: 33/514
EXP: 267348
For next: 12588

Since: 05-25-04

Since last post: 104 days
Last activity: 32 days
Posted on 06-03-04 06:15 AM Link | Quote
Use a PHP or CGI script, rather than directly linking to the files. The script can check the 'referer' address to see which site the browser is coming from. Then the script can either redirect the browser (not recommended) or it can send the file data directly through the script. (Recommended! Gives more control, and doesn't allow the browser to ever see the actual file location.)

Here's a sample script which you can use and modify to suit your needs: http://parasyte.panicus.org/projects/download-script.zip

Link to it using something like: download.php?file=filename.zip
Then the user will be able to download "filename.zip" if he is allowed, else he will get a 404 message.
Stifu

Level: 28

Posts: 69/304
EXP: 129458
For next: 1880

Since: 03-16-04
From: Your mom's bed

Since last post: 18 hours
Last activity: 11 hours
Posted on 06-03-04 01:15 PM Link | Quote
Thank you !

Doesn't sound too complicated... Although I'm still having problems.

---

?php
require "mimetypes.php";

$allowed = strstr(getenv("HTTP_REFERER"), "yoursite.com");
if (($file) && ($allowed)) {
$mimetype = $mimes[substr(strrchr($file[name],"."),1)];
if (!$mimetype) showError();

header("Content-disposition: filename=\"/path/to/".$file."\"");
header("Content-type: ".$mimetype);
header("Content-length: ".filesize("/path/to/".$file."\"));

$fp = fopen($file[path],"r");
while (!feof($fp)) $content .= fread($fp,1024);
fclose($fp);
echo $content;

updateCounter();
}
else showError();


function showError() {
header("Location: /path/to/404.html");
}
?>

---

I'm not sure what to change in the file apart from the stuff I put in red (well, more like pink...). What else must I change ?
The [name] and [path] thing maybe ? Not sure what to put there.

And am I supposed to change anything in mimetypes ? Looks like a weird file.

Thanks.


(edited by Stifu on 06-03-04 04:54 AM)
(edited by Stifu on 06-03-04 04:54 AM)
Jizuko

Jiz Is The Magic!
This board has run out of mana and can no longer use The Magic
Level: 51

Posts: 371/1191
EXP: 1004683
For next: 9255

Since: 03-15-04

Since last post: 230 days
Last activity: 213 days
Posted on 06-03-04 03:08 PM Link | Quote
I would recommend using htaccess to disable hotlinking. You can easily disable several folders or specific images with this or even create accounts for people so some can see it and others can't. Same with password protecting a dir or a specific file.
Parasyte

Bullet Bill
Level: 35

Posts: 36/514
EXP: 267348
For next: 12588

Since: 05-25-04

Since last post: 104 days
Last activity: 32 days
Posted on 06-03-04 03:27 PM Link | Quote
Sorry, the script contains old code that did not get removed. I've re-uploaded it, and tested this time. Download from here: http://parasyte.panicus.org/projects/download.html
Stifu

Level: 28

Posts: 70/304
EXP: 129458
For next: 1880

Since: 03-16-04
From: Your mom's bed

Since last post: 18 hours
Last activity: 11 hours
Posted on 06-03-04 03:48 PM Link | Quote
Jizuko: I already tried before... And the link you provided only work for image files... Not zip files.

Parasyte: Thanks again... The new version seems more convenient to manage and all.
However I still can't get it to work. I get the error page from my site as if I was direct linking...

I guess the error may be coming from this line:

$allowed = strstr(getenv("HTTP_REFERER"), "yoursite.com");

I've tried putting "stifu.free.fr" or "http://stifu.free.fr" (the address to the site where the download links are, but not where the zip files are stored...) instead of yoursite.com... With no success.
I'm redirected to the error page no matter what.


(edited by Stifu on 06-03-04 06:51 AM)
Parasyte

Bullet Bill
Level: 35

Posts: 37/514
EXP: 267348
For next: 12588

Since: 05-25-04

Since last post: 104 days
Last activity: 32 days
Posted on 06-03-04 04:18 PM Link | Quote
You need to place download.php on the same server with the files that it accesses. Also, the "$downloadpath" variable should be set to the absolute path. Thought you may be able to get away with using a relative path, like "./download/"
If you cannot run php scripts from that server, you will have to use browser redirection instead of the header/fread calls.
To relocate to the actual files:

header("Location: http://site.with.files.com/path/to/".$file);

Finally, you should use "stifu.free.fr" in the $allowed line. You generally won't want to include http:// or www, because many users visit a site like "yahoo.com" rather than "www.yahoo.com" and the script notices the difference between the two. If you were to remove the 'www.', then it would work properly in either situation.


(edited by Parasyte on 06-03-04 07:19 AM)
(edited by Parasyte on 06-03-04 07:22 AM)
Stifu

Level: 28

Posts: 71/304
EXP: 129458
For next: 1880

Since: 03-16-04
From: Your mom's bed

Since last post: 18 hours
Last activity: 11 hours
Posted on 06-03-04 05:02 PM Link | Quote
Both PHP files need to be on the server with the zip files, not on the server with the HTML files, right ? Just making sure...

Last question, does putting "stifu.free.fr" in the $allowed line also affects sub folders of the stifu.free.fr address ? Because the concerned HTML files aren't at the root...

Sorry if it sounds like I'm being lazy to ask all of this instead of just testing and trying myself, but I think my 2nd server isn't working quite right now, so I can't check properly at the moment... Or maybe this account is too restricted... I'll find out.

Alright... I think I'm almost there now.
Thanks.


(edited by Stifu on 06-03-04 08:05 AM)
(edited by Stifu on 06-03-04 08:12 AM)
(edited by Stifu on 06-03-04 08:13 AM)
(edited by Stifu on 06-03-04 08:15 AM)
(edited by Stifu on 06-03-04 08:21 AM)
Parasyte

Bullet Bill
Level: 35

Posts: 39/514
EXP: 267348
For next: 12588

Since: 05-25-04

Since last post: 104 days
Last activity: 32 days
Posted on 06-03-04 11:52 PM Link | Quote
Originally posted by Stifu
Both PHP files need to be on the server with the zip files, not on the server with the HTML files, right ? Just making sure...


That is correct.

Originally posted by Stifu
Last question, does putting "stifu.free.fr" in the $allowed line also affects sub folders of the stifu.free.fr address ? Because the concerned HTML files aren't at the root...


The $allowed line is checking where the browser is coming from -- not where it is.
So if you link to the file from *stifu.free.fr* (where the asterisks represent ANY text) then it will work. Unfortunately, this also means that any site linking to download.php with a directory named "stifu.free.fr" will be able to download files. This can be avoided by using "http://stifu.free.fr/" in the $allowed line. That should be fine for you, since I don't think anyone will be able to use anything like 'www.stifu.free.fr' to access the site, anyway.

As an example, say you have a link on your site (http://stifu.free.fr/downloads/index.html) pointing to "http://www.yoursite.com/download.php?file=filename.zip" ... When the script at yoursite.com checks where the browser came from, it will check for "http://stifu.free.fr/" in the referrer. The referrer would be "http://stifu.free.fr/downloads/index.html" which does contain the $allowed string. If http://www.lamersite.com/ was linking to the script though, the script would not be able to find "http://stifu.free.fr/" in the referrer, so it would give the error message. Makes sense?


(edited by Parasyte on 06-03-04 02:53 PM)
Stifu

Level: 28

Posts: 73/304
EXP: 129458
For next: 1880

Since: 03-16-04
From: Your mom's bed

Since last post: 18 hours
Last activity: 11 hours
Posted on 06-03-04 11:58 PM Link | Quote
Yes, thank you !

My 2nd account is down at the moment so I can't try it yet, but I think I understood everything... Should work fine.

Thanks for all of your help, I appreciate a lot.
ErkDog

Fuzz Ball
Level: 47

Posts: 593/982
EXP: 752190
For next: 14013

Since: 03-15-04
From: Richmond, VA

Since last post: 40 days
Last activity: 19 days
Posted on 06-04-04 03:54 AM Link | Quote
you can also check the referrer with .htaccess files and Access Denied anything not coming from the 1st site...

which would bounce any direct requests or foreign referrers
Cellar Dweller

Flurry
!!!
Level: 27

Posts: 64/269
EXP: 107817
For next: 8342

Since: 03-15-04
From: Arkansas

Since last post: 16 days
Last activity: 34 min.
Posted on 06-04-04 01:13 PM Link | Quote
Please treat missing HTTP_REFERERs as matching the list of permitted referring sites. I've had problems with anti-leech scripts when using some browsers.
Parasyte

Bullet Bill
Level: 35

Posts: 43/514
EXP: 267348
For next: 12588

Since: 05-25-04

Since last post: 104 days
Last activity: 32 days
Posted on 06-04-04 01:23 PM Link | Quote
Yes. But then people using those browsers will have no trouble leeching from off-site. ;P Unless the script were changed to check if the user's IP address had been to the download page within the last ~5 minutes. Something of that sort would work nicely.

In fact, that's exactly how I had set up DES' "Emulation Forum" when we had user-uploadable files for download. If the user's IP address wasn't in the access log from the last 5 minutes, it would redirect the browser to the post containing the file. It was a very good system, and worked no matter which browser (or referrer settings) the user had.


(edited by Parasyte on 06-04-04 04:25 AM)
Add to favorites | "RSS" Feed | Next newer thread | Next older thread
Acmlm's Board - I2 Archive - Programming - Restricting a folder to prevent leeching | |


ABII


AcmlmBoard vl.ol (11-01-05)
© 2000-2005 Acmlm, Emuz, et al



Page rendered in 0.016 seconds.