FDM ed a file nearly 10,000 times from my site? 3oib
s: Alex 1a6c65
FDM ed a file nearly 10,000 times from my site? 3oib
Had an interesting situation over the weekend where a single visitor trying to one of the few files I host on my site managed to the same file almost 10,000 times falling just short of 25GB worth of data. I've actually never used FDM before (actually never heard of it before this week) so please bare with my lack of savvy here, but how could this happen?
I'm unimpressed as a as this one instance has skewed the heck out of my statistics for this file, and used a massive amount of resources (25GB is usually around 5 months worth of data for my site, not 2 days). Person I really feel for though is the as they have probably just set it going for the weekend and come back to find that most of their monthly quota has been blown out of the water. Any insights would be greatly appreciated.
I'm unimpressed as a as this one instance has skewed the heck out of my statistics for this file, and used a massive amount of resources (25GB is usually around 5 months worth of data for my site, not 2 days). Person I really feel for though is the as they have probably just set it going for the weekend and come back to find that most of their monthly quota has been blown out of the water. Any insights would be greatly appreciated.
2h3739
Was the entire file ed each time, or just a portion? Frankly, it sounds like a malicious attack to me. If it were me I'd block the 's IP.
2h3739
Anonymous wrote:Was the entire file ed each time, or just a portion? Frankly, it sounds like a malicious attack to me. If it were me I'd block the 's IP.
1.) Nice and then? Most IP's are dynamically assigned - not fixed. So it's a useless and contra productive sollution.
2.) I would guess the simply set the "retry" option to "try for ever and ever and ..." + some error on his/ her side so the must have started over and over again.
2h3739
Thanks for the replies. I don't think it was malicious as I've been able track the s access on my site from Google using a search term that was very relevant to the file they tried to .
I'm not sure where the error has occurred here, but I've been doing some reading and it appears that the CMS I am using may have some issues with file management. It might be something else altogether though as I've not seen this before and I sure this isn't the first time FDM has been used on my site.
I'm not sure where the error has occurred here, but I've been doing some reading and it appears that the CMS I am using may have some issues with file management. It might be something else altogether though as I've not seen this before and I sure this isn't the first time FDM has been used on my site.
2h3739
why u sure that the use FDM?
2h3739
because of the agent reported in my logs. Could have been a bot spoofing the agent string, but the path the came into and around the site looks fairly organic.
2h3739
Was it FDM 2.x or FDM 1.x ?
Alex,
FDM development team
FDM development team
2h3739
FDM 2.x
Re: FDM ed a file nearly 10,000 times from my s 46h3t
Anonymous wrote:Had an interesting situation over the weekend where a single visitor trying to one of the few files I host on my site managed to the same file almost 10,000 times falling just short of 25GB worth of data. I've actually never used FDM before (actually never heard of it before this week) so please bare with my lack of savvy here, but how could this happen?
I'm unimpressed as a as this one instance has skewed the heck out of my statistics for this file, and used a massive amount of resources (25GB is usually around 5 months worth of data for my site, not 2 days). Person I really feel for though is the as they have probably just set it going for the weekend and come back to find that most of their monthly quota has been blown out of the water. Any insights would be greatly appreciated.
My bad, wont happen again.
2h3739
Hello,
I have the same issue here... I have a web site (OptimFROG Lossless Audio Compression, at http://www.LosslessAudio.org/) and it seems that from time to time, I get in just a single day something like 10-15 thousands requests, one second apart, for ing a single file with agent \
I have the same issue here... I have a web site (OptimFROG Lossless Audio Compression, at http://www.LosslessAudio.org/) and it seems that from time to time, I get in just a single day something like 10-15 thousands requests, one second apart, for ing a single file with agent \
2h3739
Hello,
I have the same issue here... I have a web site (OptimFROG Lossless Audio Compression, at http://www.LosslessAudio.org/) and it seems that from time to time, I get in just a single day something like 10-15 thousands requests, one second apart, for ing a single file with agent FDM 2.x, e.g. for this file
http://www.losslessaudio.org//O ... pr4510.php
This also makes a huge waste of traffic (in the order of GBs).
What would be the best server side workaround for this (when not using the latest version, because a new version will solve the problem only for the s having it, and most of them probably do not have it)?
I have also seen this problem described at
http://bugs.sakaiproject.org/jira/browse/SAK-12006
Please, could you confirm if it works, according to
http://www.w3.org/Protocols/rfc2616/rfc ... ml#sec14.5
to put in the headers
Accept-Ranges: none
to prevent current and previous versions of FDM to try to use multiple ranges?
Otherwise, the only practical solution I see is to ban the agent FDM 2.x, which will probably affect negatively a lot of s?
Best regards,
Florin
P.S. It seems the forum is truncating the message text at the first real quote character, even if in preview it works ok. Please, delete my previous post fragment...
I have the same issue here... I have a web site (OptimFROG Lossless Audio Compression, at http://www.LosslessAudio.org/) and it seems that from time to time, I get in just a single day something like 10-15 thousands requests, one second apart, for ing a single file with agent FDM 2.x, e.g. for this file
http://www.losslessaudio.org//O ... pr4510.php
This also makes a huge waste of traffic (in the order of GBs).
What would be the best server side workaround for this (when not using the latest version, because a new version will solve the problem only for the s having it, and most of them probably do not have it)?
I have also seen this problem described at
http://bugs.sakaiproject.org/jira/browse/SAK-12006
Please, could you confirm if it works, according to
http://www.w3.org/Protocols/rfc2616/rfc ... ml#sec14.5
to put in the headers
Accept-Ranges: none
to prevent current and previous versions of FDM to try to use multiple ranges?
Otherwise, the only practical solution I see is to ban the agent FDM 2.x, which will probably affect negatively a lot of s?
Best regards,
Florin
P.S. It seems the forum is truncating the message text at the first real quote character, even if in preview it works ok. Please, delete my previous post fragment...
Who is online 3p6y2t
s browsing this forum: Bing [Bot], oroville and 34 guests