Re: Re:Re: [users@httpd] Re:Re: [users@httpd] how to block the duplicated requests?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, May 20, 2015 at 1:26 AM, javalishixml <javalishixml@xxxxxxx> wrote:
4. later, we setup a new register page. We change its url from "http://mywebsite.com/register1.jsp" to "http://mywebsite.com/register2.jsp"
For the first several days, we find everthing is good.
But after several days, we find the robot(crackers) will successfully register the thousands different users for this web site during only several minutes.

You just made my point. The fact that you changed the URI should not require updating the filtering rules. Similarly if you add an RPC interface (or some other means of interacting with your lottery service) the front-ends to that interface should not have to know about the rules for rejecting duplicate requests. That filtering is the responsibility of your lottery service. It is not the responsibility of the web server.
 
Is it a DDOS attack? Is there a good way to resolve it at httpd level? Which apache mod is best for my issue?

No Apache module is best for your issue because the Apache web server is the wrong place to implement the request filtering. If Yehuda's and my arguments haven't convinced you that implementing this logic in the web server is the wrong place to do it then I would recommend ModSecurity: http://www.modsecurity.org/.

--
Kurtis Rader
Caretaker of the exceptional canines Junior and Hank

[Index of Archives]     [Open SSH Users]     [Linux ACPI]     [Linux Kernel]     [Linux Laptop]     [Kernel Newbies]     [Security]     [Netfilter]     [Bugtraq]     [Squid]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux Security]     [Linux RAID]     [Samba]     [Video 4 Linux]     [Device Mapper]

  Powered by Linux