RubyForge has been slow today because...

R

Richard Kilmer

Some freaking dork at the following IP address(s) was continually
downloading ruby182-14_RC8a.exe from here:

200.98.63.142

Then from here...

200.98.136.108

How is this for an example log:

200.98.63.142 - - [23/Oct/2004:17:41:34 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:17:53:18 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:17:56:34 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:00:47 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:06:31 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:10:56 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:14 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:28 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:41 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:19:10 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 9190167
200.98.63.142 - - [23/Oct/2004:18:19:12 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:19:18 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:23:16 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:23:55 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:26:32 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:26:36 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:27:46 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:28:32 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:29:58 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:31:51 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:32:07 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136

And I mean continually. Those IP address are now officially blocked. If we
find the perp who did this, they are going to be NAILED. We realize that
this is probably a DSL line or cable modem. If someone wants to help track
down who is doing this it would be great. It seems to be coming from Brazil
(www.uol.com.br) RubyForge is a community resource and this screws the
whole community.

I can only assume this was a denial of service attack. I will block the
entire 200.98 subnet and every other subnet owned by uol.com.br if these
things continue (which may negatively effect innocent people...and I don't
want to do that).

Best,

Rich
Team RubyForge
 
D

David Ross

Richard said:
Some freaking dork at the following IP address(s) was continually
downloading ruby182-14_RC8a.exe from here:

200.98.63.142

Then from here...

200.98.136.108

How is this for an example log:

200.98.63.142 - - [23/Oct/2004:17:41:34 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:17:53:18 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:17:56:34 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:00:47 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:06:31 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:10:56 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:14 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:28 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:11:41 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:19:10 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 9190167
200.98.63.142 - - [23/Oct/2004:18:19:12 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:19:18 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:23:16 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:23:55 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:26:32 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:26:36 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:27:46 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:28:32 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:29:58 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:31:51 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136
200.98.63.142 - - [23/Oct/2004:18:32:07 -0400] "GET
/frs/download.php/1205/ruby182-14_RC8a.exe HTTP/1.1" 200 11613136

And I mean continually. Those IP address are now officially blocked. If we
find the perp who did this, they are going to be NAILED. We realize that
this is probably a DSL line or cable modem. If someone wants to help track
down who is doing this it would be great. It seems to be coming from Brazil
(www.uol.com.br) RubyForge is a community resource and this screws the
whole community.

I can only assume this was a denial of service attack. I will block the
entire 200.98 subnet and every other subnet owned by uol.com.br if these
things continue (which may negatively effect innocent people...and I don't
want to do that).

Best,

Rich
Team RubyForge
I believe there are better ways than blacklisting so many users.
RubyForge is a great place for browsing projects, I think it would be
ill to prevent users to learn about RubyForge.

Maybe you could implement some sort of max downloads a day? or bandwidth
usage a day?

David Ross
 
R

Richard Kilmer

I am contacting the provider...I will attempt to identify the individual.
This is deliberate, and I will defend RubyForge from it. I will blacklist
the subnet as a last resort, but this WILL stop.

-rich
 
D

David Ross

Richard said:
I am contacting the provider...I will attempt to identify the individual.
This is deliberate, and I will defend RubyForge from it. I will blacklist
the subnet as a last resort, but this WILL stop.

-rich
Well I scanned those two computers, and appeared with nothing. I
seriously don't think the person behind the computers were the ones
attacking, but try to contact the provider just in case. brazil is the
ultimate cracker funhouse, there are many exploited computers, and
stupid people there(yes stupid people, I don't care how it sounds but
its true) There are smart people in Brazil too.. . They are the ones
usually to get infected, you are probably going to block out all of
brazil before the end of it. Would it be possible to just integrade some
sort of alert system in RubyForge? Like to give an alert if the person
tries to download too much, or keeps downloading without pauses in a 20
sec range?

the worst countries are brazil and some of the asian countries. Spam,
crackers, and kiddies.

David Ross
 
T

trans. (T. Onoma)

Speaking of attacks, I jumped over to the Garden Wiki just now and see that
the front page is spammed to the hilt and all the RecentChanges are nothing
but spam entries. So then I check out the PreventingWikiSpam page (which I
started) to see what was new there. There I find this interesting link:

http://www.rubygarden.org/ruby?action=browse&id=SpamIssue&revision=1

Imagine my surprise at seeing this! I am not sure who wrote it (daz?), but
actually I don't really care. Although, it seems I am being insinuated as the
potential "Great Graden Wiki Spam Artist". Lol! Sorry to disappoint folks,
but it ain't me.

But I can tell you this. I am quite dissappointed that the spamming problem
has not been satisfactorily dealt with yet. So much so that I now declaring
my intent to FIX IT. The plan is simple: any revision that adds an external
link will be denied --or perhaps better, honeypotted. If you want to add
external link(s) you'll have to email the administrator or official moderator
and ask that the link/links be added to the page. I doubt the email load this
will create will be very high especailly if spread out over a handful of
moderators. But if it does prove too much we can later add temporary
passwords (as in 48hrs) to let anyone do so via a special
(yet-to-be-determined) secure interface. Thats it. Problem solved.

So how should I proceed? Should I make a patch for Ruwiki? Or what?

T.

And BTW: I am not BillGuindon either.



On Sunday 24 October 2004 12:41 am, Richard Kilmer wrote:
| Some freaking dork at the following IP address(s) was continually
| downloading ruby182-14_RC8a.exe from here:
|
| 200.98.63.142
|
| Then from here...
|
| 200.98.136.108
|
| How is this for an example log:
|
| [snip]
|
| And I mean continually. Those IP address are now officially blocked. If
| we find the perp who did this, they are going to be NAILED. We realize
| that this is probably a DSL line or cable modem. If someone wants to help
| track down who is doing this it would be great. It seems to be coming from
| Brazil (www.uol.com.br) RubyForge is a community resource and this screws
| the whole community.
|
| I can only assume this was a denial of service attack. I will block the
| entire 200.98 subnet and every other subnet owned by uol.com.br if these
| things continue (which may negatively effect innocent people...and I don't
| want to do that).
|
| Best,
|
| Rich
| Team RubyForge

--
( o _ カラãƒ
// trans.
/ \ (e-mail address removed)

I don't give a damn for a man that can only spell a word one way.
-Mark Twain
 
P

Phlip

trans. (T. Onoma) said:
...So much so that I now declaring
my intent to FIX IT.

Look up "reverse Turing test". You could write one in Java in about 6 hours,
or one in Ruby in 15 minutes.
The plan is simple: any revision that adds an external
link will be denied --or perhaps better, honeypotted.

I don't like that, because I add links-out to my other sites all the time.
 
D

David Ross

trans. (T. Onoma) said:
Speaking of attacks, I jumped over to the Garden Wiki just now and see that
the front page is spammed to the hilt and all the RecentChanges are nothing
but spam entries. So then I check out the PreventingWikiSpam page (which I
started) to see what was new there. There I find this interesting link:

http://www.rubygarden.org/ruby?action=browse&id=SpamIssue&revision=1

Imagine my surprise at seeing this! I am not sure who wrote it (daz?), but
actually I don't really care. Although, it seems I am being insinuated as the
potential "Great Graden Wiki Spam Artist". Lol! Sorry to disappoint folks,
but it ain't me.

But I can tell you this. I am quite dissappointed that the spamming problem
has not been satisfactorily dealt with yet. So much so that I now declaring
my intent to FIX IT. The plan is simple: any revision that adds an external
link will be denied --or perhaps better, honeypotted. If you want to add
external link(s) you'll have to email the administrator or official moderator
and ask that the link/links be added to the page. I doubt the email load this
will create will be very high especailly if spread out over a handful of
moderators. But if it does prove too much we can later add temporary
passwords (as in 48hrs) to let anyone do so via a special
(yet-to-be-determined) secure interface. Thats it. Problem solved.

So how should I proceed? Should I make a patch for Ruwiki? Or what?

T.

And BTW: I am not BillGuindon either.



On Sunday 24 October 2004 12:41 am, Richard Kilmer wrote:
| Some freaking dork at the following IP address(s) was continually
| downloading ruby182-14_RC8a.exe from here:
|
| 200.98.63.142
|
| Then from here...
|
| 200.98.136.108
|
| How is this for an example log:
|
| [snip]
|
| And I mean continually. Those IP address are now officially blocked. If
| we find the perp who did this, they are going to be NAILED. We realize
| that this is probably a DSL line or cable modem. If someone wants to help
| track down who is doing this it would be great. It seems to be coming from
| Brazil (www.uol.com.br) RubyForge is a community resource and this screws
| the whole community.
|
| I can only assume this was a denial of service attack. I will block the
| entire 200.98 subnet and every other subnet owned by uol.com.br if these
| things continue (which may negatively effect innocent people...and I don't
| want to do that).
|
| Best,
|
| Rich
| Team RubyForge
These spam attacks are ridiculous. There needs to be a type of honeypot
system. I'm still curious how these spammers are working, through manual
or bot. Logs or explaination would help.

I still think setting a link to trap the spammer in thier own db where
no public changes are made and they view thier changes are the better idea.

David Ross
 
C

Carl Youngblood

uol.com.br is one of Brazil's largest internet providers. I would be
careful about blocking the whole subnet.
 
J

Joao Pedrosa

Hi,

uol.com.br is one of Brazil's largest internet providers. I would be
careful about blocking the whole subnet.

You are right. UOL is maybe the biggest ISP in Brazil. Their main
business is just that. Here in Brazil it's a very difficult problem to
educate people on how to avoid security problems and how to avoid
being a jerk and taking advantage of others. There is an ongoing sense
of impunity. We need professional politicians. Unfortunately, many of
our politicians have other interests besides the well-being of the
society. The only cure for such a misery is time. And count that in
hundreds of years. :)

Meanwhile, I hope that we can fight these vandal acts.

Cheers,
Joao
 
G

gabriele renzi

David Ross ha scritto:
trans. (T. Onoma) wrote:
These spam attacks are ridiculous. There needs to be a type of honeypot
system. I'm still curious how these spammers are working, through manual
or bot. Logs or explaination would help.

I guess it is bots. This is why the spammed pages are usually the one
linked from the main page. And this is why usemod-based wiki get spammed
almost everyday while other engines are not (widely used, so an optimal
target for a spambot).
I still think setting a link to trap the spammer in thier own db where
no public changes are made and they view thier changes are the better
idea.

I never understood this. A lone rider spamming some pages by hand is not
a problem, the wiki community can fix it easily. Automated systems are
the real one, and should be fighted with some simple captcha.
Everything, IMHO, anyway.
 
P

Phlip

David said:
These spam attacks are ridiculous. There needs to be a type of honeypot
system. I'm still curious how these spammers are working, through manual
or bot. Logs or explaination would help.

They are probably using HttpUnit to "test" Wikis.
I still think setting a link to trap the spammer in thier own db where
no public changes are made and they view thier changes are the better
idea.

Preventing a series of changes from the same IP might raise the bar.
Preventing the same text appearing on different pages would too.
 
T

trans. (T. Onoma)

| > ...So much so that I now declaring
| > my intent to FIX IT.
|
| Look up "reverse Turing test". You could write one in Java in about 6
| hours, or one in Ruby in 15 minutes.

No. Motivated spammers will always find a way around these things. A REAL
Turing test --a human being, is effective and works. We can no longer afford
to play games.

| > The plan is simple: any revision that adds an external
| > link will be denied --or perhaps better, honeypotted.
|
| I don't like that, because I add links-out to my other sites all the time.

But that's EXACTLY what we don't want! Do us a favor, make a single link to
your own page and add all the links you want to that.

I'm truly sorry. But we must do something about this if the Garden Wiki is to
remain a viable resource. I for one have already stopped using it b/c of this
spam problem. And I am sure others have done likewise. Not to mention the
number of man hours that have been wasted in fighting this.

I am truly sorry if this inconveniences you, but its the sacrifice we all need
to make if we wish to continue to have such a great resource.

T.
 
T

trans. (T. Onoma)

| > I still think setting a link to trap the spammer in thier own db where
| > no public changes are made and they view thier changes are the better
| > idea.
|
| I never understood this. A lone rider spamming some pages by hand is not
| a problem, the wiki community can fix it easily. Automated systems are
| the real one, and should be fighted with some simple captcha.
| Everything, IMHO, anyway.

I agree and CAPTCHA was my first suggestion. But the general take seemed to be
against it, siting reasons of use and implementation, and that spammers would
just find a way around it. I'm not so sure about these points, but
nonetheless pre-moderating pages with new external links is simple enough and
100% effective.

T.
 
G

gabriele renzi

trans. (T. Onoma) ha scritto:
I agree and CAPTCHA was my first suggestion. But the general take seemed to be
against it, siting reasons of use and implementation, and that spammers would
just find a way around it. I'm not so sure about these points, but
nonetheless pre-moderating pages with new external links is simple enough and
100% effective.

but this kills half of the goodness of the wiki..
I remain of the opinion that just stoipping the bots by changing the
post interface a little could be enough
 
D

David Ross

gabriele said:
David Ross ha scritto:




I guess it is bots. This is why the spammed pages are usually the one
linked from the main page. And this is why usemod-based wiki get
spammed almost everyday while other engines are not (widely used, so
an optimal target for a spambot).



I never understood this. A lone rider spamming some pages by hand is
not a problem, the wiki community can fix it easily. Automated
systems are the real one, and should be fighted with some simple
captcha. Everything, IMHO, anyway.
I would love captcha's, I think they are a good idea. Unfortunately not
all people think so, I think they need to be cimplemented regardless. It
would solve problems.

David Ross
 
D

David Ross

trans. (T. Onoma) said:
| > I still think setting a link to trap the spammer in thier own db where
| > no public changes are made and they view thier changes are the better
| > idea.
|
| I never understood this. A lone rider spamming some pages by hand is not
| a problem, the wiki community can fix it easily. Automated systems are
| the real one, and should be fighted with some simple captcha.
| Everything, IMHO, anyway.

I agree and CAPTCHA was my first suggestion. But the general take seemed to be
against it, siting reasons of use and implementation, and that spammers would
just find a way around it. I'm not so sure about these points, but
nonetheless pre-moderating pages with new external links is simple enough and
100% effective.

T.
Actually I havent seen any type of anti-bot methods being applied,
someoen needs to create support via CAPTCHAs and see what happens. Yahoo
uses captchas and you don't see them whingin about it. Wikis are free
support systems via webpages, someone just build the damn support to
stop this moronic spam.

David Ross
 
P

Phlip

gabriele said:
trans. (T. Onoma) ha scritto:


but this kills half of the goodness of the wiki..
I remain of the opinion that just stoipping the bots by changing the
post interface a little could be enough

How can you change it so HTML won't use a <form> tag with a <submit> button?

Automated Web page hits don't need to "look for" the Submit button, by
pixels. They just parse a page and concoct an HTTP POST response.
 
T

trans. (T. Onoma)

| > I agree and CAPTCHA was my first suggestion. But the general take seemed
| > to be against it, siting reasons of use and implementation, and that
| > spammers would just find a way around it. I'm not so sure about these
| > points, but nonetheless pre-moderating pages with new external links is
| > simple enough and 100% effective.
|
| but this kills half of the goodness of the wiki..
| I remain of the opinion that just stoipping the bots by changing the
| post interface a little could be enough

Half? I've done a lot of wiki editing, probably more than most. In all that
time I've added just over a handful of external links. This change only
effects pages with _new_ external links. So I do not see how this is any
where near "half". Do you honestly add an a new external link every other
time you edit/add a wiki page?

T.
 
T

trans. (T. Onoma)

On Sunday 24 October 2004 11:08 am, ts wrote:
|
| D> I would love captcha's, I think they are a good idea. Unfortunately not
| D> all people think so, I think they need to be cimplemented regardless. It
| D> would solve problems.
|
| make it optionnal
|
| http://simon.incutio.com/archive/2004/07/29/jimmy

Did you notice that page was spammed too?

T.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,769
Messages
2,569,580
Members
45,054
Latest member
TrimKetoBoost

Latest Threads

Top