RubyGarden Spam

J

James Britt

The rubygarden wiki has been over-run with spam links.

220.163.37.233 is one of the offending source IP addresss.

I fixed the home page, and then saw the extent of the crap. Looks like
many personal pages have been altered.

Those with user pages may want to go check their own page to assist with
the clean up.

James
 
D

David Ross

You should create a way to generate images with text
verification. This would eliminate spam.

--dross

--- James Britt said:
The rubygarden wiki has been over-run with spam
links.

220.163.37.233 is one of the offending source IP
addresss.

I fixed the home page, and then saw the extent of
the crap. Looks like
many personal pages have been altered.

Those with user pages may want to go check their own
page to assist with
the clean up.

James





__________________________________
Do you Yahoo!?
New and Improved Yahoo! Mail - 100MB free storage!
http://promotions.yahoo.com/new_mail
 
R

Robert McGovern

You should create a way to generate images with text
verification. This would eliminate spam.

I think it would slow them down but it wouldn't eliminate them completely.
 
A

Austin Ziegler

You should create a way to generate images with text
verification. This would eliminate spam.

Captchas can generally be defeated by programs and violate usability
standards in any case unless there's a fallback -- which would likely
be able to be used by spammers to continue their process.

-austin
 
D

David Ross

Yes. Captcha analyzers would work on it, but only to
an extent. If you make it right to where it is so
mixed its not just text that will
squigle..*complicated) lines, to where it should be
able to confuse an AI. the squigly letters are a sign
of novice ;). I certainly don't use them. If I were to
make one it would be really confusing as to not look
like words or anything to an analyzer.

Wiki spam is ridiculous. Of course they are going to
keep doing it, considerations should have been thought
of before writing the Wiki software. security should
be the ultimate goal in any software because there
are nasty people out there that would exploit it.

--dross

--- Austin Ziegler said:
Captchas can generally be defeated by programs and
violate usability
standards in any case unless there's a fallback --
which would likely
be able to be used by spammers to continue their
process.

-austin





__________________________________
Do you Yahoo!?
New and Improved Yahoo! Mail - 100MB free storage!
http://promotions.yahoo.com/new_mail
 
T

trans. (T. Onoma)

I think it would slow them down but it wouldn't eliminate them completely.

I Disagree. With a little cleverness, this would stop it completely.

Sadly, b/c of the spam, I for one have stopped using Garden like I used too.

T.
 
C

Chad Fowler

The rubygarden wiki has been over-run with spam links.

220.163.37.233 is one of the offending source IP addresss.

I fixed the home page, and then saw the extent of the crap. Looks like
many personal pages have been altered.

Those with user pages may want to go check their own page to assist with
the clean up.

I've got a list, but it has become obvious that maintaining a list
manually isn't going to work. I'm tempted to require registration and
authentication at this point as much as I hate the thought.

Chad
 
D

David Ross

Chad said:
I've got a list, but it has become obvious that maintaining a list
manually isn't going to work. I'm tempted to require registration and
authentication at this point as much as I hate the thought.

Chad
As much as I like the idea of having authenticatoin, I don't think it
would work. Automation of scripts or a program would allow them to
bypass the authentication system. These attacks are not automatic, they
are performmed manually by morons.

--dross
 
R

Robert McGovern

As much as I like the idea of having authenticatoin, I don't think it
would work. Automation of scripts or a program would allow them to
bypass the authentication system. These attacks are not automatic, they
are performmed manually by morons.

If you think these are being performed manually be morons why did you
suggest earlier having a captcha type system?

"You should create a way to generate images with text verification.
This would eliminate spam."

Rob
 
R

Robert McGovern

I've got a list, but it has become obvious that maintaining a list
manually isn't going to work. I'm tempted to require registration and
authentication at this point as much as I hate the thought.

I'd certainly be against it, I know spam is a bad thing and indeed my
own wiki has had it from time to time but requiring authentication /
registration removes a freedom from people they shouldn't have to give
up and might indeed push people away from using it.

Also there is nothing to stop spammers from setting up a ton of "junk"
accounts to get around it. This has happened a lot on Yahoo Groups and
the group we are in basically decided that new users (for a period of
a couple of weeks) has to have their posts moderated. This was to
prevent general spam and job solicitations. I can't think of a way to
make that sort of scheme work in a Wiki enviroment though.

Rob
 
R

Robert McGovern

You should create a way to generate images with text
I Disagree. With a little cleverness, this would stop it completely.

It all hangs on whether its bot spam or manual spam. I never believe
in absolutes :)

Rob
 
D

David Ross

Robert said:
If you think these are being performed manually be morons why did you
suggest earlier having a captcha type system?

"You should create a way to generate images with text verification.
This would eliminate spam."

Rob
Sorry, I didn't explain well. I mean sites are targetted manually. Most
likely they are using automation scripts to spam. Depends on how much
spam was actually performed as well.. It could've been performed
manually for all I know, if thats the case nothing will stop morons. The
internet is an insecure place unfortunately. Bans can be evaded by open
proxies, HTTP, HTTPS, SOCKS, etc, etc.

Chad, Britt: Would it be possible to just have a simple command roll
back everything by time? Automation scripts can be halted, yet manual
attacks which just came to mind by McGovern can never be halted.

-dross
 
D

Dave Thomas

Captchas can generally be defeated by programs and violate usability
standards in any case unless there's a fallback -- which would likely
be able to be used by spammers to continue their process.

How about displaying a trivial line of Ruby code and asking the user to
enter the value. Something like

To stop spammers, please enter the value of the following

1.+(2) = | |

Change the + to a - or * randomly, and pick random numbers between 1
and 9


Cheers

Dave
 
J

James Britt

Robert said:
I think it would slow them down but it wouldn't eliminate them completely.

If the spam is entered by a script, then the wiki code should be able to
use some simple heuristics to block the most annoying crap.

For example, if the diff from the old page to the new page is greater
than some percentage, or if the new page contains X number of links to
the same site.

Make this Ruby Quiz #2 :)

Might this cause a problem for legit users once in a while? Sure. But
we have that now, with spam clean-up.




James Britt
 
D

Daniel Cremer

I'd certainly be against it, I know spam is a bad thing and indeed my
own wiki has had it from time to time but requiring authentication /
registration removes a freedom from people they shouldn't have to give
up and might indeed push people away from using it.

interesting thought:... I wonder if a sort of ruby Passport service
would get any use and create less hassle. I don't really agree to having
it centralised for the whole world by one company as Microsoft are doing
but targeted at a community like ruby it could be useful. All the sites
such as ruby-forum, ruby-garden and rubyforge could then identify you
and you'd only have to go through one registration procedure. Who knows
it could even have use for things like distributing rubygems and other
ruby programs...
just a crazy thought

-Daniel
 
T

trans. (T. Onoma)

Here's an idea.

I was considering the potential of moderation. And I also recalled someone
else pointing out that spammers are interested in one thing: external links
--and they had suggested we just get rid of external links altogether. Both
are too much. But then it hit me: combine the two!

If a page edit adds an external link, then the page has to be
approved by moderator.

T.
 
C

Curt Hibbs

trans. (T. Onoma) said:
Here's an idea.

I was considering the potential of moderation. And I also
recalled someone
else pointing out that spammers are interested in one thing:
external links
--and they had suggested we just get rid of external links
altogether. Both
are too much. But then it hit me: combine the two!

If a page edit adds an external link, then the page has to be
approved by moderator.

That's a very good idea!

The spammers typical add a hundred or so external links to a page. So, requiring approval for more than, say, two external links on a page would ease the burden on legitimate users, while limiting spammers.

Curt
 
G

Gavin Sinclair

I've got a list, but it has become obvious that maintaining a list
manually isn't going to work. I'm tempted to require registration and
authentication at this point as much as I hate the thought.

There are other, less intrusive, ways of combatting wiki spam. Why
not be tempted by one of those instead?

Gavin
 
D

David Ross

trans. (T. Onoma) said:
Here's an idea.

I was considering the potential of moderation. And I also recalled someone
else pointing out that spammers are interested in one thing: external links
--and they had suggested we just get rid of external links altogether. Both
are too much. But then it hit me: combine the two!

If a page edit adds an external link, then the page has to be
approved by moderator.

T.

This would certainly throttle the spammers who post links, but what
about the spammers*if any* who post abusing remarks against ruby?

-dross
 
J

James Edward Gray II

If the spam is entered by a script, then the wiki code should be able
to use some simple heuristics to block the most annoying crap.

For example, if the diff from the old page to the new page is greater
than some percentage, or if the new page contains X number of links to
the same site.

Make this Ruby Quiz #2 :)

I'm glad to see you're on the lookout for ideas, James. I haven't seen
your quiz topic submission yet. :p

I have seen your posts with links to the site though, so I'll forgive
you for not being the first.

James Edward Gray II
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads

Flea vs RubyGarden 14
RubyGarden: GCAndExtensions 0
rubygarden wiki 0
RubyGarden Spam 12
Wiki Spam Report 10
Spam 5
Report Spam 3
Looking for feedback on this markup language I developed and my website idea? 0

Members online

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,906
Latest member
SkinfixSkintag

Latest Threads

Top