Jamit Software Forum
Welcome, Guest. Please login or register.
January 19, 2018, 01:09:27 pm

Login with username, password and session length
Search:     Advanced search
May 27th, 2009 - Jamit Software Launches the forum today!
3,080 Posts in 791 Topics by 1,398,668 Members
Latest Member: SusannaHag
* Home Help Search Login Register
+  Jamit Software Forum
|-+  Jamit Job Board Customers
| |-+  Plugins
| | |-+  TRAFFIC COP (Bad guys hate the cop!)
« previous next »
Pages: 1 [2] 3 4 5 Print
Author Topic: TRAFFIC COP (Bad guys hate the cop!)  (Read 22836 times)
Peter
Administrator
Hero Member
*****
Posts: 248



« Reply #15 on: June 09, 2010, 01:01:45 am »

...Yes, ideally the market should offer the feature of sending out notifications to all those who already bought the plugin. For now, you could just maintain one read-only thread (others cannot post to it) and keep posting to it whenever you upload an update.

This is what one could try: The RSS feed from market. http://market.jamit.com/rss.php?cat=93

If a plugin was updated, the date will be recent and the plugin will be on top of the list.
Logged

SECURE your site BEFORE you wish you had! Use plugins by COLOSSAL MIND!
whitecollarjobsite
Jammers
Full Member
*
Posts: 23


« Reply #16 on: June 09, 2010, 01:58:21 am »

very good idea! now if I will watch my RSS feeder............ Tongue
Logged
lee
Jammers
Sr. Member
*
Posts: 86



WWW
« Reply #17 on: July 24, 2010, 12:23:02 am »

Hi does anyone know if traffic cop will affect google spiders from crawling the site

Regards lee
« Last Edit: July 25, 2010, 07:07:18 am by Peter » Logged
Peter
Administrator
Hero Member
*****
Posts: 248



« Reply #18 on: July 24, 2010, 05:46:51 am »

It will not affect Googlebot spider.

If you wanted to block Googlebot, you would have to set up a deny rule.

Traffic Cop is highly configurable.
« Last Edit: July 25, 2010, 07:07:40 am by Peter » Logged

SECURE your site BEFORE you wish you had! Use plugins by COLOSSAL MIND!
lee
Jammers
Sr. Member
*
Posts: 86



WWW
« Reply #19 on: July 24, 2010, 11:13:54 am »

Cheers Peter for that

Regards lee
« Last Edit: July 25, 2010, 07:08:01 am by Peter » Logged
lee
Jammers
Sr. Member
*
Posts: 86



WWW
« Reply #20 on: July 27, 2010, 03:18:11 pm »

Hi Peter having trouble with the crawlers index my page, simulators are returning 0 results now im not sure if its traffic cop causing the problem because sometimes i receive Error 403 (Forbidden), But the reason i dont want to turn Traffic cop is the last time i did i had to reload traffic cop from scratch for some reason

Regards lee
Logged
Peter
Administrator
Hero Member
*****
Posts: 248



« Reply #21 on: July 28, 2010, 02:28:30 am »

....having trouble with the crawlers index my page, simulators are returning 0 results ..... because sometimes i receive Error 403 (Forbidden)..... i dont want to turn Traffic cop is the last time i did i had to reload traffic cop from scratch.....

Sorry, but I don't understand your message. Can you explain more clearly what is going on? What kind of trouble exactly? What kind of "crawlers"? How do you know and how did you verify that?

1.) What version of Traffic Cop are you using? You need to use the latest version. Today it's v 3.58.
2.) Please give me the complete Traffic Cop configuration that you are using (it's called "Deny Rules"). Just post it here in your reply.
3.) Also give me the other settings, including Default Denial Action, Whitelisted Countries etc.

Traffic Cop is 100% configurable. For example, if Traffic Cop uses no Deny Rules, then nothing will be blocked (denied), not even search engine spiders.

So, if search engine spiders are being denied access, it is because of the settings you are using.

Simulators? -- What is that?

Are you simulating a "crawler" (robot)? If yes, you better make sure that you are simulating the robot correctly.

AGAIN, the TRAFFIC COP will record each denial event in the log. If anything is being denied (blocked), it will be visible in the log. If a real robot was denied (blocked), you would see it in the log.

My conclusion (impression) is that you are worried about a problem that doesn't exist!
« Last Edit: July 28, 2010, 05:37:09 am by Peter » Logged

SECURE your site BEFORE you wish you had! Use plugins by COLOSSAL MIND!
lee
Jammers
Sr. Member
*
Posts: 86



WWW
« Reply #22 on: July 28, 2010, 10:47:51 am »

Peter how you doing mate, traffic cop is set up as standard config, CONFIGURATION_PLEASE_READ file is copied and pasted into the  Deny Rules (Blacklist),

 I have nothing in the fields below

Whitelisted IP Addresses (Optional)
Whitelisted Hosts (Optional)
4 Whitelisted User-Agents (Optional)

Traffic Cop Version using 3.58
JJB v 3.5.6


Now we have used a few different spider-bot simulators they have returned a 403 (Forbidden), they maybe using a proxy so we tried white-listing their IP and Host same response, we then deleted the white-listings and Flushed the denial logs.

I have 40-50 other url`s which are not monitored by traffic cop and all of these are indexed by google but none from the jamit board, then last night we decided to turn traffic cop of over night to see what response we get, Jamit was indexed in google Categories/ job listings etc the whole lot, so obviously the Spiders are being blocked

Can you elaborate on these instructions Peter

Whitelisted User-Agents (Optional)

Specified User-Agents will not be denied.

Put each entry on a new line.

You must use preg_match-style regular expression with delimiters. Examples: /Googlebot/i, /^My\ secret\ browser$/, or #Mozilla/5\.0#.

Also is the a way to disable it everytime we do we have to reload ip2nation.sql file

Look forward to your reply

Lee
« Last Edit: July 28, 2010, 10:50:03 am by lee » Logged
Peter
Administrator
Hero Member
*****
Posts: 248



« Reply #23 on: July 28, 2010, 11:48:26 am »

....Now we have used a few different spider-bot simulators they have returned a 403 (Forbidden), they maybe using a proxy so we tried white-listing their IP and Host same response, we then deleted the white-listings and Flushed the denial logs.....

OBVIOUSLY, the spider-bot simulators you are using are not any good. If they were good, you would not see any error 403 (Forbidden), because there is no deny rule which would block Googlebot, MSN, Yahoocrawl ....!

.....I have 40-50 other url`s which are not monitored by traffic cop and all of these are indexed by google but none from the jamit board, then last night we decided to turn traffic cop of over night to see what response we get, Jamit was indexed in google Categories/ job listings etc the whole lot, so obviously the Spiders are being blocked .....

Googlebot and most other normal bots (MSN, Yahoocrawl) are NOT being blocked (using the suggested configuration). If they were, you would find a record of the blocked (denied) event in the deny log. This is the whole purpose for the LOG. Everything is being logged, so you know what the plugin is doing.
« Last Edit: July 28, 2010, 12:00:53 pm by Peter » Logged

SECURE your site BEFORE you wish you had! Use plugins by COLOSSAL MIND!
Peter
Administrator
Hero Member
*****
Posts: 248



« Reply #24 on: July 28, 2010, 11:53:27 am »

.....Also is the a way to disable it everytime we do we have to reload ip2nation.sql file....

To temporarily disable (= toggle ON/OFF) the traffic management function of Traffic Cop, go to:

Preferences -> Settings -> Traffic Management (select the radio button Enabled/Disabled, and click save!)

You don't need to delete/reload the ip2nation database each time you want to toggle the traffic management function. The database table needs to be installed only during the plugin installation (enable in Jamit plugin panel), OR reloaded if you are updating the database table with new one.
« Last Edit: July 29, 2010, 01:26:09 am by Peter » Logged

SECURE your site BEFORE you wish you had! Use plugins by COLOSSAL MIND!
Peter
Administrator
Hero Member
*****
Posts: 248



« Reply #25 on: July 28, 2010, 01:20:43 pm »

.....
Can you elaborate on these instructions Peter

Whitelisted User-Agents (Optional)

Specified User-Agents will not be denied.

Put each entry on a new line.

You must use preg_match-style regular expression with delimiters. Examples: /Googlebot/i, /^My\ secret\ browser$/, or #Mozilla/5\.0#.

.........

Which instruction don't you understand?

TRAFFIC COP is made to be 100% configurable. It has the option to have many, many settings. A typical user will only use Deny Rules in section 1.1 (Blacklist).

When managing traffic (if enabled), Traffic Cop uses blacklist rules and whitelist rules. If a condition specified in the blacklist is true, an associated denial action will be executed. A blacklisted condition can be negated by a whitelisted condition (in most cases).

What is a User-Agent? http://en.wikipedia.org/wiki/User_agent
You need to keep in mind that User-Agent string can be forged. You cannot rely on it for security!!!

For example, the robot from Google, also called Googlebot, uses a User-Agent string such as:
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

The PROBLEM is that many people around the world, and many of them some very tricky, criminal and evil elements also use this User-Agent string. They forge it! The reason is that they want to trick a webmaster and pretend that they are Googlebot. These people do it because they usually get a "green light" to move around the site and access pretty much everything.

As I said, User-Agent string cannot be relied upon for security. The only way to be certain that a visitor to your site is a real Googlebot is by IP address.

The text above is another reasoning why any sort of "robot simulator" you are using is a bad thing. I would not use any "simulator".

DO YOU KNOW HOW TO READ THE LOG?

Traffic Cop has a page called "Deny Log". This is a table which shows all the recent "deny events". This means all the HTTP requests that have been denied access.

The log table shows some important information, such as IP address, host name, User-Agent etc.

For example, a real Googlebot always has hostname googlebot.com, or the given IP address can be traced back to Google Inc., Mountain View, California, using the Whois function, which is part of Traffic Cop. Just click on the hyperlink of the IP address and you will see!

As I said, the User-Agent information in the log table (although useful) can be forged, so it cannot be relied upon. Although it may say "...Googlebot...", it may be some trickster trying to sneak into your site.
« Last Edit: July 29, 2010, 02:46:05 am by Peter » Logged

SECURE your site BEFORE you wish you had! Use plugins by COLOSSAL MIND!
lee
Jammers
Sr. Member
*
Posts: 86



WWW
« Reply #26 on: August 19, 2010, 12:47:18 am »

Hi Peter thanks for the update are the instructions ive added below the right way to update traffic cop as they are slightly different from you instruction

Best regards lee

First delete the existing IP2nation Database,
 then upload the new SQL file to your server (folder /include/plugins/TrafficCop/),
then click "Install IP2nation Database" in the Admin control panel.
Logged
Peter
Administrator
Hero Member
*****
Posts: 248



« Reply #27 on: August 19, 2010, 02:01:17 am »

Hi Lee, thanks for the comment. Your instructions are correct and so are mine! No problem. Thanks again!
Logged

SECURE your site BEFORE you wish you had! Use plugins by COLOSSAL MIND!
Peter
Administrator
Hero Member
*****
Posts: 248



« Reply #28 on: September 10, 2010, 11:37:13 pm »

Today, 11 Sept, version 4.00 became available on the Market.

This is a MAJOR release. Many functions have been rewritten, bugs fixed, new functions added, old functions improved.

In detail:

1.) Eliminated one SQL query in function checkDenialDB(). Moved  denial event counting to function logSingleDenial(). -- This improves speed.

2.) Added  "Security Alerts by Email (Hacker Attack Instant Notification)"  feature. -- Now you can receive an alert into your email mailbox (mobile phone?) the instant when a security threshold has been breached. You define the threshold yourself. I have verified that these alerts can be received on Gmail accounts, but you should double check by using the included test tool.

3.) Added "Basic User-Agent Security". -- This verifies that User-Agent string is not forged. The function checks for User-Agent length and presence of invalid (evil) characters.

4.) Updated ARIN WHOIS URL link in whois() function. -- ARIN has changed the URL on their website.

5.) OTHER: Changed  character encoding to UTF-8. Column `condition` in table  `jb_log_redirects` changed to MEDIUMTEXT.

6.) Extensive updates and additions to Deny Rules in file CONFIGURATION_PLEASE_READ.txt. -- This is SIGNIFICANT!

7.) Added support for inline images (data URI scheme) to speed up  page download (of Admin pages of Traffic Cop).

8.) Added a function to count DNS files in DNS cache.

9.) Many other improvements, bug fixes, functional additions and enhancements. (2010-09-11)

This is what I am thinking about right now and I want you to know.

Cheers,
Peter
« Last Edit: October 13, 2010, 12:06:59 am by Peter » Logged

SECURE your site BEFORE you wish you had! Use plugins by COLOSSAL MIND!
Amjad
Global Moderator
Hero Member
*****
Posts: 109


« Reply #29 on: September 11, 2010, 10:06:29 am »

Quote
Some ideas going through my head right now are that I sell the Deny Rules as a stand-alone product. (Traffic Cop would come only with few lines of most basic rules.)

This is what I am thinking about right now and I want you to know.

Thanks for your plugin Peter,you can sell it to new customers.we started and suportted the traffic cop from the beginning.!!!
Logged
Pages: 1 [2] 3 4 5 Print 
« previous next »
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines Valid XHTML 1.0! Valid CSS!
Page created in 0.203 seconds with 17 queries.