Tag Archives: comparison

Comparison of China and US Unemployment Rates

Unemployment Rates….. Vote with your Comments on which is Believable and Not?

Historical Data ChartHistorical Data Chart

Hits: 3


[ISN] Security Experts Expect ‘Shellshock’ Software Bug in Bash to Be Significant

http://www.nytimes.com/2014/09/26/technology/security-experts-expect-shellshock-software-bug-to-be-significant.html By NICOLE PERLROTH The New York Times SEPT. 25, 2014 Long before the commercial success of the Internet, Brian J. Fox invented one of its most widely used tools. In 1987, Mr. Fox, then a young programmer, wrote Bash, short for Bourne-Again Shell, a free piece of software that is now built into more than 70 percent of the machines that connect to the Internet. That includes servers, computers, routers, some mobile phones and even everyday items like refrigerators and cameras. On Thursday, security experts warned that Bash contained a particularly alarming software bug that could be used to take control of hundreds of millions of machines around the world, potentially including Macintosh computers and smartphones that use the Android operating system. The bug, named “Shellshock,” drew comparisons to the Heartbleed bug that was discovered in a crucial piece of software last spring. […]

Hits: 0


Performance Results of Free Public DNS Services

I ran some tests today to optimize my Internet performance. I performed the test using DNSBench. In my test I included all the major Free public DNS services that provide SOME level of malicious host protection capabilities. Below are my results.

Please Note: Performance results may vary. This test was performed via a Comcast home broadband connection in Livermore, California. Location of requestor and server’s can contribute significantly to overall performance. All users should perform their own tests to select an appropriate provider.

Secure Free Public DNS Test Performance Graphic


Performance Ranking by Provider and DNS Server

#1 – Norton ConnectSafe DNS
nameserver (fastest DNS lookups in overall test)
nameserver (4th place DNS lookups in overall test)

#2 – OpenDNS Home
nameserver (2nd fastest DNS lookups in overall test)
nameserver (3rd place DNS lookups in overall test)

#3 – Comodo Secure DNS
nameserver (5th place DNS lookups in overall test)
nameserver (6th place DNS lookups in overall test)
Below is a comprehensive report from the tool 

Final benchmark results, sorted by nameserver performance:
(average cached name retrieval speed, fastest to slowest)

199. 85.127. 10 | Min | Avg | Max |Std.Dev|Reliab%|
– Cached Name | 0.014 | 0.016 | 0.020 | 0.001 | 100.0 |
– Uncached Name | 0.017 | 0.086 | 0.254 | 0.070 | 100.0 |
– DotCom Lookup | 0.035 | 0.077 | 0.127 | 0.023 | 100.0 |
··· no official Internet DNS name ···
ULTRADNS – NeuStar, Inc.,US
208. 67.220.220 | Min | Avg | Max |Std.Dev|Reliab%|
– Cached Name | 0.020 | 0.022 | 0.024 | 0.001 | 100.0 |
– Uncached Name | 0.021 | 0.146 | 0.590 | 0.136 | 100.0 |
– DotCom Lookup | 0.082 | 0.196 | 0.335 | 0.058 | 100.0 |
208. 67.222.222 | Min | Avg | Max |Std.Dev|Reliab%|
– Cached Name | 0.020 | 0.022 | 0.025 | 0.001 | 100.0 |
– Uncached Name | 0.021 | 0.152 | 0.518 | 0.139 | 100.0 |
– DotCom Lookup | 0.078 | 0.189 | 0.351 | 0.070 | 100.0 |
199. 85.126. 10 | Min | Avg | Max |Std.Dev|Reliab%|
– Cached Name | 0.030 | 0.032 | 0.037 | 0.001 | 100.0 |
– Uncached Name | 0.033 | 0.100 | 0.261 | 0.072 | 100.0 |
– DotCom Lookup | 0.061 | 0.105 | 0.159 | 0.022 | 100.0 |
··· no official Internet DNS name ···
ULTRADNS – NeuStar, Inc.,US
8. 26. 56. 26 | Min | Avg | Max |Std.Dev|Reliab%|
– Cached Name | 0.032 | 0.058 | 0.246 | 0.053 | 100.0 |
– Uncached Name | 0.035 | 0.130 | 0.423 | 0.100 | 100.0 |
– DotCom Lookup | 0.035 | 0.094 | 0.132 | 0.036 | 100.0 |
ELVATE – Elvate.com, LLC,US
8. 20.247. 20 | Min | Avg | Max |Std.Dev|Reliab%|
– Cached Name | 0.032 | 0.066 | 0.253 | 0.054 | 100.0 |
– Uncached Name | 0.034 | 0.138 | 0.525 | 0.119 | 100.0 |
– DotCom Lookup | 0.034 | 0.099 | 0.130 | 0.031 | 100.0 |
ELVATE – Elvate.com, LLC,US
UTC: 2014-04-09, from 21:33:01 to 21:33:30, for 00:28.540

Interpreting your benchmark results above:

The following guide is only intended as a quick
“get you going” reference and reminder.

To obtain a working understanding of this program’s operation, and to familiarize yourself with its many features, please see the main DNS Benchmark web page by clicking on the “Goto DNS Page” button below.

Referring to this sample:

64. 81.159. 2 | Min | Avg | Max |Std.Dev|Reliab%
– Cached Name | 0.001 | 0.001 | 0.001 | 0.000 | 100.0
– Uncached Name | 0.021 | 0.033 | 0.045 | 0.016 | 100.0
– DotCom Lookup | 0.021 | 0.022 | 0.022 | 0.001 | 100.0

The Benchmark creates a table similar to the one above for each DNS resolver (nameserver) tested. The top line specifies the IP address of the nameserver for this table.

The first three numeric columns provide the minimum, average, and maximum query-response times in seconds. Note that these timings incorporate all network delays from the querying computer, across the Internet, to the nameserver, the nameserver’s own processing, and the return of the reply. Since the numbers contain three decimal digits of accuracy, the overall resolution of the timing is thousandths of a second, or milliseconds.

The fourth numeric column shows the “standard deviation” of the collected query-response times which is a common statistical measure of the spread of the values – a smaller standard deviation means more consistency and less spread.

The fifth and last numeric column shows the reliability of the tested nameserver’s replies to queries. Since lost, dropped, or ignored queries introduce a significant lookup delay (typically a full second or more each) a nameserver’s reliability is an important consideration.

The labels of the middle three lines are colored red, green, and blue to match their respective bars on the response time bar chart.

The “Cached Name” line presents the timings for queries that are answered from the server’s own local name cache without requiring it to forward the query to other name servers. Since the name caches of active public nameservers will always be full of the IPs of common domains, the vast majority of queries will be cached. Therefore, the Benchmark gives this timing the highest weight.

The “Uncached Name” line presents the timings for queries which could not be answered from the server’s local cache and required it to ask another name server for the data. Specifically, this measures the time required to resolve the IP addresses of the Internet’s 30 most popular web sites. The Benchmark gives this timing the second highest weight.

The “DotCom Lookup” line presents the timings for the resolution of dot com nameserver IP addresses. This differs from the Cached and Uncached tests above, since they measure the time required to determine a dot com’s IP, whereas the DotCom Lookup measures the time required to resolve the IP of a dot com’s nameserver, from which a dot com’s IP would then be resolved. This test presents a measure of how well the DNS server being tested is connected to the dot com nameservers.

The lower border of the table contains a set of eight indicators (O and -) representing non-routable networks whose IP addresses are actively blocked by the resolver to protect its users from DNS rebinding attacks: <O-OO—->. The “O” character indicates that blocking is occurring for the corresponding network, whereas the “-” character indicates that non-routable IP addresses are being resolved and rebinding protection is not present. The first four symbols represent the four IPv4 networks beginning with 10., 127., 172., and 192. respectively, and the second four symbols are the same networks but for IPv6.

The final two lines at the bottom of each chart duplicate the information from the Name and Owner tabs on the Nameserver page:


The first line displays the “Reverse DNS” name of the server, if any. (This is the name looked up by the nameserver’s IP address.) The second line displays the Ownership information, if any, of the network containing the nameserver

The final line of the automatically generated chart is a timestamp that shows the date and time of the start, completion, and total elapsed time of the benchmark:

UTC: 2009-07-15 from 16:41:50 to 16:44:59 for 03:08.703

All times are given in Universal Coordinated Time (UTC) which is equivalent to GMT. In the sample shown above, the entire benchmark required 3 minutes, 8.703 seconds to run to completion.

All, or a marked portion, of the Tabular Data results on this page may be copied to the Windows’ clipboard or saved to a file for safe keeping, sharing, or later comparison.
• • •

Hits: 0


[ISN] ‘Anonymous’ search engine sees rocketing growth after NSA revelations

http://rt.com/news/search-duckduckgo-popularity-nsa-956/ RT.com June 19, 2013 An alternative search engine DuckDuckGo has enjoyed a record surge in traffic as NSA scandals spark fears and frighten away Internet users from the more popular Google or Yahoo!. Over the previous week DuckDuckGo, a private search engine, which claims not to collect users’ searches or create any personal user profile, has increased its traffic by 26 per cent and passed 3.1 million of direct queries. A traffic surge hit DuckDuckGo after Edward Snowden’s revelations that NSA obtained direct access to the systems of Google, Facebook, Apple, Microsoft, Yahoo and other US technology companies. The scandal over the US’s PRISM program played into the website’s hands, before that direct searches were around 1.6 million a day for the last few months. However, despite impressive results, DuckDuckGo still cannot challenge internet search industry giants. For comparison, Google handles 5,134 million searches a day. […] _______________________________________________ ISN mailing list ISN@lists.infosecnews.org http://lists.infosecnews.org/mailman/listinfo/isn_lists.infosecnews.org

Hits: 0


SWOT analysis of vulnerability management vendors

Best Enterprise Vulnerability Management Product: Rapid 7 NeXpose

After reviewing the top players in my select list, it is my opinion that the vendor who is the most feature rich, low cost and safest deployment option currently available is the Rapid 7 appliance. Qualys is my second choice based on the same criteria and mostly due to my favoring onsite deployment. Finally with McAfee and they come in last for me mostly due to their lack of web and database scanning.
I just jotted down SWOT thoughts on the following vendors so if there are any corrections please send me them via my blog’s contact form.

Vendors I Selected for the SWOT

  • Rapid 7
  • Qualys
  • McAfee, Inc.

Rapid 7 – NeXpose

– Highly focused on just vulnerability management
– Quick deployment
– Fast customer adoption (high growth)
– Recent infusion of growth capital (VC funding)
– Enterprise ticketing integration
– Web application scanning
– Database scanning
– VMware capability
– Onsite deployment
– Low cost (depreciable)

– Small company
– Limited policy compliance functionality (ITGRC)
– Operations cost (management, power, rack space etc)
– Small research team
– Small support team

– Take greater market share as larger vendors lag
– Expansion to policy management (ITGRC)
– Expand distribution channel
– Integration with 3rd party blocking technology (web app firewalls)
– Integrate web app scanning ticketing to development bug tracking systems

– Company aquisition
– Alternative technologies are developed
– Large players address weaknesses

Qualys – QualysGuard Enterprise

– SaaS and cloud adoption increasing
– Web application security
– Database security
– Quick deployment
– Enterprise ticket integration
– Highly focused on vulnerability management

– SaaS only (high cost for onsite deployment option)
– High ongoing fees (non depreciable)
– Lower ROI due to continuous yearly subscription model
– Limited database scanning support

– Commitment to on site deployment option
– Reduce yearly subscription renewals to address ROI argument
– Move more towards SaaS based ITGRC platform
– Integrate web app scanning ticketing to development bug tracking systems

– ITGRC vendors expand to Vulnerability management space
– Smaller (more nimble companies) develop better functionality
– Larger players lower pricing further
– Larger players match SaaS offering

McAfee – McAfee Vulnerability Manager

– Large market share
– Countermeasure awareness
– Vmware option available
– Foundstone research heritage
– Instant new threat assessment reporting
– Onsite deployment option

– Limited web application scanning
– Limited database scanning
– Countermeasure awareness limitations (competitor products?)
– Console strategy unknown (epo?)
– Some functionality requires separate console

– SaaS expansion to include ticketing and policy compliance (ITGRC)
– Consolidate existing SaaS offerings under one single website console.
– Consolidate separately managed products into EPO (i.e. Vuln manager, Risk and compliance manager and remediation manager)

– Poor execution of consolidated console strategy
– Possibility of Acquisition
– Reduced revenue due to commoditization

Note:  The results of this analysis are not quantitative in nature and are only opinions of the author and no other associations, organizations or persons.

Hits: 14