I did some real quick tests on a couple other sites. Here are Wired.com and WashingtonPost.com. Note that Wired the 'www' sub-domain is mandatory (or was before they fixed it, for whatever reason - load balanced servers or a misconfiguration). WashingtonPost.com and Apache.org are still wide open, and God knows how many other sites, as I only tested a handful.
Here's a couple quick snapshots of The Washington Post and Wired.com. I have blurred out client IPs.
WASHINGTONPOST.COM:
WIRED.COM:
THIS IS ONLY THE FIRST PART OF PAGES AND PAGES OF INFORMATION, INCLUDING CLIENT IPs!
Next, I tested Apache.org, seemed reasonable. To my surprise, IT was wide open too! Great example they are setting, lol.
Again, this is only the FIRST of MANY, MANY pages of information (as you can see), including IP addresses of ALL clients currently hitting the server. In all my screenshots I've omitted them, or blurred them out.
Why Care?
So clearly we have here a lot of public information that is could be useful to attackers, competitors, statistic collection agencies, and whoever else. A person could poll this page on an interval and get lots of information about your visitors, and what content they are visiting. In addition, 'sensitive' URLs that make the mistake of hoping for security through obscurity may be exposed to potential attackers, especially if they pass credentials on the query string. And, of course, at the most basic level, this status may allow a DDoS attacker to know what effect he or she is having, and what counter-measures you may have in place.
Let's recap. Imagine how many:
- Sites pass secure credentials on the query string via GET requests, under the *assumption* that nobody else is seeing the GET requests in REAL TIME.
- Sites that use security through obscurity, by using unique folder and file names.
- Privacy implications, given that this status can be continually refreshed to get lots of information about a server AND its visitors.
- Cases where additional information about the server can be helpful to attackers.
- Other theoretical concerns I can't imagine.
While forgetting a password at my forum, I saw one of many real world examples of how this could exploit a server. The forum supplied a 'secure login' URL emailed to the account. I've seen other companies do this as well. Instead of sending a new password, they send you a unique URL good for a period of time. Obviously, an attacker continually polling the server-stats would have access to such URLs when they are accessed by the user (at least). If the pages allow for multiple accesses, and many may, then it could open the user account to the attacker.
What is this?
As with all things Apache, it is a module, mod_status. Of course, almost every ancillary feature in Apache is implemented as a module, either statically or dynamically linked. Most every Apache configuration will have it installed, though as we see, not all have it properly secured! It is configured by default to be built into Apache, as part of the 'base' modules.
Appending different parameters to the server-status URL provides information in other formats, and sometimes additional info (e.g. ?auto shows me mod_qos settings I don't see on the main status page of my server). See this link for full information.
How many servers are vulnerable?
Given that this is an Apache default, and Apache.org proudly has their stats wide open, I'd say that it is probably a fairly common mistake due to the lack of awareness, and based on the handful of quick tests I did. I have not scanned a bunch of sites, nor will I. I just tested a handful off the top of my head. Could I have just happened to hit vulnerable ones? Perhaps, but it seems unlikely. I would guess as many as 10% of servers could be vulnerable, with 5% being a lower-end number. That is of all servers, including IIS (which are not vulnerable).
How to protect your server-status
The fix is pretty simple, as with everything Apache. Assuming you want these stats on, here's your best bet. You simply need to add to your .htaccess, httpd.conf, or management software HTTPD includes lines something along the lines of:
# lock down server-status
<Location /server-status>
SetHandler server-status
Order deny,allow
Deny from all
Allow from localhost 127.0.0.1
Allow from someplace.whereiam.example.com
</Location>
The docs themselves actually recommend this, but it seems few have paid attention, or have implemented it correctly in hosted site default configurations. The ideal fix is to have mod_status not expose this page to non-localhost clients unless expressly specified.
So, check your server, make sure your stats are secure! If Wired.com isn't, there has to be lots more. It seems that most are secure, thankfully. However, there are PLENTY that are wide open.
UPDATE: Also check /server-info and (if using cPanel/WHM) /whm-server-status . The former may be wide open on some sites, and provides complete server information. Still, /server-status seems to be the one that is most ignored in the security configuration of sites around the web.
UPDATE2: As a new version of PHP was released, I checked their site and it is wide open too (PHP.NET). hXXp://www.php(dot)net/server-status
Shameless plug: Visit http://bitsum.com for great Windows freeware and shareware -- and embedded systems F/OSS software and information.
UPDATE: Also check /server-info and (if using cPanel/WHM) /whm-server-status . The former may be wide open on some sites, and provides complete server information. Still, /server-status seems to be the one that is most ignored in the security configuration of sites around the web.
UPDATE2: As a new version of PHP was released, I checked their site and it is wide open too (PHP.NET). hXXp://www.php(dot)net/server-status
Shameless plug: Visit http://bitsum.com for great Windows freeware and shareware -- and embedded systems F/OSS software and information.
Here's a quick `nmap` command to scan a host (or range) for /server-status:
ReplyDelete$ nmap -p 80 --script=http-methods.nse --script-args http-methods.url-path=/server-status