sometimes developers just aren’t playing for the same team

This is the kind of stuff that makes us admins infuriated at developers. Just to illustrate, pretend we have 3 testing environments and then Production for a web app. Env1, Env2, Env3, Prod. It is expected code will be moved up through those environments sequentially.

Developer Ted: I am rolling out code to environments Env1 and Env2 at the same time.

[a few hours later]

Developer Ralph: Env2 is broken and my coworkers and I can’t get anything done. What’s wrong?

Admin Mike: The code rolled out to multiple environments earlier broke things and made the environments unstable. We’re working to fix this now. Talk to Ted who rolled it out in multiple places all at once to not do that in the future.

Developer Ralph: But I need to work now, and so do my coworkers. We’re going to start doing our work in Env3.

Admin Mike: [blank look knowing I don’t have authority here]

[short amount of time passes]

Developer Ralph: I need support in Env3 because it is not working properly now.

Admin Mike: Well, some of the stuff you moved up shouldn’t have been moved up and that environment is borked now and we’ll have to expend more energy to fix it.

Developer Ralph: But I and my people need to work, should we start moving to test in Production?

At this point strangling the developer actually seems like a plausible mitigation to further destruction and downtime…

powershell working with time objects

I have a perpetually running powershell script which is always looking at a text file to see if an install is scheduled to run within the next 2 minutes. This text file just contains a list of times when installs should run (or nothing). I want this install to run every night at 12:10 am. To do this, I need to make a list of the next 100 days’ worth of 12:10am entries.

$basetime = get-date “11/15/2007 12:10 AM”
[array]$times = @()
for($a=0;$a -le 100;$a++){ $times += “$($basetime.AddDays($a).ToString()) both” }
$times
11/15/2007 12:10:00 AM both
11/16/2007 12:10:00 AM both
11/17/2007 12:10:00 AM both…

This gives me a list of 100 strings that can be read into get-date as a time/date object!

$blah = get-date $times[3].replace(” both”,””)

Why the hell is that “both” part in there? Well, that’s something just for me, which describes the install that is occurring. When evaluating schedule entries, I replace those off and trim the string down. Why do I want to read this into get-date again? So I can do better compares!

$objScheduleTime = Get-Date $blah
if ($objScheduleTime.GetTypeCode() -ne “DateTime”)
   { “timedate is invalid” }
else
   {
      $TimeDifference = $objScheduleTime – (Get-Date)
      if ($TimeDifference -lt 0)
         { “time is in the past” }
      else
         { “time is in the future” }
   }

First, convert $blah into a date-time object, then check the type code to make sure it converted correctly. Incorrect conversions need to be handled and not continue as a null object, or the rest of the script will complain. As usual, there are plenty of ways to do this, but this makes sense to me.

soccer goal security, risk analysis, and more from an auditor

I hesitate to post this link which I gleaned from Anton Chuvakin’s blog, because it has a lot of hard sentences to read and rambles a bit, but it has enough stuff to be a bit thought-provoking. Anton Aylward’s post deals with soccer goal security, but touches on a ton of things involving security.

In his marvelous 1992 novel “Snow Crash“. Neal Stephenson describes a franchising system and makes reference to the “three ring manual”. This manual is the set of operating procedures for the franchise, who does what and how, down to the smallest detail. I mention this in contrast to, for example, some of the businesses that failed after 9/11. These businesses did not have any ‘plant’ – desks, computers, software, even data – that could not be replaced. They failed because their real assets were not documented – the business processes existed solely “in the heads” of the people carrying them out.

The real assets of a company are not the COTS components. This is a mistake that technical people make. The ex-IBM consultant, Gerry Weinberg, the guy who came up with the term “egoless programming“, also pointed out that people with strong technical backgrounds can convert any task into a technical task, thus avoiding work they don’t want to do. Once upon a time I excelled in the technical side of things, but I found that limited my ability to influence change with management.

Interesting stuff. Anton A. is an auditor, and as such has a unique perspective on the industry. It is easy (maddeningly easy) to point out the flaws in other people or businesses or processes, and no one does it better than auditors. Kinda like IT journalists who can spout off best practices and “told ya so’s” but don’t know anything about IT beyond their home office 10-in-1 fax printer. Ok, that’s unfair for the auditors, as they do have more usefulness and knowledge, in my books. 🙂

powershell and active directory searching

I’ve been doing some more work using PowerShell for small ad-hoc types of scripts. Basically I keep some notes around, and adjust those notes for what I need at the time. This works great when I need to query certain things from our Active Directory. While we use AD a lot, only my team uses it, which means it gets messy and out of sync quickly.

A recent request needed me to pull all the supervisors and managers in our company. Odd, but no one keeps a list of these, nor do we have neat groups in AD to accomodate the request. Great. I could, however, pull out everyone who is listed as having a “direct report” in their AD account, which is something the desktop techs *are* good about updating.*

$objADSearcher = new-object DirectoryServices.DirectorySearcher([ADSI]””)
$objADSearcher.filter = “(&(ObjectClass=User))”
$objFoundUsers = $objADSearcher.FindAll()

[array]$objADUsers = @()

foreach ($t in $objFoundUsers)
{
   if ($t.properties.directreports)
      {
      $t.properties.name
      $objADUsers += $t
   }
}

This snippet will search out all user accounts in AD and display the names of those who have direct reports. Further properties on any given account can be found by doing a .properties to it, .e.g $objADUsers[45].properties.
I’ve also had a need to quickly find all the members of a group in a way that allows me to copy and paste the results.

$i = “Supervisors Group”
$objADSearcher = new-object DirectoryServices.DirectorySearcher([ADSI]””)
$objADSearcher.filter = “(&(ObjectClass=Group)(name=$i))”
$objFoundGroup = $objADSearcher.FindAll()
$objFoundGroup[0].properties.member

This will display the result of the search for Supervisors Group. If only one object is returned, I often forget that I still need to reference it by index[0].

Now, if I get a user back and want to connect directly into their AD object, I need to leverage the path property.

$ADSPath = $objFoundUsers.path
$container = [ADSI]$ADSPath
$container.manager
$container.directreports

* I am positive there are many ways to accomplish these tasks, and I may not be doing the most optimal method, however, this method does work for me for now, until I find some better way.

rant on the economics of disk storage and business priorities

The economics of IT are always going to be a pain point. Sadly, such penny-pinching when it comes to IT spending can result in some pretty creative issues. This is just a small Friday rant from work, so read at your expense!

Today we had a web server D drive fill up (the drive with our data), which caused some errors to start occuring on that server. This filled up because the log files weren’t getting cleaned up. We didn’t get alerts because our web servers run on such small disks that we were getting constant reminders about low disk space, so we turned them off as no one would pony up for more space. *

The log files weren’t getting cleaned up because a separate web log processing server’s disk was full and couldn’t pull the logs in anymore. This filled up because no one a) wants to make a policy on how long to keep log files or how important they are, so they are kept forever, and b) no one wants to look at the criticality of the server and assign a dollar value, which can then be used to offset costs for more storage. So it stays with the disks it has.

So a non-critical system that can’t get more storage due to penny-pinching caused an intermittent production outage on a system that itself is running on fumes because no one wants to put out for more storage. Capacity planning and budget submissions are one thing, but as much as we do them, the exec/business side continues to say “No thanks,” to the expense.

Ugh! I understand this can be a way to go for companies, kind of a JIT of disk storage, but it really, really helps to be up front with that policy so IT staff doesn’t have to constantly work in a “worry/told you before” sort of mode all the time. It’s just not important until it brings down production and clients notice. Sounds awfully similar to security!

* I love the little side risk to this practice. Developers can put out code quite easily enough on their own to fill the disks and cause web servers to all die in production. And even if intent isn’t there, we do run the risk that someone will accidentally publish something large that effects a DoS.

misunderstood hushmail hands over mail records

I’m still playing news catch-up, but I was drawn to this Wired blog post about Hushmail handing over mail records. This is a confusing article, quite honestly.

First, I will swear that Hushmail has been offering webmail service far prior to 2006 as mentioned in the article. I’ve been using them off and on for many years (both free and pay accounts), and definitely prior to 2006.

Second, I’ve never been aware of any sort Java applets or encryption when doing mail with Hushmail. Maybe this is just in the commercial version, but I suspect it really only works with email sent to other Hushmail users or recipients forced to log into Hushmail to retrieve the mail.* I can also attest to never, ever having to supply any passphrases, only the password to my login. So this whole encryption thing with Hushmail is a niche that I would be willing to bet few people truly use or were even aware of.

Still, Hushmail seems a very misunderstood service, as they market to security conscious people as being anonymous and private, when in fact it really is no less private than Gmail, unless you use their annoying and “non-solution” tools (and as the article demonstrates, even that isn’t solid). I personally just liked having the anonymity, as opposed to the privacy.

If someone were truly paranoid enough about their email privacy and anonymity, they are much better off scouring the net for open mail relays, using pgp, and then sending through an ever-rotating list of relays to their recipients. This protects the message in transit, spreads out your mail to such a degree that no one can form a profile of you, and hides your own originating information. And even that doesn’t protect your address unless you use rotating and/or disposable mail addresses…

* I really don’t agree with that approach to email security, and most people who use it really hate the annoyance of having yet another web site to get mail, rather than it coming to their own mailboxes. And yes, we have a secure mail solution that does this, but users both internal and external either don’t understand how to use it or actively hate it and try their damnedest to work around it…it’s just a terribly lame approach. What really sucks is marketing who then tries to say they secure email with encryption when I damn well know they can’t unless it never leaves their servers. Such misleading garbage that sucks in less-technical purchasers..

tool and book releases in my inbox

There have been a number of things released or updated recently that I want to try out, update, or read. Typically at work if I see new things, I’ll send notes to myself at home on my gmail account, but lately this has been getting jammed up as work has been insane lately. So I’m offloading some of the quick notes into blog posts…who knows, maybe someone else will likely these too!

OSSEC 1.4 has been released. This is still on my short list of projects.

IDS Policy Manager 2.2 has been released. I’d love to check this out, but I need to get my Snort box fixed at home.

fgdump 1.7 is out. fgdump is a utility for dumping Windows passwords, aka using pwdump more successfully and remotely.

Saw a note and placed an order today for a copy of Michael Rash’s latest book, Linux Firewalls: Attack Detection and Response.

Nipper 0.10.8 is out. Nipper can perform security audits on Cisco device configs.

proper education against werewolves?

I just wanted to capture some words from Bejtlich for my own preservation here because they rock. Feel free to take both sentences as wholly different subjects.

Forget about user education; I recommend management education. Deflect silver bullets.

If you want to read the post this was taken from… A-fucking-men. We can’t expect business and users to Get It if our own IT staffs and managers don’t Get It.

dan morrill on ethics in information security

I’ve been so terribly busy this past few weeks that I’ve not been able to keep up much with the blogs and news out there! However, one article I am very glad to have gotten to is a quick read from Dan Morrill that touches so many pain/pressure points for our industry. Need a conversation-starter with your fellow geeks? Pick a paragraph from this post and start yammering. Basically, this post is our life in a nutshell right now.

My only concern is how we actually can win battles. I guess I should define that in this case I consider the enemy the attackers. The only way we can truly win against them is to catch them in the act and shut them down. Defending against their attacks is nothing more than being a hockey goalie slapping away on-goal shots. We’re not often allowed to cross the line in the center and delve into the attacker’s territory, at least not with the blessing of our organization unless we happen to work for law enforcement.

Of course, one can attack this position by modifying my definition of who the enemy is. If our battle is against the attacks, we certainly can win battles, many of them, and make progress. We can limit the attacks that affect us or that make us worry, deflect the ones that we do have to worry about, and detect the ones that make it through our gauntlet of defenses. We win battles every day when a random IP fails to brute our SSH server, or scripts/root.exe fails to execute against our web servers.

sending mail in powershell: mail message objects

I’ve made a previous post about sending emails in PowerShell. Some additional notes I have found include creating the mail message as an object rather than straight strings. I also wanted to make multi-lined emails (carriage return, line feed, second line…), which seems easier when creating the message as an object. One could properly declare the email address string as a mail address object, but I just let PowerShell auto convert it for me.

$smtp = new-object Net.Mail.SmtpClient
$smtp.DeliveryMethod = “PickupDirectoryFromIis”
$objMailMessage = New-Object System.Net.Mail.MailMessage
$objMailMessage.From = “michael@server.com”
$objMailMessage.To.Add(“michael@server.com”)
$objMailMessage.Subject = “Subject line.”
$objMailMessage.Body = “Hello `nThis is a second line.”
$smtp.send($objMailMessage)