Monday, August 31, 2009

Active Directory Powershell Utils

These four basic fuctions make life so much better when dealing with AD:


$ds = new-object system.directoryservices.directorysearcher

function Find-Group($samName){
$ds.filter = ('(samaccountname={0})' -f $samName)
$ds.FindAll()
}

function Find-User($samName){
$ds.filter = ('(samaccountname={0})' -f $samName)
$ds.FindAll()
}

function Find-Members($groupName){
$group = Find-Group $groupName
$group.Properties.member | Get-Entry
}

function Get-Entry($cn){
begin{
if($cn){
new-object System.directoryservices.DirectoryEntry("LDAP://$cn")
}
}
process{
if($_){
new-object System.directoryservices.DirectoryEntry("LDAP://$_")
}
}
end{
}
}


(ok, three functions. Two are the same, I know)

Thursday, August 20, 2009

When to Automate (Revisited)

I'm going to add another scenario to my 'When to Automate' list: when the execution time is more important than the preperation time.

The classic example of this is a typical data migration effort. Months of work goes into creating a one-off artefact, primarily to ensure the system down-time is minimized. There are other advantages (testing, pre-transition sign-off, auditing), but in most cases these just exist to mitigate against the risk of downtime, so it's still all about the execution interval.

What I'd not really thought about until the other day was that this also applies to refactoring efforts on code. Any large refactoring burns a considerable amount of time just keeping up with the changes people check-in whilst you're still working away. Given a sizable enough refactoring, and enough code-churn, you get to a point beyond which you actually can't keep up, and end up trapped in merge integration hell.

There are two normal responses to this. The first is to keep the size down in the first place, and bite off smaller, more manageable chunks. This can involve some duplication of effort, which is why many team-leads will take the second option: come in in the evening or over the weekend for a big-bang macho refactorfest.

But there is a 'third way': the refactoring can be as big as you like, as long as you keep it brief. If you can take the latest version of the codebase, refactor it and check it in before anyone else commits any more changes then you only have to worry about the pained expressions on the face of your co-workers as the entire solution restructures in front of their eyes. But you'll get that anyway, right?

Right now I've got some 50 projects, spread over 4 solutions, most of which have inconsistent folder name to project names to assemblyname to base namespace, some of which we just want to rename as the solution has matured.

To do this by hand would be an absolute swine of a job, and highly prone to checkin-integration churn (all those merges on .csproj / .sln files! the pain! the pain!). Fortunately however Powershell just eats this kind of stuff up. It's taken the best part of the day to get the script ready, but now it has the whole thing can be done and checked in in some 15 mins, solutions and source control included. And there's no point-of-no-return: any point up till checkin all I've done is improve the script for next time, no time has been wasted.

And by de-costing[1] the whole process, we can do it again some day...



[1] ok, I'm sorry for 'de-costing'. But I struggled to come up with a better term for 'reducing the overhead to next-to-nothing' that wasn't unnecessarily verbose, or had 'demean' connotations.

Monday, August 10, 2009

To Windows 7… and back again

This weekend I uninstalled the RTM of Windows 7.

Whilst I could have probably got the glide pad and sound card working with Vista drivers (had I spent enough time on the Dell website lying about what system I had to find the current versions of drivers, rather than just the aged XP ones listed under Inspiron 9300), what killed it was my Kaiser Baas Dual DVB USB tuner wouldn’t play ball: driver appeared all ok, device recognised etc… but it just wouldn’t detect any TV channels anymore. And for a media centre PC that’s just not on. And with a limited window before it’d miss recording Play School and earn me the scorn of my sons, I pulled the plug and dug out the Vista RTM disk (yeah: do you think I could successfully download the with-SP2 version from MSDN this weekend? Of course not).

This is the only time in living memory I’ve upgraded a PC without pulling the old HD as a fall-back, and predictably it’s the one that bit me in the arse…

Mind you, a fresh install’s made it feel pretty snappy again, Vista or no.

Wednesday, August 05, 2009

Ticks

(No, not the ones you get camping. This is a developer blog).

What’s a Tick? Depends who you ask. ‘Ticks’ are just how a given component chooses to measure time. For example,

DateTime.Now.Ticks = 100 ns ticks, i.e. 10,000,000 ticks/sec
Environment.TickCount = 1 ms ticks since process system start

These are just fine for timing webservices, round-trips to the database and even measuring the overall performance of your system from a use-case perspective (UI click to results back). But for really accurate timings you need to use system ticks, i.e. ticks of the high-resolution timer: System.Diagnostics.Stopwatch.GetTimestamp() [1]. This timer’s frequency (and hence resolution) depends on your system. On mine:

System.Diagnostics.Stopwatch.Frequency = 3,325,040,000 ticks/sec

…which I make to be 0.3ns ticks. So some 300x more accurate than DateTime.Now, and accurate enough to start measuring fine-grained performance costs, like boxing/unboxing vs generic lists.

Clearly it’s overkill to use that level of precision in many cases, but if you’re logging to system performance counters you must use those ‘kind of ticks’, because that’s what counter types like PerformanceCounterType.AverageTimer and PerformanceCounterType.CounterTimer require.

[1] Thankfully since .Net 2, we don’t have to use P/Invoke to access this functionality anymore. On one occasion I mistyped the interop signature (to return void), and created a web-app that ran fine as a debug build, but failed spectacularly when built for release. Took a while to track that one down...

Tuesday, August 04, 2009

GC.Collect: Whatever doesn’t kill you makes you stronger

Just a meme that was bouncing around the office this morning, in relation to the nasty side-effect of calling GC.Collect explicitly in code: promotion of everything that doesn’t get collected.

What this means of course is if you call GC.Collect way before you get any memory pressure, objects will get promoted to L1, then to L2, and quite likely not get cleaned up for a long, long time, if ever.

LLBLGen take note

Popular Posts