Thursday, April 22, 2010

Accessing CodePlex using Windows Live ID via Team Explorer

…doesn’t work for me. I eventually remembered what my ‘native’ CodePlex password was, and that worked just fine.

Of course, this turns out to be a RTFM:

Q: Why do I still need a CodePlex account?
A: We still require a CodePlex account to successfully authenticate with the source control servers.

…but it wasn’t like was plastered all over the account linking process page, or (unfortunately) mentioned on that ‘how to set up TFS client’ popup they have.

Tuesday, April 13, 2010

3 Reasons to Avoid Automatic Properties

That’s dramatic overstatement of course, because automatic properties are great in many cases (though are public fields really so wrong?) But now that VB.Net has joined the party too [1], it’s worth remembering that they are not all good news:

1/ They Can’t be Made ReadOnly

Sure you can make them have a private setter, but that’s not the same as a readonly field, which is a great check against whole classes of screw-ups. If a field shouldn’t change during an instance lifetime, make it readonly, and save yourself some pain.

2/ No Field Initializers (in C#)

The nice thing about initializing fields in the field initializers is you can’t forget to do so in one of the constructor overloads, and (in conjunction with readonly fields) you can ensure it can never be null. Since this is all on one line it’s easy to inspect visually, without having to chase down code paths / constructor chains by eye.

(You can vote for this, for all the good it will do [2])

3/ Poor Debugging Experience

Properties are methods, right, even auto-generated ones, and need to be executed for the debugger to ‘see’ the value. But that’s not always possible. If the managed thread is suspended (either via a threading / async wait, or by entering unmanaged code) then the debugger can’t execute the property at all, and you’ll just see errors the below:

Cannot evaluate expression because the current thread is in a sleep, wait or join

Here you can only determine the value of ‘AutoProperty’ through inference and guesswork, whereas ‘ManualProperty’ can always be determined from the backing field. This can be a real pain in the arse, so it’s worth avoiding automatic properties for code around thread synchronisation regions.

As an aside remember that there are backing fields, it’s just you didn’t create them, the compiler did, and it used it’s own naming convention (to avoid collisions) which is a bit odd. So if you write any ‘looping over fields’ diagnostic code you will see some strange names, which might take some getting used to. You’ll also see those in WinDBG and CDB when you’re looking at crash dumps and the like.

 

[1] ...but I bet the VB community spat chips over the curly brackets in Collection Initializers
[2] And yet whilst VB.Net 4 has this, they don’t have mixed accessibility for auto properties yet. Go figure.

Thursday, April 08, 2010

Automating WinDBG with PowerShell

I’ve been doing a bit of WinDBG work recently after a long hiatus, and I’ve been blown away by some of the things I’ve missed.

One of them was PowerDBG: a Powershell (2) module for working with WinDBG in Powershell. How awesome is that? No really, how freaking awesome.

But I couldn’t help but feel the execution was lacking something. It wasn’t, for want of a better word, very Powershelly. For example, this is what you’d do in PowerDBG to look at an object:

PS C:\> connect-windbg "tcp:port=10456,server=mr-its-64-vds"
PS C:\> Send-PowerDbgCommand ".loadby sos mscorwks"
PS C:\> Send-PowerDbgCommand "!do 0000000001af7680"
# Glance at the original WinDBG output make sure it looks ok
PS C:\> $global:g_commandOutput
0:000> Name: MyNamespace.Services.MyService
MethodTable: 000007ff002b1fd8
EEClass: 000007ff002af238
Size: 72(0x48) bytes
(c:\Program Files (x86)\SomeFolder\SomeDll.dll)
Fields:
MT Field Offset Type VT Attr
Value Name
0000000000000000 4000148 8 0 instance 00000000024
09858 _laneGroups
0000000000000000 4000149 10 0 instance 00000000024
04490 _lanes
0000000000000000 400014a 18 0 instance 00000000026
c7730 _routes
0000000000000000 400014b 20 0 instance 00000000024
d4f78 _roadSections
0000000000000000 400014c 28 0 instance 00000000026
cc668 _devices
000007ff007222e0 400014d 30 ...gDatabaseProvider 0 instance 0000000001a
f76c8 _provider
0000000000000000 400014e 38 0 instance 00000000023
16b30 MappingsUpdated

# Call the dump-object parser to stick it in a CSV file
PS C:\> Parse-PowerDbgDSO

# look in the CSV file
PS C:\> type .\POWERDBG-PARSED.LOG
key,value

Name:,MyNamespace.Services.MyService#$#@
:,000007ff002b1fd8#$#@
:,000007ff002af238#$#@
72(0x48),bytes#$#@
4000148,8 0 instance 0000000002409858 _laneGroups#$#@
4000149,10 0 instance 0000000002404490 _lanes#$#@
400014a,18 0 instance 00000000026c7730 _routes#$#@
400014b,20 0 instance 00000000024d4f78 _roadSections#$#@
400014c,28 0 instance 00000000026cc668 _devices#$#@
400014d,30 ...gDatabaseProvider 0 instance 0000000001af76c8 _provider#$#@
400014e,38 0 instance 0000000002316b30 MappingsUpdated#$#
@
PS C:\>


That’s a bit ugh. Commands share state via the global 'g_commandoutput' rather than the pipeline, and the end-goal of most operations seems to be to spit out a CSV file POWERDBG-PARSED.Log.



I think we can do better.



I want objects, preferably ones that look like my original objects. I want to be able to send them down the pipeline, filter on them, sort them and maybe pipe some back to the debugger to pick up more details. And I want cmdlets for common WinDBG /SOS operations like !dumpobject rather than pass every command as a string. In short, I want a real PowerShell experience.



More like this:



PS C:\> $o = dump-object 0000000001af7680
PS C:\> $o

__Name : Mrwa
__MethodTable : 000007ff002b1fd8
__EEClass : 000007ff002af238
__Size : 72
_laneGroups : 0000000002409858
_lanes : 0000000002404490
_routes : 00000000026c7730
_roadSections : 00000000024d4f78
_devices : 00000000026cc668
_provider : 0000000001af76c8
MappingsUpdated : 0000000002316b30
__Fields : {System.Object, System.Object, System.Object, System.Object..
.}


Note how I've mapped the field value/addresses onto a synthetic PowerShell object that uses the same names for the properties as the original fields (which were underscore prefixed, as you can see in the original WinDBG output above). I can then work with the object in the debugger in a more natural way:



PS C:\> $o._lanes | dump-object


__0 : 000
__MethodTable : 000007ff0072b8c8
__EEClass : 000007feede6ba30
__Size : 88
buckets : 00000000024044e8
entries : 00000000024050e8
count : 688
version : 688
freeList : -1
freeCount : 0
comparer : 00000000013da180
keys : 0000000000000000
values : 0000000000000000
_syncRoot : 0000000000000000
m_siInfo : 0000000000000000
__Fields : {System.Object, System.Object, System.Object, System.Object...}


Note also that I've kept the metadata originally available about the object by mapping those WinDBG output lines to double underscore-prefixed properties on the object. And I've not lost all that extra metadata about the fields either: whilst the properties above 'shortcut' to the field value/address, you can look in the __Fields collection to find the details if you need them (it's just much harder to pipeline stuff this way):



PS C:\> $o.__Fields


MT : 0000000000000000
Field : 4000148
Offset : 8
Type :
VT : 0
Attr : instance
Value : 0000000002409858
Name : _laneGroups

MT : 0000000000000000
Field : 4000149
Offset : 10
Type :
VT : 0
Attr : instance
Value : 0000000002404490
Name : _lanes

# ... etc...


Normally looking in arrays and dictionaries via WinDBG is a massive pain in the arse (find the backing array for the dictionary, find the key-value pair for each item, find the object that the value points to). PowerDBG has a script to automate this, and again I've tried to implement a more 'pipeliney' one:



PS C:\> $items = dump-dictionary $o._lanes
PS C:\> $items[0..2]

key value
--- -----
00000000024098f8 00000000024098d0
0000000002409a10 00000000024099e8
0000000002409a68 0000000002409a40


You can easily pipe this to dump-object to inspect the objects themselves. In my case I wanted to know if any of the objects in the dictionary had a given flag set, which ended up looking something like this:



PS C:\> $items | 
% { Dump-Object $_.value } |
? { $_.MyFlag -eq 1 } |
% { $_.MyId } |
Dump-Object |
% { $_.__String }


That's a mouthful, but basically what I'm doing is getting doing a !do for all the values in that dictionary, and for all those that have the MyFlag set true I send the MyId field down the pipeline. That's a string, so I do a dump-object on it, and then send the actual string value to the output.



With a large dictionary this stuff can take quite some time (and seriously chew memory in WinDBG) but then you wouldn’t be worrying about any of this if the dictionary only had two items – you’d do it by hand.



At the moment all this is unashamedly sitting atop PowerDBG’s basic ‘channel’ to WinDBG, but that should probably go too. PowerDBG grabs lines from the console out and concatenates them into a string, but actually want line-by-line output from the debugger, because I want to leverage PowerShell’s streaming pipeline (i.e. emit objects as they are ready, rather than when the command finishes). Maybe another day.



You can get the script for this from my SkyDrive. It’s definitely a first pass, but.

Popular Posts