public class ben:harrell

March 26, 2013

TFS WorkItem field “Name not supported” error

Filed under: Uncategorized — Tags: — benjamin harrell @ 6:52 pm

Ran into this one the other day and the odd error message threw me off for a few minutes until I tried one of my old developer tricks which is to highlight a field and see if anything weird is hiding and sure enough there were some spaces that I had pasted at the end of the name.  Apparently this is not supported so just check your spelling and any leading/trailing spaces when you get this error and hopefully that will fix you right up.

January 4, 2013

New updates for Cannot open user default database post

Filed under: database, SQL Server 2005, Visual Studio — Tags: , , , , , , — benjamin harrell @ 1:24 pm

After writing this post we were fortunate enough to have many people (all smarter than me) to comment with the various solutions especially for different versions so I have updated the post with the most helpful comments so that they can be found more quickly.  Good Luck!

Cannot open user default database. Login failed. Login failed for user ‘UserName’. (Microsoft SQL Server, Error: 4064)

January 3, 2013

TFS Synchronization and Migration

Martin Hinshelwood recently posted about the current state of TFS sync and migration tools here and while I thought the post was very detailed and covered the market pretty well I think he did miss one smaller tool that I’ve used with some success:  TFS Bug Item Synchronizer .

Now I don’t know what his criteria was for the article and if perhaps this tool didn’t fall into some category but I definitely think it is worth a look especially for smaller companies with smaller budgets or even TFS teams that might not need one of the huge enterprise solutions.  (and it even has a trial period)

From the product site


TFS Bug Item Synchronizer is an tool which allows testing and development teams to work together using tools built for their usage. Testers enter defects in Quality Center and when defect is created and updated into TFS, developer can fix it, link all changes to it and mark it as fixed. After this the tester will continue process on Quality Center side to verify the fix.


I implemented this product last year and like most sync tools it takes a little up front work and definitely some understanding of how TFS works but the support  that was available was incredibly helpful and you can’t beat the price point.

So hopefully this will add one more tool to what seems to be an ever growing list of tools in this space.   Now if we can just get one that requires little to no configuration  😉


August 4, 2010

TFS Builds Report clicking link gives error InvalidBuildUriException TF215070

Filed under: Team Foundation Server, Usability — Tags: — benjamin harrell @ 6:53 pm

In Team Foundation Server when running the Builds report you are presented with a list of Build instances that have run in the past and a few columns of useful data such as Quality, Tests Passed, % Code Coverage.  The build names themselves are intended to be clickable to show you the build details screen for that build but often times when clicking one of these links you will see the following message:

InvalidBuildUriExceptionTF215070: The build URI 5087 is not valid. Make sure that this build exists and has not been deleted.

This error is simply telling you that the Build result for this particular build no longer exists.  This is most likely due to your retention settings for the build.  I defaulted my builds to only keep the 2 latest builds thinking that would be a good way to save space, blah blah but in the end it means that all information about that build is removed INCLUDING the build details and label on your code!  I found more information on build retention and how to correct this in a blog post HERE

Now if I could only find out how to cleanup that Build Dropdown list on this report….

May 18, 2010

DevExpress add PivotGridField error

Filed under: .NET, ASP.NET, C#, Uncategorized — benjamin harrell @ 5:24 pm

If you get the “Object must be of type String”  (or Int, etc) while adding a custom field to a Pivot Grid in DevExpress then check to see if you are using CustomGroupIntervals.  If you have a custom GroupInterval you must provide a GroupValue for EVERY case that your values might contain or else you will get this error.

private void grid_CustomGroupInterval(object sender, DevExpress.XtraPivotGrid.PivotCustomGroupIntervalEventArgs e)


if ( Convert.ToDecimal(e.Value) < 5)


e.GroupValue = “< 5%”;


else if (Convert.ToDecimal(e.Value) < 10)


e.GroupValue = “< 10%”;


else if (Convert.ToDecimal(e.Value) < 15)


e.GroupValue = “< 15%”;


else if (Convert.ToDecimal(e.Value) <= 20)


e.GroupValue = “<= 20%”;




e.GroupValue = “> 20%”;



December 30, 2008

Beware of the “middleman”

Filed under: consulting — Tags: , — benjamin harrell @ 3:28 pm

I’ve consulted for many years and usually had good success going through recruiters, third parties, etc.  They normally provide a valuable service of “buffering” your corporate consulting pay essentially allowing you to get paid more frequently and also allowing a large company (such as an investment bank) to contract to several smaller corporations without dealing with each individually.  My warning comes from this relationship and the leverage that such a large entity has with both parties once they’ve established…errr entrenched themselves into the system.  Whenever the middleman has too much leverage he can actually screw both parties and squeeze out all sorts of profits through junk fees, delay of payment (to gain interest on the float), fee increases, etc.  Not only is it a complete scam that I wait on average 45 days to get paid for my work, this “middleman” now wants $2.50 out of every $100 I make!  We call that taxation where I come from and my only choice is to walk away but wait…that can’t be done because my “employer” only works with this 1 company so I’m forced to bendover and accept this increase.  Just remember this question when you get it:  Do you have a preferred vendor?  My new answer is NOT ZEROCHAOS…please let me know if any of you have had similar issues or if I’m way off base.

The following is an email from my “middleman”, ZeroChaos (can’t you just hear the sarcasm in the email?):

Dear Valued Partner

Pursuant to the attached notification, the ZeroChaos administrative fee for all placements made at “Feral Wench” is now 2.50% for each billable hour. Continuing to provide Services on any ZeroChaos assignment at “Feral Wench” after December 19, 2008 will indicate Supplier’s acceptance of these terms. All invoiced time worked after 12/19/2008 will incur this updated administrative fee. Please direct any questions to the Workforce Support Services Team at 877-937-6242 Option 1 or

Best regards,

ZeroChaos Workforce Support Services

August 14, 2008

PSConfig.exe Beware!

Filed under: .NET, Team Foundation Server — Tags: — benjamin harrell @ 9:09 pm

I’m at the end of a 2 day TFS 2008 install marathon (yes, the one with the “fixed” installer) and I needed to move my WSS 3.0 config database (wss_config) to a new database which requires essentially creating a new wss_config database with the proper settings and I found an article about using PSConfig.exe to create/configure everything magically for you.  Now remember this is a Microsoft tool, recommended in MSDN and installed by WSS 3.0 so of course I trust that it does what it says.  Well, let me just say this

NEVER, EVER, EVER, EVER use this tool!

I ran it with the configdb command -create and oh yes I got my shiny new database but in the process it deleted my ENTIRE wwwroot folder.  I have no idea how or why but something is REALLY wrong deep down in this code.  If you wrote this code PLEASE feel free to contact me to discuss and I will gladly retract my statements but in the meantime save yourself the trouble.  I think STSAdm.exe might still do the job or perhaps you can do it from the Admin screens (not sure) but avoid this tool

June 8, 2008

SSIS Custom Source Component for EBCDIC

Well, it’s finally done!  Patrick and I finished version 1 of the EBCDIC Source Component which aids tremendously in the importing of EBCDIC (mainframe, IBM, old school stuff) data into SQL Server Integration Services.  We think this component will allow a number of shops to focus on getting the data right in their ETL solution within Integration Services rather than beating their head against an older data format that doesn’t always play well with others.

This component, named Lysine, works like most other Sources in SSIS so it should be easy to get started.  Currently, the component has the following features:

  • Several EBCDIC code pages supported
  • Intuitive Layout UI for rapid  development
  • Quick Preview to show you if your layout is correct
  • All major column types supported Redefines, Occurs, Occurs Depending, Packed (Comp-3), Zoned
  • Single Pass conversion for scalable performance
  • Export/Import of layout for team development

Please come check out the demo, browse the User Guide, try some Samples and let us know what you think!

September 18, 2007

SSIS Custom Component – ProvideComponentProperties vs ReInitializeMetadata

Filed under: .NET, Custom Source Component, database, Integration Servicees, SQL Server 2005, SSIS, Technology — benjamin harrell @ 9:47 am


I am currently working on a custom source component in SSIS that converts EBCDIC data to ASCII inline and one of the challenges I face is creating dynamic outputs and output columns based on the layout process of the source component.  Normally, when you want to additional outputs on your component you create them in by override ProvideComponentProperties like this:

public override void ProvideComponentProperties()
    // add the outputComponentMetaData.UsesDispositions = true;
   IDTSOutput90 output =     ComponentMetaData.OutputCollection.New();
   output.Name = “My New Output”;
   output.ExternalMetadataColumnCollection.IsUsed = true;

This works really well if all of your output information is available at design time (in the SSIS ui) but what happens if your dynamic outputs are determined at runtime?  ProvideComponentProperties is only called one time, when the component is added to designer surface.  In order to dynamically add outputs at a later point you must use ReInitializeMetaData which is called whenever Validate returns VS_NEEDSNEWMETADATA.

public override void ReinitializeMetaData()
   // add the output
   ComponentMetaData.UsesDispositions = true;
   IDTSOutput90 output = ComponentMetaData.OutputCollection.New();
   output.Name = “My New Output”;
   output.ExternalMetadataColumnCollection.IsUsed = true;

Note that I have not shown the additional work of adding columns to either of these scenarios, you will need to add that code yourself. 

July 14, 2007

SSIS Row Limits and DefaultBufferMaxRows (Part 2)

Filed under: .NET, Custom Source Component, database, Integration Servicees, SSIS — benjamin harrell @ 10:44 am

I believe the mystery is solved and I’m sad to say that (as usual) it was something silly but crucial.  In a custom source component you must create a new PipelineBuffer in order to write your rows out.  SSIS has provided a way for us to let the engine know when we are done adding rows with a simple method “SetEndOfRowset”.  The name says it all and even the MSDN documentation is clear that you must call this.  In addition, your error logs if you don’t call this method will say something like:

The PrimeOutput method on <your component> returned success, but did not report an end of the rowset. 

 You would think that this error message would be enough to warn any developer that they were missing a key line of code but sadly that wasn’t the case.  In my case I had actually allocated 2 buffers, 1 for data rows and 1 for error rows.  I set a breakpoint on my SetEndOfRowset call and watched it execute so I just knew this wasn’t my problem.  But I forgot the second buffer!  So just a note to all you brave souls commanding bits on the SSIS battlefield.  Call SetEndOfRowset for EACH buffer you allocate. Good Luck! 

Older Posts »

Create a free website or blog at