Friday, December 28, 2012

Quick Test for Entity Framework DbContext

Okay, so I have been working on a project using Entity Framework, with reverse-engineered POCO classes and custom T4 template files. One of the things we were wanting to do is quickly identify an issues that caused by the generated entity and mapping classes.

One way to do this is to iterate through all the DbSet properties of the generated context and try to get some data. This proved to be tricky , but I did come up with something that I thought would be worth sharing with others. So here is the code that runs this basic test...
public void DbContextCheck()
    using (var myContext = new efDbContext())

        var sets = from p in typeof(efDbContext).GetProperties()
                   where p.PropertyType.IsGenericType
                   && p.PropertyType.GetGenericTypeDefinition() == typeof(IDbSet<>)
                   select p.GetValue(myContext, null);

         foreach (var myDbSet in sets)
             IQueryable thisSet = (IQueryable)myDbSet;
             var myEnummerator = thisSet.GetEnumerator();

Essentially, LINQ and a little reflection is used to get the instance of each IDbSet<TEntity> property. One of the things that I found interesting/frustrating, is that there is not a easy way to call the FirstOrDefault extension method for the DbSet. In order to get around this, each DbSet can be cast to an IQueryable type, since IDbSet, as well as DbSet have IQueryable in the inheritance chain. This allowed me to get the Enumerator for the DbSet and call the MoveNext method, to attempt to retrieve a record.

If something isn't quite right in the EF classes, this thing will blow up on you. Otherwise, it is an indication that  your EF stuff has generated properly, and you can code with confidence!

Wednesday, September 14, 2011

Access 2007 Macro Fun

Okay, don't ask me how or why, but an issue with the Access 2007 Transfer Text macro came accross my desk. Macros... I thought these things would die out and go away, but apparently they are holding steady, if not increasing in popularity. But when macros break or don't work, who gets to fix them???? Developers... Developers, who would much rather be writing code and trying to figure out how to work the latest new design pattern into their current project. I digress....

Anyway, here are some tips for all you macrophiles wishing to use this particular macro.
  • Use UNC paths
  • Make sure the file you're importing is allowed by Access.
    • The allowed extensions are: txt, csv, tab, asc, tmp, htm, html, but if you are so insistent that you need to allow some other extension, you can hack the registry using this MS article: Please note that if you are using a 64 bit OS with 32 bit Office, you may need to also look in the HKLM\Software\Wow6432Node\ for the setting.
  • Make sure you have enabled macros.
    • This one is kind of a gimmie, but can get overlooked.

  • If you cannot see the macro in the dropdown list, make sure you are showing all macro actions.
    • The Transfer Text macro is not trusted or regarded as "safe" so Access hides it by default (that should tell you macrophiles something right there).
  • Double check your import specification.
So, if you are going to tread into the land of macros, make sure to play by the rules as much as possible. Utilize the microsoft documentation when setting them up, as it is rather thorough. If you run into trouble, retrace your steps, and double check everything. If all else fails and you do need to to go your resident developer, come with your hat in your hand and a shaker of salt (because he or she will probably make you eat it). Offer him or her a pizza, case of Mt. Dew, or 12 pack of their favorite beer for their trouble. After all, they are doing you a favor. :)

Friday, April 29, 2011

Hello Google Analytics! My name is... umm... err... meh

Okay, so here is the deal... I came up with this handy, dandy online newsroom app, that has a mailer service. The goal, cut down on the number of emails sent to all users of the company from 5 cagillion, bazillion to one.

What does it do, well it basically is your typical newsroom app, with categories, defined posters, and defined post approvers. Approved stories, of course, can be viewed via the application interface. Essentially people submit their would-be email announcement to this application and then once a day, at a specific time, an email is sent out, via a windows service, to all the users of the company, specifying headlines for any relavent information and highlighting the items for the current day.

The end result... big win in the eyes of the internal marketing folks, which of course bubbles up to the appropriate VP. Then, the inevitable happens. Email from the VP asking, "Can we track how many people open this email??" CRAP!! Well, tracking email stuff I know nothing about, but tracking websites, that I can handle!

Enter, Google Analytics. I have heard of it, and it is used to track and trend the external website for the company, but not anything internal. From what it looked like, Google Analytics wanted an FQDN when you set up a profile, and we use no such thing for intranet sites where I work for two reasons. 1) The end users here are not only a little dull, but lazy. I mean, who wants to type all those characters anyway. 2) The supported browser is IE which drops network credentials and prompts for a user ID and password for FQDNs that require authentication, unless you set them up in the intranet zone. Umm... can you say group policy nightmare for the engineers? Anyway, setting an FQDN for the site was not an option. So I turned to the IT staffer's handbook, Google.

I found a nice blog article by Mike Knowles ( explaining that this was possible by calling the _setDomainName method of the Google Analytics tracker object (<object>._setDomainName("none")). So I was off and running, but when I got the script, it looked nothing like what he had in his example, but rather looked like this:
Okay, so now what is going on? Well what is happening is the _gaq object is essentially a FIFO queue of commands that will be run by the Google Analytics tracking API. Documentation for the push method can be found here. Essentially, I needed to push the _setDomainName method into the list of commands before the _trackPageView method, as Mike's article suggests. So I modified my code to look like this:
All appears to be well now and I'm sure the VP will be yet another satisfied customer. Hopefully, this helps somebody out there. The documentation on the Google Analytics API is pretty good, and has some nice examples. Don't be afraid to venture forth there either, and remember, it will take 24 hours for the fruits of your labor to show up.

Friday, October 15, 2010

SharePoint Content DB Restores

I know there are already a ton of articles, posts, etc. out there about this, but I have yet to find one that addresses all of these situations at once. In helping to administrate a SharePoint farm with 1000+ sites in well over 50+ site collections, I have had the opportunity to mop up after a few bone-headed blunders. Things had been quiet for some time and then I got a request to restore an excel file from 6 months ago. Turns out the library it was in did not have versioning turned on.

I was fairly sure on what I needed to do, but a little fuzzy on the exact procedures. "No problem", I said to myself, "I'll just pull up the documentation from the last time I did this." Oops! Apparently I did not document it last time, or I forgot where I put it, 'cause I couldn't find it anywhere. So this time around I made documentation and figured I would share it with the rest of the world. I'm sure there are other ways to do this, but I went for easy to follow, rather than sophisticated. Enjoy!

Restoring SharePoint Site or Item from Content Database Backup


  • If an entire site collection is not to be restored, then the content database must be restored to a staging farm.
  • The staging farm must be the same version/build as the production farm.
  • Any features (.wsp) that are active for the site in question on the production farm must be installed on the staging farm.


Restoring a site collection

To restore from a current version/build content db

  1. Stop the IIS website of the site collection to restore.
  2. Using SQL Server, set the content database for the site collection into single user mode.
  3. Restore the content db according to SQL Server db restore practices.
  4. Restart the IIS website.

To restore from a previous version/build content db

  1. Stop the IIS website of the site collection to restore.
  2. Using SharePoint Central Administration remove the content db of the site collection to be restored.
  3. Using SQL Server, set the content database for the site collection into single user mode.
  4. Restore the content db according to SQL Server db restore practices.
  5. Use the stsadm –o addcontentdb command to add the content database to the site collection (stsadm –o addcontentdb –url  -databasename ). Other options may be needed.
  6. Restart the IIS website.

** Please note that it is not required to start and stop the IIS site, but in some cases it is more user-friendly.

Restoring a web (sub web)
  1. Create an empty site collection on the staging farm (this can be a root or a sub site collection)
    • Sub site collections are a little more tricky and may be best created via the stsadm –o createsiteinnewdb command.
  2. Restore the content db according to SQL Server db restore practices.
  3. Using SharePoint Central Administration or stsadm, take the current content db for the site collection offline, or remove it.
  4. Use the stsadm –o addcontentdb command to add the content database to the site collection (stsadm –o addcontentdb –url -databasename ). Other options may be needed.
  5. Use the stsadm –o export and stsadm –o import commands to export the specific sub web from the staging farm and import it into a specified location on the production farm.
    • It may be beneficial to use various options from the export and import commands to preserve security, etc.
Restoring an item
  1. Create an empty site collection on the staging farm (this can be a root or a sub site collection)
    • Sub site collections are a little more tricky and may be best created via the stsadm –o createsiteinnewdb command.
  2. Restore the content db according to SQL Server db restore practices.
  3. Using SharePoint Central Administration or stsadm, take the current content db for the site collection offline, or remove it.
  4. Use the stsadm –o addcontentdb command to add the content database to the site collection (stsadm –o addcontentdb –url -databasename ). Other options may be needed.
  5. Find the item in question and move it to the production farm by whatever action(s) seem prudent.

Off the Market!

First of all, my apologies for not posting in some time. The tail end of the summer got quite busy for me, and although I was inspired to post some stuff here and there, alas, I did not. I do have a good reason however... I got married in September (destination wedding) and had a reception back home in early October.

As any good man knows, during the time approaching these events, I was at my bride's beck and call to help make sure things went as planned, and to help shoulder some of the stress involved with planning, etc. Now that we are through everything life seems simpler and we again have free time to take-up extra curriculars again. So the posting is back on!

Tuesday, July 27, 2010

Sitefinity Users Beware!

Okay so this is for anyone out there who plans on or is trying to run Telerik's Sitefinity <= 3.7 on IIS 7.0. If you haven't had much opportunity to read up on or play with IIS 7.0, then you may find this helpful. IIS 7.0 has changed the way that it processes IIS and ASP.NET reqests. In the past it would use separate pipelines for the 2 different types of requests, but now they are integrated into the same pipeline, which can allow for some powerful things to happen for applications that are written to take advantage of it. However for those applications that are not written to take advantage of it, frustration and confusion can ensue. This is where Sitefinity comes in...

Based on my own personal experience, if you are not careful IIS 7.0 will be set up with the DefaultAppPool, which uses the integrated request pipeline. Sitefinity will install and work, kinda, but you will get wierd 404 errors, which will drive you bonkers. Fortunately, there is a way around this by implementing an application pool that uses the classic request pipeline. If you make sure to install IIS 7.0 with IIS Metabase and IIS 6 configuration compatibility, there will be a 2nd application pool titled Classic .NET AppPool, which uses the classic request pipeline.

Simply switching the application to use this app pool will clear up a lot of gray and allow you to steer around any other issues you may have (most likely dealing with authentication). One other thing you may want to ensure is that the Classic .NET AppPool runs under the LocalSystem account. I believe that Telerik does have some additional documentation here, but I didn't find that until after I figure stuff out for myself. Go figure...

Monday, July 19, 2010

MS Forefront Identity Manager 2010

So I was asked by my boss to take a look at a MS Virtual Lab for Forefront Identity Manager (FIM) 2010, MS latest and greatest in the series of Identity lifecyle management tools. The lab essentially consisted of 3 sections, one dealing with account creation and provisioning, another dealing with self-service, approval-style group management, and another using a password reset utility to show off workflow capabilities. I must admit, that I did approach this lab with a bit of skepticism and contempt as I have already developed a web-based application for creating and provisioning accounts and have a hard time seeing the value in spending thousands of dollars on something that we basically already have and give up the ability to change/customize it. Anyway, this is my analysis and takeaway on what I have experienced.

The FIM interface is basically a web interface... and a poky one at that, unless that is just the virtual lab slowing everything down, but it did seem relatively intuitive, which is not surprising since MS has been at this for awhile now.

Section 1: Account Creation and Provisioning
The first section of the lab dealt with basic account creation and provisioning. Not very impressive, as what we already have does more in some areas and in areas that are lacking, could easily accomplish or exceed with a few minor modifications. The lab basically had you fill out a form with all of the particular account details, leaving room for data entry errors, etc and based on certain options selected from a dropdown list or two, automatically adds the user to a group or set of groups. All of this data apparently goes into a database backing the FIM somewhere, because the lab then has you fire up a script, which runs every 30 seconds, to synchronize the FIM with Active Directory. Two questions, why not interface directly with AD and second, why wait so long to synchronize. The issue I see with this is that there is no immediate feedback of success, and this could ultimately slow down the account creation process in a high demand environment. The application we have interfaces directly with AD as well as global account list and provides direct feedback of success/failure, as well as logging all of the attributes that have been set during the account creation process. As far as adding the account to specific groups based on options selected from dropdown lists, that framework essentially already exists, is easily extensible, and just needs to be implemented.

The other question that comes to mind is security trimming and customization. Sure it is nice to have a neat web interface that can be used to create accounts, but is the interface security trimmed or can it be? The current application is and the security trimming that exists can be extended and modified. What about customizations? The environment that exists does not lend itself well to working with FIM out of the box. The advantage to the current application is that it is built around the unique account creation/provisioning process, as well as other needs with regard to modifications, moves, and deletions, and because of this, is more agile with respect to modifications dictated by the process, instead of modifying the process to deal with inflexibility in the app.

Section 2: Self-Service Group Management
I have to admit this concept is pretty cool, and I do like the approach that MS has taken here as far as patterning goes. Basically, an AD group is created and ownership is given to a manager. It appears as though FIM serves as a broker or gatekeeper for the group membership. Somehow through the process of setting this up, an add-in is created for MS outlook whereby users, can apply to be members of a group. When a user applies for membership, a message is sent to the group owner and they can approve or deny, which in turns informs the original requester.

So this is a pretty neat process. However, I find a bit of a problem in the implementation of the process via an Outlook add-in. The add-in may only be specific to Outlook 2007+, which is not consistent in the environment, and it seems that it will more than likely require user training. Windows SharePoint Services (WSS) allows an option for requesting access to SharePoint groups, which basically uses the same workflow process, but it is web-based.

Other questions I have about this feature that FIM implements revolves around the groups themselves. Are the end users limited to which groups they can even see to apply to? I'm thinking about role-based access control (RBAC) here... There could be a whole list of groups that one set of people could apply for membership to that would be superfluous to another group of people. The current application that we have does not offer this capability, but certainly could, and it could do it while keeping RBAC trimming in mind as well. So although FIM offers a cool feature here, it is not something that is beyond the reach of extending the current application and doing it in a much easier to use web-based user portal that could exert some RBAC trimming.

Section 3: Password Reset Utility
Okay, so when I saw this in the lab outline my first though was "oh please... this is already being done", then I got into it... The lab assumes that the user forgets their password at the login screen and needs to reset it. The utility works essentially the same as any other password reset utility that you encounter on the web for any secure site, but most like banking sites. The lab first walks through the process of the user setting up the utility by picking specific questions and providing answers. It then walks you through as an admin to view the workflow that is actually associated with, or generates the utility program. Finally we log out of the system and run the utility from the log in prompt. Very cool! The user is able to reset their own password without any request for intervention. This could be a handy tool.

The current application does not have this exact feature, and even though it could easily enough, if there is not a way to place a link or call to the reset utility on the welcome/logon screen, it is a mute point. The only question/issue I have with this again goes back to software requirements. Is this something that can be done on WinXP or is it strictly Vista+.

Overall, my impressions from this lab left me with more questions than an feeling to be drawn to FIM 2010. However it did give me some cool ideas that I feel could easily be implemented in the web-based application that we currently use. The only thing that I do not think could be possible is a password reset from the welcome screen, and this is just because I'm not familiar with the possible hooks from that part of the OS. A systems engineer may be able to help shed light on this. However with the licensing cost that I have found for FIM 2010 @ $15,000 per server and $18 per CAL, I wonder if a password reset utility is worth that when the rest can be done in the current web application at a fraction of the cost, while remaining flexible to the demands of business processes.