Skip navigation

Category Archives: Development

I’ve been using and developing against the product that has become known as FIM 2010 R2 since 2003. This particular piece of software started life at a company called Zoomit and was developed originally by a guy named James Booth. (Please correct me if I have the backstory wrong.) Zoomit was acquired by MS and they changed the name to Microsoft Metadirectory Services. I encountered this in the form of MMS 2.2 and I helped migrate a large IT department to the first version of what would become the FIM Synchronization Service, MIIS 2003, in 2003. The next revision was in 2007 and came with another name change, Identity Lifecycle Manager 2007. The product was essentially unchanged in that it was a synchronization engine that could be extended with C# or VB.Net to sync data  between LDAP, SQL, flat files, Lotus Notes…you name it. I found this extensibility really empowering. It’s the main reason that I ended up focusing on development instead of system administration.  Despite the sort of “can do” attitude that it fostered in administrators and developers, when polled, end users kept telling MS to “get the coding” out of the product. The next release would add “declarative provisioning” capabilities to accommodate this desire. I think the results are mixed. It would also have another name change. The product had been moved into the Forefront brand group, containing such “winning” pieces of software as IAS. (Yes, that IAS…the “proxy server of doom.”)

FIM 2010 provided some very compelling use cases. Secure Self Service Password Reset is a feature that can provide a single unified password reset experience whether it’s performed from the GINA/Credential Provider, that’s the CTRL+ALT+DEL prompt to the rest of the world., or from a web based portal. The Synchronization Service could now be controlled via declarative rules in a SharePoint based portal that also included the ability to trigger workflow tasks when an object changed state. It also allowed delegated user and group administration. This was improved in the R2 release and then again with the SP1 that came out in early 2013.  There are still some rough edges, however.

MIIS 2003 could be customized by providing implementations of .Net interfaces packaged in a .Net DLL extension. The sync service scans a directory for extensions and loads them when it starts. This provided extensibility for every part of the sync cycle.  When mapping an attribute in AD to an internal object the service would call into your DLL and you could process it any way that you liked with the full capabilities of the .Net Framework. There were guidelines about the kinds of things that one shouldn’t do while the engine was doing its work, but it was the definition of flexible. While in the sync manager UI you could even have it create a Visual Studio project for you with skeleton code for you to customize. All of that functionality remains, but there are some problems with this approach. Don’t get me wrong, I loved working on this stuff, but it’s requirements are becoming a “non-starter” in lots of IT shops. Namely, to work with the sync management service you have to logon to the console of the server via RDP. Not a big deal, but as of Server 2012 the official default deployment model is Server Core. You can’t use the sync service on Server Core. You need the full UI. In addition to this, if you want to debug any errors that you run into you need Visual Studio installed on the server. In 2003, it was recommended for production servers…I still do it to trap errors in the debugger that only arise with the full production data set. This really isn’t a good model going forward.

But what about declarative provisioning you ask? Wouldn’t that eliminate the need for having custom code in the sync service? I’m sure that there is an 80th percentile case where yes, it would do just that. The problem is, though, the real utility of this software has been its flexibility in an area of IT that is rife with idiosyncrasies. The process of discovering, processing, and then syncing changes to and from the different identity “silos” around an enterprise can be incredibly complicated. This is often for reasons that are more political than technical, but because of that you might need to take steps that you might not otherwise. So, for instance, you may have an attribute in a database view exposed by PeopleSoft that needs to be modified ever so slightly before it’s considered “gold” and ready to be sync’d to other systems, like AD. (And maybe the PeopleSoft guys hear “Microsoft” and tell you that they are afraid that it will “break” their database. Seriously.) Currently, FIM 2010 R2 provides some limited means of manipulating it with some functions like “TRIM()” or “LEFT()” but the experience is akin to writing JavaScript in notepad after you have become used to using Visual Studio. Did it work? Who knows until you actually get the new “sync rule” into the engine via an import and start synchronizing….then its entirely possible that the “expected rule entry” that links every object to it’s particular matching sync rule could just say “unapplied”. Without going into the guts of how sync works on FIM 2010 (I’m saving that for another post) lets just say that in some ways the current declarative system is easier for most things, but when you want to do something that it can’t do you are in the weeds. Just how far in the weeds is kind of amazing…like I said, I’m saving that for another post, but wow. Just wow. I earn a living working with this software and every now and then it occurs to me just how much weird behavior that I have grown accustomed to over the years.

Anyway, as of a few months ago the Forefront brand is no more. Most of the products in it’s line up have been killed. “Endpoint Protection” has been moved to the System Center line of products. I expect that the next major release will continue the pattern of name changes that have accompanied every major release. I’ve been giving some thought to what the future of this thing should look like…at least if someone cared what I thought about the subject.  Smile

First off, it needs to run in Server Core. There needs to be a “development version” that you can run on your workstation where you can model the desired behavior for your environment and then upload it to the actual sync server. The current SharePoint portal just isn’t a good way to configure this incredibly complex software. It’s fine for delegating access to users and groups (as long as you don’t need to do much customization), but when you are doing admin or development work it is really death by a thousand cuts. They use AJAX wherever possible in the UI, but it’s always loading something…and even though it uses AJAX it still blocks the UI! This is due to the main thing that needs to be taken out and shot in the head. The “FIM Service”.

FIM 2010 introduced a new component called the “FIM Service”. This provided a new database separate from the meta-directory database that the sync manager operates on. It also provides a hosted workflow engine via Windows Communication Foundation and Windows Workflow Foundation. This interacts with the SharePoint based portal via a WCF soap-based web service. The portal surfaces that functionality in the web service, so while you do your administration and configuration through the portal you are actually using the web service. Awesome, right? That must mean that you can fire up your favorite development tools and customize it to fit perfectly into your environment, right? No. Well, you could. If you were really a hardcore WCF developer, I’m sure this would be easy. No one, let me repeat, no one who works with this software is a hardcore WCF developer. No one. For that matter, even hardcore WCF developers don’t want to be WCF developers. WCF made writing web services incredibly easy compared to DCOM. Open heart surgery is easy and agile compared to DCOM, so that’s kind of a low bar.

It seems like FIM 2010 specs were designed by well meaning people. There are lot’s of things about it that one would have thought were a good idea. It’s web service is the most standards compliant I have used. You could use it from Java if you were so inclined. It uses SharePoint, which lots of people use, right? It uses Kerberos constrained delegation. It has it’s own Secure Token Service to issue claims for the web portal. Lot’s of buzz in those words there. The problem  is that while this was going on the world shifted to light weight REST based web services that humans can use and understand. Microsoft has another Secure Token Service called ADFS that works pretty well from what I hear. No one uses the web service in a way that warrants the added complexity. Because of SharePoint and the web service together the portal always feels slow. It has it’s own little AJAX spinner to let you know its loading data from the service…learn to love that little spinner. That’s all I can say. The workflow stuff is genuinely useful, but they have another workflow engine that is getting a great deal more development energy…It’s called System Center Orchestrator and it’s awesome.

System Center has it’s own self-service platform called Service Manager that can be coupled with SC Orchestrator to do everything from provision servers to running arbitrary PowerShell code. You can define service offerings and expose them to your enterprise customers and  tie that all together with approvals and notifications. Service Manager can integrate with SC Operations Manager to open a ticket if there’s something wrong with a service. FIM can do approvals. FIM can do custom workflow’s. (Like I said, there’s a future post coming on the pains of developing for FIM.) FIM can run PowerShell by an unsupported workflow module developed in the community. There are a couple, in fact. The problem here is that the System Center products are getting the major energy and focus at MS. That’s really what I see happening in the future. It’s something that I will be investigating in my own dev lab very soon, as well.  The sync engine could be paired with Service Manager and Orchestrator to do everything that the current FIM solution does…and more. All without that painful SharePoint portal. Did I mention that you get a Service Manager license with FIM 2010 R2 for logging? Hmmm….The Forefront brand is dead. Other Forefront products have been transferred to System Center. Service Manager is already integrated with FIM…It doesn’t take a crystal ball to see where that’s going. In fact, I’m pretty sure that I could build a passable replacement for the “FIM Service” components my self in the mean time.

I’ve been working with this software since it came out and recently I’ve gotten as deep into this thing as you can get without the source code. It’s been really, really painful. The next post will just be about what I would like to see if I were building some future version.

Advertisements

 

There are quite a few scenarios that I run into routinely where I need a random value. A unique file name or password, perhaps. PowerShell makes this incredibly easy. This is one of those features that are so useful it makes you wonder how you got along without it.

Examples

 

Get ten random characters:

Get-Random -Count 10 -InputObject (65..90) | %{ [char]$_ }

Get fifteen random computer accounts for XP Workstations:

Get-ADComputer -LDAPFilter "(operatingsystem=*XP*)" | Get-Random -Count 15

Get some random records from your DNS cache:

Get-DnsClientCache | Get-Random

 

And that’s really the beauty of it…it works with any kind of collection.

Learn to love PowerShell…its the preferred management interface in the new server OS.

The preferred deployment model is going to be the “Server Core” installation. This is generally a Good Thing® in that it will drastically lower the attack surface for operating system exploits…But most Windows admins I know don’t even know the legacy VBScript that they should. I actually had a guy ask me how to get to the “search” feature that was on the XP start menu.

This is why UNIX admins seem better at their job.

Regardless, it really seems like MS really gets how big of a pain it is to run lots of machines at once. Things that used to require either expensive third party tools or custom development are baked in. In fact, the new Server Manager interface is actually just surfacing PowerShell commands. You can actually save the command text from the newest version of the Active Directory Administrative Center. This should ease the learning curve.

I guess I’ve finally read enough PowerShell examples that I’m starting to come around. Up until now I have mainly used C#, VBScript, and C++ (when forced) to do my work. One thing that I can say for the PowerShell ‘methodology’ is that its incredibly consistent.

Get-Help Some-Command –examples

It’s consistent patterns like the above that seem to pervade the whole system.

Speaking of ADAC…ADAC actually depends on the Active Directory Web Service. Some places might have reservations on deploying ADWS since it needs to be installed on every domain controller. (That’s the entry level recommendation at least.) I know quite a few places that didn’t deploy it just because it had “Web” in the name. Insert groan tag. The benefits really do out weigh any deployment or management costs whether real or mythological.

One really neat feature of PowerShell is that HKCU, HKLM, the certificate store, and IIS are all drives. So is Active Directory.

This image illustrates what I mean…

image

It’s been a long time coming. Windows really didn’t have good scripting story before PowerShell.

Full disclosure…you aren’t supposed to do this. The metaverse in FIM 2010 (ILM 2007, MIIS 2003) can be exported to an XML file. I finally got around to writing an app that can search the schema of ldap directories and SQL tables and add those new attributes to the MV schema with a prefix. Why? Because you will pretty much never get away from data flow troubleshooting questions. The FIM Sync Manager interface can only run on the console of the server that it’s installed on. (And no, you can’t install it on a server that has Remote App/terminal services installed.) There is some basic RBAC for segregating permissions but you still have to give RDP access to the machine.

The initial configuration of this application is also the very definition of tedium. The “old” method using the Sync manager was a pain…the SharePoint based method is worse.

More to come in the next post, but the general idea is a more automated means of getting the connected directories mapped into the system with inbound attribute flow’s that populate MV objects with the corresponding CD data. Programmatically generating the metaverse schema is the first step. Next step will be manipulating the MA export files.  Being able to generate these config files out of band will also enable some interesting UI scenarios for creating the attribute flows…There has to be a better way.

For the impatient:

Here’s a snippet that adds MV elements to the “person” class…
static void AddSchemaElement(string schemaObjectName)
{
var xEle = xDoc.Descendants(XName.Get(“directory-schema”, dsml.NamespaceName)).First();
var personEle = xEle.FirstNode.ElementsAfterSelf().Where(x => x.Name.LocalName == “class” && x.FirstAttribute.Value == “person”).First();
XElement newAtrributeType = new XElement(XName.Get(“attribute-type”, dsml.NamespaceName),
new XAttribute(“id”, schemaObjectName),
new XAttribute(“single-value”, “true”),
new XAttribute(XName.Get(“indexable”, msDsml.NamespaceName), “true”),
new XAttribute(XNamespace.Xmlns + “ms-dsml”, msDsml.NamespaceName),
new XElement(dsml + “name”, schemaObjectName),
new XElement(dsml + “syntax”, “1.3.6.1.4.1.1466.115.121.1.15”));

xEle.Add(newAtrributeType);
personEle.Add(new XElement(dsml + “attribute”,
new XAttribute(“ref”, “#” + schemaObjectName),
new XAttribute(“required”, “false”)
));
return;
}

I upgraded my mac to Lion. The thing I really like is the new XCode. It runs in a single window with Interface Builder built-in. Poking around in Grand Central, Apple’s parallel task system, I was really impressed. It’s using what looks like, effectively, the lambda passing stuff that .Net 4 is using as well. Since the whole world has gone MVC crazy it all seems easier to wrap my head around than my first foray’s into Mac development.  The thing is, though, I’m completely over phones, tablets, and the like…I’m just done. Everywhere I go it looks like people are examining their navel with a tricorder from star trek.

 

I’ve been doing a lot of claims-based authentication development lately, so I am looking forward extending that to cross platform scenarios.

I end up writing at least one command line single purpose app just about every day. Every now and then I end up having to “product-ize” one and give it to another group. Handling arguments are a painful part of that process. You always forget something and the app blows up because of it. This library adds some intelligence to that process.

http://www.sharpregion.com/clap-command-line-auto-parser/

 

In Active Directory you could search using the ldap filter (ANR=JOSH) and it would return any object where the search string ‘JOSH’ is a part of the the givenName, SN, mail…basically any attibute that has ‘searchFlags=0’ in its schema entry.

In SQL you would use the keyword ‘LIKE’ for the same effect. Although, in SQL you would have to specify the column names to query.

I can’t tell you how much time this would have saved me in the past. I want to weep.

 

One of the problems I face often is that the wider community of .Net developers focuses on data access through databases and not LDAP directories. The impedance mismatch is even larger than OO-to-SQL. There are lots of neat-o controls and tooling built around “standard” data access where as LDAP directories are more of a roll-your-own kind of thing.

Enter Bart de Smet’s LinqToAd library. He implements IQueryable over a wrapper for System.DirectoryServices. This will be helpful for building model classes in MVC apps.

http://linqtoad.codeplex.com/

Sigh.

This snippet will remove non-printables from an ASCII string, because we will never be rid of it.

StringBuilder sb = new StringBuilder();
char[] chars = someString.ToCharArray();
for (int i = 0; i < chars.Length; i++)
{
  char c = chars[i];
  byte b = (byte)c;
  if (b < 32)
  {
     sb.Append(' ');
  }
  else
  {
    sb.Append(c);
  }
}

In Visual Studio 2008 and 2010 alt + shift allows you to select COLUMNS in the text editor. Column selection…you know it’s awesome.