Friday, November 06, 2009

SharePoint 2010 Likely To Offer App Store

This just in from ReadWriteWeb:

Microsoft will offer an application marketplace within Sharepoint 2010 that will integrate with third-party applications from its partner network. No date has been set for the marketplace lauch but it will evolve from "The Gallery" a feature that provides Sharepoint 2010 users access to templates…

Details are few about the application marketplace that will be offered through Sharepoint. But it does point to the increasing significance of third-party applications for the Sharepoint platform and how the service may evolve as cloud computing becomes more prevalent.

I was predicting this a few weeks ago on my “Things To Get Excited About in SharePoint 2010” post. Here’s what I had to say:

Service Application Architecture – the Shared Service Provider was a good idea but it was a bit hard to use in practice. Under the new architecture, you can create Service Applications for things like Excel Services, Forms Services, Business Connectivity Services, and other services that you build or buy, and you can mix and match these in your farms as you like. The services get consumed by web front ends via a standard interface.

This should allow a lot of plug-and-play customization of farms. I’m even wondering if there is an opportunity for vendors here…create some services and expose them to clients from the cloud.

There are some other big changes like Claims Based Authentication and Solution sandboxing which are intriguing to me. The Solution sandboxing feature gives me this sneaking suspicion we will one day soon see a Microsoft SharePoint App Store where we can buy, download and run SharePoint solutions in our farms.

Magic Eight-ball now says: “You may rely on it”.

Wednesday, November 04, 2009

Hosting Clockwork Web Framework With Amazon

I’ve blogged a lot about my admiration for Amazon’s web services stack. I think they understand the web as well as any company in the world. It’s always been my intention to investigate Amazon’s Electronic Compute Cloud (EC2) and since I needed hosting for my new Clockwork Web Framework, I decided to give it a try.

The reason I went with Amazon rather than a traditional hoster is that I have no idea what kind of interest there will be in the framework, and therefore cannot predict what the load on a web server will be. Amazon EC2 is designed for this kind of flexibility, and you pay per hour.

The Platform

I am running a small Windows Server 2003 32x server instance to begin with. It only has 1.7 gb of RAM. I can scale this up if I need to, or more likely I will run up another small instance and load balancing the two using Amazon’s Elastic Load Balancer technology.

On this, I am using IIS 6, .NET 3.5, SQL Server 2005 Express, and Powershell. Most of my files are kept on a permanent storage drive (more on this below) and served by IIS. In order to maximize the speed and lower the CPU burden on the server, I have decided to use another Amazon technology, CloudFront.

CloudFront Content Delivery Network

CloudFront is a Content Delivery Network (CDN), like Akamai or Limelight. I use it to serve my images and resource files. Basically Amazon has edge servers all over the world with a copy of my images and resource files, and when users request them from my website, CloudFront automatically sends them a copy from the nearest location to them, making for some very fast download times.

To make this work, you have to use Amazon Simple Storage System, or S3. This is a virtual file system. Basically you have “buckets” of files that are served up when requests come in from the CloudFront “distributions”.

I’ve optimized it a bit by having two distributions; one for images and one for resources. This means that a page which requires both things will load even faster since two parallel CDN distributions are processing the files at the same time.

You can create CloudFront distributions through code, or through Amazon’s web management portal.

Create CloudFront Distribution

Create CloudFront Distribution - Completed  Since you can control the public URL of the distributions, you will notice if you view the properties of my website that my images are handled by the path “http://images.clockworkwf.com” and my resource files are handled by the path “http://resources.clockworkwf.com” . In other words, I have full control over what path I give them. Most people will never know these picture are being served from Amazon.

I notice the website loads really quickly, so the CloudFront makes a big difference.

EC2 Hosting Challenges

So that’s the high level architecture. There are a number of impacts when using Amazon as a hoster I’d like to talk about.

Server Goes Up, Server Goes Down

To begin with, you have to assume that at any moment your server will go down. If your server dies, it vanishes, and you have to “spin up” another one, using the web interface or code. It’s very easy to do from the web console, just click “Launch Instance” and you can pick any server ranging from Ubuntu Linux to Windows 2003 Server 64x Enterprise R2.

Launching a new instance of ec2 With CloudWatch

Although the server instances you can use have their own hard drive space on C: and D: drives, you have to treat that as transitory storage.

I’ve setup my system in such a way that I can use an Elastic Block Storage (EBS) hard drive volume, provided by Amazon.This is a more permanent drive space that you pay for, but can be attached to any server instance. Think of it as a SAN (that’s probably what it is).

So I’ve got my database and web files on this EBS block, which I then mount to any server instance I’m currently running.

On the server instance, I simply point IIS web server to the EBS block files, and away we go.

The EBS can be any size you like, and you pay per GB per month. Right now I’m using 10GB since my log files and database don’t take up much room. I can add more space later if I need to.

Here’s a screenshot of that EBS volume, in the Amazon web console.

Allocate Elastic Block Storage Instance

Dynamic DNS Entries

Next problem: Since the server can go down at any moment, DNS is a problem. If my server dies and I spin another one up, it will be given its own IP address, which my DNS entry for www.clockworkwf.com wouldn’t know about. So there might be a long delay while DNS changes to the new IP address.

So, I’m using a Dynamic DNS service called Nettica. They have a management console where I can enter my various domain records and assign a short Time To Live (TTL), which means the DNS entries update frequently. So if my server dies, I can change the entry in Nettica to point to the new server’s IP address, and within a few seconds requests are going back to the right place.

Nettica even allows me to control all of this through C# code. Going forward I plan to write powershell server management scripts that can automatically spin up a new server on Amazon, determine the IP, and register that with Nettica.

Incidentally, Amazon EC2 allows you to buy what are called “Static IP Addresses”. Essentially you can “rent” a fixed IP address which can by dynamically allocated to a server instance. So, in the short run this makes life easier for me as I have rented one, used that for my Nettica domain name record, and can assign this fixed IP to any new server instance.

Allocate IP Instance

Next problem: Disaster Recovery.

Disaster Recovery is even more important in Amazon EC2 world than elsewhere, since again your instances could die at any moment….Not that they will, but the point is, they are “virtual” and Amazon isn’t making any promises (unless you buy a Service Level Agreement from them).

However, Amazon’s EC2 provides a level of DR by its very nature – you can spin up another machine in a small amount of time. Estimates for new Windows instances are about 20 minutes.

There’s also something called an Availability Zone. Essentially it means “Data Centre” – Amazon has several of these and so you can spread your servers around between US – East, US-West, Europe, and so on. So when that Dinosaur-killing comet hits North America, the Europe Availability Zone keeps chugging.

Right now I’m not really doing much with my database, so DR isn’t such an issue. I have some security since my files are on an EBS block. However, eventually I’ll setup a second server in another availability zone and load balance the two.

Another Challenge: Price

Amazon Web Services are flexible, and you are charged per hour, for only what you use. This is an amazing model but it doesn’t work so well for website hosting, because of course your servers are supposed to be online 24/7, 365 days a year.

It’s hard to tell for sure what the annual bill will be, but for my small server instance (remember, only 1.7 Gb of RAM) it will cost well over $1,000. That’s a lot more than shared space on a regular hosting provider. However I’m willing to pay this, for the flexibility I get, and also because I think Amazon web services are a strategic advantage and so the earlier I learn about them, the more business opportunities I might unlock.

One good thing is that Amazon has been aggressively dropping its prices as it improves its services. Additionally, they have started offering “Registered Servers” – basically a pre-pay option for 1-year, 2-year, and 3-year terms. Unfortunately these are only for Linux servers at the moment but hopefully they will add them for Windows and then I can save money year on year.

CloudHost Monitoring

Amazon offers a web-based monitoring option for its server instances. I’ve started using it (for an additional fee) but I’m not sold on its utility yet. I don’t think I’m using it to its full potential yet – it is supposed to help you manage server issues by monitoring thresholds.

ec2 Cloud Monitoring

Managing S3 Files Using Cloudberry Explorer

I needed an easy way to create and manage my buckets, CloudFront distributions, and S3 files. I found Cloudberry Explorer, and downloaded the free version of it. I was able to drag and drop 1600 files from my Software Development Kit to the S3 bucket where I’m serving the resources. Super!

There’s a pro version I might purchase which would allow me to set the gzip encryption and other properties on the files. This would help lower my bandwidth costs and speed up the transfer a bit.

Here’s a screenshot of Cloudberry in action:

Cloudberry Amazon S3 Explorer

I love how easy it is to setup and use Amazon’s web services stack. I think they have a great business model for the Cloud, and they’re the company to beat. I’m willing to rely on them for the launch of Clockwork Web framework and so far I haven’t been disappointed.

Sunday, November 01, 2009

Introducing Clockwork Web Framework for .NET

In 2003, I read a book, “Making Space Happen”, by Paula Berinstein. It’s about the efforts of entrepreneurs to open up space to the public. It’s the kind of thing that gets my propeller-head spinning, and after reading it I resolved to create the best website on space travel on the internet.

So, I sat down in a park and within two hours I had covered several sheets of paper with scribbles and scrawls of what my website needed. I had notes on authentication, web components, search boxes, themes, dynamic images, language toggles, and all kinds of stuff.

Being a good little programmer, the more I designed, the more intricate the design became, and pretty soon I was knee-deep in code. Flash forward six years later, and I have yet to write a single page of that space website!

But I do have a web framework :)

What It Is

Clockwork makes it easy to build powerful .NET web sites. It’s completely free, open source (under the Apache 2 license) and you can use it in proprietary or open source projects, as you like.

Some of the ways it makes web development easy:

  • Database-agnostic data access
  • Dynamically displays content in different languages
  • Leverages the .NET 3.5 framework, including the Provider Model, generics, LINQ, automatic properties, and more
  • Integrates with popular web services such as those provided by UserVoice, LinkedIn, Google and Yahoo!
  • Makes it really easy to use object-oriented programming standards like Dependency Injection / Inversion of Control, Repositories, and Specifications

Under the hood I use many popular components, including NHibernate for database access, Castle Windsor for Dependency Injection, and log4Net for logging.

Although today marks the official public release, the framework is currently at version 3.x because I’ve been using earlier versions of it in production websites since 2004.

I’ve built Clockwork using as many web standards as I can find, as many of the latest .NET elements as possible, software best practices, and a lot of love and stubbornness.

What It Will Become

Well, it’s obviously too early to say. But I am committed to continuing to develop it, I have a long list of things I plan to add, and I’m hopeful a community of .NET developers will adopt it and push it into areas I can’t even imagine today.

Please take a minute to visit the website and learn more about it. I hope you find it helpful.

Many thanks,

Nick

Monday, October 19, 2009

Central Administration in SharePoint 2010

Here’s a quick lap around the new Central Administration console in SharePoint 2010.

New Central Administration Layout

Central Administration

The navigation structure is broken down a little more than in 2007. There is no more “Operation” and “Application Management” divide; instead the new console is divided into the following sections:

  • Application Management: Manage site collections, web applications, content databases, and the new service applications
  • System Settings: Manage servers, features, solutions, and farm-wide settings
  • Monitoring: Track, view and report the health and status of your SharePoint farms
  • Backup and Restore: Performs backup or restores
  • Security: Manage settings for users, policy, and global security
  • Upgrade and Migration: Upgrade SharePoint, add licenses, enable Enterprise Features
  • General Application Settings: Anything that doesn’t fit into one of the other sections
  • Configuration Wizards: These are nice wizards to help setup or modify the farm

This is new layout is an advantage – the “Operations” and “Application Management” tabs in 2007 always felt a bit arbitrary and it wasn’t always clear which tasks went where.

Monitoring

This is quite useful – basically you can take the heartbeat of SharePoint and its services via reports, and view problems and solutions. Here’s a screenshot of the interface:

Central Administration - Monitoring

There are only a couple of reports right now, which tell you which pages loaded the slowest, and which users are the most active. I imagine for release there will be many more.

Central Administration - Monitoring - Health Reports

The problem and solution report is very helpful in identifying which services are failing on which servers, and why. Notice in in this report there is detailed information about one of the failing services, in this case Visio, and links to remedy it.

Central Administration - Monitoring - Problem Report

Surfacing common errors in this way will go a long way to reducing the IT administrative burden of SharePoint. I hope Microsoft is active in populating this report engine (or provides a way for the community to modify it).

Usage logging settings are in here as well.

Service Applications

Central Administration - Application ManagementThese new plug-and-play replacements for the Shared Service Provider are major wins for the new SharePoint version. They allow an organization to really customize its farm based on its needs and even usage patterns. Services that needs lots of performance and support can get it, while services that are less useful can have reduced resources or even be turned off altogether. Everybody’s SharePoint 2007 farm looked alike, but going forward it is likely that no two farms will be alike.

Of course to manage this Microsoft has to surface the available services and their settings in the Central Admin. This screenshot gives an indication of just how many services can be used.Central Administration - Manage Service Applications

Export Sites and Lists

Now you can export site and list data right from SharePoint! It’s straightforward with the new Backup and Restore section, which allows full Farm Backups and Restores along with far more granular backup. The backup can include full security including site users, as well as version history information for each item in the list.

I doubt this will replace the need for 3rd party backup software but it’s another tool for IT Admins.

Here I am backing up a Calendar from a site to file.

Central Administration - Site or List Export

The new service architecture of SharePoint is one of the most exciting things about it, and obviously required a bit of a Central Administration retooling. That provided an opportunity for some other quick wins, including a much more intuitive navigation structure and some neat monitoring tasks. More evidence that SharePoint 2010 is building on, but not replacing, the core strengths of 2007.

When SharePoint 2010 Met Web 2.0

One of the goals in SharePoint 2010 was to make it easier for users to update their information and pages without lots of postbacks, clicking, and delays. Accordingly, Microsoft has invested a lot in improving the web user interface.

One way they have done this is by adding the Office Ribbon concept to SharePoint. I think this has to be a first for a web application, and to be honest while I saw the value in Office 2007, I wasn’t sold on it for a web interface.

I think the major weakness of the Ribbon concept is that you can spend a fair amount of time trying to remember what command belongs to what tab. As well, it doesn’t always save clicks. More on that in a moment.

The other major investment Microsoft made is adding AJAX. This is  no-brainer and a hands-down winner for me. I’ve attached some screenshots to show how you would modify a page in the new UI.

Let’s imagine you want to modify a team site:

Step 1: You are in the Browse tab of the Ribbon (up top) – choose the Edit Tab.New Team Site - Browse RibbonNew Team Site - Edit Ribbon

To Edit, click “Edit” which is one of the buttons on the Edit tab. Then click on the area of the page you want, type some text in, and click Stop Editing. Are we saving clicks yet? :)

New Team Site - Edit Page

Well, not so far, but there weren’t any postbacks, so overall I think there’s some time saving here. An important benefit from a training perspective is the server and office products now have identical user experiences, which is a big win.

As well, there are some nice new options including an XHTML converter. And did I mention this all works flawlessly in FireFox? Web standards, hooray!

You can also insert new web parts via the Insert section of the Edit Ribbon:

New Team Site - Insert Web Part

Of course, the context-based Ribbon experience continues when managing lists and libraries. Here’s a screenshot of the out of the box Shared Documents library’s two important ribbons, Documents and Library:

New Team Site -Shared Documents Library - Documents RibbonNew Team Site -Shared Documents Library - Library Ribbon

Finally, tagging and sharing is a major concept in Web 2.0 and SharePoint 2010 addresses this by surfacing sharing activities through the Ribbon. Content can be easily tagged - Tags can be private or public and are automatically added to a suggested set so that users can share tags. New Team Site - Share and Track Ribbon

New Team Site - My Tags

Tagging is also part of a user’s Activity Stream (not sure what the official term is). You can see on my profile that I tagged an element.

My Profile - Tags and NotesI’m not showing it here but there is also an Enterprise Metadata service that allows an organization to centrally control its taxonomy. So, now you can make peace between the “folksonomy” and “centralized taxonomy” gangs in your office!

All in all these UI improvements are icing on the SharePoint 2007 cake. I’m not sure they are enough by themselves to encourage SharePoint 2007 customers to upgrade (I think there are better reasons to upgrade), but somebody with 2003 or without SharePoint at all might now make the plunge. However, these are welcome additions to an already great product.

Although I’m not convinced the ribbon will save clicks, and will certainly take some retraining and familiarization time, it at least is consistent with the Office clients, making for tighter integration. The AJAX-style UI is a big win, and the inclusion of some interesting tagging and sharing features brings SharePoint up-to-date with the Web 2.0 world.

Things To Get Excited About In SharePoint 2010

Now that Microsoft’s lifted the TAP NDA and is presenting SharePoint 2010 publicly at the SharePoint Conference in Las Vegas, there will be a spurt of queued up blog posts on the net :)

Here are some things I’ve been very excited about, in no particular order. They are fairly developer-centric.

  • Ability to develop against the SharePoint dlls on a developer desktop! ‘Nuff said.
  • Developer Dashboard – makes it easy to see tracing information and web server details when you are working on a SharePoint site.
  • LINQ to SharePoint – this is some nice syntactic sugar that helps replace CAML a little bit. You can created strongly typed SharePoint entities using a utility called SPMetal and then query and manipulate the data in them using standard LINQ syntax. I was hopefully predicting this in another post.
  • Visual Studio 2010 integration – VS2010 will have a lot more tools to make SP2010 development a snap. SharePoint Project and Item Templates, Feature Designer, and Project Packaging, will hide most of the messy details of creating, packaging, and deploying a SharePoint solution from the developers.
  • Business Connectivity Services – the next level of the Business Data Catalogue. BCS uses External Content Types which look a bit like Content Types, and are defined in the new SharePoint Designer or in Visual Studio and then added to SharePoint using a definition file (a bit like the BDC currently works). Users can then create External Lists in their sites, which pull in the data from these external sources.
  • Client Object Model – an abstraction layer that allows developers to write code that will work in client .NET applications, Javascript (for AJAX type operations), and Silverlight. Basically this is a disconnected, batch-style API that will operate on the existing SharePoint web services and handle requests and responses using XML and JSON.
  • SharePoint 2010 Designer – Whereas SPD 2007 was a warmed-over FrontPage, the new version has been rebuilt with a focus purely on SharePoint. The new navigation panel is great because it shows you a list of SharePoint objects, such as Entities, Lists, Master Pages, and Workflows. What’s great about this is it keeps you thinking about what you are trying to do in SharePoint, rather than where that command used hidden in SPD. Another big win is you can export your SPD changes as a .WSP file straight into Visual Studio for further customization.
  • The Office Ribbon makes it into SharePoint. The Ribbon kind of grew on me in Office 2007. I think it was a clever paradigm to surface many commands that used to be buried. Now the many SharePoint menus and Site Action dropdowns will coalesce into the Ribbon. I think this will make training and support a little easier. The big weakness of the Ribbon is that you often have to remember which tab the commands belong in. I found that was the case with the new SharePoint Ribbon but after a little while you get used to it, and it becomes faster to modify SharePoint pages.
  • STSADM is dead, long live PowerShell! Leveraging the great new scripting environment is a huge win for SharePoint. The ability to write .NET code to manipulate the command pipeline means we will start to see some very powerful “no-touch” deployment and management options for SharePoint
  • More events – now you can find out when your web or list was created or deleted. This may sound like a small feature but this enables some provisioning and discovery scenarios that in SP2007 were not even possible!
  • Enterprise Metadata Manager. I’ve blogged a lot about the important of governance and centralizing metadata. The new Enterprise Metadata Manager makes it easy to import and manage term sets, keyword and tags.
  • Service Application Architecture – the Shared Service Provider was a good idea but it was a bit hard to use in practice. Under the new architecture, you can create Service Applications for things like Excel Services, Forms Services, Business Connectivity Services, and other services that you build or buy, and you can mix and match these in your farms as you like. The services get consumed by web front ends via a standard interface. This should allow a lot of plug-and-play customization of farms. I’m even wondering if there is an opportunity for vendors here…create some services and expose them to clients from the cloud.

There are some other big changes like Claims Based Authentication and Solution sandboxing which are intriguing to me. The Solution sandboxing feature gives me this sneaking suspicion we will one day soon see a Microsoft SharePoint App Store where we can buy, download and run SharePoint solutions in our farms.

Anyway, there’s a lot of exciting new stuff in SharePoint and I think SharePoint development is about to become really fun!

Monday, October 12, 2009

SharePoint: A Product and a Platform

SetFocus just published another of my articles for their Technical Articles section. This one is called “SharePoint: A Product and a Platform”, and discusses the implications of SharePoint as a software platform.

My conclusions are that the platform provides significant capabilities including a unified development environment, reduced maintenance, development, support, and training costs, and may increase the risk of vendor lock-in.

I’ve written for SetFocus before because I have a long association with them, dating back a decade. I had my Java certification training and first job placement through them. For the past year I’ve been developing and teaching parts of their SharePoint programming classes for the SharePoint Master’s Program (I’m instructing evening classes again starting this Saturday).

You can read more at http://www.setfocus.com/TechnicalArticles/Articles/sharepointproductandplatform.aspx. I hope you enjoy it and welcome your feedback!

P.S. The article is licensed under the Creative Commons Attribution-Share Alike 3.0 Unported License which means you can modify it and share it around!

Wednesday, August 19, 2009

NHibernate Performance Profiling with NHProf

NHibernate

I’ve been using NHibernate a lot recently. It’s an Object Relational Mapping software that makes it easy to “map” between SQL database syntax and standard C# object models. The goal is to talk to databases, by writing code like this:


var query = session.CreateQuery("from WebPage p where p.VirtualPath like :path")
.SetString("path", "%pages%")
IList<WebPage> list = query.List<WebPage>();

Now behind the scenes, there’s a relational database somewhere – and transactions, and validation, and syntax parsing, and query analyzing, and all of that standard relational database stuff – but as a programmer I just need to know about my object model and it will return me a list of WebPage objects and I can easily use them in my code, update them, and delete them. Shiny!

NHibernate is a straight port of Hibernate, from the Java world where it originally evolved many years ago. So the concepts behind it have been field-tested for in both Java and (now) .NET shops. This makes it a very robust ORM tool. Did I mention it is completely FREE?

While it’s amazing software, it comes with a big learning curve. There isn’t much documentation out there – and most of it is on blogs and wikis. I’ve bought Manning’s NHibernate In Action and that helps a bit. However, there isn’t much information on common performance and configuration traps.

Learning and Analyzing With NHProf

So I was glad to find out about NHProf, a profiling and analyzing tool for NHibernate created by one of NHibernate’s main developers (Oren Eini aka Ayende Rahien). His colleagues are Christopher Bennage and Rob Eisenberg.

Essentially the profiler is a slick-looking Windows Presentation  Foundation executable that “records” your application as it writes statistical data to the NHibernate log file, then provides a graphical view of the various things that are going on under the hood.

The interface is well thought out, with only a few tabs and windows, so the information is easy to sort through. Here’s a screenshot of the main interface:

NHProf Main Interface

What I Like About It

Now to the things I really like about this software:

First, you can see the exact SQL query that NHibernate is generating. Straightforward, but critical. There is a related Stack Trace which allows you to jump to the part of your code where you executed this statement.

As well, you can view the rows that are returned by any query. This makes it easy to see exactly what data you are getting back – a much-needed sanity check at times :)

 NHProf View Results

Each NHibernate action is evaluated against known best practices (or bad practices) and you get “Alerts” that can provide more information on what to do (or not do).

For example, while running some recent queries, I received the following alert: “Too many cache calls per session”.

NHProf Alerts - Small 

This leads me to the final element that I LOVE – the “read more” and “NHibernate Guidance” features. Software is so complicated that I just want to get it working most of the time – but I know that if I really understood it, I would avoid a lot of bugs and future issues.

So what makes this software shine for me is the care that has gone into helping people learn NHibernate. By clicking “read more”  you go straight to a web page that teaches you about that particular error and ways to avoid it – including code samples!

 NHProf Alerts - Learn More

As well, there is a “Guidance” option that you can always access to learn about general NHibernate performance issues such as “Select N+1” or “Unbounded Result Set”. I’ve already applied the lessons from “Unbounded Result Set” and “Do Not Use Implicit Transactions” to my code and the result is much better performance and stability.NHProf NHibernate Guidance

One thing I would like is the ability to hover over an alert in the statement in the main window, and actually see a tooltip of the alert message. At the moment you can see the icon showing the alert, but then have to click on the statement and then click on the “Alerts” tab at the bottom to see what it’s for.

NHProf is still in final beta but I have been using it for about a month and have found it to be very stable. I just bought my copy – there is a discount right now before it hits RTM and I think it has already been worth the money.

I would recommend this to anybody using NHibernate.

Thursday, July 16, 2009

Data Splunking

I’ve had my head down for the last couple of months, churning out code for that elusive framework I keep hinting at :) Right now I’m staying in a trailer with no tv, internet, or cell phone coverage and I’ve never been more productive (says I).

Still, I thought I would pop up briefly to mention a cool IT tool that can provide you with a centralized, browser-based repository to search on all the millions of log files, event viewers, and databases that are inevitably scattered around any company’s data centres.

It’s called splunk. Its name is clever – users get to spelunk into their data silos and see what’s there. It’s a simple, single package install that runs on most desktop machines and servers. There’s a free version if you use less than 500 megs of indexed data, and enterprises can pay to index larger corpuses. I’m running that on my Vista 64 bit box and it indexes and searches like a little champ.

In my case I’ve been using it on my framework log files to help analyze bugs and performance bottlenecks. Here’s a screenshot of a search on the keyword “nhibernate” (NHibernate is an Object Relational Mapping software):

Splunk Log Files

As you can see, it quickly pops up all the logged events where NHibernate was called from my classes.

To get this to work, all I had to do was add an “Input” for splunk to index – in this case the full path to my log file folder.

As you would expect, it does lots of reporting. It has broken down my log files into various columns. Examples of these columns are: custom C# properties I search on; the standard log file “stuff” such as the source name, date created, file size; even the sql commands that NHibernate generates for me. I can filter these columns for even more detailed breakdowns. In the next screenshot I am reporting on Entity ID values I use to track my framework objects.

Splunk Log Files - Report

I like splunk because it’s a one-stop shop for me to analyze all my various bits of IT Operations information. There’s a slick AJAX web user interface, and so far performance seems fine for me on my little dev laptop. I find it solid, intuitive, and I don’t have to expend much effort to install, manage, or learn it.

There’s also a way to extend splunk using its custom Application Programming Interface. I plan to investigate that when I have some free time but have not had a look yet.

I think any IT company should give splunk a test run.

Tuesday, May 19, 2009

WCF WTF

Today I was working on a WCF application integration layer for my framework. For some bizarre reason, no matter how many times I updated the service contract and rebuilt my client service references, I never got the latest version.

Specifically, I was trying to call the following method:

public void CreatePages(IList<WebPageDTO> pagesToCreate)

The WebPageDTO was declared (temporarily) in the WCF project. I put DataContract and DataMember on it, the project built fine, and I could even add a reference to it from a client application. However, I could never update the reference from my clients, and when trying to view the service directly I would get this error:

The type '[namespace].[servicename]', provided as the Service attribute value in the ServiceHost directive could not be found.

Finally, I had to modify the method signature to RETURN WebPageDTO – which I didn’t want to do. However, that fixed the service reference and now I can update it. Now I have switched the method stub back to return a custom ServiceResponse object, and that also works.

Bit of a head scratcher. I don’t fully understand why this is happening. Anyone know?

Saturday, May 09, 2009

Trawling for Business Offshore

On the High Seas, large ships drag vast nets along the ocean bottom, hoping to catch schools of helpless Fish Sticks. The IT equivalent are Offshoring Companies who cold-call businesses, trying to dredge up remote contract work.

Times must be tough for them - not a week has gone by without one of these firms ringing me out of the blue.

They always seem to follow the same screwy script. Somebody with a ludicrous title such as “Business Development Manager for Premier International Clients” introduces themselves as “Steve”, “Mike” or “Jane”. They’ve usually got thick Indian or Eastern European accents – perhaps their names really are Mike Jane or Steve but I’m guessing they think I’m somehow more likely to talk to them if I think they look exactly like me? That’s slightly insulting ;)

The conversation normally goes like this:

“Steve”: Hi, is this Nicholas?

Nick: Yes, speaking.

“Steve”: Hi, I am Steve from [Name withheld to protect the guilty]. I had a detailed look at your website and your focused approach towards Microsoft Technology. This focus presents tremendous opportunity between our companies to establish a mutual association towards IT outsourcing. [Name Withheld] is a mature, 200 person strong organization based out of [Somewhere]. We provide deep software engineering skills to the global software industry. We have a satisfied client base of more than 70+ companies spread across US and Europe.Our service areas include bespoke software development and testing services. We have a proficient team of IT professionals working in .Net (C#, ASP.Net, VB.Net, VB)  and SharePoint technology. We are working on few of the latest technologies like AJAX, JSF and Web Services.

Nick: Err, we’re not planning…

“Steve”: We also have expertise in XML, SilverLight, IIS 5, 6, and 7, Windows Start button, SQL Server Database, Oracle Database, Access Database, MySql Database, Flat Files…

Nick: Please hold on, we’re not looking to outsource…

“Steve”: …in addition to client software experience customizing Word, Excel, PowerPoint, Clippy, Microsoft Bob…

(10 minutes later)

“Steve”: …As well as Binary Load Lifters, baking, and cattle rustling.

Nick: Umm, I was trying to say that we are not planning on outsourcing our cattle rustling. Thank you and goodbye.

“Steve”: Thank you - I look forward to speaking to you again very shortly, Jonathan.

Nick: (Dial tone)

Seriously, is this how we’re supposed to do business? Two companies that obviously don’t know the first thing about each other are expected to work across time zones and culture barriers to create complicated software on time and under budget? Riiiiiiiiiiiight.

I’m not against outsourcing firms, but can you please approach me in a professional manner that indicates you are actually interested in working with me?

Actually the first outsourcing firm that called me has a reputation as scam artists. I even found a website run by outsourcing employees who named it as the worst company to work for! This company wouldn’t take no for an answer and months later were still calling me up and trying to get me to start a project with them (asking for 25% of the money up front, which was apparently their scam). So these unsolicited calls can be funny but annoying and scary at the same time.

Still, the fact they’re doing this means that – like Viagra spammers – they must somehow be getting some bites. So if you can’t beat ‘em, join ‘em. I’m going to start my own cold-calling to drive business partnerships:

“Nick”: Hi, is this Jeff Bezos?

Jeff Bezos: Speaking.

“Nick”: Hi, I had a detailed look at your website Amazon.com while I was buying DVDs on it, and really liked your focused approach towards e-Business. This focus presents tremendous opportunity between our companies to establish a mutual association towards IT outsourcing. We have a proficient team of IT professionals working in .Net (C#, ASP.Net, VB.Net, VB)  and SharePoint technology

Jeff Bezos: (Dial tone)

“Nick”: Thank you for your time, Bill.

“Nick”: Hello, Larry Ellison? I had a detailed look at your website MySql.com …

Larry Ellison: (Dial Tone)

So, anybody else getting caught in the net?

Monday, May 04, 2009

Sharing Knowledge With Creative Commons

Over time I’ve written lots of user guides, installation documents, governance papers, roadmaps for various companies…and I can’t use any of them ever again.

This is because the standard consulting or employment contract contains lots of legal text stating that the employer owns all the copyright to the employee’s work. This naturally includes software documentation. This seems like a good idea on the surface – the company pays for the time I take to write the document, so it’s their property.

However, there are hidden downsides.

To begin with, software project requirements tend to be fairly standard. Most projects require a project plan, a specification document of some kind, server topology diagrams, installation scripts, standards and best practices recommendations, help and support docs.

The details of these may vary a bit, but the definitions, explanations, and recommendations usually won’t. Even the details may be similar – one 3-server SharePoint farm will have to address the same issues any other 3-server farm will. This could be boilerplate text.

Reinventing the Wheel

If companies claim ownership of all software documentation IP – which is their default stance - they may not realize that they are paying their employees or consultants to reinvent the wheel. If they claim ownership of everything they generate, no one else can benefit. Conversely, no-one has an incentive to share.

It might be better for a company to leverage documentation created somewhere else, and modify it to suit their own needs – then they are saving time and money.

As a consultant, it’s been suggested to me that if I bill by the hour I shouldn’t mind writing the same stuff over again. But I find that boring – having figured out a way to describe something once, why would I want to come up with new words to say the same thing again and again?

Also – I would rather spend my budgeted documentation time improving existing content. If I’m spending my allotted time rewriting from scratch, that won’t happen.

Finally, we programmers are lazy - in a good way! We’d rather move on to new,  interesting problems than rehash the old ones again and again. So, proprietary IP ownership goes against most programmers’ core mindset.

I’m not saying a company should never claim software documentation IP – there can be valid business reasons for this.

If the company wants to own the IP because it is a business differentiator, and they will maintain and improve the documentation, that is a compelling reason for them to want to own, and not share, their IP.

Also, if the documentation contains lots of very sensitive information, or is very unique, that could be another good reason to hold on to the IP tightly.

I just think they should examine the alternatives first.

Why Bother Picking a Sharing License?

Why bother picking a license in the first place – why not just post the information somewhere and not worry about  it?

Often information shows up on people’s blogs, in user groups, or in online tutorials. In most cases the authors want people to be able to use the information, but they don’t explicitly specify a license. They probably assume that other people can take the code or text and use it as they like.

However, most organizations are (rightly) nervous about copyright issues. If they can’t tell what license the information is released under, then rather than spend time requesting the rights to the content, and tracking the request, they may avoid using it at all. So, specifying the license makes the information easy to adopt, and provides legal protection for companies using it.

A Business Case for Creative Commons

Wikipedia is probably the best example of the power of sharing information on the web. It currently uses the GNU Free Documentation License, but will use Creative Commons Attribution-ShareAlike in the near future. Jimmy Wales, Wikipedia’s creator, says he would have used Creative Commons if it had existed at the time he founded Wikipedia.

There are other license types that allow sharing. Public Domain seems to allow freedom to do anything you like, but my understanding is the concept of Public Domain doesn’t exist in all countries, so it may be a legal impediment for companies wanting to share work. Creative Commons makes an effort to “port” the license terms to different jurisdictions so it is more likely to be valid elsewhere.

I’m primarily using Creative Commons Attribution-ShareAlike licensing because it fits my business model:

1) It’s transparent. People can easily understand what it means, because it’s in plain English and also in Legalese. Even the name is self-explanatory: “Share Alike” and “Attribution” makes it clear what somebody using it can do, and how. Having spent the last three months examining various software and documentation licenses, I’ve come to the conclusion that clarity is one of the most important qualities of a license. Also, I want my three months back!

2) It encourages sharing, even for commercial use. Information Technology is so complicated because we combine many different systems and technologies together and expect them all to work. No wonder software breaks all the time. If we don’t share, there are a lot of things we’ll never figure out on our own. Any licensing that encourages the spread of knowledge is a net gain for everyone.

3) It requires attribution. Attribution is a tiny barrier to adoption, because people are used to adding copyright notices to text they reference. I like the idea that if I do good work, people will find out about it. I consider it viral advertising.

4) It’s low maintenance. I don’t have to get a law degree to figure out the licensing. I would rather copyright law stayed the heck out of my way while I focused on my work. CC-SA allows that.

5) Sharing is good karma :)

Free Online Content on our Extranet

So, is 100% of my content shared under this license? No, and it never will be, because licensing is a business decision that always depends on the specific context. However, my default stance is to share what I write.

The content on this blog is now licensed under CC-SA.

As part of my contracts now I usually negotiate to own the IP I generate, with the promise of licensing it under CC-SA so everybody can benefit. This is working well so far.

To put my money where my mouth is, I’ve started to publish some content on our extranet at https://extranet.griffonsolutions.com/clients/allclients. You can log in with the username “guestuser” and the password “guestus3r”. There is a shared document library with the documents grouped by license type.

I welcome feedback on the documents – what works or doesn’t work, what can be improved and how. I’ll keep the docs up-to-date. Please email feedback to me at nick@griffonsolutions.com.

Feel free to browse, mix, and share!

Monday, April 13, 2009

Griffon Solutions - Startup Diary

About four years ago my partner Marie-Claude and I started developing some custom ASP.NET websites for small businesses.

The company name - Griffon Solutions - comes from a lovely little village in the Gaspesie region of Eastern Quebec called l'Anse-au-Griffon.

down to gaspe 270

What began as a part-time effort on evenings and weekends slowly started to get a little more organized over time. Two years ago, before we moved to Australia, we registered the company in Quebec.

Our reason for coming back from the Lucky Country last year was to focus more on this company, with the eventual goal of having a fun and challenging full-time web solutions business (software and consulting services) that we could run from anywhere.

We'd love to have an "open source business" - not just by writing Open Source software but by transparently sharing what we're up to, and how we're going about it, and learning as much as possible from the web community as we go.

Website Redesign

We’ve just launched a redesign of our website, www.griffonsolutions.com. We had a couple of pages online before, but the site was put together very very quickly as a placeholder, and needed a lot more love. In fact I’ve always been reluctant to publicize it :) As the old saying goes, “the cobbler’s children run barefoot”, and we never made the time to fix up the site, until now.

As a result of this SharePoint blog, I was contacted by Mario Hernandez, at Designs Drive. Mario has been looking into SharePoint and wanted to know a bit more about the LDAP integration I’ve written about. After a few email conversations he was kind enough to volunteer some of his time to help us redesign our website.

Marie-Claude and I picked a graphic design template we were happy with, and then Mario worked hard to make sure it was transformed into valid Xhtml and CSS. We all felt it was important to adhere to web standards and avoid table layouts if at all possible. We’re very happy with the end result.

Thanks so much, Mario, for contributing your time and your knowledge to us!

I think this is an interesting trend, especially in tough economic times…Here we have a couple of guys working in Los Angeles and Quebec who have never met, trading their IT skills and time to each other online. I know we’re not the first ever to do this, but it shows how small the world is, and how many opportunities there are for people to collaborate and work together.

The Framework

Incidentally, our new website is built on a custom .NET web application framework I’ve been working on constantly for the last 5 years.

The framework has been very much a labour of love. Sometimes it has been very frustrating and dispiriting, while at (most) other times the work has been fascinating, challenging, and deeply educational.

I’m finishing up the architecture on it and aiming to release it this Autumn as supported Open Source (probably Apache 2). This means that other people can use it (even in proprietary software).

In plain English, the platform has two goals:

  1. Provide a foundation of Enterprise-level capabilities to any .NET application.
  2. Integrate with popular software, databases, and web services in a simple, stable, secure, and flexible way 

It is standards-based and fully multilingual right out of the box. It uses NHibernate so it can support most databases without any modification.

From a technical perspective I’m still nailing down the shipping features, but currently it includes all of the following:

  • C# 3.0 / .NET 3.5 Framework
  • Generic business Entities to model common web software concepts such as users, websites, documents, and web pages. These are implemented via interfaces so you should be able to integrate them with your existing code without much modification
  • Common metadata and provider information for all Entities, such as Creator, Date Last Modified, and Data Source
  • Basic LINQ querying for all Entities
  • Entities can be exposed via a variety of formats including JSON and XML, or through web services (such as REST and SOAP) or web feeds (RSS and Atom)
  • Full multilingualism down to the level of an individual piece of data
  • N-tier codebase, using object oriented best practices
  • NHibernate database-agnostic data storage
  • log4Net for robust logging
  • Application Integration layer to make it easier to consume and provide information from a wide variety of services, software, and other sources (such as RSS and Atom feeds)
  • OAuth authentication for authenticating to web services such as YouTube or Google
  • Some strongly typed web service managers and web controls to make it easy to use popular services like LinkedIn, Yahoo FireEagle, and Google
  • Strongly-typed, standardized file access to a variety of file storage sources including Amazon S3, Http web servers, FTP servers, and Windows file systems
  • Uses the .NET Provider Model (especially for Roles and Membership)
  • Presentation layer with prebuilt base controls, pages, and master pages, as well as some server controls
  • Includes basic website project with Robots.txt, Master Pages, sitemap, XRDS file, and Admin area to make it easy to start up and manage a new website

I’ve been using various versions of this framework in production since 2003 / 2004 so it is tested and stable (my current internal release number is 2.7), but it’s nowhere near where I eventually hope it can be, which is why I’m hoping to build a thriving community around it.

Right now I’m in talks with a group that wants to evaluate using it to integrate their software with a popular web messenging service. If you too would like to evaluate the framework around the Q3/ Q4 period, drop me an email (address below) and we can have a chat.

I’ll provide more updates on this framework during the summer, as it nears RTM.

The Future

Obviously there’s a lot going on. In the future, this blog will speak not just about SharePoint and related technologies, but about business and technology issues in general. I hope to learn as much as I share, and I hope above all that the blog remains interesting and that you enjoy reading it :)

Cheers,

Nick

P.S. You can always email me at nick@griffonsolutions.com

Wednesday, April 01, 2009

Ottawa SharePoint User Group – PerformancePoint

Yesterday’s Ottawa SharePoint User Group was a demonstration of Microsoft PerformancePoint, given by Microsoft Canada’s Olivier van Brandeghem. PerformancePoint is a Business Intelligence product that was built on top of SharePoint (MOSS Enterprise only).

Olivier began by explaining Microsoft’s strategy of making Business Intelligence available across the organization. He pointed out that the people who tend to see Key Performance Indicators and Dashboards are be the people who are least likely to act on them directly – so it is helpful to make these sorts of dashboards available as widely as possible. He argued that this form of Business Intelligence is collaborative or “democratized”.

In order to allow this, the technical complexities (of installing the BI product, managing it, producing OLAP cubes and other data sources, and making and deploying dashboards and reports) have to be reduced. This is a key goal of PerformancePoint, and Olivier therefore focused his demo on showing how easy it was to use.

As I mentioned, PerformancePoint was previously a standalone product. As of today, April 1 (International Conficker Day!), it can no longer be purchased separately – it is part of the SharePoint Enterprise Client Access License which means if you own Enterprise MOSS, you get PerformancePoint.

This is a huge win for clients who love the idea of dashboarding and BI but can’t afford even more software licensing in addition to their SharePoint fees. It also fits well into the Enterprise SharePoint space, which also provides basic KPIs, Excel Services, Forms Services, and the underappreciated Business Data Catalogue.

Additionally, Reporting Services can be bundled with SharePoint “natively”, so the Enterprise product fit is very good. PerformancePoint is also part of recent Microsoft moves from licensing by servers, to licensing by services. This is due to the Software+Services initiative I blogged about here.

So what exactly does PerformancePoint give you? Here’s a quick list:

  • Scorecards
  • Analytics
  • Maps of business data
  • Data Linked Images
  • Search
  • Advanced Filters
  • Predictive KPIs (“You will come into some money”)
  • Planning Data

One nice feature is the Central KPI management, where you can set the KPIs in one place and share them all over a portal.

Olivier also demonstrated how Visio diagrams can be connected to KPIs. The demonstration he showed was of a hospital, which was actually a very intuitive way of showing all these capabilities. The Visio diagram for instance was a map of hospital rooms showing infection rate, patient turnover, and other metrics, and the various rooms of the hospital turned red or yellow or green depending on the KPI result.

It seems easy for end users to create their own reports, using various templates. Olivier mentioned the use of MasterPages so there can be a level of consistency in the branding (Reporting Services, are you listening?).

Strategy Map scorecards are available. These are dynamic combinations of KPIs - almost like a workflow or flow chart - that give a more realistic flow of key business metrics. As an example, if you have some red KPIs at the beginning of a business process, your whole process might be flagged red or yellow; but if everything is alright except for a few optional business metrics that are red, your strategy map may still be Go Go Green.

PerformancePoint ships with some built-in web parts that allow ad-hoc KPI manipulation. Generally they provide Master-Detail views and some charting or rendering components such as pivot tables. Each allows export to Excel as you would expect, where you can drill down into even more detail or take the data offline. For more information on the native SharePoint Excel / BI offerings, check out this blog post from the Sydney User Group.

The Advanced Analytics tool called ProClarity also ships with PerformancePoint. Microsoft purchased these guys in 2006 with the goal of beefing up their Business Intelligence offering. ProClarity gives you open access to the OLAP cube to manipulate and report on data. Although the tool is separate in this version, in the next version it will be tightly integrated into the rest of the toolset.

PerformancePoint supports a variety of data sources, including Relational Databases, but the obvious source is an OLAP cube. In response to a question from the audience, Olivier stressed that the goal is not to require SQL Server Analytics, but any OLAP cube provider. Microsoft understands that companies that have made big investments in some other BI vendor, such as Cognos, won’t be willing to shift all their BI bits into another vendor, simply to get dashboarding. So their goal is simply to help surface the existing data into SharePoint.

Olivier also mentioned that Enterprise Project Management, the latest version of Microsoft Project Server, now uses OLAP cubes to help report on project metrics. Anybody using PerformancePoint and EPM therefore should spend just a bit of time putting these metrics to use.

One thing that startled me a bit was the licensing discussion. Olivier mentioned that you can mix Standard and Enterprise User CALs. To be honest this recommendation has tended to vary depending on who you talked to at Microsoft. Sometimes Microsoft representatives say no, everybody in the organization has to use Enterprise CALs if anybody does; other times the response is yes, you can mix them up as long as they are tracked somehow. An official FAQ seems to imply the latter. In any case with dedicated Site Collections it’s pretty easy to lock down functionality to a select few so this is achievable in SharePoint.

The tight integration of PerformancePoint with SharePoint is part of a growing trend I mentioned a couple of years ago. More and more products will end up on top of, or talking to, the SharePoint stack. This is the whole point of having a platform. We can expect much more evidence of this in the next release.

Sunday, March 29, 2009

The Code Factory - A Hangout for Ottawa Startups

I’ve started renting shared workspace at the Code Factory, a “collaborative co-working space” at 246 Queen street in downtown Ottawa. I’ve purchased about 40 hours of time so far. I can use meeting rooms, internet, coffee machines, lending libraries, and even some entertainment in the form of a foosball table and a Wii.

 P1010484 P1010480 P1010481 P1010482 P1010483

The owner, Ian Graham, calls himself a “Management Consultant and Entrepreneurial Catalyst”. He’s very enthusiastic about the startup scene in Ottawa, and definitely wants to help act an incubator and organizer of it.

It’s a neat place to hang out and meet entrepreneurs. There’s at once an informal and exciting vibe, and every time I’ve been there I’m come back amped up. Although I mostly work out of a home office it’s worth it to rent a bit of space because I like the chance to meet people, and I’m learning a lot about what’s going on in Ottawa and especially in the startup community.

It’s great to see such a place in Ottawa, which has tons of IT knowledge and has done for many decades…The Ottawa high tech scene actually has some very deep roots, due to the presence of government research and development agencies, a highly educated workforce, a strong telecommunications sector, and (more recently) some major hardware and software firms such as Nortel, Cognos, and Corel, which generated a lot of spinoffs. To prove the point, there’s a neat family tree on the wall showing the genesis of Ottawa-Gatineau startups and their sources (government, research, telecom, and so on).


View Larger Map

Thursday, March 26, 2009

LDAP Authentication – More Tips

After posting about why LDAP authentication for intranets are a bad idea, I received some more emails with some tips I thought I would share.

Although most people commented that they would try to take the approach of synchronizing their e-Directory identity store to a slave Active Directory, a few were in the unfortunate position of having to implement LDAP anyway.

So if you fall into that camp, here are a few more approaches that might help you. As always your mileage might vary:

Wen He posted some great advice on modifying the People Picker for LDAP Membership providers. This is a pretty common problem – even if the profiles are importing successfully, the people picker may not show them. So here’s the quick code to do this:

you will need to add a key specifying the LDAP Membership “LDAPMember” to the <PeoplePickerWildcards> section into the web.config for the web application and Central Admin as shown:

<PeoplePickerWildcards>

<clear />

<add key="AspNetSqlMembershipProvider" value="%" />

<add key="LDAPMember" value="*" />

</PeoplePickerWildcards>

The line with the key="LDAPMember" and the value value="*" that explicitly specify a wildcard enables PeoplePicker to be able to search for People and Groups by enumerating users from LDAP directory. You know that if you don’t add this line, the PeoplePicker will look only for an exact match on the user ID.

Wen He goes on to describe user and group filtering against LDAP in SharePoint. It’s a great post, highly descriptive and comprehensive.

I also received an email from Krzysztof Wolski – after setting up LDAP and trying to log in he was having problems with “Unknown Error” (one of my all-time favourite exception messages ever!):

We have to use LDAP authentication because this
is an official requirement of our client.
I've solved the problem with "Unknown error" - I've changed the default user
for Application pool and set the identity information in web.config for 81
web application to :
<identity impersonate="true" userName="WIN2003\#USER#" password="#PASS#" />
We've used the same user for Application pool and impersonation.

Krzysztof also mentions one other potential gotcha that occurs if your e-Directory store is set to not allow anonymous authentication. He solved it this way:

We've successfully installed Sharepoint with LDAP Authentication.
One more thing we've added to the LdapMembershipProvider in web.config:
    <membership>
      <providers>
        <add server="#IP#" port="389" useSSL="false" useDNAttribute="false"
userDNAttribute="cn" userNameAttribute="cn"
                userContainer="o=MyCompany" userObjectClass="user"
userFilter="(&amp;(ObjectClass=inetOrgPerson))" scope="Subtree"
                otherRequiredUserAttributes="sn,givenName,cn"
name="LdapMembership"
type="Microsoft.Office.Server.Security.LDAPMembershipProvider,
Microsoft.Office.Server, Version=12.0.0.0, Culture=neutral,
PublicKeyToken=71E9BCE111E9429C"
                connectionUsername="#USERNAME#"
                connectionPassword="#PASSWORD#" />
      </providers>
    </membership>
eDirectory configuration in our case was configured to not allow anonymous
access.

Thanks Krzysztof for the great tips!

I hope this information helps those of you doing LDAP integration – but even more I hope you don’t have to do this in the first place :)

Wednesday, February 25, 2009

I'm All Twitterpated

I'm late to the party, but based on the advice of many of the folks at the SharePoint Best Practices Conference I've signed up to Twitter - and boy is it fun!

It took me a while to understand it, because it's a variety of different things rolled into one. It's a micro-blogging platform, meaning it's easy to write little posts or "tweets" in 140 characters or less. It's a subscription service like an RSS feed, where you can listen to people talking about a particular topic. It's also a great way to get support - companies often search on twitter now and respond to questions and support issues.

Case in point: I'm still researching hosted Exchange email, so yesterday I started the usual Google search to find reviews etc...but then had the happy thought of asking Twitter people about it. By asking a question and putting the keyword #exchange in it, my question appeared to anyone following that keyword. Soon suggestions and advice were flowing in.

As well, companies that host Exchange starting contacting me and making suggestions. So it's a kind of active-passive channel to get quick information from people in the know.

Then there's the fun side of it - following people in all walks of life, listening to them talk about their favourite tv shows, trips, obsessions - there's a lot of weird and wonderful stuff out there. It's a real slice of life!

Right now I'm following a variety of SharePoint folks but also people interested in things like Exchange, NHibernate, and programming best practices.

I'm also following some friends and colleagues, and even the great British actor Stephen Fry, who is one of the most popular twitterers in the world with over 100,000 followers (people who subscribe to his tweets). His latest tweets describe him whale-watching, drinking Tequila, and riding a mule up a mountain for some movie he's filming (not at the same time).

To get started, all you have to do is create a profile on twitter.com. To make it easier to twitter I am using a desktop client. This helps with searching, following people, and tracking through the long lists of tweets. The one I chose is Twhirl which is an Adobe AIR application that sits in my task pane window and alerts me the moment a new tweet comes in.

A good thing to do is check out Twitter tips and tricks on the web. ReadWriteWeb has a lot of great information on twitter usage, applications, and tips - here's a link to their twitter articles. Also Joel Oleson has some SharePoint + Twitter tips on his blog.

You can follow me on twitter at @NickKellett or at www.twitter.com/NickKellett.

Friday, February 20, 2009

Using LDAP Authentication With A SharePoint Intranet Is A Very Bad Idea

A couple of years ago I wrote an article explaining step-by-step how to integrate Novell e-Directory with SharePoint. At the time it was pretty much the only available information on the web. Since then I have frequently been asked for tips on integrating LDAP with SharePoint intranets, most recently last week. So I thought I would provide some updated advice:

Run away! Run far, far away.

  1. Get one of your colleagues to distract your boss.
  2. Climb out of your office window.
  3. Head for the nearest bus, train station, or airport.
  4. Change your name.

If somehow your boss or client finds you and demands that you integrate LDAP with the SharePoint intranet, explain why it's a very bad idea.

Why is it a Very Bad Idea?

 

From a technical perspective LDAP integration is really just Forms Based Authentication (FBA) - you are passing in a username and password to SharePoint, and these happen to be authenticated via LDAP calls to an identity store somewhere.

Using LDAP, logging into your SharePoint portal will look like this:

LDAP Signin - Additional Zone

Doing this is a big mistake!

Reason #1: Time Spent Tinkering

Now in order to do this, there are a variety of technical steps you need to take. If you run into problems anywhere along the way, you will spend your valuable time trying to figure out if the problem is in your Role and Membership provider settings, in your various web.configs, your LDAP query, or something else you have to enable in SharePoint.

This means you are spending your effort (and your client or employer's money) struggling to implement something that with Active Directory would "just work".

Some might argue that this isn't a great reason - after all plenty of time and effort goes into modifying SharePoint to accomplish other requirements. But any effort you expend making SharePoint work without AD is time you could be spending modifying SharePoint to address problems the business actually cares about. The business does not care that its credentials are currently stored in Novell eDirectory and SharePoint prefers them in AD.

Reason #2: Your SharePoint Intranet Won't Work The Way You Expect

Portal users expect seamless integration and functionality when they are using SharePoint for the intranet, because that's what all the marketing materials teach them to expect.

Building an intranet without Active Directory can lead to some surprising and annoying side effects. Out of the box web parts or controls like the Organizational Hierarchy don't work very well without Active Directory. You can make them
work with ADAM or by exploring 3rd party replacements but you'll have to test any portal functionality you think you are likely to use.

Also, the user experience with Microsoft Office integration can become a problem. Open up Word or Excel when logged in using LDAP credentials and you might see this:

LDAP Word Integration Problem

It's fun explaining that to an end user!

SharePoint Designer is also tricky - it wants to automatically authenticate you using windows credentials, and throws errors when it runs into forms based authentication.

I'm not claiming that these issues are insurmountable, but collectively they introduce new bugs, development, testing, and management issues, increase cost and risk, and potentially annoy your users. Is it really worth it?

How Is that Different from Internet or Extranet Environments?

Many extranet and internet environments use Forms Based Authentication without problems. Administrators or installers have to modify SharePoint to work in these zones using Role and Membership providers...So why is it ok for those environments to use FBA but not the Intranet?

Again, it's a question of user expectations. When a user logs on to an extranet or internet site, they don't expect seamless Office integration and automatic access to file shares using their desktop credentials. When they use SharePoint intranets, they do expect these conveniences.

All the literature and sales material tells them that they should be able to interact directly with SharePoint intranets via Outlook, Word, and Excel - without annoying popup boxes and workarounds. They expect to be able to use all of the out of the box controls and web parts like the Organizational Hierarchy, and don't want to hear excuses like "that works best with Active Directory so we can't use that now".

So What Should You Do?

I recommend synchronizing your e-Directory identity information into an Active Directory domain and building SharePoint on top of that.

There are a variety of ways to do this but one way you can investigate is via Novell's IDM which can synchronize between e-Directory and Active Directory.  Your e-Directory is still the master identity store, but any e-Directory changes get automatically sync' ed to the child Active Directory. Here's a link that might help: http://www.novell.com/coolsolutions/appnote/18349.html

You can definitely make SharePoint use multiple identity providers - but the reality is your SharePoint portal becomes much more flaky and expensive to manage. These days there are many ways to populate Active Directory from some other identity store - I always recommend that since it is less effort and risk down the road, and (probably) less cost as well.

SharePoint is complicated enough without adding to the challenges.

Of course your mileage might vary. If you have attempted (or succeeded) in integrating a SharePoint intranet with LDAP what were your experiences? Besides IDM, are there other good ways to synchronize Active Directory with a master identity store?

Tuesday, February 17, 2009

Above the Clouds: A Berkeley View of Cloud Computing

The Electrical Engineering and Computer Sciences department at the University of California at Berkeley just put out a research paper on Cloud Computing as they see it.

The paper is an in-depth exploration of what some consider to be just another buzzword, Cloud Computing. Since nobody has agreed on what exactly it means, the implication is that it's just a marketing term.

I remember when web services started to appear, around 2000/2001 if I recall correctly. The descriptions and possibilities seemed great, but nobody really knew what to do with them or why. So there came a time when nobody talked about web services anymore and it looked like that particular bubble had burst.

In fact, behind the scenes, a host of companies and individuals were figuring web services out, building their own, and releasing them. A couple of years after the term started popping up, web services arrived for real and now we have mashups and SaaS and Software + Services and some really well-traveled XML fragments zipping around the globe.

The same thing seems to be going on with Cloud Computing. We're in the early days, and still hearing the "Moon on a stick" promises that Cloud Computing is a silver bullet for everything.

This white paper is one of the first I've seen that really quantifies the (potential) cost savings of Cloud Computing.

Some gems:

  • Explanations on Cloud Computing and how it differs from previous attempts;
  • Classes of Utility Computing on page 10, comparing Google AppEngine, Amazon Web Services, and Microsoft's beta Azure platform;
  • Cloud Computing economic models, on page 12;
  • A discussion of the Top 10 challenges- and potential solutions to them - on page 16;
  • The observation that FedExing your data is a good way to cut down on your bandwidth costs and delays.

This is very impressive work. The full paper is here:

http://www.eecs.berkeley.edu/Pubs/TechRpts/2009/EECS-2009-28.pdf

Monday, February 09, 2009

SharePoint Best Practices - Another Great Year

P1010386

P1010434

We enjoyed another great year of Mindsharp's SharePoint Best Practices Conference in San Diego. Thanks to Mark Elgersma, Ben Curry, and Bill English as the chief organizers, although I know there are lots of other Mindsharp folks who worked hard to pull this off.

UPDATE: Just heard back from Ben - the primary conference organizers were Bill English, Paul Stork, Paul Schaeflein, Todd Bleeker, Steve Buchannan, Pamela James, Brian Alderman, Ben Curry, and Mark Elgersma. Thanks again guys!

Given the grim economy it was noticeable how many people showed up - over 350 attendees I believe in addition to all the vendors and organizers.

I attended with echoTechnology's Director of Sales, Sean O'Reilly. He's a real hoot - a bit of a legend on the conference circuit. We arrived on the Sunday. Since Sean and I are now using demo laptops and have the exhibit booth process nailed down after numerous conferences, it only took us about 15 minutes to setup the booth before we could unwind.

P1010402

That night the BPC kicked off unofficially with a Super Bowl party in Ben Curry's suite. It was great to watch the game with various attendees and exhibitors.

Joel Oleson gave a funny keynote on Monday morning, talking about the 10 steps to success.  He argued that SharePoint is plastic and so just because you can do something with it, doesn't mean you should. Some of his analogies included Robot Barbie and headless chickens. Also there was a disapproving mother ("IT") and finger-painting baby who got paint all over the wall ("the business").

One quote he mentioned from Gartner says that by 2010 less than 35% of WSS sites will put effective governance in place!

Programming Best Practices

In between exhibit hours and meetings, I was able to attend only one seminar, given by Francis Cheung of Microsoft's Patterns and Practices Group. This was a really neat explanation of best practices for programming against SharePoint.

What was interesting was Francis showed how object oriented patterns like the Repository pattern could be used to abstract out SharePoint-specific resources like list names, loggers, and so on. Francis also pointed out the need to create strongly typed business entities for SharePoint, rather than straight calls against SP objects.

This provides an additional layer of abstraction that allows mocking and unit testing. The idea is that for code testing purposes you should be able to swap in mock interfaces without relying on SharePoint being available. For instance you should be able to run unit tests against SharePoint code without the SQL database even being available.

These entities also make it easier to work with presenters/controllers without worrying about looking up Site or List GUIDs, or provide an easy way to do CRUD operations.

Microsoft currently has a patterns and practices guidance available at www.microsoft.com/spg. From the guidance doc:

This guidance discusses the following:

  • Architectural decisions about patterns, feature factoring, and packaging.
  • Design tradeoffs for common decisions many developers encounter, such as when to use SharePoint lists or a database to store information.
  • Implementation examples that are demonstrated in the Training Management application and in the QuickStarts.
  • How to design for testability, create unit tests, and run continuous integration.
  • How to set up different environments including the development, build, test, staging, and production environments.
  • How to manage the application life cycle through development, test, deployment, and upgrading.
  • Team-based intranet application development.

This approach is actually a standard approach to custom code development but SharePoint has tended to blur the lines a little bit. What is very interesting is that more and more of the BPC seminars were about programming and unit testing. People were even talking about Test Driven Development (TDD) against SharePoint!

What this indicates is that organizations are starting to treat SharePoint seriously as a development platform. This has always held potential but required a steep learning curve. For example, at the Best Practices Conference last year there was very little on these sorts of developer-centric practices. This year it was all about programming against SharePoint in the traditional way - using unit tests, mocking, web smoke tests, and OO patterns.

There hasn't been a killer app for SharePoint yet but it's coming.

Best Practices For Centrally Governing Your Portal and Governance

On Wednesday morning I gave a presentation on helpful tips for centrally managing a portal. I showed a governance site collection I have been working on and talked about how it can be used to make it intuitive and easy to run a portal.

I'm including my powerpoint presentation here.

There are no screenshots of the governance site collection yet although I am uploading a couple as part of this post.

Governance Site Home Page

Governance Site Taxonomy 

Hope this helps!