Thursday, May 10, 2012

Book Review: .NET 4.0 Generics Beginner’s Guide

I am definitely no kind of expert in .NET software development.  This is probably not a surprise to any of you who’ve read this blog before, but in the case of this book review, it’s an important fact to keep in mind, as this makes me the ideal target audience for the subject of this article.

 

Packt Publishing’s “.NET 4.0 Generics – Beginner’s Guide” by Sudipta Mukherjee is an excellent resource for any newcomer to the .NET development arena.  Mukherjee’s writing style is approachable and not prone to the kind of dense technobabble that is a common feature in many modern computing texts.

Rather than talking at length about underlying principles, Mukherjee sketches out an outline of the topics and approaches relevant to each section, then walks through the application of those principles with the added context of detailed examples.  These walkthroughs contain several “Have A Go Hero” points where the reader is encouraged to try the example out for themselves, and rather than saying “This is what the example will do”, the analysis of the examples happens after the user has attempted to try the code for themselves.  As a kinaesthetic learner I found this to be particularly helpful.  This approach drives the reader towards implementation, not just explanation – and actually having used the code patterns exposed in this text means I’m less likely to forget them after I’ve set the book aside for more than a week.

In terms of topic coverage, I have to say I found this book to be quite comprehensive.  Early chapters explain the history of the problems addressed by generics and outline the interfaces common to most .NET generics (i.e IEnumerable, IComparable) and then jump straight into seeing simple generic types and collections in action. 

This text also contains an excellent introduction to C# Extension

Methods and LINQ.  I have to admit that my understanding of LINQ is only basic at best, and the explanation of the various LINQ clauses along with the use of anonymous funcs, actions and delegates made more sense to me upon reading this text than after attempting to grind through others that have covered the same ground.

As the book progresses, more complex usage patterns are unfolded before the reader, and more complex types that use or are composed of more primitive generic types are explained in the context of the primitive types used.  Multithreading in .NET Generics is described with some especially clear examples, and given my background in scripting, seeing just how powerful the eventing model in .NET 4.0 generics is was - quite frankly – astonishing!  Additional third-party libraries are also described (again, with great examples) and the book is rounded out by a couple of great chapters on best practices and performance tuning – again in a way that takes great pains to show, not tell.

Some of the examples are quite amusing as they occasionally contain quirky references to well known internet and software brands.  Between the easy reading, the great examples and the humour I have little choice but to give this book a 5-star rating, and to offer Mukherjee and the editing team at Packt Publishing a resounding pat on the back!

[tweetmeme source="Ozziemedes" only_single="false"]

Monday, May 7, 2012

Where I’ve been and What I’ve been doing

Wow… it’s been too long since I posted here.  I guess that means it’s time I offered a brief explanation.

In September 2010 I tendered my resignation from HP and hung out a shingle as a SQL Server consultant/architect/developer.  During the following 20 months or so I’ve been pretty busy actually working on client projects.  Since I’m in the process of ramping up into a new piece of work I actually have the luxury of a few minutes to do some blogging.

So… what have I been doing?  Well… the first project was for an agricultural services company doing some data modelling for a customer data warehouse and the SSIS development to populate said warehouse.  In that project I was using SQL Server 2008 R2, Pragmatic Works BI Documenter, BIDS 2008 R2 and Visual Studio 2010 Data Tools for the database development and source control.

The second project was much more code-intensive – I was building a database and corresponding client-side data entry application (and a bolt-on ASP.NET web site) to handle cross-institutional status credit assessments for a local university.  For that project I used Visual Studio LightSwitch on the client-side to build the forms solution.

Some of you might be asking “What the hell is Visual Studio LightSwitch?” – and I’m glad you asked.  VSLS is a model-driven framework for rapidly building business data applications using some of the best practice features in the .NET framework.  Amongst other things, it uses:

  • ADO.NET Entity Framework (to abstract underlying data stores as objects that can be interrogated and updated via LINQ to Entities)
  • Microsoft Extensibility Framework (to provide a plugin model for various kinds of functionality, including themes, custom controls, application shells, business object templates, screen templates and so on)
  • Model-View-View Model N-tier architecture (which enforces separation of concerns and imposes good coding practices regarding the separation of UI logic from business logic and data logic)
  • Microsoft’s QuickLaunch framework (for deployment of client-side and mid-tier binaries)
  • Native Azure integration (both for the presentation/business tiers (deploys as Azure hosted apps) and data tier (deploys as SQL Azure database))
  • Visual Studio’s data tools (to manage the enumeration and deployment of database changes – although this is hidden from the developer)
  • Claims-based security model using the ASP.NET authentication and session management database (to allow pluggable authentication models (none/forms based/windows authentication) and a relatively simple claims-based security model to be implemented within applications)
  • Choice of C# and VB for code-behind

Microsoft promotes LightSwitch as a “No-Code” framework, although in its version 1.0 form, the degree to which that claim is valid is limited to the simplest of data entry and search forms.

The solution I built in LightSwitch does a lot of custom data presentation, complex validation logic, compiles/sends emails from templates stored in SQL Server, allows the upload, download, preview and printing of documents and has a lot of term-based search functionality.  Navigation is relatively intuitive, and the solution is now deployed into production.  I also built a simple ASP.NET form for processing approvals from lecturers.

Once I got the hang of some of the quirks of the LightSwitch 2011 (i.e. v1.0) platform and learnt how to recover quickly from some infrastructure issues (within LightSwitch) that cropped up a few times, I was able to be quite productive within the framework.  Given that as coders go, I make a great data architect this is quite an achievement.  However, I’d warn any business developers who are looking to invest heavily in LightSwitch to be wary of the V1.0 release, as it’s documented fairly sparsely and it took quite a while for the forum junkies on MSDN to get up to speed with some of the common glitches that you might come across.  With that said, once you’ve got over the initial hump and had to rebuild a couple of apps from scratch you’ll find it to be a much more efficient solution for building business applications than hand-coding WinForms, ASP.NET, WPF or Silverlight apps.  The flexible deployment options also make for a painless release process and your infrastructure guys will love you for the painlessness of the application deployment side of the picture.

This solution is going to form the basis for a cloud-based multi-tenanted solution that other higher educational institutions can subscribe to.

In addition to the status compendium solution I have also started mapping out data models and some initial screens for a couple of other applications.  The first is an IT architecture repository that allows IT architects, change managers and service delivery managers to provide traceability and audit of business functionality through to the application systems, source code and the infrastructure on which those solutions have been deployed.  The second is a concept for a crowd-sourcing solution to allow people with idle capital equipment to lend/hire out their gear to other people who need it for short term projects.

[tweetmeme only_single="false"]

Wednesday, January 27, 2010

Confessions of a Slacker (and OneNote Niftiness)

A quick update on DPGA and this blog...

For those following this blog, I've been qujiet for a couple of weeks.  One reason for this is that I've been setting up a SQL Server blog over at http://ozziemedessql.blogspot.com, where I'm doing a series of posts on querying metadata for fun and productivity.  The other reason is that I have been working on building user story cards for the DPGA application.  

As an aside, I've been building the DPGA story cards using Microsoft's OneNote application, which lends itself nicely to this kind of relatively ad hoc documentation.  It's possible to create OneNote post templates containing preconfigured content (which I've done), so that each time you create a new note within a given section it is based on an existing note.  The net result looks something like this:



Is it not nifty?  I just click "New Page" and I get a clean pre-formatted template and can populate data directly into the table.

The DPGA app is fairly complex (lots of different TYPES of functionality), so building user stories isa relatively time consuming process. Anyone who wants to help out by building story cards is more than welcome to do so. Just leave a comment with an email address I can hit you back on and I'll set you up with access to the one-note repository on the DPGA project site. This is a great opportunity to get involved in the design of the tool.

Once I've got all the user stories worked out, I'll publish a "project backlog" (basically a list of user stories to be implemented) and will work through the priorities.  Once that's done, I can start working with code!

Until then - Code Well and Code Often!

Thursday, January 7, 2010

Note: Game Assistant Contributions

Just so people are aware – I’ve created a project on CodePlex for this project, so people will be able to inspect the source code and discuss what I’ve done, how I’ve done it.  I’m happy to accept that my code will not be brilliant quality at the start – this project is as much about teaching me to code as teaching others. 

Also, if people want to help out on the project, I’m happy to accept other contributors.  My only request is that you focus on the real objective of the project – i.e. learning about what’s possible in the .NET framework – and make sure that your code is commented properly.  I’d also ask that you post blog entries on any modules submitted to the project, so both I and my readers can learn from what you’ve done.

Proposed Technologies for Game Assistant

In my last post, I put together a feature list for my Gaming Assistant project. At this point I’m ready to propose a set of technologies that will be used to build my solution.

Technology Selection

I’m going to take a structured approach to the technology selection. There are five key technology choices I need to make, as follows:

  • Code Platform (choices: VB.NET, C#, other)
  • Visual Platform (choices: winforms, ASP.NET, WPF, Silverlight, DirectX)
  • Remoting Platform (choices: WCF, SOAP web services, TCP/IP Sockets)
  • MultiMedia Platform (choices: Live Services, DirectX, WPF MediaElement, SkypeAPI)
  • Structured Data Persistence & Distribution Platform (choices: XML files, SQL Server Compact Edition, SQL Server Express Edition)

Let’s tackle these one by one.

Code Platform

This is a personal choice only, not a criticism of any of the other options. I like the conciseness of C#. VB.NET might be a little “friendlier” to read, but I find myself having to read a lot more code to understand what’s going on in VB.NET than C#. I could possibly also look at alternative technologies with Common Language Runtime support (e.g. IronRuby, IronPython) but I’d be going back to square one, whereas with C# I at least have some idea about the syntax. So… C# it is.

Visual Platform

This is a much tougher choice. Most of the folks who will be using this program will also be geeky enough to have computer hardware capable of supporting all of the DirectX (and thus WPF and Silverlight) vector graphics features. So let’s walk through the pros and cons of each product.

Windows Forms

ProsCons

I have some experience writing simple windows forms

Windows forms is based on GDI+, not on DirectX

Windows forms does not require .NET Framework version 3.0 features, so backward compatibility may not be as hard to implement

This means the “design palette” is not as functional as with other visual platform options
WinForms offers a rich eventing model and provides full multithreading supportWindows Forms offers minimal support for vector graphics – particularly important for map rotation/zoom

ASP.NET Web Forms

ProsCons
I have plenty of reference material on ASP.NETI have minimal experience working with ASP.NET
Using ASP.NET I could build something along the lines of a Web 2.0 “mashup”Managing session state and asynchronous events can be painful
ASP.NET supports the full functionality of the .NET framework in server-side code-behindASP.NET requires extra effort around database security
ASP.NET does not support vector graphics to the same degree as other options
ASP.NET requires a server
For the app I want to build, ASP.NET will have to be mixed with AJAX or JQuery controls, requiring an additional language learning curve

Windows Presentation Foundation

ProsCons
I have done a 3 day course on WPFI have less direct experience working with WPF than WinForms
WPF supports vector graphics and DirectX
WPF uses XAML for form design – and as such is more declarative than WinForms – this simplifies coding slightly
WPF supports the full functionality of the .NET framework in code-behind
WPF does not require Javascript, AJAX or JQuery knowledge

Silverlight

ProsCons
Silverlight is heavily based on WPFI have no direct experience working with Silverlight
Silverlight supports vector graphics and DirectXSilverlight supports only a subset of the full functionality of the .NET framework in code-behind
Silverlight uses XAML for form design – and as such is more declarative than WinForms – this simplifies coding slightlySilverlight is a web-control set, and as such, runs in a sandbox that will make remoting awkward at best.
Silverlight will work on a Mac.

Native Direct-X

ProsCons
DirectX provides high performance access to graphics hardwareI have no real experience working with DirectX
DirectX natively supports most of the things we want to doDirectX is a complex API for developers who haven’t written games before
DirectX offers the opportunity to design a very rich interfaceDirectX is very resource dependent – this could be a strength if I were artistically inclined, but I’m not, so having to create shader textures, rich bitmaps and so forth… too hard
DirectX is a stable set of APIs – it has been around since the mid-90s.
DirectX provides feature-rich audio capabilities, including DirectVoice – it’s audio conferencing API
Because DirectX is a set of modular APIs, it can be included into other visual platforms if required

After working through the list above, I’ve decided that WPF provides the best mix of functionality and access to other tools.

Remoting Platform

What do I mean by “Remoting”? Remoting is the glue that connects one instance of the application to another over the network, allowing sharing of information, feeds (e.g. Chat) and streams (video/voice). Out of the APIs listed above, it’s probably going to be easiest to run with Windows Communication Foundation (WCF), as it serves as a wrapper/abstraction layer for the other two options (TCP/IP Sockets and Web Services).

Multimedia Platform

I think I’m going to park this one for now while I do more research. There’s a lot of options out there (windows native and third party APIs) and I’d like to spend some time evaluating them more thoroughly before I make a choice. Once I get a bit further into the project, I’ll post my reviews online here.

Structured Data Persistence & Distribution Platform

Having spent over a decade as a SQL Server DBA/Developer, the product family of choice for me is going to be SQL Server of some sort. XML is an option, but it’s stored as plain-text and therefore prone to hacking. XML is also a bastard to query – XPath is a pain to use, and XQuery is not supported natively in all situations.

The choice to me seems to be between SQL Server Compact Edition (runs in-process with the application, no support for stored procedures, but very small footprint) or SQL Server Express Edition (provides more features, but also requires a fairly large installation footprint).

For the purposes of this application, I’m going to favour deployment footprint as the determinant of which edition to use. As such, SQL Server Compact Edition will be used. I will use Sync Services to make sure that players can inspect and annotate their character sheets offline, and have their changes synchronized back to the GM once they connect.



I know I promised that I’d add some scheduling to this post, but the platform selection discussions got a bit bigger than I intended.

I’ve had a deeper think about scheduling and decided that I’m also going to manage the activities on this project using a 1-man “agile” approach. This means that the next post will lay out the process of turning the high level features from the previous post into a set of user stories to be implemented. I will rate the user stories by priority and effort, and seek to build a schedule from that.

Note that this is an out-of-hours project for me, so I’m only going to commit 10-15 hours a week to working on it, where the 10 is the actual coding/design effort, and the 0-5 extra is the blogging time. I may spend more time on development or blogging occasionally, but this project should be considered to be a long-term initiative.

Wednesday, January 6, 2010

MIA but now returned…

Wow… what a year 2009 turned out to be.  I ended up doing a lot of travel – especially during the latter half of the year, and ended up working long hours on most of the projects I was assigned to last year.  I didn’t have the spare energy to do the personal learning or the blog posting required to be able to achieve the objectives of this weblog, but I’m back on deck now – refocused, rejuvenated and ready to start chasing Eureka moments again in 2010.

So… to keep myself honest, I’m going to post somewhat of a schedule here that I’ll try my best to meet.  I’m also going to put my dev work in the context of a real-world project that will make use of .NET 3.0 and 3.5 technologies as a way to task orientate my learning, and the information I’ll be sharing on this blog.

So – let’s get started then!

The App

The project I’m going to be working with is a dice-and-paper role-playing game assistant.  I’m an old school gamer (in the pre-PC gaming sense) who has some friends interstate who are keen to do some dice-and-paper gaming, so I want to build an application that can facilitate a distribute gaming session, where participants are in different cities, or perhaps even different countries!  For now, I’m going to focus on Dungeons and Dragons Ed 3.5, as it’s relatively well-known, and still widely enjoyed.  However, once the initial cut is out, I may look at using some extensibility tools to see if I can make the game system “pluggable” and look at writing some plug-ins for the White Wolf “World of Darkness” game system, and perhaps look into wiring up Cyberpunk 2020 and Shadow Run.

Requirements

Video & Voice

The first thing I want the tool to do is display multiple web-cam feeds and play multiple voice streams.  I also want to be able to have the tool enable private conversations between the Game Master (GM – aka Dungeon Master) and individual players.

Text Messaging

The next thing I need is the ability for the GM to send text messages to individual players, and be able to see a feed of text messages being sent between the other players.

Roll, baby, Roll!

Another feature I need to implement is a centralized dice rolling system.  In a distributed gaming environment, relying on players to be honest with dice rolls can – well… let’s call it an exercise in optimism.  So a centralized dice manager is critical.  The dice roller needs to have the ability for all of the GM’s rolls to be made private.  The GM also needs to be able to make certain player rolls private between the GM and the chosen player.

What a character…

Another core feature is character sheet tracking.  Character stats, abilities, experience points, To-Hit and Saving throw targets, Inventory, Banked Inventory and personal character notes all need to be tracked.  We will store some of this information in SQL Server database tables.  The database will also store relevant tables used for resolving combat and non-combat character situations.  I will also need to build a forms-based GUI for displaying and editing information in the character sheets.  The GM needs edit-level access to all character sheets, and the players need to be able to edit certain fields. 

A fast game is a good game…

A “nice to have” feature would be to include an sand-timer that the GM can trigger to force players to come to a quick decision.  This is particularly useful in combat scenarios where you don’t want players “meta-gaming” too much – i.e. haggling over who is going to hit the dragon with swords, where the archer is going to stand, what spell the magic user should be casting, or who the cleric should be healing first.  An alarm that plays in the audio channel of all players when the sand-timer empties should get the message across.

Combat Assistant

Another “nice to have” will be the inclusion of a combat ticker that can help the GM manage initiative rolls, character/enemy action sequencing and resolve the results of dice rolls.  Ideally, this would pop up in a separate window and provide support for various character choices such as subduing an enemy, charging, retreating, performing a defensive disengagement or calling an aimed strike.

Maps, maps and more maps!

Once again, on the “nice to have” list will be a map management tool, that allows the GM to:

  • “Automagically” generate dungeon and building maps (this actually requires some quite complex rules, which is why this feature is a nice to have)
  • Upload maps created by the GM in Visio or JPEG format.
  • Display currently visible map segments to players
  • Manage the “Fog of War” based on player lines of sight.

In-Game Economics and Politics

The final “nice to have” feature (and probably the most difficult to implement) is tracking of in-game economics and politics.  For example:

  • If a band of adventurers clears all the goblins out of a dungeon, who moves in later?  What impact does this have on human, demi-human and humanoid populations in the area?  Who are the movers and shakers in the region? 
  • If the player characters killed the goblin smith who was working on a doomsday weapon for a major political player, can they find out who did it?  And what form will their retribution take? 
  • What about the slightly higher level scenario of a dark army levelling most of the towns in a given region and putting their crops to the torch?  What are the impacts of population change and resource changes? 
  • If the players drain a major shrine of its healing power to raise one of their fallen comrades from the dead, will this anger the locals? 
  • Given the typically feudal milieu that many role-playing games are set in, what sorts of problems might our characters set off by rising above their station, or mingling with lower castes than their own. 

A good GM will typically define a set of “factions” with whom adventurers can raise or lower their reputations.  A reputation tracker will be an important part of this module, but the GM should also be able to do some basic price modelling based on the demand and supply of resources, keeping in mind that humanoid races may participate in black-market economies, and that monsters can have an impact on the local demand and supply of goods (e.g. the undead roaming the marsh-lands, impacting the supply of peat to locals, or the bandits paying for their supplies with fools gold).  These impacts can probably be categorized as geographic, systemic, magic or malefic to help make the modelling easier.


Well… that’s a pretty fair volume of text for a single blog post.  I’ll sign off for now, but in the next post I’ll discuss the technologies I intend to use for the Game Assistant, and start building a schedule for follow-up posts.

Sunday, May 10, 2009

Moving from GedZone on Live Spaces

This is the first in what I hope will be a decent number of blogs on Blogger. I actually started this blog on GedZone but I'm moving content over here as I'm disappointed in the Live-Spaces blogging engine's capabilities to support extensibility and RSS aggregation. I also have a blog over on Xanga.

So… firstly, a few things about me.

I’m an information technology professional who works for a large IT outsourcer. I’ve been somewhat rudderless there for about two years, trapped between infrastructure software support and software architecture (particularly in the SQL Server area), and catching just about everything Microsoft related that has fallen through the cracks, but largely twiddling my thumbs waiting for work to come along. One specific area in which I’ve spent a lot of hours is in tools support for development with the .NET platform. I can’t say I’ve had a huge amount of time to spend using those tools yet, but I’ve written a couple of hundred pages of documentation for support and development teams on tools such as
SQLNexus, Microsoft’s RML utilities for SQL Server and their Visual Studio Team System Database Edition tools that ship with VSTS 2008.

My employer is currently undergoing some interesting organizational changes as a result of a merger between it and a competitor. One of the things that is happening is that I’m moving across into the Microsoft Solutions Practice, in which my role will incorporate elements of a number of more formally defined roles – solution designs using Microsoft’s various application servers will certainly be a big part of my job, but I will also be responsible (along with my peers) for defining training paths, product capability maps, tools evangelism and standards evangelism. It looks pretty interesting, but I’m having to boost my skills both up into the software architecture space, and sideways into the sales support and software development areas. Just to round all this out, I’ve also decided to work towards raising my profile in the Microsoft developer community, and the blogging on this site will be the first step.

I’ve just got back from two weeks touring around the south west of Western Australia. The trip was a lot of fun (despite having cataclysmic elements such as tipping the car onto its right side
while driving a very loose pebbly dirt road near Pemberton and getting stuck in Manjimup for 4 days). From a .NET learning perspective it was also quite productive! The great thing about driving around on my own is that I had nights to myself, and spent several evenings cutting C# code for some demos which you’ll start seeing up here over the coming weeks.

In the demos, I went back to my “CompSci 101” training and started out with a “Hello World” app. I then built that out with some user I/O, starting out with a static class for tracking the time between console events, and an instantiable class with private properties for file logging. I then reused the classes from my console app in a WinForms version of the Hello World demo. I’ve also started working on a “Demo 2” series that will focus more on inheritance, polymorphism and other OO features such as indexers, serialization and code attributes. I’m also taking pains to introduce a few .NET framework classes in each demo, so the output won’t just be “Here’s the C# language…” – it will be “Here’s the .NET Framework” as well.

Over coming months I’ll be gearing up to actually video-casting some of this material, and preparing live presentations for user groups on more advanced topics. The products that will probably get most input from me in the new role at work will be BizTalk, SharePoint and Dynamics CRM, so expect to see more about these products. I will also be looking to stand up the infrastructure required to record some demos using these tools, and will start posting that material once I’m done exploring the .NET framework.


Well… that’s it for now. Code often… Code well!!!