Saturday, December 22, 2007

VSoft FinalBUilder 5.5

In the last few days I finally got around to test and deploy FinalBuilder. This tool is like a DSL for building software applications. With over 600 actions (almost anything you can imagine), it takes just a couple of hours to set up a Visual Studio solution/projects doing daily builds.

What I did was set it up to automatically build one of my company's MOSS2007 components, SharePoint Rules, and five of its plug-in's. I created a project, configured it to get the latest version of the code, created 6 separate "action lists" (==build subroutines), and finished it off by sending a success e-mail. Each of the subroutines just preps up the dependent assemblies, builds the respective solution, and then generates SharePoint's solution for installation. If anything fails in the process, a failure email is sent.

The usual approach when setting up continuous/daily builds is using either Microsoft's MSBuild/TFS, or something like CruiseControl.Net . Having been through this in the past, my opinion is: forget it. Buy this product, it is definitely worth it and save you hours if not days. It supports over 10 source control systems (including VSS, TFS, CVS, SubVersion and ClearCase), FTP, NNTP, ICQ, FTP, installers, virtual machine control (VPC, Virtual Server,  and VMWare Server/Workstation), burning CD's, and a lot more (the list is here). The design experience even includes debugging the build project, and the mandatory breakpoints/watches.

All this said, there are some minor glitches: the lack of an "Undo" means I've had to resort to previous versions of the build project a couple of times. Also, I quickly found a difference between the Help File and one specific action (I did post a message in VSoft's support forum and got a complete reply in under 24 hours explaining the situation -- with a solution attached).

Along with CodeSmith and a few other, this is one of the most valuable tools I've ever used in development. Highly recommended.


Changing topics: one of the things I tried to set up as part of the build project, was generating the CHM help files for the compiled assemblies. The idea was to use the included SandCastle actions to do this. I had a good impression of SandCastle (I used Oct2007 CTP) from what people told me, and of NDoc before that, but never actually tried to use it until now, and I was thoroughly disappointed. SandCastle is far from being a simple tool/product, and usability is near zero. After trying to configure it (for longer than it took me to create the entire build project), hunting for tips in blogs and Xml files, and even failing to get the included samples to work, and always ending up with 0x8000-like errors, I just quit. A colleague had success when using the SandCastle Help File Builder available at CodePlex, but since this is not usable in the automated build scenario, I just disabled the SandCastle actions until a better day comes. A disappointment I wasn't expecting. Can't win them all. :-)


Disclosure: as an MVP, I have received a free license for FinalBUilder 5.5 as a 3rd party offer from VSoft Technologies.

Friday, November 16, 2007

MyTechEd 2007 - Day 3

Clearly the best day of the conference until now, with two really great sessions: Pat Helland's and Rafal's.

The first session was Justin Smith's "Connections in the Cloud - BizTalk Services and WCF". This was an interesting session about BizTalk Services, with some demos to illustrate how it can be used. I've described the technology previously, so I'm not going to spend time here on it. I especially liked the demo where the access control to a service is done at the ISB-level, based on claims and w/out any change whatsoever to the service. We were also told the team is using agile methodologies, with new drops every 6-8 weeks. Workflow is obviously the feature everybody is waiting for. And I personally wish it had context-based routing, and not only named-uri/topic-like pub/sub. CBR is an immensely powerful mechanism available in BizTalk Server, and it allows for greater decoupling between the sender and receiver(s) of a given message.

Next was Matt Winkler's "What's the Context of this Conversation: Enabling long running services in workflow services". The basic ideas of Workflow Services (in .Net 3.5) are implementing services as workflows, and hosting workflows as services. The session described the current mechanisms used to communicate between the wf host and its instances using applicational queues, and described how this works in long running scenarios. If this was BizTalk Server, this would be a session about Correlation and Convoys. Since its not, it was about the exchange of context information between the service and its clients so that the correct instances can be rehydrated when messages/invocations arrive. Matt highly recommends the "Conversation" sample from the SDK to learn about this topic. Having studied previous versions of WF, I'm glad they are dropping the Handle External Event/Call External Method ways of communication between host and workflows, which always felt strange to use.
This was not the greatest of sessions, but it was interesting nonetheless.

After lunch there was Pat Helland's "Data on the Outside vs Data on the Inside", the best session of TechEd IMHO. This is not a new session (also see this), it's a couple of years old, but the ideas are still up-to-date. The session starts with the idea that services communicate using messages, and from there Pat explores the location of the data (inside services, or in messages being exchanged). The most interesting part of the presentation were the parallels between the theory of relativity and messaging. Definitely a great session. Some quotable sentences I noted: «Messages are not from the now, but from the past. There is no simultaneity at a distance», «Services, transactions and locks bound simultaneity», «All data from distant stars is from the past [so, each service has its own perspective]» and «operators is hope that something will happen in the future».

The next session was "Silverlight, Asp.Net and Web Services in IronPython and IronRuby", presented by Mahesh Prakriya. The session was very much about the Dynamic Language Runtime and the languages that use it. It included quite a few impressive demos of Silverlight and its interaction with IronPython, as well as using Web Services (and dynamic proxy creation). The Asp.Net demo was less interesting. I find dynamic languages really interesting (I love languages like Lisp, Prolog and Xslt), although much less structured that C++/C#/Java-like languages, and would like to experiment using these languages in some enterprise scenarios. Unfortunately, however, it's possible to call from DLR languages into CLR languages, but not vice-versa. Which means I can't develop over SharePoint, OBA-apps, or even BizTalk, calling into DLR code. Helas.

The last session of the day was Rafal Lukawiecki's "Developing More Intelligent Applications Using Data Mining". Rafal  is a great speaker, and this was a very good session, which is available online here at a previous event. Rafal is widely known to be able to deliver great sessions whatever topic (he did sessions on security, networking and MSF, at TechEd), being a very eloquent and expressive speaker. His session was dedicated to the use of data mining techniques in application development, and started by establishing the differences between OLAP (interactive exploration of data) and Data Mining (Proactive discovery of information/patterns). This "discovery" aspect, or "Predictive Programming", is what can allow use to develop more intelligent apps, with adaptative user interfaces, data input validation, and business process validation. De session ended with a demo of Input Validation using Sql Server 2005 Analysis Services. Overall this was a very interesting session that left me with several ideas of possible applications in the development work we do at |create|it|.

Thursday, November 8, 2007

MyTechEd 2007 - Day 2

The second day started with a BizTalk Server session: "Enterprise Ready BizTalk - Best Practices from Customer Implementations", by Ewan Fairweather, one of the authors of the excellent "Professional BizTalk Server 2006" (ed. wrox). Not the greatest of sessions, but it had some interesting stuff. One of the topics covered was BizUnit, the test framework for BizTalk, and what I found interesting here was the usage of the database query shapes to query BizTalk's own database, as well as the notion of "priming messages", to warm up the system to the tests you are doing. The session also described and demo'ed code coverage using Visual Studio's performance tools (vsperfmon/vsinstr), to use for instance when testing pipelines of custom components, and the last part was dedicated entirely to Disaster Recovery and the various mechanisms available, from Log Shipping to SAN mirroring or applicational mirroring.

After this, a level 200 session, "Introducing the Microsoft Synchronization Framework", by Philip Vaughn. This was a very high level and un-technical description of the features of the just announced sync framework, presenting its use scenarios (offline and collaboration). The framework supports both hub&spoke and P2P topologies, and was described as "the glue for S+S". The fw includes three components: a sync agent (the "deamon"), a set of built in providers (relational, file system, etc. -- and like the name implies, this set is extensible), and the runtime, and basically supports a "poll" model of change detection, based on versions. Apparently there's already a dev center at MSDN about this CTP technology. The session also included a demo of n-way contact synchronization between a SQL database, Outlook contacts, Windows Mobile 6 Contacts, and Windows Vista contacts. Also, the SF is based on metadata information for the sync, which was summarily described, but no-one quite understood what that metadata consists of. I was really curious about the Sync Framework, but this was definitely not the best of sessions, especially because of being too general for a technical audience (IMHO).

Following this session was lunchtime's David S. Plat's "Why Software Sucks". David Plat is a very engaging and amusing speaker, and his session was basically about user experience... problems. The session highlighted several UX problems, enumerated his law of UX design: «Know thy user because he is not thee», and ended giving 5 suggestions: 1) Add a Virgin to the design team (ie, someone from outside); 2) Break convention when needed (ie, Palm/OneNote/MS Money lack of a Save feature); 3) Don't let the edge cases complicate the mainstream case; 4) Instrument carefully (ie, get info from user usage); 5) Always ask: "Is this individual design decision taking us closer to Just Working? or farther away?" (ie, are we introducing unnecessary steps or interaction). Having worked in Usability in my previous life, the message convened in the session really resonates with my personal views on this topic. HOWEVER, this is an easy session to make: the problems and bad examples in UX abound (I used several when I sold usability services), and Usability is described in a very light way. I totally agree with David Plat's principle, and clearly having the developers engage users (or at least be conscious of them and of the difference in usage profiles) is advantageous, but by itself this is not enough. Plus, I've seen a lot of bad decisions being done by marketing (after all, marketing teams design sites/products, not developers -- usually, at least), NOT developers, so pointing the fingers at them is not 100% correct. To conclude: a fun and motivational session, if not 100% "scientific". :-)

After lunch there was another excellent session by Pat Helland, "Life beyond distributed transactions: an apostate's opinion". Pat's basic premise is that you shouldn't use distributed transactions. They are fragile, and they break encapsulation (especially in SOA scenarios). The entire session explored what happens when you assume this, and ways to design your systems, also focusing on scalability. Pat published an article at CIDR 2007 which apparently is available online for download on this topic alone, and he also has posts on his blog about this which I highly recommend. This is the kind of session that makes you re-consider the way you design systems. One of the sentences that I always find motivational is his "accountants don't use erasers". A lot can be derived from this, especially in the way we use databases.
Btw, and I didn't know this, apparently Pat was involved in the several implementations of Transactions, both local and distributed (2PC), and the word "apostate" means "someone who used to believe [in distributed txs] but no longer does". And when someone in the audience asked him about the MS sales pitch about dist tx's a few years back, he just honestly replied: "You're right. I'm sorry."

Next session, Neil Padgett's "Implementing solutions that leverage MS Sync Framework to provide sync capabilities across devices, services and applications". This session gave some more detail about the Sync technology, and included both a repeat of the contact-sync demo and some code samples (who had the idea of using dark blue keywords over a dark gray slide background?). I learnt that the SF is a set of assemblies, about "sync endpoints/replicas" - the various parts in a sync network - , and that the providers are used to expose these replicas. Neil also described the concepts of Version (of the information) and Knowledge (a representation of the versions a replica knows about), essential to the architecture.
I'm guessing the team still didn't have the time to prime SF's presentations, as this was another somewhat poor session. Also Neil didn't always speak very clearly, so I ended up leaving early.

The next session I picked was about "Microsoft Robotics Studio", by Martin Calsyn and Olivier Bloch. Robotics Studio is very frequently mentioned because of the CCR (Concurrency and Coordination Runtime), which can be used to manage apps w/out robots. Back to the session: Martin did most of the session and demos, which several robots on-stage. The basic idea of the RS is abstracting the capabilities of the robots (which are controlled with Web Services), and once you do this, you can now change the specific physical robot without modifying your code.  MRS also includes a visual programming language (VPL).
I have been an owner of Lego Mindstorm for some years now, and am still amazed with how easy it is to program robots using Lego's visual language. This session is mostly unrelated to most of my everyday work, but it was very interesting nonetheless. It's a technology to keep an eye on, even if only for personal curiosity.

And thus ended the second day of the event. At night, there was a "Influential's" dinner, with several MVP's, RD's, and conference speakers. Did I mention TechEd was largely about networking? ;-)

MyTechEd 2007 - Day 1

The event officially opened with Soma's Keynote, "Building great apps". This was an overview session, as was to be expected, with some announcements: VS2008/.Net 3.5 will RTM before the end of the month (Nov07). Also announced were the MS Synchronization Framework and P&P's "S+S Blueprints", both of which sounded interesting, and a "Popfly explorer". The session included a few very nice demos, including one of using Visual Studio (no open to third-parties) to develop World Of Warcraft Add-Ins.

After this I went to Pat Helland's "Metropolis: Interchangeability of Applications". This was the first of a series of sessions Pat is doing, and it consisted of a study of how interchangeability evolved in the physical and industrial world, with parts replacement, assemblage locations, etc., and what we can learn in IT from this evolution, and apply it to the services-enabled world. We frequently read that, in the SOA world, if a service is not adequate, you can simply replace it with another one with the same interface. Well, truth is, I've never seen this happening in the real world. And even if the interface/contract is the same, are the semantics the same? So this is the kind of issue Pat discussed. It was a very thought-provoking and interesting session.

The next session was Stephen Forte's "Database Design Patterns", which identified some interesting database patterns. The most interesting were the Slowly Changing Dimension (SCD) and the horizontal/vertical partitionings. The SCD basically consists of creating a replica of your business database, but optimized for reporting or a given type of queries. The replicas are typically created using some ETL mechanism like SQL Server Integration Services/DTS. This allows you to alleviate load from your original, normalized, business database. I'm guessing this technique is not as frequently used as it should, especially in high-traffic systems. Forte's style is very dynamic, maybe we'll have him at the next TechDays. I hear he's available at that date and interesting in visiting Lisbon, so perhaps someone will invite him over.

And thus ended the first day of the event. At night, after some time at the welcome reception, there was a "Connected Systems Influential's" dinner at Las Ramblas. Did I mention TechEd was largely about networking? ;-)

Monday, November 5, 2007

techEd 2007 Influencer Community Camps: Development +/vs Architecture Communities

This year Microsoft held a set of pre-conference meetings with "Community Influencers" (MVP's, etc.). These meetings happened in an interesting open format:

The Microsoft Influencers Community Camps will be modeled after Open Space Technology where the attendees define the topics, volunteer or nominate peers to host sessions and then attend a series of sessions that interest them most (quote from here).

Basically the idea is that the participants define the rules and what is to be discussed. At an initial session, people suggest topics that are placed in an agenda and assigned a room, and whoever wants shows up. And rules are there to be bent (schedule, topic, etc.). A very interesting format which I am looking forward to try out in Portugal. Maybe at the next TechDays 2008? :-)

I proposed the topic "Development +/vs Architecture Communities", which got some people interested in the idea. The two issues that were mentioned more often were "Architects are the guys with the tie" and "Architects don't know the real technology, they are just theory". It's interesting how - at least at GASP - this doesn't happen, as the group is very much focused in real experiences. There were some good ideas about how to keep the "gap" small, most focused on the real issue: community and social issues. The Wiki will be updated with notes from that session, I'm told, so keep an eye on that for a complete summary.

Monday, October 29, 2007

Architect Forum: BizTalk Services - The Internet Service Bus (slides)

The slides  of the presentation can be downloaded from here. Note that they are in portuguese, and that they are ~10Mb in size (PDF).

Saturday, October 27, 2007

TechEd 2007 Barcelona

This year will be my 3rd presence at TechEd, this time again in Barcelona. Apparently there's almost 80 people coming from Portugal, which is more than last year. My personal focus will be on the "SOA and Business Process" and "Architecture" tracks.

People who didn't yet attend one of these events tend to see almost as a vacation. The reality is that there's in general very little free time, and you get back NEEDING vacation. There are also several side events, including the mandatory networking dinners: monday, the CSD Influencers; tuesday, the MVP's; wednesday, the country dinner. The networking is one of the most valuable aspects of these conferences, as you get to meet the people you know only from blogs or the net.

This year there are no big news expected at TechEd, although I'd bet there will probably be some announcements. Either at Somasegar's keynote, or at the GEN01 General session in the last day, which has a vague description and is the only session at that time slot.

To those using Vista, there's a nice TechEd countdown gadget available here.

The only unfortunate thing is that Microsoft decided to stack up conferences: the SOA & Business Process Conference happens next week in Seattle (just check the speaker list), and TechEd right after that.  There will probably be sessions at SOA&BP that will not happen at TechEd because they're back to back, which is unfortunate (and UNFAIR!).

See you in Barcelona.

Friday, October 26, 2007

Architect Forum: BizTalk Services - The Internet Service Bus

This last Wednesday I presented a session at the Microsoft Architect Forum 2007, held at the Lisbon Casino (a great venue), with ~100 architects attending. The overall topic of the event was S+S, with a general introduction about the topic being done by Beat Schwegler. My session, following Beat, was about the «S for Services», the S after the plus (S+S), and José António Silva wrapped up with a talk about the «S for Software».

My presentation was divided in two parts. The first was a general introduction to the topic of Services, SOA, SaaS and S+S. The second, and most provocative part, aimed at introducing a different paradigm of looking at the way software is developed, the notion of having it completely hosted in «The Cloud». Very specifically, I talked about the concept of the Internet Service Bus, materialized (I wonder if this is the best word, since there's no box to buy) in BizTalk Services, under development by Microsoft, and based on WCF technology.

I described its three main current components: Identity&Access Control, Connectivity, and Workflow (still to be made available). The first two are the essential parts of the platform, allowing for universal secure connectivity. It is perhaps not obvious that Ms should start with these two, but if you want to put up software in the cloud, you do have to make sure that people can both reach it, and reach it in a secure way, so it makes sense.

I had to ask people for their "suspension of disbelief", however, to make a parallel between Facebook and BizTalk Services. Facebook is a social community site, and what I find most amazing in it is that there are over 6500 apps in its directory. Six thousand! This is an amazing figure.

Now imagine you had the same, in terms of services and enterprise services. Imagine you wanted an portfolio of services to handle HR, clicked "Add" in some kind of marketplace site (will there be one?), and BAM!, here you have it. You want an accounting app? just pick one and click "Add". Not working as you want? click "Remove", etc. This would be S+S-nirvana, and maybe we'll have it one day.

I hope it is now clear why I asked people to suspend the disbelief :-)

At the end of the session I did a small demo. This was the scenario: a global company has a set of distributed warehouses, and wants its business users to be able to monitor remotely what merchandise goes out. The tracking of the merchandise was done using Rfid tags and BizTalk Rfid, which sent events to BizTalk Services, and these were consumed by a client app developed in WPF/.Net 3.0.

Here's how we did it: in BizTalk RFID, I created an app with two event handlers: the first removes duplicated tag reads; the second connects up to the ISB using usn/pass authentication, and sends a notification with the tag id read. I used the Phidgets RFID device. On the other end, the client app(s) just connects to the ISB and subscribes to that notification. Every time a tag is read, its photo is displayed in the Rfid Dashboard.

I developed the BizTalk RFID part, Raúl did the BizTalk Services bit based on the Multicast sample in the SDK, and André did the WPF app (great work, guys!). The scenario supports several publishers and several consumers.

I did a recording of the demo, so if you want to check it out you can download it from here (2 minutes, 5 mb):

BizTalk RFID + BizTalk Services + WPF demo

Tuesday, October 2, 2007

«Hope, Unfortunately, has never been a very effective strategy»

You have to love that sentence, in an article about IT and its challenges.

The article is titled "The Trouble With Enterprise Software", by Cynthia Rettig, and - much in the same vein as "Does IT matter?" - it talks about the "failure" of enterprise software (ERP-like software, but not limited to), the complexity of software, the problems with data quality, the mis-alignment of business executives with IT, and the promise of SOA:

The timeline itself for this kind of transformation may just be too long to be realistically sustainable and successful. The dynamic business environments of today, where whole industries and markets can undergo radical changes in a matter of a few years and the horizon for corporate strategies has shrunk from 10 years to three to five, makes it questionable whether companies can actually maintain a focused strategy long enough to align their core business processes with IT.

The article includes very interesting data, and unfortunately it proposes no solutions. Software, in it's "infinite" flexibility, is a complex beast to tame, and not all (most?) the promises it made materialized.

The average professional coder makes 100 to 150 errors for every 1,000 lines of code

The view that ERP software, by its size and unicity, tries to avoid the difficulties of integrating distinct [fractal-like] modules and applications, is an interesting one. But apparently, if...

75% of ERP implementations were considered failures

... that may not be the way. But Gregor Hohpe reminds us, in Enterprise Integration Patterns, that developing loosely coupled asynchronous systems (today's fad) implies more complex development and debugging, that basically means we (IT) really have a tough problem.

[...] the way most large organizations actually process information belies that glorious vision and reveals a looking-glass world, where everything is in fact the opposite of what one might expect. Back office systems — including both software applications and the data they process — are a variegated patchwork of systems, [...] installed over decades and interconnected by idiosyncratic, Byzantine and poorly documented customized processes. To manage this growing complexity, IT departments have grown substantially: As a percentage of total investment, IT rose from 2.6% to 3.5% between 1970 and 1980. By 1990 IT consumed 9%, and by 1999 a whopping 22% of total investment went to IT. Growth in IT spending has fallen off, but it is nonetheless surprising to hear that today’s IT departments spend 70% to 80% of their budgets just trying to keep existing systems running.

The Red Queen hypothesis, usually applied to Human evolution, states that

For an evolutionary system, continuing development is needed just in order to maintain its fitness relative to the systems it is co-evolving with. (from wikipedia)

I wouldn't be surprised if these same words also applied to IT and business environments.

The Red Queen in Lewis Carrol's "Through the Looking-Glass" says:

Now, here, you see, it takes all the running you can do, to keep in the same place.

There's some comfort in the following sentence, however:

If you want to get somewhere else, you must run at least twice as fast as that!

Monday, October 1, 2007

Windows Live now Supports CardSpace

According to a post in Kim Cameron's blog,

You can now use Information Cards at Hotmail and all the other MSN/Windows Live sites. 

Just go here to associate an Information Card with your existing account.

It's the start... of the end of the road, for password-multiplication. Very welcome news, this one.

Friday, September 28, 2007

BizTalk 2006 R2 RTM

Although with very little publicity (!?), BizTalk 2006 R2 RTM'd 2 weeks ago. This new (evolutionary) version includes a very interesting set of new features, including an RFID server, the ability to expose and invoke services using WCF, EDI support, and lots of other new stuff. Here's my first post about this a few months back, and BizTalk HotRod issue 1 has an article detailing the new features.

The evaluation version can be downloaded from here. The Developer edition ISO has been posted to MSDN Subscriber downloads yesterday.

I'm looking forward to try out this version in conjunction with BizTalk Services...

Scrum types a|B|C

Back from an hot summer, I wanted to post about an article posted two years ago on Jeff Sutherland's blog on Scrum: «Future of Scrum: Support for Parallel Pipelining of Sprints in Complex Projects». I am by no means an expert of Scrum, but I've been on two projects using this methodology, and I've been introducing its adoption at |create|it|, my almost 6 year old company (link in PT-PT). This article apparently created some discussion when it was published, but I'm finding it extremely interesting. It describes 3 ways to do Scrum, by varying the intervals between Sprints and anticipating some of the work done at those intervals. What I found most interesting, however, where the findings related to functional and technical specifications:

[...] This suggests that minimal functional specifications should be clear at the beginning of a Sprint and that design and technical specifications are best done within a Sprint. [...]

 An interesting notion, and one that I'd already been resorting to, lately - writing detailed functional specifications, as well as overall architecture documents, and then just proceed to development. I guess this is what Fowler calls Evolving Architecture (link in PT-PT).

The following paragraph has more interesting information:

[...] MacCormack’s multivariate analysis showed three primary factors that lowered defect rate (early prototype, design reviews, and integration or regression testing at code checkin) and two primary factors that increased productivity (early prototype and daily builds).
Releasing a prototype to customers that is only 40% functionally complete increases productivity by 36% and adopting the practice of daily builds increases productivity by 93%. [...]
Incremental and early delivery of working software is at the core of the effectiveness of Agile processes.[...]

I'd been discussing just this issue today with my company's management team colleagues: how to increase productivity and lower defect rates? (yes, we're looking for the silver bullet, I am well aware of this). This article gives us some very interesting feedback, however. We've been developing packaged components for SharePoint 2007 lately (here is one of them - link in PT-PT), and we're considering structuring a Scrum-based process to handle these components. I found, surprisingly, very little existing books or documents on "modern" product development methodologies, encompassing from envisioning to production and support. Any recommendations would be welcome.

I still haven't finished reading the article, so I'll post more when it's done.

Sunday, July 29, 2007

BizTalk RFID: notes

After finally getting around to setting up VMWare Workstation with a BizTalk 2006 R2 Beta2 + BizTalk RFID, I did several experiences with the Phigets RFID reader and the Phidgets DSPI.

Two notes about the setup of the DSPI: first, do it manually, following the instructions. The setup script has an error in the names of the dll files (is uses the name "phidgets" instead of the correct name of the dll, "*phidget*"). Second, if you have an error when starting the provider, check this page in the Msdn forums. The setup guide for the Phidgets RFID quickly guides you through a complete configuration of the provider, device, and business process, so I highly recommend you follow it.

One important limitation of the Phidgets DSPI is that it doesn't support the elimination of repeated reads. Just to give you an example of why this is a limitation, if you hover the Rfid tag for about half a second over the reader, you get 10-20 tag reads. I have my doubts as to wether this elimination should happen at the provider level (DSPI)... it seems to me it should be done in the generic RFID Infrastructure. Apparently other DSPIs have the same limitation, but there's a way around it: either use specific features of the device (non applicable in the case of the Phidgets reader), or an event handler associated to the business process you define (check the DupElim sample in the BizTalk 2006 R2 Beta2 samples).

Something that I found unintuitive when setting up BizTalk RFID is that this package (much like BizTalk Services) is not part of BizTalk Server itself. Or rather, what I mean is: you won't get "tag read" messages directly into the Message Box, like you could assume you'd get. Rather, you associate some kind of "sink" to an application/business process you configure in the RFID management console (typically, the out-of-the-box SqlSink that just drops the reads in a SQL Server database), and then use BizTalk Server to consume events from that sink. An alternative way of getting tag reads is to use an orchestration to consume the RFID webservice (check the ConsumeRFIDWS sample). This does make BizTalk RFID into a generic "RFID server", but the BizTalk in the name is misleading. One further note about this is that you do get, out of the box, the capability to use the Business Rule Engine as a way of validating your tag reads (check the BRESample).

Microsoft's forum for BizTalk RFID is here, and I highly recommend it if you have problems, I found a solution there for all the difficulties I had.


On to a different subject, Microsoft has recently updated the Roadmap for BizTalk Server, with the interesting part being the "Beyond Biztalk Server R2" section. It seems that the "Solution Designer" that left us open eyed when it was shown in a Channel9 video two years ago is really not going to happen.

Tuesday, July 17, 2007


LoadGen is a Microsoft tool that can be used to generate test inputs to BizTalk solutions. The two features I really like about it are Message Creators, which allow you to generate different messages in each run (for example, different request id's, guid's, etc. in every generated file/message), and the Load Generators/Transports, which allow you to generate files, Http or Soap requests, Msmq messages, etc. A third architectural component are Throttlers, which allow you to regulate the rate at which documents are generated.

Load generation is just the first part of the problem when testing a BizTalk solution, however. You must have a way to measure how your solution is holding on, and for this a typical approach is using Performance Counters.

The download includes several samples, however most of these omit the configuration of Message Creators, which took me some time to get working. One thing to remember is that you can/should include the configuration of the Message Creator inside a Section block, at its end.

Here are some other quick tips:

  • LoadGen writes (verbosely) to the Event Log. Check it to find out if something is wrong with your configuration, for example.
  • In the documentation for Message Creators ("Dynamic Message Creation"), the MessageCreator/Field/InitialValue Xml element contains a simple replacement string: if the source file contains IDField_0, this will be replaced by the value generated by the configured MessageCreator. The documentation talks about this being a "field name", which is not totally clear.
  • The MessageCreator/TemplateFilePath element should point to the configuration for the Message Creator, and not to the Template for document generation. It's a bit mis-named.
  • If you have a Section for generating files, with a Message Creator within it, you have two references to the template file for your documents: in the Message Creator configuration file, the element MessageCreator/@SourceFilePath, and in the Section you have Section/SrcFilePath. This can get a bit confusing.

In my tests, I wanted to generate values with specific formats, so I decided to write an additional Message Creator. What I found out was that it was way quicker to disassemble the out-of-the-box CustomMC assembly and extend it than write one from scratch. Not exacly recommended and probably not supported/allowed, but... quick. I used Reflector.Net and the File Disassembler add-in.

LoadGen is a nice tool, but it's only the beggining of your work. Now, what I would really like to see is something like this being used in conjunction with BizUnit :-).

Just another suggestion: Scott Colestock did a session at last year's SOA conference I recommend, on this testing topic: «Applying Maximum Sustainable Throughput to a Management/Operations Strategy». Slides are here.

Monday, July 16, 2007

Take a Look at These...

I'm not about to start a series of "daily links" post, but here are recent links I find relevant:

Happy reading!

Wednesday, July 11, 2007

Gadgets in Javascript? I changed my mind

When I first heard that Vista Sidebar Gadgets where to be developed using Javascript, I thought Microsoft was making a mistake, and that Javascript+Html+Css was the wrong platform for these developments.

Since then, I've tried several gadgets, and tweaked two of them: first the Show Me Life Flickr gadget and yesterday the Scraper gadget. What I found is that the technologies involved are really convenient, and development turned out to be really quick. The Scraper gadget sample doesn't work anymore (I think the site where it scrapes info from is down), but I quickly changed it to get financial quotes from my bank's pages, add auto-update timers, use a better layout, and be configurable. All in a couple of hours.

Being a simple development, I did the code editing in Notepad2, but I think for more advanced gadgets (a few months back I tweaked Microsoft's Outlook Upcoming Appointments to get my own custom to-do list, and Outlook's object model is significantly more complex, which made this much more complicated) a more sofisticated development+test environment is probably a good idea.

You'll have to excuse me for not sharing the code, but I don't want the bank to block the scraping. :-)

Monday, June 25, 2007

Transfering files with BizTalk Services

BizTalk Services has been out for a few weeks, but only recently did I have the time to try out the samples included in the June SDK. BizTalk Services is one of Microsoft's moves in the Saas (or rather, S+S) game, and is self-described as an Internet Service Bus (ISB), and is not related to the BizTalk Server line of products, consisting more of an additional functionality stack on top of the Wcf framework, hosted in the cloud.

Channel 9 published a 30-minute video about BizTalk Services which I recommend. The video includes demos of some of the samples included in the SDK, like the Echo and Multicast/Pub-Sub ones.

I have tried several of the samples, starting with the Echo. This short sample includes a simple setup of Wcf service and client, but with the communication between the two being done via the ISB, using a relay binding which allows two-way communication between service and client. The interesting part of this is that you can go through firewalls (it just works, much like Office Groove). I played with the different instancing modes and several simultaneous clients (all of them work as expected), checked how long it takes for the client to get the reply back (a little less than a second) and how big could the echo message be (8k). A somewhat annoying aspect of this sample is that CardSpace's identity selector keeps popping up, both when starting the client and the server.

Anyway, it's a great starting point, and I quickly wrote a simple file transfer service, something I could use (with the appropriate security, obviously) to get files from my company's file share when I have no VPN connectivity.

The service implements a GetFile contract, receiving as parameters the filename, size of page/block/chunk, and page number, and returns a byte array. The service's instance mode is PerCall.

The client app asks the input for a path to a file located on the server, and gets it page by page, saving it to the C:\Incoming folder.

Download the code bellow. 

There are two other samples posted on the net which I recommend: Clemens Vasters' "Tweetiebot" and Christian Weyer's post "safe and secure WCF duplex callbacks through NATs and firewalls".

Wednesday, June 13, 2007

More on Mind Mapping

A site called Innovation Tools (which I recommend) published in September 2006 the results of a survey on mind mapping software (in PDF). MindManager is the clear leader, with ~70% usage, with the open source FreeMind at ~10% (which I tried before prefering MindManager).

This survey is especially interesting when you see the diversity of topics people use this kind of apps for: to-do lists, preparing presentations, taking notes, solve problems, plan projects, decision making, etc. According to the survey, the single most important benefit of using mind mappers is "Clarity of Thinking". This is very true, and you can get results very quickly. A simple map can take me ~15mins to create, and usually the result can be used directly in your work (for example, for the titles of a PowerPoint presentations, or the chapters of a word document, or even a database structure or class diagram).

Note on lost posts

I've just noticed that my cross-posting configuration got mis-configured, and several of my latest posts were not included here. To check them, head over to Arquitectura de Software, where they were published with success.

Thursday, March 22, 2007

TechDays 2007: WF Extensibility With Custom Activities

The second session I delivered at the event was more lively than the first, and also much simpler. A level 300 session, but focused on beginner developers in .Net 3.0's WF.

There are two features of WF, which I was unfortunately unable to demonstrate at the session, which I really love: first, dynamic  instance update - the ability to modify a running workflow in runtime, adding new activities and modifying its behavior. I can think of several uses for this, some on a professional level, some for fun. :-) The second is using custom activities together with WF to define Domain-Specific languages. Simpler than the DSL toolkit (and also  with distinct applicabilities), and given the fact that you can run your declarative XAML workflows without recompilation, it's something I'm really looking into.

Wednesday, March 21, 2007

TechDays 2007: BizTalk R2 Session

I delivered my first session at TechDays today, an overview of the new features of BizTalk Server R2. Topics I covered included:

  • BizTalk RFID (I can't wait for the Phidgets provider to come out, so I can test it with a simple setup -- apparently this will happen soon after beta2 comes out);
  • WCF adapter, with the ability to expose BizTalk Artifacts in IIS and consuming services using the WCF stack. This will make us BizTalkers learn at least the basics of Wcf :-)
  • WCF Lob Adapter SDK, which allows you to develop adapters to Line Of Business applications (which can range from a SAP-like system to a simple database). One of my favourite features.
  • BAM Interceptors for WCF and WF. Another nice feature, this allows you create BAM Activities that span not only orchestrations, but also things happening in the transport and workflows. The most interesting thing is that you don't need to change your existing Wcf/Wf developments, it's all done in config. You inject a Tracking Service, in the case of WF, or a Behaviour, in the case of WCF. The down side? the events are handled synchronously.
  • EDI/AS2 adapter: not the most exciting of the new features of R2, the truth is that EDI usage is still growing, so this replacement to the simple "Base Edi Adapter" is welcome, especially giving the pricing of the Covast adapter.

Although not a new feature in R2, but rather a set of developments and guidance on top of it, I also very briefly mentioned Microsoft's ESB Guidance, due to come out at about the same time as R2, and ended the session with Microsoft's take on this whole "BizTalk vs .Net Framework" issue.

It was a tough session to deliver, because I had a lot of materials, and little time for demos, which always give some life to presentations like these.

Right at end, I left the dates people were waiting for: the first public beta of R2 (beta2) will come out around the end of March/start of April, and the final version in the second semester (from what I've seen, and since the SOA conference is in October, I'd probably bet on Q3).

Changing subjects: fellow MVP Charles Young has a good overview of the sessions we attended at the Summit. Check them out: day 1, 2, 3 and 4.

Sunday, March 18, 2007

Mvp Summit: it's over

Turns out I couldn't post that much. Anyway, I really enjoyed the deep dive sessions included in the "Connected Systems" track, at Microsoft building 43. Clemens Vasters did the introduction to an agenda that would span topics such as BizTalk, Wcf, Wf, Identity (CardSpace+MIIS/ILM), and even POX/Rest/Ajax. Session's durations ranged from 30min to 1 hour, and some had to be cut short or the overal agenda would be delayed.

One of the sessions that was most active was by Paul "Workflow" Andrew, on the topic of BizTalk vs .Net (specifically, WF+WCF). Paul pointed out that all of these are being developed by CS people (the only part of the .Net framework not being developed by the C.S.Division is WPF), and that while .Net is a framework, BizTalk is a premium server. Some structural parts of BizTalk are being replaced, that much is true, but that also means the resources can be invested in other areas. Very little is known (by us :-)) about where exacly evolution will happen, but the information was there, for those listening atentivelly. :-)

Like I said in the previous posts, the Summit is in a big part a "networking" event, and I did get to meet several BizTalk Mvp's, as well as several other Mvps from random competencies. My program included a Q&A and a dinner with people from the CSD/BizTalk teams. Both proved to be very interesting. One interesting note on this was the MVPs proved to be very demanding of Microsoft, which was interesting to see. Most of us are MVP's because we have a passion for technology, but that doesn't make us less demanding «customers».

As to information I can share: BizTalk R2 will be released in the second half of 2007, and the first public beta (beta2?) will probably come out near of little after the end of this month.

Finally, Seattle: the city is located in a great place, geographically, the only problem is really the weather, makes you wonder if it affects the software Microsoft produces :-). My favourite spots were the Pike Street Market, the amazing-amazing-amazing Elliot Bay Book Company (photos: 1, 2), and obviously the Space Needle. No time to see more or drive around. Maybe next year if I am nominated again (the next summit will he held 14-18/April), or at the SOA & Business Processes Conference later this year.

Monday, March 12, 2007

MVP Summit: Sleepy and Wet In Seattle

Seattle is known for its rain. And it's justified.

Two portuguese mvp's at the summit tihs year are known for being sleepy and wet from the rain. :-) Went to Palermo's party tonight, where we met some of the 40 Brasilian MVPs attending. An estimated 1900 MVPs and RDs from almost 90 countries are comming to be biggest summit ever. There are few BizTalk Mvp's, but I already managed to bump into four. Geekland. :)

Saturday, March 10, 2007

MVP Summit: Getting Ready

My second trip to USA's East Coast will be to Seattle/Redmond, where the MVP Summit is being held. My agenda, the Connected Systems Track ("BizTalk, WCF and WF experts will all be “Connected Systems experts” in the future"), includes a day and an half of in-depth technical sessions held in Redmond, 14 sessions on BizTalk, WCF, WF and Identity Management.

Short sessions, several of them of level 400, and most of all the oportunity to learn, and meet other BizTalk MVPs from around the world and the Product Group. The general part of the event will include talks by BillG, as well as Don Box and Chris Anderson.

I'll let you know what the "mothership" looks like. :-)

Friday, March 9, 2007

Excel Services v1

I have been doing some tests/prototypes using Excel Services (included in MOSS2007). Excel Services is a new product, a server-side implementation of Excel. Its big selling point is that it allows business users to keep their excel sheets, where logic has accumulated over the years, but now expose it on the server, where it can be shared enterprise-wide.

The product includes 2 main components: Excel Web Access (EWA) and Excel Web Services (EWS). EWA is in essence a web part that displays a mostly read-only view of an Excel spreadsheet. You can input single-valued parameters and see the calculations being updated. EWS is a web services layer in the front of an Excel spreedsheet. You can input values, individual cells or ranges, and get both results of a binary snapshot of the spreadsheet.

I was really surprised about how much I can do with the product, and how easily. The ability to save a spreadsheet I half filled with Web Service calls, the ability to use Ajax to invoke the web services (with some sample code availablle on the net) on the browser, the power of the User Defined Functions, developed in C# and which can access databases or whatever, etc. It IS a very interesting product, and I strongly recommend it.

This said, I think this first version, included in MOSS 2007, still has space to evolve in a couple of different areas, improving it's fit to several more design/architecture problems.

First, EWA has to be improved to support full read-write, instead of simple single-cell input, using ranges, lists-of-values, etc. While this doesn't happen, it's usefullness is mostly in data display and very simple usage scenarios.

The second is more strategic: the Excel "client" is not really a Client App to Excel Services. They are different applications. If would be nice if one could have Excel (client) open a spreadsheet and do the calculations and data access on the server only.

Let me give an example, related to the product's main selling point: imagine you have an Excel spreadsheet initially developed in the 80's. It has hundreds of formulas, some 20 sheets, reference data, dozens of input fields/ranges, charts, etc. A living nightmare to IT, not necessarily so to the business users. Now I can store this spreadsheet on the server, true, but what do use as a client? I know I can interact with it using EWS, but the development of a specific Smart Client to access it, replicating some of the rules, is clearly an expensive option. This should be done directly in Excel Client, because that is the application the the business users love and know how to use, and because that's what minimizes development effort. I can use VSTA and develop in Excel Client to do this, and this is probably the best option at the moment.

Developing this idea, two more thoughts: it would be nice to have a "Excel Click-Once" functionallity, where a spreadsheet open in Excel (client) could be updated with the latest information from the server spreadsheet. Also, one of the problems with current Excel spreadsheets build up over the years, is that they get lost in people's hard drives. Some kind of mechanism could be in place to avoid or control this, so that people are always using the most up-to-date version of the logic.

Two further aspects, non-technical: when a business user is presented with a solution based on Excel Services, the reaction can easily be of surprise: "is this it?!". They are used to IT giving them apps, web sites, etc., not saying: "Just use Excel". People still have the impression that, to be good, you have to pay for it.
Which brings me to the second aspect: licensing (which is where you do pay). To use Excel Services, one has to fully license MOSS2007 Enterprise. That is, to use Excel Services, you have to pay for SharePoint Server, Forms Services, BDC, etc. All the components of SharePoint Server. Interestingly, this does not happen with Forms Services: you can simply license WSS 3.0 + Forms Services.

Anyway, these were just some ideas. Excel Services is an excelent start for a new product, and will surelly have very interesting developments in the following years.

Most of the information I've used while developing my prototypes where based the SharePoint 2007 SDK, but I also recommend LuisBE's blog, and Cum Grano Salis (where you can find the Ajax library I mentioned). Also, this ExcelPackage CodePlex project seems very promising (don't get me started in the power of the new formats :-)).

Monday, February 26, 2007

ArchCamp 2007 Field Report (part 2)

Like I said previously, each team presented their solutions to the Region President and his Assistant, who later asked questions about each approach. All the approaches were very high level, and the view of Architecture was perhaps that of an Enterprise Architect, very close to the business needs. Hugo Ribeiro is posting some pictures, and the event's blog will contain both the problem statement and the proposed solutions.

The Design Lab ended with an open space for questions by the audience to each of the groups, and the event was wrapped up with a final Debrief and evaluation. Some interesting suggestions were presented to future events, like doing a continuation of the Design Lab to further detail the technical approaches of each team. Also, analysing how each team self-organized to work on the project is something that is worth sharing. If you have a team of 6 Architects, several of them used to being team leaders and point the direction, it's certainly interesting to share how did they work together.

A final word of thanks to the event's sponsors, starting with Microsoft which again supported the event, but also Agilior, Create It, Primavera BSS, and PT.COM/Sapo.

Sunday, February 25, 2007

ARCHCAMP 2007 Field Report

Great experience! The event started Saturday morning with a keynote done by the main meeting's organizer, Hugo Batista, setting out the rules and agenda. After this, I did two presentations, on the past and future of GASP - the portuguese Architecture Group. Open presentations, with lots of space for discussion and exchange of ideas, it was extremely fruitful and a lot of valuable feedback was presented. Three things will be defined in the next few months: the group's Mission, Growth Model, and Objectives. The presentation also included IASA's 2.0 chapter model, and the big news: the creation of a chapter in the Center Region of Portugal, based around Coimbra/Leiria/Aveiro/Viseu. There will be news on this soon, and the leadership team was identified and participated in the ArchCamp.

After this, Nuno Costa presented ideas and criteria about a proposed system to measure and value the participation of each member in the community. More exchange of ideas ensued, and alternative or complementary ways of evolution were thought out. Some questions remain open, clearly, but several opinions were expressed that now gives us a more precise idea of where to go in terms of the group's future.

At the end of the day, 4 teams of 6 were formed and Mr Region President Tiago Pascoal, aided by his assistant Mr Hugo Ribeiro, presented a challenge to the groups, around the notion of a fictional regionalization happening in the country, and the need to provide local infrastructures and essentially support the region's economical growth. Perhaps the most interesting aspect of this problem was the context, meaning that this is a problem being issued by the region's President, who is being roleplayed as a purely politic person. In this view, if the President is the one doing the selection, the way the solution is presented must take this in consideration, and technological detail must be limited.

I teamed up with Nuno Costa, Miguel Madeira, Denis H., Pedro Lopes and Ricardo Teixeira (of Coimbra), in the Penguin Team. :-)

The complete problem statement will be posted soon, and also all the solutions.

The teams worked for the rest of the Saturday and Sunday morning, and will each present their solutions/proposals to the President and his aid. There is a space for questions about each approach, and an open discussion at the end. Regardless of the proposals and their merits, this has been a valuable process, with lots of ideas and different points of view. As to Penguin Team's work, we decided to completely avoid technical terms (including terms in english and acronyms), and design a 20000 feet high level view, complemented with a series of structural projects surounding ours, to create a web of actions.

More details later.

Saturday, February 10, 2007

BizTalk 2006: Tool to add application and references from the command line

BizTalk Server 2006 allows us to use the concept of Application to logically organize the artifacts in our solutions, such as orchestrations, schemas, send and receive ports, etc. Further, you can add references from one application to other applications, which allows you - for example - to bind an orchestration logical receive port to a receive port defined in some other (referenced) application, or create a Send Port filter using a property defined in a Property Schema included in other application.
You can use the Administration Console to add these references manually, by simply selecting an Application's properties and the "References" tab.

I am a huge fan of scripts and automation, so I quickly developed a command line tool to help me a) create applications and b) add references from one application to other application, which I use further automate my deployments, which sometimes are not easily done with MSI's.

Sample usage is:

BizAppManage -AddApp nameOfAppToCreate nameOfAppToReference1 nameOfAppToReference2 nameOfAppToReference3
BizAppManage -AddRef nameOfExistingApp nameOfAppToReference1 nameOfAppToReference2

I've used Carlos Medina's BtsHelper class to avoid hard-wiring the BizTalk connection string.

Wednesday, January 17, 2007

Identity Management and CardSpace

Identity Management is not one of my priorities, but it's a subject I've been interested about for sometime, and which is very related to the work I am doing at the moment. It all started with Kim Cameron's Identity Blog and his Laws of Identity.

The most visible face of this whole Identity Management issue is the multiple logins people have to make while browsing the internet, creating accounts at several sites to access their services or contents. I've had to resort to password-management software, but the problem is deeper than memorizing your multiple logins and passwords, especially when financial transactions are involved.

Probably the best description of the problem, or at least an introduction (and also a demonstration of what great presentation skills are), is 2005's Dick Hardt's Identity 2.0 introduction to the concept of Digital Identity.

Yesterday I listened to Hanselman's Identity podcast, and came home to read more and try Windows CardSpace (.Net's 4th pillar). CardSpace is included in .Net 3.0, but if you are using Windows Vista, it's build in (just type "card" on the start menu and "Windows CardSpace" shows up :-)). I started it and created a simple card with some of my information, and went looking for a place to use it. I found one at .Net 3.0's site, the SandBox. The SandBox is a Community Server installation with CardSpace support for user registration and login. When I registered, I got into Vista's Secure Desktop mode, with CardSpace open, selected the card I wanted to present to the SandBox (I got shown what field the SandBox would get from the card), and BAM, I was registered and logged in. All I had to do was to pick a nickname. Later I got an email with an username and password, just in case I want to log in using "traditional" methods.

CardSpace is based on some of the WS-* standards, such as WS-Security and WS-Trust, which supposedly make it both "safe and standard", but what I like the most is really the end user experience. For me, the idea of no longer having to create logins everywhere, and being able to select the specific pieces of information I want to share with each site I visit, is a very interesting prospect. The question is, obviously, if there will be acceptance to this outside Microsoft, or if this will be another Passport/Hailstorm situation. A major difference, the way I see it, is that information is stored in your computer, not at Microsoft somewhere, so the trust obstacles are aleviated.

As to this being available in public sites, I have no idea. I found a comment in a blog saying that Community Server 2.1 should include full CardSpace support soon, for all users to install, but found no details on this having happened yet, and found no major implementation of it yet (time to throw out Passport).

One final note, out of curiosity: when the screen greys out in Vista, you are in what MS calls "Secure Desktop" mode. This is Windows' mode that is used, for example, when you log into your Windows computer (running Xp, Vista, 2003, ...) . This mode is designed to block out processes from execution, to make sure you are inserting your password in a secure environment where no keyloggers or such can work. In Vista, you get a greyed out/transparent background when you are in this mode (which is just a UI thing, the grey is really a screenshot with transparency :-) Human Factors stuff). More information about this here and here.

Just before I go: there's already Firefox support for CardSpace, and Kim Cameron has an implementation of the identity system in Php. Also note that CardSpace can be used for much more than simple site login, I just wanted to blog about it because the first impression it leaves was really positive.

Monday, January 15, 2007

BizTalk: Wire Tap

Debugging in BizTalk (and other async/messaging-based solutions) can be complex, and very often the UIs (the Admin Console and Hat) don't give you enough tracking information, either because you've just re-deployed and lost tracking settings, or because the Sql Agent is turned off. This tends to happen frequently during development when you want to look at the body of the messages.

A very simple and very useful technique in this situation, and one I often find is not fully used, is to create a "Wire Tap". A Wire Tap is an Integration Pattern that allows the inspection of messages that travel across a channel. In BizTalk, this translates to simply creating a (Static, One Way) Send Port that looks for specific messages (using its Filters), and sends them to some destination, typically, a file folder (SMTP email is another frequent choice). This port is not bound to any orchestration, it's a content-based solution only.

One thing to remember about these Send Ports is that if you have a Send Port that has no filters, it catches nothing. Always remember to set up a filter. The ones I used the most are based on the message type (BTS.MessageType) and the receive port the message came in through (BTS.ReceivePortName).

I actually find that understanding and resorting to this mechanism is frequently a good indicator of the maturity of the BizTalk developer and his understanding of the pub/sub model in BizTalk Server.

BizTalk 2006: Custom Functoids and their Icons

When you develop custom functoids in BizTalk Server 2006, one of the steps you have to do is create icons to represent them in the mapper.

These functoids must be created as 16x16 BMP icons, and inserted/embeded in a Resources file. You can do this directly in Visual Studio, by opening the Resx file and adding an Image file, or (my preferred way) by using Lutz' Resourcer for .NET .

You can start by creating an icon in Visual Studio's editor, which I then screenshot into Paint.Net and save as a 16x16 BMP to later insert into the resources file.

Another "detail" that shouldn't be forgotten is updating the "Custom Tool Namespace", on the .resx file. Also remember that you have to restart Visual Studio if you want to have the icons/dlls update, as it caches the functoid assembly from its path at <drive>:\Program Files\Microsoft BizTalk Server 2006\Developer Tools\Mapper Extensions.

Using a background color for the Functoid icons is good practice, to avoid confusing yours with the built-in ones.

Catch Up in 2007

In the last two years I have been feeling a ever largest difficulty in following the rhythm of technological evolution in my area. When there was only BizTalk 2000/2002 and .Net 1.0, the world was simple. Now there is .Net 3.0, with WCF and WF, Dsl Tools and several Software Factories, BizTalk itself with its R2 evolutions, the notorious SharePoint/Office 2007, ever more frequent contributions in software architecture, more and more podcasts and videos, magazines, newsletters, hundreds of emails ... and still only 24 hours every day (they'll have to solve that one of these days).

It's not possible to stay up to day on everything, it's not even worth trying.

One of my decisions for 2007 concerns this Information Management issue. My first step was to identity the topics and technologies (I picked 3+1) I really want to follow in depth. Anything outside these bounds I will not focus on. I could have opted for "knowing little about everything", but somehow that doesn't feel right :-). I already started unsubscribing from blogs (from my list of 250 feeds), newsletters, and deleting podcasts. It's sad to see them go, but there is no other option.

I also did a quick re-read of "Getting Things Done", and printed out the poster freely available at David Allen's store (you have to register, but it's a free purchase). It really helps, having this in front of you during your workday. One of the hints I value most is: if you get contacted about something that takes less than 2 minutes to do, don't procrastinate, do it immediately. This applies to all those emails asking for "Can you please send me document X?". Sites like 43 Folders occasionally give you helpful hints to better manage your time. I personally tend to multitask a lot, and get distracted by multiple things of interest, so some of the resources here might help. Some time ago someone posted a method to keep you on track on something you are doing: write it down on a post it note, and stick it in your computer next to the screen. People can look at you sideways, but it seems to work. :-)

The day I'll feel happy is when I can get my Outlook Inbox from 500 to 0 emails, however.

DSL Tools v1 (Windows SDK Set/2006) and Modelling

Last week I went back to the V1 of the Dsl Tools, and re-did the tutorials. Much simpler than the beta ones, the entire platform is much stabler and parts of it were simplified, being easier to use. The unification of the DSL definition with its presentation definitions in a single file is very welcome. The lack of intellisense/syntax coloring support while writting the templates is one of my complains. The inability to use the DLS's outside Visual Studio (in scenarios where you want the business user to at least prototype the model) is another serious limitation. Finally, there are also some concepts that are not easy to understand on the first approach, and there seems to be too much "visible plumbing" in the design of languages and its later use.

I personally find the whole Software Factories/Modelling idea very powerfull, and with some ingredients that can help solve this whole quality issue in the Software industry, but while activity in the Software Factories has been plentiful (also in the Architecture Journal issue 9), not much new has been coming up on the web concerning DSL's and modeling. MS's DSL blogs also have little activity and there's no news on the expected book.

Most of the work Create It and myself do involve work around integration and custom software development, frequently around products such as BizTalk or SharePoint, and we don't market our own products or specialize in any vertical industry. Given this, I've tried to find uses for DSL's in what we do, and mostly I came out with what I call "technical" scenarios. For example, modeling a biztalk schema so that it's easier to create, or model a structure of sites in SharePoint to automatically provision/update them. The Software Factories that came out also work in this "technical" space, so this helps explaining why they are easier to come up with and develop.
I found very few uses of DSL's in real "business" scenarios, considering the kind of work we do. The most obvious case is perhaps in building workflows, which is Windows Workflow's space, and WF in itself is a "technical" DSL without busines specific activities. WF in SharePoint is closer to what I am looking for, specially with its activities to do content approval and routing, focused on the document management space.

This is a topic I would like to discuss with other people interested in modeling. Any feedback?