Thursday, December 10, 2015

«Creating a great [software] engineering culture» roundtable at Merge Lisbon #3

This evening I’ll be participating, together with people from local companies Uniplaces, Talkdesk and GrandUnion, in a panel on how to create a great software engineering culture, at the third meeting of Merge Lisbon.

An interesting topic, in a world were startups are born everyday focused on delivering an MVP as soon as possible (and not quality), where people stay less and less time in their jobs or simply freelance and pick only the really enticing projects, where Agile can on occasion lead to sloppiness, where people (rightly) enjoy working from home, and where – very often – the “bits are worth less”, for example when developing microsites for events or apps for short-lived festivals. Being based in Lisbon, I could also add: and where people leave all the time to work in the UK, the Netherlands or Switzerland.

I’ll come back here after the panel, surely with new ideas.

More information here (but it’s been fully booked for some time, I’m told).

Monday, December 7, 2015

[Micro-post] Puretext has a new version!

I’ve been using this little tool for years now, and just noticed it has a new version. PureText is a simple tool that stays in your tray icon, uses up 0,1Mb in RAM (!), and allows you to do Windows-V to paste text, removing formatting. You can change the key combination, but this one is perfect.

Extremely useful! Completely free, downloadable here.

Friday, December 4, 2015

«Azure WebApps: Why aren’t you using them yet?» @ Microsoft WebCamp 2015 - Lisboa

Microsoft Portugal held the yearly WebCamp event this last Wednesday, focused on Web Technologies,both from Microsoft and Open Source. The openness to OSS is clear, with sessions focusing specific to that approach to software development. The term “best of breed” does come to mind, in the sense that more and more solutions include parts by different sources/vendors, that have to work well together.

Anyway, my session was focused on Azure WebApps, and I expected the technology to be familiar to most people by now, but was surprised to find out it is not. The session had an enterprise-ready focus, which I had to adjust somewhat as a consequence.

As usual, the session was demo-heavy, and unfortunately the 40 minutes were not enough to show them all. Here’s what I had planned:

  • Create a site and publish from Visual Studio – showing the simplicity of the process and the new window in the SDK 2.8.1;
  • WebApps in the Azure Resource Manager – open up https://resources.azure.com and show how the site resources and their settings are represented;
  • Backup and Restore – using a site (a to-do app sample) I had previously deployed, I showed how the Backup and Restore features work, including the database-embed capability. For me, one of the killer features of this PaaS offering;
  • Remote Debugging – it’s always amazing, being able to do remote debug to code that is running remotely in the cloud. There was a specific session on Application Insights, so I opted to not go into it;
  • Staged Publishing – deployment slots are huge, both in terms of supporting the dev/test/quality environments, but also in team development itself (e.g., having a slot for each developer). Had a chance to show a swap and the “setting stickyness”, but had to skip the “A/B testing” support (where x% of the traffic is directed to a slot, and the rest to another one);
  • Traffic Manager – this is another impactful demo: having pre-created sites in North Europe, Brazil and Japan, used traffic manager to unify them under a single domain name, and then www.whatsmydns.net to check the dynamic resolution. Always a great demo (I’m impressed myself Smile);

The session time was only 40 minutes, so other demos had to be left out:

  • Redis Session State Provider – the idea here was to explain about Redis (an instance of which I had pre-created), install the package with the session state provider, change web.config, and redis-cli to show the keys/sessions being added to the repository;
  • Scaling – having a Redis has the backend to the user’s session state, the obvious next demo was using Scaling/Automatic scaling to see it working, and showing how scaling back down to one single server didn’t imply a loss of session;
  • Web Tests (not to be confused with Load Tests) – one of the newest features in WebApps is Web Tests, a service that works similarly to what services like AreMySitesUp provide: access your site from several locations worldwide, and if for example 3 of them can’t reach it more than 3 times, send and email alert. Discrete, but helpful.
  • IP Blocking – this final demo was addressing one specific complaint about the way IP Blocking works in Azure WebApps: either you include the blocked IPs in Web.config, or use App Service Environment (ASE). ASE implies the Premium service tier, which is costly. Adding an IP in the web.config implies a site restart, and if you have sites in several regions, you have to make the change in all of them. So the demo goes the applicational path: the IPs to block are simply added to Redis (ex: IPB1.2.3.4) and an Action Filter in the MVC project checks the source IP and returns HTTP 403’s if it’s in the blocked list. Quick and yet not dirty Smile.

That was it. A full room, I’m just sorry I didn’t have the time to show everything I had prepared. Maybe I should do some screencasts?

Monday, November 16, 2015

Revista Programar: Azure Logic Apps: o futuro dos backends? [Portuguese]

My article "Azure Logic Apps: o futuro dos backends?" just made the cover of the 50th edition of the "Revista Programar" magazine. The article describes my view of the historical evolution from Mashups to SOA to Microservices, and describes the current version of Azure Logic Apps, Microsoft's implementation of that architectural view.

If you do happen to read portuguese :), the direct link to the article is here.

Friday, June 26, 2015

IoT: Raspberry Pi2 and Azure Event Hubs and Mono and SQL Database–experiences

A couple of months ago I bought a Pi2 , to complement the Pi1 I use mostly as a media center. I also bought modmypi’s Raspberry Pi YouTube Workshop Kit, a pack that includes a breadboard, cables, a set of sensors, and that pairs with a set of Tutorial videos on how to set it up. The tutorials are all done using Python, but my goal was (obviously) to do the same using .Net/Mono on raspian.

Using an approach and code that initially was similar for example to Jan Tielens’ in his “Raspberry Pi + GPIOs with DS18B20 + Azure + C# = Internet Thermometer” article, and which I’ll describe in a later post, I now have my Pi2 sending temperature readings to an Azure Event Hub using REST, from where it is read by Azure Stream Analytics and then dropped into an Azure SQL Database. I still hope to wire this up to PowerBI, but there doesn’t seem to be a way at the moment to connect my MSDN Azure account with my corporate account where we have 100 PowerBI licenses, so that will have to wait.

What I wanted to share for now are some tips regarding the process, which are not described elsewhere in other articles I read on the net, and which I guess are very specific to the IoT/sensor world (to which I am new). Keep in mind that my simple goal was to have the Pi2 send temperature readings to Azure every minute.

Service Bus Queue vs Event Hub

My initial code was posting readings to an SB Queue. I didn’t antecipate using Event Hubs, since an event every minute doesn’t justify the platforms’ capabilities. Turns out that Azure Stream Analytics doesn’t support SB Queues as the source, so I had to change the connection code that posts temperature readings using REST. The changes were very few, but included:

  • Dropping the support for custom headers, which I was injecting in the message sent to the Service Bus (example: sensor id). I had to move this information into the message payload itself;
  • Changing the URL to which the message is posted, including the API version. To write to SB I was using https://myservice.servicebus.windows.net/queuename/messages?timeout=60&api-version=2015-01 and had to change this to: https://myservice.servicebus.windows.net/eventhubname/messages?timeout=60&api-version=2014-05 .

Reading the Temperature – closing the Stream

The code that reads from the sensor, in Jan Tielens’ code (and other similar code found on the net), doesn’t allow for repeat readings in a loop. This line of code:

var w1slavetext = deviceDir.GetFiles("w1_slave").FirstOrDefault().OpenText().ReadToEnd();

… actually leaves a text stream open (StreamReader class), that has to be closed for repeat readings to work. So that was another fix.

The main loop – time between readings

My application is simple console application implementing a while(true) loop, that does this:

  • Read a temperature value from the sensor
  • Send a message to an Event Hub, by doing an HTTP post of a JSON-serialized message
  • Wait for 60 seconds with Thread.Sleep

One thing I noticed was that the readings were spaced, not 60 seconds, but 60+something. This “something”, usually 2-3 seconds, were obviously caused by the time the first two steps took. So to fix this I created a System.Diagnostics.Stopwatch at the start of the main loop, and at its end waited for 60 seconds minus the time it took for the first 2 operations to execute.

Now the readings were close enough (to a few milliseconds) to one every minute. Simple fix, simple mistake to make.

The main loop – long operations

The previous solution has a problem, which I quickly found out about. After running for a few hours, I had some posts to the Event Hubs that took a long time. More than 60 seconds. Maybe the cause was some Wifi problem, or network issue, don’t know. But what this meant was that I was calling Thread.Sleep with a negative value, which crashed the app. So another fix: if the operations took more than 60 seconds, don’t sleep and do another temperature reading immediately.

The main loop – SHA tokens’ lifetime

At the top of the app, before the main loop, the first thing I do is to create a SHA token used to connect to the event hub. This token has a lifetime, which I think is one hour by default. So, as you can expect: after one hour of reading temperature (60 readings), the SHA token expired, and sending the message failed with a 401 (permission denied), and I had an exception that stopped the app. Back to the code, another simple fix: wrap the sending of the message to the eventhub (which uses the WebClient class) with a try/catch, and when I find a WebException with 401 as the error code, recreate the SHA token and send the message again.

The main loop – the all encompassing try-catch

The last fix I did after I started having unhandled exceptions which I am not sure are due to the Pi2, Mono, network, whatever: I just wrapped the code inside the main loop inside a general try-catch, logged any error to the console output, and continue the loop execution. A “just in case” solution.

Finally, getting information about the device

This is not specific to the handing of the readings themselves, but I think is relevant. In the payload of my messages I wanted to include some information specific to the device, and found out I could find this information by reading from some devices/streams provided by the Raspian OS. I dug into some samples in the net, and ended up doing code that gets both the serial and the model name. The OS calls I do, using the Process/ProcessStartInfo classes as a way to get into bash are:

cat /proc/cpuinfo | grep Serial | awk '{print $3}'   -- this gets you the device’s serial, for example “00000000f5b55a06”

cat /proc/cpuinfo | grep 'model name' | head -n 1 – this gets you a string from where you can extract the model name, for example “ARMv7 Processor rev 5 (v7l)”

 

I’m still cleaning up the code and making sure it’s stable, but I’ll post it to github pretty soon. Contact me if want to see it sooner. Anyway, what I did already realize is that the colder time of the day, in my place at least, is between 22:00 and 02:00, which surprised me, and the temperature variation is about 4 Celcius on average.  Interesting info!

Tuesday, June 16, 2015

«The Azure App Service Architecture» @ Microsoft Developer TechRefresh 2015–Lisboa

Yesterday we had another Developer TechRefresh at Microsoft Lisboa, where I and my colleague André Vala both presented sessions. My session was a repeat of the Build 2015 session of the same name, presenting the architecture and demoing the two new main components of the new Azure App Service: API Apps and Logic Apps. The first especially I would say are almost ready for prime-time, and both of them are a very good play by Microsoft in the micro-services/mashup space. Very interesting technology, although obviously not everything is finished yet and there are some issues. 

Session slides are available on slideshare. The video of the session was recorded, will link to it when available.

PS: I just wished Microsoft fixed/replaced the “new” Azure portal, I still have frequent errors using it.

Friday, May 22, 2015

ITARC15 Architecting a Large Software Project - Lessons Learned

This morning I presented my “Lessons Learned” workshop at ITARC 2015 in Stockholm, Sweden. This session had previously been presented at Netponto, and was improved with more content targeted at software architects and also updated with more current information. The goal of the workshop (3,5 hours!) is to share experiences and discuss approaches in developing complex software projects. I had great feedback from the participants, and provocative and relevant questions.

One issue that did come up is the definition of “Large”: this was a large project for Portugal’s standards, plus it was complex and took a long time until release. But for local Swedish standards, it wasn’t that “large” Smile. Even so, the contents are general enough to be interesting, or so I was told. Also it was interesting to learn about some cultural differences between Portugal and Sweden – and those were much less than would be expected.

Great session, loved doing it. Here’s the slidedeck, for those interested.