toaster » (the universe lets me win) 2013-06-22 21:20:07

June 22nd, 2013
Elizabeth, E, Seamus, EJ, LJ, Kidface, Eliza, SpaceFrog, Lizabiz, Liz, Fizz, Spud, Piglet, Bald Cat, Grumpkin, Poppyseed. Our many-named baby is six months old! It's been the nicest time and the most exhausting and the most terrifying and the most fun. I guess everyone says things like this, because getting a kid comes with useful brain damage that makes any road-not-taken comparisons very suspect, but I'm so glad we did this. So glad! Is it objectively better? It _feels_ like it is, but it's impossible for me to tell.

At six months old she looks less like a bewildered baby and more like a small kid. She talks a lot, using words that sound like a language we don't know. She laughs. Oh my god it's so good. I hoist her up into the air on my knees and she grips my fingers and makes these big capital-D smiles and laughs like this is the funniest thing that ever happened. She laughs at stupid faces, at nonsense words, at falling over. Sometimes she laughs just because someone else is laughing. How does she know to do that? I don't know!

She can't crawl, but she has invented a ridiculous mode of perambulation where you push your face into the ground, brace your legs to make an arch, then flop. It hasn't occurred to her to use her arms for any part of this operation; the face does the heavy lifting. It works: she no longer stays where she's put. I have woken to find her looming over me, jaws wide, two tiny razor-sharp teeth glinting in the moonlight, gigantic baby bobble head moving jerkily around as she searches for something to chew on. It's a little unsettling at 4am.

She likes the baby in the mirror. She's just discovered how great it is to kick the water in the bath. She thinks peas are amazing and carrots are weird. She loves the bright pictures in The Snowy Day, but is also fairly attentive when I want to read Dr Seuss. When she thinks really hard, she looks cross, but isn't, just like Joel does when he's thinking really hard. I like to watch her contemplate things. I wonder about what she decides.

People in work say "How's the baby?" and I can feel myself light up as I think of a hundred things I want to enthuse about. I say "She's amazing :-D" and "She just did this new thing, let me tell you!", and honestly, I do know that other people's babies are not intrinsically interesting, and I try to rate-limit myself, but, dudes, she's so cool. I really enjoy having her. She's a great baby.

Here's a bunch of pictures.
Elizabeth 4-6 months.

Elizabeth, May 19, 2013

cambo » LINQ Deferred Execution & Lambda Methods for providing Simple Stats (Part II)

May 24th, 2013

Important!

This is part 2 in a series of posts on Linq & Lambda capabilities in C# 

Deferred Execution

So lets take a minute to talk about deferred execution. You may here this referred to as Lazy Execution as well. But in a nutshell what this means is that when you write a linq or lambda query against a collection or list, the execution of that query doesn’t actually happen until the point where you need to access the resuts. Let’s look at a simple example.

var ienum = Enumerable.Range(1, 10).ToList();

var query = from i in ienum
            where i%2 == 0
            select i;

ienum.Add(20);
ienum.Add(30);

SuperConsole.WriteLine(query);
//prints 2, 4, 6, 8, 10, 20, 30

So why does it print out 20 and 30. This is deferred execution in practice. At the point where you write your query (var query) the query is not actually executed against your datasource (ienum). After the query is setup, more data is added to your data source, and the query is only actually executed at the point where the results need to be evaluated (SuperConsole.WriteLine)

This holds true in a number of other Linq Scenarios. In Linq-to-Sql or Linq-to-Entity Framework, execution of the Sql Query is only sent to the database at the point where you need to evaluate your queries. It’s important to understand this so that queries don’t go out of scope before being executed, so that un-executed queries aren’t inadvertently passed to other parts or layers in your application and so that you don’t end up introducing N+1 problems where you think your working on data in memory but in actual fact, your performing multiple executions over and over in a loop. If you do need to make your queries “Greedy” and force them to execute there and then, you can wrap them in parenthesis and immediately call .ToList() on them to force the execution.

Min, Max, Count & Average

Linq has a number of convenient built in methods for getting various numeric stats about the data your working on. Consider a collection of Movies which you want to Query.

public class Movie
{
    public string Title { get; set; }
    public double Rating { get; set; }
}

...

var movies = new List
    {
        new Movie() {Title = "Die Hard", Rating = 4.0},
        new Movie() {Title = "Commando", Rating = 5.0},
        new Movie() {Title = "Matrix Revolutions", Rating = 2.1}
    };

Console.WriteLine(movies.Min(m => m.Rating));
//prints 2.1

Console.WriteLine(movies.Max(m => m.Rating));
//prints 5

Console.WriteLine(movies.Average(m => m.Rating));
//prints 3.7

Console.WriteLine(movies.Count);
Console.WriteLine(movies.Count());
//prints 3

Min, Max and Average are all fairly straight forward, finding the Minimum, Maximum and Average movie rating values respectively. It’s worth mentioning with regards the Count implementations that there are different “versions” of the count implementation depending on the underlying data structure you are operating on. The Count property is a property of the List class are returns the current number of items in that collection. The Count() method is an extension method on the IEnumerable interface which can be executed on any IEnumerable structure regardless of implementation.

In general LINQ’s Count will be slower and is an O(N) operation while List.Count and Array.Length are both guaranteed to be O(1). However in some cases LINQ will special case the IEnumerable parameter by casting to certain interface types such as IList or ICollection. It will then use that Count method to do an actual Count() operation. So it will go back down to O(1). But you still pay the minor overhead of the cast and interface call. Ref: [http://stackoverflow.com/questions/981254/is-the-linq-count-faster-or-slower-than-list-count-or-array-length/981283#981283]

This is important as well if you are testing your collections to see if they are empty. People coming from versions of .NET previous to Generics would use the Count or Length properties of a collection to see if they were empty. i.e.

if(list.Count == 0)
{ 
    //empty
}
if(array.Length == 0)
{
    //empty
}

Linq however provides another method to test for contents called Any(). It can be used to evaluate whether the collection is empty, or if the collection has any items which validate a specific filter.

if(list.Any()) //equivalent of count == 0
{ 
    //empty
}
if(list.Any(m => m.Rating == 5.0)) //if it contains any top rated movies.
{
    //empty
}

If you are starting with something that has a .Length or .Count (such as ICollection, IList, List, etc) – then this will be the fastest option, since it doesn’t need to go through the GetEnumerator()/MoveNext()/Dispose() sequence required by Any() to check for a non-empty IEnumerable sequence. For just IEnumerable, then Any() will generally be quicker, as it only has to look at one iteration. However, note that the LINQ-to-Objects implementation of Count() does check for ICollection (using .Count as an optimisation) – so if your underlying data-source is directly a list/collection, there won’t be a huge difference. Don’t ask me why it doesn’t use the non-generic ICollection… Of course, if you have used LINQ to filter it etc (Where etc), you will have an iterator-block based sequence, and so this ICollection optimisation is useless. In general with IEnumerable : stick with Any() Ref: [http://stackoverflow.com/questions/305092/which-method-performs-better-any-vs-count-0/305156#305156]

Next post, we’ll look at some different mechanisms for filtering and transforming our queries.

~Eoin C

cambo » Handy LINQ & Lambda Methods and Extensions (Part I)

May 23rd, 2013

The System.Linq namespace contains a fantastic set of utility extension methods for filtering, ordering & manipulating the contents of your collections and objects. In the following posts I’ll go through some of the most useful ones (in my humble opinion) and how you might use them in your C# solutions

Important!

This is part 1 in a series of posts on Linq & Lambda capabilities in C# 

Before we start, here’s a handy static method to print your resulting collections to the console so you can quickly verify the results.

public class SuperConsole
{
    public static void WriteLine<T>(IEnumerable<T> list, bool includeCarriageReturnBetweenItems =false)
    {
        var seperator = includeCarriageReturnBetweenItems ? ",\n" : ", ";
        var result = string.Join(seperator, list);
        Console.WriteLine(result);
    }
}

Enumerable

The System.Linq.Enumerable type has 2 very useful static methods on it for quickly generating a sequence of items. Enumerable.Range & Enumerable.Repeat. The Range method allows you to quickly generate a sequential list of integers from a given starting point for a given number of items.

IEnumerable<int> range = Enumerable.Range(1, 10);
SuperConsole.WriteLine(range);
//prints "1, 2, 3, 4, 5, 6, 7, 8, 9, 10"

So why is this useful, well you could use it to quickly generate a pre-initialised list of integers rather than new’ing up a list and then iterating over it to populate it. Or you could use it to replicate for(;;) behavior. e.g.

for (int i = 1; i <= 10; i++) 
{     
    //DoWork(i); 
} 

Enumerable.Range(1, 10).ToList().ForEach(i =>
    {
        //DoWork(i)
    });

Repeat is similar but is not limited to integers. You can generate a Sequence of a given length with the same default value in every item. Imagine you wanted to create a list of 10 strings all initialised with a default string of “ABC”;

var myList = Enumerable.Repeat("ABC", 10).ToList();

Item Conversion

There are also a few handy ways to convert/cast items built into the System.Linq namespace. The Cast<T> extension method allows you to cast a list of variables from one type to another as long as a valid cast is available. This can be useful for quickly changing a collection of super types into their base types.

var integers = Enumerable.Range(1, 5);
var objects = integers.Cast<object>().ToList();

Console.WriteLine(objects.GetType());
SuperConsole.WriteLine(objects);

//prints
//System.Collections.Generic.List`1[System.Object]
//1, 2, 3, 4, 5

But what if a valid implicit cast isn’t available. What if we wanted to convert our collection of integers into a collection of strings with a ‘:’ suffix. Thankfully Linq has us covered with it’s ConvertAll Method on List

var integers = Enumerable.Range(1, 5);
var converter = new Converter<int, string>(input => string.Format("{0}: ", input));
var results = integers.ToList().ConvertAll(converter);

SuperConsole.WriteLine(results, true);
/*prints
    1:
    2:
    3:
    4:
    5:
    */

In the next post, we’ll look at some the lazy & deferred execution capabilities of LINQ and some useful methods for performing quick calculations and manipulations on our collections.

~Eoin C

cambo » Implementing HTML Formatted Emails in the Enterprise Library Logging Block

May 21st, 2013

Enterprise LibraryThe Microsoft Patterns & Practices Enterprise Library contains a number of useful applications blocks for simplifying things like DataAcces, Logging & Exception Handling in your .NET Applications. Recently we had a requirement to add HTML based formatting to the Email TraceListener in the Logging Application Block, something that’s unfortunately missing from the base functionality. Thankfully, Enterprise Library is an open source code plex project so implementing a custom solution is a relatively trivial task. The email tracelistener functionality is contained in 3 main files: EmailTraceListener – The actual listener which you add to your configuration EmailTraceListenerData – The object representing the configuration settings EmailMessage – The wrapper object around a LogMessage which gets sent via email. Unfortunately because of the the way these classes are implemented in the EnterpriseLibrary Logging Block, they are not easily extended due to dependencies on Private Variables and Internal classes in the EnterpriseLibaray Logging Assembly so they need to be fully implemented in your own solution.

Implementing a Solution

Step 1 was to take a copy of these three files and place them in my own Library Solution. I prefixed the name of each of them with Html; HtmlEmailTraceListener, HtmlEmailTraceListenerData and HtmlEmailMessage. Other code needed to be cleaned up including removing some dependencies on the internal ResourceDependency attributes used to decorate properties within the class & tidying up the Xml Documentation Comments. The main change was then to enable the IsBodyHtml flag on the mail message itself. This was done in the CreateMailMessage method of the HtmlEmailMessage

protected MailMessage CreateMailMessage()
{
	string header = GenerateSubjectPrefix(configurationData.SubjectLineStarter);
	string footer = GenerateSubjectSuffix(configurationData.SubjectLineEnder);

	string sendToSmtpSubject = header + logEntry.Severity.ToString() + footer;

	MailMessage message = new MailMessage();
	string[] toAddresses = configurationData.ToAddress.Split(';');
	foreach (string toAddress in toAddresses)
	{
		message.To.Add(new MailAddress(toAddress));
	}

	message.From = new MailAddress(configurationData.FromAddress);

	message.Body = (formatter != null) ? formatter.Format(logEntry) : logEntry.Message;
	message.Subject = sendToSmtpSubject;
	message.BodyEncoding = Encoding.UTF8;
	message.IsBodyHtml = true;

	return message;
}

Using your new solution

Once implemented it’s simply a matter of reconfiguring your app/web.config logging sections to use the new types you’ve created instead of the original Enterprise Library types. You need to change the type and listenerDataType properties of your Email Listener in the &gl;listeners@gt; section of your config.

<listeners>
      <!-- Please update the following Settings: toAddress, subjectLineStarter, subjectLineEnder-->
      <add name="EmailLog"
           toAddress="toAddress@example.com"
           subjectLineStarter="Test Console - "
           subjectLineEnder=" Alert"
           filter="Verbose"
           fromAddress="fromAddress@example.com"
           formatter="EmailFormatter"
           smtpServer="smtp.gmail.com"
           smtpPort="587"
           authenticationMode="UserNameAndPassword"
           useSSL="true"
           userName="fromAddress@example.com"
           password="Password"
           type="YourLibrary.YourNamespace.HtmlEmailTraceListener, YourLibrary, Version=1.0.0.0, Culture=neutral, PublicKeyToken=0000000000000000"
           listenerDataType="YourLibrary.YourNamespace.HtmlEmailTraceListenerData,  YourLibrary, Version=1.0.0.0, Culture=neutral, PublicKeyToken=0000000000000000"
           traceOutputOptions="Callstack" />
    </listeners>

You’ll also need to ensure that you’ve escaped your Html formatted textFormatter template in the formatters section of your code. i.e. replacing <html> with &lt;html&gt;

<formatters>
      <add name="EmailFormatter" 
              type="Microsoft.Practices.EnterpriseLibrary.Logging.Formatters.TextFormatter, Microsoft.Practices.EnterpriseLibrary.Logging, Version=5.0.414.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" 
              template="
                  &lt;html&gt;
                  &lt;body&gt;
                  &lt;table border=&quot;1&quot; style=&quot;border: solid 1px #000000; border-collapse:collapse;&quot;&gt;
                  &lt;tr&gt;&lt;td&gt;&lt;b&gt;Message&lt;/b&gt;&lt;/td&gt;&lt;td&gt;&lt;b&gt;{message}&lt;/b&gt;&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;Local TimeStamp&lt;/td&gt;&lt;td&gt;{timestamp(local)}&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;Timestamp&lt;/td&gt;&lt;td&gt;{timestamp}&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;Title&lt;/td&gt;&lt;td&gt;{title}&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;Severity&lt;/td&gt;&lt;td&gt;{severity}&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;Category&lt;/td&gt;&lt;td&gt;{category}&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;Priority&lt;/td&gt;&lt;td&gt;{priority}&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;EventId&lt;/td&gt;&lt;td&gt;{eventid}&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;Local Machine&lt;/td&gt;&lt;td&gt;{localMachine}&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;AppDomain&lt;/td&gt;&lt;td&gt;{appDomain}&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;LocalDomain&lt;/td&gt;&lt;td&gt;{localAppDomain}&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;Local Process Name&lt;/td&gt;&lt;td&gt;{localProcessName}&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;Local Process&lt;/td&gt;&lt;td&gt;{localProcessId}&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;Win32ThreadId&lt;/td&gt;&lt;td&gt;{win32ThreadId}&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;ThreadName&lt;/td&gt;&lt;td&gt;{threadName}&lt;/td&gt;&lt;/tr&gt;
                  &lt;tr&gt;&lt;td&gt;Extended Properties&lt;/td&gt;&lt;td&gt;
                  &lt;table border=&quot;1&quot; style=&quot;border: solid 1px #000000; border-collapse:collapse;&quot;&gt;
                  {dictionary(&lt;tr&gt;&lt;td&gt;{key}&lt;/td&gt;&lt;td&gt;{value}&lt;/td&gt;&lt;/tr&gt;)}
                  &lt;/table&gt;&lt;/td&gt;&lt;/tr&gt;
                  &lt;/table&gt;
                  &lt;/body&gt;
                  &lt;/html&gt;
" />

    </formatters>

All done. Now you can happily send email log messages in HTML format via your Application Logging calls.

~EoinC

cambo » Using Signlar to Publish Dashboard Data

May 15th, 2013
SignalR

SignalR

Recently David Fowler announced the release of the Signlar 1.1.0 Beta Release. So I decided to do some dabbling to get a prototype application up and running. The solution is pretty simple. It uses a SignlaR hub to broadcast the current Processor % usage, and renders it in a nice visual graph using HighCharts.

 

Important!

The completed solution can be found on GitHub at https://github.com/eoincampbell/signalr-processor-demo 

First things first we’ll need a bare bones web application which we can pull in the relevant nuget packages into. I started with a basic empty web application running under .NET 4.5. Installing the signlar & highcharts packages is a breeze. Open up the PowerShell Nuget Console and run the following commands. HighCharts gets installed as a solution level package so you’ll need to manually copy the relevant JavaScript files to your scripts directory in your application.

Install-Package HighCharts
Install-Package Microsoft.AspNet.SignalR

The Hub

Signalr relies on a “Hub” to push data back to all the connected Clients. I’ve created a “ProcessorDataHub” which implements the Signalr Base Hub to manage this process. It contains a constructor for Initializing a static instance of my ProcessorTicker class, and a start method to start the thread within the ticker. The HubName attribute specifies the name which the hub will be accessible by on the Javascript side.

[HubName("processorTicker")]
public class ProcessorDataHub : Hub
{
    private readonly ProcessorTicker _ticker;

    public ProcessorDataHub() : this(ProcessorTicker.Instance) { }

    public ProcessorDataHub(ProcessorTicker ticker)
    {
        _ticker = ticker;
    }

    public void Start()
    {
        _ticker.Start(Clients);
    }
}

The ProcessorTicker

The heavy lifting is then done by the ProcessorTicker. This is instantiated with a reference to the Clients object, a HubConnectionContext which contains dynamic objects allowing you to push notifications to some or all connected client side callers. The implementation is fairlly simple using a System.Thread.Timer which reads the current processor level from a peformance counter once per second, and Broadcasts that value to the client side.

Since the Clients.All connection is dynamic, calling “updateCpuUsage” on this object will work at runtime, so long as the relevant client side wiring up to that expected method has been done correctly.

Clients.All.updateCpuUsage(percentage);

The Client Side

One change since the previous version of SignalR is the requirement for the developer to manually & explicity wireup the dynamically generated Javascript endpoint where SignalR creates it’s javascript. This can be done on Application Start by calling the RouteTable..Routes.MapHubs() method

protected void Application_Start(object sender, EventArgs e)
{
RouteTable.Routes.MapHubs();
}

Finally we’re ready to consume these published messages on our Client Page. Signlar requires the following javascript includes in the Head Section of your page.

<script type="text/javascript" src="/Signalr/Scripts/jquery-1.6.4.js"></script>
<script type="text/javascript" src="/Signalr/Scripts/jquery.signalR-1.1.0-beta1.js"></script>
<script type="text/javascript" src="/Signalr/signalr/hubs"></script>

With those inplace, we wire up our own custom Javascript function to access our ProcessorTicker, start the Hub on a button click, and begin receiving and processing the

    <script type="text/javascript">
    $(function () {
        var ticker = $.connection.processorTicker;

        //HighCharts JS Omitted..

        ticker.client.updateCpuUsage = function (percentage) {
            $("#processorTicker").text("" + percentage + "%");

            var x = (new Date()).getTime(), // current time
                y = percentage,
                series = chart.series[0];

            series.addPoint([x, y], true, true);
        };

        // Start the connection
        $.connection.hub.start(function () {
            //alert('Started');
        });

        // Wire up the buttons
        $("#start").click(function () {
            ticker.server.start();
        });
    });
</script>

The result is that I can fire up a number of separate browser instances and they’ll all get the correct values published to them from the hub over a persistent long running response. Obviously this an extremely powerful system that could be applied to Live Operations Systems where dash boards have traditionally relied on polling the server at some regular interval.

Live Processor Data to Multiple Browsers via SignalR

Live Processor Data to Multiple Browsers via SignalR

~Eoin C

cambo » When’s a Deep Dive not a Deep Dive ?

April 29th, 2013

Global Windows Azure Bootcamp

This weekend, I attended the Global Windows Azure Deep Dive conference in the National College of Ireland, Dublin. Microsoft in conjunction This was a community organised event where Local & National IT Organisations, Educational Institutions & .NET Communities were running a series of events in parallel in a number of cities around the world. The purpose; Deep Dive into the latest technology available on Microsoft as well as take part in a massively parallel lab where participants from all over the world would spin up worker roles to contribute to 3D graphics rendering based on depth data from a KINECT. Alas, Deep it was not, and Dive we didn’t.

I suppose I can’t complain too much. You get what you pay for and it was a free event but I’d have serious reservations about attending this type of session again. Don’t get me wrong, I don’t want to sound ungrateful, and fair dues to the organisers for holding the event but if you’re going to advertise something as a “Deep Dive” or a “Bootcamp” then that has certain connotations that there would actually be some Advanced Hands-on learning.

Instead the day would barely have qualified as a Level 100 introduction to 2 or 3 Windows Azure technologies interspersed with Sales Pitches, Student Demo’s of their project work and filler talks relating to cloud computing in general. Probably most disappointingly we didn’t actually take part in the RenderLab experiment which kinda torpedoed the “Global” aspect of the day as well. You can see the agenda below. I’ve highlighted the practical aspects in Red.

Time Topic
0930 Welcome Dr Pramod – Pathak, Dean, School of Computing, NCI
0935 Schedule for the day  – Vikas Sahni, Lecturer, School of Computing,NCI
0940 How ISIN can help – Dave Feenan, Manager, ISIN
0945 Microsoft’s Best Practice in Data Centre Design – Mark O’Neill, Data Center Evangelist, Microsoft
1000 Virtual Machines – Demo and Lab 1 - Vikas Sahni, Lecturer, School of Computing, NCI
1100 Careers in the Cloud – Dr Horacio Gonzalez-Velez, Head, Cloud Competency Center, School of Computing, NCI
1110 Graduates available today - Robert Ward, Head of Marketing, NCI
1120 Break
1135 Web Sites – Demo and Lab 2 – Vikas Sahni, Lecturer, School of Computing, NCI
1235 Building the Trusted Cloud – Terry Landers, Regional Standards Officer for Western Europe, Microsoft
1300 Lunch
1400 Tools for Cloud Development – Colum Horgan, InverCloud
1410 Windows Azure Mobile Services – Overview and Showcase –  Vikas Sahni, Lecturer, School of Computing, NCI and Students of NCI
1440 Developing PaaS applications – Demo – Michael Bradford, Lecturer, School of Computing, NCI
1530 Break
1545 Windows Azure – The Big Picture – Vikas Sahni, Lecturer, School of Computing, NCI
1645 Q&A

Alas even the practical aspects of the day were extremely basic and the kinda of thing that most people in the room had done/could do in their own spare time.

  • During the Virtual Machines Lab, we spun up a Virtual Machine from the Windows Azure Gallery and remote desktop connected into it.
  • During the Websites Lab, we deployed a WordPress install… unless you were feeling brave enough to do something else. To be fair I hadn’t done a hands on GitHub Deploy of the code so that was interesting.
  • During the PaaS Application Demo… well it was supposed to be a Hello World web/worker role deployment but god love the poor chap he was out of his depth with Visual Studio and had a few technical hiccups and it was just a bad demo. Upshot was we ran out of time before there was an opportunity for any hands on time in the room.

At 15:30 we left… I didn’t have another lecture in me, although at least we’d had the common courtesy to stay that long. Half the room didn’t come back after lunch.

The takeaways; I know that alot of time and effort goes into these events, and particularly when they are free, that time and effort is greatly appreciated. But you need to make sure you get your audience right. If you advertise Advanced and deliver basic, people will be disappointed. That was clear from the mass exodus that occured during the day… I’m kinda curious to know if there was anyone around for the Q&A at all. I’ll be sure as heck checking the agenda on these type of events before committing my time to them in future. We aren’t currently using Windows Azure in our company yet, and embarrassingly I had been promoting it internally and had convinced several of my colleagues to give up their Saturday for it.

~Eoin C

usal » How much solar capacity would be required to replace the energy output we get from oil?

March 27th, 2013

A while ago I was watching a talk by Elon Musk, founder of PayPal, SpaceX and Tesla, he was talking about a bunch of things but something he said struck me;


I think it is helpful to use the analytical approach in physics, which is to try boil things down to first principles and reason from there, instead of trying to reason by analogy. The way this applied to rocketry was to say, okay, well, what are the materials that go into a rocket, how much does each material constituent weigh, what's the cost of that raw material, and that's going to set some floor as to the cost of the rocket. That actually turns out to be a relatively small number. Certainly well under 5% of the cost of a rocket and, in some cases, closer to 1% or 2%. You can call it, maybe, the magic wand number. If you had piles of the raw materials on the floor and you just waved a magic wand and rearranged them, then that would be the best case scenario for a rocket. So, I was able to say, okay, there's obviously a great deal of room for improvement. Even if you consider rockets to be expendable.” - transcript

At the end of the talk he spoke a little about how he thought that the only way away from fossil fuels is solar.

In order to test my ability to use his ‘back to physics’ model, I decided to work out if going all solar was possible or obviously ridiculous.

Is there enough solar energy falling on earth to replace the ‘banked solar’ of fossil fuels?

Start with Sci-fi. Is there enough energy?

If we imagine a large solar array, floating outside the atmosphere, in the same orbit around the Sun as earth. How large would this array need to be to capture the equivalent amount of energy to that which we release by burning oil every day?

First off, How much oil do we produce?
How much energy is contained in a barrel of oil?
About  6.12×109 J

This gives us a world output of 6.12 ×109 x 85.64 ×106 J/day = 5.241168e+17 J/day

If we simplify from day to second we get: 5.241168e+17/86400 J/s = 6.0661667e+12 J/s

Luckily J/s is also known as Watt (W) making the next bit easier.

How much energy does the Sun give off?
That value is known as the Solar Constant and is about 1,367 Watts per square meter

Do the division and you get an array 4,437,576,225.31 square meters in size.

Convert from square meters to square kilometers and you get, 4,438 km^2.
An array slightly larger than Rhode Island.

That’s not as bad as I expected.

What happens when we move to Earth?

From a convenient web page:
Missouri has more than 200 sunny days per year, for an average of 4.5 to 5.0 kWh per square meter per day”

That translates to ≈1000 kWh per square meter per year.

One barrel of oil contains 1.7 MWh.
Convert to kWh (x1000), 1700 kWh
We use the same ‘barrels per day’ number as earlier, we assume a 365 day year.

Oil produces 5.313962e+13 kWh per year.

Divide by that neat 1000 kWh per square meter, per year and you get : 53,139,620,000 (m^2)

That’s 53,140 square km. or 76% of Missouri’s area, or less than 20% of the area currently farmed for corn in the US.

What I learned?

40% of US corn is used to make Ethanol for fuel, replace that with solar and you can replace the all world’s oil and get 20% of the land back.

Solar’s practical and corn is a really inefficient way to farm sunshine.

cambo » Could not load type ‘System.Runtime.CompilerServices. ExtensionAttribute’ from assembly mscorlib when using ILMerge

March 6th, 2013
Works On My Machine

Works On My Machine

I ran into a pretty horrible problem with ILMerge this week when attempting to build and deploy a windows service I’d been working on. While the merged executable & subsequently created MSI worked fine on my own machine, it gave the following rather nasty problem when run on a colleagues machine.

Error!

Could not load type ‘System.Runtime.CompilerServices.ExtensionAttribute’ from assembly mscorlib

It turns out that between .NET 4.0 & .NET 4.5; this attribute was moved from System.Core.dll to mscorlib.dll. While that sounds like a rather nasty breaking change in a framework version that is supposed to be 100% compatible, a [TypeForwardedTo] attribute is supposed to make this difference unobservable.

Unfortunately things breakwhen ILMerge is used to merge several assemblies into one. When I merge my .NET 4.0 app, with some other assemblies on the machine with .NET 4.5 installed, it sets the targetplatform for ILMerge to .NET 4.0. This in turn looks into C:\windows\Microsoft.NET\Framework\v4.0.30319 to find the relevant DLLs. But since .NET 4.5 is an in place upgrade, these have all been updated with their .NET 4.5 counter parts.

Breaking Changes

“Every well intended change has at least one failure mode that nobody thought of”

You need to specific that ILMerge should use the older .NET 4.0 reference assemblies which are still available in C:\Program Files\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.0. (or program files x86) if your on a 64-bit box). There’s more info on the stackoverflow question where I finally found a solution and in a linked blog post by Matt Wrock.

http://stackoverflow.com/questions/13748055/could-not-load-type-system-runtime-compilerservices-extensionattribute-from-a

and

http://www.mattwrock.com/post/2012/02/29/What-you-should-know-about-running-ILMerge-on-Net-45-Beta-assemblies-targeting-Net-40.aspx

To override this behavior you need to specify this target platform directory as part of your ILMerge command. e.g.

"C:\Path\To\ILMerge.exe"
    /out:"$(TargetDir)OutputExecutable.exe"
    /target:exe
    /targetplatform:"v4,C:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework\v4.0"
      "$(TargetDir)InputExecutable.exe"
      "$(TargetDir)A.dll"
      "$(TargetDir)B.dll"
I had previously been using the ILMerge.MSBuild.Tasks tool from nuget but unfortunately, this library doesn’t currently support specifying the TargetPlatform. There’s an unactioned open item on their issue tracker in google code.
~EoinC

cambo » Automatically update the AssemblyFileVersion attribute of a .NET Assembly

January 25th, 2013
Automatic AssemblyFileVersion Updates

Automatic AssemblyFileVersion Updates

There is support in .NET for automatically incrementing the AssemblyVersion of a project by using the “.*” notation. e.g.
[assembly: AssemblyVersion("0.1.*")]

Unfortunately the same functionality isn’t available for the AssemblyFileVersion. Often times, I don’t want to bump the AssemblyVersion of an assembly as it will effect the strong name signature of the assembly, and perhaps the changes (a bug fix) isn’t significant enough to warrant it. However I do want to automatically increment the file version, so that in a deployed environment, I can right click the file and establish when the file was built & released.

Important!

Enter the Update-AssemblyFileVersion.ps1 file.

This powershell script, (heavily borrowed from David J Wise’s article), runs as a pre-build command on a .NET Project. Simply point the command at an assembly info file, (or GlobalAssemblyInfo.cs if you’re following my suggested versioning tactics)  and ta-da, automatically updating AssemblyFileVersions.

The Build component of the version number will be set using the following formula based on a daycount since the year 2000.

# Build = (201X-2000)*366 + (1==>366)
#
    $build = [int32](((get-date).Year-2000)*366)+(Get-Date).DayOfYear
 

The Revision component of the version number will be using the following formula based on seconds in the current day.

# Revision = (1==>86400)/2 # .net standard
#
    $revision = [int32](((get-date)-(Get-Date).Date).TotalSeconds / 2)
 

The Major & Minor components are not set to update although they could be. Simply add the following command to your Pre-Build event and you’re all set.

%SystemRoot%\system32\WindowsPowerShell\v1.0\powershell.exe 
    -File "C:\Path\To\Update-AssemblyFileVersion.ps1"  
    -assemblyInfoFilePath "$(SolutionDir)\Project\Properties\AssemblyInfo.cs"

~Eoin C

toaster » (the universe lets me win) 2013-01-13 08:45:09

January 13th, 2013
Some livejournal folks are blogging what they had for dinner, and today I will Participate in a Meme. Here's my dinner, in blogular format:

So, Joel asked "What do you want from seamlessweb?" and I said "No! I shall eat something from our fridge just like an adult would". This is not a normal response for me, and maybe I regretted it a bit as he ordered great Chinese food from Tofu in Park Slope and I extracted half a mozzarella and a bag of wilted basil and no Kerrygold because we put it all on the garlic bread on Thursday (and, seriously, that was three quarters of a block of Kerrygold and the garlic bread was an appetiser for a dish that was made mostly out of cheese. How are we still alive?). And I said "huh" and "well" and checked two or three more times to make sure that nothing else in this quite full fridge could be converted into food, but vermouth and apple sauce and old carrots do not a dinner make, even when you have two kinds of every condiment that has ever been sold.

So I went over to the bakery on the corner and I said "Hey, I have a mozzarella and I need bread to put it on" (because after five years living here I still don't know what any kind of bread is called, and this is my survival strategy: I lay out the problem and let them solve it) and the bakery lady said "You need an Italian" and she sold me a soft and crusty white loaf that felt pretty fresh even though it was 8pm. Also, the bakery was still open at 8pm because this is the city that never sleeps (until 9pm), and that's a thing I love about living here.

I took that home and sliced up a lot of the mozzarella and salted and peppered the holy hell out of it, and washed the basil and put it on top, and dug around in the pantry to see if we had any sardines and we did. The pantry is really a converted coat closet, but we have airs. I fried up the sardines in the olive oil they were canned in, which has the side-effect of making the entire house smell vibrantly like sardines, and to be clear I don't just mean the apartment, I mean the upstairs neighbours are probably like "did we buy the world's least likely air freshener? What were we thinking" and if you think sardines are amazing, then that's delightful, and if you hold the exact opposite opinion, well, you're Joel and I'm lucky to not have been divorced yet.

25% of the sardines found their way into the cats, as was laid out in the ancient covenant, and I poured the rest on top of the mozzarella and wrapped the bread around it, lamenting the Kerrygold we didn't have, and ate it in about 45 seconds while paying the co-op's water bill online.

I occasionally have classy dinners, but today was not a classy dinner day.