Christopher Bennage

not all who wander are lost

Render Action

It’s common for a single web page to include data from many sources. Consider this screen shot from Project Silk. There are four separate items displayed.

The primary concern of the page is displaying a list of vehicles. However it also displays some statistics and a set of reminders. I labeled the stats and reminders as orthogonal because they are (in a sense) independent of the primary concern. Finally, there is the ambient data of the currently logged in user. I call this data ambient because we expect it to be present on all the pages in the application.

It’s a common practice in MVC-style applications to map a single controller action to a view. That is, it is the responsibility of a single action to produce everything that is needed to render a particular web page.

The difficulty with this approach is that other pages often need to render the same orthogonal data. Let’s examine the code for the action invoked by \vehicle\list.

public ActionResult List()
{
    AddCountryListToViewBag();

    var vehicles = Using<GetVehicleListForUser>()
        .Execute(CurrentUserId);

    var imminentReminders = Using<GetImminentRemindersForUser>()
        .Execute(CurrentUserId, DateTime.UtcNow);

    var statistics = Using<GetFleetSummaryStatistics>()
        .Execute(CurrentUserId);

    var model = new DashboardViewModel
                    {
                        User = CurrentUser,
                        VehicleListViewModel = new VehicleListViewModel(vehicles),
                        ImminentReminders = imminentReminders,
                        FleetSummaryStatistics = statistics
                    };

    return View(model);
}

Disregarding how you might feel about the Using<T> method to invoke commands and other such details, I want you to focus on the fact that the controller is composing a model. We generate a number of smaller viewmodels and then compose them into an instance of DashboardViewModel. The class DashboardViewModel only exists to tie together the four, otherwise independent data.

Project Silk had separate actions just to serve up JSON:

public JsonResult JsonList()
    {
        var list = Using<GetVehicleListForUser>()
            .Execute(CurrentUserId)
            .Select(x => ToJsonVehicleViewModel(x))
            .ToList();

        return Json(list);
    }

You’ll notice that both JsonList and List use the same GetVehicleListForUser command for retrieving their data. JsonList also projected the data to a slightly different viewmodel.

Reducing the Code

While reevaluating this code for Project Liike, we decided to employ content negotiation. That is, we wanted a single endpoint, such as \vehicle\list, to return different representations of the data based upon a requested format. If the browser requested JSON, then \vehicle\list should return a list of the vehicles in JSON. If the browser requested markup, then the same endpoint should return HTML.

First, we needed to eliminate the differences between the JSON viewmodel and the HTML viewmodel. Without going deep into details, this wasn’t hard to do. In fact, it revealed that we had some presentation logic in the view that should not have been there. The real problem was that I wanted the action to look more like this:

public ActionResult List()
{
    var vehicles = Using<GetVehicleListForUser>()
        .Execute(CurrentUserId);

    return new ContentTypeAwareResult(vehicles);
}

Only, the view still needed the additional data of statistics and reminders. How should the view get it?

We decided to use RenderAction. RenderAction allows a view to invoke another action and render the results into the current view.

We needed to break out the other concerns into their own actions. For the sake of example, we’ll assume they are both on the VehicleController and named Reminders and Statistics. Each of these action would be responsible for getting a focused set of data. Then in the (imaginary) view for List we could invoke the actions like so:

// List.cshtml 
<ul>
@foreach (var vehicle in Model)
{
    <li>@vehicle.Name</li>
}
</ul>

<section role="reminders">
@{ Html.RenderAction( "Reminders", "Vehicle") }
</section>

<section role="statistics">
@{ Html.RenderAction( "Statistics", "Vehicle") }
</section>

The value of using RenderAction is that we where able to create very simple actions on our controllers. We were also able to reuse the actions for rendering both markup and JSON.

A secondary benefit is the separation of concerns. For example, because we moved the responsibility of composition from the controller into the view, a designer could now revise the view for the \vehicle\list without needing to touch the code. They could remove any of the orthogonal concerns or even add new ones without introducing any breaking changes.

The Downside

There are a few caveats with this approach.

First, don’t confuse RenderAction with RenderPartial. RenderAction is for invoking a completely independent action, with its own view and model. RenderPartial is simply for renders a view based on a model passed to it (generally derived from the main viewmodel).

Secondly, avoid using RenderAction to render a form. It’s likely won’t work the way you’d expect.This means that any form rendering will need to occur in your primary view.

Thirdly, using RenderAction breaks the model-view-controller pattern. What I mean is that, in MVC, it’s assumed that the view does nothing more than render a model. Controllers invoke a view, and not vice versa. Using RenderAction breaks this rule. Personally, I have no problem breaking the rule when it results in code that is more simple and more easily maintained. Isn’t that the whole point of best practices anyway?

Finding Out When Something Happened in Your Git Repo

Acknowledgment: This is meant to be the Windows equivalent of Anders Janmyr’s excellent post on the subject of finding stuff with Git. Essentially, I’m translating some of Anders’ examples to Powershell and providing explanations for things that many Windows devs might not be familiar with.

This is the third in a series of posts providing a set of recipes for locating sundry and diverse thingies in a Git repository.

Determining when a file was added, deleted, modified, or renamed

You can include the --diff-filter argument with git log to find commits that include specific operations. For example:

git log --diff-filter=D # delete
git log --diff-filter=A # add
git log --diff-filter=M # modified
git log --diff-filter=R # rename

There are additional flags as well. Check the documentation. By default, git log just returns the commit id, author, date, and message. When using these filters I like to include --summary so that the list of operations in the commit are included as well.

N.B. If you run a git log command and your prompt turns into a : simply press q to exit.

I don’t think that you would ever want to return all of the operations of a specific type in the log however. Instead, you will probably want to find out when a specific file was operated on.

Let’s say that something was deleted and you need to find out when and by whom. You can pass a path to git log, though you’ll need to preced it with -- and a space to disambiguate it from other arguments. Armed with this and following Ander’s post you would expect to be able to do this:

git log --diff-filter=D --summary -- /path/to/deleted/file

And if you aren’t using Powershell this works as expected. I tested it with Git Bash (included with msysgit) and good ol’ cmd as well. Both work as expected.

However, when you attempt this in Powershell, git complains that the path is an ambiguous arugment. I was able to, um, “work around” it by creating an empty placeholder file at the location. Fortunately, Jay Hill heard my anguish on Twitter and dug up this post from Ethan Brown. In a nutshell, Powershell strips out the --. You can force it to be recognized by wrapping the argument in double qoutes:

git log --diff-filter=D --summary "--" /path/to/deleted/file

That works!

I’m guessing that Powershell considers -- to be an empty arugment and therefore something to be ignored. I also assume that when the file actually exists at the path that git is smart enough to recognize the argument as a path. (Indeed, the official documentations says that “paths may need to be prefixed”).

While we’re here, I also want to point out that you can use wild cards in the path. Perhaps you don’t know the exact path to the file, but you know that it was named monkey.js:

git log --diff-filter=D --summary -- **/monkey.js

Happy hunting!

Finding Content in Files With Git

Acknowledgment: This is meant to be the Windows equivalent of Anders Janmyr’s excellent post on the subject of finding stuff with Git. Essentially, I’m translating some of Anders’ examples to Powershell and providing explanations for things that many Windows devs might not be familiar with.

This is the second in a series of posts providing a set of recipes for locating sundry and diverse thingies in a Git repository.

Finding content in files

Let’s say that there are hidden monkeys inside your files and you need to find. You can search the content of files in a Git repositor by using git grep. (For all you Windows devs, grep is a kind of magical pony from Unixland whose special talent is finding things.)

# find all files whose content contains the string 'monkey'
PS:\> git grep monkey

There several arguments you can pass to grep to modify the behavior. These special arguments make the pony do different tricks.

# return the line number where the match was found
PS:\> git grep -n monkey

# return just the file names
PS:\> git grep -l monkey

# count the number of matches in each file
PS:\> git grep -c monkey

You can pass an arbitrary number of references after the pattern you’re trying to match. By reference I mean something that’s commit-ish. That is, it can be the id (or SHA) of a commit, the name of a branch, a tag, or one of the special identifier like HEAD.

# search the master branch, and two commits by id, 
# and also the commit two before the HEAD
PS:\> git grep monkey master d0fb0d 032086 HEAD~2

The SHA is the 40-digit id of a commit. We only need enough of the SHA for Git to uniquely identify the commit. Six or eight characters is generally enough.

Here’s an example using the RavenDB repo.

PS:\> git grep -n monkey master f45c08bb8 HEAD~2

master:Raven.Tests/Storage/CreateIndexes.cs:83:         db.PutIndex("monkey", new IndexDefinition { Map = unimportantIndexMap });
master:Raven.Tests/Storage/CreateIndexes.cs:90:         Assert.Equal("monkey", indexNames[1]);
f45c08bb8:Raven.Tests/Storage/CreateIndexes.cs:82:          db.PutIndex("monkey", new IndexDefinition { Map = unimportantIndexMap });
f45c08bb8:Raven.Tests/Storage/CreateIndexes.cs:89:          Assert.Equal("monkey", indexNames[1]);
HEAD~2:Raven.Tests/Storage/CreateIndexes.cs:83:         db.PutIndex("monkey", new IndexDefinition { Map = unimportantIndexMap });
HEAD~2:Raven.Tests/Storage/CreateIndexes.cs:90:         Assert.Equal("monkey", indexNames[1]);

Notice that each line begins with the name of the commit where the match was found. In the example above where we asked for the line numbers, the results were in the pattern:

[commit ref]:[file path]:[line no]:[matching content]

N.B. I had one repository that did not work with git grep. It was because my ‘text’ files were encoded UTF-16 and git interpretted them as binary. I converted them to UTF-8 and the world became a happy place. Thanks to Keith Dahlby and Adam Dymitruk for helping me to figure out the problem.

References

Finding Files by Name With Git

Acknowledgment: This is meant to be the Windows equivalent of Anders Janmyr’s excellent post on the subject of finding stuff with Git. Essentially, I’m translating some of Anders’ examples to Powershell and providing explanations for things that many Windows devs might not be familiar with.

This is the first in a series of posts providing a set of recipes for locating sundry and diverse thingies in a Git repository.

Finding files by name

Let’s say that you want locate all the files in a git repository that contain ‘monkey’ in the file name. (Finding monkeys is a very common task.)

# find all files whose name matches 'monkey'
PS:\> git ls-files | Select-String monkey

This pipes the output of git ls-files into the Powershell cmdlet Select-String which filters the output line-by-line. To better understand what this means, run just git ls-files.

Of course, you can also pass a regular expression toSelect-String (that is, if you hate yourself.)

References

Next, searching for files with specific content.

On Not Being a Jerk

My interest in making software well is an accident. What I’m really interested in is living life well. Chasing that chimerical beast of software “best practices” is merely a happy side-effect.

To that end, there’s an ancient maxim: ‘know thyself’. Despite over three decades of living with myself, I am often surprised by what I do. Surprised, and many times embarassed.

For example, last week I complained on twitter about what I had perceived as selfish and inconsiderate behavior of some of my fellow employees. It was quickly pointed out to me that I was wrong; that I was completely misinterpreting my observations.

Once I realized my mistake, my immediate thought was “Oh, I don’t want people to think that I’m a jerk. I wish I hadn’t said that”. Shortly afterwards though, I realized that I had been more concerned about what other people thought and not my real problem. The real problem was that I was a jerk. I had judged people I did not know with only scant evidence. This reminded me of another ancient maxim: “judge not, that ye be not judged”.

Now, here is the surpising conclusion. I’m glad that I stated my faulty opinion out loud, despite that it embarassed me, because it revealed my fault and I had to correct it. I had to confront my own prejudice and fix it. If I had kept the venom to myself, I would have gone on nursing my prejudice.

My take away: it doesn’t matter what people think about me, it matters what I am. It is better for me to surface my flaws and fix them, than it is for me to hide them and decay.

Refactoring Relationships

Working with people is a lot like working with code. New relationships are green fields. Over time they become brown fields and (just like code) they require maintenance. I’m sure that everyone reading this can identify some legacy relationships that they would describe as well complicated. Just like some legacy code.

Getting Started With JavaScript… Again

I’ve alluded before that I did a large chunk of my development in some form of ECMAScript for the first ten years of my professional life. Now, JavaScript is cool again for the first time. Everyone wants to learn it.

So, like me, you probably already kinda maybe knew JavaScript. But times have changed and now it’s a serious language. How do you get up to speed? Here’s what I did.

Mobile Development: Detecting Devices & Features

Take this post cum granlis salis. I’m trying to figure this stuff out and I’m thinking out loud.

Background

Whenever a browser makes a request, it includes a string identifying itself to the server. We commonly refer to this as the user agent string. This string identifies the browser and the platform and the version and a great deal more such nonsense.

Blocks and Playsets

I’ve recently discovered that I favor blocks over playsets. I’m talking about toys, and of course the canonical example of blocks is Legos. You can build nearly anything with them. They are useful, versatile, and inviting.

Now, the term ‘playset’ warrants a bit more explanation. I don’t mean the large outdoor sets with swings and sandboxes and spring-loaded ponies. No, I’m a child of the 80s and I loved me some Star Wars playsets.

Being a New Kid on the Mobile Block

The last few weeks I’ve been trying get a finger on the pulse of mobile web development. I wanted to identify the thought leaders, understand the big questions, and (perhaps mostly importantly) begin cataloging the practical considerations for building mobile experiences today.

Here’s where I’m at so far…