Notes from Daily Encounters with Technology RSS 2.0
 
# Friday, October 17, 2014

Unity 4 Game Development HOTSHOTI recently received a review copy of another game development book from Packt Publishing: Unity 4 Game Development HOTSHOT by Jate Wittayabundit. The book turned out much different than I expected. Based on its excerpt and table of contents, I was hoping it would focus more on the game design and development concepts and general approaches. It is written in a much more hands-on approach, instead. That's not necessarily worse, it just targets a different audience.

Taking that into account, the book still has its flaws. The manner it is written in, makes it almost necessary to actually repeat the steps from the book in Unity, while reading it. The book can be really difficult to follow otherwise. This works very well with the sections which are oriented more towards configuring the Unity project in its UI. The code oriented parts didn't leave that good an impression on me - it's still too much code that needs to be typed into Unity or copied from the completed project. Having all the code printed in the book both in C# and JavaScript, makes the matters even worse.

Even after doing the projects from the book yourself, you'll be often on your own to generalize the knowledge from it. The book doesn't offer all the much accompanying explanations which would give a broader picture of the concepts that were used. Each section is followed by a brief description of what has been done and there are a couple of appendices at the end which can be used as reference material. For the rest, the reader will have to follow many links to the official documentation or find a different source. It also bothers me that I've encountered a couple of technical inaccuracies in the book.

Having said all that, I can still recommend it, if you already know your way around Unity and would like to learn a couple of very specific more advanced techniques. I'm pretty sure you'll be able to take advantage of the learned experience in your own projects. On the other hand; if you're expecting to gain some general knowledge about 2D and 3D animation, AI and shaders, as I did, you'll most likely be disappointed in the end.

If you still find this book of interest to you, you can buy it from Amazon or directly from the publisher.

Friday, October 17, 2014 9:37:52 PM (Central European Daylight Time, UTC+02:00)  #    Comments [0] - Trackback
Development | Unity | Personal | Reviews
# Monday, October 13, 2014

Since I've first installed ConEmu after reading Scott Hanselman's excellent blog post about it, I've been using it for all my console needs. Even the command prompt improvements in Windows 10 haven't really excited me all that much. There was an exception, though. I've never really made the effort to make Git work in my system command prompt. SourceTree seems to cover most of my needs, when interacting with Git. When I really needed to use it from the command line, I settled with its embedded terminal support. Now that I've finally decided to take care of this, I collected all the steps I took in this blog post for future reference. I assume it might prove useful to others, as well.

Since SourceTree comes with its own embedded installation of Git, the first step was actually installing standalone Git. I kept most of default options in the installer, except for the PATH variable - I wanted Git to be included in the path so that I could use it from everywhere.

Use Git from the Windows Command Prompt

This was enough to get Git working in its most basic form, but I really wanted to take it further. As an avid PowerShell user, the next logical step was posh-git to get some prompt and tab completion goodness. The simplest way of installing it is with PsGet. Even if you've never before used the latter, you only need to execute the following commands:

(new-object Net.WebClient).DownloadString("http://psget.net/GetPsGet.ps1") | iex
Install-Module posh-git
. $PROFILE

The last command reinitializes your PowerShell session with the updated profile. After doing that, posh-git complained:

WARNING: Could not find ssh-agent

This happened because during Git installation I decided against polluting my path by adding c:\Program Files (x86)\Git\bin to it. Fortunately there's an easy way to get around that. Just add the following two lines to your PowerShell profile file (type "notepad $PROFILE" to edit it):

Set-Alias ssh-agent "${env:ProgramFiles(x86)}\git\bin\ssh-agent.exe"
Set-Alias ssh-add "${env:ProgramFiles(x86)}\git\bin\ssh-add.exe"

After saving the changed file and reinitializing the session again (". $PROFILE", remember?) the error was gone.

There was one last thing, I wanted to fix: Git kept prompting me for my password whenever I tried to interact with my HTTPS remotes. Windows Credential Store for Git can help with that. I just downloaded it, executed it, and it worked. The next time Git required my password to access the remote, the following popup was displayed instead of the usual prompt in the console:

Enter Git Credentials

This was the last time I had to enter my credentials. They are now safely stored Windows Credential Store and retrieved from there whenever Git communicates with my remotes. Now, I'm truly satisfied with my setup.

Thanks to Phil Haack, brianary, and Joshua Gall for their posts which where the source for all of the above.

Monday, October 13, 2014 6:39:07 AM (Central European Daylight Time, UTC+02:00)  #    Comments [0] - Trackback
Software | Git
# Monday, October 6, 2014

Windows Workflow Foundation was primarily designed for long-running workflows. Typically their execution will stop at certain points and wait for an external trigger before resuming. Since we don't want the workflow instance to remain in memory during all this time, support for persisting workflow instances is an important part of the framework. It is also easily extendable: additional data can be saved when a workflow is persisted. There are some things to be aware of, when consistency between the persisted workflow and the extra data needs to be ensured. I'll discuss them in a scenario of a custom workflow host.

Different types of instance stores are supported, but only SQL Server instance store is provided by Microsoft; for others you'll need to turn to 3rd party vendors, such as DevExpress and Devart. To enable persistence, of course a database needs to be prepared first. For SQL Server, the database schema scripts are distributed with .NET framework and can be found in c:\Windows\Microsoft.NET\Framework\v4.0.30319\SQL\en. Only a subset of scripts in the folder is required for workflow instance persistence in .NET 4.5; they need to be run in this exact order: SqlWorkflowInstanceStoreSchema.sql, SqlWorkflowInstanceStoreLogic.sql, SqlWorkflowInstanceStoreSchemaUpgrade.sql.

To support persistence when hosting workflows in your own process, WorkflowApplication class needs to be used and set up as follows:

Exception exception = null;
var handle = new AutoResetEvent(false);

var workflowApplication = new WorkflowApplication(workflowDefinition, inputs);

// unload the workflow upon persisting
workflowApplication.PersistableIdle = args => PersistableIdleAction.Unload;
workflowApplication.Unloaded = args => handle.Set();

workflowApplication.Aborted = args =>
{
    exception = args.Reason;
    handle.Set();
};

// setup the instance store - the schema scripts must already be run
workflowApplication.InstanceStore = new SqlWorkflowInstanceStore(ConnectionString);

// start the workflow
workflowApplication.Run();

// wait for the workflow to be unloaded or aborted 
handle.WaitOne();

Notice the use of AutoResetEvent for synchronizing the running workflow with the main thread. When the workflow is aborted, the exception which caused that, is included in WorkflowApplicationAbortedEventArgs. I'm storing it in a local variable so that I can inspect it later.

To trigger the persistence, workflow definition must include an activity which creates a bookmark, i.e. pauses the execution. Although there are some built-in activities like that, I decided to use my own trivial implementation:

public class ActivityWithBookmark : NativeActivity
{
    protected override bool CanInduceIdle
    {
        get { return true; }
    }

    protected override void Execute(NativeActivityContext context)
    {
        context.CreateBookmark();
    }
}

Here's the workflow definition and inputs that can be used with the above sample:

var workflowDefinition = new Sequence
{
    Activities =
    {
        new ActivityWithBookmark()
    }
};

var inputs = new Dictionary<string, object>();

Now that the we got the basic persistence working, it's time to extend it with our own custom code. For this purpose we need to create a workflow instance extension by deriving it from PersistenceIOParticipant base class:

public class PersistenceExtension : PersistenceIOParticipant
{
    private readonly bool _throwException;

    public PersistenceExtension() : 
        base(true, false) // make save transactional, load non-transactional
    { }

    protected override IAsyncResult BeginOnSave(IDictionary<XName, object> readWriteValues, 
        IDictionary<XName, object> writeOnlyValues, TimeSpan timeout, AsyncCallback callback, object state)
    {
        // save custom data here
        return base.BeginOnSave(readWriteValues, writeOnlyValues, timeout, callback, state);
    }

    protected override void Abort()
    { }
}

The extension needs to be added to WorkflowApplication before running the workflow:

workflowApplication.Extensions.Add(new PersistenceExtension());

Now, our BeginOnSave overload will be called just after the built-in persistence into the instance store is completed. PersistenceIOParticipant constructor parameters allow us to specify, whether we want the save and load operations to be included in the transaction, respectively.

Requiring the save transaction as in our case will cause the transaction in instance store to remain open for the duration of our BeginOnSave method. Not only that; our code is going to be run inside the same TransactionScope, therefore by default any database connection opened in this method will join this same transaction. Since there's no documented way of using the same connection string, the transaction will automatically be promoted to a distributed transaction. If that's okay with you, there's nothing more to do; otherwise you might want to include your database access code in a new TransactionScope, suppressing the ambient transaction:

protected override IAsyncResult BeginOnSave(IDictionary<XName, object> readWriteValues, 
    IDictionary<XName, object> writeOnlyValues, TimeSpan timeout, AsyncCallback callback, object state)
{
    using (new TransactionScope(TransactionScopeOption.Suppress))
    {
        // save custom data here
    }
    return base.BeginOnSave(readWriteValues, writeOnlyValues, timeout, callback, state);
}

Don't worry, you can still ensure transactional consistency, even if you do this. If anything goes wrong in your code and you want to roll back the ambient transaction, just throw an exception.

As long as you're using SQL Server 2008 or later, this is all you need to know (and this should be the case, since SQL Server 2005 is already in its extended support period). If you're stuck with an older version, you'll either need to use distributed transactions, or give up transactional consistency. TransactionScope works differently with SQL Server 2005, therefore the transaction will already need to be distributed before you code in BeginOnSave is called at all. The only way to avoid it, is not to require the save operation to be included in the transaction (in PersistenceIOParticipant constructor). If you do this, the built-in persistence to instance store will already be committed when BeginOnSave is called, so there's no way to roll it back. It's your choice.

Monday, October 6, 2014 6:53:24 AM (Central European Daylight Time, UTC+02:00)  #    Comments [0] - Trackback
Development | .NET | WF4
# Monday, September 29, 2014

How often it happens to you, that you need to write "too much" supporting code for you unit tests? Let's take a look at the following example:

[Test]
public void CompareDtoAndBusinessClassesBefore()
{
  // arrange
  UserEntity[] expectedUsers = new[]
    {
      new UserEntity(1, "Dennis", "Doomen", "ddoomen"),
      new UserEntity(2, "Charlie", "Poole", "cpoole")
    };
  var repository = new Repository();
  repository.Users.AddRange(expectedUsers);

  // act
  var adminService = new AdminService(repository);
  IEnumerable<UserDto> actualUsers = adminService.GetAllUsers();

  // assert
  var sortedExpectedUsers = expectedUsers.OrderBy(user => user.Username);
  var sortedActualUsers = actualUsers.OrderBy(user => user.Username);
  CollectionAssert.AreEqual(sortedExpectedUsers, sortedActualUsers, new UserComparer());
}

It represents a typical scenario when testing service oriented applications. You can notice two different classes for describing users:

  • UserDto is exposed to the consumers of the service
  • UserEntity is being used inside the application and will usually have more properties than the former one.

Both of them implement a common interface IUser to make testing easier and maybe even enable some code sharing elsewhere in the project.

While the arrange and act part could hardly be more concise, there is a lot of plumbing code in the assert part of the test. UserComparer probably won't be used outside the tests, but is required for comparing instance of IUser whether they are of the same type or not. On top of that; we are sorting both collections because results are not ordered. We should be using CollectionAssert.AreEquivalent instead of CollectionAssert.AreEqual here, but there's no overload accepting an IComparer available, so we had to find a different solution.

The above sample is using NUnit test framework, but the test wouldn't turn out much differently, if we were using any other test framework. The additional code we need to write has many disadvantages: it means extra work, there can be bugs in the plumbing code, and the tests are more difficult to understand.

While trying to simplify a particularly nasty case of the above described scenario, I stumbled across FluentAssertions NuGet package and I've quickly grown to like it. This is how the above test could look like, when using its assertions:

[Test]
public void CompareDtoAndBusinessClassesAfter()
{
  // arrange
  UserEntity[] expectedUsers = new[]
    {
      new UserEntity(1, "Dennis", "Doomen", "ddoomen"),
      new UserEntity(2, "Charlie", "Poole", "cpoole")
    };
  var repository = new Repository();
  repository.Users.AddRange(expectedUsers);

  // act
  var adminService = new AdminService(repository);
  IEnumerable<UserDto> actualUsers = adminService.GetAllUsers();

  // assert
  actualUsers.ShouldAllBeEquivalentTo(expectedUsers);
}

Of course, the arrange and act parts haven't changed. The assert part shrunk down to a single line with no supporting code whatsoever. Although the assertion is named clearly enough, it's you need to know the following to understand the test fully:

  • The equivalency of both collections is asserted, i.e. the order is not important, hence there's no sorting required.
  • The items in both collections are compared property by property, based on their names, therefore the IUser interface is not required any more, as long as properties have the same name.

There's another advantage in using FluentAssertions that can't be noticed by just looking at the test code. If we introduce a bug in the service method, this is the output of the first test:

  Expected is <System.Linq.OrderedEnumerable`2
[FluentAssertionsForCollections.UserEntity,System.String]>,
actual is <System.Collections.Generic.List`1
[FluentAssertionsForCollections.UserDto]> with 2 elements
  Values differ at index [0]
  Expected: <FluentAssertionsForCollections.UserEntity>
  But was:  <FluentAssertionsForCollections.UserDto>

Yes, we could improve it by overriding ToString() in both classes, but that would mean even more supporting code for the sake of test, which we want to avoid.

Here's the output of the second test:

Expected item[0].Id to be 1, but found 2.
Expected item[0].Id to be 1, but found 2.
Expected item[1].Id to be 2, but found 3.

With configuration:
- Include only the declared properties
- Match property by name (or throw)

There is still room for improvement, but unlike the first case; it's quite obvious how both collections differ. This is just another argument in favor of FluentAssertions.

Still, I've just scratched the surface. The library includes a huge collection of assertions which should have you covered, once you get to know them all. If not, it's really easy customize the behavior of many of them with no or next to none supporting code.

There's really no reason not to give it a try. It seems to work with all unit test frameworks currently available. You never know, it might have a big impact on the way you're writing unit tests.

Monday, September 29, 2014 6:34:26 AM (Central European Daylight Time, UTC+02:00)  #    Comments [1] - Trackback
Development | .NET | Testing
# Monday, September 22, 2014

On Thursday a new season of Slovenian Developers User Group meetings started in the new premises of Microsoft Slovenia. After introducing the new community portal to the started the meeting with a session about NuGet basics. I focused on the advantages it brings to the process of managing project references, but also gave a short tutorial on how to create and publish a new NuGet package. I concluded the session by taking a broader look at the NuGet ecosystem and the tools improving it and taking advantage of it.

As always here are my slides from the session. I didn’t have any sample source code worth downloading this time.

As always, if you want to learn more about the subject, I recommend reading my book. It gives a more detailed look at most of the topics presented in my session.

Monday, September 22, 2014 6:35:44 AM (Central European Daylight Time, UTC+02:00)  #    Comments [0] - Trackback
Development | NuGet | Downloads | Presentations
My Book

NuGet 2 Essentials

About Me
The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.

All Content © 2014, Damir Arh, M. Sc. Send mail to the author(s) - Privacy Policy - Sign In
Based on DasBlog theme 'Business' created by Christoph De Baene (delarou)
Social Network Icon Pack by Komodo Media, Rogie King is licensed under a Creative Commons Attribution-Share Alike 3.0 Unported License.