Using Xamarin Forms with .NET Standard

July 9, 2016 Uncategorized 6 comments , , , ,

Using Xamarin Forms with .NET Standard

With the release of .NET Core and the .NET Standard Library last week, many people want to know how they can use packages targeting netstandard1.x with their Xamarin projects. It is possible today if you use Visual Studio; for Xamarin Studio users, support is coming soon.

Prerequisites

Using .NET Standard pretty much requires you to use project.json to eliminate the pain of “lots of packages” as well as properly handle transitive dependencies. While you may be able to use .NET Standard without project.json, I wouldn’t recommend it.

You’ll need to use the following tools:

Getting Started

As of now, the project templates for creating a new Xamarin Forms project start with an older-style packages.config template, so whether you create a new project or have an existing project, the steps will be pretty much the same.

Step 1: Convert your projects to project.json following the steps in my previous blog post.

Step 2: As part of this, you can remove dependencies from your “head” projects that are referenced by your other projects you reference. This should simplify things dramatically for most projects. In the future, when you want to update to the next Xamarin Forms version, you can update it in one place, not 3-4 places. It also means, you only need the main Xamarin.Forms package, not each of the packages it pulls in.

If you hit any issues with binaries not showing up in your bin directories (for your Android and iOS “head” projects), make sure that you have set CopyNuGetImplementations to true in your csproj as per the steps in the post.

At this point, your project should be compiling and working, but not yet using netstandard1.x anywhere.

Step 3: In your Portable Class Library projects, find the highest .NET Standard version you need/want to support.

Here’s a cheat sheet:

  • If you only want to support iOS and Android, you can use .NET Standard 1.6. In practicality though, most features are currently available at .NET Standard 1.3 and up.
  • If you want to support iOS, Android and UWP, then NET Standard 1.4 is the highest you can use.
  • If you want to support Windows Phone App 8.1 and Windows 8.1, then NET Standard 1.2 is your target.
  • If you’re still supporting Windows 8, .NET Standard 1.1 is for you.
  • Finally, if you need to support Windows Phone 8 Silverlight, then .NET Standard 1.0 is your only option.

Once you determine the netstandard version you want, in your PCL’s project.json, change what you might have had:

{
    "dependencies": {
        "Xamarin.Forms": "2.3.0.107"        
    },
    "frameworks": {        
        ".NETPortable,Version=v4.5,Profile=Profile111": { }
    },
    "supports": { }
}

to

{
    "dependencies": {
        "NETStandard.Library": "1.6.0",
        "Xamarin.Forms": "2.3.0.107"        
    },
    "frameworks": {        
        "netstandard1.4": {
            "imports": [ "portable-net45+wpa81+wp8+win8" ]
         }
    },
    "supports": { }
}

Note the addition of the imports section. This is required to tell NuGet that specified TFM is compabtible here beause the Xamarin.Forms package has not yet been updated to use netstandard directly.

Then, edit the csproj to set the TargetFrameworkVersion element to v5.0 and remove any value from the TargetFrameworkProfile element.

At this point, when you reload the project, it should restore the packages and build correctly. You may need to do a full clean/rebuild.

Seeing it in action

I created a sample solution showing this all working over on GitHub. It’s a good idea to clone, build and run it to ensure your environment and tooling is up-to-date. If you get stuck converting your own projects, I’d recommend referring back to that repo to find the difference.

As always, feel free to tweet me @onovotny as well.

Portable- is dead, long live NetStandard

June 23, 2016 Uncategorized 8 comments

Portable- is dead, long live NetStandard

With the RC of NuGet 2.12 for VS 2012/2013, and imminent release of .NET Core on Monday the 27th, it’s time to bid farewell to our beloved/cursed PCL profiles. So long, you won’t be missed! Oh, and dotnet, please don’t let the door hit you on the way out either.

In its place we join the new world of the .NET Platform Standard and it’s new moniker netstandard.

When dotnet was released last July, there was a lot of confusion around what it is and how it worked. Working with the NuGet and CoreFX teams, I tried to explain it in a few of my previous posts. Despite good intentions, dotnet was a constant frustration to many library authors due to its design and limited support. dotnet only worked with NuGet v3, which meant that packages would need to ship a dotnet version and a version in a PCL directory like portable-net45+win8+wp8+wpa81 to support VS 2012/2013.

It was hard to fault anyone for wondering, “why bother?” The other downfall of dotnet, and likely the main one, was that dotnet fell into a mess trying to work with different compatibility levels of libraries. What if you wanted to have a version that worked with newer packages than were supported by Windows 8? What if you wanted to have multiple versions which “light up” based on platform capabilities? How many of you who installed a dotnet-based package, with its dependencies listed, saw a System.Runtime 4.0.10 entry and wondered why you were getting errors trying to update it? After all, NuGet showed an update to 4.0.20, why wouldn’t that work? The reality was that you had to release packages with multiple versions that were incompatible with some platforms because there was no way to put all of it into one package.

Enter .NET Platform Standard

netstandard fixes the short comings of dotnet by being versioned. As of today, there’s 1.0 – 1.6, with the following TFM’s netstandard1.0netstandard1.6. Now, the idea is that each .NET Platform Standard version supports a given set of platforms and when authoring a library, you’d ideally want to target the lowest one that has the features you need and runs on your desired target platform versions.

With NuGet 2.12, netstandard is also supported in VS 2012 and 2013, so that there’s no further need to include a portable-* version and a netstandard version in the same package. Finally!

What does this all mean

The short answer is that if you currently have a Profile 259 PCL today, you can change your NuGet package to put it in a netstandard1.0 directory and add the appropriate dependency group. The full list if profile -> netstandard mappings is here. If you support .NET 4.0 or Silverlight 5 in your PCL — basically a PCL “older” than 259, then you can continue to put that in your NuGet alongside the netstandard1.0+ version and things will continue to work. In addition, a platform-specific TFM (like net45) will always “win” over netstandard if compatible.

Detour: Dependencies for NetStandard

Like dotnet, netstandard requires listing package dependencies. For the most part, it’s easier than with dotnet as there is a meta-package, NETStandard.Library 1.6.0 (the RC2 version is 1.5.0-rc2-24027), that has most BCL dependencies included. This package is one that you probably have in your project.json today. Put the NETStandard.Library 1.6.0 dependency in a netstandard1.0 dependency group, along with any other top-level dependencies you have:

<dependencies>
  <group targetFramework="netstandard1.0">
    <dependency id="NETStandard.Library" version="1.6.0" />
    <dependency id="System.Linq.Queryable" version="4.0.1" />
  </group>
</dependencies>

Next steps

There’s about to be a lot to do over the coming weeks:

  • Get the .NET Core 1.0 RTM with the Preview 2 tooling
  • Download VS 2015 Update 3 when available
  • Grab NuGet 2.12 for VS 2012/2013

Start updating your packages to support .NET Core if you haven’t already. If you’ve been waiting for .NET Core RTM, that time has finally come. You can either use a csproj-based portable library targeting netstandard or use xproj with project.json. As an aside, my personal opinion is that xproj‘s main advantage over csproj today is cross-compiling. If you need to compile for multiple targets, that’s the currently the best option, otherwise, you’ll get a better experience using the csproj approach.

Converting existing PCL projects

VS 2015 Update 3 has a new feature that makes it easy to convert a PCL to netstandard. If your library has any NuGet dependencies installed, you need to first follow the steps to switch packages.config to project.json. Once you do that, you can select Target .NET Platform Standard from the project properties and it’ll convert it over:
Project Properties

Icing on the cake

Xamarin supports all versions of netstandard as well. This means that if you were cross-compiling because PCL 259 was too limiting, try taking another look at netstandard1.3+. There’s a lot more surface area there and it may mean you can eliminate a few target platforms.

Bonus Round

If you want to use xproj for its cross-compiling features and also need to reference that library from a project type that doesn’t support csproj -> xproj references today (most of them, including UWP don’t work super-well), I’ve written up an explanation of how you can do this on StackOverflow.

Project.json all the things

February 8, 2016 Uncategorized 27 comments ,

Project.json all the things

One of the less known features of Visual Studio 2015 is that it is possible to use project.json with any project type, not just “modern PCL’s,” UWP projects, or xproj projects. Read on to learn why you want to switch and how you can update your existing solution.

Background

Since the beginning of NuGet, installed packages were tracked in a file named packages.config placed alongside the project file. The package installation process goes something like this:

  1. Determine the full list of packages to install, walking the tree of all dependent packages
  2. Download all of those packages to a \packages directory alongside your solution file
  3. Update your project file with correct libraries from the package (looking at \lib\TFM
    • If the package contains a build directory, add any appropriate props or targets files found
  4. Create or update a packages.config file along the project that lists each package along with the current target framework

Terms

  • TFM – Target Framework Moniker. The name that represents a specific Platform (platforms being .NET Framework 4.6, MonoTouch, UWP, etc.)
  • Short Moniker – a short way of referring to a TFM in a NuGet file (e.g., net46). Full list is here.
  • Full Moniker – a longer way of specifying the TFM (e.g., .NETPortable,Version=v4.5,Profile=Profile111). Easiest way to determine this is to compile and let the NuGet error message tell you what to add (see below).

Limitations

The above steps are roughly the same for NuGet up to and including the 2.x series. While it works for basic projects, larger, more complex projects quickly ran into issues. I do not consider the raw number of packages that a project has to be an issue by itself – that is merely showing oodles of reuse and componentization of packages into small functional units. What does become an issue are the UI and the time it takes to update everything.

As mentioned, because NuGet modifies the project file with the relative location of the references, every time you update, it has to edit the project file. This is slow and can lead to merge conflicts across branches.

Furthermore, the system is unable to pivot on different compile-time needs. With many projects needing to provide some native support, NuGet v2.0 had no way of providing different dependencies based on build configuration.

One more issue surfaces with the use of “bait and switch” PCLs. Some packages provide a PCL for reference purpose (the bait), and then also provide platform-specific implementations that have the same external surface area (the switch). This enables libraries to take advantage of platform specific functionality that’s not available in a portable class library alone. The catch with these packages is that to function correctly in a multi-project solution containing a PCL and an application, the application must also add a NuGet reference to all of the packages its PCL libraries use to ensure that the platform-specific version winds up in the output directory. If you forget, you’ll likely get a runtime error due to an incomplete reference assembly being used.

NuGet v3 and Project.json to the rescue

NuGet 3.x introduces a number of new features aimed at addressing the above limitations:

  • Project files are no longer modified to contain the library location. Instead, an MSBuild task and target gets auto-included by the build system. This task creates references and content-file items at build time enabling the meta-data values to be calculated and not baked into a project file.
    • Per-platform files can exist by using the runtimes directories. See the native light-up section in the docs for the details.
  • Packages are now stored in a per-user cache instead of alongside the solution. This means that common packages do not have to be re-downloaded since they’ll already be present on your machine. Very handy for those packages you use in many different solutions. The MSBuild task enables this as the location is no longer baked into the project file.
  • Reference assemblies are now more formalized with a new ref top-level directory. This would be the “bait” assembly, one that could target a wide range of frameworks via either a portable- or dotnet or netstandard TFM. The implementation library would then reside in \lib\TFM. The version in the ref directory would be used as the compile-time reference while the version in the lib directory is placed in the output location.
  • Transitive references. This is a biggie. Now only the top-level packages you require are listed. The full chain of packages is still downloaded (to the shared per-user cache), but it’s hidden in the tooling and doesn’t get in your way. You can continue to focus on the packages you care about. This also works with project-to-project references. If I have a bait-and-switch package reference in my portable project, and I have an application that references that portable library, the full package list will be evaluated for output in the application and the per-architecture, per-platform assemblies will get put in the output directories. You no longer have to reference each package again in the application.

It is important to note that these features only work when a project is using the new project.json format of package management. Having NuGet v3 alone isn’t enough. The good news is that we can use project.json in any project type with a few manual steps.

Using project.json in your current solution

You can use project.json in your current solution. There are a couple of small caveats here:

  1. Only Visual Studio 2015 with Update 1 currently supports project.json. Xamarin Studio does not yet support it but it is planned. That said, Xamarin projects in Visual Studio do support project.json.
    • If you’re using TFS Team Build, you need TFS 2015 Update 1 on the build agent in addition to VS 2015 Update 1.
  2. Some packages that rely on content files being placed into the project may not work correctly. project.json has a different mechanism for this, so the package would need to be updated. The workaround would be to manually copy the content into your project file.
  3. All projects in your solution would need to be updated for the transitive references to resolve correctly. That’s to say that an application using NuGet v2/packages.config won’t pull in the correct transitive references of a portable project reference that’s using project.json.

With that out of the way, lets get started. If you’d like to skip this and see some examples, please look at the following projects that have been converted over. These are all libraries that have a combination of reference assemblies, platform specific implementations, test applications and unit tests, so the spectrum of scenarios should be covered there. They have everything you need in them:

One last note before diving deep: make sure your .gitignore file contains the following entries:

  • *.lock.json
  • *.nuget.props
  • *.nuget.targets

These files should not generally be checked in. In particular, the .nuget.props/targets files will contain a per-user path to the NuGet cache. These files are created by calling NuGet restore on your solution file.

Diving deep

As you start, have the following blank project.json handy as you’ll need it later:

{
    "dependencies": {        
    },
    "frameworks": {        
        "net452": { }
    },
    "runtimes": {
        "win": { }
    } 
}

This represents an empty project.json for a project targeting .NET 4.5.2. I’m using the short moniker here, but you can also use the full one. The string to use here is the thing you’ll likely hit the most trouble with. Fortunately, when you’re wrong and try to build, you’ll get what’s probably the most helpful error message of all time:

Your project is not referencing the “.NETPortable,Version=v4.5,Profile=Profile111” framework. Add a reference to “.NETPortable,Version=v4.5,Profile=Profile111” in the “frameworks” section of your project.json, and then re-run NuGet restore.

The error literally tells you how to fix it. Awesome! The fix is to put .NETPortable,Version=v4.5,Profile=Profile111 in your frameworks section to wind up with something like:

{
    "dependencies": {        
    },
    "frameworks": {        
        ".NETPortable,Version=v4.5,Profile=Profile111": { }
    },
    "supports": { }
}

The eagle-eyed reader will notice that the first example had a runtimes section with win in it. This is required for a desktop .NET Framework projects and for projects where CopyNuGetImplementations is set to true like your application (we’ll come back that in a bit), but is not required for other library project types. If you have the runtimes section, then there’s rarely, if ever, a reason to have both the supports section too.

The easiest way to think about this:

  • For library projects, use supports and not runtimes
  • For your application project, (.exe, .apk, .appx, .ipa, website) use runtimes and not supports
  • If it’s a desktop .NET Framework project, use runtimes for both class libraries and your application
  • If it’s a unit test library executing in-place and you need references copied to its output directory, use runtimes and not supports

Now, take note of any packages with the versions that you already have installed. You might want to copy/paste your packages.config file into a temporary editor window.

The next step is to remove all of your existing packages from your project. There are two ways to do this: via the NuGet package manager console or by hand.

Using the NuGet Package Manager Console

Pull up the NuGet Package Manager Console and ensure the drop-down is set to the project you’re working on. For each package in the project, uninstall each package with the following command:
Uninstall-Package <package name> -Force -RemoveDependencies
Repeat this for each package until they’re all gone.

By Hand

Delete your packages.config file, save the project file then right-click the project and choose “Unload project”. Now right-click the project and select Edit. We need to clean up a few things in the project file.

  • At the top of the project file, remove any .props files that were added by NuGet (look for the ones going to a \packages directory.
  • Find any <Reference> element where the HintPath points to a NuGet package library. Remove all of them.
  • At the bottom of the file, remove any .targets files that NuGet added. Also remove any NuGet targets or Tasks that NuGet added (might be a target that starts with the following line <Target Name="EnsureNuGetPackageBuildImports" BeforeTargets="PrepareForBuild">).
  • If you have any packages that contain Roslyn Analyzers, make sure to remove any analyzer items that come from them.

Save your changes, right click the project in the solution explorer and reload the project.

Adding the project.json

In your project, add a new blank project.json file using one of the templates above. Ensure that the Build Action is set to None (should be the default). Once present, you might need to unload your project and reload it for NuGet to recognize it, so save your project, right-click your project and unload it and reload it.

Now you can either use the Manage NuGet Packages UI to re-add your packages or add them to the project.json by hand. Remember, you don’t necessarily have to re-add every package, only the top-level ones. For example, if you use Reactive Extensions, you only need Rx-Main, not the four other packages that it pulls in.

Build your project. If there are any errors related to NuGet, the error messages should guide you to the answer. Your project should build.

What you’ll notice for projects other than desktop .NET executables or UWP appx’s, is that the output directory will no longer contain every referenced library. This saves disk space and helps the build be faster by eliminating extra file copying. If you want the files to be in the output directory, like for unit test libraries that need to execute in-place, or for an application itself, there’s two extra steps to take:

  1. Unload the project once more and edit it to add the following to the first <PropertyGroup> at the top of the project file: <CopyNuGetImplementations>true</CopyNuGetImplementations>. This tells NuGet to copy all required implementation files to the output directory.
  2. Save and reload the project file. You’ll next need to add that runtimes section from above. The exact contents will depend on your project type. Rather than list them all out here, please see the Zeroconf or xUnit for Devices for the full examples.
    • For an AnyCPU Desktop .NET project win is sufficient
    • For Windows Store projects, you’ll need more

Once you repeat this for all of your projects, you’ll hopefully still have a working build(!) but now one where the projects are using the rich NuGet v3 capabilities. If you have a CI build system, you need to ensure that you’re using the latest nuget.exe to call restore on your solution prior to build. My preference is to always download the latest stable version from the dist link here: https://dist.nuget.org/win-x86-commandline/latest/nuget.exe.

Edge Cases

There may be some edge cases you hit when it comes to the transitive references. If you need to prevent any of the automatic project-to-project propagation of dependencies, the NuGet Docs can help.

In some rare cases, if you start getting compile errors due to missing System references, you may be hitting this bug, currently scheduled to be fixed in the upcoming 3.4 release. This happens if a NuGet package contains a <frameworkAssembly /> dependency that contains a System.* assembly. The workaround for now is to add <IncludeFrameworkReferencesFromNuGet>false</IncludeFrameworkReferencesFromNuGet> to your project file.

What this doesn’t do

There is often confusion between the use of project.json and its relation to the DNX/CLI project tooling that enables cross-compilation to different sets of targets. Visual Studio 2015 uses a new project type (.xproj) as a wrapper for these. This post isn’t about enabling an existing .csproj or .vbproj project type (the one most people have been using on “regular”) projects to start cross-compiling. Converting an existing project to use .xproj is a topic for another day and not all project types are supported by .xproj.

What this does do is enable the NuGet v3 features to be used by the existing project types today. If you have a .NET 4.6 desktop project, this will not change that. Likewise if your project is using the Xamarin Android 6 SDK, this won’t alter that either. It’ll simply make package management easier.

Acknowledgments

I would like to thank Andrew Arnott for his persistence in figuring out how to make this all work. He explained it to me as he was figuring it out and then recently helped to review this post. Thanks Andrew! A shout out is also due to Scott Dorman and Jason Malinowski for their valuable feedback reviewing this post.

Syntax highlighting on WordPress with Prism and Markdown

February 5, 2016 Uncategorized 2 comments

Using the JetPack plugin, WordPress supports using Markdown natively in the editor. This makes it much easier to write posts, but one feature has been a bit wonky — syntax highlighting.

Out-of-the-box, there’s no syntax highlighting plug-ins that met my need. Prism.js is a popular highlighter and while there are a few plugins to support it, the languages they pre-selected didn’t have what I wanted. Also, they didn’t seem to be regularly updated.

Fortunately, it wasn’t hard to create a child theme that contained a custom-downloaded version of the Prism JavaScript and CSS. I won’t walk through that part as it’s well documented elsewhere (see the previous link). What I did with the base child theme was to place my configured prism.js and prism.css files into a sub-directory and register them to be loaded by WordPress.

That almost worked. The trick is that Prism expects the syntax highlighting to be in tags matching <code class="language-foo"> where foo is whatever syntax rules it should apply. The trouble is that by default, JetPack’s Markdown -> HTML processor turns into <code class="json">. Seeing that, I hacked together a little fix-up script to inject prior to the prism code executing:

jQuery(function() {
    $ = jQuery;
    $("code").each(function() {
       var className = $(this).attr('class');
       if(className && !(className.lastIndexOf('language-', 0) === 0)) {
            // No language, prepend
            $(this).attr('class', 'language-' + className);
        }
    });  
});

Caveat: I am not a “real” JavaScript programmer; you might have a better way to do this!

When it was all assembled and uploaded to WordPress, I can now use the normal syntax and things get highlighted as expected.

I’ve zipped up my child theme that only contains these changes here.

You’re welcome to use that as a starting point; you’ll probably need to rename the child theme and specify the correct parent. You can also merge it with your current child theme.

As the Prism JavaScript and CSS is highly customizable, you may wish to generate your own from their site and use those in place.

Announcing Humanizer 2.0

January 30, 2016 Uncategorized 1 comment

Announcing Humanizer 2.0

Earlier today we finalized and published the next major release of Humanizer. This version includes many fixes and new features, many of them coming directly from the community. A huge thank you to all those who have contributed!

You can find the latest Humanizer on NuGet and the website contains the latest documentation. The release notes contains the full details of the changes.

I wanted to call out a few things though:

  • The Humanizer package now supports selecting locales to install. This was done by using a little-known feature of NuGet called satellite packages. The main Humanizer package is now a meta-package that pulls in all language packages plus the core library; this is the existing behavior of Humanizer today.
    • To install English only, you may elect to install Humanizer.Core directly
    • To install specific a language or set of languages, you can specify Humanizer.Core.<locale> where <locale> represents a supported language package.
  • There is currently a known issue with DNX with satellite packages. It might affect CLI too; track that one here.
  • For best results, using project.json/NuGet v3 is highly recommended over packages.config/NuGet v2. The key difference is that all of the child packages are transitively included instead of directly referenced in your packages.config file. project.json is supported in any project type, not just .NET Core or UWP projects.

Finally, I wanted to thank Mehdi Khalili for trusting me with the stewardship of the project. Mehdi did a fantastic job building Humanizer up and getting the community involved to contribute back. I also would like to thank Alexander I. Zaytsev and Max Malook for their efforts in coordinating the community contributions and guide the project forward.

Xamarin MVP

January 25, 2016 Uncategorized No comments

Xamarin MVP

Xamarin just announced their newest round of MVP awards, and I am very honored to have received one. I have always thought that the C# and .NET are the best, most productive, way to create applications for iOS and Android and Xamarin’s tools make this possible.

Hope to see you at Xamarin Evolve later this April!

Continuous Integration for UWP projects – Making Builds Faster

December 3, 2015 Uncategorized 2 comments , , ,

Continuous Integration for UWP projects – Making Builds Faster

Are you developing a UWP app? Are you doing continuous integration? Do you want to improve your CI build times while still generating the .appxupload required for store submission? If so, read-on.

Prerequisites

You’ll need VS 2015 with the UWP 1.1 tools installed. The UWP 1.1 tooling has some important fixes for creating app bundles and app upload files for command line/CI builds.

You’ll also need to register your app on the Windows Dev Center and associate it with your app. Follow the docs for setting linking your project to a store from within VS first.

If you’re using VSO, you may need to setup your own VM to run a vNext build agent. I’m not sure VSO’s hosted agents have all the latest tools as of today. I run my builds in an A2 VM on Azure; it’s not the fastest build server but it’s good enough.

Building on a Server

Now that you have a solution with one or more projects that create an appx (UWP) app, you can start setting up your build scripts. One problem you’ll need to solve is updating your .appxmanifest with an incrementing version each time. I’ve solved this using the fantastic GitVersion tool. There’s a number of different ways to use it, but on VSO it sets environment variables as part of a build step that I use to update the manifest on build.

I use a .proj msbuild file with a set of targets the CI server calls, but you can use your favorite build scripting tool.

My code looks like this:

<Target Name="UpdateVersion">
    <PropertyGroup>
      <Version>$(GITVERSION_MAJOR).$(GITVERSION_MINOR).$(GITVERSION_BUILDMETADATA)</Version>
    </PropertyGroup>    
    <ItemGroup>
      <RegexTransform Include="$(SolutionDir)\**\*.appxmanifest">
          <Find><![CDATA[ Version="\d+\.\d+\.\d+\.\d+"]]></Find>
          <ReplaceWith><![CDATA[ Version="$(Version).0"]]></ReplaceWith>
      </RegexTransform>
    </ItemGroup>
    <RegexTransform Items="@(RegexTransform)" />    
    <Message Text="Assm: Ver $(Version)" />
</Target>

The idea is to call GitVersion, either by calling GitVersion.exe earlier in the build process, or by using the GitVersion VSO Build Task in a step prior to the build step.

GitVersion can also update your AssemblyInfo files too, if you’d like.

Finally, at the end of the build step, you’ll want to collect certain files for the output. In this case, it’s the .appxupload for the store. In VSO, I look for the contents in my app dir, MyApp\AppPackages\**\*.appxupload.

If you setup your build definition to build in Release mode, you should have a successful build with a .appxupload artifact available you can submit to the store. Remember, we’ve already associated this app with the store, and we’ve enabled building x86, x64, and arm as part of our initial run-through in Visual Studio.

The problem

For your safety, a CI build will by default only generate the .appxupload file if you’re in Release mode with .NET Native enabled. This is to help you catch compile-time errors that would delay your store submission.

That’s well-intentioned, but it can severely slow down your builds. On one project I’m working on, on that A2 VM, a “normal” debug build takes about 14 min while a Release build takes 81 minutes! That’s too long for CI.

Fortunately, there’s a few things we can do to speed things up if you’re willing to live a bit dangerously.

  1. Force MSBuild to create the .appxupload without actually – yes, it is possible!
    • In your build definition, pass the additional arguments to MSBuild: /p:UseDotNetNativeToolchain=false /p:BuildAppxUploadPackageForUap=true. This overrides two variables that control the use of .NET Native and packaging.
  2. If you have any UWP Unit Test projects, you can disable package generation for them if you’re not running those unit tests on the CI box. There is a g̶o̶o̶d̶  reason for this — it’s hard. Running UWP CI tests requires your test agent to be running as an interactive process, not a service. You need to configure your build box to auto-login on reboot and then startup the agent.

    In your test projects, add the following <PropertyGroup> to your csproj file:

<!-- Don't build an appx for this in TFS/command line msbuild -->
<PropertyGroup>
  <GenerateAppxPackageOnBuild Condition="'$(GenerateAppxPackageOnBuild)' == '' and '$(BuildingInsideVisualStudio)' != 'true'">false</GenerateAppxPackageOnBuild>
</PropertyGroup>

This works because the .appxupload doesn’t actually contain native code. It contains three app bundles (one per platform) with MSIL, that the store compiles to native code in the cloud. The local .NET Native step is only a “safety” check, as is running WACK. If you regularly test your code in Release mode locally, and have run WACK to ensure your code is ok, then there’s no need to run either on every build.

After making those two adjustments, I’m able to generate the .appxupload files on every build and the build takes the same 13 min as debug mode.

Surface Book or Surface Pro 4?

October 6, 2015 Uncategorized 3 comments

windows101

This evening, at the Windows 10 Devices fan celebration in NYC, I got to use the Surface Book (and the other devices announced today) and talk to the product guys about the Surface Pro 4 and the Surface Book. One of my questions to them stemmed from a question at work regarding the split-hinge on the Surface Book; I thought the answer was interesting, so here goes (read on further for my comparison of SP4 vs SB?)

They said the split hinge was a deliberate design decision. It also stems from the following goals:

  • To keep the base as thin as possible
  • To have a “perfect” keyboard. The travel on the keys is 1.6mm, which is greater than most laptop keyboards
  • When the lid is closed, they didn’t want the keys to scuff up the screen.
    • To address this, sometimes the keyboard is slightly recessed in the case – it is like that on the MacBook Pro I have. The problem is that’s wasted space that and they wanted to make the thing thinner.

Also, when the lid is open, they had to get the balance exactly right so that when you push against the screen (it is a touch screen after all), it doesn’t tip over. Many/most other 2-in-1’s don’t have the balance quite right and are “tipsy”. Having tried the Surface Book, I can say it’s certainly not tipsy. The “dynamic fulcrum hinge” has some role in this too.

When it comes to a choice between a Surface Pro 4 and the Surface Book, I’d have to say that the differences are primarily around usage:

  • Surface Pro 4 is a tablet and can go it’s full battery charge without its keyboard
  • Surface Pro 4’s keyboard is better than the previous gen one, but for people that do a lot of typing (developers?), it may not be as ideal. In “lap mode”, the SP4 keyboard still has some “bounciness” as the cover overall could be stiffer.
  • Surface Book’s “clipboard” has three hours of battery life on its own. The remaining 9 hours are in the base (for a total of 12). That’s why they call it a clipboard and not a tablet, because the tablet usage is intended as a secondary/auxiliary mode, not the primary.
  • The Surface Book’s keyboard is really, really nice.
  • The 13.5” screen size feels bigger than it is due to its aspect ratio and the resolution. It also has a very narrow bezel, so the screen goes almost to the edge.

Both devices will have the same memory/storage capabilities, maxing out at 16 GB/1TB. The 1TB storage isn’t available yet (will be a month or two) as they are finishing testing those components. They are using Samsung 3d V-NAND modules so the more storage, the faster it actually is. The pen is really nice and has a great feel to it. Even for people with messy handwriting, the friction level on the screen is the right amount to have control and write something legibly.

Both machines are priced at about $2700 fully loaded (16GB/1TB). Which one to get really depends on your usage and needs; I have a feeling most developers would be happiest with the Surface Book while non-developers would probably like the Surface Pro 4 best.

Enabling source code debugging for your NuGet packages with GitLink

September 23, 2015 Uncategorized 3 comments , , , ,

Enabling source code debugging for your NuGet packages with GitLink

Recently on Twitter, someone was complaining that their CI builds were failing due to SymbolSource.org either being down or rejecting their packages. Fortunately, there’s a better way than using SymbolSource if you’re using a public Git repo (like GitHub) to host your project — GitLink.

Symbols, SymbolSource and NuGet

Hopefully by now, most of you know that you need to create symbols (PDB’s) for your release libraries in addition to your debug builds. Having symbols helps your users troubleshoot issues that may crop up when they’re using your library. Without symbols, you need to rely on hacks, like using dotPeek as a Symbol Server. It’s a hack because the generated source code usually doesn’t match the original, and it certainly doesn’t include any helpful comments (you do comment your code, right?)

So you’ve updated your project build properties to create symbols for release, now you need someplace to put them so your users can get them. Up until recently, the easiest way has been to publish them on SymbolSource. You’d include the pdb files in your NuGet NuSpec, and then run nuget pack MyLibrary.nuspec -symbols. NuGet then creates two packages, one with your library and one just with the symbols. If you then run nuget push MyLibrary.1.0.0.nupkg, if there’s also a symbols package alongside, NuGet will push that to SymbolSource instead of NuGet.org. If you’re lucky, things will just work. However, sometimes SymbolSource doesn’t like your PDB’s and your push will fail.

The issues

While SymbolSource is a great tool, there are some shortcomings.
* It requires manual configuration by the library consumer
* They have to know to go to VS and add the SymbolSource URL to the symbol search path
* It slows down your debugging experience. VS will by default check every configured Symbol Server for matching PDB’s. That leads many people to either disable symbol loading entirely or selectively load symbols. Even if you selectively load symbols, the load is still slow as VS has know way to know which Symbol Server a PDB might be on and must check all of them.
* Doesn’t enable Source Code debugging. PDB’s can be indexed to map original source code file metadata into them (the file location, not contents). If you’ve source-indexed your PDB’s and the user has source server support enabled, VS will automatically download the matching source code. This is great for OSS projects with their code on GitHub.

GitLink to the Rescue

GitLink provides us an elegant solution. When GitLink is run after your build step, it detects the current commit (assuming the sln is in a git repo clone), detects the provider (BitBucket and GitHub are currently supported) and indexes the PDB’s to point to the exact source location online. Of course, there are options to specify commits, remote repo location URLs, etc if you need to override the defaults.

After running GitLink, just include the PDB files in your nuspec/main nupkg alongside your dll files and you’re done. Upload that whole package to NuGet (and don’t use the -symbols parameter with nuget pack). This also means that users don’t need to configure a symbol server as the source-indexed PDB’s will be alongside the dll — the location VS will auto-load them from.

An example

Over at xUnit and xUnit for Devices, we’ve implemented GitLink as part of our builds. xUnit builds are setup to run msbuild on an “outer” .msbuild project with high-level tasks; we have a GitLink task that runs after our main build task.

As we want the build to be fully automated and not rely on exe’s external to the project, we “install” the GitLink NuGet package on build if necessary.

Here’s the gist of our main CI target that we call on build msbuild xunit.msbuild /t:CI (abbreviated for clarity):

<PropertyGroup>
  <SolutionName Condition="'$(SolutionName)' == ''">xunit.vs2015.sln</SolutionName>
  <SolutionDir Condition="'$(SolutionDir)' == '' Or '$(SolutionDir)' == '*Undefined*'">$(MSBuildProjectDirectory)</SolutionDir>
  <NuGetExePath Condition="'$(NuGetExePath)' == ''">$(SolutionDir)\.nuget\nuget.exe</NuGetExePath>
</PropertyGroup>

<Target Name="CI" DependsOnTargets="Clean;PackageRestore;GitLink;Build;Packages" />

<Target Name="PackageRestore" DependsOnTargets="_DownloadNuGet">
  <Message Text="Restoring NuGet packages..." Importance="High" />
  <Exec Command="&quot;$(NuGetExePath)&quot; install gitlink -SolutionDir &quot;$(SolutionDir)&quot; -Verbosity quiet -ExcludeVersion -pre" Condition="!Exists('$(SolutionDir)\packages\gitlink\')" />
  <Exec Command="&quot;$(NuGetExePath)&quot; restore &quot;$(SolutionDir)\$(SolutionName)&quot; -NonInteractive -Source @(PackageSource) -Verbosity quiet" />
</Target>

<Target Name='GitLink'>
  <Exec Command='packages\gitlink\lib\net45\GitLink.exe $(MSBuildThisFileDirectory) -f $(SolutionName) -u https://github.com/xunit/xunit' IgnoreExitCode='true' />
</Target>

<Target Name='Packages'>
  <Exec Command='"$(NuGetExePath)" pack %(NuspecFiles.Identity) -NoPackageAnalysis -NonInteractive -Verbosity quiet' />
</Target>

There are a few things to note from the snippet:
* When installing GitLink, I use the -ExcludeVersion switch. This is so it’s easier to call later in the script w/o remembering to update a target path each time.
* I’m currently using -pre as well. There’s a number of bugs fixed since the last stable release.

The end result

If you use xUnit 2.0+ or xUnit for Devices and have source server support enabled in your VS debug settings, VS will let you step into xUnit code seamlessly.

If you do this for your library, your users will thank you.

UWP NuGet Package Dependencies

August 29, 2015 Uncategorized 4 comments , , ,

UWP NuGet Package Dependencies

[Updated: 9/15/15 on the NuGet package contents at the end]

In my last post, Targeting .NET Core, I mentioned that NuGet packages targeting .NET Core and using the dotnet TFM need to list their dependencies. What may not be immediately obvious, as this is new behavior for UWP projects, is that UWP packages need to list their BCL dependencies too, not just “regular” NuGet references.

The reason for this is that UWP projects also use .NET Core and may elect to use newer BCL package versions than the default. While the uap10.0 TFM does imply BCL + Windows Runtime, it doesn’t really say what version of the dependencies you get. Instead, that’s in your project.json file, which by default includes the Microsoft.NETCore.UniversalWindowsPlatform v5.0.0 “meta-package”, which pulls in most of the .NET Core libraries at a particular version. But what happens if newer BCL packages are published? Right now, the OSS BCL .NET Core packages are being worked on and they’re a higher version – System.Runtime is 4.0.21-beta*.

In Windows 8.1 and Windows 8, this wasn’t an issue because those platforms each had a fixed set of BCL references. You’d know for sure what BCL version’s you’d get for each of those. But now with UWP, that’s no longer true, so you need to specify them.

Fortunately, you don’t have to figure out all of the dependencies by hand. Instead, you can use my handy NuSpec.ReferenceGenerator tool (NuGet|GitHub) to add those dependencies to your NuSpec file.

The ReadMe is fairly detailed, but for the majority of projects, if you have a NuSpec file whose filename matches your project name (like MyAwesomeLibrary.csproj with a MyAwesomeLibrary.nuspec sitting somewhere under the .sln dir), adding the reference should be all you need.

For a UWP Class Library package, you should have the following in your NuSpec:

  • A dependency group with the uap10.0 TFM
  • In your Project Build options for Release mode, choose “generate library layout”
  • Copy the entire directory structure of the output to your \lib\uap10.0 dir.