Renovate Bot Sharable Configurations

|

If you haven’t already noticed by the amount of blog posts about Renovate Bot, I am really loving it and its feature set.

One very cool feature that was pointed out to me was the ability to have defaults or configurations to extend from, shared from a repository. I put mine in the repository Renovate Bot lives in and share it from there.

So, if you find yourself applying the same configuration in multiple repositories, this is maybe something you want to look into.

Defining a default config

Where your Renovate Bot lives, I created a defaults.json, but you can actually call it almost anything you want, you will need remember the name though for when extending it in your config for your repos you are scanning. With the file defaults.json in place. In this file I put something like this as these are things I keep applying most places:

{
  "$schema": "https://docs.renovatebot.com/renovate-schema.json",
  "extends": [
    "config:recommended"
  ],
  "prHourlyLimit": 0,
  "prConcurrentLimit": 0,
  "automerge": true,
  "azureWorkItemId": 123456,
  "labels": [
    "dependencies"
  ]
}

Using default configs in your configs for your repositories

To use the above defaults.json it is as easy to remove the configuration entries that you want to use from the defaults and adding a line such as this to your renovate.json config in the scanned repository.

{
  "$schema": "https://docs.renovatebot.com/renovate-schema.json",
  "extends": ["local>MyProjectName/RenovateBot:defaults"],
  ...

So here I use local> which means self hosted Git. If you are on GitHub or GitLab or some other hosted Git please refer to the Preset Hosting documentation. For Azure DevOps Repositories, local> works.

Otherwise, I just specify the Project name in Azure DevOps and the Repository the configuration I want to extend lives in. That is more or less it.

Next time your Renovate Bot runs, it will pull those config items.

There is a lot more you can do along with some recommended presets by the Renovate Bot team which you can apply. Read more about it in their documentation about Sharable Config Presets.

Cool Renovate Bot Features

|

Last month I wrote about how cool Renovate Bot is, updating the dependencies in your repositories. It works anywhere, I got it to work in Azure DevOps Pipelines running every night at 3 AM, while everyone is sleeping and no one else is using the build agents for important stuff.

I initially did a couple of things, that I changed later on, after getting a bit more familiar with how Renovate Bot works. I will go through some of the discoveries I did and how I configure it in the end, in this blog post. Hopefully you find this useful 😀

Configuration and pipeline can live in its own repo

Initially I had the renovate bot configuration and pipeline living in the same repository of the code I wanted it to run against. This is entirely not necessary and it can live in its own repository and have a base configuration, for things such as authentication in this repository.

So now I have a repository called RenovateBot with two files in it:

  • azure-pipelines.yml the pipeline for running the bot (see the previous post on how I set that up)
  • config.js the configuration for the bot

When running, Renovate already knows how to check out files in the repositories you tell it to scan, in the config file. So you don’t need to run it in the code repositories, you want to scan for updates.

In the config.js file I now simply have something like:

module.exports = {
    hostRules: [
        {
            hostType: 'nuget',
            matchHost: 'https://pkgs.dev.azure.com/myorg/',
            username: 'user',
            password: process.env.NUGET_TOKEN
        },
    ],
    repositories: [
        'myorg/repo1',
        'myorg/repo2',
        'myorg/repo3'
    ]
};

It will scan all those repositories defined in the repositories collection.

Neat!

You can have repository specific configs

For each repository you define, apart from the basic configuration you provide for renovate, you can add additional configuration. I use this to add tags and group Pull Requests made by Renovate for dependencies that group together. For instance for Unit Test dependencies.

So in each repository you can add a renovate.json file with additional configuration. This is the same file that Renovate creates initially on a repository on the first Pull Request it makes.

Here is an example of what a configuration for one of my repositories looks like:

{
  "$schema": "https://docs.renovatebot.com/renovate-schema.json",
  "azureWorkItemId": 123456,
  "prHourlyLimit": 0,
  "labels": ["dependencies"],
  "automerge": true,
  "packageRules": [
    {
      "matchPackagePatterns": [
        "Cake.*",
        "dotnet-sonarscanner",
        "dotnet-reportgenerator-globaltool"
      ],
      "groupName": "CI",
      "addLabels": [
        "CI"
      ]
    },
    {
      "matchPackagePatterns": [
        ".*xunit.*",
        "Moq.*",
        "AutoFixture.*",
        "liquidtestreports.*",
        "Microsoft.NET.Test.SDK",
        "Microsoft.Reactive.Testing",
        "MvvmCross.Tests",
        "Xamarin.UITest",
        "coverlet.*",
        "MSTest.*",
        "System.IO.Abstractions.*"
      ],
      "groupName": "Unit Test",
      "addLabels": [
        "Unit Test"
      ]
    }
}

Let’s go through some of the options here.

  • azureWorkItemId will add the specific work item Id to every Pull Request it creates. This is especially useful if you have a policy set on your Pull Request to always link a work item
  • prHourlyLimit I’ve set this one to 0, such that Renovate Bot can create as many Pull Requests it wants on a repository. Otherwise, I think the default is 2. So if you wonder why it didn’t update all dependencies, this could by why
  • labels This option lets you set default labels on pull requests, so for each of my Pull Requests made by Renovate it will have the dependencies label on it
  • automerge This option will set Auto Complete in Azure DevOps on a Pull Request using the default merge strategy, such that you can have Pull Requests automatically merge when all checks are completed
  • packageRules Is super powerful. Here you can limit which packages you want to be grouped together, in the case above I have two groups. Unit Test and CI, which will look for specific regex patterns of package names to include in the groups. I also add additional labels for these two groups using addLabels and assign groupName such that when Renovate creates a Pull Request for a group, the title will be Update <group name>. There are many more options you can set on packageRules, you should refer to the docs if you want more info.

You can scan many types of project types

So far I have scanned .NET projects and Kotlin projects with Renovate Bot and it handles these very well without any issues. I simply add additional repositories in the config.js file and on next run or when I run the pipeline manually it adds a renovate.json file to the repository and it is good to go.

Some Azure DevOps annoyances

When using System.AccessToken as your Renovate Token, the Pull Requests are opened by the user Project Collection Build Service (myorg). This user is built into Azure DevOps and does not have any e-mail assigned to it and you cannot change it either. If you have “Commit author email validation” enabled on a repo, you will need to add both the renovate bot email (or the one you’ve defined in your config) along with the Project Collection user like so: [email protected]; Project Collection Build Service (myorg) to the allowed commit author email patterns. Otherwise auto completion on Pull Requests will not work as it will violate one of the repository policies.

Using Renovate Bot in Azure DevOps

|

I have been spoiled by the dependabot on GitHub, which helps keeping NuGet and other packages up to date. However, dependabot is not easily available in Azure DevOps. Again, the Open Source Community to the rescue! After asking around on social media, my friends Martin Björkström, Mattias Karlsson and Pascal Berger let me know of the existence of Renovate bot. The purpose of this bot is to periodically to update the dependencies that you use in your projects. It has loads of plugins for all sorts of package systems, like NPM, NuGet, PIP and many more. Probably, anything you think of, it has support for it or it can be configured to work with it.

Pascal conveniently let me know of a Docker image you can use in your pipelines to run renovate. This docker image comes with the packages pre-installed, such that you just need to execute renovate. This is nice, because then you do not need to install the renovate npm package on every pipeline run.

Configuration

To configure renovate, you will want to create a config.js file, here you can add stuff like private NuGet feeds, rules about which labels to apply on PRs and much more. For my usage, I need access too a private NuGet feed, and want to apply a label dependencies and a work item on every PR that renovate creates:

module.exports = {
  hostRules: [
    {
      hostType: 'nuget',
      matchHost: 'https://pkgs.dev.azure.com/<org-name>/',
      username: 'user',
      password: process.env.NUGET_TOKEN
    }
  ],
  repositories: ['<project>/<repository>'],
  azureWorkItemId: 12345,
  labels: ['dependencies']
};

For private NuGet feeds, you need to add hostRules, to let renovate know how to authenticate with the NuGet feed. For Azure DevOps Artifacts, you can unfortunately not just use the System.AccessToken in the pipeline, so you need to create a Personal Access Token (PAT), with permission to read the package feed.

You can have renovate create PRs for one or more repositories, provide a list of repositories you want it to run on. You can quickly deduct this from the URL for your repo, which will be in the format: https://dev.azure.com/<organization>/<project>/_git/<repository>. Each repository you want to be scanned you add like: <project>/<repository>.

On my repositories, I have branch protection enabled and have a rule that work items must be linked to each PR. So for this I have created a work item, which I simply use for each renovate bot Pull Request.

That is it for the configuration.

Pipeline definition

With the configuration in place, you can now set up a pipeline to run renovate based on a schedule. I have used the example renovate suggest, running every night at 3.

This pipeline is using the docker container Pascal Berger let me know exists. So every step after specifying container will run inside of that.

The env argument NUGET_TOKEN, is what the password for the hostRule for the NuGet feed above will be replaced with. In my case it is a Personal Access Token (PAT) that only has access to the private NuGet feed. The GITHUB_COM_TOKEN is used to get release notes for Pull Request descriptions when renovate creates such.

schedules:
- cron: '0 3 * * *'
  displayName: 'Every day at 3am (UTC)'
  branches:
    include: [develop]
  always: true

trigger: none

pool:
  vmImage: 'ubuntu-latest'

container: swissgrc/azure-pipelines-renovate:latest

steps:
- bash: |
    npx renovate
  env:
    NUGET_TOKEN: $(NUGET_PAT)
    GITHUB_COM_TOKEN: $(GITHUB_TOKEN)
    RENOVATE_PLATFORM: azure
    RENOVATE_ENDPOINT: $(System.CollectionUri)
    RENOVATE_TOKEN: $(System.AccessToken)

With this, you should be good to go! First time renovate runs, it will create a pull request with a renovate.json file. Merge it and it will from now on create Pull Requests with dependency updates! Neat!

Here is a screenshot of how this looks.

pr

This works in many environments. Refer to the renovate documentation for more info.

Migration of Xamarin Libraries and Apps to .NET 6.0/7.0

|

We are well into .NET6 and .NET7 release lifecycle and .NET8 coming up soon. Additionally end of life of Xamarin is coming increasingly closer, so I bet a lot of people are looking towards migrating to the newer bits.

I have already migrated every App I work on to NET7.0 and have done multiple releases to stores of these Apps. So wanted to share some insights with everyone.

1. Breaking changes

You maybe we aware that net{6|7|8}.0-ios targets are incompatible with Xamarin.iOS targets. The breaking changes primarily are how NFloat and related types are now handled. Instead of living directly in .NET 6.0, these types are now added implicitly using the NuGet package System.Runtime.InteropServices.NFloat.Internal. If you are already using .NET 6.0 you might have noticed this package gets pulled in, even though you have no other Package References. This makes Xamarin.iOS assemblies not forward compatible with the newer target frameworks. Hence, library authors need to migrate their projects to make them compatible.

I have done a couple of migrations, a huge one in MvvmCross which has a lot of target frameworks it targets. Migrated NukeProxy which is a iOS Binding Library, which had to be converted to using XCFramework instead of a Fat library. I have migrated a bunch of internal libraries and 6 Apps at work. So now I have a little bit of experience behind the belt.

2. Prerequisites

Make sure your .NET version is up to date. As of November .NET 7.0 is out and newer versions of Visual Studio for Mac will pull this the things described here should also be working if even if you are on .NET8.0 already, but also if you are for some reason on .NET6.0.

The bits for Android, iOS, macOS, mac catalyst etc. are no longer distributed with Visual Studio. Instead you will need to use a new feature in .NET to install a “workload” for each of these. This is super easy and much nicer in my opinion, instead of having to open a UI tool you can easily update and install these tools in the command-line. For instance if you want Android and iOS you do:

dotnet workload install android ios

You will need to install the appropriate workload for each Target Framework you intend to support in your library. If you are going to use .NET MAUI, then you might also want to install the maui maui-android maui-ios workloads.

3. Migrating a project to new SDK style project

The new SDK style project has been available to use for a very long time. The biggest advantage of these new project styles, is that you will not have to specify every single file in your project file that you want to compile. Instead the new style just picks up any know file types and makes some sane assumptions about the build actions for the files and adds them automatically. This makes working with csproj files much easier, as they are not polluted with loads of <Compile Include="folder/myfile.cs" /> definitions all over the place. The only things you would really have in these new csproj files is defining Target Framework, Package References and Project References. If you have other stuff, you may be doing something wrong.

If you compare the Playground.iOS csproj file in MvvmCross, it went from over 200 lines of code to 29, since the new SDK style projects are so more succinct.

The easiest way to migrate to the new project style and new TFM, is simply creating a new project and dragging over all your files, then adding all the project references and package references.

If you need inspiration of how a csproj file looks you can have some inspiration from some of the MvvmCross project files.

Android Library: MvvmCross RecyclerView csproj

Multi-target Library: MvvmCross Messenger Plugin csproj

4. Do I still need MSBuild.SDK.Extras for multi-targeting?

In MvvmCross I historically used the excellent MSBuild.SDK.Extras project to help with multi-targeting scenarios. However, after migrating projects to net6.0 I started having weird build issues. Migrating away from MSBuild.SDK.Extras resolved my issues. Your milage may vary, but it has helped me with a bunch of issues not to use it anymore.

If you are reading this and don’t know what it is, then you are not missing out on anything. It was just necessary with the older target frameworks to have a bunch of extra setup, which excellent community members such as Claire Novotny helped making for library authors to have a much nicer experience. However, it appears this is not necessary to use anymore.

5. Using Xamarin.Essentials?

If you are using Xamarin.Essentials, you may have heard that this is now moved over to MAUI.Essentials. However, not everyone are making Apps using MAUI, so you don’t really want to pull in everything from MAUI just to have access to the MAUI.Essentials API. MAUI.Essentials is not a NuGet package you pull in though. As of writing this post you can add MAUI.Essentials, by adding the following to your csproj file in a <PropertyGroup>:

<UseMauiEssentials>true</UseMauiEssentials>

Remember to initialize MAUI essentials on startup in your Activity or Application or View controller:

Microsoft.Maui.ApplicationModel.Platform.Init(this);

Read more about the migration in the Microsoft Docs

6. Change IntPtr to NativeHandle on iOS

Often when you are creating CollectionView or TableView cells among other views on iOS, you need to add constructors which historically used IntPtr in the constructor. This has changed now and you need to switch all these over to use NativeHandle or you will encounter issues at runtime, where it will complain about missing constructors.

7. My App crashes with PlatformNotSupported exception in System.Linq.Expression errors at runtime

There are places in net6.0 and net7.0 where some paths use code that requires JIT compilation. Such as when using parts of System.Linq.Expression which internally uses System.Reflection.Emit. This is supposed to be fixed in net8.0.

Read more here:

If you encounter such issues you will have to add the following to your iOS project:

<UseInterpreter>true</UseInterpreter>

You may also want to experiment with adding

<MtouchInterpreter>-all</MtouchInterpreter>

You can read more about this in Ryan Davis’s excellent blog post about improving build times

But UseInterpreter kicks in the Interpreter mode for your App and allows for some cool features, which are also described by Ryan Davis in his post about the mono interpreter. Among these emitting code, patching code at runtime and more, but more importantly fixes runtime crashes until libraries get patched.

8. I have a Binding Library what do I do with that?

Business as usual. However, I’ve found that switching over to output XCFramework libraries for the stuff you want to bind, is much easier to work with. Especially if you also want to support Mac Catalyst, then it is a must.

You still need to provide a ApiDefinition and Structs file. Specifically for these you need specific build actions in your csproj file:

<ItemGroup>
  <ObjcBindingApiDefinition Include="ApiDefinition.cs" />
  <ObjcBindingCoreSource Include="Structs.cs" />
</ItemGroup>

Then you can refer to your XCFrameworks like so:

<ItemGroup>
  <NativeReference Include="..\..\Output\NukeProxy.xcframework">
    <Kind>Framework</Kind>
    <SmartLink>False</SmartLink>
  </NativeReference>
  <NativeReference Include="..\..\Carthage\Build\Nuke.xcframework">
    <Kind>Framework</Kind>
    <SmartLink>False</SmartLink>
  </NativeReference>
</ItemGroup>

If you are still on .net6.0, you may have issues with BitCode being pulled in for some reason. Refer to this comment on GitHub to see how you can add extra stuff in your csproj to get rid of it. Supposedly it is fixed in .net7.0.

I think this is more or less what you need to know. Sure, this is not a complete migration guide, so you will have to try this yourself, but hopefully there are some things here that might you. If you have any questions, please do reach out on Discord, Mastodon, Twitter or in the comments below.

Easy Setup of Dev Tools on macOS with ZSH dotfiles

|

I recently got a new machine at work. Before I got it I spent a bit of time preparing and figuring out which essential tools I need to do my daily work. But, also to minimize the amount of time I would have to spend installing everything.

As you the reader might know I develop mobile Apps with Xamarin and .NET. So this blog post will be geared towards that. However, you would be able to install anything using the same approach.

My colleague pointed out that there is this super cool feature of ZSH called dotfiles. ZSH ships with macOS ans is the shell that you see when you open a terminal. It has a set of very powerful interactive tools but is also a script interpreter. Similar to other shells you might know such as bash, fish and csh.

The configuration for ZSH happens with a file called .zshrc, which is a list of settings, aliases for commands, styles and more.

Dotfiles in ZSH is a way to organize the settings you would normally put it .zshrc, but not only that, you can organize other configuration files for other tools, and also organize scripts.

There are multiple attempts to build on top of these dotfiles for easily managing the tools you need for your daily work. The version I went for is Oh Your dotfiles. With this tool you can make a logical folder structure with scripts and dependencies you want to have installed.

Let’s dig into it!

Start off by cloning Oh Your dotfiles:

git clone https://github.com/DanielThomas/oh-your-dotfiles ~/.oh-your-dotfiles

Then you run

ZDOTDIR=~/.oh-your-dotfiles zsh

Followed by:

dotfiles_install

Now you have oh your dotfiles installed. Time to create your own repository to customize your installation. For my own use I’ve created a dotfiles repository, with the tools I need. I’ve called it dotfiles on purpose as Oh Your Dotfiles automatically looks for paths containing dotfiles in the name, call it anything you want, just make sure you clone it into a folder starting with . and containing dotfiles in your user directory.

Now with the stuff cloned, you can start add content.

For instance. I have a folder called dev with a script called install.homebrew with the following contents:

ca-certificates
git
gh
scrcpy
[email protected]
mitmproxy
gawk
imagemagick
curl
wget
tree

This will install all those tools from homebrew.

Similarly if you want to install stuff from homebrew casks you can create a install.homebrew-cask file and add stuff there. For instance I have this in one of my folders:

firefox
amethyst
anydesk
little-snitch
spotify
gimp
signal
macfuse
keka
hot

When you’ve added some files and stuff to download. Simply run dotfiles_update and it will install everything.

Remember to synchronize your dotfiles somewhere. Then next time you get a new machine you can simply do:

git clone https://github.com/DanielThomas/oh-your-dotfiles ~/.oh-your-dotfiles
git clone https://github.com/Cheesebaron/dotfiles ~/.dotfiles
ZDOTDIR=~/.oh-your-dotfiles zsh
dotfiles_install

Then you can wait and everything is installed. Super convenient.