Splitting up OneDrive folders to separate disks

Disclaimer: only follow this guide if you have a backup of your data, don’t come to me telling you lost all data and didn’t make a backup before following this guide. Use at your own risk!

I am using OneDrive for a lot of things: my documents, my photos, my videos, etc. On my main pc, I have 2 disks:

  • C:\ => a combination 3 SSD’s in RAID0 to maximize performance for the OS and apps, available space is 300 GB
  • D:\ => A regular disk to store files that don’t need lots of activity, available space is 1 TB

Since I like to take RAW photos, my OneDrive usage looks like this:

  • Documents (5.88 GB)
  • Pictures (177 GB)
  • Videos (10 GB)

This means that after installing a few games (that also take quite some space) on the C drive, I am basically out of luck because to sync my OneDrive stuff, I would need more than what the C drive has to offer.

Possible solutions I tried in the past

The first thing I did was to uninstall a few games, but this was just a temporary solution because eventually the Pictures folder became bigger and bigger (Lumia 950XL shooting RAW can take quite some space).

The second thing I did was to move all the files to the D drive because it had much more space. But boy my documents (which I am working in daily) were… so… slow.

The third option I tried was not syncing the Pictures folder to this main PC at all. But then I wanted to edit my photos in Lightroom and got stuck again

The solution – directory symlinks

The something came to mind: why not try and see if we can use symlinks to store the Pictures and Videos on the D drive while keeping the rest of the stuff on the C drive. It was actually simpler than I thought. In order to move some OneDrive folders to a separate disk, use the following simple guide to do so.

  1. Stop OneDrive so it doesn’t sync anything
  2. Move both the Pictures and Videos folder (or any folder you’d like to move to a different disk) to a different drive (in my case D), so I ended up with:
    1. D:\Pictures
    2. D:\Videos
  3. Open a command prompt window (in administrator mode) and navigate to C:\Users\[yourname]\OneDrive
  4. Use the following command to create a symlink for a directory:

    symblink /D [name] [directory]

    In my case these are the commands:

    1. symlink /D Pictures D:\Pictures
    2. symlink /D Videos D:\Videos
  5. You will see the directories “appear” again in the OneDrive folder. Now you can start OneDrive and your data is ready to be synchronized (it should already be synchronized if you haven’t changed any files from the moment you performance step 1)


This way I can store my pictures, which I don’t edit that often, on a separate, very large and cheap drive while keeping my documents on a super fast drive and still sync everything with OneDrive.

Improving performance of .NET code

Recently I’ve been on vacation. As most developers, I think vacation is a great time to take a step back from what I have been doing and evaluate some things in life. One of the things I’ve always intended to do was to learn about high-performant applications and how I could apply this to the software I am writing. To accomplish this, I’ve been reading the excellent book “Writing High Performance .NET code” by Ben Watson. In this blog post I will mention the most important things I’ve learned, but I really recommend you buy this book if you want to improve the performance of your .NET code.


Let me start by the most important lesson I’ve learned: measure, measure, measure! To accomplish this, I’ve created a benchmarking solution for Catel so the team can benchmark features and compare the performance against previous versions of Catel. This is all open source and can be found here.

The underlying framework for the benchmarking being used here is BenchmarkDotNet.

Try to minimize code generation

Most of the .NET code (with the exception of .NET native) must be compiled the first time it’s hit by the CLR. The smaller the code, the faster this “compilation” will be. Therefore it’s wise to really consider:

  1. Do I really need this code at all?
  2. Do I really need that complex linq statement (resulting in complex underlying code)?
  3. Does this method need to be async await? Async await generates a state machine under the hood resulting in both complexer code and task objects requiring garbage collection. Only use async await if it makes sense, don’t do it by default.

Try to minimize garbage collection

Garbage collection is “very expensive“ in .NET. Therefore you should try to minimize the number of allocations (and thus members on a class / struct). If you need large objects, consider using an pool manager. For example, if you need to read a lot of http messages inside a memory stream, try to re-use memory streams from a pool to prevent garbage collection.

Try to prevent boxing

Boxing can be expensive in highly critical code. This affects both CPU (boxing / unboxing) as the garbage collection. Therefore it’s better to prevent boxing where possible, for example by using generics. Another way is to cache boxed values for simple types (e.g. booleans) and return a cached version of a boxed boolean.

Try to avoid exceptions

Exceptions in .NET are really expensive. This is because the framework gathers a lot of information about an exception to provide a great debugging experience. Therefore it’s important to only use exceptions when they are really necessary, thus in exceptional cases.


There are tons of ways to improve performance in your code. If you are interested in improving your coding skills on performance level, I suggest that you buy the book and improve your skills. Note that posting benchmarks on the internet might make you feel very naked, some things might not be as optimized as you’d hoped. But remember that at least you’re learning and improving, it takes courage to do so.

In the meantime I will start adding more benchmarks to Catel and hopefully add some great performance benefits for all the users of Catel to enjoy.

Hyper-V: improve your I/O performance

Since a few weeks I am running Hyper-V instead of ESXi for my virtualization stuff. I have a very fast “server” running a decent CPU and a RAID 0 array consisting of 4 Samsung 840 Pro SSD disks with an Intel raid controller.

This should all be super fast, but still I wasn’t happy about the I/O performance. After doing lots of research, I performance the following tasks. As you can see, I was able to gain a 200 % performance improvement (for the build server + build agents):


Note: always make a backup before messing with your RAID configuration

Upgrade the driver and firmware of the RAID controller

Upgrade the driver of the RAID controller (check your official RAID controller website, in my case intel). And even if you are running Windows Core (Hyper-V 2016 without UI), you can still install Raid Web Console 2.0 and run startupui.bat in the installation directory. Check if all settings are still correct for you to ensure the best performance.

Upgrade the firmware of the SSD disks

This is a bit trickier because the disks need to be recognizable. But they are not because they are exposed as a single volume via the RAID controller. To upgrade, I did the following:


  1. Separate machine that you can easily attach a SATA disk to. I have a specific desktop machine where I can just plug and play SATA disks at the top of the casing.
  2. The latest firmware upgrade software (on the separate machine), in my case Samsung Magician


  1. Turn off the server and make sure you have a backup (!).
  2. Unplug the SSD from the machine and perform the steps one by one:
    1. Plug the disk into the separate machine (do not format or try to read the disk, just add it)
    2. Run the firmware upgrade tool. I tried upgrading via a boot CD for Samsung, but that gave me a “upgrade was unsuccesfull” message all the time. Samsung Magician worked great.
    3. Check if the firmware correctly upgraded, put back the disk in the server and continue with the next one

My firmware was updated from DXM04B0Q to DXM06B0Q.