kireiorganic.info

Safe photo editing software


Why I switched to Windows and built a water-cooled 5.2GHz 6-core editing machine

If there's one thing that will make even the most powerful computer feel like a 7 year old rig, it's Adobe Lightroom paired with RAW files from any high-megapixel camera.

In my case, I spent over a year of spare time editing 848GB worth of 11,000+ 42-megapixel RAW photos and 4K videos from my New Zealand trip and making . I quickly realized that my two year old iMac was not up to the challenge.

In 2015 I took a stab at solving my photo storage problem with a . That setup is still running great. Now I just need to keep up with the performance requirements of having the latest camera gear with absurd file sizes.

I decided it was time to upgrade to something a bit more powerful. This time I decided to build a PC and switch to Windows 10 for my heavy computing tasks. Yes, I switched to Windows.

A note to the reader

This is a long blog post. The longest I've written on this site—over 32,000 words—and consumed many of my weekends for about 4 months. Typically these "I built a computer" posts are rather useless a few months down the line when new hardware comes out and it's nothing but an old parts list. While I can't avoid that, I aimed to provide enough information about my reasoning for why I chose certain parts or how I configured things so that this post may still be helpful a year or three down the line. Enjoy!

If you like this post, please share it with your friends, followers or anyone that might be interested.

Table of contents

What I use my computers for

For the last few years I have more or less had some variant of the same setup: a beefy desktop computer for heavy lifting and a small laptop for travel and casual use. My desktop usage, in order from most to least frequent, is largely comprised of Adobe Lightroom, web development for this website, Adobe Premiere Pro and some occasional gaming.

While I did love my 5K iMac, I hated that the only way to upgrade a year or two later was just to replace the entire thing. I hated that even the newest models were typically behind Intel's release schedule and you couldn't get the absolute latest and greatest hardware, much less be able to overclock them a bit for even more performance.

Apple has failed to provide the option for high-performance, user-upgradeable machines for years and even the new iMac Pro continues that trend. Perhaps the rumored upcoming Mac Pro will be different but I just don't see a world where you'll ever be able to hear about the latest Intel chipset and processor launch, immediately buy a new processor and motherboard and upgrade your Mac that weekend.

I'm not the only one with this mindset. More and more creative professionals that demand the most from their machines are getting over Apple for their high-end computing needs. Filmmaker Philip Bloom recently . Photographer Trey Ratcliff and I've been seeing more and more friends in the creative space do the same.

Nothing is really holding me to macOS on the desktop. I go in there, edit some photos or do another large task then I retreat to my 13" Macbook Pro. The Adobe suite works on Windows and there is now official Linux support via WSL on Windows 10 so I can run my development environment easily.

I would be lying if I didn't mention one of the main reasons I wanted to build a PC: finally having a modern full-size graphics card, for both GPU acceleration in creative applications as well as for gaming. With my iMac I casually played a few games on Steam, but with paltry settings. Even if I were to purchase a new high-end Mac, you just can't get the best graphics card on the market (even with the new iMac Pro ). Much less just be able to easily swap it out with a better card a year later.

When I began planning this new build around April 2017, I considered making it a dual-boot and Windows 10 PC. At the time a hackintosh build sounded promising: Kaby Lake processor support and Nvidia drivers for Pascal GPUs for macOS had just been announced.

Then I began thinking of how I actually use my computer. The idea of constantly rebooting to hop into Windows for a bit to play a game, then reboot to go back to macOS seemed like a major inconvenience. It also meant that I couldn't just upgrade to the newest hardware — I would have to wait for hackintosh support to arrive. Not to mention the associated hackintosh annoyances I've dealt with in the past: tricky software updates and reliability issues. I knew what I needed to do.

The goal: Build a fast, yet quiet and understated desktop PC with a healthy overclock aimed at improving my photo workflow while giving me the ability to upgrade parts of it later on.

What makes Lightroom fast?

But first, let's talk about what Lightroom needs to thrive.

One thing to note, and this kind of defeats the purpose of this whole post: I've never seen any hardware improvement, no matter how drastic, turn Lightroom into a pure speed demon when dealing with the kind of huge RAW files I work with. I might experience single digit to low double digit percentage improvements in certain tasks, but nothing that would blow my socks off. Nothing instant. If someone claims their Lightroom setup is instant, they're lying or they're working with tiny 12-megapixel JPGs. if you're interested.

In addition, any performance improvements gained on new hardware are often then negated when upgrading to newer and newer cameras that shoot higher megapixel photos and higher resolution and bitrate videos. It's a vicious cycle. I have even thought about downgrading to a lower megapixel camera to make editing easier; but I love being able to have room to crop photos and videos. And the extra megapixels helps when I .

What would make Lightroom really fast is the software itself receiving dramatic optimization and performance updates. It has been around for ages, I'd imagine there is quite a bit of code cruft that Adobe would love to refactor and rethink. Adobe has even and they're working on it. Nothing I can do here but cross my fingers and wait for software updates.

How I use Lightroom

Here's what I do in Lightroom that can feel slow.

I spend a lot of time in Lightroom. What exactly do I do to my photos? Increasingly less and less (more on that below), but there's still quite a few tasks from culling to pick the best shots out of hundreds or thousands all the way to numerous adjustments made individually on each photo.

I have been interested in photography for over a decade but didn't really start taking it seriously until I built out my and started crafting photosets of trips. At first I enjoyed making photos seem surreal and dramatic. I was all too eager to yank the saturation and clarity sliders and even use programs like Photomatix Pro and Aurora HDR that started out basically encouraging the creation overly gaudy HDR images.

Over the years I have tried to hone my photography aesthetic to be more realistic and only edit to try to capture what it was like to be there and see something with your own eyes — recovering highlights and shadows, removing spots created by a dirty lens in a long exposure, adjusting color temperature to communicate the warmth of that day, remove noise to share the clear night with bright stars and so on. And sure, sometimes that vibrance slider might find it's way to +15 to accentuate some glacial blue water, but I rarely touch the saturation slider these days.

And like designing a product interface, there's just as much work if not more that goes into keeping things simple and have it communicate effectively. Sometimes I'll spend the most time leveling a shot and finding a good crop.

Sony A7R III with .

I love working in Lightroom on a high-res display in full-screen mode. I often zoom 100% into one of my massive 42MP RAW photos to find the sharpest and most in focus of several similar shots.

Unfortunately, all three of these behaviors incur a significant performance cost right off the bat.

If you're familiar with Lightroom you probably know about the different modules of the app. I spend the vast majority of my time in Develop module and some of my time in Library module. The different modules act as tabs — changing between them brings up a new set of functionality and contextual side panels.

Lightroom Classic CC fullscreened at 3840x2160 in the Develop module working on an HDR image.
Photo:

When I’m doing basic culling, I try to stick around in the Library module where there are performance benefits at the expense of not being able to do any real editing to the shots. It's possible to make filmstrip scrolling and browsing in the Library module fairly speedy by generating previews either manually or on import.

Generating previews in advance means that Lightroom doesn't have to fire up the Camera Raw engine to process and then cache a large compressed RAW file each time you click on a photo, an action that can take up to 3-5 seconds per photo on a large screen.

There are several kinds of previews in Lightroom, but I generally have 1:1 previews created when I import a new set of photos. They're processed, full-size versions of the RAW photo. It takes a lot of time to generate them but it's done all at once. I don't mind that upfront cost as I can just go make a coffee, and come back in 30 minutes.

However, I often hop over to the Develop module while culling to see what the photo could look like with some basic adjustments or a crop to see if the shot is worth keeping. Unfortunately, 1:1 previews are not utilized in the Develop module and even if I had generated Smart Previews which are used in the Develop module, they only create previews up to a max size of 2540px on the longest edge of each photo.

So where does this leave me? Spending the majority of my time in the Develop module where generated previews won't help on a large display with frequent 100% zooming. The only savior we have here is that the Develop module is the only part of Lightroom with GPU acceleration:

Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.

Even with that, it's still very early for Lightroom GPU hardware acceleration and it leaves much to be desired. GPU acceleration can make most Develop controls quicker but it seems that can come at the slight expense of two things: the time it takes to load full-resolution images as well as moving from image to image. Also, actions like panorama stitching, HDR photo merging, the adjustment brush and spot removal tools do not seem to get any boost here.

Inside the Develop module

Okay, here's my typical Lightroom workflow

After I have mostly completed the culling process and selected the better shots to keep in my collection, I go over each photo with a series adjustments as needed. I most commonly visit these settings:

  • Camera Calibration → Profile: The profile determines how Lightroom processes the RAW and serves as a basis for all your adjustments. Depending on the camera you use and Lightroom's support for it, you will see different options here. I believe the goal from camera manufacturers is to have the profile mimic the camera's own creative style settings had you had any enabled and shot a JPG; those settings don't affect the actual RAW.

    I rarely use the default Adobe Standard profile and have Lightroom configured to use Camera Standard as the new default profile. Depending on the photo I may use something like Camera Landscape for more contrast and color but often find it too saturated and have to manually compensate for that. There's a wealth of information about Camera Profiles out there, but .

  • Remove Chromatic Aberration & Enable Profile Corrections: I tend to have these on by default. The latter will load a lens profile if one exists to correct any distortion (like barrel or pincushion) with your lens. One school of thought is to rarely use profile corrections as they can reduce detail and also lead to some minor cropping at times. But I find this to be a bit nitpicky and won't readily be able to discern a significant loss of detail by enabling it.

  • Transform: On certain occassions, like having a shot of a building that was taken at a slight skew, it can come in handy to enable a perspective correction. With more complex subjects, such as trying to tame two similar but slightly off leading lines in a photo, Lightroom has the guided transform feature. However, I try to only use these if the effect does not do too much. It can look pretty unnatural in those cases.

  • Basics: The essential controls that I fiddle with on every shot: Exposure, Temperature, Tint, Highlights, Shadows, Whites, Blacks and to a much lesser extent clarity and vibrance. At times I will jump directly to the Tone Curve but I often really only go there to tweak one or two RGB curves a bit, not everything at once.

  • HSL: Sparingly, I'll find myself wanting to reduce the luminance, saturation and rarely hue of a particular color in a shot. Most commonly I'll use it to decrease the prominence of a particular color in a scene I find distracting. Like if I adjusted the color temperature of a photo to be a bit warmer and it's making some yellow/orange foliage in a shot look obviously too saturated. Or if I wanted to adjust the luminance of blue to make a body of water darker or brighter to maybe compensate for other adjustments that may have made it appear a bit off.

  • Spot removal, adjustment brush, graduated filter:
    Spot removal gets a good amount of use. Most frequently to clean up anything caused by a dirty lens. When you're out shooting all day you tend to get some dust specks, mist and other tiny debris that only becomes obvious when capturing long exposures. I use the "Visualize spots" mode of the spot removal tool to easily track down and remove these spots.

    I use the adjustment brush much less, but in recent memory I used it to select a mountain range in the distance that had decreased visibility due to clouds/fog and increased the clarity and contrast a tad. But I'm using it less and less these days.

    Graduated filter rarely gets used anymore, but in the past I liked placing it above the horizon to make the top of the sky a bit darker, reduce highlights, boost contrast and clarity to make clouds pop.

    Lightroom Classic CC has a new range mask modifier for these actions that makes them easier to control that I've used a few times.

The visualize spots mode for spot removal. Left: Dirty lens from shooting near the water a lot one day with lots of spots obvious in the sky (the largest spot was actually a tiny scratch in my polarizer). Right: With corrections applied.
  • Merge HDR: When necessary depending on the scene, I will turn on bracketing and shoot a ton of 3-shot brackets. I've done several 5 shot brackets but didn't find enough value in the difference to make up for all the extra storage and time required for those. In Lightroom I will stack each 3-shot bracket then select a few of the stacks and begin them in parallel with the . Just press CTRL (windows) or Cmd (mac) + Shift + H when you have a few stacks selected. This is only good if your HDRs are from the same scene and fairly similar as the headless mode skips the HDR Merge Preview dialog and just goes with whatever setting you last used.

Even though I consider this light editing, that's still a ton of actions to do on one photo. Even something seemingly as simple as a profile correction can end up increasing the number of calculations Lightroom has to do on all subsequent actions. Adobe even in the Develop module to speed things up:

The best order of Develop operations to increase performance is as follows:

  1. Spot healing.
  2. Geometry corrections, such as Lens Correction profiles and Manual corrections, including keystone corrections using the Vertical slider.
  3. Global non-detail corrections, such as Exposure and White Balance. These corrections can also be done first if desired.
  4. Local corrections, such as Gradient Filter and Adjustment Brush strokes.
  5. Detail corrections, such as Noise Reduction and Sharpening.

Note: Performing spot healing first improves the accuracy of the spot healing, and ensures the boundaries of the healed areas match the spot location.

Once I've adjusted each shot to my heart's content — and gone back and forth over each shot multiple times — I happily initiate an full-size JPG export. This takes a long time but I don't care as much compared to speed in the Develop module; I use the opportunity to take a break and do something else while the computer works.

A word about presets

Why don't you just have a few presets to pick from instead of adjusting everything manually?

Having a robust set of custom presets tailored to your personal photography aesthetic can save a ton of time when faced with a new set of imported photos. But that's not really my thing. I simply don't like using presets as one-click-and-done filters. I always want to manually adjust things to see what a certain photo is capable of and not "leave anything at the table" by just using a preset I have lying around. It could get the job done but wouldn't be what I would have ended up with had I started from scratch.

The thing that speeds up my workflow more than anything is not presets, but actually just a quick way to copy and paste develop settings with shortcut keys. I use VSCO Keys to do this using . and , hotkeys. It's quick and effective — I fiddle with settings on one photo to my liking, copy and use that as a base for any subsequent photos that are similar.

Many folks use presets in a similar way to speed up basic, repetitive things you typically do then stack them to provide a quick base to work from. You might have a handful of presets to do things like boost contrast, increase shadows and so on. And by making them for distinct actions and not set any other values, you can stack them by continuing to click on other such presets. Some folks love this flow but I never really got into it.

Hardware considerations

What do you need most? Disk I/O, GHz, CPU cores, GPU?

As you know there are a few main levers that affect the majority of a computer’s performance: storage, RAM, GPU and CPU. To be more precise: storage throughput, RAM size, RAM speed as well as the number of CPU cores and clock speed. In the case of Lightroom, CPU plays the most important role in overall application performance and to a much lesser extent GPU.

Storage

Surprisingly, storage speed is not of the utmost importance to Lightroom as long as you have something decent. It's especially a non-issue if you have some kind of SSD and store everything on it. That can be rather expensive and you may opt to have a smaller SSD that only stores the Camera Raw cache, previews and catalog and then a regular hard drive to store the images themselves. Even with that setup the Lightroom performance difference is fairly indistinguisable. There are as far as Lightroom is concerned.

Whatever your storage solution, you'll want a lot of storage space when dealing with hefty RAWs. Or to archive shots when you're done with them.

RAM

As for RAM, you probably don't need more than 16GB for Lightroom. However, if you aggressively multitask and/or use Adobe Premiere Pro you'll want at least 32GB. You can definitely exceed that amount and go with the maximum amount supported by your motherboard.

But there are some things to keep in mind. First, RAM is not as cheap these days as it used to be for a variety of reasons, and low latency RAM is even pricier. Second, if you care about bleeding edge performance and overclocking, you should find your ideal amount of RAM in 2 sticks, not 4. It's harder to maintain an aggressive overclock with 4 channels of RAM putting a larger strain on the integrated memory controller, especially if the CPU only has dual-channel support like the Intel i7-8700K and if the CPU itself is running an aggressive overclock.

GPU

Lightroom can use a good graphics card for hardware acceleration but it really doesn't take the most advantage of it. You might notice the benefit if you are using a 4K or better display and do basic actions in the Develop module.

On lower resolution displays I've heard that having GPU acceleration enabled can actually hurt performance as your computer spends time sending data between the CPU and GPU that the CPU could have just done on its own in a shorter amount of time.

Hopefully Lightroom will make better use of high-end graphics cards with future software updates. You definitely don't need a top of the line card for Lightroom but if you're going to get one anyways you'll want to learn towards an Nvidia card. Adobe software seems to be more optimized for Nvidia graphics cards.

CPU

Adobe's recent upgrade to Lightroom Classic CC brought some performance improvements, largely related to increased multi-core performance for generating previews. Overall though, Lightroom does not make the best use of many CPU cores. The Develop module can to a degree but the performance makes it obvious that it's not terribly efficient. This is a theme with all of Lightroom in regards to performance: it could always be better.

For my needs Lightroom loves the highest clock speed it can get, as opposed to a ton of lower clocked cores.

Having more cores in Lightroom can help you if you care more about exporting images and generating previews. That is not something I care about as it happens so infrequently compared to me fiddling with sliders in Develop. Otherwise, you're better off with fewer cores with a very high clock speed. This will help with:

  • Scrolling through photos in the Develop module
  • Performance and responsiveness for adjustments in Develop
  • Converting images to DNG
  • Merging HDR images and stitching panoramas

If you shoot a large amount of photos and hate waiting for images to export or previews to generate, then a higher core count CPU like the Core i7 7820X 8 Core, Intel Core i9 7900X 10 Core, or even the Core i9 7940X 14 Core may be a great choice depending on your budget. You certainly give up general editing performance as you get into the higher core counts, but a 30-40% reduction in the time it takes to export and generate previews can be a massive time saver. However, if this isn't a major consideration and you just want the smoothest editing experience possible, then the Intel Core i7 8700K is still our go-to recommendation for Lightroom.

What about video editing?

Premiere Pro is great at using multiple cores and a beefy GPU

I wouldn't consider myself anywhere near savvy in any video editing apps, but I have been shooting more and more video footage on my trips. I've gone from making edits in iMovie to Final Cut Pro X then After Effects and finally to using Premiere Pro. A good chunk of my footage is now captured in 4K. Better Premiere Pro performance is a nice to have for me but not a priority compared to Lightroom performance.

On the completely opposite end of the spectrum compared to Lightroom, Adobe Premiere Pro has much, much better multicore efficiency. It also seems to and will most definitely make use of as much RAM you can give it.

My common tasks in Premiere Pro — rendering previews, warp stabilization, Lumetri color adjustments and exporting — all take advantage of additional processor cores. And the faster the clock of each core, the better.

Why build a PC?

Building a PC is easier than ever today

I'm no stranger to building a computer from scratch; I have built dozens by now . However, quite a few things have changed from the last time I built a computer.

These things are now fairly commonplace and welcomed additions to the building process:

  • AIO (all-in-one) liquid cooling systems
    Back in the day if you wanted a high-performance and relatively quiet cooling option for your overclocked processor, you would have to source a radiator, fans, pump, reservoir, tubing, CPU/GPU/Northbridge waterblock and assemble it yourself. You'd have to fill it up, put some anti-algae chemicals in, get all the bubbles out, do hours of leak testing and then change the liquid coolant out every 6 months or so. It was a huge hassle.

    Now you can just buy an all-in-one system that comes with everything you need in a simple closed loop. Most AIO units these days even have a USB connection and some desktop software to help monitor and automatically ramp up the fans and pump depending on CPU or GPU activity.

  • Fully-modular PSUs
    Why should a computer with a million internal drives and accessories have the same number of power supply cables as a basic setup with just one SSD? There's no need to have 20 extra power cables taking up space in your case if you don't need them.

    Modular power supplies let you connect only the cables you need, reducing clutter from your case. And most offer attractive sleeving styles to boot. Just don't ever mix and match cables from other PSUs — there is no standard pin layout for the connections on the PSU and rookie PC builders often burn their PC by accidentally keeping old cables in there when switching to another modular PSU.

  • Operating systems are sold on USB sticks now. No longer do you need to buy a cheap optical drive just to install the operating system and never use it again. Windows 10 is now sold on a tiny USB stick.

  • Cases with no 3.5" and 5.25" drive bays! And on that note, why should you have a case with unsightly internal and external drive bays you may never use? Case manufacturers have started offering cases entirely devoid of 5.25" optical drive bays as well as 3.5" racks, or they have removable racks.

  • M.2 NVMe SSDs: Probably the biggest innovation for me personally. SSDs in a small PCIe stick that offer a tremendous performance advantage over even regular 2.5" SSDs. Something like 3-5x faster.

  • Case windows use real glass now. Modern high-end computer cases actually use real glass instead of scratch-prone and flimsy lucite or plexiglass. I still think windows in computer cases are kind of silly though.

  • Wireless mice are actually good now. Okay, I'm really dating myself here but for the longest time wireless mice were laggy. Noticeably laggy cursor. Impossible to use for even the most basic gaming and they came with horrible battery life. I have been using wired Logitech mice for about a decade... but I recently switched to a Logitech MX Master 2S. Wireless mice with great battery life and nice customizability are finally here.

In the past you sort of had to wing it when it came to picking parts for your build. You would have to read a bunch of reviews for motherboards, graphics cards, RAM and so on. Then you might need to actively participate in a computer forum to see what folks were running, if there were any compatibility concerns, order everything and then hope everything worked as expected.

is one relatively new resource that I have found to be invaluable. People share their build lists, photos and more there. It helped me answer very specific questions on numerous occasions:

  • Does this graphics card fit in this case?

  • Will this watercooler fit?

  • What does this case look like with these parts?

  • What is the smallest case that will fit this motherboard?

Chances are someone out there has built a machine identical to what you want to build and you can just look up pictures of that rig.

In addition, I've found a few other handy resources while building: the active Reddit and some popular YouTube channels like , and .

Though I would be remiss of me if I did not mention why it may not be a great time to build such a PC: RAM prices and graphics card prices have skyrocketed in the last year. The latter mainly due to insane high-end graphics card demand from cryptocurrency mining.

The case

Finding a good case will never be easy

Unfortunately, one thing has not become easier over time. Finding an attractive, understated and simple case. Case manufacturers seem to only cater to the gamer stereotype of excess and gaudy.

I'm in my 30s, I'm a designer... I want something simple but that doesn't mean I don't want to have the best hardware, support for large water-cooling radiators and expected case ammenities like thumb-screws, anti-vibration features and other noise considerations. I could care less about LED fans, weird intake designs and other questionable aesthetic choices.

In the past I tended to like small form factor computers, having . So I started there, thinking I could get a micro-ATX motherboard. This would prove to be a challenge given my desire to have a long full-size high-end graphics card.

There were a few that were somewhat close to what I was looking for like the Define Mini C and Corsair Air 240, at least size-wise. Then I found a Kickstarter for a ridiculously small case called the DAN Cases A4-SFX that used a micro-ITX motherboard and could house a full-size graphics card. It was dubbed the "world's smallest gaming tower" at roughly the size of a shoebox. Sure, it had some tradeoffs (non-ATX PSU, limited motherboard selection, limited heatsink-fan options and not the best cooling in general) but it seemed perfect.

Unfortunately, it was sold out everywhere. A on Kickstarter and I ordered it. Though it won't arrive for a while. Maybe I'll use that for a separate build later on.

The more I thought about it, I wanted a case large enough for me to pick a motherboard with great overclocking capabilities, the ability to have a water-cooling setup and the room to expand to two graphics cards via SLI if I so decided in the future .

The searching continued for a case that could accomodate an ATX motherboard as well as a large 280mm radiator. I'll spare you the details, but after looking at a bunch of cases (ones from NZXT, Define, Corsair and Lian Li mainly), I landed on the NZXT S340 Elite VR. NZXT also has a newer model called the H700i that seems interesting and has a bit better internal cable management but I'm not a fan of some of the perforated panels it has on the top.

Shot these photos myself! Got just for this article.

The parts

What parts I chose and why

I first built this computer in April 2017 with a quad-core i7 7700K and Z270 chipset motherboard. But later that year Intel released six-core i7 8700K processor and Z370 chipset. I ended up upgrading both the CPU and motherboard at that time.

Processor

Intel Core i7 8700K

I went with the hexa-core Coffee Lake Intel Core i7 8700K processor running at 3.7GHz (4.7GHz with Turbo Boost). At the time I built this computer, the 8700K was the best processor for gaming as well as performing well in Lightroom compared to other chips. It's not the best for Adobe Premiere Pro but it's better than a 7700K with fewer cores.

Why not go for more than 6 cores?

There are processors with more cores — from both Intel and AMD — but I don't think they would be better for what's important to me: Lightroom and gaming. Two uses that traditionally prefer higher clocks and don't make good use of too many cores. Typically when you get a processor with more cores, the lower the clock speed per core.

For example, each core in the ridiculous ,000 18-core Intel Core i9 7980XE has a mere base clock speed of 2.6GHz but with a Turbo Boost (v3.0) up to 4.4GHz. There's some extra clarification to be made here: Turbo Boost does not mean every core gets that speed. In this example, only two cores get 4.4GHz. If there was some magical processor that had a ton of cores where each core had a very high clock speed as well, then the case may be different.

This does also apply to the 8700K. While the Turbo Boost is listed at 4.7GHz that's only for one core. If the computer decides two cores should be boosted, then they are each at 4.6GHz. That continues down to all cores running at 4.3GHz with Turbo Boost. Compare to the 7700K that has a Turbo Boost of 4.5GHz for one core and 4.4GHz for all four cores. So yes, the comparable all core Turbo Boost speed of the 8700K is slightly slower than the 7700K chip.

Then why did I get the 8700K if that's the case? Because I'm going to overclock the heck out of all cores on the 8700K — and a bit of a spoiler but I got a good chip and was able to overclock higher with the 7700K. And as a nice secondary benefit the extra two cores means I have better performance for applications that make good use of multiple cores and many threads, like Premiere Pro.

I would try to explain more of the current landscape of Intel processors... but it would take way too long to even begin to explain and it doesn't really matter.

CPU cooling

Corsair Hydro H115i AIO liquid cooler

As I mentioned above, all-in-one liquid cooling options are affordable and highly performant alternatives to creating your own water-cooling loop, not to mention the related hassle and maintenence. These systems are easy to use, as long as your case is large enough to support the radiator size you want.

I picked for one of the larger ones: the Corsair H115i has dual 140mm fans for its 280mm radiator. Corsair recently released a newer version called the but the main differences seem to be a pump head with RGB LED lights and mag lev fans that aim to be quieter (which I will probably order separately and swap out my current fans).

While I got the largest and easiest to use Corsair 280mm AIO system, there are a few other options if you're feeling more adventurous and wish to build a custom loop or have a larger case and want something more performant. If you plan to watercool your graphics card and don't want to have to deal with placing a second radiator from an AIO kit, a custom loop is a good way to go.

Motherboard

ASUS ROG Maximus X Hero

Aside from the basic need that it support LGA1151 processors and used the Intel Z370 chipset, I had a few requirements when I began searching for the motherboard:

  • Onboard 802.11ac Wi-Fi: Because I really don't want to have to get an additional card to add Wi-Fi capability.

  • Two M.2 slots: I didn't want to go the route of a traditional SATA SSD with this build and wanted to go with a tiny and speedy M.2 card slot SSD. As for why I wanted two — more on that in the section below.

  • First-class overclocking support: There are a few things I like to see in a board I plan to overclock with, even a bit:

    • Large built-in heatsinks over vital parts of the chipset, especially the power management components above and to the left of the CPU socket.
    • An easy way to reset or diagnose why the computer doesn't boot (external rear restart buttons are nice, as well as an onboard display to indicate error codes).
    • A solid UEFI BIOS that lets me control everything related to overclocking. While modern motherboards also have companion Windows software to let you control this on the fly, it's nice to have more control in the UEFI BIOS itself.
  • Aesthetics: Definitely further down on the list of wants, but I'd like something fairly discrete without a ton of bright red RAM and PCIe slots. Nowadays everything (motherboards, graphics cards...) has a ton of LEDs on it but fortunately they are all customizable so I can turn them off.

  • Full ATX form factor: Adequately spaced PCIe slots to accommodate large graphics cards with non-standard height coolers and future expansion cards I may plug in.

  • Strong, reinforced PCIe slot: Graphics cards are so heavy these days with massive coolers that I'm always worried I'm going to damage the PCIe slot with all the weight. While I plan on getting a graphics card brace to help with this, it would definitely be nice to have if the motherboard had a stronger PCIe "SafeSlot" as Asus calls theirs.

This may sound like a laundry list but if we're talking about fairly high-end boards, there are a lot that meet these needs. I ended up going with . In fact, the one I went with was the lowest end model of this high-end ASUS ROG Maximus line that caters to the gaming and overclocking crowd.

If you're looking for something with similar functionality but a few less boxes checked, any Z370 motherboard from the enthusiast Asus ROG Strix line below this Maximus line would be a solid choice.

Graphics

ASUS ROG STRIX Nvidia GeForce GTX 1080 Ti

For the graphics card, there was really no beating the just-released (at the time) Nvidia GTX 1080 Ti. The much more expensive ,200 Titan Xp came out shortly after but had marginally better performance (around 5-7%) — gains that could mostly be achieved by mildly overclocking the GTX 1080 Ti. And as I had mentioned earlier, Lightroom and other Adobe applications I use frequently are more optimized for Nvidia cards at this time.

You might be thinking.. holy crap, 0+ for a graphics card!?! That's almost double the price of the CPU.

First off, this card gets you solid VR and 4K-and-beyond gaming, and should continue doing its job well into the next generation of VR gear. With a clock speed nearing 1.6GHz, 11GB of GDDR5X VRAM, 3,584 CUDA cores, 11.3 teraflops and a whopping thermal dissipation of around 250W (more than double the i7 8700K CPU's TDP), the GTX 1080 Ti is a beast. If you care more about the details, this should be more than enough.

High-end graphics cards have matured significantly since the last time I purchased one for a build. With their ridiculous number of cores excelling at highly parallelizable tasks, modern graphics cards have also found a life beyong gaming with the rise of general-purpose computing tasks on GPUs (GPGPU) like cryptocurrency mining.

However, knowing I wanted the GTX 1080 Ti wasn't enough. I had to pick out which of the many different models I wanted. I knew I wanted to avoid the standard "blower" type reference design cards. The single-fan blower style cards tend to be fairly loud and lack some ideal thermal characteristics that I'd want for such a card, especially one I will overclock. I ended up with this triple fan ASUS model that also featured a huge heatsink requiring a 2.5-slot height. There are way more options for GTX 1080 Ti cards at the time of publishing compared to when I started building this PC; for the most part look for anything with a huge cooler and you should be good.

There's one caveat though. The GeForce line of graphics cards don't output 10-bit color graphics (40-bit RGBA) to a 10-bit monitor unless you're in a DirectX 11 fullscreen mode, which is basically only for gaming. It seems that Nvidia blocks their consumer line of cards from outputting 10-bit color for professional applications and prefers that you buy a card from their much more expensive Quadro line (the top Quadro cards range in price from ,000 to ,500!).

Why get only one card?

I initially considered going for a dual-card SLI setup. After some research I discovered two things:

  • Lightroom has no support for a dual card SLI setup.

  • Not many PC games even have SLI support now.

As such, it doesn't seem worth pursuing a dual-card setup. I can already game at 4K 60fps no problem with this single GTX 1080 Ti. However, I expect the need for and adoption of SLI support from developers to change as the need for even more performance grows with future high-resolution, 240Hz DisplayPort 1.X+ monitors and VR HMDs.

Storage

Dual 1TB Samsung 960 EVO M.2 SSDs

I had heard so much about the crazy performance of these tiny new PCIe NVMe M.2 SSDs that I had to try one. Some of the latest high-end M.2 SSDs boast speeds more than 3-5x faster compared to their SATA counterparts.

960 EVO at home in its M.2 card slot.

But first.. what the heck does PCIe NVMe M.2 mean?

  • PCIe: The high-speed serial expansion bus that connects to a bunch of peripherals like graphics cards and some types of storage (not SATA). You might have heard about a CPU/chipset supporting a particular number of PCIe lanes. That roughly refers to how much bandwidth (each lane equates to 4 physical wires — two to send, two to receive) a particular device may require. For example, a modern graphics card usually wants a PCIe x16 slot to get 16 lanes for more bandwidth while current M.2 SSDs only require 4 lanes.

  • NVMe: Short for NVM Express (which is short for something even longer), NVMe is just a specification for interfacing with non-volatile storage attached via PCIe. It's like an API for these new SSDs. Previously, PCIe-attached SSDs had their own custom way to talk to the chipset and that lead to requiring custom drivers. NVMe is now the standard and was designed with SSDs in mind, compared to the precursor protocol AHCI that was made with spinning disks in mind.

  • M.2: And this is simply the name of the connector for the expansion card itself. You might see them called M.2 2242 or 2280. That refers to the length of the card: 22mm wide and either 42mm or 80mm long.

When I began researching M.2 SSDs for this build, there seemed to be only two options when it came to no-holds-barred performance: the Samsung 960 EVO and the Samsung 960 PRO. The EVO model uses TLC V-NAND with some smart uses of two kinds of SLC caches to increase performance. The PRO model on the other hand uses superior MLC V-NAND flash memory.

Despite the difference in NAND types used between the 960 PRO and 960 EVO, the performance isn't too dissimilar (likely thanks to the EVO's great use of SLC caching):

  • 960 PRO — 3,500 MB/s seq. read, 2,100 MB/s seq. write

  • 960 EVO — 3,200 MB/s seq. read, 1,900 MB/s seq. write

I went with the 960 EVO to save a bit of money given that I probably would not be able to tell the difference between those two in terms of speed. I was not concerned with lifespan as I would likely upgrade long before I saw any diminishing performance.

I got two 1TB EVO 960's. I started this build thinking I would have a dual-boot Windows 10 and macOS hackintosh machine. I ended up deciding against that for a variety of reasons, including having to limit my initial hardware choices to only things that would be friendly for a hackintosh setup. Then I thought maybe I would just RAID 0 the two SSDs but decided against that (more on that later). I just ended up making one a dedicated scratch disk for Lightroom to store photos I'm currently working on. It feels safer that way in case I do something that somehow nukes my main OS drive (though I always have the photos backed up to the NAS and Backblaze so it wouldn't matter much).

A note about 3D XPoint: Intel has a promising new type of memory technology called 3D XPoint memory that they have started selling under . It's really expensive for the time being but very fast and something to keep an eye on in the future.

Uh, this is a Lightroom PC and you only have 2TB of storage??

If I didn't have to archive photos I was done editing, I would have opted to also get large internal mechanical hard drive like this if I was only going to use it alone. If I was going to use it in a RAID array, I'd get several of the or drives (they have NAS/RAID specific features like TLER).

RAM

2 x 16GB G.SKILL Trident Z DDR4-3200 CL14

I feel like RAM is an often overlooked piece of vital computer hardware for all but the more experienced computer enthusiasts. When it comes to RAM it's not just about picking enough so that your applications have enough room to play and don't need to unncessarily keep paging to your SSD.

You at least need good enough RAM so you don't have stability issues. Bad RAM can lead to a myriad of stability issues and odd computer behavior. I will just reiterate that this is not an area you want to cheap out on. Unfortunately, RAM prices are so high these days it's actually pretty hard to "cheap out" in this space.

I began by looking for only 2 sticks of RAM instead of 4 for a few reasons. First, I plan to overclock a bit so I wanted to only use 2 sticks to reduce strain on the integrated memory controller. Second, the Z370 chipset on this motherboard only supports dual channel so there would be no performance benefit going with 4 sticks. Not to mention the extra heat created with 4 sticks crammed right next to each other.

I wanted two low latency matched 16GB ram sticks for for a total of 32GB. While I definitely wouldn't mind having more RAM, 32GB is more than sufficient for my needs and performance is a higher concern for me. And well, it's really not possible to find really fast, low latency RAM in anything larger than 16GB sticks; even that is a challenge. The very high speed RAM kits tend to only come in 8GB sticks.

When it comes to RAM, there's a lot more to look at beyond just the number of gigabytes. Speed and latency play a very large and interconnected role.

Overclocked RAM can have a sizable performance improvement when it comes to frames per second while playing some CPU-bound games. The performance variance for general system tasks on an Intel machine: Intel machines are much less picky with RAM than AMD Ryzen machines.

The Intel Core i7 8700K with a Z370 motherboard supports a Coffee Lake DDR4 reference speed of 2666MHz. However, even if you have DDR4-2666 or faster installed, you won't get this speed out of the box without any configuration. It will run at 2133MHz due to the base JEDEC DDR4 specification. Fortunately, all you have to do is enable a memory setting in the UEFI settings called XMP (Extreme Memory Profile) — this will automatically bring your memory up to their rated speed and memory timings, adding a bit more voltage if necessary.

It's rather easy these days to overclock RAM on its own, separate from any CPU overclock. While the performance benefits on an Intel system outside of gaming probably don't make it worth your while to go overboard with extremely pricey RAM, just setting your RAM to its rated XMP settings can get you on your way quickly.

When I was overclocking long ago, the memory controller resided in the northbridge chip of the chipset and overclocking was frequently done by just increasing the front-side bus speed (in addition to the CPU multiplier if it was unlocked for the CPU) that links the processor and RAM, usually with a 1:1 memory divider if it would work. With modern Intel machines the memory controller resides inside the processor itself and it's much less common to overclock the base clock (BCLK) when RAM speed can be manipulated entirely on its own easily. Well that and because BCLK overclocking is tricky and can easily cause system-wide instability from RAM to devices on the PCIe bus.

Nerdy bits about RAM frequency and latency

When you shop for RAM, you typically see 3 things: size in GB, speed in MHz (like DDR4-3200 for 3200MHz) and finally latency or timings, typically shown with four numbers like 14-14-14-34. You may also see the latency listed as a CAS latency or CL value, that simply refers to the first and most important number for us of those four timing numbers.

In general, faster speeds and lower latencies are always better. But they are interconnected when it comes time to measure the absolute latency. Lets talk about what that means.

CAS latency (CL) does not represent a time value. Rather, it refers to the number of clock cycles it takes from the time the CPU (well, integrated memory controller inside the CPU to be more accurate) requests some data from the RAM to the time the RAM can supply that data back. For example, RAM with a CAS latency of 14 will take 14 clock cycles to return that data and CL16 RAM at the same speed would take 2 more cycles to get the same number of operations done.

So what is this clock cycle? The frequency at which the RAM operates is the number of operations per second the RAM can achieve. In the case of 3200MHz DDR4 RAM this is 3.2 billion cycles per second. A single cycle is the smallest unit of time a computer can recognize. There's one more wrinkle in this: we're talking about DDR, which stands for Double Data Rate. This kind of modern RAM processes 2 piece of data per cycle, so the DDR4-3200 we've been talking about is actually only clocked at 1600MHz but effectively operates at 3200MHz.

Now that we know how latency and frequency are related, we can begin to calculate absolute latency and see how it varies as RAM frequency increases.

Let's say we have DDR4-2666 RAM with a CAS latency of 14. First, we need to use the true data rate, which is half of 2666MHz. To get the absolute latency we plug it into this: 1/(data rate/2) CAS latency. That would be 1/(1333MHz) 14 = 10.5 nanoseconds to complete an operation requested by the CPU.

If you run a few different types of RAM through that equation, you can see the difference in absolute latencies. I only plotted a few RAM speeds and latencies in there, but it's possible to buy RAM at frequencies as high as (as far as I've seen).

(While I wasn't able to find RAM for sale at those CAS latencies, it might be possible to increase voltage and overclock the RAM to achieve some of those lower latencies. For example, I ended up running my DDR4-3200 CL14 at 3333 CL14.)

As you can see here, having the lowest latency doesn't mean much when it's not referring to absolute latency, which takes into account the number of cycles happening per second. Makes sense — the faster it's going, the less time each individual cycle takes, so at some point much faster RAM can make up for slightly higher CL timings. Here's another example: DDR4-2400 CL12 has the same 10ns absolute latency as DDR4-3200 CL16.

Does this mean you're better off getting cheaper RAM and overclocking it to the desired speed? Well, not quite. First off, there is no guarantee that your cheaper 2400MHz RAM could actually reach an overclock like 3200MHz. It might 1) not be possible, 2) require extra voltage, or 3) only be possible with significantly loosened higher CL timings, which defeats the purpose. As such, it's a good idea to always get the lowest latency RAM you can find, even if you plan to run it overclocked with higher timings. Better to get CL14 RAM and run it at CL15 or CL16 when overclocked much higher, than get CL16 RAM at the same speed but only be able to get it to work overclocked at CL18 or higher.

For all those reasons above, I ended up going with 2x16GB DDR4-3200 CL14 RAM. It's among the highest frequency and lowest latency RAM you can find. This should give me solid headroom to overclock the RAM past 3200MHz and still have a low CL even if I have to loosen it up a bit. In addition, this G.SKILL RAM uses Samsung's B-die chips which are reputable for their performance and overclocking ability.

And as a minor point, I was looking for RAM that would feel more at home in my mostly black PC and wasn't some obnoxious bright color.

PSU

Corsair AX860

Back when I was just getting into building computers some 15+ years ago, power supplies felt like they were largely overlooked by the DIY computer building community. The thinking was something like: just get something that's 300 watts or so with a big, heavy heatsink and a fan that isn't too loud and you were probably good to go.

In reality, the power supply is one of the most important parts of a stable and performant rig. Cheaping out on the power supply can result in random computer stability issues and restarts. In a worst case scenario a bad PSU could damage or even kill some of your components. The criteria for picking a good power supply has become a bit more stringent with power hungry graphics cards (modern graphics cards can use much more power than the CPU) in the last few years.

In addition, overclocking is no longer some mystical dark art — motherboard manufacturers cater exclusively to this crowd with high quality capacitors, PWMs and VRMS along with UEFIs featuring comprehensive voltage and control settings for just about everything.

When it came time to pick my power supply, I looked at a few things in particular:

  • Quiet: These days it's easier to find a PSU with a larger single fan (120 to 140mm in size) instead of the louder dual 80mm fans you used to find in power supplies. However, some more advanced power supplies have what they call a zero RPM mode — they don't even need to spin the PSU fan(s) until the load reaches some percentage of total output. Even then the fan only speeds up incrementally as needed. As such, it might be worth getting a more powerful PSU than you need, just so you can stay closer to that zero RPM mode with your regular idle/light use load.

  • Efficient: Again, fairly common these days to find PSUs with the efficiency designation, of which there are now a bunch of tiers: bronze, silver, gold, platinum and titanium. The higher the efficiency of the PSU, the less power that turns into heat instead of becoming DC current for your computer. This usually also means less heat that the PSU needs to deal with and pump out of your case. For example, if you have a 1000W PSU with 80% efficiency, then your PSU will likely pull around 1250W from the wall outlet to generate the 1000W peak output. That'll cost you extra on your electricity bill compared to a PSU with a higher efficiency rating.

  • Fully modular: A fully modular PSU has fully detachable cables. So if you have only a few interal drives and peripherals, you just plug in the cables you need and don't have to worry about where to hide the unused cables inside your case. This makes cable management much, much easier. And also improves airflow from having fewer cables obstructing it. Just make sure you're getting a PSU with enough connections (and wattage) to support the number of devices you need to power inside your case.

  • Sufficient wattage with a bit of headroom: The best way to figure out how much wattage your build will need is with . You select your exact parts and it'll estimate max load wattage and provide a recommendation (about 10% more wattage). The nice thing about this calculator in particular is that it lets you estimate usage if you overclock your CPU and GPU as well. In my case, it said my rig would use close to 600W when overclocked a bit.

    There's also the whole discussion to be had if you're really curious.

After a bit of research I ended up going with the Corsair AX860. It has the zero RPM fan mode I was talking about, is fully modular and has an 80 Plus Platinum rating. Corsair also has AX860i model that has more functionality (there's a desktop app to control it) but there are some mixed reviews about fan issues and buggy software so I decided to avoid it.

At 860W this PSU is considerably more powerful PSU than I need right now. I opted for something like this to provide enough headroom for high CPU and GPU overclocks and to future proof myself a bit in case I ever decide to add a second graphics card or do something crazy like upgrade to an overclocked 10+ core processor with a much higher TDP. Had that not been the case I could have gone with a power supply in the 650-750W range.

  • Alternatives
  • Affordable Lower Wattage Modular PSU

  • High-end lower wattage Modular PSU

Keyboard & mouse

Apple Magic Keyboard, Logitech MX Master 2S & Evoluent VerticalMouse 4

I've more or less always used and loved Apple keyboards and the slim new Magic Keyboard is no exception. It's bluetooth and it's possible to configure it to work as expected with Windows 10. The new Magic Keyboard has keys with limited travel and some folks may not feel comfortable typing on it. You'll have to try it for yourself. One thing is for certain though: I absolutely hate loud clicky-style mechanical keyboards with long key travel. So no keyboards with Cherry MX mechanical switches for me.

As for the two mice, I often switch between a regular mouse and a vertical mouse to allay some RSI wrist pain from time to time. More detail on my page.

  • Alternatives

Speakers

Bose SoundLink Mini II

This one probably seems the most out of place compared to everything else on this list. Yes, it's a tiny portable speaker that I'm using for my desktop computer. I just didn't want a large multiple speaker setup taking up space on my desk, especially one requiring some bulky power adapter and multiple cables.

It's only for light use like watching videos on the web, basic Spotify background music or casual gaming. — I have a much larger and powerful Sonos system for when I really want to play music. And as for sound while gaming, that's not a priority for me. I make due with just this or plugging in headphones.

There's a newer model of the SoundLink but it's not directional and didn't seem like what I wanted. The SoundLink Mini II is tiny but packs a good punch and can be powered via micro-USB and connect via bluetooth or a standard 3.5mm audio aux cable. I have a micro-USB cable hidden under a cable management shelf under my desk that I can pull out whenever I need to charge this or my Logitech MX Master 2S mouse.

Operating System

Microsoft Windows 10 Home, USB flash drive

Nothing much to say here, Windows 10 Home. I have little use for any of the features included in Windows 10 Pro.

  • Parts list
  • 9
  • 9
  • 9
  • 9
  • 2
  • 2x 9
  • 9
  • 7
  • 99
  • Miscellaneous
  • 9
  • 9
  • 4
  • 9
  • SUBTOTAL
    ,931

The display

Dell UP2718Q 27" 4K display

I was definitely spoiled coming from a 5K iMac. It has a stellar display with great color accuracy and incredible brightness. They even tossed out the industry's way of advertising monitors based on sRGB or Adobe RGB color space accuracy and have instead slightly adapted the DCI-P3 color space, originally meant for projectors, to be used for device displays and called it Display P3. Apple is seemingly ahead of the game here (well, it's ).

I had a daunting challenge ahead of me if I was to find a quality replacement for that display.

I'm not a professional photographer. I'm merely a hobbyist.

That means I don't get paid to take photos or do anything where color accuracy is mission critical, such as shooting and post-processing portraits with accurate skin tones or working with print. That means I don't care quite enough to use a pricey 10-bit Nvidia Quadro graphics card paired to a 10-bit professional monitor with internal LUTs for calibration with great homogeneity and extremely high color accuracy across the board like some of the or .

It also means I don't plan to buy and meticulously use a color calibration device. That means I don't typically share my images with others in large lossless formats. That means I don't care to limit myself to a 1920x1080 or 2560x1440 resolution display for increased Lightroom performance.

I, on the other hand, spend my time publishing my photos online. I will knowingly compress and sacrifice a good bit of image quality to make it easier for people to load my shots in a photoset. I know people view my compressed shots on a myriad of displays and devices where photos could look slightly different than how I might have intended. I also don't quite care enough to embed file-size-increasing ICC color profiles and save different versions of my files and depending on the device. Whatever, I'm fine with that. (Okay, I do want to spend some time to research serving up wide-gamut images on my site at some point..)

So what am I looking for?

What matters for my hobbyist photography use

  • 4K or 5K resolution: Yes, I know this comes at the expense of Lightroom speed but I just love having more space.
  • Good color accuracy: There's a lot of ways you can define good, but for me this just means something exceeding the sRGB color space and ideally covering a solid portion of either the the DCI-P3 or Adobe RGB color spaces.
  • Sub-10ms response time for occasional gaming
  • Ability to use the display with both Macs and PCs easily

What this meant was that right off the bat I was not looking for fast 144Hz gaming monitors that were not 4K, had a questionable physical appearance geared towards gamers and subpar color accuracy. Which is great because I won't have to describe Nvidia G-Sync and AMD FreeSync adaptive sync technologies meant to reduce screen tearing while gaming.

At first I did a lot of searching for a 5K display (did you know Dell has an 8K monitor out now too?). I was going to breakdown a list of all the current 5K monitors on the market and what was not good about each of them but I'll spare you the details.. I just don't think there are any exceptional 5K displays on the market at this moment that satisfy my above criteria. I talk about this in more detail below, but this is changing and it may be a good time to wait a bit longer. For example, LG has some 21:9 5K Nano IPS displays with great color space coverage coming out this year.

We're in this weird time of a transition away from DisplayPort 1.2 to 1.3/1.4 so if you want to run 5K at 60Hz on a PC now, you're likely going to need to use two cables. And then there's the mixed bag of trying to output video via Thunderbolt on a PC for certain displays. For example, if you want to run the LG Ultrafine 5K monitor (the one made for Macs) at 5K 60Hz as intended, you have to use a to passthrough the graphics to a valid Thunderbolt 3 signal. It's all just a hassle right now. I'll wait.

I got a cheap Dell 4K.. and I didn't like it.

Given the current state of 5K monitors and my primary use of just publishing sRGB photos on the web, I thought I would be fine with a placeholder 4K display for now. Just something to hold me over for a year or two until a great 5K display came out. I got the 27-inch 4K Dell P2715Q display for about 0 (the slightly updated is its successor).

It wasn't the most attractive monitor, didn't have the best brightness or even color space coverage. It only did about 79% accuracy for Adobe RGB. There did not seem to be many 4K displays with Adobe RGB coverage in the 90% range without going over the ,000.

Even though it seemed fine at first and definitely did exceed the sRGB color space, I began to want more control over my photos and be more future-proofed. In short: even if I end up converting and publishing to sRGB, I still want to see my photos as close to how they were captured as possible and have control over the proofing.

More importantly: I think this notion that sRGB is the color space of the web is quickly changing for the folks that visit my website and tend to have Macs with wider gamuts than most or recent phones with OLED displays and so on. If I'm going to keep my display for 3-5 years, I should be ready to publish my photos with larger color profiles in short order.

I put the Dell P2715Q to the side and replaced it with the .

I ended up with the 00 Dell UP2718Q that I mounted on a Humanscale M8 VESA arm.

At ,500 this display is not cheap but as part of Dell's UltraSharp line, it has some great color accuracy: 100% sRGB, Adobe RGB and Rec 709, along with 97.7% DCI-P3. In addition it is a 10-bit panel, has HDR10 support and a ridiculous peak brightness of up to 1000 nits. And as a supremely nice benefit, it supports DDC/CI which means its settings can be controlled via the OS. Long story short, I was able to script it so that I could control the monitor brightness with my keyboard.

You don't need 10-bit right now

While this display does have a 10-bit (per color) panel, I'm not using it with a 10-bit graphics card. Yes, you can get way more colors with a 10-bit setup: 1.07B compared to 16.7M for 8-bit. But it's going to cost you, a lot.

If you want to take advantage of 10-bit in applications like Lightroom in Windows, you have to use a different graphics card such as one from the Nvidia Quadro line. They're pricey workstation cards, not gaming graphics cards. And once you have that, you need an even more pricey 10-bit display. If you find an affordable 10-bit display, it's probably not real 10-bit but something called 8-bit + FRC (Frame Rate Control) that fakes 10-bit output by flashing two colors very quickly to mimic another color it can't natively reproduce.

Don't buy into the HDR hype yet

Do not buy into the current HDR hype with computer monitors. It's just a huge bunch of gotchas. Some displays like the cheaper Dell U2718Q (not the UP2718Q) that boast HDR functionality only have it work on the HDMI port and not the DisplayPort port.

In Windows 10, HDR support has to be manually turned on and then all the colors go dull as colors get remapped to the Rec. 2020 color space and the display lowers the brightness. As such, it's not something you will use for your daily computing and only for a select few games and supported applications.

The state of HDR for your desktop is horrible right now. But at least now there is a new "DisplayHDR" standard to help you identify what kind of display you're working with. Currently there are a few levels: DisplayHDR 400, 600 and 1000. The numbers refer to the minimum required nits for brightness and each level requires a minimum color space accuracy and global display dimming functionality. Next, we'll just need operating systems to allow for HDR to be automatic depending on the application.

What about OLED?

OLED displays in particular are very intriguing. Dell released their first 4K OLED display last year, albeit for ,500. OLED displays bring a lot to the table: a ridiculously fast response rate, an insanely high contrast ratio and impressive color accuracy including very dark blacks. This seems to make OLED displays ideal running the gamut from gamers to creative professionals.

That's the hope at least. OLED displays currently have issues with color shift over time as well as image burn-in (remember old plasma TVs? I definitely had got some burn-in on mine). There are some new technologies that aim to address those issues so we can only hope the tech matures in a few years. At the moment, there are not many affordable 4K or 5K displays on the market.

At the same time, nascent Micro LED displays are an area to watch. Like OLED displays they require no separate backlight and have great brightness and contrast (Micro LED can be even brighter) with the benefit of no burn-in or decreased performance over time.

On color spaces

A primer on sRGB, Adobe RGB, P3 and more...

I've mentioned color spaces a few times now so let me provide a somewhat brief description of what they are and what to look for in a monitor. The best way to describe it is probably to start off by showing you this chart. The colorful horseshoe shape behind everything represents the range of colors visible to humans. This is one of the more popular chromaticity diagrams called the CIE 1931 color space. There are a that calculate things differently based on things like the light source, but the purpose is the same.

CIE1931 color space comparisons: ProPhoto RGB, Rec. 2020, Adobe RGB, DCI-P3, sRGB

The triangles shown on top of this visible spectrum represent other color spaces. When it comes to talking about a display or photo-editing workflow, gamut is the range of a certain color space that can be reproduced. There is no display in the world that can reproduce every color humans can see. Gamut refers to the range of colors that can be displayed, not that can be observed. So as much as I would like to plot a high-end camera on this diagram, it . But to give you a general idea: yes, your camera can capture colors you can't see.

Let's start with the smallest and most restrictive color space shown here, sRGB. The sRGB color space has for a long time been the "default color space of the web" because while it's fairly limited compared to other color spaces, it more or less represents the lowest common denominator with respect to what various computer and device displays out there can reproduce. That's why historically SVG defaults to sRGB, and CSS only supports sRGB (). That means that if you edited and converted your photo to sRGB, it should look close to how you intended on the vast majority of devices.

Compared to another color space shown here, Adobe RGB, you can say sRGB has a smaller or narrower gamut. Adobe RGB provides a much larger color space—more than 50% of the visible spectrum—and was originally designed for people working with RGB on computers to be able to match the colors capable by CYMK printers. Most monitors do not display anything near 100% Adobe RGB. It's getting better as the years go on, but it doesn't have the best coverage like sRGB. Displays that can reproduce or exceed Adobe RGB (P3 as well) are considered wide-gamut.

Next up we've got DCI-P3. While it has roughly the same gamut size as Adobe RGB, P3 gladly sacrifices a few saturated blues and greens in favor of some reds and yellows. That's because the P3 color space was made in 2007 for high-end digital cinema projectors. Only recently has this been applied towards computer displays, not projectors.

But if DCI-P3 was originally intended for the cinema, why should we care about it and why did Apple even push forward with it for their hardware? Perhaps they wanted to hop on the digital video bandwagon as P3 might be the next standard gamut for movies as we transition beyond the current Rec. 709 color space on the way to Rec. 2020 for UltraHD content. Maybe they wanted to cater even more to digital video content creators? Or maybe they just wanted to go with a color space that more uniformly augments the sRGB color space compared to Adobe RGB.

Then we have Rec. 2020. This one aims to represent the gamut for upcoming display technologies — both HDR10 displays (also Rec. 2100) and UHDTV 8K televisions. If this is the new standard gamut for such high-end televisions, one can only hope it will make its way to some kinds of professional computer displays meant for creatives. There is also a use for this gamut outside of the context of a display and more for video workflows (almost like ProPhoto RGB below) but there's not much point to diving even deeper into that now.

Finally, I wanted to point out a color space called ProPhoto RGB. This is not like the other color spaces mentioned here. It goes way outside the visible spectrum. And for a reason; ProPhoto RGB is a massive 16 bit per channel color space used in Adobe Lightroom. It's not meant to be a display gamut. It's just a safe working space that won't clip or compress of the colors captured by your camera when shooting in RAW, providing ample headroom while post-processing your shots.

You might have seen ProPhoto RGB listed in Lightroom if you go to edit a file you're working on in another app or plugin. There's a little dialog asking what color space and bit depth to send the file as. However, it gets a bit more complex behind the scenes. RAW camera files have a gamma of 1.0, so Lightroom has decided to do all of its calculations at this gamma in the ProPhoto RGB color space. What you end up seeing as a preview in the Develop module (not the Library module or filmstrip — those use Adobe RGB) actually uses a gamma close to that of sRGB at 2.2. I believe this modified color space is internally called Melissa RGB at Adobe, named after one of their engineers. Confused yet? Great, so am I.

While you work in this large ProPhoto RGB space while manipulating your photos in Lightroom, you later export your files and have them converted to your desired color space. This process remaps the colors to fit within your desired destination color space. If you've ever saved an image to another color space and seen terms like "perceptual" and "relative", they often define how to map colors and how to deal with colors that are outside the gamut of your destination color space.

And when it comes to doing this in Lightroom you can always preview what this may look like by :

An extreme example of soft proofing in Lightroom. These photos may look the same if your display can't show more than sRGB. The right photo (limited to sRGB) lacks certain saturated colors.
Photo: Off the coast of Grand Cayman Island near Rum Point

What it all means

That was a ton of detail to get one point across — when selecting a monitor, you should at least be aware of what coverage the monitor has in the color space you care about: likely Adobe RGB or P3 if you're doing a lot of photography. Since it's hard to find regular prosumer monitors right now that even list their P3 accuracy, you'll probably just want to find something as close to 100% Adobe RGB as possible. At the end of the day, having more coverage means for the most part you'll be able to reproduce more saturated colors, assuming you're working with color managed software.

Alternatives

I will refrain from recommending any particular display options aside from right now. There are a ton of new ones coming out this year, but I don't think you can go wrong with displays from Dell's UltraSharp line. There are also some interesting this year that also have Nvidia G-Sync and a great 98% P3 color space coverage.

The build

Putting it all together

And now the fun begins. I started ordering parts here and there, slowly accumulating everything over a week or two. I didn't quite jump right into building the PC as soon as everything arrived. I actually ordered a to try my hand at taking some shots of the parts and computer against a white backdrop as you saw in some of the parts list earlier.

Case

I began by unboxing the NZXT S340 Elite case and removing its side panels, which was uneventful with its avid use of thumbscrews that remain attached even when unscrewed. I found the case internals to be laid out well, providing for some great channels to hide the PSU and various cables. The case also had some 2.5" drive cages pre-installed. I removed those as I won't be using any SATA devices.

PSU & custom cables

I installed the Corsair AX860 power supply and connected a few of the cables I knew I would need soon: 24-pin ATX motherboard power cable, 8-pin CPU power cable and two 8-pin PCIe power cables for the graphics card.

I did not use the standard cables that came with the PSU though. I wanted something a bit more aesthetically pleasing and went with individually wrapped custom cables from . They let you pick exactly what color paracord is used for each pin on the cable, but I opted for the same gray pattern on all of them. Since the cables are entirely custom I was able to specify the exact length of each one. I went with 50cm for the PCIe cables, 70cm for the CPU power cable and 60cm for the motherboard power cable.

Water-cooling

Then I installed the Corsair h115i AIO liquid cooler. I removed the front plate of the S340 and removed the handy dust filter that was magnetically affixed to the front intake. One of the main reasons I got this case was because it supports a 2x140mm radiator. It was a bit of a close fit but ended up working out. I screwed the fans and radiator in, then snaked the power cables for the fans and pump to the back.

Motherboard

The motherboard came next, but first I had to install the supplied motherboard backplate for the CPU. The h115i has some pretty stiff tubes coming off the waterblock and it takes a good amount of force to fasten it down, so the backplate helps spread that force across the motherboard. Compare that to my old days of PC building where a large and heavy heatsink-fan would bend the back of the motherboard.

RAM

Then I installed the two sticks of G.SKILL DDR4-3200 RAM. And now this is the point where I remind you to make sure to make sure they are installed in slots of the same color. This ensures that dual-channel mode will be activated. Apart from that there is one thing new about DDR4 that took me a second to realize: only one side of the RAM slot opens up. I'm not sure if this is a DDR4 spec thing, or just a motherboard-specific feature.

The last time I built a desktop PC when you inserted ram, you would nudge the two locking tab levers to their position away from the RAM, then place the ram straight down to lock both tabs on place. But now with DDR4 the bottom lever does not move and is locked in place so you have to put that side in first, cantilever the RAM down to the other side and then lock the lever in place. Sounds more complex than it is, it's just different from how I was accustomed. Also, if you look closely at the bottom of DDR4 RAM you'll notice it has a slightly curved edge to reduce required insertion force with all the extra pins compared to previous generations of RAM.

CPU

With the brace installed and motherboard screwed in to the case, it was time for the delicate act of installing the CPU, applying thermal paste and then fastening the h115i's pump head.

But this isn't any regular i7 8700K CPU. I had it delidded.

The big metal thing you see attached to the green PCB on a processor is not the processor itself; it's a heatspreader or just IHS for Integrated Heat Spreader. Desktop processors have this IHS to take the grunt of the force that comes from attaching a large heatsink and helps prevent amateur PC builders from inadvertently cracking the die. For years the CPU die under there had solder attaching the die to the IHS, thus providing for highly effective heat transfer.

At some point, desktop processors began shipping with thermal paste instead of solder to connect the CPU die to the IHS. And with that change came a larger distance between the die and the IHS, due to a thick application of glue on the sides of the IHS. This means more distance for the heat to travel and through a less effective heat transfer agent. Depending on the particular CPU, that means higher CPU temperatures and likely worse overclocking potential.

Delidding fixes that. Delidding is the process of forcibly removing the IHS by applying so much lateral pressure that the IHS glue breaks and the IHS can be removed. Then meticulously cleaning the die and surrounding area, carefully applying a better and much thinner layer of "liquid metal" thermal compound then gluing the IHS back on, but this time with less compound to ensure the die sits closer to the IHS.

Delidding can be done with a specialized tool that holds the CPU in place while pushing the IHS off. Given that it would be my first time doing this, I opted to have a professional do the job for me. Especially at a time when the 8700K was impossible to find in stock and I got lucky even getting one shortly after launch. I didn't have time to try to find a replacement if I ended up cracking this one.

After some research on various forums, I found . These folks provide CPU delidding and binning services. I had them do both and I shipped them my 8700K and they sent it back out in a day or two.


The CPU and binning results came back in a great state: this particular 8700K can sustain a very admirable 5.2GHz overclock with a healthy dose of extra voltage: 1.425V. There's a small caveat in that it runs with an AVX offset, which downclocks for certain workloads that make use of Intel Advanced Vector Extensions instructions which can be brutal on a machine. A bit of a note on AVX from Intel:

Because Intel AVX instructions generally consume more power, frequency reductions can occur to keep the processor operating within TDP limits. Intel is including additional AVX base and turbo frequency specifcations to provide more clarity for these Intel AVX instructions. Performance of workloads optimized for Intel AVX instructions can be signifcantly greater than workloads that do not use Intel AVX instructions even when the processor is operating at a slightly lower frequency.

CPU in hand, I put it in the LGA 1151 socket on the motherboard and slowly closed the socket. I didn't expect closing the socket lever to require so much force but my concerns were quickly allayed after some frantic Googling. The next step was the apply a thin layer of thermal paste to the CPU. The h115i ships with a questionable thermal pad affixed to it, so I cleaned that off first.

The most stressful part of the build was now out of the way. Installing the M.2 SSDs was up next. This motherboard has two M.2 slots but only one has its own heatsink. I wanted to place the more active drive that I would install the OS on there, and leave the less active Lightroom scratch drive in the standalone M.2 slot.

I couldn't quickly ascertain how each slot was identified in the UEFI and I didn't want to mistake installing Windows on the wrong drive. To solve this I only installed the SSD under the heatsink first and would install the other one after I had Windows up and running.

Samsung 960 EVO M.2 SSD in place but before heatsink installation. Samsung 960 EVO M.2 SSD in place but before heatsink installation. That USB cable on the Corsair pump head annoys the crap out of me. I need to find a slimmer cable I can hide more easily.
Graphics card

The massive GTX 1080 Ti was next up to bat. When I picked the case and graphics card I had to make sure that I would have enough room in the front for the radiator. Fortunately, that was not an issue here. But the one thing I knew would be an issue: so-called GPU sag. Graphics cards like this one are very heavy — especially this one with a larger than average heatsink — and have two bulky 8-pin power cables adding even more weight. As such, when mounted in a vertical case like this the card tends to sag down, applying a ton of stress on the PCIe slot with it.

While it's probably nothing to be terribly worried about as this motherboard has a reinforced PCIe slot, I also installed a GPU support bracket to be safe. Though I had one unexpected snag: this card has fans along the length of the card so there weren't any great bracing points for the support bracket to push against. I managed to place it at the very end which doesn't provide the best support.

Cable management

With the main components installed it was time to connect the remaining power cables, attach the front panel LED and power switch cables, the front panel USB cable as well as a micro-USB cable for the Corsair AIO pump. I wanted to hide some of these USB cables and give me another port so I installed an and hid it in the back.

With that out of the way I meticulously zip tied just about everything on the back. Not that it mattered much; they wouldn't be visible with the side panel on. Then I put the glass side panel back on. This took some work as the Corsair h115i tubes are very rigid and didn't want to stay inside the case at first.

The finished product

A closer look

Finally, here's the finished computer and desk setup! While I was initially concerned this case might be a bit larger than I wanted, I ended up being rather pleased with it, especially the mostly black/gray theme with the internals. The S340 Elite has enough room to make hiding cables easy and allow for just about any component I want without size or thermal restrictions.

The can does attract fingerprints though not nearly as much as a glossy surface would. The side panels do scuff easily as well, but on the plus side I've been surprised at how well the front dust filter does at keeping dust out.

The next challenge was ensuring my desk cable management situation was at least decent to complement the tidy internals of the PC. I went with a three fold approach:

  • Humanscale M8 adjustable monitor arm

  • Humanscale NeatTech cable tray that screws under the desk to hide chargers and miscellaneous cables. I always hide micro-USB, USB-C and Lightning cables under there as well as MacBook Pro charger for when I connect my work laptop to the display when working from home.

  • Cable sleeving and velcro ties for taming PC cables under the desk

First boot

I plugged everything in — including a different keyboard directly into the USB socket labeled BIOS on the motherboard — and nervously pressed the power button for the first time. The computer quietly whirred to life, the motherboard and graphics card lit up their numerous animated LEDs and the motherboard's two digit Q-code display showed various codes before successfully POSTing.

As for the sound of the machine, it's nowhere near as completely silent as an idle iMac but not too much louder with the dual 140mm fans of the Corsair AIO liquid cooler in the default silent mode. However, the top 140mm and rear 120mm case fans could be quieter. I ended up undervolting the top fan to 7 volts with an adapter to spin a bit slower. Though I do want to look into either quieter Noctua fans, "be quiet!" brand fans or the new maglev Corsair ML line of fans.

PC on (after I disabled all the bright animating LEDs)

The first thing I did was enter the BIOS to do a quick runthrough of the settings. I wasn't concerned with overclocking just yet but did setup the boot drive order and disable SATA as I didn't have any SATA devices connected. I also enabled the XMP memory settings so the RAM would run at its rated 3200MHz. After Windows finished installing I went back to install the second M.2 SSD and then go back into the BIOS to ensure it was running at PCIe x4 speed (the default was x2).

The BIOS on this Asus Maximus X Hero is insanely detailed and overclocker oriented. You can tweak just about every voltage or timing you'd ever dream of fiddling with. After saving the settings I changed, I put in the Windows 10 USB stick and rebooted to the installer.

There was nothing particularly noteworthy about the Windows 10 installation process, it did it's thing and went by pretty quickly.

Windows 10

Setup & first impressions

Windows 10 after installation. 4K display means screenshots will be rather tiny, even with 125% scaling

The first thing you notice about Windows 10 is the dark theme. It's a rather bold choice for Microsoft to set as the default for everyone, but something about it does feel modern, sleek and precise. However, as I quickly noticed throughout my entire Windows 10 setup experience, everything is customizable. You can change the accent color, adjust how you want it to appear in the title bar and so on.

I have some mixed feelings about the Start Menu though. Right after having installed Windows and opening the Start Menu for the first time, it’s a more than a bit daunting in its default incarnation — there's a ton of stuff pinned to it making it this rather large monstrosity that only draws attention to all the unwanted, preinstalled applications and games.

Ahhhh what is all this junk??

I had to spend a few minutes going through and uninstalling a bunch of stuff. It's only then that the Start Menu began to feel more humble and minimal. Some folks may like the live tiles for glanceable weather updates but I prefer something more basic.

Coming from macOS there were definitely a few things that feel familiar. There's the expandable sidebar for notifications and Action Center on the right side of the screen. Similarly some Windows 10 has some pop-up notifications in the bottom right corner when apps or the OS need to tell me something. Then there's macOS Expose/Spaces-like TaskView that supports multiple virtual desktops (in addition to the expected Alt+Tab switcher).

Windows has had a neat window snapping functionality to allow quick resizing and snapping of windows to either full-screen, half-screen or corner quarter-screen sizes for a while now. While I appreciate its existence, it's largely only useful for devices with smaller displays — with a 4K display I really don't need my windows to be as large as 50% of the screen as it wants to do for me.

Unlike macOS, typography on Windows leaves a lot to be desired, even after fiddling with Microsoft ClearType settings. Many typefaces throughout the OS just feel like they are lacking weight and commonly used ones like in the Chrome address bar seem hairline thin. There is a third party font rasterizer called but I haven't had much luck with it and uninstalled it.

Another area this manifests itself is high-DPI display support. Some applications can display blurry text when mixed with UI scaling — I run my 4K display with the 125% scaling set, which seems to keep the native resolution but only scale certain parts of application chrome and text as necessary, which I do like. However, this particular issue may just be a transition period thing until more Windows 10 applications are served as that can run on any Windows 10 device.

This push for UWP apps makes sense for the grander vision for Windows 10. There are so many kinds of devices running Windows 10, especially convertible 2-in-1 tablet/laptop hybrids like the Surface Pro and Surface Book 2, that Windows has invested quite a bit in making the experience on any device smooth. For example, there's a tablet mode you can enter to make it easier to use with a touchscreen notebook.

Cortana is Microsoft's "truly personal digital assistant." It has lofty goals of being able to do stuff like Alexa and Siri—things it can only do by knowing more about your online habits, contacts, location, calendar, emails, et cetera. Instead of being a relatively hidden and streamlined-when-you-need-it part of the OS, Cortana seems to be more like an overbearing parasite grabbing every surface area it can. For now, that means there's a few more items in your Start Menu: Cortana Notebook, Cortana speaker (even if you don't have one there's a reserved menu item for it), Cortana Reminders and Collections.

Hopefully, the rumors will come to fruition and Cortana will find a where it can be safely ignored. I know that's a bit harsh, but I don't see the value from Cortana yet and I'm not sure how I could—I use Google Chrome, I use G Suite Gmail and Calendar instead of native clients and so on. There's not too many ways for Cortana to learn about me aside from just what I use Cortana for: finding files and launching applications.

Even if you only use the regular Cortana search the default screen for that is always trying to get you to do something else and trying to show you what else it can search for. I get it, Microsoft wants to be aggressive with this and find ways for Cortana to grab a hold of your daily needs to fit into your life somehow. It just comes off as adding complexity to everything. I disabled it as much as I could. Maybe I'll find a use for it one day, but for now I need Cortana to recede.

Other significant additions to the Windows 10 experience include the fast new Edge browser, the renamed and revamped Microsoft Store as well as less popular but noteworthy functionality like automatic facial recognition login with Windows Hello (though nowhere near as advanced as Face ID) and Dynamic Lock to log you out when you're not near the computer with a Bluetooth-paired phone.

The Windows store became the Microsoft Store and now features media for purchase.

As for the overall design of Windows 10, there's a definite feeling of inconsistency. Some parts feel more refined and modern while others seems like a relic of the past. For example, take a look at the entirely different aesthetic of these two settings-related windows:

But that is changing — and quickly. Microsoft is beginning to incorporate their system to replace the older Metro style that exists in parts of Windows 10.

Fluent design has a few areas of focus for how they're thinking about the system: light, depth, motion, material and scale. Material was the first aspect of Fluent design that I noticed in parts of the current Windows 10 release. There's this in certain menus and panes that is like frosted glass with translucency and a strong background blur.

Probably not the best example of Fluent design.. but you get the idea :)

Facets of light can also already be seen, mostly in the hover states for various elements with . They now dynamically adjust the lighting of the container based on where your cursor is over the element. It's like what happens when you move your cursor above an inactive Google Chrome tab.

However, until Fluent design permeates more of those legacy surface areas, we'll have to deal with some repulsive stuff like this:

I have no idea how to use this menu.

While my first impressions of Windows 10 might have come off rather negative, I do really like the OS. Sure I'm dismayed that various parts of Windows have some clutter (I'm looking at you Cortana, OneDrive and Quick Access) but that's just the default. Similar to how many parts of Windows can be personalized, other items can often be changed and simplified to your liking with enough motivation. For every minor annoyance I've had, it only took a few minutes to find out how to customize it enough to make it more acceptable.

First step: uninstalling many preinstalled programs I don't want.

After a few customization and cleanup tasks that I'll dive into later, Windows itself felt like it began to recede and let me focus on my tasks at hand. It's a faster and more capable beast than the Windows versions I remember.

Fluent design sounds exciting but it'll take some time to see what the fully realized vision will do for Windows 10. The good news is that Windows feels like it's constantly being updated. Instead of waiting for larger annual tentpole releases, there are more frequent large updates like the recent Fall "Creators Update" that brings entirely new functionality, not just bug fixes. The Windows "Service Pack" updates are long gone.

And if you want to see new features being tested, you can easily sign up for Windows 10 Insider Preview builds. For example, the last big release in December 2017 featured two new workflow and window management features: .

There's one more thing. You can run now Linux on Windows! And not in some slow VM. This is absolutely huge news.

Windows 10

Installing apps & drivers

After the Windows 10 installation completed, I had to first install some basic motherboard drivers to get online. Only the Ethernet port was working out of the box so I had to connect that to my router first. I'll spare you the exact details, but I went to the Asus site and had to download a ton of drivers from Wi-Fi to Bluetooth, and then some motherboard specific programs like AI Suite 3 from Asus to manage overclocking and advanced energy use settings. Then I downloaded the Asus Aura Sync program to be able to disable the animating LEDs on the motherboard and graphics card.

I then installed all Windows updates, a motherboard BIOS update, Nvidia GeForce drivers, Dell Display Manager software, Logitech Options for the MX Master 2S mouse, Corsair Link software for the AIO cooler and Samsung Magician for the 960 EVO SSDs. A reboot or two later and I had all required software installed. Definitely not as easy as turning on a Mac for the first time, but not difficult.

The vast majority of programs I wanted to use had Windows versions too.. which is a really funny thing to say because 15 years ago I would have been complaining that my favorite apps from Windows weren't on OS X. The largest exception was my preferred note-taking app ; its iOS/macOS apps are iCloud-based and they don't have a web version. However, I'm currently testing out Notion which has a web support and a Windows 10 app.

Then I installed my essentials:

  • Adobe Creative Cloud: Lightroom Classic CC, Premiere Pro CC, Photoshop CC

  • Atom text editor

  • Backblaze

  • Dropbox

  • Spotify

  • Steam (CS:GO, Call of Duty: WWII, PUBG)

  • Origin (Battlefield 1)

  • Blizzard (Destiny 2, Overwatch)

  • Oculus (Robo Recall and misc VR games)

The Corsair Link software controls the settings of the CPU liquid cooler. You can create and set different profiles that determine what speed the radiator fans and pump should be running at for each temperature range. It also lets you glance at temperatures for the motherboard, CPU, graphics card and drives.

I find the Samsung Magician software much more interesting. Aside from letting you run firmware updates and performance benchmarks, it also gives you an easy way to enable over provisioning.

As mentioned during the SSD selection process earlier, there are different kinds of NAND memory from QLC to SLC. They all have some kind of maximum number of read/write cycles. For the average consumer like myself, I don't really need to worry at all about this, especially with TRIM and modern SSDs. Last I checked you'd have to write on the order of a few hundred TB of data to the SSD before you experience any kind of errors; though you'd get some performance degradation along the way. I've been running these two SSDs half a year so far and have only put about 8TB of writes between both drives. But there are some technologies at play to help extend the SSD lifespan.

The Samsung Magician app links you to the Windows drive optimization dialog but this is largely unnecessary. For one, it's already done on a schedule by default and second, Windows 10 has great TRIM command support. When TRIM is enabled, every time you delete a file Windows tells your SSD that a particular set of LBA data blocks are no longer being used by the OS and can be erased immediately.

Without TRIM, the SSD will retain the contents of those LBAs until they are overwritten by another action. In actuality it's a bit more complex than this with the SSD firmware doing some automated wear leveling and garbage collection in association with TRIM, but long story short TRIM is good and helps reduce write amplification and increase write speed overall.

As for over provisioning, this lets you specify a percentage of the drive to go unused by the OS and give to the SSD to help maintain the performance and extend the lifespan of the drive. Seagate has if you'd like to learn more:

Note that in this case, as the amount of over-provisioning increases, the gain in performance is quite significant. Just moving from 0% over-provisioning (OP) to 7% OP improves performance by nearly 30%.

It's for these reasons—easy firmware updates, ability to set over-provisioning and TRIM support—that I opted to not use my two 1TB 960 EVO SSDs in a RAID array. While TRIM in a RAID array is technically possible (I've heard Intel RST can do it), I didn't want to have another potential thing to debug for a new build. Also, if I were to upgrade my processor and motherboard down the line, the new motherboard may not recognize the array. And these things are so fast already, I'm not sure I'd see a huge real world performance gain for the added risk.

Tweaking Windows

Making it feel like home

With the computer now up and running with all required drivers, it was time to tweak a few things that I wanted to feel more natural to me coming from macOS or just fix things that annoyed me:

  • Disable User Account Control: Every time I install a new program in Windows 10, I get this really annoying confirmation dialog that takes over the full screen. While this is great for less tech savvy Windows users, it's annoying for me.

  • Install 7-Zip: I'm just really not a fan of how Windows 10's native compressed folders work for unzipping things. I also find it way slower than 7-Zip. 7-Zip lets me right-click on any compressed archive and extract right in place.

  • Use to remap keys to feel more like the Mac keyboard layout: While I was able to easily pair the to Windows 10, I did not have any functioning media keys. In addition, I wanted to remap the Windows key to use the current Control key instead. That way I could still use the same Cmd (Mac)/Ctrl (Win) placement. That means that things like opening a new tab in Chrome would use the same finger position for me if I was on my MacBook Pro or my Windows machine. It would make going between the two machines much less annoying.

    KeyTweak is a rather complicated piece of software that took some time to get used to, but I was able to remap those keys after some poking around. I have not had any issues after setting it up the one time.

  • Install Apple BootCamp keyboard software: I wanted the Apple-style volume HUD and related features so I installed the Apple Boot Camp software. But since Boot Camp is intended for Macs that have Windows installed, by default it will want to install a bunch of other hardware drivers that I don't need. I to only install the keyboard-specific software.

  • Install Xmeters: For years I have gotten used to seeing CPU and network activity at a glance in my macOS menubar with . It's just some nice peace of mind to know if my machine is doing something (or isn't doing something) that I'm expecting. is the closest equivalent I could find for Windows. It lets you put a myriad of system stats in your taskbar.

  • Install for macOS-like "Quick Look" spacebar file previews: Because I've gotten way too used to selecting a photo and hitting the spacebar in macOS to preview it. There are two versions of Seer, a free one and a more advanced paid one. I went with the older free one for now.

  • Remove clutter from File Explorer:

    • : Just a matter of personal choice. I don't like Quick Access taking up space in the File Explorer sidepane. View the last reply from a Microsoft employee in that thread for the registry tweak.

    • Completely uninstall OneDrive: I'm a Dropbox user, I don't want OneDrive everywhere. After uninstalling the program regularly it still remains in File Explorer so you need to .

    • Remove Adobe Creative Cloud from the File Explorer side pane: Even after completely disabling Adobe Creative Cloud's file sync functionality, the related folder still exists and can't be deleted and just keeps coming back. is in order.

  • Remove the Recycle Bin from the desktop: I prefer to have a clean desktop. Instructions on how to do that .

  • Rename the PC: So that I don't keep seeing some random PC name on the network or announced by my Bluetooth speaker when I connect. This setting can be found in Settings → System → About

  • Move the task bar to the top: I'm just used to having it up on top.

  • Install Virtual Desktop Manager: The good news is that Windows 10 has native virtual desktops. I often use two with one dedicated to full-screen Adobe Lightroom. While doing this a few times I kept running into issues with the standard windows hotkey to switch desktops sometimes not working while I was editing a photo. I also didn't like that the native animation to switch desktops felt a bit slow. I installed the lightweight app to change the hotkey and eliminate the slow switching animation. If you need a bit more control there's also and .

  • Install Tiny Hot Corners to enable basic hotcorner functionality like macOS: After years of using Exposé/Mission Control, I wanted to bring some of that behavior to Windows to allow me to easily throw my mouse in a corner of the screen to show all windows to quickly find something I'm looking for (I also used it to quickly sleep the computer or show the desktop). Windows 8 had hot corners but apparently the implementation was more annoying and it was removed with Windows 10.

    After a lot of searching I found some really bad apps that offered this kind of functionality. Then I stumbled on . It's an impressively minimal, no frills executable. You can only specify the coordinates of the hot corner zone, the action to be done on activation, the delay before it triggers and so on. But it does the job of opening up Task View for me when I toss my mouse to the corner and it uses very little system resources.

  • Configure keyboard to control monitor brightness using : I got the monitor brightness buttons on my Mac keyboard working. One benefit to running the Dell UP2718Q is that it can be controlled via the Dell Display Manager app to adjust settings instead of fiddling with the hardware buttons on the display. This is thanks to it supporting DDC/CI.

    I also realized the Dell Display Manager executable accepts command line arguments, letting me easily script something basic using AutoHotkey.

  • Install as a better launcher over Cortana: As a general app launcher Cortana is fine. I can quickly click the search box in the taskbar or hit the Windows key and start typing searching. But it's not great. By default the search results are cluttered with suggestions from the Microsoft Store or the web. There are some rudimentary search filters but it doesn't appear that they can be set as the default. There is a way to permanently disable those suggestions in Cortana settings. When that's done the search field placeholder text says "Type here to search" instead of "Ask me anything" and results look like this:

While the results are filled with fewer suggestions now, I don't like having the expanded search box taking up space in the taskbar and if I hide it, I'll still be pressing a hotkey to open up search. If I'm doing that, I might as well look for a better alternative. One that's even a bit simpler.

Enter . It's a basic Spotlight/Alfred equivalent for Windows. I use it to launch applications and search for local files easily. To do the latter I have to install a separate indexing service called . It can do a bit more than that with various plugins you can configure. But this may be temporary — it seems like search on Windows is .

Developer mode + Linux!

Enabling the Windows Subsystem for Linux

It is now possible to run a full Linux environment right inside Windows. This means you can install Ubuntu or another distro and get access to the same bash prompt you'd expect inside Ubuntu. It was this new Linux functionality (that I read about on ) that was partially responsible for my initial curiosity in Windows 10 and building a new PC. It meant I could also easily carry out my basic web developement tasks to maintain and publish to this site. For me that means a simple Ruby and Node development environment.

The Windows Subsystem for Linux lets developers run Linux environments -- including most command-line tools, utilities, and applications -- directly on Windows, unmodified, without the overhead of a virtual machine.

I really can't understate the magnitude of this. There are some quirks and not everything is smooth sailing but I've been able to adapt my workflow to it just fine. Your mileage may vary.

First I needed to enable Developer Mode and the WSL feature. I also found some nifty settings on the "For developers" page to like showing the full path in the title bar by default as well as displaying file extensions and hidden files. The entire setup process is pretty straightforward and : pick and install a Linux distro then create a UNIX user account.

After that's done you can now just type bash inside any command prompt to get access to your Linux distro's shell.

Install Hyper terminal

It didn't take long for me to dislike the included Command Prompt and PowerShell command-line shells in Windows 10. I wanted something more customizable. I went with .

There are a few ways to install Hyper but I used to quickly install it for me. Chocolately is like the Homebrew package manager on macOS or apt on Ubuntu. Chocolately also has a you can install if that's more your style. A few moments later I had the lovely Hyper terminal up and running.

It's rather novel compared to other terminals I have used in that it's built with web technologies.. you can even open up a web inspector for the terminal itself! When you want to add a plugin, you just type in the name in the .hyper.js preferences file and when you save the plugin is automatically downloaded and installed behind the scenes with npm. It's super easy to get started and there are and . Unfortunately, despite all my tinkering, it seems like it's not possible to get a transparent or translucent terminal background for the Windows version of Hyper.

After spending too much time checking out various Hyper plugins and copying over some of my .bash_profile alias and tweaks, I was ready to get back to work.

By default Hyper uses the Windows system prompt. This means that whenever you want to access your Ubuntu bash prompt you need to type bash at the prompt. This got annoying pretty quickly so I configured Hyper to make that my default prompt. There's just a line you can uncomment in the preferences file:

shell: 'C:\Windows\System32\bash.exe',

I also installed my preferred terminal font "" and set it as the typeface for Hyper.

Hyper after a few tweaks. I like a fairly simple and dark style. Used for the background.

Setting up my dev environment

My site is all static flat files and based on the static site generator Jekyll so I needed to set up a ruby environment. I also work with my site on my laptop here and there so I prefer to install the exact same ruby version on both machines. I find it easiest to manage ruby versions with rbenv. I installed rbenv and ruby by following along (or any basic Ubuntu ruby setup guide) and then running a bundle install in my Jekyll directory to install the gems I use.

That was pretty much it! I also installed ImageMagick and grunt, which I use for a few things. I also resize my photos a few times, compress them and convert some to WebP. I have always done the former with a grunt script but I had a Mac app I used for WebP conversion. I started looking around for a Windows equivalent to let me batch convert photos to WebP and found the lovely . is also a solid runner-up but I found XnConvert to be more capable.

There are a few gotchas associated with this WSL setup. The main one is this: you can't edit a file that originates from the Linux userland inside Windows. My workaround was to pull down a git repo in Windows, edit it in Atom on Windows and have Jekyll in Linux work with the files. The catch was that I had to ensure I had git for me:

git config --global core.autocrlf true

But you won't run into that issue as long as you do all your Linux stuff inside Linux and all your Windows stuff inside Windows.

Configuring Lightroom

What I do after a clean install

The first thing I do with any new Lightroom installation is move the catalog to my Dropbox folder. I have done this for years, initially to sync catalogs when I used them interchangeably between my Macs as I mentioned in . But with this new PC, I pretty much only do my editing here so syncing is less of a requirement now. But I digress, I like having the catalog backed up in case I mess something up I can quickly revert.

Since I move the RAWs I'm no longer actively editing to my NAS, I went to File Explorer and mapped the NAS as a network drive. I set it as Z:\ and went into Lightroom and updated the locations of archived sets to the new drive path. When I'm done editing a photoset I just drag the local folder to the NAS network drive inside the Lightroom Library module.

, Mount Cook, New Zealand

There's not much I need to do aside from tweak a few settings and install one thing to get Lightroom to my liking. I don't really use presets or plugins.

  • Install : Once a paid app, VSCO Keys is now free and open source (though not updated anymore) shortcut tool I use to speed up my editing workflow. I mainly use it to copy and paste develop settings across photos quickly: I just tap , to copy develop settings on a photo and . to paste. It can also map keys to some other functions. It has been said that VSCO Keys was broken with Adobe Lightroom Classic CC but I was able to get it to work, so maybe that was only for the Mac version.

Some of the many customizable hotkeys.
  • Keep NVIDIA drivers up to date and ensure GPU acceleration is enabled: While Lightroom doesn't put the GPU to the best use, it is particularly helpful if you're on 4K display. It was enabled for me by default, but good to double-check.

  • Save presets inside catalog: While I don't use many presets, I like having them stored inside the catalog and backed up. Handy if you have any synced catalogs across computers.

  • Increase Camera Raw cache to 20GB or more: As instructed by Adobe . I went to 50GB given that I'm working with much larger 42MP RAW files so I might hit 20GB more readily.

  • Set JPEG preview to full size for DNG creation I don't convert my RAWs to DNG too much. There is said to be some performance gain but I can't really tell the difference for it to be worth the lengthy upfront conversion time. But if I end up changing my workflow in the future to use more third-party culling apps, I would want larger JPEG previews baked into the DNG. I might dabble with this more this year.

  • Set Default Develop Settings: Whenever I import new photos from my camera, I'd like to have my default develop settings applied automatically. I don't have many default settings — just Remove Chromatic Aberration, Enable Profile Corrections and set a camera profile other than the default Adobe Standard. As I mentioned above, the list you see in the Camera Calibration » Profile dropdown will vary depending on your camera manufacturer. For my Sony A7R III with Version 4 processing, the profiles feel a bit different from my A7R II so I'm still trying to see what I want to set as my default. Likely Camera Standard or Camera Neutral.

    To change these defaults, go to the Develop module and change any settings on a particular photo that you would like to have as the new default for every photo from that camera. When you're doing tweaking, hold down the Alt key on Windows then click "Set Default..." in the bottom right and accept the dialog that comes up:

And now I just fullscreen Lightroom and get to work:

I usually have Lightroom fullscreened in its own virtual desktop.
Akaroa, New Zealand

Performance

Overclocking & putting the new build to work

I set out to build a speedy Windows 10 PC mainly for my Lightroom photo editing work. One that would be easily upgradeable. How did I do on that goal?

Well there's still one more thing to do — overclock! I built this computer with the intention of getting some extra performance by overclocking the CPU, RAM and GPU a bit. Especially with many of the parts I purchased being geared towards the overclocker enthusiast. With a delidded processor and liquid cooling system with a large dual-140mm radiator, I should be able to achieve a high and stable overclock to run 24/7 for this processor.

I spend the majority of my time in the Develop module, a part of Lightroom that does not really put extra processors cores to work efficiently. As such, I opted for relatively fewer CPU cores (compared to going with 8, 10 or more), but with a very high clock speed compared to tons of cores with a lower clock speed.

But why did I opt for 6 cores instead of 4 cores? It seems like I would be able to achieve a better overclock with the 6-core i7 8700K than I would with the 4-core 7700K, meaning I could have my cake (extra cores) and eat it too (overclocked for a high clock speed). This wouldn't be the case for an 8-core chip where I wouldn't be able to reach 5GHz+ on all CPU cores and where the extra cores wouldn't be put to use well with inefficient Lightroom.

And I got lucky. The Intel i7 8700K I have happens to be made from better silicon than other 8700Ks and I can get away with a stable 5.2GHz overclock on all cores. Compare that to the stock setup which would only ever run all cores at a Turbo Boost of 4.3GHz (the advertised 4.7GHz Turbo Boost is just for one core). As a secondary benefit, the two extra cores markedly improve my Premiere Pro video editing performance; that was not my goal as I use Premiere Pro much less, but it's nice to have. Going with 6 cores seemed like the right choice given the current Lightroom implementation. If Lightroom keeps getting better at multi-core performance for Develop actions, then that would not be the case.

Overclocking

How and what did I overclock?

There are two main ways to overclock the CPU: in the UEFI or using a Windows program like the one provided by the motherboard manufacturer. While you can tweak the overclocking basics quickly with the Asus AI Suite 3 in Windows, it doesn't contain every piece of functionality. The controls exposed by the UEFI for this Asus Maximus X Hero motherboard can be very daunting at first. There are pages and pages of discrete settings, voltages, frequencies and more that you can adjust and test.

As much as I would like to provide a guide to overclocking here, that would add many pages to an already long post, and more important: I'm not an overclocking expert. In a nutshell, it is lots of trial and error: increase the CPU Vcore voltage a tiny bit, increase the CPU multipier a few steps, see if it boots and is stable, repeat until unstable, then back down until stable and lower voltage as necessary.

Fortunately, overclocking is much easier than it was in the past.

Many enthusiast motherboards have simple two-digit error code displays directly on them to help you understand why your machine is not booting. And if you attempt some crazy settings, the motherboard will most likely catch the error and simply reboot for you with safe settings. There are other related motherboard safeguards too, like if you somehow corrupt your BIOS and you can't POST at all, you can use BIOS (again, it's UEFI but half the stuff is still labeled BIOS) flashback feature to plug in a USB port with a firmware file to flash or update your BIOS. Back in the day a corrupt BIOS flash would have just meant a dead board.

Even with those new debugging tools and safeguards, your first time overclocking can still be a nerve-racking experience. Intel now lets you purchase a . I think it's a bit gimmicky but it's insurance: Intel will replace your CPU once if you kill it. I've only killed a processor from overclocking once — a 3.06GHz Northwood Intel Pentium 4 that I sent too much voltage to — and that was in 2002 and Intel sent me a free replacement anyway. I do applaud Intel for being so overclocker-friendly these days though. Of course, I already voided my warranty by delidding the CPU so that protection plan is not an option for me.

Fortunately, it's easy to find great overclocking guides for your exact hardware as well as general rules of thumb:

  • : A good starter guide on using the ASUS GPU Tweak II software to eke out some extra performance from their ROG Strix GTX 1080 Ti.

  • : An easy to follow video guide on overclocking the GTX 1080 Ti but this time using the MSI Afterburner software which I tend to prefer as well.

  • For those that that are comfortable flashing their card to an XOC BIOS for more voltage controls, or even extreme liquid nitrogen cooling and hardware voltage mod tweaking.

  • : Yea I know this is not a Coffee Lake guide for an 8700K but it's a thorough guide written by ASUS themselves that is a fantastic primer for navigating around the UEFI BIOS on ASUS Maximus boards and more.

  • : And a great video tutorial walking through the UEFI BIOS and explaining what things do as it's being overclocked using my exact hardware.

  • : While memory overclocking is covered sufficiently in other guides, here's a much deeper dive on the topic for those interested.

Results

I'm still finalizing my exact settings but it's looking like this is a stable overclock for me:

  • CPU: 5.2GHz (52 CPU multiplier, 47 uncore multiplier) on all cores at 1.42V with -2 AVX offset

  • GPU: 2012MHz GPU clock, 5602MHz memory clock (11,204MHz DDR effective clock) at 1.06V with 120% power limit

  • RAM: DDR4-3333 at CL14 at 1.4V (using Maximus Mode 2)

The graphics card overclock isn't terribly necessary given that Lightroom doesn't use the GPU too much and it's already such a fast card and I won't see any real gain in games as I'm already exceeding 60fps with 4K gaming.

Given that this is quite an overclock, I should probably mention the idle and load temperatures. With the Corsair liquid cooler set to silent mode, the idle temperature runs somewhere between 29-32°C depending on the ambient temperature in my room. Under Lightroom loads with the fans kicked up, it will approach 55°C. Under complete 100% load like with a benchmark or Premiere Pro rendering a video, loads it could reach up to 72°C.

If I keep the fans in silent mode, I hit around 79°C at load. Given that this is with a delidded CPU running higher voltage than normal and on water, it seems unlikely that such a high overclock could be sustained without a good liquid cooling system. Compare to the max load temperature of around 52-56°C that I saw at stock clock (4.3GHz Turbo Boost on all cores).

Benchmarking

What do six 5.2GHz CPU cores mean for Lightroom?

Now that everything is finally and up running as I had intended, it's time to see how this new PC stands up to my typical Lightroom workflow. First off, Lightroom Classic CC opens up quickly in about 5 seconds thanks to the 960 EVO M.2 SSD.

The actions I care about are all in the Develop module: things like the responsiveness of dragging around the spot removal tool or adjustment brush, as well as simply scrolling through the filmstrip with Lightroom fullscreened on my 4K display. Unfortunately, I do not know of an easy way to benchmark actions in the Develop module.

Anecdotally, I can say everything in the Develop module is faster. I wouldn't say it's instantaneous or snappy — I mean we're still dealing with massive RAWs using relatively unoptimized software. But it's a marked improvement. I'd like to think this is the best performance I'd be able to achieve in the Develop module with any number of cores; only a higher clock would help more.

Then there's the items that are easier to benchmark: import, export, DNG creation and HDR merging. They are mostly ones that can benefit from more cores and to varying degrees. While I did benchmark these tasks, they're not really the type of task I was aiming to optimize with this build, so showing these results is a bit moot.

I selected a set of 350 42MP RAWs (15GB) from my recent visit to the and ran them though importing (with the copy option), 1:1 preview generation, DNG conversion and exporting. I ran these all multiple times and averaged everything. Since I initially built this machine with a quad-core Intel 7700K and Asus Maximus IX Code motherboard before upgrading to the 8700K, I also ran the benchmarks on that rig. In addition, I benchmarked each without an overclock and with. For the 7700K I was able to get to 5GHz stable and with the 8700K I'm at 5.2GHz on all six cores.

I'm not a professional benchmarker so I wouldn't say these results are totally accurate, but directionally accurate. For example, I think on my earliest 7700K benchmarks, I didn't wait long enough after import to start building previews and Lightroom was automatically applying my default camera profile corrections at the same time. That probably would have made that time a tad faster.

Building 1:1 previews on 8700K @ 5GHz (set to 5.2Ghz but Lightroom counts as an AVX process so it's at a -2 offset)

Below you'll see two screenshots of the 8700K at work building 1:1 previews (left) and exporting photos (right). When I talk about Lightroom being efficent or not efficient with multiple cores, you can see the difference in the CPU usage in task manager between the two tasks. While neither are perfectly using all of the CPU, exporting images is much more efficient. Building previews on the other hand is comparatively all over the place.

Building 1:1 previews (left) and exporting images (right). Notice the difference in CPU usage.

For the merge to HDR benchmarks, I selected 15 RAWs (5 3-bracket photos) and timed merging a single 3-bracket stack to an HDR in headless HDR mode. I did this many times for each shot to find consistent times, then averaged all the times together. I have less faith in the HDR numbers as I could see dramatically different times by doing things like simply closing Lightroom then reopening and running the HDR again, despite clearing cache and so on.

5 stacks of 3 bracketed shots during headless HDR benchmarking.

Some thoughts on these numbers: first off, the import numbers are kind of cheating. The iMac and MacBook Pro were copying from and to the same SSD whereas the PC had two SSDs that I was copying between as I was moving the files to the dedicated SSD I use for Lightroom.

Second, maybe I'm doing something wrong but the times on the Macs were faster than I was expecting. My hypothesis is that Lightroom Classic CC for macOS is more efficient at certain tasks compared to the Windows version. It definitely wasn't the case for me with Develop actions and felt sluggish there.

And third, I was expecting a more significant gain with HDR merging between the machines but they were fairly minimal.

Update 2/4/2018

Adobe right after this post came out and the new release has some very significant performance upgrades. It seems I was correct in saying that the macOS version of Lightroom had felt comparatively faster at certain tasks. Adobe added me to the prerelease versions for Lightroom Classic CC and I got to take the new performance improvements in 7.2 R5 for a spin:


In short, this update brings the largest increase in Lightroom performance compared to any past Lightroom update I can recall. Beyond these benchmarks I found a massive responsiveness increase in filmstrip browsing and navigating from photo to photo in the Library module (the same increased in Develop module a bit as well) and a noticeable performance bump when using the adjustment brush in the Develop module.

I also ran the Premiere Pro PPBM H.264 encoding benchmark. Obviously this is a benchmark that puts the graphics card and every core to very efficient use. Again, not something I was optimizing for with this build but a nice secondary benefit from my move to 6 cores. What I'm showing here is different kinds of GPU acceleration: CUDA (Nvidia GPU), Software (no GPU acceleration) as well as OpenCL and Metal acceleration for the Macs.

An early 5.2GHz Premiere Pro benchmark while I was still finding my ideal stable overclock. This was without AVX and set using the Asus AI Suite overclocking app. I was using extreme voltage here. I absolutely do not recommend going anywhere past 1.4V Vcore. I went a bit overboard here.

As expected, the Premiere Pro encoding times went down considerably with the addition of more cores and faster cores, and was monumentally faster with GPU acceleration enabled.

What's next?

Nothing's ever really finished.

I've been really happy with this machine so far. First off, I just love the aesthetics and the dark theme inside the case. While my goal was not to make a gaudy PC with a window to show off everything, I think this build was tastefully done. I do kind of wish the case was a tad smaller, but I wouldn't compromise for a smaller radiator or limit my motherboard options to get that.

This PC also does exceedingly well at 4K and VR gaming which I was going to talk about in this post but decided to leave out. I also got an Oculus Rift and an extra sensor for 360° play and was thoroughly amazed the first time I used it and went through the demo. And I got that same feeling when I played for the first time. When I played for the first time. And again when I used for the first time. And when I fired up a massive virtual computer display in front of me in VR with . But many hours spent in VR over several months, I started to become annoyed at the low resolution of the Rift (nothing ever feels sharp, especially text) and the so-called god rays. Maybe I'll give it another shot with the next generation of higher-resolution HMD hardware.

After a lot of tweaking Windows 10 to get it to my liking, I've really come to like it — though to be frank I'm not sure it will ever feel as natural as macOS to me. It can do everything I need no problem, but some of the Metro/Fluent design inconsistencies and very involved ways of getting certain tasks done (try using Task Scheduler) make it clear that there are definitely parts of Windows 10 that were swept under the rug.

The big surprise for me was how good the current state of the Windows Subsystem for Linux is and how well it can take care of my web development needs.

But for how fast the PC is, it only makes me realize how much I'm being held back by Adobe's subpar Lightroom optimization.

It's like I have a fast hypercar but I can only to use bald tires and a worn clutch that can't put all the power to use. I have been editing photos in Lightroom for years. I even remember when in 2006! At that time photographers had Photoshop, Camera Raw and Bridge to do all their RAW photo editing. It wasn't the best workflow and Photoshop had become an unwieldy behemoth to photographers wanting to focus on more bulk, basic image edits.

I don't see myself entirely leaving Lightroom for one of the many competitors: notably but there's also , , and . I've given other applications a shot but always come back to Lightroom.

While I'm really tied to the Develop module in Lightroom, maybe there are other parts of my workflow I can optimize. The best candidate would be trying to do my photo culling outside of Lightroom. That would mean fewer photos imported, fewer photos that need to have previews generated and so on. Even with 1:1 previews generated in Lightroom, navigating around photos in the Library module fullscreen view is not lightning quick.

There are two programs that come to mind for quick RAW viewing and culling: and . Photo Mechanic has a pretty powerful culling workflow and related file management functionality. However, if I shoot RAWs alone it only views the small embedded JPEG inside the RAW file, which in the case of my Sony A7r III seems to be 1616x1080.

Fast Raw Viewer lacks some of the advanced culling features and is more focused on being a RAW viewer. It views the embedded JPEG inside the RAW file, but you can also set it to process the RAW and render that original image instead of the small embedded JPEG. The latter is not quite as fast as just displaying the JPEG but is handy when you want to zoom all the way in.

With Photo Mechanic you open your folder of RAW files then browse through them and tap a key to tag (select) any shot you want to end up importing into Lightroom. You can also assign a rating or color value during this process. When you're done culling, you can have Photo Mechanic only show the selected items, then you can simply drag them into Lightroom. Lightroom will pop open an import dialog and only the photos you culled will get added to Lightroom. And as long as you specified to have the photos added and not copied, any rating or other metadata you added in Photo Mechanic is visible in Lightroom as well.

If your Lightroom catalog allows for changes to automatically be written into XMP, any rating and metadata updated in Lightroom will be reflected in Photo Mechanic instantly. Unfortunately, it does not appear to work the other way around automatically: you'd have to go to the Metadata menu and select "Read metadata from file" to have it update in Lightroom.

Photo Mechanic 5: Powerful though not the most aesthetically appealing UI to work in..

The benefit of simply viewing the embedded JPEG is that loading and going between photos truly is instant. Though for my exact needs, it leaves me wanting more. My typical culling process is not based on viewing the overall composition. I usually want to zoom 100% into the photo to see if the focus is sharp — for example if I have many similar shots of the same scene and only want to keep the one with the best focus, especially when I shoot with manual focus. Furthermore, sometimes I don't know whether I should keep a photo until I tinker with basic Develop settings in Lightroom like tone and cropping.

For the first issue about not being able to see the original, I reached out to the maker of Photo Mechanic. They offered one workaround for not being able to render RAWs as an option in the Windows version: just telling my camera to shoot RAW+JPEG. Photo Mechanic then displays the pair as one item and lets me view the full-size image quickly. Seems interesting but that would significantly increase my storage needs while I'm traveling. If my New Zealand trip exceeded 800GB I can't imagine how many more SD cards I'd have to carry along if I had RAW+JPEG enabled.

Then there's my desire for some basic editing functionality during culling just to see if a shot is worth keeping. Ideally, I would be able tag photos in Photo Mechanic from a set of photos I had already imported into Lightroom, and be able to see the new selections automatically appear in a rating filter I had in Lightroom. That would let me select a photo in Photo Mechanic, see it appear in Lightroom and then tinker with Develop settings or further zoom into the photo if it's one of those shots where I'm not sure I want to keep it yet. Like a two-way sync.

I do recognize that there is tremendous value in only having a quick preview functionality and I'll keep tinkering to try to change or improve my workflow with it in some way.

But here we are in 2018 and Lightroom feels like a sluggish, unwieldy behemoth. Is history set to repeat itself?

It looks that way. Adobe recently spun off a simpler version called Lightroom CC. That's not the Lightroom Classic CC I've been referring to in this article. If I had to guess, it seems like Adobe has some different goals for Lightroom CC:

  • Entice new photographers and inspire casual photographers to try out the Lightroom ecosystem.

  • Enable entirely new and simple synced and mobile workflows for the casual user, functionality enabled by requiring all photos to be backed up to Adobe's cloud. Also, remove the traditional pain point of storage management.

  • Make some extra money in the process by requiring everyone to pay for each terabyte of cloud storage. (Though it appears that the most you could ever have is 10TB which excludes most prosumer/professional photographers.)

Lightroom CC is a native application for your computer along with companion mobile apps. It has a much simpler interface. There are no module tabs like in Lightroom Classic CC, just a pane on one side that is like a basic Library module for browsing and managing your imports. And then there is a pane on the right that can expand to show Develop-like functionality, now called Edit. It's definitely more pleasant on the eyes. Labels and sliders are larger and easier to use on 4K display.

Upon importing a set of photos, Lightroom CC immediately goes to work uploading all of those massive RAW files. You can pause it temporarily, but eventually they will need to be uploaded. You can specify how much of your local drive to give up for the photo cache and Lightroom CC will selectively remove and download photos when that cache gets filled.

Top and bottom right: Nelson Lakes National Park, New Zealand.
Bottom left: Wharariki Beach and the Archway Islands, New Zealand.

Performance-wise, it felt identical to Lightroom Classic CC in my brief back and forth comparison between the two applications. I think it was just the less cluttered and simpler interface that tricked me into liking Lightroom CC more initially and thinking it felt faster than it actually was.

If I like the UI of Lightroom CC more, then why don't I just use it? It . Well, I shouldn't say lack because the targeted audience for Lightroom CC probably doesn't care about these things:

  • Ability to merge HDRs or stitch panoramas. This is a big one.

  • Ability to split original files across various drives (I )

  • Support for multiple catalogs

  • Support for tethering and watched folders (Though I've only ever used this functionality once for a portrait studio)

  • Color labeling

  • Ability to set custom sort order

  • Range masking

That list used to include presets and tone curve adjustments but they were added recently.

Another part of the Lightroom CC story is around using machine learning — Adobe calls their tech Sensei and it appears to be used in two ways so far. First, the "auto" button now takes the shot into account and tries to find ideal settings for it compared to other shots. Second, Sensei enables easier searching by automatically tagging photos with keywords.

While I like the pitch and how bold Adobe is being with the cloud and machine learning approach, I won't be leaving Lightroom Classic CC anytime soon. Classic is for the high volume photographers that can handle and prefer to manage their own asset storage (and have numerous backups) and who aren't keen on doing any kind of real editing on a mobile device.

But Lightroom CC is good for one thing: it shows me that Adobe knows what they're doing and can make some good software. I hope that this will trickle its way down into future Classic improvements. And if one day Lightroom puts multiple cores to use efficiently, you can be sure I'll come back here with another long post about building a 24+ core machine optimized for Lightroom.

Please share :)

If you enjoyed this post, please share it with your friends and followers. It took me several months of spare time to write this and is currently my longest article out of 1,200+ on this site.

While I'm on the subject anyways.. I really, really loathe the current trend of compositing photos. You've no doubt seen this practice on some popular Instagram accounts. It's the practice of taking parts from several different photos and combining them into one to appear as though it was actually captured from a camera. Photo look too good to be true with a perfectly placed flock of birds right in the perfect spot in the background? Likely composited shots. Unfortunately, it's also common to use the clone tool to add or remove significant parts of the photo to get a desired effect. I just feel like it's lying to the viewer way more than boosting shadows or adding vibrance would be.

One thing to note is that we're on the 3rd generation of Intel desktop processors using a 14nm process, each with rather incremental improvements. The next big update is rumored in Q2 2018 and will be a 10nm process with a code named Cannon Lake microarchitecture. So if you can wait until then to build a computer, you might have some impressive CPU/chipset options at your disposal.

As explained in the memory section of this post earlier, you want low timings at a fast clock. To beat CL14 at 3333 (8.4ns) I'd have to run something like CL15 at 3600 (8.33ns). And that would barely beat it. Other high-end RAM kits that support XMP settings of CL16 at 3800 or CL17 at 4000 would not beat CL14 at 3333. And besides, those are increasingly very high overclocks and at that speed you would risk memory controller stability when paired with a very high cpu overclock for less than a 1% performance gain. I decided my slight modification on XMP settings were great and already much faster than the Coffee Lake reference of 2666MHz.

Storage concerns while traveling aside: The Photo Mechanic folks did tell me about a way to delete the JPEGs to save disk space after I've completed my culling.

Published 22 Jan 2018

Get new articles via email I only publish once every few months.

  • Made on an iPad Pro

    How the 12.9-inch iPad Pro took me by surprise and replaced my laptop.

  • Building a Lightroom PC

    Why I switched to Windows and built a water-cooled 5.2GHz 6-core editing machine

  • Reading more

    Why I got a Kindle and set a goal to read 24 books in 2017

  • Storage for Photographers (Part 2)

    How 12TB of network-attached storage changed my digital life and why you need it too.

  • Traveling and photography (Part 1)

    How and why I create photo stories.

  • Getting started with drones

    Everything you need to know about how drones work, how to fly them and modify them.

  • Simplify.

    Wherein I realize how to own less, worry less and live more.

  • 1200 posts since '05



Related News


Personalised photo gifts south africa
Didier drogba photo gallery
Presto mr photo software download
Studio foto di bekasi
Air max 95 black photo blue
Brianne elizabeth photography contact information