Monday, 1 January 2018

How to set up SteamVR with the HTC Vive

ITS THE NEW YEAR!

These last few weeks Meshminds gave me the opportunity to work with the HTC Vive and I've decided to build a small VR experience using Unreal Engine. I had a "VR Ready" laptop (Gigabyte Aero 15) but yet I must confess that the setup for the Vive is not as easy or straightforward as it seems. Here are some notes on the process...

Setting up the Play Area



Recommended setup

First, you'll need to clear the play area. In our case, having just moved back to Singapore, our boxes haven't arrived from London just yet, so we simply moved all the furniture out of the way. Technically speaking, all you need to clear is 60m2 (3x4x5) metres. But let's be realistic: not everyone is going to have this luxury of floor space in their home or rental flat. And even if you think you've cleared everything (and even when you see the Chaperone boundaries) you'll probably still violently thwack something with the controllers or stub your toes on the furniture you stowed away in the corner at some point.



[Note that the average HDB flat living room floor area will exceed the volume required so take care that the second lighthouse shouldn't exceed a distance of 5m from the other lighthouse]

Vive Installation


The Vive came in a reasonably compact box, but once I took out the items from their neatly packed state, the cables somehow seemed to gain volume and a wilful desire to take over the entire room. After moving the equipment from one room to another (whilst trying to decide which was the room to use), I found myself spending time detangling cables over and over again.


Cable explosion after I took it out. Even with the diagram its difficult to put it all back in after you uncoil everything...

If you are renting, then you probably don't want to do anything so drastic as to drill a permanent mount into the wall. After cracking our brains on what to do, we found two cheap $5 selfie sticks with 1/4" camera screw mounts that also fit the Vive Lighthouses's screw base. We took the selfie stick apart and mounted the lighthouse on the flat stick which gave us an adjustable mobile rig that we could stick to the wall with a combination of 3M Command tabs and duct tape. (Note that the base station generates some vibration so you definitely want to pile on the duct tape for safety)






DIY Vive Lighthouse Mounting - NOT PRETTY BUT IT WORKS



Troubleshooting


After spending countless boring hours installing new drivers and updates for Windows and the graphics card, we were able to run SteamVR intermittently. One recurring issue was relating to the VR Compositor. Every time we restarted the entire system, on the first run it always returns "Shared IPC Compositor Connect Failed (306)" and "Compositor is not available (400)" even when everything is already set to use High Performance graphics (set to default to the better GPU in the laptop). It seems to be a common issue - if you search online you will find several forums where many lost people are also asking the same question "HOW DOES I COMPOSITOR???"





Apparently this error is caused either by too many display devices (monitors, etc) connected or the Intel graphics card somehow taking precedence over the Nvidia GTX1060 in this laptop. The fix for it is to go to SteamVR Settings and Disable Direct Mode and then Re-enable Direct Mode. SteamVR will restart each time you do that. Sometimes this works, SOMETIMES THIS DOESN'T. Okay, most of the time it will fix it.



At this point it seems worth asking, so what is the VR Compositor? If you read Valve's documentation, it says:

The Compositor simplifies the process of displaying images to the user by taking care of distortion, prediction, synchronization and other subtle issues that can be a challenge to get operating properly for a solid VR experience.

If I understand it right, the compositor runs the VR display, and when an application wants to use the HMD (Head Mounted Display) it asks the compositor for access to a buffer and begins to render into that buffer. The eye buffers are a bit like layers that are ontop of one another. Vive’s Chaperone is an example of one such layer that already asked the compositor for access at the very start - so you see the room boundary in the HMD when you walk too close to the edge.



The standard or normal rendering of the 3D world is rendered into each of the eye buffers (one for each eye!) and then the warp pass/spatial as well as chromatic distortion is executed over the eye buffers to allow the final image to match the curved lenses on the HMD so that you see it as a 3D image with the headset.

[Another issue we had was hardware related - one of the lighthouses seemed faulty - one LED does not light up (instead of 17 only 16 LEDs are lit). In the meantime, we simply used the working Lighthouse as the front facing one and tried not to turn 180 degrees in a hurry so it wouldn't grey out. Apparently it still mostly works with one Lighthouse so long as if you are still kinda facing the single Lighthouse]

ANOTHER FUN READ: Teardown of the HTC Vive

To take a screenshot in SteamVR, press System+Right Trigger at the same time. The image in SteamVR is quite confusing as the prompt on screen is not very clear - I read it as a suggestion that you press System and THEN press Right Trigger. So I thought the screenshot feature was broken until George told me to press BOTH AT THE SAME TIME. As for where these screenshots are being saved, its really not very obvious either. For example, you can't set the folder for it to be saved. It took me a while to figure this out, but the screenshots are being saved into the userdata folder in Steam. For me, by default the screenshots were being saved here:

C:\Program Files (x86)\Steam\userdata\760\remote\250820\screenshots\

Besides the 250820 folder, there were other numbered folders for different apps or sessions. Look in all of the folders inside remote to find your screenshots.

Also note that the screenshots are PRE-warp/distortion. So you'll get screenshots that all look like two normal views without the overlay of the other buffer layers such as the Vive Chaperone, etc.



Now for the fun... testing out various VR apps!



The Lab

On Steam: http://store.steampowered.com/app/450390/The_Lab/



First stop is surely The Lab. Pet your robotic dog on a mountain side and play with other very solid VR experiments in The Lab. This free title is deservedly highly rated - you'll definitely find that there are quite a number of VR games appearing on the market which are simply clones / rip offs of the devices from this portal-themed collection.

VR Museum of Fine Art

On Steam: http://store.steampowered.com/app/515020/The_VR_Museum_of_Fine_Art/

Art? What art? I came here specially to look at the potted plants used to decorate the dusty corners of the museum. 10/10.



Google Earth VR


Google Earth VR is super impressive in 3D cities like London where you can fly around in a warped clone universe...



and with a wave your hand you can instantly change the time of day into THE TIME OF THE APOCALYPSE





There is also another frightening feature...



AIN'T NOBODY WANTS TO FIND THEMSELVES FLOATING OVER EARTH AT THIS ANGLE!!! URGHHH. Which brings us to the issue of sim sickness. Its not great when the motion in the graphics suddenly does not correspond with the user's own head movements...

Job Simulator




In Job Simulator, you play a human being assigned to do some artisanal jobbing for robots who have developed a refined taste for their services and products being made sloppily and badly by humans in a world of limitless machine perfection. I liked this game too much, because I'm not very good with following instructions in an open roaming world. I haven't tried all the jobs yet, but here I was being a chef - burning lemons on the grill, drinking virtual wine on the job, and throwing raw meat at the frowning robot customers.

SUPERHOT




Superhot works on the premise that time only moves when you move. The graphics are simple but the rapidity with which things move when you accidentally make a wiggle of your head in VR is very effectively. It also however leads to a lot of unintentional arm or hand flapping when you somehow need to make things go faster, so it was quite amusing to watch George playing this.

Saturday, 30 December 2017

Blender & Unity: Manually Rigging Blender Humanoid Characters for use with Unity Mecanim


I'm definitely no character animator by trade, but there comes a time when you end up with a Unity project that somehow requires it. There are obviously many automatic rigging methods available (Blender does actually have an auto-Rigging system called Rigify for biped humanoids) and you could even try to download other rigs made by other people and plonk them into your scene, but I found that so many of the rigs including the rigify one seems to involve so many complicated bones you don't need, so you end up having to sift through the bones, deleting so many unwanted bones, renaming bones, perhaps even having the impression of the impossibility of rigging up them bones.

Although it may seem terrifying at the beginning (I'm not an animator or rigging specialist!), I found that surprisingly, it is not that difficult to manually rig up all your bones if what you have is a very simple humanoid character. You just need to be orderly and to stick with the admittedly tedious bone naming process. (Although our character is blobby, we're sticking with a humanoid as we're going to use it with the Kinect to sync it with the movement of the human user, and our human user is going to return a humanoid set of values that we'll need to rig up our character to...)



According to the Unity Blog's post on Mecanim Humanoid:

"The skeleton rig must respect a standard hierarchy to be compatible with our Humanoid Rig. The skeleton may have any number of in-between bones between humanoid bones, but it must respect the following pattern:"
Hips – Upper Leg – Lower Leg – Foot – Toes
Hips – Spine – Chest – Neck – Head
Chest – Shoulder – Arm – Forearm – Hand
Hand – Proximal – Intermediate – Distal



This is the list of all the bones you need (I found it useful to copy and paste in these names directly)

head
neck
collarbone.L
collarbone.R
upperArm.L
upperArm.R
lowerArm.L
lowerArm.R
hand.L
hand.R
chest
abdomen
hips
upperLeg.L
upperLeg.R
lowerLeg.L
lowerLeg.R
foot.L
foot.R
toes.L
toes.R

Optional: eye.L and eye.R

For starters: Ensure that your character model is positioned at origin and that its pivot point is also at origin (0,0,0). Make sure you reset the scale to 1 just in case (Ctrl+A, Select Scale). The hip bone is the key bone in all this, so start by creating one big bone starting from the bottom of hip to top of the chest. Hit Space and start typing "Subdivide Multi" (Armature) and give it 2 cuts so you get 3 bones. These will form the hips, abdomen and chest bone.



After you've done the main spine bones, you can turn on x-axis mirror.



- Select the ball on top of the bottom bone (hips bone). Make sure Options>Armature option>X-Axis Mirror is selected, then press Shift-E to extrude mirrored bones. When you're in mirror mode, every time you create a new bone, you'll have a second one mirrored on the other side of the X-Axis. Remember that you'll have to rename BOTH bones later on - if you are facing your model face-on, also remember that L is actually to the right and R is to the left, and name it accordingly.

- Arrange the leg bone into position (you may need uncheck "Connected" in order to let the leg bone go into the right position). Reposition the leg bones away from the hip. Subdivide Multi (1 cut) this leg bone into two bones, forming upperLeg and lowerLeg.

- Shift-E to extrude two more foot and toe bones, and also add in the collarbone, arms and neck+head bone. Do make sure you keep it all in a standing T-pose (as if the character is standing in the shape of the letter t).

- Ensure that all of your bones are renamed correctly as per the list. If there is an L bone there must always be a R bone.

- Go into Object Mode and Select first the character and then Shift select the armature. Press Ctrl+P and select Set Parent To - Armature Deform - With automatic weights. Your computer might lag for a second before its all connected up.



From there, you're in the home stretch. Export your Blender model in FBX format and then import it into Unity, and in Unity set the rig to humanoid (instead of generic) and at the bottom of that, hit Apply.

Let the wild rigging begin!



See also:
Animate Anything with Mecanim

Sunday, 17 September 2017

Singaporean Landscapes: Cut-and-paste greenery

With the opening of After The Fall at National Museum of Singapore in a few days time, I thought I should post up a series of entries about the design process of the holograms that I've been working on.

When I began thinking of my desired aesthetics for an imagined "Singaporean landscape", I can't think of anything more apt than clean textures and hard cut lines in the 3D models; that impression of some master designer's wildly clicking Ctrl-V all over the landscape. Bizarrely hard contrasts between forms - such as this example of the generic industrial property constructed within a hair's width of the highly decorative chinese temple:


View from the Bedok Park Connector: Taoist Federation Building meets Industrial Carpark on Bedok North Ave 4
(Photo: Debbie Ding)

Or another example is this pristine specimen of a concrete and metal crash barrier that I encountered the other day whilst out walking around Bedok - so blindingly white and perfectly new that it may as well have been a 3D render:

20170827_101146

20170827_101150

20170827_101153

ITS REAL



For efficiency, the top tip would be to cut away any faces/vertices that do not ever appear within the camera view so no processing power will be wasted on that information that you will never get to see within the final render. So for a very tall tree, you can simply cut off the tree's crown as long as you won't see it and you won't need the shadow of the leaves to impact on the overall scene.



If you did it right, you'll be able to dramatically reduce overall file size and render time. But the visual outcome is in the file you'll also see a lot of hard cuts in the virtual greenery... And whilst I was working in the isolation of my flat in London a few months ago when this project first began I was actually worried that I might have overdid things with my.. er... overzealous cutting-and-pasting. But I need not have worried! For upon returning to Singapore, I was gratified to see countless examples of this highly efficient cut-and-paste greenery at work:

20170827_104433

Recently the trees around Bedok Reservoir seem to have been subjected to a round of very fastidious tree pruning which look like a model picture of the cut-and-paste public greenery that was in my head. A cutting exercise facilitated no doubt by what they describe as more cutting-edge tree monitoring tech...

20170827_104545

20170827_104532

20170827_104730

Sunday, 3 September 2017

Goodbye Mac, Hello Windows: Windows alternatives to Spotlight, Quick-look and other Mac interface staples.



This year my 5 year old Macbook Pro had finally reached the point where its old processor seemed completely unable to handle the 3D work I needed to do for a project, so it was time to make the big migration to a new desktop replacement... and a new OS!... Because after an entire adult life of using Mac - I've switched to Windows!!!

I've been really disappointed by the blandly uninspiring offerings for the 2016/2017 Macbook Pro. In my opinion, Apple has failed to produce a reasonable 15" that can serve as a decent desktop replacement. What they've made instead is really just a bigger Macbook Air. There have only been minor improvements to the processor, RAM is still capped at 16GB, the touch bar seems to have inflated the overall machine cost, and in order to reduce the thickness and weight of the machine, the USB ports and SD Card reader have been taken out, despite the fact that a majority of digital camera users still currently use SD cards and that whole argument about moving towards a future in which we'll all be transferring huge (RAW or otherwise) files wirelessly is simply not yet feasible for most situations that I find myself in!

After adding in the cost of other usual screen/RAM upgrades and extended international warranty that one would need to get for a long-term desktop replacement, the average Macbook Pro buyer in 2017 would have to pay close to or in excess of 4000SGD for a 15" desktop replacement that lacks some of the most basic features you would expect of any decent portable desktop replacement. I mean, you're telling me that after spending so much on a new machine, I'll still have to buy a whole bunch of frustratingly expensive peripherals and dongles to replace the complete lack of ports and card readers that come with every single other PC laptop on the market????

BYE MAC... (Image via GIPHY)


With so many compelling reasons to jump ship to PC, I spent some weeks formulating a new Windows PC criteria list. (Much of this was compiled with the help of George aka PC FAN BOY). In case it might be useful to other Mac users thinking of switching, this is what my list looked like:

DBBD's Specs list for a 15" Windows PC Desktop Replacement (written in May 2017):

Processor: 7th gen Kabylake [The newest at that point of time]
RAM Upgradability: 32GB
Graphics Card: At least GTX 1060/1070/1080 ["VR Ready"]
Screen resolution: 4K
Screen bezel: As small as possible [aesthetic preference]
SD Card Reader: Required
Fan: Adjustable speeds / Should have option for silent mode
Weight: Below 2.5kg (my Mid-2012 Macbook Pro 15" was 2.56kg not counting the charger and this was back-breaking)
Thickness: Below 3cm
Design: Shouldn't look overly aggro or like a monstrous tank [Ruling out Alienware and MSI...]
USB: The more USB 3.0 the merrier
Thunderbolt port: Required
HDMI Port: Required
Warranty: Two Years International [A Basic requirement!]

[There's the term VR Ready which means the machine is capable of a decent performance with VR headsets such as Oculus and Hive. For this, I've used the benchmark of recommended system requirements, not minimum specifications, although these specifications are always changing]

[One might also sagely ask: since I am already switching, WHY NOT LINUX? and the answer is simply that I DONT HAVE THE TIME to spend weeks setting up Linux, deciphering my way around Linux and posting endless help messages on Linux forums which is probably what will happen if I try to switch to Linux.]

What I ended up getting based on my spec list:



Gigabyte Aero P65W
Processor: 7th gen Kabylake Intel® Core™ i7-7700HQ
RAM: UPGRADED TO 32GB
Graphics Card: GTX 1060
Screen resolution: 4K
Screen bezel: 5mm
SD Card Reader: YES
Fan: Adjustable / Silent Mode
Weight: 2.1KG + 0.5KG Charger
Thickness: 1.9CM
Design: SLEEK
USB: USB 3.0 x 3
Thunderbolt port: YES
HDMI Port: YES
Warranty: Two Years International

A compact, light 15" desktop replacement gaming laptop with one of the smallest bezel I've ever seen on a Windows laptop. The screen and colour looks excellent to me, and I'm so pleased to have a laptop with variable fan settings for once, and the keyboard can be set to conduct a light show performance in a dizzying array of LED colours (it can also be easily turned off when you get annoyed with the all-blinking keyboard feature). But because the screen bezel is so tiny the webcam for this is located below so by default you'll always be pictured chin-first (FATFACE) when the unusually low webcam turns on. But this doesn't bother me too much though because in every other way this laptop basically meets all the specifications that I need for my work.



CONS: GET READY FOR UNUSUAL WEBCAM ANGLES....
PROS: GET READY FOR UNUSUAL WEBCAM ANGLES!!!!

Price in Mid 2017: 2899 SGD (3049 SGD with RAM upgrade)



Transitioning from Mac to Windows: Alternatives to Spotlight, Quick-look and other Mac interface staples on Windows


Besides the usual issues of getting used to a new keyboard for speed touch-typing, the transition from Mac to Windows has been pretty smooth. There are a couple of features which Windows doesn't seem to have a default solution to, however for every single problem that presents itself, there are probably a half-dozen free ways to accomplish the same task...

1. Local Search / Quick File Launcher Replacement




Mac users will probably find that Windows 10's default File Explorer is unacceptably slow and unable to find any files as efficiently as Mac's Finder/Spotlight can. Could this be an indexing problem? Is the File Explorer slow because it is pointlessly looking for network folders which it can't find? I honestly don't know what causes Windows 10's File Explorer to be so slow. Fortunately, the solution is simply to use a free indexing tool called Everything which indexes contents and works as a much faster local file search tool. From within Everything, you can indeed find everything on your computer instantly. Used together with Wox, which is a launcher very similar to Mac's Spotlight, this basically does the job.

Wox's default hot-key is Alt-space and you can theme Wox with different colours/fonts and its easy to add or write additional plugins to extend the function of the quick launcher. With different prefixes you can quickly search different things, for example "g anything" will google search for "anything", "wiki anything" will search wikipedia for "anything", and #000 will show the hex colour #000 (Black).

2. Quick-Look Replacement - Seer


I googled to find out when the Quick-Look was first introduced and it says that it was first rolled out in 2007. That means I may have had 10 years of impulsively hitting the space-bar when I want to see a larger preview of an image, and let me tell you its hard to stop a habit like that. The same way you might launch yourself into a browser today and find yourself inexplicably typing in the first few letters Y O U T U B E for absolutely no reason at all. Ah, muscle memory.

It was a major disappointment to find that hitting the space-bar did absolutely nothing on Windows at first - I tried to ask George to tell me how to enable any kind of Quick-look feature on windows only to find out that the concept of Quick-look basically does not exist in Windows. For a moment there, this almost made me doubt whether switching to Windows 10 was a worthwhile investment of my time to get familiar with Windows. There you have Windows 10's slow and completely unintuitive and cluttered File explorer interface and I could gripe about it all day. But one has to be objective about things and the good thing is that so very often there will be a free tool that does exactly what I want the Windows PC to do - a different file launcher, a different previewer - you just need to find it or make it yourself.

A quick fix to this entire Quick-Look problem is simply to install Seer, which provides the much needed Quick-Look feature that allows you to preview images in File Explorer quickly by hitting the space-bar.

3. Skitch Replacement - Monosnap




For years, I've probably used Skitch on a daily basis to annotate images for work purposes before sending them in email, but sadly Skitch was ruined by Evernote (Evernote bought up Skitch in 2011, almost seemingly just so they could force the discontinuation of the development of this excellent project. Because of this terrible behaviour on Evernote's part, I refuse to use Evernote. Evernote, suffer the wrath of the 10 million disgruntled Skitch users!).

Fortunately there is no lack of alternatives that have come into the market since, especially for the Windows platform. There are many Skitch-alternatives and after trying a few options I settled on using Monosnapfor all my quick screenshot and annotation needs. Monosnapis very similar to Skitch - you can draw arrows, lines and then quickly drag and drop the file into an email or file folder. Another super important feature is that you can also blur out parts of images quickly!

I set all *.png files to open in Monosnap by default, so that by default all screenshot PNG files would open in Monosnap, ready to be instantly annotated. Super handy. Other similar alternatives include Greenshot and Lightshot. I just happen to like Monosnap's interface the most out of them all.

4. Web Browser - Microsoft Edge


Recently, after having created a monster of a webpage involving about 53 animated gifs each about 1-6MB huge, I realised that Chrome was suffering and lagging. Although I wasn't intending to upload that page in that form, I decided to check how the same page performed in every single browser on my computer and came across Microsoft Edge which I had never used or even seen before. Not at all to be confused with Internet Explorer, Microsoft Edge is the new Windows 10 browser and e-book reader. I was really surprised to find that my horrifying web creation did not lag at all in Edge!

Microsoft Edge was first released in 2015 and obviously for a new browser it lacks the same kind of extensiblity that Chrome and Firefox have with the multitude of plugins available. Nevertheless, when I look online and see the general complaints of Edge's earlier incarnations not having a browsing history (it now does) and not having a fullscreen mode (it now does), it seems like it has only been improving since it was introduced.

Apparently for things like Google's Octane and a lot of other javascript engine benchmark tests, Edge actually outperforms Chrome. And clearly it performs much better on my computer for graphics and page loading times as compared to Chrome, Firefox and IE. Just so you know...

Thursday, 31 August 2017

A Warehouse Sale in Singapore: the mass production of freedom tshirts

Recently the parents took me to see one of the huge Robinsons warehouse sales in Singapore out at Expo, where I encountered vast mountains of GIORDANO shirts like never before. For the uninitiated, in Singapore Giordano has long-established itself as the main purveyors of generic low-cost casual wear - with an outlet in practically every other HDB centre. Even I probably have had a few Giordano basics in my childhood wardrobe, because Giordano is so ubiquitous that people sometimes even think they must be a Singaporean company (they're actually from HK). I don't know who designs these shirts, but if you just want the cheapest possible basic cotton shirt and have no care for fashion or aesthetics or things with meaningful messages on them, then this is the place for you. You know, a $5 tshirt that basically says "I absolutely do not care whether the words on my shirt makes any sense at all in any context whatsoever because I do not care about the useless practice of wearing shirts to express my individuality and personality".

[This is also not to be mistaken for Things which are so bad until they're good again. No, this is not exciting, trendy, or radical clothing. No, these are just generic words printed onto a generic shirt and then sold in staggeringly vast quantities in Singapore]

Now the strange thing about all these Giordano shirts was that... THEY WERE ALL FREEDOM THEMED! Yes, a mountain of FREEDOM shirts in Singapore.

Here are a few of them:

20170902_103934

OKAY... WE ARE FREE TO CHOOSE...
BUT WHY ARE ALL THOSE PEOPLE RUNNING?
WHAT ARE THEY RUNNING FROM???

20170902_103923

OK... I'LL TRY TO KEEP CALM...
BUT..... IS THERE REASON NOT TO BE CALM?
IS IT RELATED TO THE "STAYING FREE" PART???

20170902_103914

I DON't KNOW GIORDANO, WHY DON'T YOU TELL ME?
OR AM I SUPPOSED TO ASK THESE FISHIES?
DO THE FISHIES KNOW FREEDOM?
WHAT IS FREEDOM???

Maybe they had a meeting and then they thought about how Singapore and Singaporeans are always concerned about freedom. Maybe they were chortling to themselves as they designed it, "Oh these Singaporeans! They are so going to love our insightful and subtle commentary on the state of freedom and free speech in Singapore in our new line of freedom shirts!". Or who knows, maybe the designing of the shirts was done by a hapless designer being forced by the dollar to churn out more and more nonsensical mass produced tshirt designs which the designer knows will never be thought of or valued as a credible creative product and as the poor designer sheds a tear for the demise of his or her creative integrity he or she types into Illustrator: WHAT DOES FREEDOM MEAN?

I wish I had taken more pictures of all of the designs but the Dingfather pulled me away and told me to stop mocking the freedom shirts by laughing so loudly.

Saturday, 19 August 2017

Blender Cycles: 12 Ways to Reduce Render Time

Recently my challenge was to produce 2000 x 3 frames for the production of three holograms. At 50 minutes per frame, 6000 frames would have taken a frightful 208.333333 days to render out on one already relatively speedy laptop. On the slowest mechanised computating potato in the household (5+ year old Macbook Pro), it would have taken several hours. But the real issue was not the complexity of the scene, but the way in which I had initially constructed the scene, and the necessary render settings.

After several weeks of editing (and frantically googling "HOW TO REDUCE RENDER TIME"), here is a compilation of ways that I used to reduce my render time (CPU) for a single frame from a horrible 50 minutes to 1-2 minutes. Its not an exhaustive list, but I thought it would be useful to WRITE IT ALL DOWN NOW in case the Future Debbie completely forgets everything after the hard slog of trial and error of the last few weeks..

1. Duplicate Linked Object Instances the right way




This may seem pretty obvious but I think it needs to be on the top of every list. The right shortcut to duplicate an instance of an object is ALT-D not SHIFT-D. SHIFT-D produces a completely new object with no reference back to the data of the original object. A linked object shares the same object data, so any edits you make to the first object will make all the other linked objects change as well. When you're in a hurry it is easy to accidentally type Shift-D instead of Alt-D, but this has the potential to make a serious impact on render time.

2. Link Object Data




Let's say you completely can't recall if you used Shift-D or Alt-D to duplicate your objects. If you go to Mesh you'll be able to see how many linked objects are currently using the same data. If your mesh is unique and unlinked, first select what is going to be the master, then all the other objects you want to have using its data, and press Ctrl-L whilst the mouse is over the 3D view. You'll get the "Make Links" dropdown menu and you should select Object Data to link the objects. Other links for materials, groups, etc can also be made using this shortcut.



Note that if for some reason you do accidentally select different objects which aren't at all similar, note that all the latter objects will be changed to have the object data of the master object anyway...



In general, I personally found it useful to assign my materials really crazy colours in the viewport so that I could see at a glance which objects were the same and which were not.

3. Clamp Direct and Indirect




Usually you end up turning up the samples for a scene because there are too many 'fireflies' and noise, but clamping values can quickly remove these stray white dots which appear on the render. Clamping sets the very lightest (brightest) samples to a maximum value so it removes those stray white fireflies, but in the process, it will reduce the bright "pop" or light sheen that you might have wanted to achieve with certain glossy materials.

If you set Clamp too low, it will also cause the ENTIRE scene to be too dark/dimly lit, especially if you are using HDR for Environmental Lighting, so don't set the Clamp too low. The general advice is to start from a high number like about 10 and then work your way down to see what works for your scene. I was able to set Clamp Direct to about 4 and Clamp Indirect to about 6 and still achieve acceptable results. As for the overall "dimming" effect the Clamp will have on the scene, you can simply increase scene brightness through the compositor with a Color Balance node, or you can simply do it in post.

4. Subdivision surface




Subdivision surface is a modifier commonly used to create a smooth surface mesh from a blocky linear polygon mesh. It is done by splitting up the faces of the mesh into even smaller faces in a way that gives it a smooth appearance.

It is worth checking if you have stupidly set a subdivision surface of VERY MANY iterations for something incredibly trivial, tiny but also incidentally heavily duplicated in the scene...

5. Decimate






Did you subdivide your terrain into unwisely tiny bits and then handsculpt it with a 5px clay brush?? If there is complex modelling or you've applied Subdivision Surface to a mesh and are now regretting it, you can undo your CPU-killing subdiv-happy ways by decimating the meshes that you don't need smoothing on! Add the Decimate modifier to reduce number of faces.

6. Simplify




Let's say you're just rendering a preview for yourself and its not the final render. You can quickly set the global max Subdivision Surface and Child Particle number here under Scene > Simplify. Just remember to uncheck the Simplify box when you're producing the final render.

7. Delete unnecessary terrain




Set up Blender so that you can see several angles of your scene at the same time, along with the timeline if you need to scrub through it quickly. Go into Edit mode (Wireframe) and highlight the excess terrain that never appears in Camera view for the entire timeline using either B (to draw a box) or C (to paint it with a circular brush). Make sure you're viewing in the Wireframe mode though, because if you're viewing in Solid you'll only be able to select the vertices that you can see, rather than all the vertices in that area regardless of whether you can see them or not.

The most handy shortcuts in 3D view are the 0 and 7 button:
0 is Camera View
7 is Top view
5 to toggle between Perspective view and Orthographic view



The resultant landscape will look a bit weird like this but you'll save time not rendering all the bits. But do keep the bits which you'll need for the light bouncing off to produce a realistic scene.

8. CPU/GPU Compute and Tile Size




If you have a Nvidia graphics card, you'll still need to enable it in Blender's User Preferences in order to use GPU, which can drastically cut down your render time. When GPU works, its like magic. GPU can be dramatically faster than CPU but is also limited by the total amount of VRAM on the card - so once it hits that limit the rendering process will simply fail (memory error). Also I had to dramatically rein in my expectations - I have always insisted on using desktop replacement laptops rather than a desktop for portability (especially for my kind of work) - but one has to consider that laptop GPUs generally aren't as powerful as the ones in desktop GPUs in terms of VRAM, No. of CUDA cores, and overall speed.



It is generally said that the tile size should either be a perfect squares or factors (ie: divisible fraction) of the final resolution (having smaller bits of tiles left over is wasteful) but I think a lot more testing would be required to determine for the type of scene and type of CPU/GPU. Generally, if you reduce tile size too small, it incurs more overheads of switching between tiles. You should experiment with the numbers and see what works for you...

What worked for my scenes (Intel Core i7-7700HQ @ 2.80GHz / Nvidia GeForce GTX 1060 6 GB GDDR5):
For CPU an ideal Tile size seems to be around 16x16 or 32x32
For GPU an ideal Tile size seems to be 312x312

9. Number of AA Samples / Progressive Refine


The number of AA (Anti Aliasing) samples will increase render time exponentially, and this was largely why my first renders were taking 50 MINUTES PER FRAME even on the best laptop in the entire household! How many samples are enough samples? How do you find out how many samples are good enough for you, visually?

Under Performance, there's an option for Progressive Refine which will progressively show you the overall image at each sampling level. It can be slower to have to complete the entire image together but you can also stop it when you think the image is good enough. Its useful to eyeball it until you find out what number of samples you are happy with, then just use that number and uncheck progressive refine so it will be faster.

10. Resolution, Output File Location, and Output Quality




When you "double" the size of the image, you're actually making it four times as large and your image will take 4 times the CPU/GPU to compute! When making a preview and not the final render, you can set resolution to 50%. But don't forget to uncheck it when you are doing the final render!!! (ARGH!!!)



Make sure that you have set the correct Output file location. If you are opening the blend file for the first time on a new computer, MAKE SURE YOU HAVE RESET THIS. Blender does this thing where it doesn't tell you the output folder/drive doesn't exist - it will happily render but only tell you at the end of the process that it had no place to save the file.

Below that there's also the compression/quality setting for it. For a preview you can set it lower but remember to set it back to 100% for the final render.

11. Selective Render / Animated Render Border




Whilst in Camera view (Shortcut Num-0) within 3D view, if you use the shortcut CTRL-B, you can demarcate the "Selective Render" border (which appears as a red dotted line) in camera view. To release this Selective Render Border, its CTRL-ALT-B.

Ray Mairlot's extremely useful Animated Render Border allows you to selective render an object moving through a scene, or to even create an animated border that can be keyframed.

When using this add-on, the final output is a frame that will still be the size of the entire render resolution, but only the selective render area will have an image and the rest will be alpha transparent.

12. Use a Render Farm


Ok so after I shaved scores of minutes off the render time to 2 minutes per full resolution frame, I realised that 2 minutes x 6000 = 8.33333333 days - and that was 8.33333333 days that I certainly did not have! There are limits to what a great laptop with a good graphics card can do. When the computer is rendering - you can't really use anything else that taxes the graphic card or processor when its rendering, so it basically disables the computer for the duration of the render.

So.. there was no other way around this - I had to use and pay for a render farm. I tried out Render Street, TurboRender and Foxrenderfarm as they were the farms which came up when you search for Blender Render Farms.

The basic process for all of them is:



- Pack external data into blend file (tick the option to automatically pack)
- Upload a zipped copy of the blend file
- Choose your render settings
- Pay the render farm some money (for slower render) or LOTS OF MONEY (for faster render)
- Monitor render progress and render quality through web interface
- Magically download completed rendered files in record time

[Do however note that the Animated Render Border add-on mentioned above in this list will not work with the render farms, but you can write to most of these render farms regarding desired plugins and they will let you know if the add-ons and plugins can or cannot be installed]

A more comprehensive review of render farms coming up in the next post...