Lenovo and UPS completely FAIL to deliver my new Laptop

On 30.8.2016 I ordered my new Lenovo P70.
A 4400eur workhorse of a workstation-laptop that was supposed to accompany me for the next 5 years to come.

On 3.09.2016 my payment was acknowledged.
...so far so good.
Somehow on the same day I was asked to register within 30 days for my 5 year extended waranty. Despite obviously not having a laptop for weeks to come. However after a quick email I was told that I only needed to register after actually receiving the laptop.

On 10.9.2016 I got a strange notification that I was shipped???? a 5 year waranty. However that is physically possible.

On 14.9.2016 I was informed that my laptop was delayed and would be shipped on 27/09/2016 (+1 week).

On 22.9.2016 I was informed that my laptop was delayed and would be shipped on 20/10/2016 (+1 month).
However 2 days later I was already informed that my hardware had been shipped. However
  • with an empty tracking number
  • no mention what shipping service was used
  • login data for a broken shop system that doesn't allow me to see or do anything at all despite look at the same error code in Firefox, Chrome and Safari on MacOS, Windows and Android.
On 27.9.2016 I finally got an answer that of all the possible shipping companies, Lenovo in their infinite wisdom had chosen UPS. ...and a tracking number.  ...that's when the "fun" starts.
I never ever choose UPS for anything. It's not a regular parcel service. It's a business 2 business curier and as such ONLY works for delivering to businesses that have offices that are open every weekday 9-5. They are completely incapable of delivering to a private address of anyone who works during the day. They don't have offices, robotic parcel-pickup stations or a way of deliverting to dropoff points or other neibours. They never invested in any of that because business 2 business couriers don't need that.

I don't work in the same city Mo-Fr where I live Sa+So. The city where I work changes frequently.

UPS answered that
  • I can not pick up my order on a Saturday because they are closed.
  • I can not reroute my package while it is still in Cologne (there it was scanned again and again for 4 days straight. 4-5 times a day. Without going anywhere.)
  • Lenovo has explicitely forbidden them from rerouting the package to my workplace and that was only possible after the first delivery-attempty anyway-.
  • They are incapable of routing the laptop to the only UPS partner shop for pickup.
  • They never invested in their own Packstation -network.
  • They are incapable of drop-shipping into my shed behind a code lock.
Lenovo answered that
  • They can not change the delivery address after it was shipped. (Not understanding that this wasn't what I asked. I asked Lenovo to simply allow UPS to change the address.)

On 6.10.2016 I just received a message that I have registred for my 5 year waranty....despite never having done that because I never received a laptop yet.
I have moved heaven and earth to find a way to stop work very, very early and take 3 trains, 1 taxi and 1 car ride to get to the depot 30min before it closes (according to the website).... only to be informed on telephone that the depot has changed it's opening hours and will close way before I have any chance of getting there.

So on 7.10.2016 my laptop will  be automatically returned to China.
...Levono said they would refund all of it. (I hope that covers the paid for exended warranty.)


Multicam editing in DaVinci Resolve 12.5

One of the great new features of Davinci Resolve 12 is multicam editing including synchronizing cameras via their sound track.
Something I have done extensively in Final Cut Pro X and now want to do directly in Resolve, so save the extra step of colour graded, intermediate files rendered from Resolve and edited in FCPX.

Since nobody seems to write about actually using this feature, I use this blog posting to document my findings.


If you don't want to read all of it:
  • you can't add a camera angle to an existing muticam clip with automatic syncing
  • syncing by soundtrack doesn't work at all (either crashes even in trivial cases or doesn't sync) 
  • there are serious bugs and even crashes related to multicam clips.
  • file creation dates can not be used as a criteria to get at least a starting point
  • identifying what clips are from the same camera doesn't work (ignored Camera ID metadata field)


Davinci Resolve 12.5 Studio with hardware dongle on MacOS.

This is a stage show. Not scenic film. Not a documentary.
Together with sports events this is probably the most common occurence of a multicam edit of more then 3 cameras. (Interview typically max out at 3 cameras.)

Material to work with:
  • No clapperboard (the audience wouldn't be pleased) 
  • No time code synced via cable (hald the cameras don't support it, TC-generators cost a fortune and we aren't allowed to put cables all across the emergency exit routes)
  • Nothing can ever be re-shot. There is one chance and one chance only to record it.
  • 1 Zoom H6 doing a continous audio recording (including -6dB backup) including pause times.
  • 2 Blackmagic Pocket cameras in ProRes with Atomos Samurai Blade HDD recorders
  • 2 GH4 with Atomos Shogun SDD recorders
  • 1 Blackmagic Production Camera 4K recording RAW (for dark scenes for easier denoising) and ProRes (for bright scenes to save SSD space due to limited number of SSD we could rent)
  • 1 GH4 with internal MPEG recording
  • some 3rd party footage will come in later
  • cameras record 3 acts of 4 scenes per act but not the pause times

Problems identified

Problems in Ingestion

  • Resolve doesn't import any camera name/id from any camera except the Blackmagic Production Camera 4K in RAW (but NOT in ProRes) => WTF?
  • Individual dropped frames in RAW recordings (it's a 3+h show. Such things happen) have to be manually added via shell-script for Resolve to even acknowledge that there IS a recoding in that directory and not a million image files. => not exactly robust software for a production environment

Problems in automatic syncing

Due to the 2GB limit, the audio track is broken up into 3 files.
  • syncing even a single clip to a compound clip of the 3 arranged audio files crashes resolve every single time => Bug ticket with Blackmagic Design [Support #BAX-398-79379]
  • syncing to the 3 individual wave files lets everything start at 0, even the 3 audio files with file creation timestamps indicating their correct positio (*hint* *hint*)
  • Same for any atempt to sync 2 video clips with 95% identical audio 
  • "detect clips from same camera using: Metadata Camera #" completely ignores any manually entered value into the "Camera ID" metadata field => BROKEN
  • There is no way to define a track as the master track for every other track to sync against
  • There is no user interface to ADD an additional camera/clip to an existing multicam clip with automatic syncing.
  • There is no user interface to re-sync a single or multiple, selected clips inside a multicam clip.
  • There is no user interface to even create a new multicam clip from clips that are in different bins (e.g. because you like to organize clips by camera to make colour grading them easier)
Result: syncing via audio track IS BROKEN and the feature is HIGHLY INCOMPLETE.

No software should EVER crash (at worst it should display an error message) and this feature plain doesn't work even under ideal real-world conditions. Maybe it would work if the cameras had identical audio tracks instead of recording the same audio independently using their own microphones but that's not a realistic scenario nor a useful feature.
Without the ability to add tracks later, the implemantation of this feature can not be considered to be "complete" in any sense of the word. 

Problems in manual syncing

  • Draging a new video clip into the audio area of a multicam adds only the audio.
  • The selection of what should be visible and non-silent when doing sync-work in the multicam clip directly affects what video and audio angles are exported for use in timelines that make use of this multicam clip. Anything you hide/mute will not be offered as a choice later or display as black/silent.
  • There is no way to scale the waveform display without adjusting the audio levels, so that you can clearly see all you need to see for manual syncing via audio waveforms.
  • There is no visual display of the border between individual frames whe zoomed in far enough in manual syncing. You only notice how far you have zoomed in when trying to move the clip around.

Problems in workflow

  • Clicking or double clicking a multicam clip or timeline does not open it in the timeline editor...it simply does nothing at all. 
  • When doing picture-in-picture and multicam clip that makes up the smaller picture has no clip for the selected angle at the current time, not only does the scaled-down area where the PiP would apear go black but the ENTIRE frame goes black. => Bug ticket with Blackmagic Design [Support #BSX-823-55972]
  • The manual refers to keyboard shortcuts as they are on an English keyboard. There doesn't seem to be a German manual explaining the German keyboard shortcuts. (Something like "Option-Shift-[" or "Option-Shift-\" would break your fingers as "[" and "\" are Option-combinations of digit keys and not first order keys here.)
This blog posting is constantly being updated while I am editing my multicam project.



Getting started with the Raspberry Pi 3 OctoPrint Bundle

After upgrading to the low friction spool holder,
I just got myself the Raspberry Pi 3 OctoPrint Bundle from Watterott.
It shall be the center component of my Ultimate Filament sensor.

Since it doesn't come with any instructions, here is what you need to do:


Don't plug it in yet!
Insert the SD card into a regular computer and edit the file octopi-network.txt .

Then insert the 2 transparent elements into the 2 holes near the micro USB socket.
(Yes, there are 2 plugs for 4 holes and no instructions.)

Now insert the Raspberry Pi and then insert the SD card.
The contacts should face upwards.
(It is near impossible to the the cards out again.)

After switching it on, you can connect to it via http://octopi.local .


You can also access the raspberry via SSH
ssh pi@octopi.local
The default password is "raspberry.
The SD card is mounted as /boot
The OctoPrint config file is at "/home/pi/.octoprint/config.yaml"
You can restart the server via "sudo /etc/init.d/octoprint restart"

If your Wifi access point via
sudo sudo iwlist wlan0 scanning | grep ESSID
can't be seen by Linux, run
sudo raspi-config 
and select "5 internationalization options" -> "I4 select Wifi locale"
to enable the Raspberry to see all Wifi channels that are legal in your country.

The Raspian I got was very old. I had to provide Internet via Ethernet and do
sudo apt-get update
sudo apt-get dist-upgrade
then it was able to see Wifi networks on Channel 40 (5GHz) and 12+13 (2.4GHz).

GPIO fun 

While at the shell, you can have fun with the GPIO pins in Bash.
Sadly you can't set the pull-up resistors from the shell.
However my image came with WiringPi already installed.
It doesn't have a "--help" or a man page on the Pi itself, so here are the basics:
  • gpio readall
  • gpio mode (pin) in/out
  • gpio mode (pin) up/down/tri         (set pull up resistors)
  • gpio read (pin)
  • gpio write (pin) 0/1
  • gpio wfi (pin) rising/falling/both    (non-busy waiting for a state change)
  • (more)
  • (reading multiple gpios )
  • ...including timeout via read -t (seconds) || echo "timeout detected"  ...still in bash ;) 


Sorry, there is no hole for the Raspberry Pi camera. The best place to cut one is probably on the side (so you don't damage the cool logo), above the camera connector.

This longer cable (Reichelt) may be helpful.

Cura slicing

Luckily the Bundle comes with thje CuraEngine plugin preinstalled. So slicing it not much of a problem.You can imort your existing 15.x profiles (but not 2.1.1 profiles) in Settings->Plugins->CuraEngine->import profile.

BTW, there are "send to Octoprint" plugins for Cura on the desktop!

Ultimaker II setup

The Ultimaker series is not supported out of the box.

Settings->printer profile:

Profile  (UM2 extended)

  • Color: default
  • (X) Rectangular 
  • Origin: lower left
  • X: 223mm
  • Y: 223mm
  • Z: 315mm
  • (X) heated bed

Profile  (UM2 go)

  • Color: default
  • (X) Rectangular 
  • Origin: lower left
  • X: 120mm
  • Y: 120mm
  • Z: 115mm
  • (X) heated bed

Profile  (UM2)

  • Color: default
  • (X) Rectangular 
  • Origin: lower left
  • X: 223mm
  • Y: 223mm
  • Z: 205mm
  • (X) heated bed


After "after abort of a print job" enter:
;fans off
;extruder heater off
M104 S0
;heated bed heater off (if you have it)
M140 S0
;metric values
;absolute positioning
;move Z and X/Y to min endstops
G28 Z0 X0 Y0
;relative positioning
;retract the filament
G1 E-5 F300
;steppers off
;absolute positioning

Cura 15

in Cura set: GCode Type = RepRap (Marlin/Sprinter)
start.gcode (first line must be blank)

;Sliced at: {day} {date} {time}
;Basic settings: Layer height: {layer_height} Walls: {wall_thickness} Fill: {fill_density}
;Print time: {print_time}
;Filament used: {filament_amount}m {filament_weight}g
;Filament cost: {filament_cost}
;M190 S{print_bed_temperature} ;Uncomment to add your own bed temperature line
;M109 S{print_temperature} ;Uncomment to add your own temperature line
G21        ;metric values
G90        ;absolute positioning
M82        ;set extruder to absolute mode
M107       ;start with the fan off
G28 X0 Y0  ;move X/Y to min endstops
G28 Z0     ;move Z to min endstops
G0 X20 Y20 F{travel_speed} ;bring extruder to the front
G1 Z25.0 F{travel_speed} ;move the platform down 25mm
G92 E0                  ;zero the extruded length
G1 F200 E25              ;extrude 25mm of feed stock
G92 E0                  ;zero the extruded length again
G1 F{travel_speed}
;Put printing message on LCD screen
M117 Printing...
end.gcode (first line must be blank)
;End GCode
M107 ;fans off
M104 S0                     ;extruder heater off
M140 S0                     ;heated bed heater off (if you have it)
G21 ;metric values
G90 ;absolute positioning
G28 Z0 X0 Y0 ;move Z and  X/Y to min endstops
G91                                    ;relative positioning
G1 E-15 F300 ;retract the filament
M84                         ;steppers off
G90                         ;absolute positioning

Cura 2.1

The documentation should be here However that's not the whole picture incomplete.
You need an Ultimaker2extended, Ultimaker2Go or Ultimaker2 profile with the reprap g-code flavor to have start and end added to your gcode files including material temperatures, homing and shutdown. Like this one.
To avoid adding files to Cura itself (and keeping them after updating Cura),
you can put your .json files for a new machine definition here:
  • Cura 2.1 (Linux) ~/.local/share/cura/machines
  • Cura 2.2 (Linux) ~/.local/share/cura/definitions 
  • Cura 2.1 (OSX) ~/.cura/machines for 2.1 or ~/Library/Application Support/cura/definitions
  • Cura 2.1 (Windows) ~/AppData/Local/cura/machines
  • Cura 2.2 (Windows) ~/AppData/Local/cura/definitions for 2.2 
Due to bug #850, you need to copy the fdmprinter.json, Ultimaker2.json and other files you inherit from into the same directory.

You can also add additional materials (use the existing materials in Cura 2.1.2.app/Contents/Resources/cura/resources/profiles/materials as a reference) to
~/.cura/profiles/materials (OSX)
but be careful, the file structure is identical to the MATERIALS.txt that the firmware imports from an SD card but the property and section names inside are different. Strange design decision.

Ultimaker II attachment


It looks like self adhesive Velcro is the best option to attach the box to the back of your Ultimaker II.


Having a Raspberry Pi permanently connected to your printer, that has ample 5V and 12V, it is kind of silly to power it via a separate wall wart. So we should see about powering it from the Ultimaker.


g-code for "after pause"
G91 ;relative positioning
G1 E-25 F200 ;retract the filament before lifting the nozzle, to release some of the pressure
G1 Z20 F15000 ;move the platform down 20mm
G90 ;absolute positioning
G0 X20 Y20 ;bring extruder to the front
g-code for "resume after pause":
G91 ;relative positioning
G92 E0 ;zero the extruded length
G1 F200 E55 ;extrude 55mm of feed stock
G92 E0 ;zero the extruded length again
G1 Z-20 F15000 ;move the platform up 20mm again
G90  ;back to absolute positioning
G1 F600 ; set travel speed



Ultimaker II upgrade with Low Friction Spool-Holder

I'm currently upgrading my Ultimaker II with a "Low friction UM2 spoolholder" the YouMagine user IRobertI.

As I print a lot, I use huge and heavy Colorfabb 2.2Kg spools.
So when a spool is fresh and heavy, there is a lot of friction involved that the extruder needs to overcome.
My hope is to reduce this friction a great deal with this simple upgrade.

  • Print "608 mount" ith lots of infill, a large nozzle and thick layers. It needs to be strong.
  • M8x140 is plenty of length. It should have a hex head, so you only need 2 M8 nuts
  • You can use counter nuts (4 in total), so the nuts stay in place even with lots of vibration over a long time period
  • Use a bolt that has threads all the way (no smooth shaf and only threads at the tip)
  • Clean up the inside of "608 Core2 90mm" and "core 1", so a 608 bearing can be inserted into the far end
  • check the order of assembly:
  1. insert M8x100 hex headed bolt into "608 mount
  2. add M8 nut and 608 (skateboard) bearing, so "core 1" has a small air-gap wi "608 mount"
  3. add "core 1"
  4. add spacer
  5. add another 608 bearing and M8 nut (don't overrighten)
  6. add "608 Core2 90mm"
  7. add final "608 nut"


Panasonic YAGH for GH4 disassembly


I received a broken Panasonic YAGH unit for a GH4/GH4R camera.
The description was that it simply failed at some point and was dead. The original owner has a second YAGH in operation, so I'm ruling out handling error.

Since the Audio part also did not work, I can rule out any ground loop or static electricity issue with the SDI ports as the cause.
If the SDI ports  are broken, I'm still left with a good XLR audio preamp for my GH4. ;)

My educated guess is a burned protection diode/0 Ohm resistor/fuse in the power supply due to a power spike or ground loop.


This unit is all nuts and bolts.
Don't start this disassembly unless you have
  • a LOT of table space to lay out the bolts
  • a means to print photos you made of each board/side/layer to put the bolts on
  • I learned the hard way that putting the unit onto a photocopier does NOT produce good results
You have to remove all visible screws on all sides to remove the outher plastic hull.
Use the photos I made to get a good understanding of how the different parts inside connect to disassemble them later.
You can have to disconnect the board with the 2 XLR audio connectors to physically separate "Main" and "Power" board.
You have to disconnect the front panel "SW-Audio" board from the "Main" board to to physically separate the elements
You have to remove the lower board "Main" with the SDI connectors to remove the plastic on top of the "Power" board. 

Aparently you can reconnect things in a way to run the YAGH without any case and easily reachable contacts. Just the upper side of "Main" and lower side of "POWER" are blocked because they mate to each other.

What surprised me was, that the unit actually contains an micro-HDMI to HDMI cable and a full sized HDMI socket inside!
My guess is that this is to easily replace a broken cable but then again...you need to completely disasembly EVERYTHING to reach that cable.

Power supply


The unit accepts XLR power with
  • 12V applied on Pin 4 and 
  • GND on pin 1. 
  • Pin 2+3 are not connected
  • Rated for 11-17V (1.4A at 12V) -> 16.8V D-Tap  power is fine but very near the top limit

Setup: I'm testing the unit with a DTap to XLR power adapter cable.
I could also use a lab power supply but attaching it would be difficult as the cables are very well insulated and leave no place where I could attach clamps.

Lessons learned: aparently you CAN insert a chinese D-Tap plug THE WRONG WAY without any resistance.

Found the problem? The Power board is not supplying power to any of the other board. The fuses are intact and there are no obviously burned capacitors or diodes that I could easily replace.

 SDI out


  • 1080p24/25/30 -> the same 1.5G SDI signal on all 4 outputs
  • 1080p50/60 -> the same 3G SDI signal on output 1 and 2
  • UHD at 24/2530fps or DCI 4K at 24fps ->4 FullHD segments as 1.5G SDI signals on each port
Conclusion: I can test the SDI ports using my Atomos Samurai Blade or shogun that both only have a single SDI input.

Audio board 

XLR inputs arrive in a small board "CJBB84417" with only passive componente.
They are then routed into the board "VJBB0F44" with 4 low noise amplifiers labeled "8202 304 JRC".
It also houses the Pogo pins to the large connector on the camera and a large connector to the main board.
My guess is, that this does all the audio pre-amp and hands these over to the camera.
It should also provide the volume level to the main board for display and get level information from the input board via the main board somehow.
All the other pins should just be routed through from the main board to the camera.
I fuether guess that if I supply this board with power and connect it to the GH4, it may work on it's own (with a fixed level value).
If the amplifiers are similar to the NJM8202, they should have a 15V power supply.



Ultimaker II filament sensor and remote control using Raspberry Pi

The Plan

This is still an idea, growing in my head. I don't know if or when I'll build it.

I want to attach a Raspberry Pi 3 to my Ultimaker II extended.
Obviously with Wifi and USB it will have
  • a camera, watching the print
  • a GPIO pin an Octoprint API call telling the Ultimaker to pause the print (via a tiny firmware modification)
I will replace the small, plastic filament idler pulley on the back of the maching (opposite to the extruder drive mechanism) and replace it with a ball bearing and 2 sensors
  • a trivial 2 bit rotary encoder to meassure the speed and amount of the filament and auto-pause if no filament is transported anymore
  • a load cell with a Hx711 module to tell how much pulling force is applied and auto-pause on no force = filament empty and too much force = something blocks the filament from moving
The too much force - part means it will also detect obstacles that would otherwise lead to underextrusion because filament is still being transported but way less then there should be.
  • a second load cell in the modified "Low friction UM2 spoolholder" will meassure the weight of the spool and thus tell me how much filament is left.
  • the modification is a 12-20cm arm with the load cell attached


  1. DONE: Raspberry Pi with Octoprint installed
  2. DONE: Designed parts for first prototype (using only Encoder and Dummy-Load Cells)
  3. DONE: Rotary Encoder arrived
  4. DONE: Pinout specified
  5. DONE: Waiting for load cells and Hx711 from China
  6. DONE: Test rotary encoder from Raspberry PI command line
  7. DONE: Test Octoprint API from Raspberry PI command line
  8. DONE:  Convert rotary encoder values to binary positions, movements, direction and speed
  9. DONE:  Wrote a simple filament monitor script that detects stalls and reverse movement and pauses the print via the Octoprint API
  10. DONE: Test Hx711 and load cell on Raspberry PI using Python
  11. DONE: The current "pause" in the Octoprint API is dangerous for scripts. It can resume if issued twice. ->fixed in Octoprint update
  12. DONE: solder the final cables and mount things on my UM2extended
  13. DONE:  designed modified "608 mount" to weight the remaining filament.
  14. DONE:  meassuring the weight of the remaining filament works! 
  15. DONE:  Write the final software including monitoring pulling force and spool-scale
  16. TODO: Write an assembly instruction including setting up the software.


Testing rotary encoder
Testing load cell for extruder pulling force

The software

I'm using a Raspberry Pi running Octoprint as the central controller.
The original plan was to use the Octoprint API.
For the final software (Here on GitHub) I decided to write an OctoPrint plugin.
I can not only detect if filament is supposed to move at the moment and issue a pause command without hacking the firmware and dual-using one of the homing switches as an E-Stop.
I can also display information (such as remaining filament) in the OctoPrint web interface
and offer my meassurements as new values in the OctoPrint API.

The parts

New method (using Octoprint API)
Old method (using GPIO signal to pause the printer):


continuous Multicam recording

I know, I have too many cameras.
What do you say. It's a hobby of mine.
But every now and then I actually use all of them... at the same time.
Lectures and events have a tendency of ending up as multicam events that need more then 20 or 40 minutes per camera and should have their highest possible quality because they simply can't be repeated. 

Surprisingly I got my hands on a used Ninja 2 on eBay.
Now I have completed my setup!
(inserts manic laughter here) 


One GH4R with no internal recording limit can be powered by an Atomos Power Station and record in 4:2:2 onto a Ninja Assassin in UHD for the wide shot that also gives crispy sharp, post-stabilized, post-panned FullHD medium shots as a crop. It doesn't move or refocus and thus needs little monitoring.(for Wifi monitoring I would have to switch from 10bit down to 8bit and I won't do that)
An internal MPEG backup recording is running.
Zoom H6 does the sound as 6 wav including - 4dB  backup against clipping channels ,if not all channels are uses and has its line out run into the Ninja Assassin too. Channel 1+2  are the Mid/Side microphone because that gives perfect Mono down mix and can not have any stereo cancellations like the shotguns 3+4. 5+6 is a Rode VideoLink digital wireless channel to a Lavalier. 

Remote Right

One Blackmagic Pocket Cinema Camera on the far right. Powered from a Lanparte V-mount plate. An Atomos H2S converts the HDMI to SDI.
An internal ProRes LT backup recording is running.
It provides FullHD Closeups from the right .
The SDI cable runs to left side. 


There am I with an Atomos Samurai Blade, monitoring and recording it and giving instructions via PMR radio to match my own framing.
Myself I am with another Blackmagic Pocket Cinema Camera, another Lanparte battery plate and the Ninja 2 for matching Closeups from the left side. Monitor is not as good and I have no scopes but I have my eyes and the screen of the camera.
Now we have wide shots from dead centered with any crop we desire and Closeups for dialogues from left and right.
Wenever something important is obscured from one side, we have 2 different angles to choose from.


  • Each recording has a backup. 
  • I can see what the opposite camera is doing including focus peaking and give instructions
  • HDD (Fullhd)  and SSD (UHD) drives last forever 
  • so do the batteries 
  • audio from 3 locations +lav in case a chair chair squeaks and for a wide applause with no single shouts standing out
  • all recordings are in 10bit (or raw)  -  no banding when I pull the midtones up of down in post or pull up the shadows 
  • all recordings are in ProRes (or raw) - fine noise floor for NeatVideo to work with and no MPEG artifacts 
  • Editing off the 3 SATA drives directly with no need to copy anything