Developing TITS (My overcomplicated, never ending project)

Tucker’s Intelligent Torque System
4wd torque vectoring for Ankle Wreacher
Strap in for a long and technical thread.

Legal

I have and am putting a lot of work and money into this project, and plan on sharing a lot here. I’m not a litigious person, but I want to add this legal disclaimer for some CYA.

The contents of this post and subsequent posts by @tuckjohn is is licensed under CC BY-NC-SA 4.0


I started this project back in July 2024, and it’s still far from completion
(mostly through my own lack of controlling project scope :grin:)

Scope has grown to include:

  • Tire temperatures
  • Live display
  • Data logging
  • Data analysis

With the goal to eventually add:

  • Torque Vectoring
  • Rudimentary on-track interface
  • Traction Control
  • Antilock braking
  • Speed wobble dampener
  • Tire Forces

I’ve been teasing progress in the Ankle Wreacher thread… but it’s reached a point where I can share this journey with y’all. Below is a summary of the last 16 months:


The idea

  1. Sensors across board capture the board state, most notably truck angle(s)
  2. “Arduino-like microcontroller” reads from the sensors and receiver input
  3. “Arduino-like microcontroller” does calculations based on sensors and receiver
  4. “Arduino-like microcontroller” sends commands to VESCs to control the motors

The Start… TITS Main PCB V1.0

First hardware prototype iteration had some good ideas, and other bad ones.

Development of V1

Hand assembled Custom PCB

Full assembly. Has a Teensy 4.1 for the main brain, ESP32c3 for display drawing, and second ESP32c3 as a vesc express.

The PCB was designed to fit inside the bellypan

Screen mounted up front with the MEGAN display


Picture of all the required wiring for V1 (after I pulled it out to upgrade to V2)


Testbench for Development

V1 taught me how much software work was required for this project, I needed a testbench where I could test experimental code without affecting my raceboard.

I bolted a bunch of spare parts to an old 3D printer frame to make a fully electrically representative skateboard.

Later on, tired of how much desk space it used, I screwed everything to the bottom of my lofted bed.

This has been invaluable for my software development efforts.


Truck angle Sensor

With the main goal of this project being torque vectoring, measuring rider intent is paramount. Truck angles is the best proxy for desired turning radius, so opted to measure this to calculate the angle of turning;

  • Accelerometer? → To hard to reliably remove environmental factors and noise
  • Potentiometer/Encoder/RVDT? → Requires complex linkage geometry that I don’t have space for

Ended up using a magnet-based option. Specifically, I’m using a 3-axis magnetometers on the front and rear trucks.

Some magnets (in a 3D printed holder) are epoxied to 3-link hym

Then, using two extra collars mounted to my hanger, a custom magnetometer PCB is mounted directly above them

First test of the system proved very promising (graph showing XYZ magnetic field strength).

I’ll definitely post more information about the algorithm I came up with to convert this XYZ magnetic field magnitudes to deck angle/turning radius in the future, it’s quite complicated (probably more then it needs to be lol)

This PCB has gone through numerous revisions, fixing issues and changing the system architecture.

I anticipate doing another soon for reasons :grin:


Tire Temperature Sensors

Originally, I added these for the meme. However, I’ve found them to be invaluable while racing.

To measure my tire’s temperature, I use 4x MLX90614-BCC sensors.
I choose these because they’re readily available, cost-effective, and work decently enough.

These sensors are readily available soldered to this sensor breakout PCB. My first revision just used it directly by mounting to a 3D print, which then bolts to tapped holes in the motor mount plate.

First trials with them proved to be plenty accurate

After lots of problems with these breaking, I designed a custom PCB that allows for a beefier mount.



These have allowed me to do some cool stuff, like testing experimental tires here.
Can’t imagine having a raceboard without live tire temperature feedback now.


TITS PCB V2

V2 was a ground-up architecture redesign of V1. It copied what worked, and changed what didn’t.

Most notably was the location. By locating it directly under the MEGAN, it:

  • simplified some wire routings and harness construction
  • Lifted it out of the bellypan, where stray rocks were starting to cause PCB damage
  • Wider PCB was easier to layout/route

I built two systems. One got installed onto Ankle Wreacher, the other was assigned to the testbench for software development (with a matching copy of sensor PCBs).

Installed on Ankle Wreacher. My board has a comical amount of wiring.


V2 Bus architecture
Here’s where thread gets technical “How is everything connected?”

This is my system architecture at the moment (subject to change as the project progresses, obviously)

My magnetometer and tire temperature sensors are on an I2C bus.

I2C bus design notes

Using an I2C bus for the sensor communication isn’t ideal. I2C was designed to be protocol that exists entirely on-PCB, and never more then a meter long. My bus goes across 7 custom PCBs, and is about 1.5m long, in a very EMI noisy environment.

Despite this, I decided to use I2C for two reasons

  • It’s a protocol that a lot of hobby-grade sensors use
  • All sensors can share the same wires (ex. SPI would require a separate CS wire for each sensor)

To protect signal integrity against EMI, everything is run through shielded wires. However, this greatly increases the bus capacitance/impedance, which reduces max clock rates because I2C is a open-collector and relies on pull-up resistors to reset the bus. Great TI application note on the topic

In future revisions, I’d like to incorporate I2C Redrive buffers to help with signal integrity.

I originally tried solving this with a PGA9615, which is a software-transparent solution that reencodes the I2C signal as a higher voltage differential signal (which can be de-coded at the other end)

However, PGA9615 was never very reliable, frequently failing on me. To get a system that worked, I just bypassed them and used some much stronger (low resistance) pull-up resistors. This resolved issues, but requires sensors/microcontrollers to work a lot harder to pull the bus low (lot more current). I’m surprised that I haven’t burned anything out yet :rofl:

In future revisions, I’d still like to incorporate a I2C buffer to help with signal integrity. In hindsight, the PGA9615 was complete overkill and a more “normal” i2c buffer like P82B715 or similar would be fine

The Teensy 4.1 does all the sensor/vesc communication, relevant calculations, data logging, etc.
The Teensy 4.0 is a overqualified GPU. The much shorter distance to the display on V2 allowed for much faster communication, allowing higher framerates (~22fps atm)

Interfacing with the VESCs is done directly over the CAN bus, with a custom CAN library that I wrote. This utilizes the Teensy’s internal CAN controllers, allowing for very low overhead communication (Low latency and FAST!).

Isn't this just a copy of Radium RTS?

Inevitably what I’ve built here is going to be compared to Radium’s RTS system that ships with the Mach One. TITS and RTS are similar at a high-level, but I hope this thread is proof enough that this is not a reverse-engineered clone of RTS.

  • Entirely different system architecture
  • Entirely different hardware (including individual components)
  • Slightly different truck sensing method/algorithm
  • Live display feedback
  • SD Logging
  • Tire temperatures

Additionally, the RTS system has features that I don’t like

  • Much simpler for an average user to use/tune (just not something that I’m optimizing for)
  • More cost optimized

Radium, you do cool stuff, love you guys :grin:
Everyone go buy motors from them


Logging infrastructure

Given that this is an experimental system, I need to log everything possible for post-race analysis and system debugging.
However, to allow for my TITS control loop(s) to run at as high frequency as possible, I needed to reduce the processing overhead for logging to be as low as possible.

Highly technical exploration of the infrastructure I've built

On a hardware level, a Teensy 4.1 supports QSPI, where it writes to the SD card with a 4-bit bus which is ~4x faster then SPI (1-bit bus). (I’ve also gotten a U3 rated SD card, just to ensure it’s not a bottleneck)

On a software level, I went way overboard
In previous Arduino projects, I’ve formatted the data into a CSV format, then appended the CSV-formatted string to the end of the log file.
However, this was relatively computationally and memory hungry. Additionally, every additional byte of information written to the card increases the write-time, which will decrease how fast I can run TITS (because Teensy’s are single-threaded).

Say I have a integer variable containing the number 433312. Instead of formatting and writing the string “433312,” (7 bytes), I can write the raw binary 0x00069CA0 (4 bytes).

A binary(.bin) file! Incomprehensible, but very information dense.
For example, these 4 bytes contain the milliseconds since powered on. 5203ms

To do anything with this data, it need to be post-processed, parsing the raw binary into separate variables. This requires the paser to knowing the number, order, and types of data in the binary file.

So naturally, I also wrote a custom matlab parsing script.

I could’ve hard-coded my parsing script with my known format, but I expect this format to change as I develop the system, add variables, etc. I definitely don’t want to need to change my parsing script every time I do this;

  • Would make updating TITS source code take more work, and take away focus from what I would actually be trying to update
  • Updating the parsing script would make it unable to parse older log files that use an older format. Strict version control to preserve this would be more annoying and take extra work.

So, upon board startup, TITS creates a “spec” plaintext file(.txt) that contains all information necessary for parsing the Binary file (.bin).
image
(I also use the Teensy’s onboard RTC clock to name the files with a datetime when they were created, really helps with usability and identifying the right log file)

The contents of this spec file is automatedly generated with a custom python script at source code compiling.

This spec file is automatically found and used by my Matlab parsing script, which uses it to dynamically parse the data into Matlab variables, ready for graphing/export/other dushbaggery.

All of this happens entirely transparently and automatically. My TITS source code can change, and my Matlab parsing script automatically adjusts to compensate :tada:

With this infrastructure up, I’ve got tons of cool graphs of data! Right now I’m able to get 50~100Hz data, and definitely will be sharing more as this thread goes on.

For starters, check out this one of my 4 tire temperatures. You’re able to see the ~40s laps just because of the tires cooling on the straightaway :exploding_head:


Misc steam deck

In addition to my desktop, I got my development environment set up on my steam deck so I can do mobile development/code updates. Should be useful if I ever need to modify code or look at data at a race event.


This post is an overview of what I’ve already achieved, but this is obviously a WIP project.
I’ve got a hardware revision in the works (with more features!) and lots of software still to write.

More to come!

Continued…

25 Likes

13 Likes

I came because of the Titi and I have no idea what is going on…

2 Likes

I want your tits…

Can you give me tits please!

7 Likes

My first login in over a year and i’m blessed with reading a post like this. Incredible dude

6 Likes

Does TITS have anything to do with the REDACTED photos you posted in the Nothing Fancy thread?

Did you redact your tits!?

I love this project, super cool and ambitious!

6 Likes

Unfortunately not, that’s a separate secret.
No TITS on Nothing fancy (for now)

2 Likes

@tuckjohn
I just read it and still have no idea, how. But what an interesting story! Amazing job and development :partying_face:

2 Likes

Very interesting read! I’ve been thinking of adding tire temperature and pressure readouts to my board, so might be going a similar way eventually.

Torque vectoring is also such an interesting topic, I really wanted to work on it but I am overwhelmed with my projects without that already :sweat_smile:

Can’t wait to hear more about this as you progress!

2 Likes

I am just…so impressed.

These are the best damn tits ive ever seen.

4 Likes

This is soooo dope!! Good idea to track the hanger with a magnet

1 Like

Great TITS sir, beautiful work. Traction control and Wobble dampener could be a game changer if you get that working.

2 Likes

So, I’ve got a magnetometer on the front hanger, and another on the rear hanger. 2x sensors providing 3-axis XYZ data about the magnetic field.

What’s the best way to calculate a single variable (angle of the deck) from these six magnetic field strengths?

This is all about building a pipeline for calibration. Everything I explained here is done entirely behind the scenes… To redo calibration, all that’s required is running the Magnetic TITS Calibration Tool and (slowly) sweeping the board across the range of motion.


Why it’s a complicated problem
Measuring rotation from a magnetic field is a very common engineering problem, even in esk8. There’s plug-and-play integrated circuits that output degrees of rotation. In fact, most hall effect remotes use these.

However, sensing 3-link truck angles isn’t that simple.

  • 3-links don’t rotate around a static axis (like channel trucks do). Their axis of rotation changes throughout the rotation, so it’d be impossible to properly locate a sensor/magnet for simple rotation (because it constantly changes)
  • This axis also changes when linkage angles change, so even if it was possible, I’d need to re-glue magnets and re-mount the sensor whenever I changed linkage setup.
  • I don’t trust my magnets to be high-quality enough to be uniform
  • My magnet and sensor mounting is never going to be perfect

I decided that using XYZ magnetometers is going to get me the more accurate measurement possible because of the lower-level control and increased data throughput (at the cost of increase software complexity).

So, calibration?
Calibration is the process of correlating sensor values with a “ground truth”
Then, during operation, you go the opposite direction, calculating the truth from the sensor value.

Because this is going to need to be repeated anytime I change linkages or touch the sensors, I wanted to automate the process as much as possible.

Introducing, the Magnetic TITS Calibration Tool

This Matlab software runs on my PC. It connects to TITS over USB, which reads sensor values.

On the TITS PCB, I have a high-accuracy IMU that measures acceleration and rotational rate. If the IMU is not moving and the board is on a flat surface, the angle of gravity will be equal to the angle of the deck. Easy to calculate with some basic trigonometry!

X-Axis Offset

You might have noticed the accelerometer zero button on the tool.

Ever since my collision with Redbeard at Apex, Ankle Wreacher’s front box has been.. not straight.

Only a few degrees, but enough to be significant. To avoid this affecting the accuracy of my calibration, I do a coordinate transform (defined by the accelerometer’s zero calibration) to rotate all measured accelerations to the deck’s reference frame, correcting for this misalignment.

A nice side effect of this reference frame transform is that it would allow TITS to be mounted on an angle-tipped deck in the future, if I ever wanted to.

If I then record the measured magnetometer values in the same instant, I’ve got a record of what magnetic field strength to expect that that amount of deck lean.

This is what the Magnetic TITS calibration tool does! It automatically records data, tossing out anything where the the board is moving (which would throw off the calculated gravity vector).

Rotating the board full left and right, this is what the raw data looks like.

Great! There’s a strong correlation between deck angle and the measured magnetic field, on all axis’s. Sone cool things to notice:

  • For a given amount of deck lean, the front sensor measurements change way more then the rear sensor. This makes sense, because my steering is front-biased
  • The measured magnitudes are not sinusoidal in shape. This reinforces my hunch that I wouldn’t be able to use a simple rotational encoder (and that selecting magnetometers was the right choice)
  • The rear data shows some interesting hysteresis on the Z axis(yellow). Speculation on this later.

Because we exist in the real world, we’ve got to deal with the sensor noise.

This is where the real math arrives.

Calibration algorithm
I spent a long time thinking about this algorithm. It needed to be

  • Accurate and repeatable
  • Robust to interference from stray magnetic feilds
  • Shouldn’t break if something slightly shifts while I’m riding
  • Feedback about the quality of calibration (and if re-calibration is needed)

I can use the XYZ magnetic field magnitudes as coordinates in a 3D XYZ plot, color is the deck angle. Hopefully you’re able to see that all the points exits on a single line through the space, and that the deck angle (represented here by the color) changes continuously along the line.

After some processing, I’ve get a single “idealized line” through magnetic state space that defines the relationship between deck angle and measured magnetometer values.

Now, when I measure the magnetometers and want to know the angle of my deck;

  1. Plot the point in the XYZ magnetometer state space
  2. Snap the point to the nearest location on the idealized line
  3. The calibrated “deck angle” at that point of the idealized line is almost certainly the current angle of the deck

Here’s a graph that kinda visualizes the process;

Because there will always be a “closest” location on the line, this means that it should never fail to find a solution.

Even cooler, the distance correlates to the calibration quality!
If something changes that makes the calibration invalid, suddenly the distances of any new measurements to the idealized line will be very far. Should make it easy to implement fault detection down the line (allowing TITS to detect and compensate for a fault sensor/broken magnets)


Checking my work
Using my recorded calibration data to test the accuracy of this algorithm, I get a graph of how accurate the calibration is. These graphs are so fucking cool, you have no idea.

  • The front sensor is incredibly accurate, calculating the deck angle to within ±0.5 degrees!
  • The front sensor also shows a nearly perfect Gaussian distribution. This makes me think that this is the maximum accuracy I can expect to achieve without a sensor upgrade (as sensor noise is almost always gaussian)
  • Rear sensor is less accurate (±1 degree), It shows a bimodal distribution of data.

Bimodal distribution?
Basically, showing two distinct peaks (example from google)

Looking at the raw data, it has hysteresis(as I mentioned before). Basically, the measured data depends on the direction you approach it from. This would explain the bimodal distribution, as the ““theoretically true”” magnetic value is either over or under measured by the magnetometers. (another example from google)

I scratched my head for awhile trying to figure out the source of this.
Multiple recalibrations showed the same thing. Try to guess, its fucking cool.

  1. The “true” value of deck angle comes from the accelerometer at the front of the board.
  2. The front trucks don’t show a bimodal distribution because they’re much closer to this “truth” sensor
  3. I stood on the board during the calibration, and swept it from full left, full right, then full left.
  4. Drag on the wheels/gears caused the chassis to twist, causing the rear trucks to lag slightly behind the front trucks.

In other words, my sensors are so accurate that they’re able to pick up the torsional twist of a chrome-oly steel V5 Chassis

Standing on the board and… yeah. That looks like a degree or two of twist :exploding_head:


Now what?
I still need to write the C++ version of the algorithm that will actually run on TITS, but that’s probably going to take a day or two of work.

Once I get that written, not entirely sure. I want to start analyzing on-track data (see how it looks). I’m very excited to see truck angle data correlated with other stuff like GPS, accelerations, turning radius, power, etc. Should be really cool.

14 Likes

Dam tuck, anything your tits cant do? Omitits

5 Likes

You might be the coolest person I know, this is insane.

4 Likes

insanely impressive engineering in order not to use dualities haha

2 Likes

Look, I did this because it makes the entire system more flexible to unique hardware configurations and setups.

Also I already tried putting dualities onto a super spine/v5 and decided writing the software would be easier lol

8 Likes

Excited to get this data into a video format

7 Likes

That’s super cool data!! The spread at the end is interesting! wouldn’t have expected fronts to be hotter!

1 Like

I present the best thing I’ve ever made.
My magnum opus;

The product of weeks of nonstop work, and a culmination of years of esk8 riding, board building, and TITS development.

Yes, this race happened 6 weeks ago. shush.


the long journey that was producing this video

From the moment I started TITS, I aspired to overlay data onto video. I had very high expectations/requirements, and wouldn’t accept any compromises:

  • Had to work with my unique data (obviously)
  • Tire temperatures are represented by the changing colors
  • Bar graphs for throttle/speed/etc.
  • Data shown in numeric text format
  • Hardware accelerated video encoding
  • Every frame gets new data plotted (no sharing data graphics between frames!). TITS records a lot of data, I want to use all of it!

The Dead Ends
Multiple times, I had to scrap days of effort and go back to the drawing board with a different approach.

Matlab

I started by extending my existing Matlab data parsing script. I was able to import the video as frame bitmaps, then draw graphics elements on top of each bitmap frame.
However, it quickly became apparent that

  • Manually drawing pixels would require coding custom drawing functions for text, graphics, shapes, etc.
  • Adding transparency to shapes would be possible, but difficult
  • Because I was doing everything in strict 8-bit RGB, the quality wasn’t great.
  • Everything was running on my CPU (no hardware encoding), so each frame took very long to draw and export. A full render could take weeks.
  • Adding GPU acceleration would require code refactoring, and increase the complexity substantially. Not to mention requiring a $50 toolbox package.

It was quickly clear this wasn’t the path forward, so started looking at other options.

Python

Python has always been my Achilles heel.

I’ve failed to become proficient in it 4~5 separate times. I wanted to change that, and I thought this video could become a reason to become passably proficient.

However, Python is undeniably powerful. I totally expected there to be an open source library that would do what I wanted to do… but I couldn’t find one. I tried 3~4 libraries over several days, but I hit weird roadblocks with each one.

Part of me wanted to keep trying to get it to work, as python seems like the “most right” way to do it? But I didn’t think I was capable of getting it to work.

Joe Barnard captures this sentiment fantastically here:
https://youtu.be/4jgTCayWlwc?si=KgTda5t6Jjw8_Dn0&t=735

Times I’ve been bested by python… +1

Dashware

I was really hopeful for this one!
On the box, it should be able to do exactly what I want it to do - overlay data onto action camera footage!

It was really promising, but I got as far as importing my data into the program before realizing it wasn’t the solution. Death by thousand cuts:

  • Dashware is abandonware, last updated in 2017. It’s likely to be even more broken in the near-ish future.
  • Random Crashes
  • Clunky UI required a lot of clicks to do anything. Configuring a custom data importer and custom UI would take days, maybe weeks, of tedious data entry, and would be hard to adjust in the future as I expand TITS.
  • Color changing tire graphics might be possible? Would likely be hacky, and/or take a lot of effort to get working.
  • Multiple frames shared the same data. It was only refreshed at ~10Hz iirc
  • Every time I exported, The video was flipped upside down (with the graphics still upright!). This happened regardless of how the video exporter was configured.

Not the clean solution I was looking for, unfortunately

Telemetry Overlay

Telemetry Overlay is an absurdly expensive($200) program for overlaying Camera data onto action cam footage.

However, it doesn’t support custom data files, data types, or graphics elements. Easy reject.

All of these failed attempts pointed me to needing a powerful video suite to fully realize my vision.


Davinci Resolve
is an absurdly powerful, fully free video editing suite. It comes with Fusion, their node-based video editor

Vonk Ultra is a 3rd party plugin for Fusion, which adds numerous nodes for doing math and drawing animations. Most notably, it supports animating graphical elements with data.
It also can pull that data from a JSON file!

Always wanted to get good at video editing, this was a great excuse to dive into the deep end.


A solution, at last

Some additional code in my Matlab data parsing script exports data as a JSON. This JSON file is ingested by my fusion composition, which then draws the graphical elements.

Technical Weeds
  • To make Fusion’s job easier, I resample/downsample the raw data in Matlab to match the framerate of video (in this case, 50Hz) with linear interpolation. That way, each line of data in the JSON file is one frame’s worth of information.
  • This greatly simplifies the logic that happens in Fusion, as the data from
  • How did I sync the data to the video? I did a few quick throttle punches after starting the camera. This produces very quick ppm spikes in the data, which I can use as a “clapperboard” to sync it with the video.

This composition came together relatively quickly, only taking a few evenings of work to get the basics laid out (including learning!). Early prototypes were haphazardly laid out, prioritizing demonstrating a proof of concept before I invested time in making everything pretty.

note: resolve's benefit over python?

Using a full-suite video editor provided a UI to effortlessly place elements, make shapes translucent, rotate shapes/text, add transitions, etc… with real-time feedback for how it looks. I couldn’t imagine defining the positions and properties of the UI with just code. Would’ve taken ages to look half decent.

The actual video timeline is pretty simple (Fusion Composition in pink)


Music

I wanted something suspenseful, building, and flowy. I almost went with Mombasa, but the vibe was too serious.

Ended up going with
Worakls - Detached Motion
Worakls - Caprice

Tension building like Mombasa, but some funk sillyness thrown in


Rendering

Proved to be an entire project in itself. A week of troubleshooting!

I’m so far outside Vonk Ultra’s intended use case of abstract 3D animations, Resolve crashed when I tried to render. My hard drive was filling up with 100s of GB of temp files… causing the crash before even being a few minutes in.

Projecting how much space would be required, it would be more temp files then the size of my C drive!
and I REFUSE to solve this by reinstall my windows onto a bigger drive, that’s just too dumb (even for me)

Oh, the Joys of a unique use case!

I was entirely unable to find resources online for this problem.

If I could change the location of that cache to another drive, that would be okay? But that specific cache… just wasn’t an option to configure in the Resolve Settings. I burned a lot of time learning about how Resolve Caching works (even the 4,234 page resolve manual provided little help)

Turns out there’s an entirely separate settings menu for fusion caches. Fortunately, I got a better solution before I figured that out…

A German Savior

I put up a flag for help on the Vonk Ultra Forums. Local moderator Dunn graciously offered his time to help debug.
Massive, massive thanks to him. He ended fixing the issue with a custom Lua import script, made custom for this project. I needed to made moderate modifications from his initial version to add/change what/how data was processed, but the overall framework remained untouched.

I couldn’t have done this myself (lack of familiarity with Lua/Fusion/Vonk), Dunn made this video possible. The custom script reduced the amount of temp rendering files from 1000GB+ to <1GB, and increased my rendering speed 220% :tada:


Sharing

When I first uploaded the 1080p export to YouTube, the quality was dogshit. The video was blurry and the text unreadable. Unwatchable. Original (left) vs YouTube (right)

Predictably, YouTube compression *really* doesn't like how much stuff was changing each frame.

Basically, I inadvertently created a torture test for video compression. Probably why every other piece of software out there doesn’t produce new data graphics for every single frame :zany_face:

YouTube reencodes uploaded video to a lower bitrate when you upload it, somewhere around ~12Mbps for 1080p.

Resolve exported with an average bitrate 288Mbps to display everything…that’s 24x higher!

So it makes sense the YouTube version would look terrible.

I needed more bitrate. YouTube gives higher bitrates to higher resolution videos (~45Mbps for 4k), so I re-rendered the video at 4k resolution.

Much better! Original (left) vs YouTube 4k (right)
Even 1080p quality was better then before!


… And that’s why it took me a month and a half to edit this video.

Hope you enjoyed it!

16 Likes