Homelab Check In 2026

Good Morning from my Robotics Lab! This is Shadow_8472, and today I am reviewing the important parts of my homelab, going over the servers and software, and improving the inner workings. Let’s get started!

Hardware

The Internet comes in and connects to my ISP’s box. It’s set up in some kind of bridge mode. I have an Opnsense hardware firewall named Cerberus serving as the network hardware firewall and router. Wi-Fi is handled by an older router with stock firmware.

I have a server closet with my old RedLaptop, a NAS (Network Attached Storage) called GoldenOakLibry, and a pair of Dell Optiplex towers named ButtonMash and Joystick.

Golden Oak Libry

When we first got a NAS, we went with Synology. Since then, I’ve learned and the Linux kernel has developed, and one day I heard from tech support that they won’t be supporting a mission critical feature given my software stack (more on that later). I’ve been taking it slowly, but I’m transitioning into a newer QNAP device and adjusting to its quirks.

The driving force behind this month’s efforts has been completing this migration. I’d previously moved all the data onto other drives and back while only missing a few small files – mainly thanks to a terminal program called rsync. I grabbed all the metadata along with it – including UID/GID (User ID/Group ID) of all the files. And since I’d set up the previous NAS to not care about users, it was all Synology’s default “everybody” identifiers.

My goal for this time around is to distinguish between users. How-hard-can-that-be? Very.

Researching for Network Login

NFS (Network File System), the protocol I’m using to talk with Golden Oak, by default just trusts the incoming UID of any clients connecting to it. That means that if I upload a file from DerpyChips with a UID of 1000 and I access the NAS from my new laptop with a UID of 1000, everything will just work. But nothing is keeping my father’s computer from reporting the same UID, as 1000 seems to be the default value for all initial users. Now, I have an account on my father’s computer, and since it was created after his, my UID on there isn’t 1000. Too much work, too fragile. Approach rejected.

A more robust approach to solving this problem is Kerberos. I have some recollection from my initial days setting up the Synology NAS. In short: a client talks to a trusted server to get at a service. I spent a while researching implementation –including using Buildah to make a container to run with Podman– but found it expects way more network infrastructure than I already have, so I pivoted toward learning about this technology I keep hearing about in relation to homelabs: LDAP.

LDAP (Lightweight Directory Access Protocol) is roughly equivalent to Active Directory in Microsoft Land. And in true Linux fashion, LDAP is a specification, not a single piece of software. Opinions differ as to which is best for a use case like mine, but LLDAP (Lightweight LDAP) seems to be right for me.

Joystick

Podman is the underlying technology I use for my homelab. It is often compared to Docker, but it works without root access in case of container breakout. I have a non-admin account on my server, Joystick, where I use Quadlet files for permanent Podman containers so they’re always running when I expect them to be.

But the heart of my homelab (software wise) is Caddy. Caddy is more or less a do-it-all networking suite. It’s a reverse proxy with built-in HTTPS – both real and by self-signed SSL certificates. And everything Caddy does is controlled by a Caddyfile written in as simple a format as possible. Once everything is in motion, the most annoying part is telling your computers and browsers to trust your self-signed certificate authorities.

Taking inventory, I found I didn’t have much in fully functional order. Vaultwarden (3rd party Bitwarden server written in Rust) and Crafty Controller (Minecraft server management software). Beyond that, I have an incomplete Nextcloud setup waiting on Golden Oak and dreams of many more projects – many of which also are waiting on Golden Oak Libry. The home directory was getting full, so I sorted them between Services that provide target functionality and Network that support the Services.

One of Caddy’s features is a static web server. After reviewing all my containers, I wrote a landing page for it to display. Debugging it ate a bunch of time, but I found I had goofed a mount configuration. From there, I linked my different services as a way to test them.

During this process, I discovered a poor Caddyfile entry for Crafty. Someone on GitLab had suggested essentially bypassing HTTPS security in a way that otherwise looked legit. This would have been fine for my setup, but out of concern for the community, I updated an earlier solution that relied on a deprecated Caddy term a different Caddy /path/to/certificate. In this way, I learned one way to “establish trust.”

Cerberus

Hardware Firewall. It’s basically a fancy router – a REALLY fancy router. If Caddy is the heart, Cerberus is the head. It handles the DNS blocklist using the OPNsense plugin Unbound. But Unbound also handles domain name resolution. I was several major versions out of date. Oops. Fixed.

Ironing out Unbound was another place that ate up diagnostic time like crazy. I eventually happened across a bug report where DNS server crashes if Unbound tries forwarding a wildcard subdomain (as in *.myDomain.lan) if the router thought itself on that same domain (under System>Settings>General>Domain). It wouldn’t stop completely, but it would get really slow. Switching to myDomain.arpa is a small price to pay for not having to manually forward each subdomain I configure for Caddy.

Takeaway

Life’s been hectic as of late, so I haven’t had the time or usable attention span to finish all my goals. Instead, I found several smaller goals and am in the process of whittling away toward the larger ones. My homelab was a bit of a mess after the techno-wizzarding I’ve been pulling off, and on this later pass, I’ve improved my understanding of what it was I was doing and how to engineer it better.

I have plenty of smaller goals I left out because I want this to be a log of what I’ve managed to finish – not a dream log of what I still have to do, but I still have one thing I wish to share.

One of the links on my homelab landing page is back to the router: https://cerberus.myDomain.lan. Firefox won’t load it. I looked up how to establish trust between Caddy and OPNsense, and my search results were flooded with a Caddy plugin for OPNsense. I’m sure it’s a fine project, but I don’t want to invest in that toy right now. Long story short, I pivoted approaches to use an ACME (Automatic Certificate Management Environment) client I found for OPNsense. I left an entry in my Caddyfile to hopefully expose the ACME server and did everything to make my first attempt. It failed, but at least I got to that attempt in the first place. I do however need to call it quits on the research.

Prayer

One YouTuber I occasionally pay attention to is Network Chuck (Shoutout if you’re reading this). I may disagree with his claim that “everything in IT requires coffee,” but I do like how he’s made a space in his videos for Jesus, and I’d like to do the same (the prayer, not the coffee).

Father in Heaven,

I thank you for the progress I’ve had this month, and I praise you for whomever is sticking around. Please bless him or her according to your will. May you inspire my readers in their personal projects, be it with Caddy, OPNsense, or more. Guide them towards whatever documentation or bug report they need, and help them avoid focusing on premature feature creep.

In Jesus’ name I pray, Amen.

And if you are curious about the Bible, God, or Jesus, I’d be more than happy to talk. These days, I primarily use Discord. There’s a link to my server in my Socials tab at the top of this page. So far, He’s been welcome there, but from now on, I’d say He’s properly invited.

Final Question

Do you have an opinion on which LDAP server reigns supreme for homelabs? I look forward to hearing about it in the comments below or in my Socials!

Game Pitch: AstroRush

Good Morning from my Robotics Lab! This is Shadow_8472, and today I have a game I’ve started making with a side of working on my homelab. Let’s get started.

The Pitch

AstroRush

A re-imagining of the classic asteroid-blasting arcade staple, this simple 3D space shooter puts you in the cockpit of a nimble spacecraft pulverizing asteroids in search of their mineral wealth. But is humanity alone in its interest in the Alpha Centauri system?

Styled to resemble vector graphics of the inspirational Atari title, Asteroids, AstroRush is my first game in my lineup to build my skills toward expressing my Container Analogy in an interactive format.

Progress so far

First off, I’m building this game on Redot 26.1, a free and open source game engine forked from Godot to provide a more stable base. The project recently moved from a “soft fork” to a “hard fork” – meaning that backwards compatibility with Godot is no longer a concern. The word I heard is that Godot introduced breaking changes in a minor update, which generally is a big no-no.

Moving on to AstroRush. It’s a working title I came up with to replace “Asteroids 3D” for this post – a pun off of “asteroid” and “gold rush.” The story I have in the back of my mind is that humanity is building its first inter-stellar colony, and all precious minerals are required.

Development proper started by grabbing my free flight controller and a picture of the sky from NASA for my background. I got the composite of the starry sky to feel like it could be drawn by an arcade game by turning down the saturation and maxing out contrast in the environment node.

And then most of my time was spent on dead ends trying to get the look just right. This category is split into two sections: first is shaping the asteroids, the second is outlining models with all-black albedo. To shape the asteroids, I first made a vertex shader to nudge around the points of an Ico Sphere out of Blender. The naive approach of displacing each vertex gave me a “nacho explosion” where each triangle goes off on its own. The trick was using the values at a 3D noise texture as a seed. But I found I could only make one unique asteroid using this method.

My second approach was to build the same logic in code. It took a bit longer, but again found that the last mesh defined was being used for everything. This is particularly problematic, as I need meshes to scale for larger/smaller asteroids. I ran my code past the Redot community and learned the engine was working as intended; I had made erroneous assumptions from the documentation. At this point, I’m likely to just “bake” a series of meshes that can be assigned randomly as needed.

Extrapolating the visual style of Asteroids to 3D was probably where I wasted most of my time. After trying and failing to write an outline shader for myself, trying to work with the stencil buffer (turns out it’s listed as experimental, and subject to change), and searching for a pre-made outline, I circled around to one that worked sort of OK, but didn’t particularly like low-polygon games as I was working with.

I wanted to understand the structure of the shader I was working with, so I spent a day re-building it as a visual shader – translating each line of shader code into a form resembling a flow chart. For some reason, I was never able to get it to behave the same as the original, even when I rigged the relevant parts into an “expression” node made for handling code. Yeah… those appear a little buggy at the moment.

Around this time, I was playing with a widgit in the Redot interface when I noticed a pattern in the misbehavior with the outlining. All I had to do was vertex deform the outline mesh as well.

Working on the free flight controller, I found speed caps and drag for both rotation and lateral movement helped with the feeling of a responsive space ship. Without them, turns are a fight against velocity. Think: Kerbal Space Program. I’m also working on making the script more friendly towards using it in other games. That means extending it using a second file.

Takeaway

It will take time, and progress will be incremental as I learn the basics. I still don’t have a playable prototype. Is the game fun? I don’t know. I sure hope so. But as the code base grows, I’m going to want a local Git server in my homelab.

I am thankful for the Redot community being so chill because the Godot Discord has reportedly gone on a witch hunt – banning people over common opinions the moderation team considers hateful – be they expressed openly, in a profile you have to hunt for, or in a leaked DM. No mercy, no notification – just a ban, according to some Redot users I met. Assuming it’s all true, I have to say it’s sad. On the one hand, I believe they hurting themselves. On the other, Freedom of speech is not freedom of platform. They have every legal right to build an echo chamber, but I find it morality dubious at best to eject people for wrongthink when they are otherwise behaving.

Final Question

What would you call this game?

The Importance of Team Communication

Good Morning from my Robotics Lab! This is Shadow_8472 and this month, I participated in the 2026 Redot Winter Game Jam. I have a story out of it, but it didn’t go the best. Let’s get started!

Disclaimer

I do not wish to defame anyone – only warn about the importance of communication. I am therefore substituting the names of my teammates for Stardew Valley characters based loosely on roll in the project while ignoring other characteristics, such as age or gender.

People have bad days, and I am only one witness of a narrow window of time. So please: unless you are a moderator, leave idle curiosity here. Fellow participants, do not allow anything you read here to affect how you rate the entries.

Out the Door Studios

One of my longer-term goals has been to code a big idea I’ve had as a video game, but I need to build skills to that end first. I entered the Redot Winter Game Jam last year to do just that. Redot is a fork of Godot – itself a free and open source game engine. Godot made a comment I found unwelcoming on their socials, and Redot prioritizes making games over witch hunting wrongthinkers. Under that context, I built what became Out the Door Studios. Going into the Redot Winter 2026 jam, I have two jams worth of experience plus a little from personal projects.

A Group Founded in Poor Communication

The official theme for this jam is officially “Slippery when…”. For me, it is miscommunication – not all of it mine, not all of it bad. The first miscommunication involved meeting Sebastian, a programmer with a little 3D experience. He mistook our 2025 entry for 2026, but I invited him to Out the Door nonetheless. Nobody I’ve previously jammed with was both available and interested. Within a few minutes of my pre-jam pitch, he had delivered a visual matching my inspiration perfectly.

And then I spotted an ad by Lewis. He wanted a programmer and a 3D artist to round out his jam team for a more ambitious 3D project. I applied, offering Sebastian’s and my talents. It went back and forth between Sebastian, myself, and Lewis for quite a while, with me making Lewis out to be “middle-to-top of the little leagues.” Somewhere in there, I listed my years of experience with Linux intending that as an ability to solve problems.

Unbeknownst to me at the time, Lewis was more “bottom of the big leagues” and took my experience maintaining a homelab to mean I was programming the entire time. When I arrived in Lewis’ Discord, it had about 10x the members I expected. It is set up more like an office than an informal development environment. Impostor syndrome set in, but I pushed it aside. Their rules weren’t copy-pasted –usually a good sign– but the casual coarse language made me uncomfortable from the start.

Already on the team were people to whom I will refer as Leah, a concept artist I didn’t interact with much, and Maru, another programmer. Yellow flag out the gate: time zones – the five of us were spread roughly equally around the globe. Most pointedly: our programmers shared little in the way of sleep schedule, severely hampering collaboration.

The Jam Begins

Now, I never signed any NDA’s (Non-Disclosure Agreements), but as of publication, the jam is ongoing; I will avoid mentioning specifics unless they’re vague or important. We got a relatively good jam theme, “Slippery when…”, lending itself as easily to ice/slipping as stealth/evasion/pursuit. Four of us hopped in voice (Maru was asleep).

In that first meeting, I immediately got cussed at to turn my mic down (apparently, my mic settings were out of whack from fiddling last month). I proposed using Codeberg – having to explain the difference between GitHub (code-hosting website) and Git (version control software at GitHub’s heart), but Lewis about insisted on using GitHub, “let’s just stick with what’s familiar.” ‘But, GitHub is not familiar to me,’ I thought. His house, his rules.

Alone, that wouldn’t mean much, but every design decision followed suit. At one point, we were split 3:2 over what to call a gameplay element. Despite Lewis landing in the minority, it shifted from “we’ll just agree to know what we mean when either term comes up” to an unspoken, ‘just use Lewis’ term in-code.’ Yellow flag.

My main contribution to the game was locating a system important to the main gameplay gimmick. I found something on the in-engine asset store and had it half-way working, but after a day it proved buggy and inherently didn’t support browsers. I spent another day studying how similar effects were accomplished in familiar titles. One lonely video demonstrated everything we needed running in Godot (which Redot is compatible with), but without code or contact info.

The Tutorial

Then someone –Sebastian, I believe– located the tutorial. How to do <our gimmick> in Godot 4. Code was published, but lacked a visible license. I spent a day tracking one down. Let’s call the tutorial author Mr. G4 for Godot 4. His project built off someone’s Godot 3 code. Mr. G3 was following up on a fourth party’s Godot 2 demo. Between them, I only found an eMail for Mr. G3. Mr. G2’s digital footprints ended in 2021, one post after retweeting an incitement to violence against police in 2020, which I reported. Mr. G3 got back to me with an MIT License, and I was able to share my Container Analogy post.

Mr. G4 was tricky. I asked Lewis, citing enough context between the video and comments section to treat it as MIT, and he said his personal philosophy is that if there’s no license, just do whatever; it’s just a game jam. Yellow flag: one might get away with CC0 (public domain, even without public domain), but that’s not how copyright usually works.

I showed the tutorial to Muggyfox1, a colleague from my second jam, who figured it was fine. The code had been online for years, I made an effort to reach out, and the author clearly intended to be copy-pasted.

In either Mr. G4’s implementation or the asset library’s, the mechanic relies on a shader (GPU code usually used for graphics). Maru was wary of using shaders, but I figured it was within reach, given ready code and my success reading data back out of the graphics card.

Meeting

Sunday came around, and the jam team was invited to a meeting. Sebastian had a previous engagement, and I woke up early. Did I say the jam team? No, it was server-wide! I caught the tail end of a foul mouthed, pre-meeting goof off session. They deliberated content creator sponsorship programs to apply for as a server while alluding to male reproductive organs at the pair they eventually got.

I abstained from deliberation and voting, as I was planning to peace out after the jam at least until another jam – besides, none of the potential sponsors interested me anyway. When pressed for comment on something, I noted for the record that I considered myself as “on probation” for the time being.

Flashpoint and Aftermath

A week into the two-week jam, Lewis was feeling the time crunch. He was using all-caps in chat, demanding that someone start the 3D environment. We programmers were presumably all waiting on each other for things. Lacking a good 3D environment, I had been asking for help all day to add a slightly less important feature to the shader. Eventually, I entered a voice call with Lewis.

Lewis tried the demo I had on GitHub, but it was from a weird development point where something wasn’t working right. I offered to share directly from my screen, but Lewis wasn’t as impressed as I would have hoped. He was quite miffed to hear I was working on a shader and simultaneously told me to “SCRAP IT!!” and, “See it through!” If you’ve ever read about the development of Sonic X-treme, that’s what it felt like.

“Tell me ONE good reason I should not kick you out right now!” he said.

I had already committed to forgiving him after He apologized, otherwise it might have turned into a shouting match. I prayed to Jesus for the situation as I actively ignored any rash words of passion that lent themselves to being hurled back. “I’m the only one who knows how the <REDACTED> system works,” I said as calmly as I could.

I tried not to flake afterwords, but the damage had already been done when Lewis yelled at me. It came into focus the next day after two unproductive hours without positive directions and a reality check with my sister and mother. I was easily spending 10+ hours a day focused on this project, trying to keep things moving when Lewis was offline and people (mostly Sebastian) were at a loss as to what was currently needed. When I DM’ed Maru, she shared her doubts from the start and said, “learning while trying to deliver a complete game in 1 week [sic] is nuts.” (The jam is two weeks.) I had tried to be clear to Lewis, “I am here to learn and have fun with the goal of getting something out the door that is a bit more complex than my previous work.” Neither of those bolded conditions were true anymore.

Sebastian had been more optimistic the night before. I had taught him the system I’d assembled and documented it with plenty of screenshots. He commented on a couple missing features on the shader, and let no one truthfully say I didn’t give it my all to add them until I was burned out.

But when I did burn out, I tried to leave with as much class as possible while still saying things I felt needed to be said. I put my code in order, leaving a new comment or two, then organized the shader with frames around nodes meant to function together. When I told Sebastian about my plans to leave and Maru’s pessimism, he replied, “I might be the only [guy] who can stitch everything together.” Maru expressed disappointment upon hearing we wouldn’t get to work together on something more serious. In the end, I wrote a 500 word letter of resignation telling this story and dropped it in his server, CC’ing everyone in the jam’s section, including what looked like a server admin.

As a last order of business, I licensed my contributions under GPL to Lewis and his team, then ensured its admittance into the main branch (I don’t know that Lewis knows how to use ‘git checkout’ to cleanly remove my contributions). Since I’m not trying to sabotage the jam, I wrote in an exemption until the submission deadline. After that GPL applies, meaning if Lewis wants to continue development, he will have to either publish his entire project’s source as GPL or contact me through a 3rd party for negotiations, but only in the presence of a chaperon approved by me and who has authority to end the conversation at any time for any reason.

Takeaway

I am proud of the work I did complete, and I really hope the team can publish a fun game on time. But at no point did I ever sign any NDA (Non-Disclosure Agreements) nor did I sign away any rights to code I wrote/modified. I hope Lewis grows as an individual from losing me like this. When you work with volunteers, you’re working with volunteers.

I had fun with the mechanic I did get working, and I think I want to do something with it in the future.

Final Question

Have you ever had to walk out on a bad team situation?

New Year’s Laptop Work

Good Morning from my Robotics Lab! This is Shadow_8472, and today I have a few related projects I jumped between over the month. Let’s get started!

Kernel Regression

In the process of software development, it is generally impossible to continuously add features without ever introducing bugs. The heart of an operating system is no different. I never figured out exactly what happened, but my laptop started getting ridiculously long boot times after a Linux kernel update. Like: 10 minutes time spent not waiting on the user to enter decryption keys and login credentials – assuming it didn’t crash. However, by mashing F8 and F12 (only one of them is needed and I’m too lazy to figure out which one) during boot, I could access a menu to boot an old kernel without undue delay.

I honestly cannot recommend System76 enough. I’ve spent a while with different tech support agents, and my experience with System76 is consistently the best. I am very thankful they have native English-speaking agents who know their stuff. No one is paying me to say this.

After a few commands, I was told my hard drive was on the full side. That could explain some stuttering I was experiencing. So, I deleted a large game I don’t play anymore and installed a 2TB hard drive in my second of three M.2 slots. Opening the case to install it was as simple as unscrewing the bottom and pulling a little, following their guide. I installed PopOS 24.04 and it booted in a reasonable time. I want to eventually move over to it, but I so far haven’t figured out LVM management enough to mount my encrypted partition, and unfortunately that is outside the scope of System76 support.

Our conclusion was that either I had a weird configuration somewhere or the kernel had a regression. In either case, there’s been another new update, and my laptop boots normally again.

Headphones Jack Destroyed!!

Towards the end of the month, I had my laptop set up while out and about, and I got startled badly. Like really badly. Enough to rip my 3.5mm audio out from from the circuit board, ruining the audio wire while I was at it. First moral of the story: Never use a straight end on a sideways pointing headphones jack. An elbow in the wire makes the connection point a lot weaker as a lever, and might have saved an approximately $80 mistake after parts, shipping and tax. Again, thank you System76 for putting such exposed circuitry on an easily replaceable separate board (although the charge for shipping was painful).

As of publication, the board hasn’t arrived. In the meantime, I still need sound for my workstation. My first thought was to pass my audio through my Blue Yeti. By default, this microphone has a monitor function so you can hear yourself. I already disabled it, and it took just a little educated guesswork to route audio correctly. I don’t like the idea of keeping my good microphone set up all the time though when the same USB-C multi-adapter has a 3.5mm audio out.

However, educated guesswork wasn’t enough for round two. Seemingly no matter what I did while trying to play audio through the multi-adapter, sound would end up either not playing or finding its way over HDMI to my main monitor. On the odd case I got as much as a chirp out of my speakers, my video driver would blink to black, stabilize, then blink out a second time – before either to my monitor or eventually giving another split second of audio and doing the same loop all over again.

I got in touch with System76 again. The friendly agent had me run a couple commands to show what devices were connected, and I did what I could to provide any diagnostic information I could. Often, this would lead to an idea, which I would delay my reply e-mail until I could explore it. The biggest tip this session got me was to poke around with pavucontrol, a fancier audio mixer than the one built into KDE.

I found a mysterious LDC in my display configuration whenever 3.5mm audio is connected. I had a good independent exploration session memorable for my login screen being absorbed by this phantom screen, I looked it up and found the audio was being routed over HDMI, and duplicating my main screen to the LCD was my best move.

One night, I found myself playing a song on loop for an hour and I decided to time the malfunction. I ran a timer on the malfunctions and as luck would have it, when I set up the glitch loop and tried timing the split-second “chirp” it ran around 9.5 seconds before LCD disconnected and reconnected, completely busting whatever working theory I had at the time.

At some point, I muted my playback of my video in Firefox, and turned the volume up slowly. No glitch. I didn’t put it together until after I’d glitched it in another browser, website, and even a local game. The common factor was volume; my cheap not-quite-a-laptop-dock has some weak components, leading to signal blowout, resetting its HDMI phantom monitor when it the volume gets too loud. The solution was to turn the volume down and amplification up. A little prompting by a second agent at System76 led to me exploring the audio chirp into real monitor HDMI fail state, and I understand the Linux audio system better for it.

Takeaway

This laptop has been a very good learning opportunity for me. Running desktop Linux is almost the same, but switching between a workstation with two monitors and a speaker system and going on the go adds a new layer of complexity. I still have a while to go before I master PulseAudio. The day before posting, I had trouble accessing my internal microphone from where I disabled it somewhere deep in the OS. But overall, I’m learning a bit more with each fixed problem, even if it’s just to wait a bit while upstream fixes it.

Final Question

What crazy laptop repair stories do you have?

Cold Start VTubing in Linux for 2026

Good Morning from my Robotics Lab! This is Shadow_8472, and today I am putting together a VTubing stack ready for 2026. Let’s get started.

Challenge Outline

I forget exactly when I saw the recording but I saw a livestream from late 2017 demonstrating a pair of animated characters standing around, talking with chat, and that blew my mind. Unbeknownst to me at the time, a very similar technology had previously been implemented by Disney Imagineers back in 2001 for a Lilo and Stitch themed video call attraction starring the fuzzy, blue alien. Fast forward through the 2020 pandemic to late 2025, and the infrastructure has developed to bring digital puppetry within reach of anyone with a good computer, and a webcam.

I’m only share my face online with people I’ve met in-person, so when a long-time friend announced a new game with webcams encouraged, I reviewed my research on VTubing on Linux.

The Software Stack

I used the video “Full Body Tracking for Linux Vtubers? Yes!” by Corbent’s Workshop [1] as my starting point. It’s aimed at VTubers switching to Linux as opposed to Linux enthusiasts looking to pick up VTubing, but I found it gave me a solid foundation to work from. In summary, he recommended:

  • XRAnimator for pose capture
  • Either VSeeFace or Warudo to animate your avatar
  • OBS (Open Broadcast Software) for streaming to the Internet

XRAnimator

XRAnimator is an open source Linux native application selected to provide face, body, and hand tracking. Only one webcam is needed for a basic setup. I played around with puppeting the default avatar and found a couple problems. First, transparency doesn’t work right (I suspect it’s an X11 bug), which I can work around by setting a black background. Second, it doesn’t particularly like my glasses, for which I can just wear contact lenses.

Warudo

I chose Warudo for the heart of my VTuber stack. I’d rather use something open source, but I’m on a deadline. Corbent’s Workshop noted how VseeFace can only track the mouth by audio or motion capture, but not both at once [1]. Warudo can. Neither has a Linux native version, but it looks like Warudo at least is working on it (I think). The video recommended the latest Proton 8.x, but I found GE-Proton-10-26 (GE as in “Glorious Eggroll”) works just as well. Furthermore: a Warudo update made some launch options necessary (Found at “Warudo Steam page/Options [gear]/Properties…/General/LAUNCH OPTIONS”):

PROTON_DISABLE_NVAPI=1 PROTON_USE_WOW64=1 %command%

Check with the community Discord if this doesn’t get you in. That is where their support system is.

OBS

This is one I had installed already. OBS stands for Open Broadcast Software. Its main use is for streaming, but with one extra package I failed to document, I convinced it to host a virtual webcam Discord had no problems picking up.

Character Model

This is all meaningless without a character .vrm file to animate. A fairly priced quality avatars can run thousands of dollars. I have basic 3D skills – even a simple robot avatar is out of scope for this project. Instead, VRoid Studio is the go-to avatar creator tool (closed source), which I also ran on GE-Proton10-26. I took the time and read through the EULA, and the biggest concern I picked up on was redistributing their models and rigging to make a competing program, so I can understand.

I made a self-avatar and crudely textured up my specific watch with included brushes. The result didn’t track my eyes or silent mouth movements. As it turns out, VRoid Studio and Warudo use different names for mesh deformation “blendshapes.” After installing the “VRM format” plugin in Blender, I brought my model into compliance with the VRM 0.x convention Warudo expects by re-naming my ‘eye closed’ blendshapes to ‘eye blink’ and removing the prefixes from A, I, U, E, and O (Japanese vowel order). However, this broke my audio lip sync, and I haven’t figured out where to fix it yet. It was enough for a full integration test where I called a friend I originally met person, and he said it made me look… like a child. Well, I tried. I’ll need a more stylized concept for showtime.

Further Research

I may have reached baseline functionality, but there is so much more to learn about, even at a $0 hobby level.

  • XRAnimator
    • Loading a model into XRAnimator is rumored to improve quality.
    • So a second webcam to improve tracking.
    • Move motion capture to another device, like my Kinect2 (if that works on Linux).
  • 3D Model
    • Finish full lip sync capabilities.
    • Make an ARKit compliant model for improved facial expressions.
      • NEB (Not Enough Blendshapes) (Unity plugin)
    • Learn to use using Warudo blueprints and expressions.
  • Accept Unity the gold standard for editing VRoid models.
  • WINE
    • Switch to Bottles if Steam’s grip on Warudo allows it (I’m on Lutris).
    • Put the Windows parts of my stack in the same WINE bottle.
  • Switch to a more open source ecosystem for avatars.

Takeaway

My journey to a VTubing Hello World was surprisingly easy to get to, obstacles notwithstanding. The hardest part I finished was probably writing this up. I have a path forward with this project.

Final Question

Are you here starting your own journey? Be sure to let me know in the comments below or on my socials.

Work Cited

[1] Corbent, “Full Body Tracking For Linux Vtubers? Yes!”youtube.com, Corbent’s Workshop, Oct. 27, 2024. [Online]. Available:https://www.youtube.com/watch?v=O7EkbzLpl-8. [Accessed: Dec. 23, 2025].

A Pair of Programming Projects

Good Morning from my Robotics Lab! This is Shadow_8472 and today I am covering my programming work this past month. Let’s get started!

I am finding that even on the monthly cycle, I don’t always finish the projects I intend to. This month, I was intending to work on my phone, but that project hit some tough truth and the story isn’t ready yet.

HyperTrace

I might as well introduce my newest long-term project, HyperTrace. It started out in October as a spectator camera for a larger Redot community project. I studied up on using quaternions to rotate a camera node to avoid gimble lock, but just as I was starting to get it under control after I was bored stiff of them, I realized I could just attach a camera to a physics object.

But once I had a camera with 6 degrees of freedom (three positional and three rotational), I started thinking about how I could make a game out of it. I figured a space piloting game was in order. I got into studying up on using Perlin noise and similar methods of making structured noise to generate a shifting cave structure to represent a hyperspace mesh for the player to navigate – preferably at high speed.

The idea is that an unnamed pilot is tracing out a route through a shifting hyperspace (preferably at high speed), and if he collides with a wall, he is thrown out of hyperspace, where it isn’t safe to re-enter, so he must return to port under conventional physics. If he succeeds, his fleet can instantly jump across his route. This FTL system sets the stage for a future where humanity has colonized the solar system, but is hesitant to venture out to other stars. Think Star Wars meets Space Engineers meets Monkey Ball.

So far, my best tech demo is a blue fog you can navigate at slow, careful speeds. It takes an unacceptably long

Shell Scripting

My services were called upon for a “computer project.” My mother had some PDF files of recipes to separate. Over the course of an afternoon, I wrote a BASH script to separate out each page in each PDF. At its heart was the program qpdf. Along the way, I tried to keep good practices like leaving comments and having checks to prevent accidental overwriting.

Now, I only know a modest bit of BASH (popular Linux command line interface aimed at users), but I know enough about basic programming it was just a matter of syntax for me. But I figured I could do one better in around the same amount of time. I saved a copy of the script and its output, and I set about naming files after their contents, identifying+recombining second pages, and sorting out charts, and numbering repeats (some duplicate recipes had changes).

But for every hour I spent working out a feature, I figured I spent another debugging edge cases. Several recipes had BASH control characters in their titles like *, &, and /. I preferred to use special handling of strings, but in the end I sanitized / as ÷ and * was removed entirely. This additional work extended the project to a total of three days.

Closing thoughts

My goal with Hypertrace is to publish it to Itch.io or similar platform on a donation basis. And before I get past more than a couple tech demos, I’m going to want a version control server in my homelab just in case something happens to my laptop. While I really want to think ahead in terms of security, I know I don’t know enough to make an informed decision about the utility I need now.

Final Question

What Git setup do you recommend for a homelab?

How to Run Electric Quilt 8 on Linux

Status: INCOMPLETE

Good Morning from my Robotics Blog! This is Shadow_8472 and today I am installing Electric Quilt 8 on Linux for my mother. Let’s get started!

First off, I think I’m going to try something new. I always seem to juggle 3-5 longer projects, splitting off extra parts if I didn’t complete something. A few updates were a bit underwhelming. Going forward, I want to try using a series of updates into old posts as I finish projects.

Electric Quilt 8

EQ8 (Electric Quilt 8) is a quilting program my mother uses. It is also my mother’s last Windows holdout. Seeing as Windows 11 is all sorts of bad news and her laptop was forcefully migrated, I took on the project of coaxing EQ8 into running off-platform. The gold standard is of course native performance, but a small Windows 10 VM (Virtual Machine) is on the table as a last resort.

Attempted Solutions

WINE:

The first stop when attempting to run Windows programs on Linux is WINE (abbreviated: “WINE Is Not an Emulator”), a compatibility layer for translating Windows system calls into Linux equivalents. Long story short, I’ve thrown several versions at it, but I’m not a WINE wizard. There is potential here, but I cannot unlock it alone.

ReactOS:

ReactOS is an open source operating system based on Windows’ architecture. I had set up a VM from another project, but once I got EQ8 over and tried to install, it said it wanted Windows 7. For context, ReactOS is starting to think about re-implementing software libraries introduced in Vista. This is a dead end for now.

Crossover:

I contacted EQ Support with a don’t-ask:-don’t-get attitude. As expected, they only support Windows installations, but the representative pointed me in the direction of Crossover, saying, “Other customers have had success using their product with our software.” I am currently on the 2 week free trial, and while I haven’t gotten it working yet, this is also a promising avenue to explore.

Fedora:

My mother contacted one lady who’s gotten EQ8 working on Fedora Linux. I made a VM for Fedora 42 and tried both Lutris and Crossover. No improvement. As of Oct. 26, the last I’ve heard from her is “Fedora Version 42, Wine Version 10.12, Windows 7 (without Internet-Connection), no launcher.”

Tech Pile:

It’s always fun to raid the tech pile in the hopes of getting something to run. We turned up 2 Vista laptops and a netbook with Windows 7 Starter. Win7 Starter is 32 bit. EQ8 is 64. It wasn’t entirely fruitless, as one of my computers has a Windows 10 drive I never nuked.

Community:

Long story short: I haven’t been able to break into many conversations on my own about EQ8. Reddit has a couple references to running it on Linux, but they run dry pretty quickly.

This is an open project. Check back later for updates.

Final Question

Do you use EQ8 on Linux? Maybe you’re a WINE guru and found this Help Wanted post. If either of these is so, please reach out to me either in the comments below or on Discord so I can update this post with a solution. You can find an invite to my server on my Socials page.

Building an Improvised Recording Booth

Good Morning from my Robotics Lab! This is Shadow_8472, and today I am building a sound booth in my closet. Let’s get started!

Designing a Recording Booth

At one point in my year off, I researched the basics of voice acting. I already have a Blue Yeti and Tenacity (fork of Audacity, a free/open source sound recording program), but the recording environment is just as important. Tenacity and other programs that handle audio often come with powerful tools to clean up background noise, and better techniques are being innovated all the time.

But no matter how good post-production gets, it’s still best to minimize unwanted noise relative to signal (voice or instruments) in the first place. This is called the noise floor. And depending on the size and shape of your room, sound reflects back and echos. Vibrations from the building can also travel up a simple mic stand and degrade quality that way. Little tweaks like are endless, but these are the scope of my project today.

Common DIY sound booth solutions range from recording in a closet to draping moving blankets off a collapsible PVC pipe frame to building a mattress fort. My budget is “stuff from around the home,” so no PVC or mattresses for me. The theory behind closets is that the clothes absorb the sound bouncing around. But I read that if you think you have enough clothes for that, you don’t have enough clothes. Long story short: I hung the heaviest, unused blankets I found around the house in my closet for a compromise between the lower two solutions. I also used a dresser drawer to reduce mic vibrations.

Project

What good is a sound booth without a project to test it on? As of writing, I’ve only done test recordings of “Marry Had a Little Lamb” comparing the sound floor loose in my room to a camper when I visited family in the mountains this summer and again within the sound booth. I think the biggest factor I need to worry about is my rather loud laptop fans.

For the longer term, I auditioned for an unpaid role in a fan re-animation project: The Tigger Movie: ReBounced [Link to Casting Call Club page]. I looked it up as I actually have a pretty good Eeyore impression, but that role was taken already. After a little self-debate, I tried out for Piglet since digital manipulation was on the table, and Piglet’s role is as small as he is. My family picked up a copy of the original movie to watch, and I’m finding Piglet’s stutter a little difficult to master, but it is nothing compared to Tigger’s lisp. The project is in need of animators, so if you happen to have some time, feel free to check it out.

Final Question

What entry-level equipment would you recommend and in what order?

I’m Back

Good Morning from my Robotics Lab! This is Shadow_8472 with a quick update. Let’s get started!

Keeping things short, I’m back from my year-long break. I didn’t do [as] much in terms of stuff to cover, but I did get a new laptop built for Linux and I participated in Redot’s New Year’s and Spring game jams. Check them out on Itch.io. I’ll be writing about them and other projects in coming months.

Speaking of, I am moving to monthly posts – the last Monday of each month. Weekly posts kept me moving when projects were simple, but I eventually found myself writing fluff. There are only so many times I want to explain NFS auto-mounts, yet I’d do it to publish something. A monthly format will allow time for tougher and/or more haunting challenges, like my Nextcloud saga I finally have an answer to. Concluding such epic efforts might be worthy of a video, which I am interested in trying out.

Final Question

One of my long-term goals is to build a following, and I understand a good way to do that is networking with similar content creators. Do you know anyone I might have a chance at getting to know?

My Pi 400 Travel Desktop

Good Morning from my Robotics Lab! This is Shadow8472, and today I am trying out my Raspberry Pi 400 as a lightweight daily driver. Let’s get started.

rPi 400

The Pi 400 is a special edition of Raspberry Pi – essentially a Pi 4 built into a keyboard. It’s missing a USB 2.0 port (made up for by the keyboard) and the 3.5mm audio jack. In theory, the miniature keyboard computer is perfect for travel – assuming you have a screen at your destination that is.

Choosing a Distro

My first thought was to try installing Arch. A brief search found an Arch on ARM project as vanilla Arch only supports x86 architecture. The instructions involved formatting a drive on the terminal. I bought a 256 GB SD card and did so on the first convenient rPi install I had – Kali linux.

It took a few failed attempts to install Arch. The pure open source method that reportedly works on the regular Pi 4 didn’t on my Pi 400, and the standard method wasn’t cooperative when installing a login manager and any of a few desktop environments. Running on a time crunch before leaving on a trip, I switched to DietPi, another lightweight distro I’ve worked with before for, but for a much different project. As with Arch, MATE was uncooperative, so I settled with XFCE. Special thanks to Balena Etcher for a special warning when formatting large drives.

Packing For a Trip

To shortcut setup, I copied my browser and LibreOffice data from my main desktop. LibreOffice worked for me, but didn’t carry the full dark mode – a problem I’ve encountered before, but ultimately decided to live with.

Firefox ESR –as installed from DietPi’s hedged garden– refused to accept my profile. Regular Firefox –installed from the apt repositories– was up to date and started as expected. Notably, it included my extensions – especially Bitwarden, my password vault.

A screen was not procured at my destination, so I packed one from home. The missing audio jack was also problematic, so I packed my Blue Yeti with plans to disable voice monitoring. For redundancy, I packed an HDMI monitor, but busted the ​​styrofoam while stuffing it into the slightly wrong box. As of writing, I have done nothing with sound.

Deployment

We left on our trip. Upon arriving, I found my over-packed tech bag lacked a long enough HDMI cable. I borrowed a longer one. The monitor signal kept flickering. For a while, I assumed I was overloading the Pi with a couple hundred browser tabs, but after a power blink and several reboots, it came out that the HDMI was bad. We bought a replacement, and it’s been working properly since.

Ejected Challenges

Arch wasn’t the only thing I had to back off from. I brought a couple additional Pi 4’s to have myself a nice, little network with Pi-Hole ad blocking, but Wi-Fi strength and configuration challenges meant those were both a no-go.

Another challenge I want to pull off is playing Stardew Valley. I copied the files over back home, but haven’t had time to try the conversion I found yet.

Takeaway

Finishing a project as a rule is better than stalling an overly ambitious one. I have an on-the-go workstation, even though it still lacks polish.

Of special interest, this week marks the 6th anniversary of my Robotics Lab. And some months ago, I decided I wanted to do a Sabbath year cycle. I’ve proven that I can be consistent at posting, even if I feel my quality slips some weeks. The facts of the matters is that weekly posts are getting a bit repetitive for me, so for the next year, I’m only going to post when I finish a major project, such as when I have the full software suite I have in mind for my homelab. After that, perhaps I’ll stick to monthly. We’ll see in a year’s time.

Final Question

Setting up a travel computer on the quick was a bit of a trick. What must-haves would you include in a similar package?

I look forward to hearing your answers in the comments below or on my Socials.