The Importance of Team Communication

Good Morning from my Robotics Lab! This is Shadow_8472 and this month, I participated in the 2026 Redot Winter Game Jam. I have a story out of it, but it didn’t go the best. Let’s get started!

Disclaimer

I do not wish to defame anyone – only warn about the importance of communication. I am therefore substituting the names of my teammates for Stardew Valley characters based loosely on roll in the project while ignoring other characteristics, such as age or gender.

People have bad days, and I am only one witness of a narrow window of time. So please: unless you are a moderator, leave idle curiosity here. Fellow participants, do not allow anything you read here to affect how you rate the entries.

Out the Door Studios

One of my longer-term goals has been to code a big idea I’ve had as a video game, but I need to build skills to that end first. I entered the Redot Winter Game Jam last year to do just that. Redot is a fork of Godot – itself a free and open source game engine. Godot made a comment I found unwelcoming on their socials, and Redot prioritizes making games over witch hunting wrongthinkers. Under that context, I built what became Out the Door Studios. Going into the Redot Winter 2026 jam, I have two jams worth of experience plus a little from personal projects.

A Group Founded in Poor Communication

The official theme for this jam is officially “Slippery when…”. For me, it is miscommunication – not all of it mine, not all of it bad. The first miscommunication involved meeting Sebastian, a programmer with a little 3D experience. He mistook our 2025 entry for 2026, but I invited him to Out the Door nonetheless. Nobody I’ve previously jammed with was both available and interested. Within a few minutes of my pre-jam pitch, he had delivered a visual matching my inspiration perfectly.

And then I spotted an ad by Lewis. He wanted a programmer and a 3D artist to round out his jam team for a more ambitious 3D project. I applied, offering Sebastian’s and my talents. It went back and forth between Sebastian, myself, and Lewis for quite a while, with me making Lewis out to be “middle-to-top of the little leagues.” Somewhere in there, I listed my years of experience with Linux intending that as an ability to solve problems.

Unbeknownst to me at the time, Lewis was more “bottom of the big leagues” and took my experience maintaining a homelab to mean I was programming the entire time. When I arrived in Lewis’ Discord, it had about 10x the members I expected. It is set up more like an office than an informal development environment. Impostor syndrome set in, but I pushed it aside. Their rules weren’t copy-pasted –usually a good sign– but the casual coarse language made me uncomfortable from the start.

Already on the team were people to whom I will refer as Leah, a concept artist I didn’t interact with much, and Maru, another programmer. Yellow flag out the gate: time zones – the five of us were spread roughly equally around the globe. Most pointedly: our programmers shared little in the way of sleep schedule, severely hampering collaboration.

The Jam Begins

Now, I never signed any NDA’s (Non-Disclosure Agreements), but as of publication, the jam is ongoing; I will avoid mentioning specifics unless they’re vague or important. We got a relatively good jam theme, “Slippery when…”, lending itself as easily to ice/slipping as stealth/evasion/pursuit. Four of us hopped in voice (Maru was asleep).

In that first meeting, I immediately got cussed at to turn my mic down (apparently, my mic settings were out of whack from fiddling last month). I proposed using Codeberg – having to explain the difference between GitHub (code-hosting website) and Git (version control software at GitHub’s heart), but Lewis about insisted on using GitHub, “let’s just stick with what’s familiar.” ‘But, GitHub is not familiar to me,’ I thought. His house, his rules.

Alone, that wouldn’t mean much, but every design decision followed suit. At one point, we were split 3:2 over what to call a gameplay element. Despite Lewis landing in the minority, it shifted from “we’ll just agree to know what we mean when either term comes up” to an unspoken, ‘just use Lewis’ term in-code.’ Yellow flag.

My main contribution to the game was locating a system important to the main gameplay gimmick. I found something on the in-engine asset store and had it half-way working, but after a day it proved buggy and inherently didn’t support browsers. I spent another day studying how similar effects were accomplished in familiar titles. One lonely video demonstrated everything we needed running in Godot (which Redot is compatible with), but without code or contact info.

The Tutorial

Then someone –Sebastian, I believe– located the tutorial. How to do <our gimmick> in Godot 4. Code was published, but lacked a visible license. I spent a day tracking one down. Let’s call the tutorial author Mr. G4 for Godot 4. His project built off someone’s Godot 3 code. Mr. G3 was following up on a fourth party’s Godot 2 demo. Between them, I only found an eMail for Mr. G3. Mr. G2’s digital footprints ended in 2021, one post after retweeting an incitement to violence against police in 2020, which I reported. Mr. G3 got back to me with an MIT License, and I was able to share my Container Analogy post.

Mr. G4 was tricky. I asked Lewis, citing enough context between the video and comments section to treat it as MIT, and he said his personal philosophy is that if there’s no license, just do whatever; it’s just a game jam. Yellow flag: one might get away with CC0 (public domain, even without public domain), but that’s not how copyright usually works.

I showed the tutorial to Muggyfox1, a colleague from my second jam, who figured it was fine. The code had been online for years, I made an effort to reach out, and the author clearly intended to be copy-pasted.

In either Mr. G4’s implementation or the asset library’s, the mechanic relies on a shader (GPU code usually used for graphics). Maru was wary of using shaders, but I figured it was within reach, given ready code and my success reading data back out of the graphics card.

Meeting

Sunday came around, and the jam team was invited to a meeting. Sebastian had a previous engagement, and I woke up early. Did I say the jam team? No, it was server-wide! I caught the tail end of a foul mouthed, pre-meeting goof off session. They deliberated content creator sponsorship programs to apply for as a server while alluding to male reproductive organs at the pair they eventually got.

I abstained from deliberation and voting, as I was planning to peace out after the jam at least until another jam – besides, none of the potential sponsors interested me anyway. When pressed for comment on something, I noted for the record that I considered myself as “on probation” for the time being.

Flashpoint and Aftermath

A week into the two-week jam, Lewis was feeling the time crunch. He was using all-caps in chat, demanding that someone start the 3D environment. We programmers were presumably all waiting on each other for things. Lacking a good 3D environment, I had been asking for help all day to add a slightly less important feature to the shader. Eventually, I entered a voice call with Lewis.

Lewis tried the demo I had on GitHub, but it was from a weird development point where something wasn’t working right. I offered to share directly from my screen, but Lewis wasn’t as impressed as I would have hoped. He was quite miffed to hear I was working on a shader and simultaneously told me to “SCRAP IT!!” and, “See it through!” If you’ve ever read about the development of Sonic X-treme, that’s what it felt like.

“Tell me ONE good reason I should not kick you out right now!” he said.

I had already committed to forgiving him after He apologized, otherwise it might have turned into a shouting match. I prayed to Jesus for the situation as I actively ignored any rash words of passion that lent themselves to being hurled back. “I’m the only one who knows how the <REDACTED> system works,” I said as calmly as I could.

I tried not to flake afterwords, but the damage had already been done when Lewis yelled at me. It came into focus the next day after two unproductive hours without positive directions and a reality check with my sister and mother. I was easily spending 10+ hours a day focused on this project, trying to keep things moving when Lewis was offline and people (mostly Sebastian) were at a loss as to what was currently needed. When I DM’ed Maru, she shared her doubts from the start and said, “learning while trying to deliver a complete game in 1 week [sic] is nuts.” (The jam is two weeks.) I had tried to be clear to Lewis, “I am here to learn and have fun with the goal of getting something out the door that is a bit more complex than my previous work.” Neither of those bolded conditions were true anymore.

Sebastian had been more optimistic the night before. I had taught him the system I’d assembled and documented it with plenty of screenshots. He commented on a couple missing features on the shader, and let no one truthfully say I didn’t give it my all to add them until I was burned out.

But when I did burn out, I tried to leave with as much class as possible while still saying things I felt needed to be said. I put my code in order, leaving a new comment or two, then organized the shader with frames around nodes meant to function together. When I told Sebastian about my plans to leave and Maru’s pessimism, he replied, “I might be the only [guy] who can stitch everything together.” Maru expressed disappointment upon hearing we wouldn’t get to work together on something more serious. In the end, I wrote a 500 word letter of resignation telling this story and dropped it in his server, CC’ing everyone in the jam’s section, including what looked like a server admin.

As a last order of business, I licensed my contributions under GPL to Lewis and his team, then ensured its admittance into the main branch (I don’t know that Lewis knows how to use ‘git checkout’ to cleanly remove my contributions). Since I’m not trying to sabotage the jam, I wrote in an exemption until the submission deadline. After that GPL applies, meaning if Lewis wants to continue development, he will have to either publish his entire project’s source as GPL or contact me through a 3rd party for negotiations, but only in the presence of a chaperon approved by me and who has authority to end the conversation at any time for any reason.

Takeaway

I am proud of the work I did complete, and I really hope the team can publish a fun game on time. But at no point did I ever sign any NDA (Non-Disclosure Agreements) nor did I sign away any rights to code I wrote/modified. I hope Lewis grows as an individual from losing me like this. When you work with volunteers, you’re working with volunteers.

I had fun with the mechanic I did get working, and I think I want to do something with it in the future.

Final Question

Have you ever had to walk out on a bad team situation?

New Year’s Laptop Work

Good Morning from my Robotics Lab! This is Shadow_8472, and today I have a few related projects I jumped between over the month. Let’s get started!

Kernel Regression

In the process of software development, it is generally impossible to continuously add features without ever introducing bugs. The heart of an operating system is no different. I never figured out exactly what happened, but my laptop started getting ridiculously long boot times after a Linux kernel update. Like: 10 minutes time spent not waiting on the user to enter decryption keys and login credentials – assuming it didn’t crash. However, by mashing F8 and F12 (only one of them is needed and I’m too lazy to figure out which one) during boot, I could access a menu to boot an old kernel without undue delay.

I honestly cannot recommend System76 enough. I’ve spent a while with different tech support agents, and my experience with System76 is consistently the best. I am very thankful they have native English-speaking agents who know their stuff. No one is paying me to say this.

After a few commands, I was told my hard drive was on the full side. That could explain some stuttering I was experiencing. So, I deleted a large game I don’t play anymore and installed a 2TB hard drive in my second of three M.2 slots. Opening the case to install it was as simple as unscrewing the bottom and pulling a little, following their guide. I installed PopOS 24.04 and it booted in a reasonable time. I want to eventually move over to it, but I so far haven’t figured out LVM management enough to mount my encrypted partition, and unfortunately that is outside the scope of System76 support.

Our conclusion was that either I had a weird configuration somewhere or the kernel had a regression. In either case, there’s been another new update, and my laptop boots normally again.

Headphones Jack Destroyed!!

Towards the end of the month, I had my laptop set up while out and about, and I got startled badly. Like really badly. Enough to rip my 3.5mm audio out from from the circuit board, ruining the audio wire while I was at it. First moral of the story: Never use a straight end on a sideways pointing headphones jack. An elbow in the wire makes the connection point a lot weaker as a lever, and might have saved an approximately $80 mistake after parts, shipping and tax. Again, thank you System76 for putting such exposed circuitry on an easily replaceable separate board (although the charge for shipping was painful).

As of publication, the board hasn’t arrived. In the meantime, I still need sound for my workstation. My first thought was to pass my audio through my Blue Yeti. By default, this microphone has a monitor function so you can hear yourself. I already disabled it, and it took just a little educated guesswork to route audio correctly. I don’t like the idea of keeping my good microphone set up all the time though when the same USB-C multi-adapter has a 3.5mm audio out.

However, educated guesswork wasn’t enough for round two. Seemingly no matter what I did while trying to play audio through the multi-adapter, sound would end up either not playing or finding its way over HDMI to my main monitor. On the odd case I got as much as a chirp out of my speakers, my video driver would blink to black, stabilize, then blink out a second time – before either to my monitor or eventually giving another split second of audio and doing the same loop all over again.

I got in touch with System76 again. The friendly agent had me run a couple commands to show what devices were connected, and I did what I could to provide any diagnostic information I could. Often, this would lead to an idea, which I would delay my reply e-mail until I could explore it. The biggest tip this session got me was to poke around with pavucontrol, a fancier audio mixer than the one built into KDE.

I found a mysterious LDC in my display configuration whenever 3.5mm audio is connected. I had a good independent exploration session memorable for my login screen being absorbed by this phantom screen, I looked it up and found the audio was being routed over HDMI, and duplicating my main screen to the LCD was my best move.

One night, I found myself playing a song on loop for an hour and I decided to time the malfunction. I ran a timer on the malfunctions and as luck would have it, when I set up the glitch loop and tried timing the split-second “chirp” it ran around 9.5 seconds before LCD disconnected and reconnected, completely busting whatever working theory I had at the time.

At some point, I muted my playback of my video in Firefox, and turned the volume up slowly. No glitch. I didn’t put it together until after I’d glitched it in another browser, website, and even a local game. The common factor was volume; my cheap not-quite-a-laptop-dock has some weak components, leading to signal blowout, resetting its HDMI phantom monitor when it the volume gets too loud. The solution was to turn the volume down and amplification up. A little prompting by a second agent at System76 led to me exploring the audio chirp into real monitor HDMI fail state, and I understand the Linux audio system better for it.

Takeaway

This laptop has been a very good learning opportunity for me. Running desktop Linux is almost the same, but switching between a workstation with two monitors and a speaker system and going on the go adds a new layer of complexity. I still have a while to go before I master PulseAudio. The day before posting, I had trouble accessing my internal microphone from where I disabled it somewhere deep in the OS. But overall, I’m learning a bit more with each fixed problem, even if it’s just to wait a bit while upstream fixes it.

Final Question

What crazy laptop repair stories do you have?

Cold Start VTubing in Linux for 2026

Good Morning from my Robotics Lab! This is Shadow_8472, and today I am putting together a VTubing stack ready for 2026. Let’s get started.

Challenge Outline

I forget exactly when I saw the recording but I saw a livestream from late 2017 demonstrating a pair of animated characters standing around, talking with chat, and that blew my mind. Unbeknownst to me at the time, a very similar technology had previously been implemented by Disney Imagineers back in 2001 for a Lilo and Stitch themed video call attraction starring the fuzzy, blue alien. Fast forward through the 2020 pandemic to late 2025, and the infrastructure has developed to bring digital puppetry within reach of anyone with a good computer, and a webcam.

I’m only share my face online with people I’ve met in-person, so when a long-time friend announced a new game with webcams encouraged, I reviewed my research on VTubing on Linux.

The Software Stack

I used the video “Full Body Tracking for Linux Vtubers? Yes!” by Corbent’s Workshop [1] as my starting point. It’s aimed at VTubers switching to Linux as opposed to Linux enthusiasts looking to pick up VTubing, but I found it gave me a solid foundation to work from. In summary, he recommended:

  • XRAnimator for pose capture
  • Either VSeeFace or Warudo to animate your avatar
  • OBS (Open Broadcast Software) for streaming to the Internet

XRAnimator

XRAnimator is an open source Linux native application selected to provide face, body, and hand tracking. Only one webcam is needed for a basic setup. I played around with puppeting the default avatar and found a couple problems. First, transparency doesn’t work right (I suspect it’s an X11 bug), which I can work around by setting a black background. Second, it doesn’t particularly like my glasses, for which I can just wear contact lenses.

Warudo

I chose Warudo for the heart of my VTuber stack. I’d rather use something open source, but I’m on a deadline. Corbent’s Workshop noted how VseeFace can only track the mouth by audio or motion capture, but not both at once [1]. Warudo can. Neither has a Linux native version, but it looks like Warudo at least is working on it (I think). The video recommended the latest Proton 8.x, but I found GE-Proton-10-26 (GE as in “Glorious Eggroll”) works just as well. Furthermore: a Warudo update made some launch options necessary (Found at “Warudo Steam page/Options [gear]/Properties…/General/LAUNCH OPTIONS”):

PROTON_DISABLE_NVAPI=1 PROTON_USE_WOW64=1 %command%

Check with the community Discord if this doesn’t get you in. That is where their support system is.

OBS

This is one I had installed already. OBS stands for Open Broadcast Software. Its main use is for streaming, but with one extra package I failed to document, I convinced it to host a virtual webcam Discord had no problems picking up.

Character Model

This is all meaningless without a character .vrm file to animate. A fairly priced quality avatars can run thousands of dollars. I have basic 3D skills – even a simple robot avatar is out of scope for this project. Instead, VRoid Studio is the go-to avatar creator tool (closed source), which I also ran on GE-Proton10-26. I took the time and read through the EULA, and the biggest concern I picked up on was redistributing their models and rigging to make a competing program, so I can understand.

I made a self-avatar and crudely textured up my specific watch with included brushes. The result didn’t track my eyes or silent mouth movements. As it turns out, VRoid Studio and Warudo use different names for mesh deformation “blendshapes.” After installing the “VRM format” plugin in Blender, I brought my model into compliance with the VRM 0.x convention Warudo expects by re-naming my ‘eye closed’ blendshapes to ‘eye blink’ and removing the prefixes from A, I, U, E, and O (Japanese vowel order). However, this broke my audio lip sync, and I haven’t figured out where to fix it yet. It was enough for a full integration test where I called a friend I originally met person, and he said it made me look… like a child. Well, I tried. I’ll need a more stylized concept for showtime.

Further Research

I may have reached baseline functionality, but there is so much more to learn about, even at a $0 hobby level.

  • XRAnimator
    • Loading a model into XRAnimator is rumored to improve quality.
    • So a second webcam to improve tracking.
    • Move motion capture to another device, like my Kinect2 (if that works on Linux).
  • 3D Model
    • Finish full lip sync capabilities.
    • Make an ARKit compliant model for improved facial expressions.
      • NEB (Not Enough Blendshapes) (Unity plugin)
    • Learn to use using Warudo blueprints and expressions.
  • Accept Unity the gold standard for editing VRoid models.
  • WINE
    • Switch to Bottles if Steam’s grip on Warudo allows it (I’m on Lutris).
    • Put the Windows parts of my stack in the same WINE bottle.
  • Switch to a more open source ecosystem for avatars.

Takeaway

My journey to a VTubing Hello World was surprisingly easy to get to, obstacles notwithstanding. The hardest part I finished was probably writing this up. I have a path forward with this project.

Final Question

Are you here starting your own journey? Be sure to let me know in the comments below or on my socials.

Work Cited

[1] Corbent, “Full Body Tracking For Linux Vtubers? Yes!”youtube.com, Corbent’s Workshop, Oct. 27, 2024. [Online]. Available:https://www.youtube.com/watch?v=O7EkbzLpl-8. [Accessed: Dec. 23, 2025].

A Pair of Programming Projects

Good Morning from my Robotics Lab! This is Shadow_8472 and today I am covering my programming work this past month. Let’s get started!

I am finding that even on the monthly cycle, I don’t always finish the projects I intend to. This month, I was intending to work on my phone, but that project hit some tough truth and the story isn’t ready yet.

HyperTrace

I might as well introduce my newest long-term project, HyperTrace. It started out in October as a spectator camera for a larger Redot community project. I studied up on using quaternions to rotate a camera node to avoid gimble lock, but just as I was starting to get it under control after I was bored stiff of them, I realized I could just attach a camera to a physics object.

But once I had a camera with 6 degrees of freedom (three positional and three rotational), I started thinking about how I could make a game out of it. I figured a space piloting game was in order. I got into studying up on using Perlin noise and similar methods of making structured noise to generate a shifting cave structure to represent a hyperspace mesh for the player to navigate – preferably at high speed.

The idea is that an unnamed pilot is tracing out a route through a shifting hyperspace (preferably at high speed), and if he collides with a wall, he is thrown out of hyperspace, where it isn’t safe to re-enter, so he must return to port under conventional physics. If he succeeds, his fleet can instantly jump across his route. This FTL system sets the stage for a future where humanity has colonized the solar system, but is hesitant to venture out to other stars. Think Star Wars meets Space Engineers meets Monkey Ball.

So far, my best tech demo is a blue fog you can navigate at slow, careful speeds. It takes an unacceptably long

Shell Scripting

My services were called upon for a “computer project.” My mother had some PDF files of recipes to separate. Over the course of an afternoon, I wrote a BASH script to separate out each page in each PDF. At its heart was the program qpdf. Along the way, I tried to keep good practices like leaving comments and having checks to prevent accidental overwriting.

Now, I only know a modest bit of BASH (popular Linux command line interface aimed at users), but I know enough about basic programming it was just a matter of syntax for me. But I figured I could do one better in around the same amount of time. I saved a copy of the script and its output, and I set about naming files after their contents, identifying+recombining second pages, and sorting out charts, and numbering repeats (some duplicate recipes had changes).

But for every hour I spent working out a feature, I figured I spent another debugging edge cases. Several recipes had BASH control characters in their titles like *, &, and /. I preferred to use special handling of strings, but in the end I sanitized / as ÷ and * was removed entirely. This additional work extended the project to a total of three days.

Closing thoughts

My goal with Hypertrace is to publish it to Itch.io or similar platform on a donation basis. And before I get past more than a couple tech demos, I’m going to want a version control server in my homelab just in case something happens to my laptop. While I really want to think ahead in terms of security, I know I don’t know enough to make an informed decision about the utility I need now.

Final Question

What Git setup do you recommend for a homelab?

How to Run Electric Quilt 8 on Linux

Status: INCOMPLETE

Good Morning from my Robotics Blog! This is Shadow_8472 and today I am installing Electric Quilt 8 on Linux for my mother. Let’s get started!

First off, I think I’m going to try something new. I always seem to juggle 3-5 longer projects, splitting off extra parts if I didn’t complete something. A few updates were a bit underwhelming. Going forward, I want to try using a series of updates into old posts as I finish projects.

Electric Quilt 8

EQ8 (Electric Quilt 8) is a quilting program my mother uses. It is also my mother’s last Windows holdout. Seeing as Windows 11 is all sorts of bad news and her laptop was forcefully migrated, I took on the project of coaxing EQ8 into running off-platform. The gold standard is of course native performance, but a small Windows 10 VM (Virtual Machine) is on the table as a last resort.

Attempted Solutions

WINE:

The first stop when attempting to run Windows programs on Linux is WINE (abbreviated: “WINE Is Not an Emulator”), a compatibility layer for translating Windows system calls into Linux equivalents. Long story short, I’ve thrown several versions at it, but I’m not a WINE wizard. There is potential here, but I cannot unlock it alone.

ReactOS:

ReactOS is an open source operating system based on Windows’ architecture. I had set up a VM from another project, but once I got EQ8 over and tried to install, it said it wanted Windows 7. For context, ReactOS is starting to think about re-implementing software libraries introduced in Vista. This is a dead end for now.

Crossover:

I contacted EQ Support with a don’t-ask:-don’t-get attitude. As expected, they only support Windows installations, but the representative pointed me in the direction of Crossover, saying, “Other customers have had success using their product with our software.” I am currently on the 2 week free trial, and while I haven’t gotten it working yet, this is also a promising avenue to explore.

Fedora:

My mother contacted one lady who’s gotten EQ8 working on Fedora Linux. I made a VM for Fedora 42 and tried both Lutris and Crossover. No improvement. As of Oct. 26, the last I’ve heard from her is “Fedora Version 42, Wine Version 10.12, Windows 7 (without Internet-Connection), no launcher.”

Tech Pile:

It’s always fun to raid the tech pile in the hopes of getting something to run. We turned up 2 Vista laptops and a netbook with Windows 7 Starter. Win7 Starter is 32 bit. EQ8 is 64. It wasn’t entirely fruitless, as one of my computers has a Windows 10 drive I never nuked.

Community:

Long story short: I haven’t been able to break into many conversations on my own about EQ8. Reddit has a couple references to running it on Linux, but they run dry pretty quickly.

This is an open project. Check back later for updates.

Final Question

Do you use EQ8 on Linux? Maybe you’re a WINE guru and found this Help Wanted post. If either of these is so, please reach out to me either in the comments below or on Discord so I can update this post with a solution. You can find an invite to my server on my Socials page.

Building an Improvised Recording Booth

Good Morning from my Robotics Lab! This is Shadow_8472, and today I am building a sound booth in my closet. Let’s get started!

Designing a Recording Booth

At one point in my year off, I researched the basics of voice acting. I already have a Blue Yeti and Tenacity (fork of Audacity, a free/open source sound recording program), but the recording environment is just as important. Tenacity and other programs that handle audio often come with powerful tools to clean up background noise, and better techniques are being innovated all the time.

But no matter how good post-production gets, it’s still best to minimize unwanted noise relative to signal (voice or instruments) in the first place. This is called the noise floor. And depending on the size and shape of your room, sound reflects back and echos. Vibrations from the building can also travel up a simple mic stand and degrade quality that way. Little tweaks like are endless, but these are the scope of my project today.

Common DIY sound booth solutions range from recording in a closet to draping moving blankets off a collapsible PVC pipe frame to building a mattress fort. My budget is “stuff from around the home,” so no PVC or mattresses for me. The theory behind closets is that the clothes absorb the sound bouncing around. But I read that if you think you have enough clothes for that, you don’t have enough clothes. Long story short: I hung the heaviest, unused blankets I found around the house in my closet for a compromise between the lower two solutions. I also used a dresser drawer to reduce mic vibrations.

Project

What good is a sound booth without a project to test it on? As of writing, I’ve only done test recordings of “Marry Had a Little Lamb” comparing the sound floor loose in my room to a camper when I visited family in the mountains this summer and again within the sound booth. I think the biggest factor I need to worry about is my rather loud laptop fans.

For the longer term, I auditioned for an unpaid role in a fan re-animation project: The Tigger Movie: ReBounced [Link to Casting Call Club page]. I looked it up as I actually have a pretty good Eeyore impression, but that role was taken already. After a little self-debate, I tried out for Piglet since digital manipulation was on the table, and Piglet’s role is as small as he is. My family picked up a copy of the original movie to watch, and I’m finding Piglet’s stutter a little difficult to master, but it is nothing compared to Tigger’s lisp. The project is in need of animators, so if you happen to have some time, feel free to check it out.

Final Question

What entry-level equipment would you recommend and in what order?

I’m Back

Good Morning from my Robotics Lab! This is Shadow_8472 with a quick update. Let’s get started!

Keeping things short, I’m back from my year-long break. I didn’t do [as] much in terms of stuff to cover, but I did get a new laptop built for Linux and I participated in Redot’s New Year’s and Spring game jams. Check them out on Itch.io. I’ll be writing about them and other projects in coming months.

Speaking of, I am moving to monthly posts – the last Monday of each month. Weekly posts kept me moving when projects were simple, but I eventually found myself writing fluff. There are only so many times I want to explain NFS auto-mounts, yet I’d do it to publish something. A monthly format will allow time for tougher and/or more haunting challenges, like my Nextcloud saga I finally have an answer to. Concluding such epic efforts might be worthy of a video, which I am interested in trying out.

Final Question

One of my long-term goals is to build a following, and I understand a good way to do that is networking with similar content creators. Do you know anyone I might have a chance at getting to know?

My Pi 400 Travel Desktop

Good Morning from my Robotics Lab! This is Shadow8472, and today I am trying out my Raspberry Pi 400 as a lightweight daily driver. Let’s get started.

rPi 400

The Pi 400 is a special edition of Raspberry Pi – essentially a Pi 4 built into a keyboard. It’s missing a USB 2.0 port (made up for by the keyboard) and the 3.5mm audio jack. In theory, the miniature keyboard computer is perfect for travel – assuming you have a screen at your destination that is.

Choosing a Distro

My first thought was to try installing Arch. A brief search found an Arch on ARM project as vanilla Arch only supports x86 architecture. The instructions involved formatting a drive on the terminal. I bought a 256 GB SD card and did so on the first convenient rPi install I had – Kali linux.

It took a few failed attempts to install Arch. The pure open source method that reportedly works on the regular Pi 4 didn’t on my Pi 400, and the standard method wasn’t cooperative when installing a login manager and any of a few desktop environments. Running on a time crunch before leaving on a trip, I switched to DietPi, another lightweight distro I’ve worked with before for, but for a much different project. As with Arch, MATE was uncooperative, so I settled with XFCE. Special thanks to Balena Etcher for a special warning when formatting large drives.

Packing For a Trip

To shortcut setup, I copied my browser and LibreOffice data from my main desktop. LibreOffice worked for me, but didn’t carry the full dark mode – a problem I’ve encountered before, but ultimately decided to live with.

Firefox ESR –as installed from DietPi’s hedged garden– refused to accept my profile. Regular Firefox –installed from the apt repositories– was up to date and started as expected. Notably, it included my extensions – especially Bitwarden, my password vault.

A screen was not procured at my destination, so I packed one from home. The missing audio jack was also problematic, so I packed my Blue Yeti with plans to disable voice monitoring. For redundancy, I packed an HDMI monitor, but busted the ​​styrofoam while stuffing it into the slightly wrong box. As of writing, I have done nothing with sound.

Deployment

We left on our trip. Upon arriving, I found my over-packed tech bag lacked a long enough HDMI cable. I borrowed a longer one. The monitor signal kept flickering. For a while, I assumed I was overloading the Pi with a couple hundred browser tabs, but after a power blink and several reboots, it came out that the HDMI was bad. We bought a replacement, and it’s been working properly since.

Ejected Challenges

Arch wasn’t the only thing I had to back off from. I brought a couple additional Pi 4’s to have myself a nice, little network with Pi-Hole ad blocking, but Wi-Fi strength and configuration challenges meant those were both a no-go.

Another challenge I want to pull off is playing Stardew Valley. I copied the files over back home, but haven’t had time to try the conversion I found yet.

Takeaway

Finishing a project as a rule is better than stalling an overly ambitious one. I have an on-the-go workstation, even though it still lacks polish.

Of special interest, this week marks the 6th anniversary of my Robotics Lab. And some months ago, I decided I wanted to do a Sabbath year cycle. I’ve proven that I can be consistent at posting, even if I feel my quality slips some weeks. The facts of the matters is that weekly posts are getting a bit repetitive for me, so for the next year, I’m only going to post when I finish a major project, such as when I have the full software suite I have in mind for my homelab. After that, perhaps I’ll stick to monthly. We’ll see in a year’s time.

Final Question

Setting up a travel computer on the quick was a bit of a trick. What must-haves would you include in a similar package?

I look forward to hearing your answers in the comments below or on my Socials.

Server Rebuild With Quadlet

Good Morning from my Robotics Lab! This is Shadow_8472, and today I am continuing my work on my home’s network. Let’s get started!

State of the Homelab

Bitwarden. I self host using Vaultwarden, a 3rd party server. Done properly, it fits nicely in a larger homelab stack, but its OCI container can stand alone in a development environment. Due to skill limitations, I’ve been using it in this configuration. My recent network work has invalidated my manually self-signed certificates, and I’d rather focus my efforts on upgrades instead of re-learning the old system to maintain it.

Today, I am working on my newest server, Joystick (Rocky Linux 9). I compiled some command-by-command notes on using `podman generate systemd` to make self-starting, rootless containers, but just as I was getting the hang of it again, a warning message encouraged me to use a newer technique: Quadlet.

Quadlet

Quadlets? I’ve studied them before, but failed to grasp key concepts. It finally clicked though: they replace not just running `podman generate systemd` once I have a working prototype setup, but also everything I might want to do leading up to that point including defining Podman networks and volumes. Just make your Quadlet definitions once, and the system handles the rest.

The tutorial I found that best matches my use case can be found at mo8it.com [1]. Follow the link under Works Cited for the full text. It’s very detailed; I couldn’t have done a better job myeslf. But it doesn’t cover everything, like how `sudo su user` isn’t a true login for `systemctl –user …`. I had to use a twitchy Cockpit terminal for that (Wayland-Firefox bug).

Caddy

Caddy is the base of my dream homelab tech tree, so I’ll start there. My existing prototype calls for a Podman network, two Podman volumes, and a Caddyfile I’m mounting as a volume from the file system. I threw together caddy.container based on my script, but only the supporting network and volumes showed up. Systemd picked up on “mysleep.container,” an example from RedHat.

As it turned out, caddy.container had missed a capitalization. I found the problem by commenting out lines, reloading, and running `systemctl –user list-unit-files` to see when it didn’t load. Likewise, my Caddyfile volume had a file path bug to squash.

Vaultwarden

Good, that’s started and should be stable. On to Vaultwarden. I updated both ButtonMash and Joystick’s NFS unit files to copy over relevant files, but Joystick’s SELinux didn’t like my user’s fingerprints (owner/group/SELinux data) on the NFS definitions. I cleaned those up with a series of cp and mv commands with sudo and then I could enable the automounts.

Vaultwarden went up with simple enough debugging, but the challenge was in accessing it. I toyed with Cerberus/OPNsense (hardware firewall) DNS overrides until Caddy returned a test message from <domain.lan>:<port#>.

Everything

My next battle was with Joystick’s firewall: I forgot to forward tcp traffic from ports 80 and 443 to 8000 and 44300, respectively. Back on Cerberus, I had to actually figure out the alias system and use that. Firefox needed Caddy’s root certificate. Bouncing back to the network Quadlet, I configured it according to another tutorial doing something very similar to what I want [2]. I configured mine without an external DNS. A final adjustment to my Caddyfile to correct Vaultwarden’s fully qualified domain name, and I was in – padlock and everything.

Takeaways

I come out of this project with an intuition of how to manage Systemd files – especially Quadlet. The Quadlet workflow makes Podman container recipes for Systemd, and a working container will work forever – baring bad updates. I would still recommend prototyping with scripts when stuck though. When a Quadlet fails, there is no obvious error message to look up – it just fails to show up.

Even though it is still new, a lot of my time on Joystick this week was diagnosing my own sloppiness. Reboots helped when I got stuck, and thanks to Quadlet, I didn’t have to worry about spaghetti scripts like how I originally organized ButtonMash and never stabilized this victory I re-achieved today.

Final Question

NextCloud is going to be very similar, which I will make a pod along with MariaDB and Redis containers. But I am still missing one piece: NFS. How do I do that?

I look forward to your answers below or on my Socials.

Works Cited

[1] mo8bit, “Quadlet: Running Podman containers under systemd,” mo8it.com, Jan. 2-Feb. 19, 2024. [Online]. Available: https://mo8it.com/blog/quadlet/. [accessed: Sept. 13, 2024].

[2] G. Coletto, “How to install multi-container applications with Podman quadlets,” giacomo.coletto.io, May 25, 2024. [Online]. Available: https://giacomo.coletto.io/blog/podman-quadlets/. [accessed: Sept. 13, 2024].

Replacing My Tablet’s Battery

Good Morning from my Robotics Lab! This is Shadow8472 with a side project of the week. Let’s get started!

I’ve had my Samsung Galaxy Tab A 10.1 (2016) since before my new Android device lockout. It’s held out for the most part, but several months ago, it developed a problem where it would lose power while plugging it in or unplugging it. At worst, it would pulse its battery charging picture until eventually moving to a charge meter. The problem grew worse over time, and new power cables didn’t solve the problem. The parts to replace the USB connector and battery were out of my price range. One day, I even made a “Goodbye” backup.

I don’t know how I found this, but a little pressure on the back let my tablet boot again. Sometimes, the battery would register as empty and shut down once Android loaded – even when connected to wall power and the battery showed up as 100% again a moment later.

I zeroed in on the left-middle (viewed from the front in portrait mode) as the place to press. As I got used to this arrangement, I had to keep pressing harder. Eventually, I pried the back cover off with an orange peeler. The battery has two sections, and I’d been pressing on the bottom-outside corner of the upper pack. The battery itself did not appear bloated or otherwise damaged, and continued to work applying pressure directly to the battery pack (DUMB IDEA, by the way; DO NOT TRY) and pressing the tiny, exposed power button.

As it turns out, Android sees the battery as a critical part (unlike laptops), and I don’t have the tools or know-how to simulate one, but some kind of wall-only power mode would have been a courteous gesture, even if it were buried deep within the firmware.

Through another month or two of observations, I learned how the battery failed when I stopped pressing too soon after booting or if the tablet started drawing too much power, such as when turning on when when plugging it in while off. With this, I narrowed my diagnostic to just the battery pack, which I figured was worth trying to replace. Reviews of different compatible replacement batteries said not to expect more than a year out of it – even genuine ones. My guess is that it’s all old stock, changes pushed to the operating system intended to slow battery aging, or a combination thereof.

Replacing the battery once it arrived was only a matter of formality. The screwdriver that came with the kit was worthless, but I had access to a better one. I expected little of the battery, and I got little of it. It’s a step or two above junk, but it’s worlds better than the old one. I just need to be a bit more careful about how I handle my tablet from now on. If it only lasts another year or less, my plan is to replace it entirely.

Final Question

I can’t find any information online about applying pressure to batteries. Do you have any insights? If so, I look forward to hearing from you in the comments below or on my Socials.