Ice Boat Challenge Port Re-revamp

Good Morning from my Robotics Lab! This is Shadow_8472, and today, I am going over the “boat race” I ported to Minecraft 1.13 vanilla. Let’s get started.

At the beginning of the week, GrifterMage, an admin on the Minecraft server I play on, showed interest in getting the boat race up and running again sometime soon. I had already changed its host server over to a “Paper” server, like the server the project is being built for. Everything worked for a little while, but one day, the datapacks I tried sending weren’t working. I restarted my server, and my “original” copy was broken.

I decided to rebuild the backbone of the project from the ground up. While Grifter recommended I reuse as much as possible, I knew there was a certain amount of spaghetti logic at the end. I figured it would take less time to take the lessons I learned last time and make a version 2.1.

Grifter said he was okay with work-in-progress “untested” datapacks, so I made provisions as I set up my workspace. With Derpy set up as a VNC machine, I experimented with different arrangements of windows on both Derpy and my home Windows machine until I ended up with something that works.

My creative server with the secret boat race development laboratory in an Ice Spikes biome served to manufacture the structure files. From there, I would switch over to Derpy, where I would copy them into the “original” datapack along with the “originals” of the functions to run the whole setup. The datapack was stored in a special file on the desktop beside another file called Nightly Backups. This file did not live up to its name. I saved between two to four zip files per day there; whenever I felt I needed to make a progress report to Grifter I would open a terminal and run a command to zip the datapack for both storage and transmission.

The greatest flaw I perceived in my design last time was my inability to be sure someone else could set the game up on another server, so I created a “Clean Paper Server” to test my datapacks myself where I wouldn’t need to worry about anything accidentally relying on some scoreboard objective I forgot to make sure was created. Once I was done, all I had to do was delete the old server, copy the Clean Paper Server beside itself on Derpy’s desktop, copy the datapack into Clean Paper Server (copy 1), launch the server, hop in with my client on Windows, load both halves of the arena, and install the Ice Boat Challenge game. If this were something I was planning on needing to do over the course of a month or more, I would have looked into writing a script to automate what I could on Derpy.

The design objectives of this version remain almost the same as my first attempt at porting the boat race. I want to minimize the load on the server, while making it as easy as possible to develop other games for the system. I ended up coming up with a number of standards, like a “home” block for most command blocks in the Level Slots (floors a level can be loaded into) to run their commands from, forming a sort of two-way vector between the homes of each Alcove and Level Slot.

One improvement I noted in my notes for future development was to set up something similar for navigating the alcove. When Grifter introduced the /data merge command to me so I could use the same structure block for basically anything I wanted, he hinted that Minecraft 1.14 might actually let commands pipe information stored in such structures as input to other commands. I made sure to note that in many places within the comments of my functions. /data merge also opens up the possibility of multiplayer games that hijack functionality from the alcoves, so I made provisions for that functionality in the future.

This time around, the game starts when a player steps in an alcove. The Game Center operating system checks the player’s score for what game they want to play, and loads that game’s loader into the Level Slot for that alcove using a structure block in the home coordinates. The loader checks what slot it’s in, then checks what level to load for the player in the Level Slot’s alcove. It tries to load in the first half of the level, but if it fails for any reason, it has a carefully staged self-destruct to reset the Level Slot while making sure the redstone bits don’t accidentally pop off while multiple blocks are being changed (I’m pretty sure Minecraft likes to work from low y values on up based on an experiment I did with End Gateways, and my observations here back up this hypothesis).

After the loader, a series of command blocks embedded into each level takes over. These blocks handle spawning in the boat, teleporting the player, animating the boat so it does its cool spin, and checking for if the player has won or left their boat. It is here that I fixed a bug where the last version would smash players’ faces every once in a while. I used some redstone to space out commands. One of the last major bugs I faced before I got the main show working was dealing with how I hooked up the win detector. I had two blocks on repeat calling the same function and feeding into the same conditional chain command block. I ended up moving the win condition and giving it its own, dedicated command block.

A very big thanks to Grifter for all the help he’s given me this week. He got me started and helped me build inertia until I understood things well enough to continue mostly on my own. He also provided a friendly Discord avatar for me to explain the problem to in the hopes he would have time to help fix my current bug, only for me to keep working on it and often find it on my own.

A final word: documentation. Documentation is the link between you programming now and whoever is working on your project in the future. I have a whole folder in the root directory of the datapack with what the logic behind the Game Center and the Boat Challenge game is supposed to be. It also has a list of known bugs, a list of naming conventions I found helpful, credits, and even some general notes for what to do for if someone ever rewrites the project again

Final Question: Have you ever gone back to redo a project, only to find you’ve sped up by a fairly large margin


An Experiment Rerun

Good Morning from my Robotics Lab! This is Shadow_8472, and today, I am rerunning the experiment from a few months ago to see if my Raspberry Pi is likely to melt through its PLA casing. Let’s get started!

Last time, my apparatus failed miserably. The painter’s tape holding the thermal probe in place slipped just enough to let the probe get away from the point it was supposed to be measuring. This time, I used Scotch tape, and it held. Another improvement was a custom program that ran all the CPU’s cores at 100%, compliments of the workshop I’ve been going to. I recorded temperatures to the nearest tenth of a degree instead of rounding to the nearest whole.

Results for today’s experiment. All measurements are in Fahrenheit.

I ran the experiment for an hour, taking measurements on a white envelope on my desk, a chip on the upward facing bottom of the Pi, and the probe measuring the target spot on the case. Measurements were taken while it was off, after the Pi had been running long enough to level off the critical spot’s temperature, and every five minutes after the test started, with extra data points at 1, 2, and 3 minutes.

I changed procedure at the 15 minute measurement, as the temperature from the chip was highly unstable, likely due to temporarily improved ventilation by removing the case’s head. After dropping this measurement, I noticed the probe temperature climb from slightly faster before stabilizing to within a degree of 102 F for the second half hour of the test, well below the glassing temperature of PLA.

Conclusion: While this case should be fine with whatever load I put on it, PLA is not the material to be making computer cases out of.

Final Question: Have you ever changed an experiment mid-procedure because you suspected your measurements were affecting your results?

A Series of Limitations

Good Morning from my Robotics Lab! This is Shadow_8472, and today, I am laying a project to rest. Let’s get started.

The past couple weeks, I’ve been working to make a video call from my Raspberry Pi using the PiCamera taped into the case. Only this week did I learn it was too ambitious when I stopped by the workshop and actually started testing what worked. I managed to get a inbound stream, and presumably good audio (there weren’t any speakers connected at the time), but the video stream coming from the Discord webapp in Chromium came through way too slowly to be of any use but for a good laugh.

My time on this project was not wasted, though. I learned a lot about both how the PiCamera works and what it can do and what the Pi itself cannot do. It turns out the Raspberry Pi has all its graphics on the CPU, meaning no specialized circuitry for videos to begin with. That isn’t to say it cannot handle ANY streaming. It did manage to pull off viewing the live broadcast of my church’s services for the last couple Sabbaths with only a few problems. It’s likely something to do with the resolution and frame rate it’s trying to work through. That or trying to send non-existent audio while streaming video is a little much for the little computer.

This project was fun, but I simply don’t have the skills to pull it off at this time. If I ever do get it working, I expect it to be on a tiny, 64 bit OS for the Pi written just for video chat and nothing else.

On an unrelated note, I plan on separating the Pi itself from the cat deterrent(s). It will hopefully solve the power problems if I set up a little, offline network connected by radio or WiFi. Maybe I’ll take apart an old wireless mouse and see if I can use that.

Final Question: If you want to stretch your abilities, you should pick goals just outside of your abilities. What was the last project you tried, but found was just a little bit too far out of your reach?

A Series of Reluctant Leads

Good Morning from my Robotics Lab! This is Shadow_8472, and today, I am bringing you another fragment on my journey to get my PiCamera working in a video call. Let’s get Started.

This week had a lot of progress, but no workable results. I did a lot of research on the problem, refining my understanding of the situation, but the solution always feels an hour or two away.

To recap, work last week finished with the Discord webapp playing nicely with my Blue Yeti microphone, but it wouldn’t even acknowledge my PiCamera, even when getting into the sound and video test page.

This week, one of my early findings was a driver, Video 4 Linux 2 (v4l2). I spent half a day believing something wrong with it, because v4l2 looked a lot like v412 because of the numbers before and after a lower case L. Long story short, after running a command:

sudo modprobe bcm2835-v4l2

The camera still didn’t work, but at least Discord tried to load an image from “Default” instead of insisting there wasn’t anything resembling a camera it could use

Fortunately, while researching why the camera still isn’t working, I noticed a forum where someone pointed out the command needed to be run after each reboot for the V4l2 driver to work. Other background information I picked up was that the problem might be something requesting a 0x0 feed to display.

I also noticed for the first time that the “raspistill” program’s output covered parts of the black boarder Raspbian leaves. It’s as if it’s on a separate layer in front of the OS’s video output. One of the options included making the camera output transparent, so I used that freedom of vision to try and share screens, but any screen sharing was from the desktop layer only.

I had a bunch of redundant/unfruitful research for a couple days, and I eventually ended up in the workshop again. I want to say I had about half my progress this week right there. It took a little while to review my progress so far, but after everyone understood things, we started playing around with narrowing the source of the problem farther. Many of my hunches about where the problem wasn’t were confirmed, though I did learn a bit about the Chromium hardware permission system and how some software repositories are left in a power saving state while not actively being used.

We managed to confirm the driver was working after arguing with a python prompt. It still requires adjustment, as in it threw up what looked like a blotchy, pixel for pixel view of the camera, and it was zoomed in. It covered up the black boarder entirely, so I’m hoping this isn’t just an educational dead end.

Final Question: most of my unfruitful research is quite unmemorable. This was just one of those weeks where my early work was mentally lumped in with last week’s work until I reviewed the actual blog post I made not covering it. When was the last time you forgot how recent your recent progress was?