Site Maintenance: Forums and Privacy Policies

Good Morning from my Robotics Lab! This is Shadow_8472 with a side project of the week. Let’s get started!

BbForums

This week, I took another look at this website. If I’m going to promote open source and privacy respecting software whenever possible, I best be prepared to lead by example. WPForums Lite was the forums plugin I was shepherded into along with Monster Insights and OptinMonster, two plugins I do not feel OK running. As effective as I hear they can be, it’s not worth my principles. I’m looking into alternatives.

Finding an analytics plugin is proving tricky – a side project for another time. Long story short: the most promising alternative needs me to arrange off-site data storage.

Forums, on the other hand, are now open for business. I’m currently working with bbPress, a forums plugin for WordPress developed by the same team using the same open source principles. You may have also noticed a new navigation bar I needed to expose a test forum. I expect it to change as I learn more about web development. Special thanks to Commander Stryker for helping me bug test it.

User Data Privacy

Since I started this blog, legislation to regulate the collection and sale of user data has been passed in the EU (GDPR) and California (CCPA). While I believe neither applies to me specifically because I’m not advertising in the EU and am not selling anything, respectively, I want to build my site to respect the principles of privacy anyway.

This week also saw me working over the privacy policy with another plugin. I believe I got it passable, but it’s still subject to further development.

Final Question

Do you know of a good open-source, privacy-respecting analytics plugin for WordPress?

I look forward hear your answers on in the comments below or on my Discord server.

A Brief Announcement

Good Morning from my Robotics Lab! This is Shadow_8472, and today I would like to talk. Let’s get started!

I want to interact with my readers and grow the community. That’s why I’m launching Shadow’s Robotics Lab on Discord, where you can talk about your tech projects, exchange help with Linux support, or just hang out.

On the flip side, I’ll be saving my main projects for the last Monday of every month with side projects other weeks.

If you want the nitty gritties about all my projects as I work on them, then join Shadow_8472’s Robotics Lab on Discord.

Final Question

How long have you been a reader?

I look forward to reading your answer on my Discord server.

NGINX: Project Redirect

Good Morning from my Robotics Lab! This is Shadow_8472, and today I am configuring NGINX on my home server, ButtonMash. Let’s get started!

NGINX Reinstall

I sat down this week to work on NGINX, and any tutorials I tried –even ones specially recommended to me– kept referencing files that didn’t exist. Staff member Pokey on the bitmerge Discord server suggested I check my version. The Rocky Linux 8 repositories are distributing version 1.14.1 whereas the latest version is 2.21.6 on their website [1]. A little further research revealed that one of those noted differences –directories for sites-available and sites-enabled– are commonly packaged with NGINX and not part of the upstream codebase.

A little research turned up a quick article on ServerDiary [2]. Turns out NGINX maintains repositories with stable and mainline releases of their software for Red Hat and CentOS – the later of which I am trusting to stay open for the time being. The instruction-by-instruction directions were clearly, but briefly described, and while I followed their instructions to enable the mainline release, I’m trying to admin ButtonMash as if it were a production environment: so I reverted to the stable version. In addition to the commands found on ServerDiary, I worked out this one for myself after trying to enable nginx-stable:

sudo yum-config-manager --disable nginx-mainline

Technology Backbone

I tried configuring NGINX to blindly pass HTTPS packets from incoming IP, but it was being more trouble than it was worth. I kept getting SSL_ERROR_RX_RECORD_TOO_LONG, which comes up when a browser expecting to make an HTTPS connection instead is offered an HTTP one – so far as I can tell.

There’s probably a way to convince it to cooperate. My productive time is probably better spent focusing directly on my technology backbone though. I expect I’ll learn the same skills at about the same time either way, but I won’t be redoing the project three or four times this way.

I registered a domain name from NameSilo. New interface on some pages aside, the process was as straightforward as the previous times I’ve done it. The real trick is when I master subdomains and I can use the same domain name for services intended for family use.

Going forward, my next immediate step is obtaining a security certificate from Let’s Encrypt with that domain name. Only then, will I have what I need to set up NGINX properly.

3D Printing Corner

Work here was slow again. For anyone not in the know: I printed a Sonic figure split front and back. I intend to glue and paint it. To practice, I made myself eight calibration cubes to glue together. I managed to locate some sandpaper and I sanded down the flat surfaces of four of the cubes.

Once I had a feel for how the PLA was reacting to being sanded, I sanded Sonic and worked on that brim I totally shouldn’t have printed. The two halves were more flush in a dry fit.

Side Projects

My father’s Debian install keeps freezing, only opening up when he sends a system request to kill everything in his session, returning him to a login screen. As this is an infrequent occurrence, I taught him a little about SSH. I had him sit down at DerpyChips, and after telling Derpy to forget about the Mint install at the same IP, I had my father log in remotely and practice using top to find a process ID (PID) and kill problematic programs with a little more finesse. While I’m not looking forward to the next crash, we have practiced for when it happens and I have a good feeling about this new procedure.

My laptop power cord has been slipping away from its 19.5 volt side, exposing the three, jacketed wires inside it. As much as I’d liked to have pushed it back up, wires don’t have good compression strength. Instead: I pinched the cord half way to the transformer and ran my fingers toward the end, stretching the outer case, encouraging my cord into a more presentable condition.

A while back, we made our own network cable to serve ButtonMash and now GoldenOakLibry. It’s been dropping connection this week. I was able to convince it to come back online for a few days, but almost as if it knew I needed to pad my post, it fizzled again, seemingly for good. I used a connectivity tester, and it failed a few connections. We found one that happened to be long enough, and I was able to reach ButtonMash. GoldenOakLibry needed to be disconnected and reconnected from the local switch before it came back online.

Final Question

I have several large tasks with no immediate payout expected. These projects will likely be research -heavy, making for less interesting reading. What kinds of side projects should I work on?

Works Cited

[1] F5 Networks, Inc. nginx.com, [Online]. Available:https://www.nginx.com/. [Accessed: Mar 21, 2022].

[2] Serverdiary, “How to install latest official Nginx on Centos 8 / RHEL 8”serverdiary.com, [Online]. Available:https://serverdiary.com/linux/how-to-install-latest-official-nginx-on-centos-8-rhel-8/. [Accessed: Mar 21, 2022].

Exporting a Discord Server

Good Morning from my Robotics Lab! This is Shadow_8472, and today I am learning about Discord Chat Exporter so I can archive an old role play I was a part of in 2017. Let’s get started!

Vanishing Data Only Hosted Elsewhere

Discord is a communications platform built around the idea of [guild] servers, channels, and posts. Guild servers are exclusively hosted by the parent company, leaving you relying on them to maintain their physical servers. While they provide easy access to chat history going back years and years, they do not provide you with the tools to extract that history in the case you are one day unable to access the service.

In around the middle of 2017, I met a very good friend on PonyvilleSquare forums. He was advertising a My Little Pony role play I joined. We moved the game to Discord at my request. It was a good thing too, because the forums were defunct within a week or two. I lost a minor detail or two from character creation, but we went on to have one of the best games I’ve ever been a part of. I liked the story well enough to go through it again, polishing it into more of a novel format with all the original players’ blessings. It’s always bothered me though that should Discord either go out of business or deplatform me for wrongthink.

Downloading a Server

Discord is rolling out some new rules at the end of the month, and I don’t know if either everyone in my circle of friends will still be welcome when it hits. Ideally, we can transition to using a decentralized platform, like how e-mail operates, where we can be in complete control our live chat server. [Matrix] fits my criteria for this end goal.

What better way to learn about Matrix than to find some tool to migrate a server I don’t want to lose? The Discord bot API only needs to see a server’s history, and it shouldn’t be impossible to record that data into a format Matrix clients can accept. In fact, one of the key talking points of Matrix is its ability to bridge to one or more chat platforms while being as inconspicuous as reasonably possible per platform. However: of the bridges I found, the only one that mentioned history listed it as not-a-feature. If there is a tool to directly migrate from Discord to Matrix, I either didn’t find it or didn’t recognize it.

Instead, I settled with another tool to archive our MLP role play: Discord Chat Exporter[1]. Discord Chat Exporter is a bot that reads Discord channels and reproduces them in a variety of static formats including HTML, plaintext, and a couple others. I found their OCI “Docker” container worked very nicely. Their documentation is clear, though I could nitpick how it didn’t like where I put the –media flag on my first try to download embedded pictures and similar.

3D Printing Corner

I’ve been trying to print a figure of Sonic I found on Thingiverse for a few weeks now. I was having a lot of trouble with the first layer sticking in previous weeks, but I finally added a bed-leveling print to my routine. Sonic himself is split front to back and arranged so he only needs minimal supports. Along the way, I tried slowing the first layer way down and adding this ugly brim I’m having trouble removing. I’ll get it next week when I glue the halves together and sand it.

Side Project

My earliest working NFS automounts were entries in the File System TABle (fstab) for my laptop. Later, I learned to mount file shares as needed with systemd. My laptop spends time off my home network, and without a timeout in my settings, I’m left with extended boot times and programs hanging until I get home and they can finish saving. Today, I copied the relevant files over to a directory on GoldenOakLibry and on over to my laptop, adding a 5 second timeout as suggested in the awesome tutorial I originally followed by Ray Lyon on Ray Against the Machine [2].

Takeaway

Imperfect solutions that achieve most of your goals are often a reasonable compromise when a full solution is a lot harder to research or even get around to doing. I’m glad I now have my MLP game properly downloaded, but even better would be the ability to import everything from a Discord server on over to Matrix.

Final Question

Do you know of a tool or set of tools that can migrate a Discord server to Matrix?

Works Cited

[1] Tyrrrz, “DiscordChatExporter” github.com, [Online]. Available: https://github.com/Tyrrrz/DiscordChatExporter. [Accessed Mar. 7, 2022].

[2] R. Lyon, “On-Demand NFS and Samba Connections in Linux with Systemd Automount,” Ray Against the Machine, Oct. 7, 2020. (Edited Aug. 26, 2021). [Online]. Available: https://rayagainstthemachine.net/linux%20administration/systemd-automount/. [Accessed Mar. 7, 2022].

My Discord Bot Understands Strange Dice

Good Morning from my Robotics Lab! This is Shadow_8472, and today I’m coding Discord dice bot to a practical state. Let’s get started!

Discord Bot Communication Skills

When last I wrote about this bot, I described a custom development environment: a custom OCI container with Python with Discord.py installed and a directory mounted from the host machine. To update the bot, I save the new source file from the host and restart it from within the container.

Programming is boring to describe, so I’ll spare myself writing all the details. Long story short, I needed to translate user requests into a form the core function understood. I used Regular Expressions (Regex) to both separate intended inputs from potential garbage and to extract the kind and number of dice to roll. regex101.com [1] was of great help in testing these expressions.

On one minor note: I got the bot into a Git repository, but I am nothing but a bumbling beginner.

My goal for next time is to produce a container with my code internalized. I’m also scheming something that will let my group know when I’m “playing around with the source code” or the bot is running from within a production container.

Commenting System

Any good program has comments left by its developer – notes for future work on the project. Wise use of comments can speed up development.

by reminding a programmer where work left off or clearly marking off major sections. Working on the command line with the Nano text editor, colors are nicely highlighted, but with all the comments I’m leaving, a simple note left to myself looks no different than commented out code blocks yet to be implemented or scheduled for deletion.

#A simple comment is useful for a quick note.

Basic comments can be anything from programmer notes on how some code works to incomplete or defunct chunks of code that shouldn’t be executed yet/anymore. Programmers must use them unless the objective is for God alone to know how their programs work.

###################################
#A title marks out major sections.#
###################################

Note: In the editor, I’m using “verse” formatting to represent code. In a fixed width text like you will find in a terminal, the lines should be the same length.

Editors that color code parts of your code are nice, but source files begin to look the same after they’re several screens of scrolling. Marking off large groups of functions keeps them organized like chapters in a book. I like to use spam a bunch of comment characters (# in Python) for extra visibility while flipping between places in the code.

################################################
#A broken pattern marks out scheduled changes.#
The quick, brown fox jumped over the lazy dog.
The little dog laughed to see such a sight.
################################################
#The quick, brown fox got distracted.
#The cow jumped over the candlestick.
################################################
################################################

I’d be surprised if I’m the first to invent a way to partition off major blocks of code for redevelopment. Not only can I quickly revert to old code without much thought, its a landmark for showing which sections are actively being worked on.

3D Printing Corner

It’s been a few weeks since I said I wanted to print something each week. This week is no different, but at least I tried.

Gluing a 3D printed object together from smaller parts allows you to overcome print volume limitations of your printer, a valuable skill to have. As I was browsing Thingiverse for something Sonic the Hedgehog related to print, I found a model of Sonic someone split front to back to require almost no support material.

Slic3r was something else. I had a difficult time getting both front and back on the plate at the same time while avoiding the damaged section. Auto-generated Supports ended up in more places than they needed to be. Sonic’s pupil even needlessly generated one with a corner sticking out of his face. I had to shift the halves around multiple times to accommodate the skirt within the print area while still avoiding the damaged spot in the middle – a challenge, but one I accomplished nonetheless. Setting supports to snug cleaned up most wayward supports, but I had to paint on support blockers around Sonic’s cuffs and fingers.

Optimizing for few supports in this way has a major tradeoff: a much more difficult first layer. I didn’t have the patience, so I left the project for another week.

Takeaway

Basic projects are often less exciting than any creative techniques devised to work on them.

Final Question

How have you used comments in creative ways?

Work Cited

[1] F. Dib, regex101.com, [Online]. Available: https://regex101.com/ [Accessed Feb.

28, 2022]

Emergency Power Reserves

Good Morning from my Robotics Lab! This is Shadow_8472, and today I am adding an uninterruptible power supply to ButtonMash instead of actually finishing the Bitwarden job because SSL certificates are very confusing to research. Let’s get started!

Computers 101

Here is the first lesson in computer literacy: power in, stuff happens. But unlike simple light bulbs, computers are essentially endlessly shifting labyrinths of electronic circuits. Complex patterns of information flash back and forth at near light speed both within any given computer. Networked computers form even more complex circuits. Everything is timed on the scale of nanoseconds or tighter. Power a local node off suddenly, and these patterns can be interrupted. That is why it’s important to use proper shutdown procedures.

But power outages are almost always a surprise. Planned maintenance or safety shutoffs can happen, but more often than not, the power company is just as surprised as its customers left in the dark. Backup systems are about as varied as their applications. I’ve heard of an old diesel locomotive engine rigged to jump start power generation for an entire college campus. Combustion-powered generators are popular at many scales. The hardware I’m dealing with today is an Uninterruptible Power Supply (UPS) from CyberPower.

First Impressions

This particular unit is essentially a battery with an inverter and some electronics to monitor everything. Going off just the box and literature it shipped with, it can power a little network device for several hours or a gaming console and TV for half an hour. Just don’t go trying to vacuum with it or run a space heater off it.

The unit has a USB-B connector in back, but little more is said about it than it carries data. I had to do a bunch of background research online to know beyond the shadow of a doubt that my particular model here could communicate a power outage so the system can gracefully shut down. Otherwise, it’s useless for long and unattended outages.

I did not find the CyberPower website helpful. The Linux driver section kept listing RHEL/CentOS 7 and ButtonMash is running a RHEL 8 compatible system. By the time I got around to seeking personal assistance finding the right driver, we had lost the receipt and didn’t know the exact date of purchase, which was the only field labeled required by the help form.

I borrowed a phone and called in. After explaining my situation, retrospect has me believe I was shepherded into the part of the help script pointing me past the one spot where the RHEL driver included version 8, and by then I had forgotten I was still wondering if my specific model could actually use the driver. Assuming it could, it wasn’t clear if the driver could hook into Cockpit or if it would want an open port all to itself.

Open Source Driver: NUT

CyberPower’s official support may be official, but it left a bad impression. When I saw that the driver needed root privileges, I decided to research an open source driver. Not many appear to exist. I reached out to my usual Linux community support channel, but it was quiet all day. I eventually landed on a mention of Network UPS Tools (NUT).

Unfortunately, my research here is incomplete. My impression is that they’ve been around for a while and that I will need to familiarize myself with some less beginner-friendly Linux rituals to operate effectively there. NUT appears to be exclusively distributed as source code signed with GPG. I haven’t mastered either skill. Furthermore, even if I do get the driver working, my specific model isn’t on their known working list. In any case, I will need to reach out to this specific community at some point if I want to see it working.

Side Project

My sister is into a game called Stardew Valley. It recently had a slightly clunky update that broke her modded setup through something known as SMAPI. My father and I took turns working on it. I opted to try a clean install on the virtual machine I’m trying to set up for her, but I ran into a problem when the virtual drive filled up its physical space without dynamically expanding. I had made a copy of the whole VM, so I was able to delete the troubled clone. In the future, I should take snapshots to restore.

Meanwhile, my father worked on the original Windows install. Together, we traced the problem to the wrong specific version the dotnet 5 framework being used when I took the shasum of the install file and compared it against the files available for download. We later identified the problem as a failure to communicate on the part of the download page. The three Windows versions looked like they could be one download each for Windows, Linux, and Mac where the correct version was located in the “column” below the Linux tab’s heading.

Takeaway

It’s safe to say I’m well beyond beginner-friendly Linux territory now. I start doing sysadmin level stunts, I best expect to subside on sysadmin grade assistance. For now, I’m happy knowing ButtonMash can survive a brief power blink and that I have a while to manually shut it down if I notice an outage as it starts.

Final Question

Has quality of help ever chased you away from a particular product?

Slow and Steady Builds a Server

Good Morning from my Robotics Lab! This is Shadow_8472 and today I am continuing the grind on my Rocky Linux server running on my ButtonMash workstation. Let’s get started!

Quality of Life

I’ve had a single-focus mindset of getting Bitwarden/Vaultwarden working, but I’ve overlooked an important aspect of running a reliable server. One evening, any new connections to ButtonMash were failing. When I investigated the next day, the system was on and the network was connected, but when I logged in to console and pulled up the address with ip a, I found the router had incremented the IP by 1.

The task of setting up static IP’s is not alien to me. On some systems, I’ve even edited config files by hand. Cockpit has a Networking tab, though. From there, I was able to quickly worm my way into the correct network interface and configure a static IPv4 back to where it belongs. Cockpit tried to reestablish its connection with the new configuration and predictably failed. I had to make the change permanent with a follow up prompt.

Container Isolation

One of the draws of Podman over Docker is rootless containers. In the unlikely event someone hacks into Vaultwarden’s container and breaks out into the host system, he won’t be able to do as much damage if he finds himself in a non-root account – even better is if he finds himself in an account without sudo privileges.

With that, I added a user special for running Vaultwarden with sudo useradd vaultwardenUsr and I locked it with sudo usermod -L vaultwardenUsr to block any normal login attempts. sudo su vaultwardenUsr will still let me Substitute User into the account. And of course I find the tools to do all that under Cockpit tab Accounts.

From within my dedicated Vaultwarden container user, I looked for the two containers listed by the plugin. But I found nothing. From my background research I should have put together, containers are owned by users like any normal file. The two images the plugin listing their owner as “system” were actually owned and operated by root, as clarified by a test where I stepped back from Administrative access to download a new container image. Novel as it is, the plugin won’t be of much help for my use case unless Cockpit lets me access it without a password.

Bringing Vaultwarden Online

podman run -d --name vaultwarden -v /home/vaultwardenUsr/vw-data/:/data/:Z -p 44300:443 vaultwarden/server:latest

This is a long, complex command that took me while to puzzle into working order.

podman run

This is the base command. I used it to go from a container image to container.

-d

“Detach” runs containers in the background.

--name vaultwarden

By default, containers are given random pronounceable names. You can just spin up multiple copies of the same image without needing to give it much thought. Here, it’s best to have something descriptive.

-v /home/[username]/vw-data/:/data/:Z

“Volume” mounts a directory from the host into a directory in the container. This part deviates from most Vaultwarden guides the farthest. First of all, Docker installation assumes you’ll be mounting directory /vw-data/ directly from the root of your file system. However, that directory is off limits without elevated privileges, so it needs a place in the user’s home directory and a full path to that directory.

The other solved issue involves the :Z at the end. At one point, I seemingly had everything else sorted, but when ran **podman ps** to list running containers, it kept coming back empty. My online searches for Podman and Vaultwarden kept sending me back to last week’s progress-report of a post, but I eventually located a discussion on Vaultwarden’s GitHub. I didn’t do much follow up research after I got it working, but it appears to be related to SELinux (Security Enhanced Linux), a kernel level security module.

-p 44300:443

“Port” forwards ports from the host machine to individual containers: host on the left and container on the right. Recommended behavior (with Docker in mind) is to forward port 80 (HTTP), but I’d rather use HTTPS on port 443. Additionally: ports 0-1023 are reserved for system use, so I added a couple zeros on the host end.

vaultwarden/server:latest

With all the flags out of the way, this is the container image being copied into an operable format.

With the container running, all I had to do was punch a hole in the firewall and I can now have Firefox laugh at me for a failed secure connection. During assembly, I was able to make a non-secure HTTP connection with -p 8000:80 and bring up a Bitwarden login.

The next step is to obtain a security certificate, but that’s a topic for another week…

Side Project

When I walked my father through installing Debian/LXDE on his computer, he was left with a few specialty keys on the keyboard going unresponsive – particularly the volume up, down, and mute keys. The system is usable without them, but it would be nice if these keys worked. Originally, I assumed the problem would be somewhere with Debian itself, but when I tried a search targeting LXDE instead, I knew from the search results that I had today’s side project.

As we researched, I started building a system flow chart from key press to volume adjustment. The Keyboard transmits its key events to a USB receiver. The receiver communicates with some sort of keyboard driver be it discrete or compiled into the Linux kernel. The kernel then passes key events to the Xorg server. From here, we used xev (X EVent) to confirm that volume key events were all making it thus far.

At some point, PuleseAudio takes over and tells the Linux kernel what to push to the speakers and how loud to play it. While testing, we cut off a song with pulseaudio –stop / –start and had to refresh the webpage to get it back.

I personally think we’ll find the problem either within Xorg or between on a direct link between Xorg and Pulseaudio. In the meantime, the search continues.

Takeaway

For both these projects: ready-made solutions out there we could just copy, paste, and wish for luck but we’re trying to learn the innards of Linux, so it’s okay to take extra time.

Final Question

Have you ever had to program your keyboard events by hand?

Linux 101 with Leo_8472: Part 2: Browser

Good Morning from my Robotics Lab! This is Shadow_8472, and today I am hardening Firefox with my father on his new Debian installation. Once again, I am hands off the actual machine. Let’s get started!

The popular Chrome browser by Google is literally made by a company that makes money by studying you to feed you ads you’re vulnerable to (ie: more likely to click on) or want you to see for some other reason (a political agenda they approve of). While this is shady and dishonorable, respectively, there’s little to keep them from aiding a government they wish to appease from targeting people on a basis for any sort of “wrong” thinking.

Consider: if logging in to a site online is like showing your face, the browser you use to visit that site is your car. Just as people can learn to recognize you by your car, websites can recognize you by your browser –even if you don’t log in– by all the individually nondescript details your browser shares by default.

Firefox markets itself as a privacy-respecting browser, and while they’re a less-offensive choice than the alternative, they’ve made some choices that chase cash and not all their defaults respect Grandpa Joe’s technical ignorance. Their default search engine is Google. They have a “privacy respecting” news headline service that tracks you locally on your machine (as I understand it) but ultimately will record your clicks and is subject to its own political bias.

Project Progress Review

Before diving into the main payload, I did guide my father through installing Debian again over his first install – this time using the LXDE desktop environment. It’s small, and we are planning on installing a different one anyway that isn’t included in the default lineup found in the installer.

When starting a software project is often best to check for updates. sudo apt-get update and my father’s account wasn’t in the sudoers group. I quickly found a command to fix that and talked him through logging in as root to take care of that. Once updated, we began working over Firefox using a guide by Chris Xiao [1].

Firefox Hardening

Firefox presents its user with an overwhelming number of options. Xiao’s guide [1] does a good job of maneuvering you through a number of options you may want to set if present, but in the end, it’s up to you to make the final call on each setting exposed to you. We neither followed all the steps suggested nor limited ourselves to it.

The guide starts off with the settings menu one might find by fiddling with the menu, where it was simple enough to read the guide and switch settings as we spotted them. It then followed in to the spooky scary deeper settings you only change when you’re okay with potential software breaks. My father compared it to the registry in Windows.

Closing thought:

I made a mistake. In a coming week, I plan on moving my father over to the “testing” branch of Debian. I don’t have the exact numbers in front of me, but his Firefox version is about 15 to 20 versions behind what I’m using on Manjaro (Firefox 92.0.1).

I will need to revisit this list.

Takeaway

Online privacy is largely a fleeting fantasy these days, but I believe it’s still worth working for.

Final Question

Where do you draw the line between privacy, functionality, and security?

Work Cited

[1] C. Xiao, “Yet Another Firefox Hardening Guide,” May 5, 2021 [Online]. Available: https://chrisx.xyz/blog/yet-another-firefox-hardening-guide/ [Accessed Sept. 26, 2021]

By Continuing To Stay I Agree To What?!

Good Morning from my Robotics Lab! This is Shadow_8472, and today, I’ve been thinking long and hard about my policies towards current practices of data collection and how I’m going to balance privacy and functionality. Let’s get started!

Wakeup Reactions

My first reaction when thinking about data collection is that it’s all bad – overreaching companies look at where you’re going and what you’re doing and use that data to feed you ads and filter your social media feeds so you start thinking “correctly” according to some big boss at the top. I’m scared. Some people just give up trying because they feel it’s too late to stop the cyberpunk dystopia we’re quickly sinking into so it’s not worth it to even try. Others will go overdrive the other direction and redesign their whole lifestyle around having little to no online interaction.

I’d describe myself as somewhere in the middle. I’ve tried to take a reasonable stance; I only replace digital services with less invasive alternatives when they’re otherwise up for replacement. This strict of a policy has led me to go a year without a phone and to boycott Minecraft ever since they announced mandatory Microsoft accounts. Overall, I feel a little safer for my efforts, but I’d feel better if I could convince the rest of my family to take similar steps as they are able. At the same time, I feel a little more isolated than I should be.

Data Collection: the Good and the Bad

The truth about online data collection is that there are functions that logically won’t work if it doesn’t happen at all. Posting to a comments section implies permission to display said comment publicly. It may also subject it to moderation, and inclusion in site backups (it may show up again if site is rolled back to before user-end deletion). Sharing a photo may additionally require giving the service permission to “digitally manipulate” it to make a low resolution thumbnail. Ordering up a video from a streaming service requires it to know what you’re watching. All these things are fine, even though you are technically forgoing elements of privacy to allow things to work.

The bulk of applications of data collection exist in a huge gray zone between acceptable and unacceptable. That video streaming service can make more appealing suggestions by building up a profile on you, observe that all your peers are into MoneyBuck’s Capitalist Island Adventure let’s plays (not a real game so far as I can tell), and guess that you might be too! The same technology and permissions can be used to track your interest in candidates for the Yellow-flag political party and with the way things are set up, it only takes a few lines of fine print you will never see, unless you actively look for them and they can sell their profile on you to some other company who prefers you vote Orange-flag and –with another line or two of fine text you won’t likely ever see or understand– will slowly change what it shows you to make you more sympathetic to the Orange-flag party while it hides pro-Yellow-flag stuff from people already in the Orange-flag camp.

In extreme, real-life cases, they’re essentially digitally harvesting your organs to profile you for profit in a not-illegal black market. Their model of you is perfectly legal because you technically give consent to the fine print of a web page they’ll only serve up when asked specialy.

Struggles of the Privacy Resistance Movement

Privacy tools exist. Many people give up hope anyway presumably because their go-to information sites often have built their business models around the user data market – places with a vested interest in hiding tools designed to assist in undermining abusive data collection policies.

Some jurisdictions have passed laws requiring opt-out options. It’s a start, but in my experience, there’s still a lot of room for sites and services to avoid unintentionally informing users of their rights and/or make it annoying or inconvenient for users wishing to exercise them. More than once, I’ve seen an opt-out popup revert on page refresh. For user accounts, expect instructions on how to find the opt-out controls to be buried behind at least a few layers of “Please see our X for more information” worth of of tiny type.

Minecraft and Microsoft Accounts

I don’t think of Microsoft in a positive light anymore. Their announcement trailer for account migration was honestly more than a little offensive because it left me with the message that my concerns in particular could be bribed away with a cape. They can burn my cape for all I care. It wouldn’t look good with my player skin anyway. I’ve been vocal about rejecting the terms of service for a Microsoft account.

However, I’d be lying if I said I didn’t miss playing Minecraft. If I want to keep playing, there will come a time when I’ll need to migrate. I’m still not happy about getting an account, but how much of a privacy backslide is it really? What steps can I take to tell them, “I said, ‘No!’” in a way they will understand?

I reached out to the folks at tos;dr [Terms of Service; Didn’t Read]. Their whole thing is annotating terms of service documents and highlighting potentially objectionable or commendable phrases they find embedded in the legalese. But they only have so many resources. In the end, they’re good to sound the alarm and not so good for pressing forward. I wasn’t able to get anything conclusive straight from them.

As migration propaganda was pushed through my family’s Minecraft launchers, I managed to talk my family into slogging through the tiny type. A lot of it sounded pretty good. Sugary even. Alarmingly calm. One document led to another led to three more and the web went on. It was not a fun read. Only as we were coming to the end of a document chain did we find that this Privacy Dashboard they mentioned several times [1] was effectively a partial red herring; I’ll need to talk to a human being somewhere in their system to exercise my full rights. {Writer’s note: I really wanted a direct quote, but it’s late, and “Learn more” is the enemy of “Find on Page.” Consequently, I don’t know if I’m on the right page.} With all the tricks I’ve seen so far, I half-expect their staffed privacy department to be severely understaffed to the point where it barely exists, legally speaking.

I may not agree with everything I’ve seen, but there’s nothing stopping me from minimizing my contact with them. I’ll only make an account once it’s absolutely necessary. I’ll only ever use Linux when playing because it doesn’t have all the mysterious collection nodes requiring additional policies on data collection from Windows, Mac, and some other names I don’t recognize or remember. The official launcher with its ads for their other games will only be my last-resort option after days of trying to get a chosen 3rd party launcher working. Finally, I’ll lock down my account once it’s made with their Privacy Dashboard for what that’s worth while I at least try to go the final step and clamp it down hard through a live agent. If I am going to accept their terms of service, it will be on MY terms.

Takeaway

Too few people care about data collection practices and the people who do are burdened with the alarmingly rational feeling of Big Tech watching them, even if they make a hobby or career out of researching how to be as nondescript online as they might be while walking a dog. Privacy should not be something Grandpa Joe needs to worry about after little Jimmy the computer nerd tips him off. On the other hand, it’s easy to get overenthusiastic about reclaiming privacy and cut yourself off socially instead. Each person has his or her own balance point, but it’s all too often privacy by intervention instead of by default.

Final Question

Have you ever boycotted anything to seemingly no avail?

Work Cited

[1] Microsoft, “Microsoft Privacy Statement” Microsoft, July 2021.[Online]. Available: https://privacy.microsoft.com/en-us/privacystatement [Accessed Sept. 6, 2021]

Removing Backgrounds With GIMP

Good Morning from my Robotics Lab! This is Shadow_8472, and today I am editing pictures for my church’s photo directory. Let’s get started!

My Background

I first learned the basics of photo manipulation in high school using Photoshop. Pictures can be split apart, remixed with other pictures or original (hand drawn) elements with a wide variety of tools, and sent back out for use elsewhere. The whole operation is organized in a series of layers with their won operations.

Photoshop is not a bad program, but ever since Adobe moved their product to a subscription model, service has have improved for people who use their products consistantly. But for people like me who only need it once every few years and at a moment’s notice, monthly or yearly payments are a terrible investment. I own an old copy of Photoshop, but even then the liscence is a pain to move from computer to computer — and if memory serves, it’I’m limited to the total number of computers it can ever rest on.

GIMP (GNU Image Manipulation Program) is a comparable, free and open source alternative. And in true Linux fashion, it focuses on a small core feature set to serve the vast majority of users’ needs while specialized functionality can be added with extensions. Contrast that with Photoshop loading every tool imaginable for everyone who won’t use but a quarter of them at most. In short: Photoshop has more polish, but GIMP is easier on system resources.

Selecting and Removing a Background

My church produces a photo directory every few years. Families grow, and pictures need updating. I was given a number of pictures and was expected to remove their backgrounds or else turn them white. Photoshop’s tool of choice would have been the quick select tool, but GIMP won’t have a perfect counterpart for this simple tool until GIMP 3.0. Instead, I taught myself the foreground select tool, which I would describe as resistant to learning by trial and error.

The foreground select tool has multiple phases. In GIMP 2.10 specifically, you start with a selection loop, overestimating your subject. You can use line segments or hand drawn boarders, whichever you deem most reasonable at the time. Endpoints are freely adjustable once you’ve closed the loop.

Hitting Enter will put you in phase 2. Here, the whole picture takes on a blue tint (The color is adjustable, but I didn’t learn how). The goal here is to paint in your clear foreground, heavily tinted background, and a lightly tinted buffer between them called “unknown” or “unknown pixels.” Tool options let you switch between what zone you’re painting.

Here, more than anywhere else, it’s particularly picky about keeping on task. I lost about two hours total because I did something just a little too unrelated and the foreground select tool sent me back to phase 1 with no way to undo. It is best to save other tasks in the workflow for later, like adding an alpha (transparency color) channel or duplicating/hiding the original picture layer for quick reference.

Phase 3 will “[calculate] the alpha value of unknown pixels” referring to your boarder zone. It provides a preview of exactly what you’re selecting and lets you make further adjustments, but it will recalculate everything each paintbrush stroke you make. I recommend finding the small dialogue box that lets you disable the preview, effectively reverting to the more economical phase 2. By going back and forth between these two phases, you can tell where the algorithm understands your selection and where it needs more help. Hitting one enter while in phase 3 will finalize your selection.

Training Family

One of the more rewarding parts of this experience was teaching my father this tool, though there were unique challenges. At one point, he was having frustrations getting the tool to cooperate. I had worked with this tool using two or three computers, but his description of what was happening sounded alien to my experiences.

I took a look at his workflow, and GIMP was skipping straight from what I’m calling phase 1 to phase 3, forcing him to use long strokes or face tons of extra recalculation time. Long story short, while I was trying to make his tool icons larger, I found he was on GIMP 2.8, the latest available in the Linux Mint repositories. The GIMP download page directed me to a GIMP 2.10 flatpack installer. The system still prefers GIMP 2.8 when unspecified, but he figured out how to proceed, so that’s a tech support topic for another day.

Color to Black and White

Later on, the software for the directory was making some pictures way too dark. Photoshop has a layer filter for this exact use case with six sliders to adjust different color ranges. GIMP doesn’t include any layer filters, but I was able to make a crude approximation of this tool. Just know there may be a better way to do this and I don’t know it — plugin or otherwise.

To make the perfect black and white version of a color picture, I would start by adding a white background and merging layers if necessary. Next, I menued over to an option to decompose the color channels to a new document. With each layer in grayscale, I added another white background and used Color to Alpha on the red, green, and blue color channel layers. I then adjusted the opacity of each layer until I had a result I was happy with.

Each picture was its own challenge. Most times, I would have one layer carrying most of the data while reducing the other two dramatically. Too little color data, and the picture would fade. Too much, and it would go dark. I targeted a mix where the picture would pop and my eye would relax a little while looking at it.

Takeaway

Both GIMP and Photoshop are wonderful pieces of software in their own rights. I’ve had a class on Photoshop, so of course I’m going to find it easier to navigate that program, even if it’s been years. Despite my slim requirements of advanced photo editing, GIMP’s slim feature set slimmed out a feature or two I miss.

Final Question

Have you taught yourself both Photoshop and GIMP? Which is harder to learn?