Import data to PowertoysRunTOTP from Aegis Authenticator

This article will introduce how to export plain JSON data from Aegis Authenticator and convert it into the format required by PowertoysRunTOTP, helping you import two-factor authentication (2FA) accounts from Aegis into the PowerToys Run TOTP extension.

Warning: Do not keep plain JSON files on your computer for an extended period. It is recommended to store them in encrypted storage, such as pCloud Crypto, or use 7zip to compress and set a secure password to protect the files.

Step 1: Export Plain JSON from Aegis Authenticator

First, export your 2FA account data from Aegis Authenticator. Ensure the exported file is in plain JSON format and save it to a secure location, such as C:\path\to\aegis_export.json.

Step 2: Write a PowerShell Script

Write a PowerShell script to convert the exported Aegis JSON file into the format required by PowertoysRunTOTP. Below is the complete script, which you can copy and paste into Notepad and save as a .ps1 file, for example, convert_aegis_to_powertoysrun.ps1.

$inputFilePath = "P:\Crypto Folder\aegis.json"
$outputFilePath = "$env:LOCALAPPDATA\Microsoft\PowerToys\PowerToys Run\Settings\Plugins\Community.PowerToys.Run.Plugin.TOTP\OTPList.json_new"
try {
    # Read the Aegis JSON file and ensure it uses UTF-8 encoding
    $jsonContent = Get-Content -Raw -Path $inputFilePath -Encoding UTF8

    # Check if the JSON file is empty
    if ($jsonContent -eq $null -or $jsonContent.Trim() -eq "") {
        throw "The Aegis JSON file is empty or contains no content"
    }

    try {
        # Parse the JSON file
        $aegisData = $jsonContent | ConvertFrom-Json
    } catch {
        throw "JSON parsing error: $_"
    }

    # Prepare the JSON structure for PowerToysRunTOTP
    $powerToysRunTOTP = @{
        Version = 2
        Entries = @()
    }

    # Check the structure of the Aegis JSON file
    if ($aegisData.db.entries -ne $null) {
        # Iterate over Aegis entries and extract necessary data
        foreach ($entry in $aegisData.db.entries) {
            $newEntry = @{
                Name = "$($entry.issuer): $($entry.name)"
                Key = $entry.info.secret
                IsEncrypted = $false
            }
            $powerToysRunTOTP.Entries += $newEntry
        }
    } else {
        throw "Entries in the Aegis JSON file are empty or not found"
    }

    # Write the converted data to the PowerToysRunTOTP JSON file
    $powerToysRunTOTP | ConvertTo-Json -Depth 3 | Set-Content -Path $outputFilePath -Encoding UTF8

    Write-Host "Aegis JSON conversion successful and saved to $outputFilePath"
} catch {
    Write-Host "An error occurred during the conversion process: $_"
}

Step 3: Run the PowerShell Script

Method 1: Run via Right-Click on Windows 10 or Later

  1. Ensure PowerToys is closed. This prevents the PowertoysRun OTP extension from overwriting the user-edited file during the process.
  2. Open File Explorer and locate the PowerShell script file you saved, such as convert_aegis_to_powertoysrun.ps1.
  3. Right-click the file and select "Run with PowerShell."
  4. If you see a Windows security warning, select "More info" and then click "Run anyway."

Method 2: Run Using PowerShell Command

  1. Ensure PowerToys is closed. This prevents the PowertoysRun OTP extension from overwriting the user-edited file during the process.
  2. Press Win + X and select "Windows PowerShell (Admin)" or "Windows Terminal (Admin)."
  3. In the PowerShell window, type the following command, without pressing Enter yet (there is a space after -File):
    %%%
    PowerShell -ExecutionPolicy Bypass -File
    %%%
  4. Open File Explorer and locate the PowerShell script file you saved.
  5. Drag and drop the file into the PowerShell window. This will automatically fill in the complete path of the file.
  6. Ensure the command looks like this and then press Enter to execute:
    %%%
    PowerShell -ExecutionPolicy Bypass -File "C:\path\to\convert_aegis_to_powertoysrun.ps1"
    %%%

Step 4: Verify the Import Results

  1. Open PowerToys, which will automatically start the TOTP extension.
  2. Once the PowertoysRun TOTP extension starts, it will automatically encrypt the data in the OTPList.json file.
  3. Open PowerToys Run and check if your 2FA accounts were successfully imported. If everything is correct, you should see your imported accounts and be able to use them for authentication.

Summary

Through the above steps, we successfully converted the plain JSON file exported from Aegis Authenticator and imported it into PowertoysRunTOTP. This method helps you easily manage your 2FA accounts and migrate them between different devices.
If you found this article helpful, please leave a comment, give a thumbs up, or share it with others.

If you have any suggestions, feel free to leave a comment!

WordPress Docker Maintenance and Deployment Notes

Docker Compose

If you're worried that using latest will pull new versions that could break your setup, you can specify a suitable version number.

version: "4.5"

services:
  tsumugi-db:
    image: mariadb:latest
    volumes:
      - tsumugi-mariadb_data:/var/lib/mysql
    restart: always
    environment:
      MARIADB_ROOT_PASSWORD: your-mariadb-root-pwd
      MARIADB_DATABASE: your-wordpress-db
      MARIADB_USER: yourDbUserForWp
      MARIADB_PASSWORD: yourMariaDbPassword

  tsumugi-wordpress:
    depends_on:
      - tsumugi-db
    #links:
    #  - mariadb:mysql
    image: wordpress:latest
    volumes:
      - tsumugi-wordpress_data:/var/www/html
      - tsumugi-wordpress_php:/usr/local/etc/php

    restart: always
    environment:
      WORDPRESS_DB_HOST: tsumugi-db
      WORDPRESS_DB_USER: yourDbUserForWp
      WORDPRESS_DB_PASSWORD: yourMariaDbPassword
      WORDPRESS_DB_NAME: your-wordpress-db

  zunda-db:
    image: mariadb:latest
    volumes:
      - zundamon-mariadb_data:/var/lib/mysql
    restart: always
    environment:
      MARIADB_ROOT_PASSWORD: some-mariadb-root-pwd
      MARIADB_DATABASE: zundamon-wordpress
      MARIADB_USER: zundamochi114514
      MARIADB_PASSWORD: some-mariadb-password

  zundamon-wordpress:
    depends_on:
      - zunda-db
    image: wordpress:latest
    volumes:
      - zundamon-wordpress_data:/var/www/html
      - zundamon-wordpress_php:/usr/local/etc/php
    restart: always
    environment:
      WORDPRESS_DB_HOST: zunda-db
      WORDPRESS_DB_USER: zundamochi114514
      WORDPRESS_DB_PASSWORD: some-mariadb-password
      WORDPRESS_DB_NAME: zundamon-wordpress
      WORDPRESS_TABLE_PREFIX: wpzundamochi_

  https-portal:
    image: steveltn/https-portal:1
    ports:
      - "192.168.19.19:80:80"
      - "192.168.19.19:443:443"
    restart: always
    environment:
      DOMAINS: 'www.zundamon-kawaii.com -> http://tsumugi-wordpress:80, blog.zundamon.co.jp -> http://zundamon-wordpress:80, www.zundamon.co.jp -> http://zundamon-wordpress:80, zundamon.co.jp -> http://zundamon-wordpress:80'
      CLIENT_MAX_BODY_SIZE: 500M
      STAGE: 'production' # Don't use production until staging works
      # FORCE_RENEW: 'true'
    volumes: 
      - https-portal-data:/var/lib/https-portal

volumes:
  tsumugi-mariadb_data: {}
  tsumugi-wordpress_data: {}
  tsumugi-wordpress_php: {}
  zundamon-mariadb_data: {}
  zundamon-wordpress_data: {}
  zundamon-wordpress_php: {}
  https-portal-data: {}

Troubleshooting

  • Browser Developer Console shows 413 Request entity too large
    • The https-portal needs an environment variable adjustment:
CLIENT_MAX_BODY_SIZE: 500M

If you accidentally add a semicolon after 500M like CLIENT_MAX_BODY_SIZE: 500M;, the container will still run, but the website will not respond. Check the https-portal error.log, and you will see an error message similar to this (my volume configuration is located in the dd87****87b folder):

2024/07/19 13:52:01 [emerg] 59#59: unexpected ";" in /etc/nginx/nginx.conf:56
  • Unable to upload files larger than 2MB to WordPress
    • Since my Compose configuration directly maps PHP to a volume, you can create an uploads.ini file in /var/lib/docker/volumes/yourstack-zundamon-wordpress_php/_data/conf.d with the following content:
file_uploads = On
memory_limit = 500M
upload_max_filesize = 500M
post_max_size = 500M
max_execution_time = 600

Backup Script

#!/bin/bash

# Define variables
NFS_SERVER="192.168.x.x" # Destination Hostname
NFS_PATH="/volume1/Backup-NFS" # Destination directory
LOCAL_PATHS=(
    "/var/lib/docker/volumes/yourblog_mariadb_data/_data"
    "/var/lib/docker/volumes/yourblog_wordpress_data/_data"
#...add and adjust as needed, no comma at the end of the string
)
MOUNT_POINT="/mnt/backup_nfs"
DATE_NOW=$(date +'%Y%m%d%H%M%S')
BACKUP_FILE="$MOUNT_POINT/web/websiteBackup_$DATE_NOW.tar.gz"

# Create mount point
mkdir -p $MOUNT_POINT

# Check if NFS is already mounted
mountpoint -q $MOUNT_POINT
if [ $? -ne 0 ]; then
  echo "Mounting NFS shared directory..."
  mount -t nfs $NFS_SERVER:$NFS_PATH $MOUNT_POINT

  if [ $? -ne 0 ]; then
    echo "Failed to mount NFS shared directory"
    exit 1
  fi
fi

# Compress and backup data
tar -czf $BACKUP_FILE -C / ${LOCAL_PATHS[@]}

# Delete excess backups
find $MOUNT_POINT -name "websiteBackup_*.tar.gz" -type f -print | while read FILE; do
    FILE_DATE=$(basename $FILE | sed 's/websiteBackup_\(.*\)\.tar\.gz//')
    FILE_EPOCH=$(date -d "${FILE_DATE:0:8}" +%s)
    NOW_EPOCH=$(date +%s)
    AGE=$(( (NOW_EPOCH - FILE_EPOCH) / 86400 ))

    if [ $AGE -le 7 ]; then
        # Keep one backup per day for the last 7 days
        continue
    elif [ $AGE -le 30 ]; then
        # Keep one backup per week for the last month
        FILE_DAY=$(date -d "${FILE_DATE:0:8}" +%u)
        if [ $FILE_DAY -eq 1]; then
            continue
        fi
    elif [ $AGE -le 365]; then
        # Keep one backup per month for the last year
        FILE_DAY=$(date -d "${FILE_DATE:0:8}" +%d)
        if [ $FILE_DAY -eq 1]; then
            continue
        fi
    elif [ $AGE -gt 365]; then
        # Keep one backup per year
        FILE_MONTH_DAY=$(date -d "${FILE_DATE:0:8}" +%m%d)
        if [ $FILE_MONTH_DAY -eq "0101"]; then
            continue
        fi
    fi

    # Delete files that do not meet the retention rules
    rm -f $FILE
done

crontab -e with micro editor

How to Fix the Black Screen Issue When Switching to Desktop Mode on Steam Deck

When you encounter a black screen while attempting to switch to Desktop Mode on your Steam Deck, it can be frustrating. However, there is a simple solution to help resolve this issue. Here is a step-by-step guide to solve the problem:

Solution

  1. Switch Accounts:

    • First, log in to a new Steam account through the game interface.
    • Use this new account to switch to Desktop Mode.
    • If the switch is successful, delete all files (excluding folders) located at Home/.local/share/kscreen and Home/.local/share/kscreen/outputs.
    • For safety, you may choose to cut and back up these files to another location before deleting them from the original location.
    • After backing up, switch back to Game Mode, log in to the account that had the issue, and try switching to Desktop Mode again. The problem should now be resolved.
  2. For Users with Decky Loader Framework Installed:

    • Ensure you have the Decky Loader framework installed and have set a root password in advance.
    • Install the Decky Terminal plugin.
    • Execute sudo -i to log in as root, then run the following commands:
      cd /home/deck/.local/share/kscreen
      rm *
      cd /home/deck/.local/share/kscreen/outputs
      rm *
      exit
    • After executing these commands, close the Terminal and try switching to Desktop Mode again; the issue should be resolved.
    • It is recommended to create a script file named fix.sh containing the above commands and store it in /home/deck/. Remember to make the script executable by using the command sudo chmod +x /home/deck/fix.sh.

By following these steps, you should be able to overcome the black screen issue on your Steam Deck and enjoy a seamless gaming experience.

One-Click Solution to Transfer iOS Illustail Images to Synology NAS

Requirements and Target Audience

  1. Users of Illustail iOS, or people who frequently download images from Booru and Twitter.
  2. Have a Synology NAS.
  3. Insufficient storage space on iOS devices.

What is Illustail

Illustail is an iOS app designed for browsing, downloading, and searching images on Booru sites. You can use it on iPhone, iPad, or Mac, and easily save images to your device or cloud storage. Illustail supports various image sites, including TINAMI, Danbooru, Tumblr, Twitter, Misskey, and Mastodon.
It also comes with an iOS widget that can be used to decorate your home screen.

In short, Illustail is a tool that helps you browse images on iOS.

The free version has ads, but you can opt to pay to remove the ads and support the app developers.

Problem

My iPad storage fills up quickly with saved images! And in 2023, Illustail only provides Dropbox as a cloud storage option. However, the free version of Dropbox only offers 4.75 GB, which is not enough.

Isn't it a waste not to use a large Synology NAS?
This guide will briefly explain how to use the built-in Synology NAS packages without command lines, making it easy for anyone to set up!

Steps

  1. Go to the Synology NAS Package Center and install Cloud Sync.
  2. Open Cloud Sync and create a new backup task for Dropbox.
    Step 2
  3. Configure the settings:
    • Connection Name: Choose a recognizable name;
    • Local path: Set the destination folder on your NAS where you want to save images;
    • Remote path: Set the folder on Dropbox where Illustail will upload images;
    • Sync direction: Set to Download remote changes only;
    • Check Don't remove files in the destination folder when they are removed in the source folder;
    • Schedule settings: Set the task schedule. I selected all options.
      Step 3
  4. Further adjust settings, such as how often to check for changes. I set it to 60 seconds. The shorter the time, the busier your NAS will be.
    Step 4
  5. In Illustail, set the upload destination to the specified Dropbox folder. And you're done!
    Step 5
  6. Test Illustail's upload function to see if the images appear on your NAS.

Simple Rclone Quick Backup to Google Drive

NPC Appears!
Boss: Hero, please don't let this task delay your other work!
─────────────────────────────────
= [ 1 Local or Network Copy and Paste ] =
= 2 Hard Disk Clone =
= 3 Escape =
= 4 Charm the Boss =
─────────────────────────────────
Hero chose 1, but although Windows can use fastcopy, it feels too time-consuming, so it's not considered.
─────────────────────────────────
= =
= [ 2 Hard Disk Clone ] =
= 3 Escape =
= 4 Charm the Boss =
─────────────────────────────────
Hero chose 2, this should be the fastest way, but the target hard drive also needs to be backed up to avoid failure. Left with options 3 and a new unknown option.
─────────────────────────────────
= 5 rclone????????? =
= =
= [ 3 Escape ] =
= 4 Charm the Boss =
─────────────────────────────────
Hero wanted to choose 3, but his heart told him he couldn't run! He also couldn't choose 4, so he decided to try 5.
..?
..???
...!!
rclone swiftly and effectively resolved all the issues.

For such a ridiculous opening, I apologize. I promise not to waste any more words.

rclone, with a single command, backs up all files from the hard drive to Google Drive or other storage. The setup is very easy! Here, I'll demonstrate the simplest method using the built-in GUI to set up the config, then enter the command to start the backup.

Config Setup to Connect to the Cloud

After downloading rclone, cd to the same folder and enter the following command into cmd / bash / zsh or any Terminal.

rclone rcd --rc-web-gui

In PowerShell, you need to enter

"./rclone.exe" rcd --rc-web-gui

This will start a local webserver, giving you a simple interface.
Enter the Config, go to Google Developer Console to enable the Google Drive API, set up OAuth, and copy the ID and Key to the corresponding fields. Follow the steps to complete the setup.

Click Explorer, and in the path input box, the Config Name you just set up will pop up. After selecting it, you can browse your Google Drive like in the web version. Create the destination folder for the backup, and copy the current path, for example:
MyGoogleDrive01:/MyBackupFolder/Laptop01/2022

Then return to the Terminal and press Ctrl + C or Command + C to close the webserver.

Enter the following command, and it will start backing up everything from C: to Google Drive!
--progress is used to show the upload progress. Without this arg, the Terminal would remain blank, with no indication of the backup progress.

rclone copy "C:/" "MyGoogleDrive01:/MyBackupFolder/Laptop01/2022" --progress

Files that fail to copy will be skipped, such as files currently in use by the system.