Categories
Architecture & Resilience

Automating Application Management With Winget

In this post, I try automating my laptop’s application management with the Windows Package Manager tool Winget.

Table of Contents

Introduction

After much frustration with my laptop’s performance, I finally booked it in for upgrades to an SSD hard drive and 16GB RAM. It’s now very responsive and far faster!

The shop originally planned to clone my existing HDD drive onto the new SSD. Unfortunately, the clone kept failing due to some bad sectors. Fortunately, this didn’t present a risk of data loss – most of my files are in OneDrive, and everything else is either in Amazon S3 or on external drives.

The failing clone meant that none of my previously installed programs and packages were on the new drive. I wasn’t flying blind here though, as I regularly use the free Belarc Advisor tool to create a list of installed programs.

But this is a heavily manual process, and the Belarc Advisor files contain a lot of unnecessary data that isn’t easy to use. So I found myself looking for an alternative!

User Story

In this section, I outline the problem I want to solve.

I want to capture a list of all applications installed on a given Windows device so that I can audit my device and have a better disaster recovery strategy.

ACCEPTANCE CRITERIA:

The process must be fully automated. I don’t want another job to do – I want the device to own this process.

The process must be efficient. Belarc Advisor gets the job done, but it takes time to load and does a bunch of other stuff that I don’t need.

There is no budget. Belarc Advisor isn’t ideal, but it’s free. I don’t want to start spending money on this problem now.

Introducing Winget

This section explains what Winget is and examines some of the features and benefits it offers.

What Is Winget?

Winget is a Windows Package Manager that helps install, upgrade, configure and delete applications on Windows 10 and Windows 11.

Package Managers look through configured repositories like the Windows Package Manager Community Repository for applications. If the application is available, it will be downloaded from the repository and installed onto the device.

Microsoft has open-sourced Winget, and has committed it to their GitHub account. After installation, Winget is accessible via the Windows Terminal, PowerShell, and the Command Prompt.

Package Manager Benefits

Package Managers like Winget offer several benefits over traditional methods:

  • Applications are installed as CLI commands, so there is no need to navigate to different websites or go through multiple installation steps.
  • Their repositories enforce a strict submission policy and use standardized package formats, so applications are installed consistently and reliably.
  • They manage application dependencies. If a desired application needs another application to work, the package manager will automatically install that application as well.
  • They lend themselves well to CI/CD pipelines, IAC and disaster recovery, as package manager commands can be used in scripts and automated processes.
  • Community tools like winstall exist that can create batch-installation Winget commands and scripts using a web GUI.

Winget Commands

Winget regularly receives new commands, a list of which is maintained by Microsoft. These commands can be loosely grouped into:

For this post, I will be focusing on the last group.

winget list displays a list of installed applications. The list includes the current version and the package’s source, and has several filtration options.

The winget list syntax is:

winget list [[-q] \<query>] [\<options>]

winget export creates and exports a JSON file of apps to a specified path.

This JSON file can combine with the winget import command to allow the batch-installing of applications and the creation of build environments.

winget export‘s JSON files do not include applications that are unavailable in the Windows Package Manager Community Repository. In these cases, the export command will show a warning.

The winget export syntax is:

winget export [-o] <output> [<options>]

Winget Scripting With VSCode

In this section, I write a script that will run the Winget commands.

I’m writing the script using Visual Studio Code, as this allows me to write the Winget script in the same way as other PowerShell scripts I’ve written.

Unique Filename

Firstly, I want to give each file a unique filename to make sure nothing is overwritten. A good way to do that here is by capturing Get-Date‘s output formatted as the ISO 8601 standard:

$RunDate = Get-Date -Format 'yyyy-MM-dd-HHmm'

This returns a string with an appropriate level of granularity, as I’m not going to be running this script multiple times a minute:

2023-04-26-1345

Winget Export Code

Next, I’ll script my export command.

I need to tell Winget where to create the file, and what to call it. I create a new folder for the exports and capture its path in a $ExportsFilePath variable.

Then I create a $ExportsFileName variable for the first part of the export file’s name. It uses a WingetExport string and the device’s name, which PowerShell can access using $env:computername:

$ExportsFileName = 'WingetExport' + '-' + $env:computername + '-'

Including the computer’s name means I can run this script on different devices and know which export files belong to which device:

WingetExport-LAPTOP-IFIJ32T-

My third $ExportsOutput variable joins everything together to produce an acceptable string for winget export‘s output argument:

$ExportsOutput = $ExportsFilePath + '\' + $ExportsFileName  + $RunDate + '.json'

An example of which is:

C:\{PATH}\WingetExport-LAPTOP-IFIJ32T-2023-04-26-1345.json

Finally, I can script the full command. This command creates an export file at the desired location and includes application version numbers for accuracy and auditing:

winget export --output $ExportsOutput --include-versions

Here are some sample exports:

{
  "$schema": "https://aka.ms/winget-packages.schema.2.0.json",
  "CreationDate": "2023-04-27T11:02:04.321-00:00",
  "Sources": [
    {
      "Packages": [
        {
          "PackageIdentifier": "Git.Git",
          "Version": "2.40.0"
        },
        {
          "PackageIdentifier": "Anki.Anki",
          "Version": "2.1.61"
        },
        {
          "PackageIdentifier": "Microsoft.PowerToys",
          "Version": "0.69.1"
        }
      ],
      "SourceDetails": {
        "Argument": "https://cdn.winget.microsoft.com/cache",
        "Identifier": "Microsoft.Winget.Source_8wekyb3d8bbwe",
        "Name": "winget",
        "Type": "Microsoft.PreIndexed.Package"
      }
    }
  ],
  "WinGetVersion": "1.4.10173"
}

As a reminder, these exports don’t include applications that are unavailable in Winget. This means winget export alone doesn’t meet the user story requirements, so there is still work to do!

Winget List Code

Finally, I’ll script my list command. This is mostly similar to the export command and I create the file path in the same way:

$ListsOutput = $ListsFilePath + '\' + $ListsFileName + $RunDate + '.txt'

The filename is changed for accuracy, and the suffix is now TXT as no JSON is produced:

WingetList-LAPTOP-IFIJ32T-2023-04-25-2230.txt

Now, while winget list shows all applications on the device, it has no argument to save this list anywhere. For that, I need to pipe the winget list output to a PowerShell command that does create files – Out-File:

winget list | Out-File -FilePath $ListsOutput

Out-File writes the list to the $ListsOutput path, producing rows like these:

Name Id Version Available Source
Anki Anki.Anki 2.1.61 winget
Audacity 2.4.2 Audacity.Audacity 2.4.2 3.2.4 winget
DBeaver 23.0.2 dbeaver.dbeaver 23.0.2 winget
S3 Browser version 10.8.1 S3 Browser_is1 10.8.1

The entire script takes around 10 seconds to run in an open PowerShell session and produces no CPU spikes or memory load. The script is on my GitHub with redacted file paths.

Automation With Task Scheduler

In this section, I put Task Scheduler in charge of automating my application management Winget script.

What Is The Task Scheduler?

Task Scheduler began life on Windows 95 and is still used today by applications including Dropbox, Edge and OneDrive. Parts of it aren’t great. The Send Email and Display Message features are deprecated, and monitoring and error handling relies on creating additional tasks that are triggered by failure events.

However, it’s handy for running local scripts and has no dependencies as it’s built into Windows. It supports a variety of use cases which can be scripted or created in the GUI. Existing tasks are exportable as XML.

Creating A New Task

There is plentiful documentation for the Task Scheduler. The Microsoft Learn developer resources cover every inch of it, and these Windows Central and Windows Reports guides are great resources with extensive coverage.

In my case, I create a new ApplicationInventory task, set to trigger every time I log on to Windows:

2023 04 25 TaskSchedulerTrigger

The task starts powershell.exe, passing an argument of -file "C:\{PATH}\ApplicationInventory.ps1".

This works, but will force a PowerShell window to open every time the schedule runs. This can be stopped by configuring the task to Run whether user is logged on or not. Yup – it feels a bit hacky. But it works!

I now have a new scheduled task:

2023 04 25 TaskSchedulerNewTask

Testing

An important part of automating my application management with Winget is making sure everything works! In this section, I check the script and automation processes are working as expected.

I’ll start with the task automation. Task Scheduler has a History tab, which filters events from Event Viewer. Upon checking this tab, I can see the chain of events marking a successful execution:

2023 04 25 TaskSchedulerHistory

When I check the WingetExport folder, it contains an export file created on 25/04/2023 at 22:30:

2023 04 25 AppInventoryExports

And there are similar findings in the WingetList folder:

2023 04 25 AppInventoryLists

Both files open successfully and contain the expected data. Success!

Summary

In this post, I try automating my laptop’s application management with the Windows Package Manager tool Winget.

If this post has been useful, please feel free to follow me on the following platforms for future updates:

Thanks for reading ~~^~~

Categories
DevOps & Infrastructure

Migrating amazonwebshark To SiteGround

In this post, I examine the process of migrating amazonwebshark to SiteGround and give an overview of the processes involved.

Table of Contents

Introduction

When I started amazonwebshark I had to make some infrastructure decisions. I registered the domain name with Amazon Route 53, and then needed to choose a blog hosting platform.

In December 2021 I took advantage of a Bluehost offer and paid £31.90 for their Basic WordPress Hosting package. This included a variety of services including:

My Bluehost renewal came through earlier this month, priced at £107.76. I’ve had great service from Bluehost and have no complaints, but that price was quite a leap. So, before I accepted it, I decided to do some research and see what my alternatives were.

Hosting Alternatives

In this section, I go through the results of my research into alternative hosting platforms.

Amazon Lightsail

I started by looking at Amazon Lightsail. Essentially, Lightsail is a simplified way of deploying AWS services like EC2, EBS and Elastic Load Balancing.

Lightsail pricing differs from most AWS services. Instead of the common Pay-As-You-Go pricing model, Lightsail has set monthly pricing. For example, a Linux server with similar memory, processing and storage to my Bluehost server currently costs $3.50 a month.

There is an important difference though. While Bluehost has teams of people responsible for tasks like server maintenance, database recovery, hard disk failures and security patches, with Lightsail the infrastructure would become my responsibility. I would save money over Bluehost, but at the cost of doing my own systems admin.

And the above list isn’t even exhaustive! It doesn’t include the setup and maintenance of services like CDN, SSL certificates and email accounts, all of which come with their own extra requirements and costs.

At this point, Bluehost was still on top.

SiteGround

SiteGround is in the same business as Bluehost. It offers a variety of hosting solutions for an array of use cases and has good standing in the industry.

SiteGround also had a great Black Friday offer this year! It was offering pretty much the same deal as Bluehost at £1.99 a month:

2022 11 25 SitegroundOfferScroll

This is INSANELY cheap, especially considering how much all this infrastructure costs to run!

SiteGround has also developed a free WordPress plugin to automate migrations from other hosting platforms. While this isn’t unique to them, a combination of good reviews, extensive services, low hassle and a great price was more than enough to get me on board.

Migrate What Exactly?

Before continuing, I thought it best to go into a bit of detail about what exactly is being migrated. I’ve mentioned servers, databases and domains, but what gets moved where? And why?

Well, because I’m moving things around on the Internet, I need to talk about the Domain Name System (DNS).

Wait! Come back!

Explain DNS Like I’m 5

What follows is a very simple introduction to DNS. There’s far more to DNS than this, but that’s beyond the scope of this post.

Let’s say I want to phone The Shark Trust. I can’t type “The Shark Trust” into my handset – I need their phone number. So I open my phone book, turn to the S section and find The Shark Trust. Next to this entry is a phone number: 01752 672008. I type that number into the handset and get through to their office.

DNS is like the Internet’s phone book. Websites are held on servers, and the ‘phone numbers’ for those servers are IP addresses. When I request a website like amazonwebshark.com, my web browser needs to know the IP address for the server holding the site’s data, for example 34.91.95.18.

Explain DNS With Pictures

This WebDeasy diagram show DNS at a high level:

WebDeasy: How the Domain Name System (DNS) works – Basics

When a URL is entered into a web browser, a query is sent to a DNS server. Using the phone book analogy, the web browser is asking the DNS server for the amazonwebshark.com phone number.

DNS servers don’t have any IP addresses, but they know which ‘phone book’ to look in. These ‘phone books’ are called name servers. The DNS server finds and contacts the right name server, which matches the amazonwebshark.com domain name to an IP address.

The DNS server then returns this IP address to the web browser, which uses it to contact the server hosting the amazonwebshark.com resources.

In the diagram, the DNS-Server represents Route 53. Route 53 holds DNS records for the amazonwebshark.com domain name, and knows where to find the name servers that have the amazonwebshark.com IP address.

The webdeasy.de server represents the Bluehost name servers. These servers can answer a variety of DNS queries, and are considered the ground truth for initial site visits and browser caching.

amazonwebshark’s DNS Setup

At the start of December 2022 the amazonwebshark.com domain name was hosted by Route 53, with an NS record pointing at the Bluehost name servers:

2022 12 02 Route53Bluehost

The basic infrastructure looked like this, with outbound requests in blue and inbound responses in green:

2022 12 27 amazonwebsharkDNSdiagram

And that’s it! To further explore DNS core concepts, this DNSimple comic is well worth a read and this Fireship video gives a solid, if a little more technical, account:

Data Migration

In this section, I start migrating my amazonwebshark data from Bluehost to SiteGround.

SiteGround has an automated migrator plugin that copies existing WordPress sites from other hosting platforms. And it’s very good! The process boils down to:

The process can also be seen in this Avada video:

The plugin copies all the amazonwebshark server files, scripts and database objects in a process that takes about five minutes. SiteGround then provides a temporary URL for testing and performance checks:

2022 12 02 SiteGroundMigratonComplete

After completely migrating amazonwebshark to SiteGround, the next step involves telling the amazonwebshark domain where to find the new server. Time for some DNS!

DNS Migration

In this section, I update the amazonwebshark DNS records with the SiteGround name servers.

I repointed the existing amazonwebshark NS record from Bluehost to SiteGround by updating the values in Route 53 from this:

2022 12 02 Route53Bluehost

To this:

2022 12 02 Route53SiteGround

My change then needed to propagate through the Internet. Internet Service Providers update their records at different rates, so changes can take up to 72 hours to complete worldwide.

Free DNS checking tools like WhatIsMyDNS can perform global checks on a domain name’s IP address and DNS record information. The check below was done after around 30 hours, by which time most of the servers were returning SiteGround IPs:

2022 12 06 DNSPropagationCheck

Any Problems?

First, the good news. There was no downtime while migrating amazonwebshark to SiteGround! During the migration, DNS queries were resolved by either Bluehost’s or SiteGround’s name servers. Both platforms had amazonwebshark data, so both could answer DNS queries.

Additionally, as I set a change freeze on amazonwebshark until the migration was over, there was no lost or orphaned content.

I did lose some WPStatistics hit statistics data though. There is no data for December 03 and December 04:

2022 12 18 WPStatisticsHits

This was my fault. The DNS propagation took longer than it should have because of a misunderstanding on my part!

So why was data lost? WPStatistics stores data in tables in the site’s MySQL database. When I first migrated my data on December 02, the Bluehost and SiteGround tables were the same. After that point, Bluehost continued to serve amazonwebshark until December 05, and wrote its statistics in the Bluehost MySQL tables.

It was only after I corrected my DNS mistake that SiteGround could start serving content and writing statistics on the SiteGround MySQL tables. So SiteGround didn’t record anything for December 03 and December 04, and as no additional data migration was done the statistics that Bluehost recorded never made it to the SiteGround tables.

I can recover this if I want to though. I took a full backup of my Bluehost data before ending the contract. That included a full backup of the Bluehost MySQL database with the WPStatistics tables. I’ll take a look at the tables at some point, see how the data is arranged and decide from there.

Future Plans

I’m considering moving amazonwebshark to a serverless architecture in 2023. While the migration was a success, servers still have inherent problems:

  • Servers can break or go offline.
  • They can be hacked.
  • They can be over or under-provisioned.

Serverless infrastructure could remove those pain points. I don’t use any WordPress enterprise features, and amazonwebshark could exist very well as an event-driven static website. Tools like Hugo and Jekyll are designed for the job and documented well, and people like Kendra Little and Chrissy LeMaire have successfully transitioned their blogs to serverless infrastructures.

The biggest challenge here isn’t architectural. If I moved to a serverless architecture, I would want something similar to the Yoast SEO analysis plugin. This plugin has really helped me improve my posts, and by extension has made them more enjoyable to write.

I’ve seen lots of serverless tooling for migrating resources and serving content, but not so much for SEO guidance and proofreading. Any amazonwebshark serverless migration would be contingent on finding something decent along these lines. After all, if the blog becomes a pain to write for then what’s the point?

Summary

In this post, I examined the process of migrating amazonwebshark to SiteGround and gave an overview of the processes involved.

I’m very happy with how things went overall! The heavy lifting was done for me, both companies were open and professional throughout and what could have been a daunting process was made very simple!

If this post has been useful, please feel free to follow me on the following platforms for future updates:

Thanks for reading ~~^~~

Categories
Developing & Application Integration

Production Code Qualities

In this post, I respond to November 2022’s T-SQL Tuesday #156 Invitation and give my thoughts on some production code qualities.

tsql tuesday

Table of Contents

Introduction

This month, Tomáš Zíka’s T-SQL Tuesday invitation was as follows:

Which quality makes code production grade?

Please be as specific as possible with your examples and include your reasoning.

Good question!

In each section, I’ll use a different language. Firstly I’ll create a script, and then show a problem the script could encounter in production. Finally, I’ll show how a different approach can prevent that problem from occurring.

I’m limiting myself to three production code qualities to keep the post at a reasonable length, and so I can show some good examples.

Precision

In this section, I use T-SQL to show how precise code in production can save a data pipeline from unintended failure.

Setting The Scene

Consider the following SQL table:

USE [amazonwebshark]
GO

CREATE TABLE [2022].[sharkspecies](
	[shark_id] [int] IDENTITY(1,1) NOT NULL,
	[name_english] [varchar](100) NOT NULL,
	[name_scientific] [varchar](100) NOT NULL,
	[length_max_cm] [int] NULL,
	[url_source] [varchar](1000) NULL
)
GO

This table contains a list of sharks, courtesy of the Shark Foundation.

Now, let’s say that I have a data pipeline that uses data in amazonwebshark.2022.sharkspecies for transformations further down the pipeline.

No problem – I create a #tempsharks temp table and insert everything from amazonwebshark.2022.sharkspecies using SELECT *:

When this script runs in production, I get two tables with the same data:

2022 11 02 SQLResults1

What’s The Problem?

One day a new last_evaluated column is needed in the amazonwebshark.2022.sharkspecies table. I add the new column and backfill it with 2019:

ALTER TABLE [2022].sharkspecies
ADD last_evaluated INT DEFAULT 2019 WITH VALUES
GO

However, my script now fails when trying to insert data into #tempsharks:

2022 11 02 SQLResults2Sharp
(1 row affected)

(4 rows affected)

Msg 213, Level 16, State 1, Line 17
Column name or number of supplied values does not match table definition.

Completion time: 2022-11-02T18:00:43.5997476+00:00

#tempsharks has five columns but amazonwebshark.2022.sharkspecies now has six. My script is now trying to insert all six sharkspecies columns into the temp table, causing the msg 213 error.

Doing Things Differently

The solution here is to replace row 21’s SELECT * with the precise columns to insert from amazonwebshark.2022.sharkspecies:

While amazonwebshark.2022.sharkspecies now has six columns, my script is only inserting five of them into the temp table:

2022 11 02 SQLResults3Sharp

I can add the last_evaluated column into #tempsharks in future, but its absence in the temp table isn’t causing any immediate problems.

Works The Same In Other Environments

In this section, I use Python to show the value of production code that works the same in non-production.

Setting The Scene

Here I have a Python script that reads data from an Amazon S3 bucket using a boto3 session. I pass my AWS_ACCESSKEY and AWS_SECRET credentials in from a secrets manager, and create an s3bucket variable for the S3 bucket path:

When I deploy this script to my dev environment it works fine.

What’s The Problem?

When I deploy this script to production, s3bucket will still be s3://dev-bucket. The potential impact of this depends on the AWS environment setup:

Different AWS account for each environment:

  • dev-bucket doesn’t exist in Production. The script fails.

Same AWS account for all environments:

  • Production IAM roles might not have any permissions for dev-bucket. The script fails.
  • Production processes might start using a dev resource. The script succeeds but now data has unintentionally crossed environment boundaries.

Doing Things Differently

A solution here is to dynamically set the s3bucket variable based on the ID of the AWS account the script is running in.

I can get the AccountID using AWS STS. I’m already using boto3, so can use it to initiate an STS client with my AWS credentials.

STS then has a GetCallerIdentity action that returns the AWS AccountID linked to the AWS credentials. I capture this AccountID in an account_id variable, then use that to set s3bucket‘s value:

More details about get_caller_identity can be found in the AWS Boto3 documentation.

For bonus points, I can terminate the script if the AWS AccountID isn’t defined. This prevents undesirable states if the script is run in an unexpected account.

Speaking of which…

Prevents Undesirable States

In this section, I use PowerShell to demonstrate how to stop production code from doing unintended things.

Setting The Scene

In June I started writing a PowerShell script to upload lossless music files from my laptop to one of my S3 buckets.

I worked on it in stages. This made it easier to script and test the features I wanted. By the end of Version 1, I had a script that dot-sourced its variables and wrote everything in my local folder $ExternalLocalSource to my S3 bucket $ExternalS3BucketName:

#Load Variables Via Dot Sourcing
. .\EDMTracksLosslessS3Upload-Variables.ps1


#Upload File To S3
Write-S3Object -BucketName $ExternalS3BucketName -Folder $ExternalLocalSource -KeyPrefix $ExternalS3KeyPrefix -StorageClass $ExternalS3StorageClass

What’s The Problem?

NOTE: There were several problems with Version 1, all of which were fixed in Version 2. In the interests of simplicity, I’ll focus on a single one here.

In this script, Write-S3Object will upload everything in the local folder $ExternalLocalSource to the S3 bucket $ExternalS3BucketName.

Problem is, the $ExternalS3BucketName S3 bucket isn’t for everything! It should only contain lossless music files!

At best, Write-S3Object will upload everything in the local folder to S3 whether it’s music or not.

At worst, if the script is pointing at a different folder it will start uploading everything there instead! PowerShell commonly defaults to C:\Windows, so this could cause all kinds of problems.

Doing Things Differently

I decided to limit the extensions that the PowerShell script could upload.

Firstly, the script captures the extensions for each file in the local folder $ExternalLocalSource using Get-ChildItem and [System.IO.Path]::GetExtension:

$LocalSourceObjectFileExtensions = Get-ChildItem -Path $ExternalLocalSource | ForEach-Object -Process { [System.IO.Path]::GetExtension($_) }

Then it checks each extension using a ForEach loop. If an extension isn’t in the list, PowerShell reports this and terminates the script:

ForEach ($LocalSourceObjectFileExtension In $LocalSourceObjectFileExtensions) 

{
If ($LocalSourceObjectFileExtension -NotIn ".flac", ".wav", ".aif", ".aiff") 
{
Write-Output "Unacceptable $LocalSourceObjectFileExtension file found.  Exiting."
Start-Sleep -Seconds 10
Exit
}

So now, if I attempt to upload an unacceptable .log file, PowerShell raises an exception and terminates the script:

**********************
Transcript started, output file is C:\Files\EDMTracksLosslessS3Upload.log

Checking extensions are valid for each local file.
Unacceptable .log file found.  Exiting.
**********************

While an acceptable .flac file will produce this message:

**********************
Transcript started, output file is C:\Files\EDMTracksLosslessS3Upload.log

Checking extensions are valid for each local file.
Acceptable .flac file.
**********************

To see the code in full, as well as the other problems I solved, please check out my post from June.

Summary

In this post, I responded to November 2022’s T-SQL Tuesday #156 Invitation and gave my thoughts on some production code qualities. I gave examples of each quality and showed how they could save time and prevent unintended problems in a production environment.

Thanks to Tomáš for this month’s topic! My previous T-SQL Tuesday posts are here.

If this post has been useful, please feel free to follow me on the following platforms for future updates:

Thanks for reading ~~^~~