Categories
Security & Monitoring

Unexpected CloudWatch In The Billing Area

In this post I will investigate an unexpected CloudWatch charge on my April 2022 AWS bill, and explain how to interpret the bill and find the resources responsible.

Table of Contents

Introduction

My April 2022 AWS bill has arrived. The total wasn’t unusual – £4.16 is a pretty standard charge for me at the moment, most of which is S3. Then I took a closer look at the services and found an unexpected cost for CloudWatch, which is usually zero.

But not this month:

While $0.30 isn’t bank-breaking, it is unexpected and worth investigating. More importantly, nothing should be running in EU London! And there were no CloudWatch changes at all on my March 2022 bill. So what’s going on here?

Let’s start with the bill itself.

The April 2022 Bill

Looking at the bill, the rows with unexpected CloudWatch charges all mention alarms. Since nothing else has generated any charges, let’s take a closer look at all of the rows referring to alarms.

$0.00 Per Alarm Metric Month – First 10 Alarm Metrics – 10.000 Alarms

The AWS Always Free Tier includes ten CloudWatch alarms.

$0.10 Per Alarm Metric Month (Standard Resolution) – EU (Ireland) – 2.000002 Alarms

In EU Ireland, each standard resolution alarm after the first ten costs $0.10. The bill says there are twelve alarms in EU Ireland – ten of these are free and the other two cost $0.10 each – $0.20 in total.

$0.10 Per Alarm Metric Month (Standard Resolution) – EU (London) – 1.000001 Alarms

CloudWatch standard resolution alarms also cost $0.10 in EU London. As all my free alarms are seemingly in EU Ireland, the one in EU London costs a further $0.10.

So the bill is saying I have thirteen alarms – twelve in EU Ireland and one in EU London. Let’s open CloudWatch and see what’s going on there.

CloudWatch Alarm Dashboard

It seems I have thirteen CloudWatch alarms. Interesting, because I could only remember the four security alarms I set up in February.

CloudWatch says otherwise. This is my current EU Ireland CloudWatch dashboard:

Closer inspection finds eight alarms with names like:

  • TargetTracking-table/Rides-ProvisionedCapacityHigh-a53f2f67-9477-45a6-8197-788d2c7462b3
  • TargetTracking-table/Rides-ProvisionedCapacityLow-a36cf02f-7b3c-4fb0-844e-cf3d03fa80a9

Two of these are constantly In Alarm, and all have Last State Update values on 2022-03-17. The alarm names led me to suspect that DynamoDB was involved, and this was confirmed by viewing the Namespace and Metric Name values in the details of one of the alarms:

At this point I had an idea of what was going on. To be completely certain, I wanted to check my account history for 2022-03-17. That means a trip to CloudTrail!

CloudTrail Event History

CloudTrail’s Event History shows the last 90 days of management events. I entered a date range of 2022-03-17 00:00 > 2022-03-18 00:01 into the search filter, and it didn’t take long to start seeing some familiar-looking Resource Names:

Alongside the TargetTracking-table resource names linked to monitoring.amazonaws.com, there are also rows on the same day for other Event Sources including:

  • dynamodb.amazonaws.com
  • apigateway.amazonaws.com
  • lambda.amazonaws.com
  • cognito-idp.amazonaws.com

I now know with absolute certainty where the unexpected CloudWatch alarms came from. Let me explain.

Charge Explanations

So far I’ve reviewed my bills, found the CloudWatch alarms and established what was happening in my account when they were added. Now I’ll explain how this all led to charges on my bill.

The $0.20 EU Ireland Charge

When I was recently studying for the Developer Associate certification, I followed an AWS tutorial on how to Build a Serverless Web Application with AWS Lambda, Amazon API Gateway, AWS Amplify, Amazon DynamoDB, and Amazon Cognito. This was to top up my serverless knowledge before the exam.

The third module involves creating a DynamoDB table for the application. A table that I provisioned with auto-scaling for read and write capacity:

These auto-scaling policies rely on CloudWatch alarms to function, as demonstrated by some of the alarm conditions:

The DynamoDB auto-scaling created eight CloudWatch alarms. Four for Read Capacity Units:

  • ConsumedReadCapacityUnits > 42 for 2 datapoints within 2 minutes
  • ConsumedReadCapacityUnits < 30 for 15 datapoints within 15 minutes
  • ProvisionedReadCapacityUnits > 1 for 3 datapoints within 15 minutes
  • ProvisionedReadCapacityUnits < 1 for 3 datapoints within 15 minutes

And four for Write Capacity Units:

  • ConsumedWriteCapacityUnits > 42 for 2 datapoints within 2 minutes
  • ConsumedWriteCapacityUnits < 30 for 15 datapoints within 15 minutes
  • ProvisionedWriteCapacityUnits > 1 for 3 datapoints within 15 minutes
  • ProvisionedWriteCapacityUnits < 1 for 3 datapoints within 15 minutes

These eight alarms joined the existing four. The first ten were free, leaving two accruing charges.

This also explains why two alarms are always In Alarm – the criteria for scaling in are being met but the DynamoDB table can’t scale down any further.

I could have avoided this situation by destroying the resources after finishing the tutorial. The final module of the tutorial covers this. Instead I decided to keep everything around so I could take a proper look at everything under the hood.

No resources accrued any charges in March, so I left everything in place during April. I’ll go into why there was nothing on the March bill shortly, but first…

The $0.10 EU London Charge

Remember when I said that I shouldn’t be running anything in EU London? Turns out I was!

I found a very old CloudWatch alarm from 2020. It’s been there ever since. Never alerting so I didn’t know it was there. Included in the Always Free tier, so never costing me anything or triggering an AWS Budget alert. Appearing on my bill, but always as a free entry so never drawing attention.

When I exceeded my ten free CloudWatch alarms, the one in EU London became chargeable for the first time. A swift delete later and that particular problem is no more.

No CloudWatch Charge On The March 2022 Bill

That only leaves the question of why there were no CloudWatch charges on my March 2022 bill, despite there being thirteen alarms on my account for almost half of that month:

I wanted to understand what was going on, so I reached out to AWS Support.

In what must have been a first for them, I asked why no money had been billed for CloudWatch in March:

On my April 2022 bill I was charged $0.30 for CloudWatch. $0.20 in Ireland and $0.10 in London. I understand why.

What I want to understand is why I didn’t see a charge for them on my March 2022 bill. The alerts were added to the account on March 17th, so from that moment on I had thirteen alerts which is three over the free tier.

Can I get confirmation on why they don’t appear on March but do on April please?

I soon received a reply from AWS Support that explained the events in full:

…although you enabled all 13 Alarms in March, the system only calculated a pro-rated usage value, since the Alarms were only enabled on 17th March. The pro-rated Alarm usage values only amounted to 7.673 Alarms in the EU (Ireland) region, and 1.000003 Alarms in the EU (London) region.

The total pro-rated Alarm usage calculated for March (8.673003 Alarms) is thus within the 10 Alarm Free Tier threshold and thus incurred no charges, whereas in April the full 13 Alarm usage came into play for the entire month…

To summarise, I hadn’t been charged for the alarms in March because they’d only been on my account for almost half a month. Thanks for the help folks!

Summary

In this post I investigated an unexpected CloudWatch charge on my April 2022 AWS bill. I showed what the bill looked like, demonstrated how to find the resources generating the charges and explained how those resources came to be on my AWS account.

If this post has been useful, please feel free to follow me on the following platforms for future updates:

Thanks for reading ~~^~~

Categories
Internet Of Things & Robotics

Getting Started With My Raspberry Pi 4 And AWS IoT

In this post I unbox and configure my new Raspberry Pi 4, and then register it with my AWS account as an AWS IoT device.

Table of Contents

Introduction

After earning my AWS Certified Developer – Associate certification last month, my attention turned to the Raspberry Pi my partner got me as a birthday present. I’ve had it for a while and done nothing with it because of a lack of time and ideas. I promised myself that I’d open it up after finishing my exam, so let’s go!

What’s In The Box?

My birthday gift came in the form of the Labists Raspberry Pi 4 4GB Complete Starter Kit. Having seen the price, I must have been good that year!

The set includes:

  • Raspberry Pi 4 Model B 4GB RAM with 1.5GHz 64-bit Quad-core CPU
  • 32GB Class 10 MicroSD Card Preloaded with NOOBS
  • Premium Black Case (High Gloss) for Pi 4B
  • Mini Silent Fan
  • Two Micro HDMI to HDMI Cables

Labists have a great video for assembling the Raspberry Pi. Fiddling with exposed circuitry is anxiety-inducing for a heavy-handed data professional like myself, so the video was very welcome!

The steps basically boil down to:

  • Attach Heat Sinks To Pi
  • Screw Fan To Case
  • Screw Pi To Case
  • Connect Fan Pins To Pi
  • Close Case

My Raspberry Pi is now out of the box and fully assembled, so let’s get some advice on how it works.

Getting To Know My Pi With FutureLearn

FutureLearn is a global learning platform with a mission to transform access to education by offering online courses from the world’s leading universities and brands. They offer a range of all-online, on-demand courses and offer free and paid content.

The Educators

The Getting Started with Your Raspberry Pi course is one of a number of free courses by the Raspberry Pi Foundation. The Foundation is a UK charity seeking to increase the availability of computing and digital making skills by providing low-cost, high-performance single-board computers, highly available training and free software.

The Course

The course is split into three weeks, although the lessons can be completed at the pace of the user. The first week of the course “Setting Up Your Raspberry Pi” introduces the facilitation team, walks through the hardware and software and gives a basic introduction to Raspberry Pi OS.

Week Two “Using Your Raspberry Pi” offers insight into what the Raspberry Pi can do. This includes the compute resources, the ability to connect peripherals and the built-in software such as the visual programming language Scratch and the introductory Python editor Thonny.

Finally, Week Three “Taking More Control Of Your Raspberry Pi” goes full SysAdmin and introduces security measures, the command line and remote access. Instructions are given on how to control the Pi via VNC Viewer and SSH, and commands like mkdir, cp and mv are covered.

Most significantly, the APT Package Manager is introduced along with commands including:

  • sudo apt update
  • apt list --upgradable
  • sudo apt autoclean.

A beginners course that introduces the ideas of keeping devices updated, tidy and secure is a welcome sight as it encourages good user behaviour early on and ultimately prolongs the life of the Raspberry Pi.

My Raspberry Pi is now accessible, updated and ready to take on jobs, so let’s give it something to do!

Connecting My Pi To AWS

AWS offer several IoT services that are summarised as Device Software, Control Services and Analytics. To simplify the process of connecting a new IoT device, AWS has added a wizard to the Build A Solution widget on the newest version of the AWS Management Console:

This loads the AWS IoT wizard used by AWS IoT Core, consisting of a three-step process:

A word of advice – check the region the wizard is running in! I mainly use eu-west-1 but the IoT wizard changed this to us-west-2 and would have created my resources in the wrong place!

Before starting, AWS need to know which operating system my IoT device uses and which SDK I want to use. I tell AWS that my Raspberry Pi is running Linux and that I intend to use the Python SDK, and in response AWS offers some advice before starting the wizard:

Some prerequisites to consider: the device should have Python and Git installed and a TCP connection to the public internet on port 8883.

This has already been taken care of so let’s continue.

AWS IoT Configuration

Step 1 involves creating an IoT Thing with a matching Thing Record. A Thing Record is how AWS represents and records a physical device in the cloud, and is used for recording properties of the IoT Thing including certificates, jobs and the ARN.

I name my Raspberry Pi dj-raspberrypi4-labists. AWS then attach a Device Shadow to the Thing Record. These make a device’s state available to apps and other services. whether the device is connected to AWS IoT or not. For example, my Pi’s state could be Online or Offline.

In Step 2 AWS confirm that a new thing was created. A new AWS IoT Core policy is also created to enable sending and receiving messages. AWS IoT Core policies are basically IAM for AWS IoT devices. They control access to operations including:

AWS also supply a downloadable connection kit. This contains certificates and keys for authentication and an SSH script for device configuration and message processing. This is provided as a ZIP archive, which I put on my Raspberry Pi in a new folder specifically for AWS objects.

Device Configuration

Finally, the wizard gives a list of commands to send to the IoT device to test the AWS connection. The first command unzips the connection kit:

unzip connect_device_package.zip

The second command adds execution permissions to the start.sh script in the connection kit:

chmod +x start.sh

I’m never keen on running unfamiliar code off the Internet without knowing what it does first, so I did some searching – it turns out that chmod +x makes a file executable.

Now start.sh is runnable, it can be executed using the command ./start.sh. This is a short script that performs the following actions:

The result is an infinite stream of Hello Worlds:

Finally, AWS give a summary of the steps completed:

Cost Analysis

AWS IoT Core hasn’t cost me any money so far. This might be because I’m only running test loads on it currently, but looking at the new lines on my bill it’s going to be a while before I start making AWS any money here:

Next Steps

Having set up my Raspberry Pi, I have found some upgrades that I need to take care of:

Operating System Upgrade

Firstly, my Raspberry Pi’s operating system has an update available. It is currently running Rasbian 10, known as Buster:

In November 2021 Raspberry Pi released Bullseye. This is a major upgrade so the recommended process is to download a new image, reinstall any applications, and move data across from the current image. This makes sense to do while there isn’t much data on my Pi.

This leads me on to…

Raspberry Pi Imager

A common task with a Raspberry Pi is installing an operating system onto an SD card. In 2013 Raspberry Pi released NOOBS, or New Out Of the Box Software to give it its full name. Someone at Raspberry Pi HQ clearly has a sense of humour.

NOOBS was designed to simplify the process of setting up a new Pi for first time users, and the Labists kit included an SD card with NOOBS preinstalled. However Raspberry Pi no longer support it, and now recommend the Raspberry Pi Imager for installing Raspberry Pi OS instead.

So plenty to be getting on with!

Summary

In this post I’ve unboxed and configured my Raspberry Pi and linked it to my AWS account as an IoT Thing. I’ve described the basic concepts of AWS IoT Core and have identified some important upgrades that my Pi needs before I consider using it for anything serious.

If this post has been useful, please feel free to follow me on the following platforms for future updates:

Thanks for reading ~~^~~

Categories
Developing & Application Integration

Earning My AWS Developer Associate Cert By Fighting A Cow

In this post I talk about my recent experience with the AWS Certified Developer – Associate certification, discuss why and how I studied for the exam and explain why part of the process was like an early 90s puzzle game.

Table of Contents

Introduction

On 25 March 2022 I earned the AWS Certified Developer – Associate certification. This is my fourth AWS certification and I now hold all the associate AWS certifications. People wanting to know more are welcome to view my Credly badges.

Motivation For Earning The AWS Developer Associate

Firstly I’ll explain why I took the exam. I like to use certifications as evidence of my current knowledge and skillset, and as mechanisms to introduce me to new topics that I wouldn’t otherwise have interacted with.

There’s a gap of around 18 months between my last AWS certification and this one. There are a few reasons for that:

  • I wanted to give the knowledge from the Solutions Architect and SysOps Administrator certifications time to bed in.
  • I wanted to use my new skills for the AWS data migration project at work.
  • My role at the time didn’t involve many of the services covered in the Developer Associate exam.

After the AWS migration was completed and I became a Data Engineer, I felt that the time was right for the Developer Associate. My new role brought with it new responsibilities, and the AWS migration made new tooling available to the business. I incorporated the Developer Associate into the upskilling for my new role over a four month period.

The benefits of the various sections and modules of the Developer Associate can be split across:

  • Projects the Data Engineering team is currently working on.
  • Future projects the Data Engineering team is likely to receive.
  • Projects I can undertake in my own time to augment my skillset.

Current Work Projects

  • Our ETLs are built using Python on AWS Lambda. The various components of Lambda were a big part of the exam and helped me out when writing new ETLs and modernising legacy components.
  • Git repos are a big part of the Data Engineering workstream. I am a relative newcomer to Git, and the sections on CodeCommit helped me better understand the fundamentals.
  • Build tests and deployments are managed by the Data Engineering CICD pipelines. The CodeBuild, CodeDeploy and CodePipeline sections have shown me what these pipelines are capable of and how they function.
  • Some Data Engineering pipelines use Docker. The ECS and Fargate sections helped me understand containers conceptually and the benefits they offer.

Future Work Projects

  • Sections about CloudWatch and SNS will be useful for setting up new monitoring and alerting as the Data Engineering team’s use of AWS services increases.
  • The DynamoDB module will be helpful when new data sources are introduced that either don’t need a relational database or are prone to schema changes.
  • Sections about Kinesis will help me design streams for real-time data processing and analytics.

Future Personal Projects

  • The CloudFormation and SAM modules will help me build and deploy applications in my AWS account for developing my Python knowledge.
  • Sections on Cognito will help me secure these applications against unauthorized and malicious activity.
  • The API Gateway module will let me define how my applications can be interacted with and how incoming requests should be handled.
  • Sections on KMS will help me secure my data and resources when releasing homemade applications.

Resources For The AWS Developer Associate

Because AWS certifications are very popular, there are many resources to choose from. I used the following resources for my AWS Developer Associate preparation.

Stéphane Maarek Udemy Course

I’ve been a fan of Stéphane Maarek for some time, having used his courses for all of my AWS associate exams. His Ultimate AWS Certified Developer Associate is exceptional, with 32 hours of well presented and informative videos covering all exam topics. In addition, his code and slides are also included.

Stéphane is big on passing on real-world skills as opposed to just teaching enough to pass exams, and his dedication to keeping his content updated is clearly visible in the course.

À votre santé Stéphane!

Tutorials Dojo Learning Portal

Tutorials Dojo, headed by Jon Bonso, is a site with plentiful resources for AWS, Microsoft Azure and Google Cloud. Their practice exams are known for being hard but fair and are comparable to the AWS exams. All questions include detailed explanations of both the correct and incorrect answers. These practice exams were an essential part of my preparation.

Their Certified Developer Associate practise exam package offers a number of learning choices:

  • Want to mimic the exam? Timed Mode poses 65 questions against the clock.
  • Prefer immediate feedback? Review Mode shows answers and explanations after every question.
  • Practising a weak area? Section-Based Mode limits questions to specific topics.

Tutorials Dojo also offers a variety of Cheat Sheets and Study Guides. These are free, comprehensive and regularly updated.

AWS Documentation & FAQs

AWS documentation is the origin of most questions in the exam and Stéphane and Jon both reference it in their content. I refer to it in situations where a topic isn’t making sense, or if a topic is a regular stumbling block in the practice exams.

For example, I didn’t understand API Gateway integration types until I read the API Gateway Developer Guide page. I am a visual learner, but sometimes there’s no substitute for reading the instruction manual! The KMS FAQs cleared up a few problem areas for me as well.

AWS also have their own learning services, including the AWS Skill Builder. While I didn’t use it here, some of my AWS certifications will expire in 2023 so I’ll definitely be looking to test out Skill Builder then.

Anki

Anki is a free and open-source flashcard program. It has a great user guide that includes an explanation of how it aids learning. I find Anki works best for short pieces of information that I want regular exposure to via their mobile app.

For example, one of my Anki cards was:

CodeCommit: Migrate Git = CLONE Git; PUSH Git
PULL = NULL

This was explaining the process of migrating a Git repo to CodeCommit. PULL = NULL was a way for me to remember that pulling objects from the Git repo was incorrect in that scenario.

If an Anki card goes over two lines I use pen and paper for it instead. Previous experience has taught me that I can visualise small notes better on Anki and large notes better on paper.

Blogging

My best exam performance is with the AWS services I am most familiar with. Towards the end of my exam preparation, I wanted to fill some knowledge gaps by getting my hands dirty!

My posts about creating security alerts and enhanced S3 notifications let me get to grips with CloudTrail, CloudWatch, EventBridge and SNS. These all made an appearance in my exam so this was time well spent!

I also ran through an AWS guide about Building A Serverless Web Application to get some quick experience using API Gateway, CodeCommit and Cognito. This has given me some ideas for future blog projects, so stay tuned!

Approach To Studying The AWS Developer Associate

This section goes into detail about how I approached my studies. I didn’t realise it at the time but, on review, the whole process is basically a long ETL. With sword fighting.

Extract

I started by watching Stéphane’s course in its entirety, ‘extracting’ notes as I went. Since Stéphane provided his slides and since I already knew some topics from previous experience, the notes were mostly on topics that I either didn’t know or was out of practice with.

Transform

Having finished Stéphane’s course, I started the Tutorials Dojo practice exams. The aim here is to ‘transform’ my knowledge from notes and slides to answers to exam questions.

I have a spreadsheet template in Google Sheets for this process:

As I work through a practice exam, I record how I feel about my answers:

I can choose from:

  • Confident: I’m totally confident with my answer
  • 5050: I’m torn between two answers but have eliminated some
  • Guess: I have no idea what the answer is

When I get the results of the practice exam, I add the outcomes:

The Gut Feel and Outcome columns then populate tables elsewhere on the spreadsheet:

I use these tables for planning my next moves:

  • The top table quantifies overall confidence, and can answer questions like “Is my confidence improving between practise exams?”, “How often am I having to guess answers?” and “How confident am I about taking the real exam?”
  • I can get the middle table from Tutorials Dojo, but have it on the sheet for convenience.
  • The bottom table shows me an analysis of Gut Feel and Outcome. This shows me how many of my correct answers were down to knowledge, and in addition how many were down to luck.

I then update the Question column of the spreadsheet depending on the results in the bottom table:

  • I assume that anything listed as Confident and Correct is well known. Nothing is changed.
  • All 5050s and Correct Guesses are coloured orange. Here some knowledge is apparent, but more revision is needed.
  • All Incorrect Guesses are coloured red, because there are clear knowledge gaps here.
  • Anything listed as Confident and Incorrect is also coloured red. These are the biggest red flags of all, as here knowledge has either been misread or misunderstood.

Load

As the knowledge gaps and development areas become clear, I began to ‘load’ the topics that still didn’t make sense or were proving hard to remember.

Based on the Tutorials Dojo practise exam outcomes, I made a second set of notes that were more concise than the first. So where the first set was mostly “Things I Don’t Know” the second set was mostly “Things I Can’t Remember”.

As you might imagine, this uses a fair amount of paper. I recycle this afterwards because I’m an environmentally-conscious shark.

Insult Sword Fighting

I’ve come to know part of the ‘load’ as Insult Sword Fighting. Some people will know exactly what I’m talking about here, while others will quite rightly need some explanation.

Insult Sword Fighting is part of the 1990 point and click adventure game The Secret of Monkey Island. In this section of the game, the player wins fights by knowing the correct responses to an opponent’s insults.

For example, during a fight the opponent might say:

“You fight like a dairy farmer.”

To which the player’s response should be:

“How appropriate. You fight like a cow!”

The player starts out with two insult-response pairs, and learns more during subsequent fights.

The aim of the section is to learn enough to defeat the Sword Master. However, her insults are different to the ones the player has previously seen. For the final challenge, the player must match their existing knowledge to the new insults.

So if the Sword Master says:

“I will milk every drop of blood from your body!”

The player should pick up on the word “milk” and respond with:

“How appropriate. You fight like a cow!”

OK But What Does This Have To Do With The Exam?

So let me explain. The first time with a practice exam is like the player’s first Insult Sword Fight. Most responses are unknown or unfamiliar, so things usually don’t go well.

The player gets better at Insult Sword Fighting by challenging new opponents. This time the player will know some responses, but will also encounter new insults to learn.

In the same way, the subsequent practice exams will pose some questions that are similar to those in the previous exam. Of course there will also be entirely new questions that need further investigation.

The player will decide they are ready to face the Sword Master when they are able to win the majority of their Insult Sword Fights because they know the logic behind the correct responses.

Like the insults, the logic behind the practice exam questions can also be learned. Knowing the logic well enough to regularly answer these questions correctly is a good indicator that the real exam is a good idea.

The Sword Master’s insults are different to the ones the player has trained with. To win, the player must look for key words and phrases in the new insults and match them to their existing responses during battle.

The real exam will use unfamiliar questions. However the key words and phrases in the questions will match the knowledge built up during the practice exams, revealing the logic to arrive at the correct answers!

For those wondering how I made these images, I direct you to this awesome tool.

Next Steps

Now that the Developer Associate exam is over, I have a number of ideas for blog posts and projects to try out:

  • Building an ETL for my own data
  • Creating an API to query that data
  • Deploying the solution using Git and CICD

Plus I have a bookmarks folder and Trello board full of ideas to consider. So plenty to keep me busy!

If this post has been useful, please feel free to follow me on the following platforms for future updates:

Thanks for reading ~~^~~