Categories
Security & Monitoring

Building Strong Fundamentals With Microsoft’s SC-900

In this post, I will talk about my recent experience with Microsoft’s SC-900 certification and the resources I used to study for it.

Table of Contents

Introduction

On 23 May 2022 I earned the Microsoft Certified Security, Compliance, and Identity Fundamentals certification. This is my third Microsoft certification, joining my other badges on Credly.

I wanted to take the time to write about my experience in the hope that it may help others who are looking at the certification. I also wanted to address the elephant in the room – why did I take a Microsoft certification when my blog is called amazonwebshark?

First, I’ll talk about my motivation for studying for Microsoft’s SC-900 certification. Then I’ll talk about the resources I used, and finally I’ll cover my takeaways from the experience.

Motivation

In this section I’ll talk about my reasons for studying for Microsoft’s SC-900 certification.

Security Is Job Zero

While I’m a Data Engineer by trade, security is still a big part of my job. I need to make sure that any S3 objects and EBS volumes I create are encrypted. Any AWS resources using servers need security groups and NACLs that only allow certain levels of access. Also, SQL Server objects like logins and linked servers must have appropriate scopes and follow principles of least privilege.

Then I have my own resources to consider. My AWS account needs to use appropriate IAM and S3 bucket policies to control access to resources and data. I need to consider multi-factor authentication and access keys. Monitoring is needed for any hacking attempts or costs resulting from malicious activity.

In addition, this blog also presents security challenges. My site backups must be hardened and kept up to date. I need to consider attacks such as Cross-Site Scripting and SQL Injection. There are also plugins to consider. Are they up to date? Are they fit for purpose?

These factors are only the tip of the iceberg. The security landscape is ever-changing and needs constant vigilance.

Validation Of Knowledge

While security is a vital job, many security certifications test at a high level. Examples include the AWS Certified Security – Specialty certification and the CompTIA Security+. These usually recommend many years of industry experience and in-depth knowledge of the exam provider’s services.

Microsoft’s SC-900 is aimed at people who are getting to know the fundamentals of security, compliance, and identity. While the exam is Microsoft-branded, the topics tested are broadly the same across all cloud providers. This makes the SC-900 useful beyond Microsoft’s platform.

Speaking of which…

Security Is Security Is Security

Having earned my AWS Developer Associate Certification at the end of March, I currently hold all of the AWS Associate certifications. Each of these exams requires some security knowledge.

The Solutions Architect exam can ask about securing access to resources and data using services including IAM, subnets and NACLs. Meanwhile, the SysOps Administrator exam looks at implementing and managing security policies and strategies. It includes services like AWS CloudTrail, AWS Control Tower and AWS SSO.

Finally, the Developer exam covers authentication, authorisation and encryption with services like Amazon Cognito, API Gateway and AWS KMS. In short, the three exams cover a wide range of AWS services offering different types of security.

The major cloud providers all have their own version of common security products. Microsoft maintains a comparison list, offering examples like:

Studying for the SC-900 forced me to check that I understood the various services conceptually. The challenge was less about remembering names, and more about recognising what the services did. In other words, what do I know outside of AWS?

Resources

In this section I’ll talk about the resources I used to study for the Microsoft SC-900 certification.

John Savill

John Savill’s Technical Training YouTube channel started in 2008. Since then he’s created a wide range of videos from deep dives to weekly updates. In addition, he has study cram videos for many Microsoft certifications including the SC-900.

I found John’s channel while studying for other Microsoft certifications in 2021. John’s DP-900 video was a big help in checking what I was happy with and what needed attention.

Since then I’ve kept an eye on John’s channel as I enjoy his style, and he publishes videos on other topics including PowerShell and DevOps. So when I committed to taking Microsoft’s SC-900 certification he was my first port of call.

Thanks John! You’re a great human!

Microsoft Learn

Microsoft’s education platform provides learning via guided paths and individual modules depending on the desired result. Microsoft Learn has options for casual and in-depth learning, exam certification and on-demand streaming. It also includes a gamified experience system to drive user engagement.

Microsoft Learn has a free learning path with four modules tailed for the SC-900 exam. These modules offer a wide range of learning resources. Some sections are text-based. Others include videos and screenshots of the Azure portal. Some sections also include interactive sections that allow the use of a sandbox.

Microsoft Learn is a great resource and I look forward to seeing what else it has to offer.

Summary

In this post, I talked about my recent experience with Microsoft’s SC-900 certification and the resources I used to study for it.

In my opinion, Microsoft’s SC-900 has found a great niche for itself. Microsoft’s investment in this certification highlights the importance of security at a fundamental level. And despite the exam’s branding, the knowledge required to earn the certification is useful for many platforms and roles.

Microsoft’s SC-900 helped me prove my familiarity with security fundamentals. It also demonstrated to me that my security knowledge goes beyond the AWS cloud. It was a good experience and well worth taking on!

If this post has been useful, please feel free to follow me on the following platforms for future updates:

Thanks for reading ~~^~~

Categories
Training & Community

AWS Summit London 2022 Takeaways

In this post, I will talk about my main takeaways from my visit to the AWS Summit London 2022 event.

Table of Contents

Introduction

Anyone following my Instagram will have seen that I attended the AWS Summit London 2022 event in April. This was my first AWS event, and I had a great time watching the presentations, taking in the atmosphere and finding things that a magnetic shark could stick to.

Besides stickers and badges, I left the event with pages of notes and photos of slides that fell roughly into two lists:

  • Consider for work
  • Consider for me

I’ve done the work list, so it’s time for mine! This post has two halves. Firstly, I’ll talk about some of the AWS services I want to try out on the blog over the next few months.

Then, in the second half, I’ll talk about some of the third party presentations that introduced me to interesting things that I hadn’t heard about before.

Let’s get started!

AWS Presentations

In this section, I’ll talk about some of the services mentioned in the AWS Summit London 2022 sessions that I want to try out over the next few months.

Amazon CloudWatch SDK For Python

Having seen the CloudWatch SDK in passing while studying for my Certified Developer Associate certification, I saw a demo of it in one of the sessions.

I was impressed with how quick and simple the SDK is to use, and have a few ideas for it as part of some Python ETLs and IoT functions I want to try. In addition, I can create and then re-use common monitoring modules to save myself some time in future.

Amazon Timestream

From the Amazon Timestream website:

Amazon Timestream is a fast, scalable, and serverless time series database service for IoT and operational applications that makes it easy to store and analyze trillions of events per day up to 1,000 times faster and at as little as 1/10th the cost of relational databases.

Some time soon I’m hoping to try out a Raspberry Pi project that uses a temperature sensor. Timestream looks like a good fit for this! It’s built with IoT in mind, is serverless and offers built-in analytics. In addition, it offers integrations with Amazon Kinesis and Grafana, so it sounds simple to get off the ground.

AWS Data Exchange

From the AWS Data Exchange website:

AWS Data Exchange makes it easy to find, subscribe to, and use third-party data in the cloud.

After you’ve subscribed to a data product, you can use the AWS Data Exchange API to load data directly into Amazon Simple Storage Service (S3) and use a range of AWS analytics and machine learning (ML) services to analyze it.

One of the challenges of trying out services aimed at big data is a lack of big data.

Sample databases like Northwind, AdventureWorks and WideWorldImporters have been around for a while, helping generations of people learn their craft. However, Northwind was intended for SQL Server 2000. And although WideWorldImporters is more recent it’s a bit limited by modern standards.

AWS Data Exchange offers a variety of modern Data Products via the AWS Marketplace. Currently, there are over 3500 Data Products and almost half of them cost nothing to access. So lots to use for potential EMR, Glue and SageMaker projects!

AWS DataOps Development Kit (DDK)

From the AWS DataOps Development Kit repo:

The AWS DataOps Development Kit is an open source development framework for customers that build data workflows and modern data architecture on AWS. Based on the AWS CDK, it offers high-level abstractions allowing you to build pipelines that manage data flows on AWS, driven by DevOps best practices.

The DDK joins the CDK as something I want to try out. I’ve not done anything with infrastructure as code on the blog yet. However, the CDK sounds like a good place to start, and the DDK could quickly spin me up some infrastructure to use with some Data Exchange data.

AWS One Observability Workshop

From the One Observability Workshop Studio:

You will learn about AWS observability functionalities on Amazon CloudWatch, AWS X-Ray, Amazon Managed Service for Prometheus, Amazon Managed Grafana and AWS Distro for OpenTelemetry (ADOT). The workshop will deploy a micro-service application and help you learn monitoring.

I’ve already made some bespoke monitoring for my main AWS account. I’m interested in trying this workshop out to see what else I can learn. I’m also keen on getting some first-hand experience with X-Ray, Prometheus and Grafana.

Third-Party Presentations

In this section, I’ll talk about some of the third party presentations that introduced me to interesting things that I hadn’t heard about before.

Cazoo’s Serverless Architecture

Cazoo‘s Engineering Coach Bob Gregory spoke about their use of AWS serverless technologies including Lambda, DynamoDB and Athena. As a result, Cazoo was the first to market and could scale quickly in response to rapid customer demand.

This was my first time hearing about Cazoo, and Bob turned a very business-oriented presentation into a chat with some mates at the pub. He has a great speaking style, an example of which is here:

Amazon published a press release about Cazoo on the day of the Summit. It details Cazoo’s current and future relationship with AWS and includes Cazoo’s plans to integrate various AWS machine learning tools. Examples include Textract for paperwork processing and invoice management and Rekognition for inventory handling and rapid image and video analytics.

And speaking of analytics…

EMIS Group’s Data Architecture

EMIS Group‘s CTO Richard Jarvis spoke about how they use various AWS services to ingest, analyse and present health care data. During the 2020 Pandemic, they were able to quickly analyse national COVID-19 data and provide clinical research about topics including transmission, treatment and vaccination.

EMIS Group’s data security includes a Data Mesh architecture, which separates data producers from data consumers. Meanwhile, AWS IAM handles the security of their applications by controlling how users access them and how they interact with each other.

As a result, EMIS Group can ensure that the right applications are accessible by the right people, and that sensitive and personal data is stored appropriately and in line with GDPR.

Ocado’s Fulfilment Robots

Ocado‘s Chief Technology Officer James Donkin and Chief of Advanced Technology Alex Harvey spoke about the use of AWS at their fulfilment centres. Ocado has made a name for itself in the field of robotics and has used this technology to drive efficiency and innovation.

That video is from 2018 and a lot has changed since then. This year Ocado have begun upgrading to their new 600 Series fulfilment robot, pictured here:

Wait. That’s a Borg Cube. Hold on.

YOU WILL BE REFRIGERATED

Alex and James talked about the challenges of operating thousands of robots, and how AWS help them innovate and scale while maintaining low latency and cost. Ocado deploys microservices and web applications to AWS, which the robots rely on for communication and navigation.

Further information is available on an Ocado case study on the AWS website.

Summary

In this post, I discussed the main takeaways from my recent visit to the AWS Summit London 2022 event. I talked about some of the services I want to try out on the blog over the next few months, as well as some of the third party presentations that introduced me to interesting things that I hadn’t heard about before.

In conclusion, I had a great time at the summit! I came away with a lot of good ideas and had some great conversations. Hopefully, I’ll be able to go back next year!

If this post has been useful, please feel free to follow me on the following platforms for future updates:

Thanks for reading ~~^~~

Categories
Me

My First Technical Job

In this post I respond to the May 2022 T-SQL Tuesday #150 Invitation “Your First Technical Job” and talk about my time as an Application Support Developer.

Table of Contents

Introduction

This month, Kenneth Fisher’s T-SQL Tuesday invitation was as follows:

This month for TSQL Tuesday I’d like to hear about your first technical job(s). I know most DBAs don’t start out working with databases so tell us how you did start.

So let’s go back in time!

Application Support

My first technical job was as an Application Support Developer at Think Money Group. I’d been with TMG for almost six years at the time, starting in a customer service role before moving into an administrative role in 2012.

By 2016 I had a deep understanding of the organisation’s front and back ends, and had built several Excel and Sharepoint-based solutions to make various internal and regulatory processes easier to manage.

When a vacancy in Application Support became available that Summer, I went to meet the team and put an application in. My roles to date, knowledge of the organisation and technical skills were a good fit, and so I was welcomed aboard!

Application Support was responsible for supporting TMG’s bespoke applications. We had several tasks that I will give a brief overview of.

Ticketing System Triaging

This was my introduction to a ticketing system. As a team, we completed tickets from the backlog based on priority and age.

Incoming tickets needed to be triaged before they entered the backlog. This involved a couple of checks like:

  • Prioritisation: How urgent is the ticket?
  • Relevance: Is the ticket a reasonable request? Is it a duplicate of an existing ticket?
  • Serviceability: Is anything missing from the ticket? Can it be completed in its current form?

A correctly triaged and well-maintained backlog was essential. As a team, it helped us meet our SLAs and prevent bottlenecks and duplication. As individuals, it increased productivity and simplified the working day.

Servicing

Common tickets would involve giving users access to services or features within the application. For example, granting access to certain reports, or changing a user’s setup to alter their view on certain screens.

This didn’t need any code changes as it was handled within the application. Even so, care was needed to make sure requests were handled correctly, and that principles of least privilege were followed.

Scripting

Other common tickets would involve changing data at scale. For example, adding notes onto accounts when letters were sent. The application could handle this with small amounts of letters. However, it made more sense to insert notes directly into the database when there were thousands of them.

These tickets served as my introduction to SQL Server. I inserted new data into tables. I updated tables using temp tables and cursors. Every test customer once ended up with the same surname when I didn’t use a WHERE clause properly.

Mistakes are part of learning though. At least it wasn’t production…

Learning about the testing and production servers then introduced me to other SQL Server concepts like Agent Jobs, Linked Servers and Replication. So the floodgates were well and truly opened!

Troubleshooting

Users would sometimes get errors that they would send in for investigation. Common ones might be caused by:

  • Insufficient permissions for a certain screen.
  • Deadlocked processes in the backend database.
  • Unsupported application version.
  • Bugs in a new release.

Firstly, I would contact the user to explain what was happening. I would then either fix the problem or record it as a bug.

For the unclear errors, I would make contact with one of the Software Engineers for guidance. We’d then either go through the error together, or I’d escalate the error if it was very complex.

Summary

In conclusion, I have talked about my time as an Application Support Developer and have given an overview of what the role involved. In addition, I have explained how the role introduced me to SQL Server.

Thanks to Kenneth for this month’s topic! My previous T-SQL Tuesday posts are here.

If this post has been useful, please feel free to follow me on the following platforms for future updates:

Thanks for reading ~~^~~