Tag: AWS (page 1 of 1)

First Thoughts About Azure

As many of you probably already know, my cloud development career started in AWS, which I worked with for just about 3 years while I worked at Scentsy. Since my recent transition to a new job at a different company, I have started to develop in Azure instead, and it’s been a learning journey. Although both platforms allow for cloud development and processing, they have quite a few notable differences in what is offered and how they offer it, which is what I’m going to cover in this post today. My goal for this list isn’t to have a technical or all-inclusive list of the differences, but more of a difference a developer might feel in their own work if they make the same switch that I have.

What’s in this post:

Azure seems simpler

Azure is simpler yet still robust. Sometimes I feel like AWS tries to overcomplicate their services in order to make them seem fancier or more cutting-edge. And it also seems like they split what could be one service into multiple just to increase their total service count. Azure combines multiple functions I was used to in AWS into a single service. An example of that is Azure DevOps, which combines your ticketing/user story system with your DevOps pipelines and your Git (or other) repos. In my past job, we used TeamCity and Octopus Deploy for the pipelines, Jira for the ticketing, and Bitbucket to store our code, so I was a little confused my first couple of weeks in my new role since everything seemed to only be in one location. But I now find it nice and easier to work with.

Azure has better cloud ETL development

In the Azure cloud platform, there is a service called Synapse Workspace or Synapse Studio, and a second service called Azure Data Factory, which both allow you to create ETL pipelines right in the cloud. AWS has Glue, but that really doesn’t seem to have the same feel or capabilities that either Synapse or Azure Data Factory (ADF) has in the Azure realm. I have already updated and created several pipelines in each of those services in Azure and I really enjoyed working with them because they were very intuitive to get working with as a newbie and I could do everything I needed for the ETL right in the cloud development workspace.

When I worked with Glue in the past, it definitely did have some limited capabilities for making drag-and-drop ETLs in the cloud, but the service seemed to have a lot of limits which would force you to start writing custom PySpark code to make the data move. While writing custom code is also possible with Synapse and ADF, they both are built with more robust built-in components that allow you to make your ETLs quickly without writing any more custom code than a few SQL queries. I have really been enjoying working in these new services instead of AWS’ Glue.

More on Azure Data Factory

Another reason why I have been enjoying working with Azure Data Factory (ADF) is because it seems to be a modern version of the SSIS I am already familiar with, and located in the cloud instead of on an ETL server and local developer box. Although the look of ADF isn’t exactly the same as SSIS, it still is the drag-and-drop ETL development tool I love working with. And since it’s developed by Microsoft, you get all the best features available in SSIS ETL development without having to work with the old buggy software. I’m sure as I keep working with ADF that I’ll find new frustrating bugs that I’ll need to work around, but my experience with it so far has been only positive.

Power Automate & Logic Apps

Two other tools that aren’t available in the AWS ecosystem and that don’t seem to have an analog in AWS are Power Automate and Logic Apps. While these tools are more aimed at people who are not developers, to allow them to automate some of their daily work, they are interesting and useful features for certain scenarios and I am enjoying learning about them and playing with them. One of the best parts about working with Azure services is that it’s fully integrated into the entire Microsoft ecosystem, so you can pull in other non-Azure Microsoft services to work with Azure and expand your horizons for development. I’m not sure yet that I would 100% recommend working with Power Automate or Logic Apps for task automation (I’m still not done learning it and working with it), but it at least is another option to fall back on in the Microsoft realm that isn’t available in AWS.

Copilot isn’t what they want it to be

While most of my experience with Azure so far is positive, there are a couple annoying things I’ve noticed that I think are worth sharing, although neither of them are so egregious that it would prevent me from recommending working with this platform.

The biggest negative about Azure for me so far is that Microsoft keeps trying to shove Copilot (their AI assistance tool which seems only slightly more advanced than Clippy) into every single product they offer even when it provides no benefit or actually detracts from your total productivity. The perfect example of this is the “New Designer” for Power Automate. For some unknown reason, Microsoft has decided that instead of allowing you to do a drag-and-drop interface for task components to build your automation flow, everyone should instead be required to interact with Copilot and have it build your components instead. That might be useful if you had already been working with Power Automate in the past so knew what capabilities and components it offered. But as someone totally new to this space who is trying to learn how to use the tool and has no idea what is currently possible to develop, it feels basically impossible to communicate with the AI in any meaningful way in order to build what I want. I don’t know what to ask it to create when I’ve never seen a list of tasks that are available. Luckily, for now it is possible to toggle off the “New Designer” and switch back to the old that allows you to add each individual component as you go and select those components from a list which gives you a short description of what each does. Maybe in the future I’ll be more open to using Copilot with everything I develop, but right now, as a new developer in Azure, it doesn’t work for me.

Unintuitive service naming

The only other nitpick I have about the Azure and Microsoft cloud ecosystem is that sometimes, the names they pick for their services don’t make sense, are confusing, or are the same thing as a totally different service. Microsoft doesn’t seem to be that great at naming things to make them understandable at a quick glance, but I suppose that can also be attributed to the desire of all cloud computing companies to make themselves look modern and cutting-edge.

The best example I can give of this phenomenon right now is that a data lake in Azure is built on what are called Storage Accounts, which is the blob storage service within Azure. It’s not as confusing to me now that I’ve been dealing with it for a month and a half, but that name doesn’t seem at all intuitive to me. Each time my colleagues directed me to go to the “data lake” I would get confused as to where I was supposed to navigate since the service I would click into was called Storage Accounts instead.

Summary

Although it felt like such a big switch in the beginning to move from an AWS shop to an Azure shop, I have already started to enjoy developing in Azure. It has so much to offer in terms of cloud ETL development and I can’t wait to keep learning and growing with these tools. I’ve already compiled so many things that I can’t wait to share, so I am hoping I will get those posts ready and posted soon so others can learn from my new Azure developer struggles.

How I Prepared for my AWS Certification Exams

So far in my career, I have achieved three different certifications from AWS for different aspects of their cloud development platform. I passed all three of the exams, one beginner, one intermediate, and one advanced, all on the first try. While I’m usually pretty good at learning new things and taking tests, two of the tests left me questioning whether or not I had actually passed or not when I left the testing center, but I did end up passing each of them. Many of my colleagues have asked me how I prepared for the exams, as they are also beginning to work on similar certifications, so I thought I would also share my process here to help others outside of my organization.

What’s in this post:

AWS exams I have taken

  • Certified Cloud Practitioner
    • This is the easiest of the certifications I have achieved
    • The content of this exam is general knowledge of most of the services available in the AWS cloud
    • You will not need to have any great depth of knowledge of any particular service to pass this exam
  • Certified Solutions Architect – Associate
    • This was the middle ground in terms of difficulty of the exams I’ve taken
    • Very focused on the cost and design of systems across a broad range of services in AWS
    • Focus on the “Well Architected Framework”
    • More of an architecture-level exam (not development level)
  • Certified Database Specialty
    • This was the most difficult of the exams I’ve taken, simply due to the deep level of knowledge I needed to have for each of the database services available in AWS
    • Focused somewhat on the design of databases in different services of AWS, but mostly focused on the best way to implement different use cases for the various database services
    • More of a development-level exam rather than architecture-level

My level of knowledge before starting courses for each exam

  • Certified Cloud Practitioner: absolutely no knowledge of anything with AWS before starting a course to learn and prepare for this exam.
  • Certified Solutions Architect: I had more general AWS knowledge before starting to learn and study for this exam due to what I had already learned for the previous exam, but I still hadn’t done any actual development work myself in the AWS platform.
  • Certified Database Specialty: In addition to the knowledge I had gained from studying for the two previous exams, I had finally started to play around in the AWS console personally, so I had a tiny bit of AWS database development before starting a course for the exam.

While it’s best to actually develop in the AWS cloud before trying to take an exam, it isn’t required in any sense. Preparing for the exams is actually a great way to learn more in depth about all the AWS services so you can start working with them yourself.

How long I learned and studied before taking the certification exams

  • Certified Cloud Practitioner: 2-3 months
  • Certified Solutions Architect: 4 months
    • At the time I was learning and preparing for this exam, I had quite a bit of free time during my average work day so I spent many hours each week preparing for this exam
    • If you only have a couple hours available each week to prepare for this exam, you will likely need more than 4 months to learn and prepare for the exam
  • Certified Database Specialty: 4-6 months
    • I did not have as much dedicated time during my work weeks to focus on this exam, so it took me longer to prepare (although I can’t remember now exactly how many months I prepared for it)
    • If you only have a couple hours available each week to prepare for this exam, you will likely need closer to 6 months or more to learn and prepare for the exam

How I use the ACloudGuru learning platform

ACloudGuru is a learning platform specifically aimed at helping developers learn how to use various cloud technologies, including AWS. They seem to have a course designed for every possible AWS certification you would want to achieve.

In all honesty, I’m not sure I would personally seek out using this platform again for any future certifications if I had to pay for it myself. My work pays for every developer to have a license to the platform, and it’s a great jumping off point for learning enough about AWS to pass the certification exams, but it certainly has flaws. But if you have the opportunity to work with these courses, they do a good enough job to get you 90% of the way to what you need to prepare for the AWS certification exams, so they’re certainly not a bad option.

Pros of ACloudGuru

  • It offers a unified, full course experience to cover most topics you will need to know to pass an AWS certification exam
  • You can watch the videos at your own pace and come back to any video whenever you need to
  • Each video and section of a course will offer some links to read for further information, which can be helpful to find the AWS documentation you need to read
  • Each section of the course has a review quiz to test your knowledge as you go, which can help you remember things better
  • There is at least one practice exam provided to cover all of the course’s content at the end of the course. You can take this exam as many times as you would like, and the questions are not always the same or in the same order (so you can’t just pass by memorizing which answer to select–A, B, C, or D).

Cons of ACloudGuru

  • I found that the videos often focused on things that weren’t that important for the exam and would somehow cram the most important details into one or two sentences that I would then have to fully unpack myself.
  • Need to supplement the course teachings with additional reading of relevant AWS white papers or other documentation online
  • The course content isn’t updated as frequently as the exams seem to be updated, so I ended up covering a lot of content in the course that was never covered in my exam, and also didn’t cover some exam topics nearly enough with the courses.
  • The editing of the videos wasn’t the best in the Database Specialty course, which I find disappointing for a platform that I’m sure costs a lot of money. There were many videos where it seemed like things were kept in that should have been edited out (like actual bloopers, not just irrelevant content).
  • Practice exam questions aren’t written in the same manner as the actual exams which might lead people to believe that the actual exams will be easier than they are.
  • The course definitely doesn’t spoon feed you everything for the exam, you have to be willing to do your own additional research and experimentation to be fully prepared

How I used the AWS white papers to learn more

As I mentioned in the section above, in addition to going through the ACloudGuru courses online, I also read a lot of documentation and white papers from AWS to feel like I truly had a sense of how each service operates.

My approach for learning was to watch each video in the courses from ACloudGuru, making sure to take thorough notes of what was covered in each video. Then after I had completed each video, I would review any documentation linked for that topic (there was usually at least one document per video, but not all videos have links to AWS documentation for further reading). If there were topics covered in the video that I felt weren’t covered well enough or that I was still confused about, and those topics didn’t have documentation linked to them in the course, I would seek out AWS and other documentation to learn more about the topic. And would also then take notes on those documents.

Although reading documentation is never the most interesting thing you could be doing with your day, doing it really does pay off when it comes to taking the exam, so you should try to read the AWS white papers and documentation for each service when possible during your studying journey. And make sure to take good notes. For the two tougher exams, I filled 1/2 – 3/4 of a composition notebook with notes for each. There was a lot of content to cover for each exam and I made sure to take thorough notes.

Additional tools I used to help myself study

After I made my way through the entire ACloudGuru course for each certification exam and had read enough AWS documentation to fill my head for a long time, I would then try my best to synthesize and recapture my notes in a useful way in order to do my final studying. For all three of the exams I have taken, I used note review and study skills that I used in college.

Custom practice tests to help review notes

I feel like this is one of the nerdiest things I can admit to, but I swear it works so I’ve done it for all 3 of the exams I’ve taken. After my notes were completed, I went through them again, by chapter of the ACloudGuru course, and wrote my own practice exams to test myself with as a first pass. Doing this does take a lot of time and paper, but I personally think it’s worth it.

To make these personal tests based on my notes, I would essentially turn the most important bullet points into questions that I could then answer. So, for example, say that I have a note that says “Redshift is used for data warehousing and data analytics, not OLTP”, I would then turn that into the question “Which AWS service can be used for data warehousing and analytics but isn’t suited for OLTP data?”. I would develop these custom tests using a Word document and would then print it out and go through all the questions I made for myself, trying to use my memory and not my notes whenever possible for the best recall and memorization.

Flash cards for quick and repetitive review of high-level ideas

A high school and college classic study tip, creating and using my own flash cards really helped me burn the necessary knowledge into my brain before each exam. And I created literally hundreds of flash cards for each exam. Sorry, trees, but the numerous flash cards really helped me so it was worth the use of so much paper.

While the thought of creating and reviewing hundreds of flash cards may seem daunting, I hardly ever tried to work through all of them at once. Instead, I made, grouped, and reviewed flash cards by topic or section of the ACloudGuru course and only ever really reviewed on section of cards at a time.

My best tip for getting the most out of flash cards is to remove from your review stack anything that you can answer immediately upon seeing the card, and keeping the cards that took you a while to remember the answer to or that you couldn’t answer at all to review again later. Keep reviewing the problematic topics over and over until they are no longer problems.

What practice exams I used to prepare

While each ACloudGuru course does include a practice exam to help you test yourself on the course’s contents, I personally do not think that those practice exams are useful in preparing for the exam, outside of general recall of the topics that could be on the exam.

What was the most useful part of my study routine for each of the AWS certification exams was to take practice exams through Udemy/Tutorials Dojo. I don’t know how those folks have done it, but their practice exams are extremely similar to the actual AWS exams. The wording of their questions and answers are basically the same as the real exam, which I found to be the most helpful thing to use for studying, since it prepares you for the verbose formatting of the AWS exam questions. In comparison, the practice exams from ACloudGuru have very different formatting and wording, which, in my opinion, isn’t useful for preparing for the real exams.

You can get the Tutorials Dojo exams through their website directly or through Udemy, but you will have to pay. However, the price is reasonable and well worth it (make sure you wait for a sale on Udemy to get the “course” for ~$16 instead of the list price of $80+). The price is especially reasonable if you’re in a situation like I am where I would have to repay my company the multi-hundred-dollar cost of the exam if I failed. Pay the little bit of money and retake the practice exams until you get 70% or higher on them repeatedly. Also make sure to review the correct answers and explanations for those answers on questions you get wrong, because those explanations to be super helpful.

Conclusion

Overall, it does take quite a bit of time and effort to fully prepare yourself to take one of the AWS certification exams, but it is all totally doable and the exams are not impossible to pass. Just make sure you do your due diligence in studying before signing up to take the exams. If I can pass them, you can pass them.

Why CFTs Take so Long to Delete

Welcome to another coffee break post where I quickly write up something on my mind that can be written and read in less time than a coffee break takes.

Background

Recently, I went through an AWS workshop for Lake Formation, a data lake management tool in AWS, and that workshop had me create many different Cloud Formation Templates (CFTs) to spin up services to use in the workshop. After I finished that, I then had to go through my development AWS account for work and clean up everything that had been created so we stopped paying for these services I no longer needed.

While attempting to delete the many CFTs I had used, I saw one that was seemingly stuck in the DELETE_IN_PROGRESS state for almost 20 minutes. I did not realize it would take so long to delete one CFT and was getting worried that it was actually stuck. So I started searching online to see if this has happened to others as well.

Why does the delete take so long?

I found this Reddit post of someone reporting the same thing, and it linked to a very informative answer to a similar question on Stack Overflow. I would recommend you go and read that detailed answer there for the best understanding of why CFTs sometimes take forever to delete.

The simple answer is that is just how it is. My CFT in question had set up a lot of Virtual Private Clouds (VPCs), Elastic Compute Cloud (EC2) instances, Elastic Network Interfaces (ENIs) as well as other resources, and some of those items simply take awhile to delete.

Even though I can’t speed up the deletion process for these big CFTs, at least now I know that in the future, should I need to delete any other large CFTs from my AWS account, I can expect it might take longer than I would anticipate to complete.

How to Get Public IPv4 DNS for AWS EC2 Instance

I have been trying to learn how to work with AWS Glue because it’s probably going to be a new ETL solution my organization uses as we migrate to Postgres in AWS. Part of learning how to use Glue is learning how to set up and use Postgres RDS instances so that I can move data between them with Glue.

Setting up the RDS instances was the easy part, since AWS makes that process go very smoothly. Even setting up the EC2 jump server to connect locally to my RDS instances seemed like it was easy as well, only a few options to select and then a new server was created for me.

The Problem

However, in my most recent attempt at creating all 3 of these servers (I have to regularly delete what I have while not using it to not incur additional charges), I kept running into an issue where my EC2 server was not being assigned an IPv4 Public DNS address, and without that value, I can’t connect to that server as a jump host on my local computer. That was a big problem for me.

I spent over a half hour trying to troubleshoot this problem, double-checking the VPC rules for DNS and everything I could think of, and none of it was working. I terminated and recreated the instance multiple times and that did not do the trick. Finally I found this Stack Overflow answer that was exactly what I needed, and the fix was super obvious but also hard to see at the same time.

The Solution was Simple

For some unknown reason, the settings that AWS defaulted to when I was creating new instances was to set “Auto-assign public IP” to “Disabled”, and I didn’t catch it at first because that section of the instance creation settings was in a non-editable state by default as well. If you run into this same issue, when you get to the “Network Settings” part of your instance creation dialog and “Auto-assign public IP” is set as Disabled and it looks like there’s no way to change that, click the edit button at the top right of that pane to change the default instance settings. Then Enable the option to assign a public IP address to the instance.

It’s that simple. I can’t believe it took me so long to figure out something so obvious! But that’s life in IT sometimes.

Extra Note

When you stop and then start your EC2 instance again, it will assign a new Public IPv4 DNS name to the instance. It took me longer than I would like to admit to figure this out. I kept having an issue each morning where my SSH tunnels to my RDS databases through this EC2 server would no longer work. After several weeks and trying many different things, I finally figured out that the Public IP address was changing each time I stopped my instance at the end of the work day and restarted it the following day, and that’s what was causing my tunnel to break.