Friday, June 14, 2019

AWS Certified Solutions Architect - Professional

So I recently needed to re-certify my AWS Certified Solutions Architect - Professional certification. I tried to keep track of everything I did, found helpful or what wasted my time. Hopefully this can aid someone looking at doing this certification, there is so much content to cover.
I spent roughly 2 months prepping for this certification, I did take a week break in the middle, and did skip a couple days here and there.

Roughly, I did about 100 hours of video content courses and YouTube, 10 practice exams and whole a bunch of reading.

Courses & Video Content

A Cloud Guru:
I have always been a fan of ACG, however I found this course lacking in details to be honest. The actual videos felt like summaries and then you are pointed at other external resources. If you recently did your associate certification, I don't know if you'll get much out of it.
I used this as a a starting point, and it was a good "refresher". The exam simulator as another source of practice, and that is quite good.
https://acloud.guru/learn/aws-certified-solutions-architect-professional-2019

Linux Academy:
This course covers a lot of the content in some good detail, and gets you to actually try it out in labs. I would recommend everyone signup for this. The instructor and other students are also quite active on their slack channel. I found some of those discussions and exam feedback useful.
https://linuxacademy.com/amazon-web-services/training/course/name/aws-certified-solutions-architect-professional-2018

Exam Readiness: AWS Certified Solutions Architect - Professional (Digital):
https://www.aws.training/transcript/curriculumplayer?transcriptId=ui4o3L6JBUuG_Zx7hkNvwA2
Was a complete waste of 3 hours of my life.

You Tube / Re:Invent:
Search YouTube for AWS Re:invent deep dive or 300/400 level for any topic you need to cover.
Here are my playlists:
Level 400s - Watch all of these
Level 200-300s - Watch what required

This is my Architecture:
I generally enjoy this format, quick, real-life examples of working architectures.
https://aws.amazon.com/this-is-my-architecture/


Notes

I didn't read a single FAQ, I personally don't like the format. I rather used the aws-cheat-sheets from tutorialdojo.
https://tutorialsdojo.com/aws-cheat-sheets/


Practice Exams

ACG, Linux Academy:

Both had 1 test each, pretty good, would recommend everyone do them. I just found their explanations only focused on the correct answer, where actually identifying incorrect answers is just as important.
https://acloud.guru/learn/aws-certified-solutions-architect-professional-2019
https://linuxacademy.com/amazon-web-services/training/course/name/aws-certified-solutions-architect-professional-2018

Whizlabs:
These have a bad rep amongst other people taking the test, and I would agree to some extent, I don't know if they added that much value, except for the practice in reading / dissecting questions at speed, and their explanations on some concepts were useful.
https://www.udemy.com/aws-certified-solutions-architect-professional-whizlabs/

Jon Bonso / Tutorials Dojo:
These were good, I would say a little easier than my actual exam, but again great to practice, great to review explanations for correct and incorrect answers
https://www.udemy.com/aws-solutions-architect-professional-practice-exams-2018/

General Reading



The Exam

Topics covered in my 75 questions (as much as I can remember) :
  • AWS Organizations scenarios - Multi AWS accounts, SCP, SAML IAM  4+
  • Lambda + Api gateway 4+
  • AWS Systems manager - Patch, Run command, automation, maintenance:  4+
  • Amazon Aurora vs RDS vs EC2 hosted: 3+
  • Elastic Beanstalk: 3+
  • ECS & Fargate: 3+
  • Cloudformation - nested stacks, stacksets: 3+
  • CloudFront - Caching & Lambda@Edge: 3 
  • SQS: 3
  • EBS - Provisioned IOPS & GP2: 2+
  • DynamoDB: 2+
  • ELB & Autoscaling : 2+
  • Redshift (one HA / quick recovery scenario) : 2+
  • AWS CodeDeploy & CodePipeline: 2+
  • Cloudwatch / CloudWatch Logs: 2
  • Cloudtrail: 2
  • Direct Connect & Direct Connect Gateway: 2
  • VPN & Direct Connect routing BGP preferred: 2 
  • Data Migration Service:  2
  • AWS WAF & Shield & Shield Advanced (DDoS) : 2
  • Transit VPC vs Transit Gateway / VPC Peering:  2
  • Kinesis - family video, firehose, analytics 
  • NAT Instance / Nat Gateway
  • Route 53
  • AWS Config
  • Application Discovery
  • Snowball, Snowmobile
  • AWS Batch
  • Athena & Quicksight
  • AWS Rekognition
  • Trusted Advisor (Business Level Support)
  • Lex
  • OpsWorks
  • Service Catalog
  • AppStream
  • VPC Flow Logs
  • VPC Endpoints - Private DNS, S3
  • Storage Gateway - File and Tape
  • X-Ray
  • Amazon Connect

My experience writing:

Going to start by saying, I found it harder and the questions less obvious than any of the practice tests. I have a certification exam technique where I flag questions I am not 100% sure of... not necessarily to go back to, but as a counter... I know if I marked less that 20 out of the 75 questions, I would be pretty sure that I would pass.. and can just complete the exam.

I started, Question 1: easy... ok nice can relax, I got this...

I then proceeded to "flag" the next 7 questions due thoughts ranging from "mmm ... unsure" to "I have no clue, wtf?".

Anyways, that trend continued throughout the exam.. couple correct, a whole bunch of "I'm not sure.". I got to the end with around 40 minutes to spare and realised that I had flagged 39 questions, I was quite disheartened at that point, so I went back and tried to review as much as I could.. I got through about 20 flagged questions when I had 9 minutes left, and I resigned myself to the fact that this was my first attempt and starting thinking what should I go back and learn for when I write again in 2 weeks.

Amazingly, I saw the words... "Congratulations, you have successfully passed ... bla bla bla."
I had to read that a couple times.. and actually still didn't believe it until I actually got my score report the following day.
I somehow managed to get: 880

So after receiving my score and re-evaluation my exam, I guess my main piece of advice would be:

The way the questions / scenarios are structured are purposely meant to be a little unclear. Be sure to pay attention to the key words / synonyms / phrases that the question is actually asking for:
cost, fewest changes, fastest processing time, least ongoing maintenance, most secure, highly available, most scalable, least downtime, fastest recovery time... etc
Then apply what you have learnt about the services, even if you're not 100% sure on the actually implementation of the scenario, make sure it satisfies the core requirement.

To anyone reading this and planning to write, Good Luck! This is a tough one.

Thursday, July 6, 2017

The struggle to stay up to date is real...

We all know the software development industry is a non stop innovation avalanche
barreling down a mountain.




So staying up to date with all the tech you touch is becoming a real struggle. When I started out as a developer many years back, I felt staying on the bleeding edge wasn't too bad.
I mean I needed to know and keep in touch with:

  • My main language and some of its main frameworks..  Java / Junit / Hibernate   
  • Some html, javascript, applets, JSP tags
  • Some RDBMS and SQL.. Latest changes on Oracle / SQL Server, how their indexing strategies change between versions etc.

These days however the list of technologies, frameworks, languages and tools has just exploded.

I have in some way or other worked with, tried, read about, implemented, studied, interested in or affected by the following in the last year. Most of which have shipped so much functionality it's actually mind boggling :

  • Java
    • Java 7 to 8 
    • Frameworks: Spring, Spring Cloud, Spring Boot, Hibernate 
  • JavaScript
  • TypeScript
  • Go
  • Python
  • Testing:
    • JUnit, Cucumber, Pact.IO, Selenium, Jasmine, Karma, Protractor, Hamcrest, Shazamcrest, Mockito
  • AngularJS
  • Angular 2 / 4
  • Docker
    • Docker, Compose, Swarm
  • AWS
    • Lambda, DynamoDB, Step Functions, S3, Cognito, ECS, EC2, VPC, Cloud Formation
  • Netflix
  • Elastic Search
  • LogStash, Kibana
  • Weblogic
  • Tomcat
  • Kubernetes
  • Kafka
  • Pivotal Cloud Foundry
  • Caching
    • Hazelcast, Redis, EHCache
  • Storage
    • Oracle, MongoDB, Prometheus.io, Neo4J, H2, MySQL, InfluxDB
  • Build + CI/CD
    • Jenkins, Bamboo, Concourse
  • Git
  • Windows Server, MacOS, Linux, iOS
  • Blockchain

I am pretty sure there are a couple others that I am missing... and not mentioning the architectural styles, patterns, agile practices and the like so prevalent in our industry.

What sparked this blog post was watching the DockerCon Videos from May...
I would say I use Docker pretty regularly, generally on quite simple use cases for my own development and some work related projects...
But watching the videos below I realized that I had completely missed out on a ton of new functionality I didn't know about...

The reason I could watch a DockerCon video 11AM on a Wednesday morning is because I am current between countries and therefore between jobs... So I have a week or 2 to kill, and obviously I jump at the opportunity to go check out some of the latest stuff...

However seeing how many changes there were and that I missed a lot of it...
I think I may need to schedule "tech-cations" every couple months to just gorge on all that is new for a couple days.. or hopefully with the use of public transport in my new city, I can cram in another hour or so of podcasts or videos into my daily schedule.







Friday, May 5, 2017

Some interesting videos from the last ng-conf

With so much content out there it is sometimes hard to find content that is actually worth spending your time on.

ng-conf released 66 videos in their last playlist
also someone gathered all the relevant resources for the conference available here

Here are a couple that I watched and found interesting or informative

Angular & GraphQL ... Apollo looks awesome, something I definitely want to try that on my next angular app / side project.



I use the angular-cli and I think it's awesome... but John Papa uses it more.. some nice tips



RxJs - as I don't get to develop in Angular on a daily basis, always good to get a refresher and learn new functionality



More RxJs...



Some component design consideration from the guys on the Material angular team.

Monday, April 17, 2017

B.A.S.S - Boot Angular Secure Starter

As this is a post related to a project "starter", I decided this time to rather have most of the details in the README.md on github rather than here on my blog.

To skip my ramblings: https://github.com/bdupreez/bass

I have recently done a couple little side projects and proof of concepts, all in Angular 2 (or 4 or whatever someone calls it), all using Spring Boot, all requiring some sort of authentication.

With both Spring and Angular teams iterating and releasing so quickly I found that by the time I got to a new POC I often had to do a number of updates and some of the projects brought in their own unnecessary complications, so it would take longer than I like to get started with the new POC, and therefore I decided it get back to a barebones implementation.

Hopefully it proves helpful to someone that requires an quickstart full stack Angular / Spring application.

It can extended and deployed as required.

One choice I would like to mention as it might not be to everyone's taste is the idea of having the Angular application as a Spring boot application rather than just deploying the static content to Nginx or Apache, and if you're looking to deploy this in a high volume production environment, you'd probably want to relook at that.
The reason I did this was just as convenience for myself. I liked having the whole app deployed as 3 self contained jars and drive both backend and ui configuration from one spring profile environment variable. Of course as this already does contain some of the Spring cloud dependancies, it would be simple enough to include Service discovery with Eureka and a Config Server, but for quickstarter... running 5 or 6 boot applications for a simple web ui might be overkill :)

More details on Github Project README.md

Tuesday, March 14, 2017

Consumer Driven Testing with Pact & Spring Boot

Recently a colleague of mine stumbled across Pact.io,  Our current application had grown to over 50 services and we we're starting to have some integration test failures and a brittle dev / acceptance test environment. So we decided to have a look at ways to try help with this.

I started out by reading:
https://docs.pact.io/faq/convinceme.html

Then watching:
https://www.youtube.com/watch?v=-6x6XBDf9sQ&feature=youtu.be

Those 2 resources convinced me to give it a shot.

So I set out and created a quick set of Spring boot apps, the GitHub repo here, to test out the concepts and get everything working.

To highlight some important bits from the demo.

Consumer:
As Pact is a consumer driven test framework. This is where you define a unit test, that test mocks the http server response and you assert against that.
Once the test is successful it creates a pact json file in the /pacts directory.

So after the "mock" test is run and the pact file has been created. You need to include a maven plugin ...pact... that is then used to publish the content of the pacts/ folder to the pact broker... which is defined in the pom as below.

Producer:

This uses the JUnit integration from Pact.io to download the pacts from the broker and then run against an running service.

Since this already uses a @RunWith annotation, I could not use the spring boot runner. So to get around that as a before class step, I start the Spring boot application, the pacts then gets run against that running instance... and the boot application gets stopped again after the tests. Depending on your use case I guess it would also be an option to do this with @Before so you get a new service instance started before each pack, but that would slow down the execution tremendously.

The @State annotation, allows for clients to define a specific state, which the producer can the use to setup additional data / conditions required for the test to run.

Once the pacts have executed against the service there are reports generated in the target folder.




Setting up the Pact Broker

1. Grab the public images from Docker Hub.
docker pull dius/pact_broker
docker pull postgres

2. Then setup the Postgres DB
docker run --name pactbroker-db -e POSTGRES_PASSWORD=ThePostgresPassword -e POSTGRES_USER=admin -d postgres
docker run -it --link pactbroker-db:postgres --rm postgres psql -h postgres -U admin
CREATE USER pactbrokeruser WITH PASSWORD 'TheUserPassword';
CREATE DATABASE pactbroker WITH OWNER pactbrokeruser;
GRANT ALL PRIVILEGES ON DATABASE pactbroker TO pactbrokeruser;
3. Once the DB is up, run the actual Broker:
docker run --name pactbroker --link pactbroker-db:postgres -e PACT_BROKER_DATABASE_USERNAME=pactbrokeruser -e PACT_BROKER_DATABASE_PASSWORD=TheUserPassword -e PACT_BROKER_DATABASE_HOST=postgres -e PACT_BROKER_DATABASE_NAME=pactbroker -d -p 80:80 dius/pact_broker


Extra References:

https://docs.pact.io/documentation/
https://docs.pact.io/documentation/sharings_pacts.html
https://github.com/DiUS/pact-jvm
https://github.com/DiUS/pact-jvm/tree/master/pact-jvm-consumer-junit

Get the example project
https://github.com/bdupreez/pactdemo


Sunday, February 12, 2017

Quick Docker environment cleanup

Just a quick reference on my docker usage.

When working / playing with docker a lot, you quite often end up with a ton of containers and volumes laying around after they are needed...

Here are a couple tips I use to keep my environment clean.

On Windows 10
If you are using GitBash create a .bashrc and include the commands below:

alias docker-rm='docker rm $(docker ps -q -f status=exited)'
alias docker-rmi='docker rmi $(docker images -q -f dangling=true)'
alias docker-rmv='docker volume rm $(docker volume ls -q -f dangling=true)'

For MacOS X 
Add the following to your .bash_profile:

docker-rm(){
docker rm $(docker ps -q -f status=exited)
}
docker-rmv(){
docker volume rm $(docker volume ls -q -f dangling=true)
}
docker-rmi(){
docker rmi $(docker images -q -f dangling=true)
}
docker-update-all() {
docker images | awk '(NR>1) && ($2!~/none/) {print $1":"$2}'| xargs -L1 docker pull
}


Then simply run docker-rm, docker-rmi and docker-rmv to remove all the dangly bits :)

docker-update-all, is a handy way to get the latest images from docker hub.

Sunday, December 18, 2016

AWS Solutions Architect - Professional Certification


Update:
Check out my updated re-certification on the new 2019 exam... here

Let me start by saying, for this certification I studied and prepared more than any certification I have done before and I have done a number of certifications throughout my career; Microsoft, Sun, IBM, Oracle and now AWS. It does feel like there is a mountain of content to get through, and then ensuring you know it at the correct level of detail for the exam.

I booked the AWS Solutions Architect Professional exam 5 weeks after the associate level one and set a couple hours a day aside to study... During the last 2 weeks however, those couple hours became all my available hours every day, it was all I focused on, including 8+ hours a day on the weekends.

Main 4 resources I used:
  • YouTube
    • The re:Invent sessions videos are an essential resource
    • My major topics playlist: 


    • Watch these twice:

  • Linux Academy 
    • These guys really impressed me, the content in the pro level certificate was excellent, I found it had more depth than A Cloud Guru, well worth the $29 per month subscription.
  • Cloud Guru
    • Decent structured content that covers the highlights of all the topics defined in the exam blueprint.
    • The discussion forum is an excellent source of tips and discussions on the practice exam questions.
  • The official practice exam
    • This was a good eye opener, into how hard some of the questions can be. I do feel that some of the questions are badly articulated which makes them almost impossible to answer, however working through every question and researching the answers was invaluable. There were also a couple of those questions on the real exam... word for word.

Other Resources (Summary  / Study Notes):


Final thoughts:

All the guides and sites recommend reading a ton of white papers, but to be honest, I read only 4 or 5 of them..and I felt that the time spent was a waste. It seems all the content was in all the youtube vids and online courses already.

I did see a lot of people mention that they were pressed for time during the exam, but I found that I finished the 77 questions with a good hour to spare. To mention though: I did not go back and review any of my answers. I have always found in the past, that I rarely change my initial answer during the review phase, and I was reasonably confident that I had done enough.

End Result:

Overall Score: 83%

Topic Level Scoring:
1.0 High Availability and Business Continuity: 90%
2.0 Costing: 100%
3.0 Deployment Management: 85%
4.0 Network Design: 57%
5.0 Data Storage: 81%
6.0 Security: 78%
7.0 Scalability & Elasticity: 90%
8.0 Cloud Migration & Hybrid Architecture: 85%


Saturday, November 12, 2016

AWS Solutions Architect - Associate Certification

I have been messing around with AWS on little side projects and experiments for about the last year. I did find it quite a daunting experience as there is just so much to learn, do, try, break, struggle, deploy, re-learn and then eventually cheer when everything works.
So I decided to focus a little more and spend the time actually learning how to hopefully do all this properly. I have always found a good way to force myself into the theory and step back from diving directly into implementing solutions is certifications.

I booked the AWS Solutions Architect Associate exam for 4 weeks into the future and set a couple hours a day aside to study.

Main 2 resources I used:

  • A Cloud Guru
    • I did watched and worked through both the Associate Developer and Associate Architect courses. There is quite a bit of overlap, but still that just meant that I had covered some of the really important bits more than once.
    • I really enjoyed Ryan's courses, I did mostly watch on 1.3x speed as it does sometimes feel a little slow and I had set myself a 4 week dead line.
    • I would recommend them to anyone starting who's looking to learn about AWS, for practical real life use or certification.
    • The discussion section / forum is a great source of tips for the exams as well.
  • Cloud Academy
    • I only signed up for their 7 day free trial, so did not work through all of their labs. However they have a ton of quiz questions and explanations which is a great way to study / practice specifically for the certification.
    • Their summary video was also good and covered some content not covered in depth on a cloud guru.
    • They also had some quick easy reading blog posts highlighting important practical information when starting with AWS... listed in Other resources below.
Other Resources:

The main aws faqs to look at:

In the end I felt I had personally learnt a lot, not only on the exam topics but in general practical application as well. Main areas were especially with regards to VPCs, Networking and Security. It was well worth the intense 4 week study session.

End Result:

Overall Score: 90%
Topic Level Scoring:
1.0 Designing highly available, cost efficient, fault tolerant, scalable systems : 90%
2.0 Implementation/Deployment: 83%
3.0 Security: 90%
4.0 Troubleshooting: 100%











Next up.. AWS Solutions Architect - Professional

Saturday, October 8, 2016

TOGAF 9.1 Notes

I was considering doing the TOGAF 9.1 certification. I have gone over all the content, to at least have a decent idea, what TOGAF is and what it is trying to achieve.
However based on my current position, interests and trends I decided to rather focus my attention elsewhere for the moment.

So just to note down all the resources I found, used and other relevant resources links incase I do have the time or the need to come back to it.

Training course UDEMY (Scott Duffy):
https://www.udemy.com/togaf-enterprise-architect/
https://www.udemy.com/togaf-part2/
I would avoid the Exam Strategy one (https://www.udemy.com/study-togaf) as I found it a little bit of a waste, and he covers a lot of the content in the other 2 courses anyways.

Books / PDFs:
The Actual TOGAF specification (PDF)
TOGAF 9 Foundation Exam Study Guide
TOGAF 9.1 Quick Start Guide for Enterprise Architects (Not sure where I got a PDF from)
TOGAF Cheat Sheet from Scott Duffy's Udemy course.

Other Useful Resources:

Sunday, September 11, 2016

Oracle Workspace Manager - Basic POC with Spring Boot and Flyleaf

Working with a process to update configuration and master data within an enterprise is always a challenging task. While investigating possible solutions on how to have someone change data via a UI, then have those changes tested, signed off and approved before taking it to production. I stumbled on to Oracle Workspace Manager.
According to their developer docs, it seemed fit this use case exactly:

Manage a collection of updates and insertions as a unit before incorporating them into production data
Workspace Manager lets you review changes and roll back undesirable ones before making the changes public. Until you make the changes public, they are invisible to other users of the database, who will access only the regular production data. You can organize the changes in a simple set of workspaces or in a complex workspace hierarchy. A typical example might be a life sciences application in which Workspace Manager supports the discovery and quality assurance (QA) processes by managing a collection of updates before they are merged with the production data.


You could think of Oracle Workspace Manager as light "git"-like db versioning. 
Simply put:

  1. You create a workspace (branch)... 
  2. You make your changes there.. other people can connect to your workspace .. also make changes... or alternatively create their own branch from yours.
  3. You can refresh (fetch / merge) your workspace and resolve any conflicts that arise.
  4. Once everything is sorted and all is well with your changes you merge it back into the "LIVE" (origin: develop / master) workspace.

I used an existing Docker image of a oracle standard edition version. Oracle Workspace manager is not available on XE unfortunately. This is a rather large image, and it does take a couple minutes to initialise. The image is available on here on Docker Hub.


Connect database with following setting:
hostname: localhost
port: 1521
sid: xe
service name: xe.oracle.docker
username: system
password: oracle

To connect using sqlplus:
sqlplus system/oracle@//localhost:1521/xe.oracle.docker
To connect on mac os - install instantclient and run from there:

./instantclient_12_1/sqlplus system/oracle@local


To setup the initial DB I tried out Flyway. All quite simple and easy to implement. 
Under resources/db/migration there are a number of sql files that do the initial database setup.

  1. create the tables: CODE and CODE_TYPE, 
  2. insert initial data 
  3. enable versioning on those tables.

When you enable versioning the following happens, the table is renamed and a view is created allowing the "recording" of changes that occur.

Reference: (oracle presentation available here)



So after the initial setup you will seen a number of tables and views:



To try this out...

  1. Get the Oracle Docker image
  2. Once the DB is started... run the boot DataApplication.
  3. Use Postman on the REST resources below

To check the current workspace:
GET: http://localhost:9119/poc/workspace
To select all the information from the code table for the current workspace:
GET: http://localhost:9119/poc/workspace/data/code
Create code in the current workspace:
POST: http://localhost:9119/poc/workspace/data/code
BODY: 
{
"id":2,
"descr": "some code",
"type": 1
}
To change workspace (LIVE is default and available):
PUT: http://localhost:9119/poc/workspace/{workspaceName}
To create a workspace:
POST: http://localhost:9119/poc/workspace/{workspaceName}
To remove a workspace:
DELETE: http://localhost:9119/poc/workspace/{workspaceName}
To merge a workspace:
PUT: http://localhost:9119/poc/workspace/merge/{workspaceName}


These REST resources just wrap some of the functions from the DBMS_WM Package
This is maybe just the tip of a iceberg, as there is a ton of functionality available from this package.

All the code is available here.


Popular Posts

Followers