BackSpace is now on Udemy. Get your free access!!!



Many of you have been requesting we put our courses on Udemy and we have finally listened. We are now listed on Udemy and future courses will also be on Udemy.

As an introductory offer we are giving the following deals:

We will of course continue to sell and support courses with exam engines at backspace.academy.

Super Fast Dynamic Websites with CloudFront, ReactJS, and NodeJS - Part 1


CloudFront should be an essential component of any web based application deployment. It not only instantly provides super low-latency performance, it also dramatically reduces server costs while providing maximum server uptime.

Creating low latency static websites with CloudFront is a relatively simple process. You simply upload your site to S3 and create a CloudFront distribution for it. This is great for HTML5 websites and static blogs such as Jeckyl. But what about dynamic sites that need real time information presented to the end user? A different strategy is clearly required. Much has been published about different methods of caching dynamic websites but, I will present the most common sense and reliable technique to achieve this end.

Server Side v Browser Side Rendering

If you read any book on NodeJS you will no doubt find plenty of examples of rendering Jade templates with Express on the server. If you are still doing this, then you are wasting valuable server resources. A far better option is to render on the browser side. There are a number of frameworks specifically for this, the most popular being Facebook's ReactJS and Google's AngularJS (see State of JS report). I personally use ReactJS and the example will be in ReactJS, but either is fine.

Creating your site using ReactJS or AngularJS and uploading it to your NodeJS public directory will shift the rendering of your site from your server to the client's browser. Users of your app will no longer be waiting for rendered pages and will see pages appear with the click of a button.

You can now create a CloudFront distribution for your ReactJS or AngularJS site.

Although pages may be rendered instantly in the browser, any dynamic data required for the pages will be cached by CloudFront. We most probably do not want our dynamic data cached. We will still need a solution for delivering this data to the browser.


Handling Dynamic Data


Although there are many elaborate techniques published for handling dynamic data with CloudFront, the best way is to deliver this data without caching at all from CloudFront.

Not all HTTP methods are cached by CloudFront, only responses to GET and HEAD requests (although you can also configure CloudFront to cache responses to OPTIONS requests). If we use a different HTTP method, such as POST, PUT or DELETE  the request will not be cached by Cloudfront. CloudFront will simply proxy these requests back to our server.

Our EC2 NodeJS server can now be used to respond to requests for dynamic data by creating an API for our application that responds to POST requests from the client browser.




Some of you might be wondering why I haven't used serverless technology such as AWS Lambda or API Gateway. Rest assured I will be posting another series using this but, I consider EC2 as the preferred technology for most applications. First of all, costs are rarely mentioned in the serverless discussion. If you have an application that has significant traffic, the conventional EC2/ELB architecture will be the most cost effective. Secondly, many modern web applications are utilising websocket connections. Connections like this are possible with EC2 directly and also behind an ELB when utilizing proxy protocol. This is not possible with serverless technology as connections are short lived.

In the next post in this series we will set up our NodeJS server on EC2, create a CloudFront distribution and, create our API for handling dynamic data.

Be sure to subscribe to the blog so that you can get the latest updates.

For more AWS training and tutorials check out backspace.academy

Attention all exam preppers! AWS Answers



AWS has just released a new AWS Answers page that is essential reading for those preparing for the certification exams.
It provides a great overview of AWS architecture considerations.

YAML and CloudFormation. Yippee!!!

YAML: YAML Ain't Markup Language


I spend a heck of a lot of time coding and, like many devops guys, love Coffeescript, Jade, Stylus and YAML. No chasing missing semicolons, commas and curly braces. I just write clean code how it should be and, at least twice as fast.

JSON, like plain javascript, is a lot cleaner, quicker and easier to read when you remove all those curly braces, commas etc. YAML does just that!

AWS just announced support for YAML with CloudFormation templates. I would thoroughly recommend you check it out and start using YAML. It will make big difference to your productivity and, your templates will be much easier to read understand.

YAML, like Coffeescript, Jade and Stylus, makes use of indenting in code to eliminate the need for braces and commas. When you're learning YAML, you can use a JSON to YAML converter (eg http://www.json2yaml.com) to convert your existing JSON to YAML.

(Very) Basics of YAML

Collections using Indentation eliminate the need for braces and commas with Objects:

JSON 
"WebsiteConfiguration": {
"IndexDocument": "index.html",
"ErrorDocument": "error.html"
}

YAML
WebsiteConfiguration:
  IndexDocument: index.html
  ErrorDocument: error.html

Sequences with Dashes eliminate the need for square brackets and commas with Arrays:

JSON 
[
"S3Bucket",
"DomainName"
]

YAML
  - S3Bucket
  - DomainName

Full Example

Here is a full example I created for S3. I'll let you be the judge which one is better!

JSON:

{
"AWSTemplateFormatVersion": "2010-09-09",
"Description": "AWS CloudFormation Sample Template",
"Resources": {
"S3Bucket": {
"Type": "AWS::S3::Bucket",
"Properties": {
"AccessControl": "PublicRead",
"WebsiteConfiguration": {
"IndexDocument": "index.html",
"ErrorDocument": "error.html"
}
},
"DeletionPolicy": "Retain"
}
},
"Outputs": {
"WebsiteURL": {
"Value": {
"Fn::GetAtt": [
"S3Bucket",
"WebsiteURL"
]
},
"Description": "URL for website hosted on S3"
},
"S3BucketSecureURL": {
"Value": {
"Fn::Join": [
"",
[
"https://",
{
"Fn::GetAtt": [
"S3Bucket",
"DomainName"
]
}
]
]
},
"Description": "Name of S3 bucket to hold website content"
}
}
}

YAML:

---
AWSTemplateFormatVersion: '2010-09-09'
Description: AWS CloudFormation Sample Template
Resources:
  S3Bucket:
    Type: AWS::S3::Bucket
    Properties:
      AccessControl: PublicRead
      WebsiteConfiguration:
        IndexDocument: index.html
        ErrorDocument: error.html
    DeletionPolicy: Retain
Outputs:
  WebsiteURL:
    Value:
      Fn::GetAtt:
      - S3Bucket
      - WebsiteURL
    Description: URL for website hosted on S3
  S3BucketSecureURL:
    Value:
      Fn::Join:
      - ''
      - - https://
        - Fn::GetAtt:
          - S3Bucket
          - DomainName
    Description: Name of S3 bucket to hold website content



Shared Responsibility 3 - Identify and Destroy the Bots



Please note: You should have a link in your login for blind or vision impaired people. These techniques will prevent them from using your application. They could be accommodated using an alternative dual factor authentication process.

In my last post I detailed how to create dynamic CSS selectors to make life difficult for reliable Bot scripts to be written. The next part of this series is to identify the Bot and take retaliatory action. The code for this post is available at Gist in case Blogger screws it up again.. The code for creating dynamic CSS selectors including some additional decoy elements looks like this:

function dynamicCSS(){
 var username, password
 x = ''
 if ((Math.random()*2) > 1)
  x += ''
 else
  x += ''
 x += '
' x += '

Please Login

' y = Math.floor((Math.random()*5)) + 2 for (var a=0; a' x += '' x += '' } for (var a=0; a
' x += '' x += '' return x }

In a real app you would set this all up using a jade template but for simplicity we will just send raw html.

Analysing Header Information

The first thing we can look at is the header information sent to our NodeJS EC2 instance. I conducted some tests using a number of different browsers and also using the very popular PhantomJS headless webkit to find any clear differences between a real browser and a headless browser. Below are the results.

Request headers from Chrome:

{
  "host": "54.197.212.141",
  "connection": "keep-alive",
  "content-length": "31",
  "accept": "*/*",
  "origin": "http://54.197.212.141",
  "x-requested-with": "XMLHttpRequest",
  "user-agent": "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36",
  "content-type": "application/x-www-form-urlencoded; charset=UTF-8",
  "referer": "http://54.197.212.141/",
  "accept-encoding": "gzip, deflate",
  "accept-language": "en-GB,en-US;q=0.8,en;q=0.6",
}

Request headers from Firefox:

{
  "host": "54.197.212.141",
  "user-agent": "Mozilla/5.0 (Windows NT 10.0; WOW64; rv:48.0) Gecko/20100101 Firefox/48.0",
  "accept": "*/*",
  "accept-language": "en-US,en;q=0.5",
  "accept-encoding": "gzip, deflate",
  "content-type": "application/x-www-form-urlencoded; charset=UTF-8",
  "x-requested-with": "XMLHttpRequest",
  "referer": "http://54.197.212.141/",
  "content-length": "10",
  "connection": "keep-alive"
}

Request headers from Safari:

{
  "host": "54.197.212.141",
  "content-type": "application/x-www-form-urlencoded; charset=UTF-8",
  "origin": "http://54.197.212.141",
  "accept-encoding": "gzip, deflate",
  "content-length": "9",
  "connection": "keep-alive",
  "accept": "*/*",
  "user-agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_5) AppleWebKit/601.7.8 (KHTML, like Gecko) Version/9.1.3 Safari/601.7.8",
  "referer": "http://54.197.212.141/",
  "accept-language": "en-us",
  "x-requested-with": "XMLHttpRequest"
}

Request headers from Internet Explorer:

{
  "content-type": "application/x-www-form-urlencoded; charset=UTF-8",
  "accept": "*/*",
  "x-requested-with": "XMLHttpRequest",
  "referer": "http://54.197.212.141/",
  "accept-language": "en-US;q=0.5",
  "accept-encoding": "gzip, deflate",
  "user-agent": "Mozilla/5.0 (Windows NT 10.0; WOW64; Trident/7.0; rv:11.0) like Gecko",
  "host": "54.197.212.141",
  "content-length": "9",
  "dnt": "1",
  "connection": "Keep-Alive",
  "cache-control": "no-cache"
}

Request headers from PhantomJS:

{
  "accept": "*/*",
  "referer": "http://54.197.212.141/",
  "origin": "http://54.197.212.141",
  "x-requested-with": "XMLHttpRequest",
  "user-agent": "Mozilla/5.0 (Windows NT 10.0; WOW64; rv:47.0) Gecko/20100101 Firefox/47.0",
  "content-type": "application/x-www-form-urlencoded; charset=UTF-8",
  "content-length": "9",
  "connection": "Keep-Alive",
  "accept-encoding": "gzip, deflate",
  "accept-language": "en-AU,*",
  "host": "54.197.212.141"
}

I won't put all the results here for all the browsers tested, but there is a clear structure to the PhantomJS header that is unique from conventional browsers. In particular starting with "accept" and finishing with "host".
Analysing the header structure can be used to help identify a Bot from a real person.

Plugin Support

Support for plugins with headless browsers is minimal or non-existent. We can also look at using the javascript Navigator method on the client to get information on supported plugins and other browser features. A simple test would be to check the plugins array length. In order to do this we need to create a javascript file that runs on the client.

In the public directory of your NodeJS create a new file called login.js (use your IP address of course)

function submitForm(username, password) {
  loginURL = 'http://54.197.212.141' + '/login'
  user = $('#' + username).val()
  pass = $('#' + password).val()
  $.post( loginURL, { 
    username: 'user',
    password: 'pass',
    plugins: '  '//navigator.plugins.length
  })
    .done(function( result ) {
      alert(result)
      }
    })
}

This code will post back to your NodeJS server information about the browser. Although headless browsers can post forms, from my experiments they don't process jQuery post commands running on the client. In contrast it has worked on all normal browsers without issue

If the post is submitted then the further identification can occur. In this example we will just check the length of the plugins array and also send the filled out input fields. If it is a Bot then the wrong input fields will be submitted and the plugins array length will be zero.

You will also need to install and enable bodyParser in your index.js file to receive the information:

var bodyParser = require('body-parser')
app.use(bodyParser.urlencoded({ extended: true }));

You will also need to update the dynamicCSS function to include jQuery on the client:

x += '<script src="https://code.jquery.com/jquery-3.1.0.min.js"></script>'

Also add a link to to your login.js file:

x += '<script src="login.js"></script>'

Additional security can also be achieved by creating multiple versions of the login file with different names for the parameters (username/password/plugins). The backend code that handles the post request would be expecting the fields defined in login file that was served. Anything different that is posted means the request is most probably a from a Bot. Another option is to serve the login.js file on the server side with random parameters and set up a route for it to be downloaded. For simplicity I won't add the extra code for this, but implementation is quite straightforward.

Building a Profile of the Bot


It is quite important that you use a number of techniques to identify bots to make sure you do not have a case of mistaken identity. It is a good idea to use a score system and identify a level which will trigger corrective action.

In your index.js file that is run on your server update with the following code:

app.post('/login', function(request, response) {
  var botScore = 0
  if (request.body.plugins == 0) ++botScore // Empty plugins array
  if (!request.body.username) ++botScore // Clicked on decoy inputs
  if (!request.body.password) ++botScore
  if (getObjectKeyIndex(request.headers, 'host') != 0) ++botScore // Bot type header info
  if (getObjectKeyIndex(request.headers, 'accept') == 0) ++botScore
  if (getObjectKeyIndex(request.headers, 'referer') == 1) ++botScore
  if (getObjectKeyIndex(request.headers, 'origin') == 2) ++botScore
  console.log('Bot score = ' + botScore)
  if (botScore > 4) {
    console.log('Destroy Bot')
    response.send('fail')
  }
  else {
    response.send('Logged in ' + request.body.username)
  }
})

We are now building a bot score and deciding whether to allow access based on that score.

Attacking the Bot with Javascript

The following technique needs to be used with caution.

If you want to ensure the Bot is destroyed for the good of the internet community then you can look at launching a counterattack on the bot with your own malicious script to crash the browser.

This can be achieved quite simply using an endless loop that reloads the page. This will stop execution of the browser and eventually cause it to crash. Update your client side code in login.js with:


function submitForm(username, password) {
  loginURL = 'http://54.197.212.141' + '/login'
  user = $('#' + username).val()
  pass = $('#' + password).val()
  $.post( loginURL, { 
    username: 'user',
    password: 'pass',
    plugins:   navigator.plugins.length
  })
    .done(function( result ) {
      if (result == 'fail')
        while(true) location.reload(true) // Crash the Bot
      else {
        alert(result)
      }
    })
}


In my next post I will look at the different types and benefits of captcha and how to enable them on your site.

Be sure to subscribe to the blog so that you can get the latest updates.

For more AWS training and tutorials check out backspace.academy

New Classroom Website




We have just started updating our classroom website:

  • We now accept PayPal and use Paypal for all transactions including credit cards (no more Stripe).
  • Runs on all browsers.
  • One click login using Facebook (more options will be added soon)
Login is no longer with username and password so please login using your facebook account that is linked to the email you used with BackSpace. If your Facebook account uses a different email then please send me an email on info@backspace.academy and I will update the database with your Facebook email.

Shared Responsibility 2 - Using Dynamic CSS Selectors to stop the bots.


In my last post I talked about techniques to stop malicious web automation services at the source before they reach AWS infrastructure. Now we will get our hands dirty with some code to put it into action. Don't worry if you are not an experienced coder, you should still be able to follow along.

How do Bot scripts work?

A rendered web page contains a Document Object Model (DOM). The DOM defines all the elements on the page such as forms and input fields. Bots mimic a real user that enters information in fields, clicks on buttons etc. To do this the bot needs to identify the relevant elements in the DOM. DOM elements are identified using CSS selectors. Bot scripts consist of a series of steps that detail CSS selectors and what action to perform on them.

The DOM structure and elements of a page can be quickly identified using a browser. Pressing F12 in your browser will launch developer tools with this information:


To see specific details of a DOM element simply right click on the element on the page and select 'inspect':


This will open up the developer tools with the element identified. You can get the CSS selector for the element easily by again right clicking on the element in the developer tools:



Note that this will be only one representation of the element as a CSS selector (generally the shortest one). There are a number of ways an element can be defined as a CSS selector including:

  • id name
  • input name
  • class names
  • DOM traversal e.g. defining its chain of parent elements in the DOM
  • Text inside the element using Jquery ':contains'.

Dynamic CSS Selectors

To make life difficult to develop bot scripts you can use dynamic CSS selectors. Instead of creating the same CSS selectors each time your page is rendered, you can look at changing these randomly each time.

When using NodeJS and Express this is quite straightforward as your are already rendering pages on the server. Simply introduce some code to mix this up a bit.

Let's Start


First of all set up an EC2 instance with NodeJS and Express set up to render pages. If you are unsure you can view the video below:

https://vimeo.com/145017165

To save you typing, the code is available at Gist (also Blogger tends to screw up code when it is published).

Now let's change index.js to create a simple login form.

Point your browser to the public IP address of your instance to check everything is ok. e.g. xxx.xxx.xxx.xxx:8080

Now change the index.js file to include a dynamicCSS function :

app.get('/', function(request, response) {
  response.send(dynamicCSS())
})

function dynamicCSS(){
 x = ''
 x += '
'; x += '

Please Login

; x += ''; x += ''; x += '' x += '
'; return x }


Now do npm start at the command line of your ec2 instance and refresh the browser page. You will now see our very simple login form:


The problem with this form is that it is really easy to identify the dom elements required to login. The id, name, placeholder all refer to username or password.

Now let's change our code and introduce dynamically created CSS selectors.


var loginElements = {
 username: '',
 password: ''
 }

function dynamicCSS(){
 var username = randomString()
 var password = randomString()
 loginElements.username = username
 loginElements.password = password 
 x = ''
 x += '
' x += '

Please Login

' x += '' x += '' x += '' x += '
' return x } function randomString(){ chars = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXTZabcdefghiklmnopqrstuvwxyz'.split('') chars.sort(function() { return 0.5 - Math.random() }) return chars.splice(0, 8).toString().replace(/,/g, '') }



This now generates a random string for the id and name tags of the input elements. This makes it not possible to use these in a reliable bot script. If you do npm start again and view the the view the element in developer tools you can see the random strings.

We now need to look at the other ways our elements can be identified as CSS selectors. As you can see the text "username" and "password" is still used in the placeholders and input type tag. Also the DOM structure itself doesn't change dynamically, making it possible to reference the element through traversing the DOM structure.

We will address both problems by creating random decoy input elements with the same parameters. The CSS position property will allow us to stack them on top of each other so that the decoy elements are not visible on the page:

app.get('/', function(request, response) {
  response.send(dynamicCSS())
})

var loginElements = {
  username: '',
  password: ''
}

function dynamicCSS(){
  var username, password
  x = ''
  x += '
' x += '

Please Login

' y = Math.floor((Math.random()*5)) + 2 for (var a=0; a' x += '' loginElements.username = username loginElements.password = password } x += '' x += '
' return x } function randonString(){ chars = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXTZabcdefghiklmnopqrstuvwxyz'.split('') chars.sort(function() { return 0.5 - Math.random() }) return chars.splice(0, 8).toString().replace(/,/g, '') }


Now when you view the DOM in your browser developer tools,  you can see the decoy input elements created underneath the real input element. If you refresh your browser you will see a different number of elements created each time (between 1 and 5 created).



The bot creator can no longer use the username and password placeholders or input types to identify the elements. They can also not use the DOM structure to traverse through the DOM as this is changing also. As pointed out by a reader of this post (thanks Vadim!), you should also put some random inputs after to handle jquery ":last". A good place would be underneath your logo.
y = Math.floor((Math.random()*5)) + 2
for (var a=0; a'
  x += ''
  loginElements.username = username
  loginElements.password = password
}
for (var a=0; a'
  x += ''
  document.getElementById(username).style.visibility = "hidden";
  document.getElementById(password).style.visibility = "hidden";
}

The next thing a bot script can do is click on an x-y position on the screen. We can handle this by randomly changing the position of the elements.

var loginElements = {
 username: '',
 password: ''
 }

function dynamicCSS(){
  var username, password
  x = ''
  if ((Math.random()*2) > 1)
    x += ''
  else
    x += ''
  x += '
' x += '

Please Login

' y = Math.floor((Math.random()*5)) + 2 for (var a=0; a' x += '' loginElements.username = username loginElements.password = password } x += '' x += '
' return x } function randomString(){ chars = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXTZabcdefghiklmnopqrstuvwxyz'.split('') chars.sort(function() { return 0.5 - Math.random() }) return chars.splice(0, 8).toString().replace(/,/g, '') }


The position of the input elements is now random. This currently only has two positions but you can elaborate on this to create many possible combinations of positions. You may also make your login form inside a modal window that changes position on the screen.

If want to go further you can look having two login forms, username followed by password. Or even better, randomly change between the two.

We have now addressed the possible techniques a bot creator can use to identify your input elements and login to your site.

Congratulations, you made it to the end!

What's next?

In my next post I will introduce techniques to identify bots and then look at launching a counter attack on the bot to crash it after it has been positively identified.

Be sure to subscribe to the blog so that you can get the latest updates.

For more AWS training and tutorials check out backspace.academy

Shared Responsibility - Stopping threats at the source


Over the past week Denial of Service (DOS) has been dinner table talk in Australia since the catastrophic failure of it's online Census implementation. Everyone from the Prime Minister down has been quick to blame IBM and, quick to accuse the Chinese government for the attack.

After the dust has settled and reality sets in, the true picture appears. No Chinese conspiracy but certainly poor planning. Expecting the entire population of Australia to log in and complete a multi page form on the same night was in reality... dumb.


One thing that surprised me in all the discussions by the experts was the focus on back-end strategy to mitigate attacks. Surely any strategy should start at the root cause, the user interface. A lot can be done at the source using client-side strategies that prevent malicious traffic ever reaching your resources. Presenting a simple login page without additional protection is surely a magnet for malicious attacks.

A magnet for attack

AWS talk a lot about "shared responsibility". This is a fundamental tenet of good application design. We need to take responsibility for what is within our control and, out of the control of AWS. Allowing threats to reach AWS infrastructure is putting all of the responsibility on AWS and taking no responsibility yourself.

When designing a critical application a number of questions can be asked and the solutions designed into the application:

  1. Are you a real browser?
  2. Are you a real person?
  3. Can I identify you?
  4. Do you have permission to be here?

Headless Browsers

In general all threats will come from an automated service, not from a person at a desktop browser. Your first line of defence should be to detect headless browsers. Headless browsers run on a server without any GUI and operate using a simulated Document Object Model (DOM). There are a number of differences between headless browsers and normal browsers that can be used for detection and mitigation:

  • Plugins: The available plugins for headless browsers will be minimal and the plugins array will probably be empty. Testing for common plugin availability and plugin array length can identify threats instantly.
  • HTML5/CSS3: HTML5/CSS3 support is not a priority for a headless browser as it does not have a GUI. Testing for these features can help identify headless browsers.
  • Iframes: Embedding forms within iframes can make it more difficult for bots to identify DOM elements.
  • Ajax: Presenting a simple login form with input type password makes life easy for bots. Consider using Ajax to reveal the login process with animation step by step. Also consider varying the time randomly between steps.
  • CSS id and classes: Dynamically creating forms allows the opportunity to vary the class names and id of the elements on the form. It also allows the position in the DOM to be varied. This will make it difficult for reliable bot scripts to be created as they are based upon CSS selectors.
  • Web Security: Headless browsers do not have the same respect for security as conventional browsers due to the limited value of information contained in browser storage. Also web security features are mostly disabled by bot owners. This creates an opportunity to run client-side scripts on the headless browser that can consume it's server resources. Extreme care must be taken with this approach due to the possibility of accidently targeting innocent visitors.

Captcha:


Captcha is the technology users hate but unfortunately we need. Image type Captcha are the most common and generally effective at preventing brute force attacks.

Image Captcha
Despite this, they can be easily overcome through the use of a Captcha solving service. The bot will save an image of the Captcha and forward it to the service for solving. These services use OCR where possible and actual human entry to solve the Captcha. The reason why this is still reasonably effective is due to the expense involved in using a Captcha service. Bots will generally move on to more cost effective targets.



The latest one-click technology by ReCaptcha is by far the best. Unlike image Captcha, this requires actual real clicks before it is accepted. This technology is extremely effective at distinguishing between real clicks and bot clicks. It is also not possible to transfer the Captcha to a solving service as it is not an image and will be disabled when the Captcha url is run on a different machine. This is certainly a powerful weapon to have in your arsenal.

Federated Identity

If your traffic has come this far it is most likely not a bot and has probably got a Facebook account. Federated users identified through Facebook, Google, Amazon etc are issued with temporary credentials and can be used in conjunction with Amazon Cognito to ensure a high degree of authentication has occurred before AWS resources are accessed. More info:

Using AWS Cognito with Node.JS

Using Cognito with PhoneGap/Cordova

AWS IAM

Not much is needed to be said about AWS Identity and Access Management. AWS supply a fantastic service and it is up to us to make full use of it through the use of  least privilege roles for your federated users and ensuring credentials are temporary and safe.


So, next time you are thinking about security, don't forget about security at the source.

In the next post I will discuss using dynamic CSS selectors to stop the bots.



Welcome aboard India!

Welcome aboard India!

AWS Announces New Asia Pacific (Mumbai) Region





At last India has its own region with two availability zones. Much overdue but sure to be a popular decision. The following services are available in the new region:

    AWS Certificate Manager (ACM)
    AWS CloudFormation
    Amazon CloudFront
    AWS CloudTrail
    Amazon CloudWatch
    AWS CodeDeploy
    AWS Config
    AWS Direct Connect
    Amazon DynamoDB
    AWS Elastic Beanstalk
    Amazon ElastiCache
    Amazon Elasticsearch Service
    Amazon EMR
    Amazon Glacier
    AWS Identity and Access Management (IAM)
    AWS Import/Export Snowball
    AWS Key Management Service (KMS)
    Amazon Kinesis
    AWS Marketplace
    AWS OpsWorks
    Amazon Redshift
    Amazon Relational Database Service (RDS) – all database engines including Amazon Aurora
    Amazon Route 53
    Amazon Simple Notification Service (SNS)
    Amazon Simple Queue Service (SQS)
    Amazon Simple Storage Service (S3)
    Amazon Simple Workflow Service (SWF)
    AWS Support
    AWS Trusted Advisor
    VM Import/Export

The available  services will no doubt be expanded so be sure to check for more details at:

New Course AWS Certified SysOps Administrator!



The much awaited AWS Certified SysOps Adminstrator Course has been released. Available with the AWS Certified Associate course. All existing members will have access!

BackSpace Academy

Pre-Warming of EBS Volumes is not necessary


Amazon Web Services AWS EBS

A number of people have asked me about pre-warming of new EBS volumes. I do realise that there are a lot of courses and exam dumps out there stating this is necessary. In fact it is not necessary with new volumes and if you answer this incorrectly you will lose valuable marks on the exam.

The only situation where preparation is required before access is with volumes that were restored from a snapshot:

"New EBS volumes receive their maximum performance the moment that they are available and do not require initialization (formerly known as pre-warming). However, storage blocks on volumes that were restored from snapshots must be initialized (pulled down from Amazon S3 and written to the volume) before you can access the block." Initializing Amazon EBS Volumes

When in doubt read the docs

BackSpace Academy

Amazon Aurora Cross-Region Read Replicas

Amazon Aurora

Watch out for this on the exam!

Just announced by AWS Cross-Region Read Replicas for Amazon Aurora. You can now create Aurora read replicas in another region to the master. Creating the new read replica also creates an Aurora cluster that can contain up to 15 more read replicas!

We will be updating the course material with the changes. In the meantime, more details in the docs: Replicating Amazon Aurora DB Clusters Across AWS Regions.

BackSpace Academy 

New videos for AWS Certified Associate Courses

BackSpace Academy AWS Certified Associate Course


We have just created more new videos for the AWS Certified Associate course:
Amazon DynamoDB Core Knowledge  (New)
Amazon Simple Queue Service (SQS) Core Knowledge  (New)
Amazon Simple Notification Service (SNS) Core Knowledge  (New)

BackSpace Academy 

New Course Videos added

BackSpace Academy AWS Certification Videos

We have just updated some existing videos and also created new videos for the AWS Certified Associate course:
AWS Virtual Private Cloud (VPC) Core Knowledge  (New)
AWS Relational Database Service (RDS) Core Knowledge (New)
AWS Elastic Beanstalk Core Knowledge (New)
AWS OpsWorks Core Knowledge (New)
Amazon EC2 Core Knowledge (Updated)

BackSpace Academy 

AWS Certificate Manager rolling out to new regions

AWS Certificate Manager


Previously you would need buy your SSL certificates outside of AWS and then convert them to the format for AWS and then upload to your ELB. Life is much easier now with AWS Certficate Manager that provides this service along with the cetificates for free! How cool is that?

The service has been rolled out to most regions so you may get a question on it in the exam.

AWS Certificate Manager is a service that lets you easily provision, manage, and deploy Secure Sockets Layer/Transport Layer Security (SSL/TLS) certificates for use with AWS services. SSL/TLS certificates are used to secure network communications and establish the identity of websites over the Internet. AWS Certificate Manager removes the time-consuming manual process of purchasing, uploading, and renewing SSL/TLS certificates. With AWS Certificate Manager, you can quickly request a certificate, deploy it on AWS resources such as Elastic Load Balancers or Amazon CloudFront distributions, and let AWS Certificate Manager handle certificate renewals. SSL/TLS certificates provisioned through AWS Certificate Manager are free. You pay only for the AWS resources you create to run your application.

More details at https://aws.amazon.com/certificate-manager/ 

BackSpace Academy 

ECS Auto Service Scaling

Watch for this on the exam! An ECS tutorial will be released with the SysOps videos.

Amazon EC2 Container Service Supports Automatic Service Scaling

Amazon EC2 Container Service (Amazon ECS) can now automatically scale container-based applications by dynamically growing and shrinking the number of tasks run by an Amazon ECS service.
Previously, when your application experienced a load spike you had to manually scale the number of tasks in your Amazon ECS service.
Now, you can automatically scale an Amazon ECS service based on any Amazon CloudWatch metric. For example, you can use CloudWatch metrics published by Amazon ECS, such as each service’s average CPU and memory usage. You can also use CloudWatch metrics published by other services or use custom metrics that are specific to your application. For example, a web service could increase the number of tasks based on Elastic Load Balancing metrics like SurgeQueueLength, while a batch job could increase the number of tasks based on Amazon SQS metrics like ApproximateNumberOfMessagesVisible.

BackSpace Academy 

 

New EC2 instance X1

Watch out for this on the exam!

X1 instances, the largest Amazon EC2 memory-optimized instance with 2 TB of memory



X1 instances extend the elasticity, simplicity, and cost savings of the AWS cloud to enterprise-grade applications with large dataset requirements. X1 instances are ideal for running in-memory databases like SAP HANA, big data processing engines like Apache Spark or Presto, and high performance computing (HPC) applications. X1 instances are certified by SAP to run production environments of the next-generation Business Suite S/4HANA, Business Suite on HANA (SoH), Business Warehouse on HANA (BW), and Data Mart Solutions on HANA on the AWS cloud.
X1 instances offer 2 TB of DDR4 based memory, 8x the memory offered by any other Amazon EC2 instance. Each X1 instance is powered by four Intel® Xeon® E7 8880 v3 (Haswell) processors and offers 128 vCPUs. In addition, X1 instances offer 10 Gbps of dedicated bandwidth to Amazon Elastic Block Store (Amazon EBS) and are EBS-optimized by default at no additional cost.

BackSpace Academy 

AWS Certified SysOps Exam Engine


We have just added the Exam Engine for the AWS Certified SysOps Associate course!

The exam engine is included with the Associate courses. All existing paid customers will receive access to the exam engine.

The course videos and material will be released in the next week.

BackSpace Academy 

2016 Updated Courses!






Major changes to the format of our courses! One payment enables access to all associate courses and exam engines.
Course have been updated to 2016 and format has been changed to make study easier. Core AWS subjects that are relevant to all streams are in a separate course and then the specific subjects for the three streams are in separate courses.
Next week we will be adding the SysOps Administrator exam engine followed by the course material the following week. This will also be added to any existing customers for free.

BackSpace Academy 

What a month in AWS!

It is certainly hard keeping up with AWS releases! Here are some of the highlights:

Amazon RDS Cross-Account Snapshot Sharing.

Watch out for this one on the certification exam!
Regular database snapshots have always been a part of any good AWS administrator’s routine. Now the service is even better with the ability to share snapshots across different accounts.
Organisations should have multiple separate linked accounts for a number of reasons; security, separation from production environments, cost visibility etc. Now you can take snapshots of your production environment and copy to a development account for testing without any risk.

EC2 Run Command.

Watch out for this one on the certification exam!
This new feature will help you to administer your instances (no matter how many you have) in a manner that is both easy and secure.
This greatly increases security by allowing commands to be run remotely using the console without having to login through a bastion host.

EC2 Spot Blocks

Watch out for this one on the certification exam!
Now you can create spot instances that run for a fixed period of time from 1 to 6 hours.

MariaDB on AWS RDS

Watch out for this one on the certification exam!
We now have another database in the RDS suite. MariaDB is a fork from MySQL and can provide some additional capabilities.

AWS WAF - Web Application Firewall

Watch out for this one on the certification exam!
Another tool in your AWS security arsenal. Deploy custom and application-specific rules in minutes that block common attack patterns, such as SQL injection or cross-site scripting.

 

Amazon Inspector – Released in preview

Amazon Inspector is an automated security assessment service. This allows you to inspect your applications for a range of security vulnerabilities.

Amazon Kinesis Firehose

Load streaming data quickly and easily into Amazon S3 and Amazon Redshift to enable real time analytics.

Amazon QuickSight. Status – Released in preview

Amazon’s new Business Intelligence tools for analysis of data from Amazon EMR, Amazon RDS, Amazon DynamoDB, Amazon Kinesis, Amazon S3 and Amazon Redshift. QuickSight utilises SPICE (Super Fast In-Memory Calculation Engine) to return results from large datasets in rapid time.

AWS IoT – Released in beta

AWS IoT (Internet of Things) provide cloud services for embedded devices. Tiny devices can use AWS Lambda, Amazon Kinesis, Amazon S3, Amazon Machine Learning, and Amazon DynamoDB etc to provide powerful capabilities for many applications.
Many embedded processor suppliers including Intel, Microchip PIC, TI, BeagleBone, Avnet, Marvell, MediaTek, Renesas, Dragonboard and Seeeduino provide starter kits to get you started.

AWS Mobile Hub – Released in beta

This service streamlines the process of creating mobile IOS and Android apps that use AWS services.


We will be updating our backspace.academy certification courses to reflect all the changes.