Tuesday, April 2, 2024

AWS Custom Domain Name with API Gateway for REST APIs

     Imagine have a setup in Lambda similar to an api endpoint. But still you have to access and called to that api endpoint via the given endpoint name by AWS. 

Will you like to call to that endpoint from a subdomain what ever you like ? YES it is possible. Let's have a look on the implementation of this setup.

Below is the architectural diagram of the design of what we need to achieve. So the user should be able to call a sub domain such as {stage}.{aws-region}.api.{yourdomain}

                                         


As highlighted you have to use AWS Route 53,API Gateway, ACM and Lambda in order to proceed with the above setup.

First you have go to the AWS AWS Certificate manager and create a certificate for the desired subdomain. Here -> {stage}.{aws-region}.api.{yourdomain} .Set the verification with DNS or Email and you will get the notification to approve the Certificate Request (Only if you are the owner of the domain)


Then We are ready with the certificate and now let's create a custom domain name in API Gateway

Add a mapping with your Lambda


Now we are done with the basic setting. You need to create a record in the hosted zone to load your api gateway when domain called.


Finally as you are done with above all, you can use POSTMAN and call to the endpoint which you created.




Tuesday, September 12, 2023

AWS Backup Service

Creating a hourly scheduled backup

If you already using aws rds instances , you may face this issue when it comes to the production environments.

All database need a closest restore point if some disaster happens. But do we actually setup such as a hourly backup using current backup option ? No in AWS if you refer most of the instances only support up to daily backups.

 Yes it is not having an option for a hourly backups. In this post I will give an solution for an alternative method by using an AWS service, AWS Backup.

First you need to identify your environments and actual customer base because you must have an idea of importance of protecting your data if there is some disaster or failure happens. 

Let move with the original topic as how to over come with hourly backups. 

To achieve this we have two optimal solutions

1.Use a lambda function,s3 and trigger an event to take a snapshot. Use may have to use AWS Event bridge(With event rules) or Scheduler (Recently introduced by AWS which is a cost effective way than event rules) - Old Way

  1. Create a lambda function and invoke it hourly via scheduler or event rule
  2. Update the lambda implementation to trigger the backup and save to a s3 bucket

Here I am not going to talk with the implementation as this purpose of the post is not about it.

2.Using AWS Backup Service inorder order achieve the given problem

Please refer the following steps to implement this approach

  1. Create the cloudformation template
  2. Deploy the resources
  3. Monitor resources and backups with

Using a CloudFormation template to create AWS Backup Service

AWSTemplateFormatVersion: 2010-09-09
Description: 'RDS MySQL Backup Service'
Parameters:    
  CreatedBy:
    Description: Who is creating the cloudformation stack
    Type: String
    Default: CodePipeline
  BackupPlanName:    
    Description: Enter the name of the backup plan (Required)    
    Type: String    
    Default: "hourly-backup"
     
  CronExpression:    
    Description: Enter the cron expression for your backup plan (Required). Currently setup to occur hourly.    
    Type: String    
    Default: "cron(0 * ? * * *)"    
     
  Retention:    
    Description: This value will identify how many days your backup will be expired after (Required)    
    Type: String    
    Default: 3    
 
Metadata:    
  AWS::CloudFormation::Interface:    
    ParameterGroups:    
      -    
        Label:    
            default: BackupPlan Configurations (Mandatory)    
        Parameters:    
          - BackupPlanName    
      -      
        Label:    
          default: Backup Rule configuration    
        Parameters:    
          - CronExpression    
          - Retention    
Resources:        
  BackupVault:    
    Type: "AWS::Backup::BackupVault"    
    Properties:    
      BackupVaultName: !Sub ${BackupPlanName}-vault    
      AccessPolicy:    
        Version: '2012-10-17'    
        Statement:    
        -    
            Sid: 'Vault-Access-Policy'    
            Effect: Deny    
            Principal: "*"    
            Action: "backup:DeleteRecoveryPoint"    
            Resource:    
              - "*"    
  BackupPlan:    
    Type: "AWS::Backup::BackupPlan"    
    Properties:
      BackupPlan:    
        BackupPlanName: !Ref BackupPlanName    
        BackupPlanRule:    
          -    
            RuleName: !Sub ${BackupPlanName}-rule    
            TargetBackupVault: !Ref BackupVault    
            ScheduleExpression: !Ref CronExpression    
            Lifecycle:    
              DeleteAfterDays: !Ref Retention  
            StartWindowMinutes: 60
            CompletionWindowMinutes: 120
    DependsOn: BackupVault            
  TagBasedBackupSelection:
    Type: "AWS::Backup::BackupSelection"
    Properties:
      BackupSelection:
        SelectionName: !Sub ${BackupPlanName}-job-assignment  
        IamRoleArn: !Sub "arn:aws:iam::${AWS::AccountId}:role/service-role/AWSBackupDefaultServiceRole"
        ListOfTags:
         -
           ConditionType: "STRINGEQUALS"
           ConditionKey: "aws:cloudformation:stack-name"
           ConditionValue: !Sub "rds"
      BackupPlanId: !Ref BackupPlan
    DependsOn: BackupPlan


Make sure you update your CloudFormation execution role with below permissions.

        {
            "Action": [
                "backup:*"
            ],
            "Resource": "*",
            "Effect": "Allow",
            "Sid": "backup"
        },
        {
            "Action": [
                "kms:*"
            ],
            "Resource": "*",
            "Effect": "Allow",
            "Sid": "kms"
        },
        {
            "Action": [
                "backup-storage:*"
            ],
            "Resource": "*",
            "Effect": "Allow",
            "Sid": "backupstorage"
        }









Backup vault: A container for storing backups. Backup vaults are created in AWS Regions and can be used to store backups of AWS resources from multiple accounts.

Backup plan: A set of instructions that defines how backups are created and stored. Backup plans can be used to automate the backup process and ensure that backups are created on a regular basis.

Recovery point: A snapshot of a resource that can be used to restore the resource to a previous state. Recovery points are created by backup plans and stored in backup vaults.

Resource assignment/backup selection: A set of instructions that defines which resources should be backed up. Resource assignments can be used to back up specific resources or groups of resources.

You may refer following links for more information and get familiarize with the available features.

-Backup plan

https://docs.aws.amazon.com/aws-backup/latest/devguide/API_BackupPlan.html

https://docs.aws.amazon.com/aws-backup/latest/devguide/creating-a-backup-plan.html

-Backup vault

https://docs.aws.amazon.com/aws-backup/latest/devguide/vaults.html

-Backup rule

https://docs.aws.amazon.com/aws-backup/latest/devguide/API_BackupRule.html

-Backup Assignee/Selection

https://docs.aws.amazon.com/aws-backup/latest/devguide/API_BackupSelection.html

-Backup Vault

https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-backup-backupvault.html


Visual Design of our Backup



Snapshots vs Continuous backups

When you doing deployment and getting backups you will get a question about doing a comparison with backup types. Because there are several options and selections are vary.

You can get more information by following below link

https://docs.aws.amazon.com/aws-backup/latest/devguide/integrate-cloudformation-with-aws-backup.html

https://www.nucleustechnologies.com/blog/aws-snapshot-vs-backup/

References

Pricing - https://aws.amazon.com/backup/pricing/

https://www.druva.com/documents/pf/white-papers/8-tips-to-simplify-aws-backup-and-recovery.pdf



Wednesday, August 2, 2023

Static Web hosting with Azure + Angular + Github + DNS

We are going to setup an Angular static web site in Azure using Github. The domain provider is a separate vendor. Follow the easy steps to do the things.

  1. Install Angular and write you basic app
  2. Commit the changes to github repo
  3. Create an Azure account if you don't have and enable free subscription(Trial) 
  4. Create a static web app in Azure
    1. Create a new Resource Group
    2. Set Deployment details as GitHub
    3. Attach your Github Account and give the repo and branch details
    4. Review and Create
    5. Once you saved GitHub Action will start to run and deploy the latest code in the selection branch (Sometimes this may failed as you don't have the build folder in the angular project path.So please create in this error occurred).
        



5. Now you have almost complete with the deployment. Next you have to link with the domain which you already purchased.
 
6. Go to the static web app custom domain and get the alias. After that go to your domain provider and link with (You can use CNAME or A Record according to the type of t he domain).



Enable SSL in Your Angular App to resolve most of native OS issues (IOS/Android)




Today we are talking about an issue which is more painful to developers when we are doing the deployments to outside of localhost(To your local ip address with a port) in Angular.


If you are targeting IOS/Android operating system to launch you angular app, this will probably a good tip.

Enable SSL in your angular app by using a certificate. Create below file and keep inside your folder path and name it as certificate.cnf.

[req]

default_bits = 2048

prompt = no

default_md = sha256

x509_extensions = v3_req

distinguished_name = dn



[dn]

C = BE

ST = digem

L = brussels

O = EY

OU = PAS

emailAddress = lahiru.dhananjaya@randomsoftware.net

CN = <serving address>



[v3_req]

subjectAltName = @alt_names



[alt_names]

DNS.1 = <serving address>


Then you have to run this command to create the certificate in your terminal.

openssl req -new -x509 -newkey rsa:2048 -sha256 -nodes -keyout localhost-remote.key -days 3560 -out localhost-remote.crt -config certificate.cnf


You can try below command in the terminal inside your angular project path


ng serve --host 192.168.x.x  --port 4200 --ssl --ssl-key <key-path> --ssl-cert <cert-path>

or you can add this to .angular-cli.json

{

    "$schema": "./node_modules/@angular/cli/lib/config/schema.json",

    "defaults": {

        "serve": {

            "sslKey": "<relative path from .angular-cli.json>/server.key",

            "sslCert": "<relative path from .angular-cli.json>/server.crt",

            ...

        }, ...

    }, ...

}


After following the steps , you will get the change to smoothly use https over your ip and do debugging easily.This will enable most native features which Chrome/Safari has (Such as settings for the page camera etc)

Wednesday, June 7, 2023

Play with Git SQUASH



Git squash refers to the act of merging multiple commits into a single one. This can be accomplished using Git's "Interactive Rebase" feature, which allows you to perform this action at any given moment. Squashing commits is typically performed when merging branches.

Below are the syntaxes when you perform squash in the git branch

  1. git log --oneline
    • Enter letter 'q' to exit from the current line
  2. git rebase -i HEAD~3
    • Change sub commit from pick to s
    • Then press ESC and type :wq to exit
    • Then again change the commit message and press ESC, type :wq to exit
  3. git push -f

If something went wrong in above steps use git rebase --abort to cancel the process and start again.


Wednesday, April 18, 2018

Dot Net Core Preview Tool 3 Missing Issue with yeoman asp core generator with VS 2017 Sample Project

If you have installed VS 2017 RC previously and upgraded to VS 2017 Community version and when you are working with the sample which provided by yeoman asp core generator,you might see this error occurred,



Previously There was separate .NET Core SDK preview tool installation but now it is missing with their web sites.

To build the VS 2017 project with preview tool 3 you need to visit this url and download

.NET Core 1.0.0 SDK – Preview 3-004056




Saturday, April 14, 2018

Azure IoT Getting Started

Hello after a long time.. Today will have a quick post with Azure IoT.

Internet of Things going in Enterprise level and we have to get ready for the upcoming challenges in large scale. To prepare with it let's begin with Azure IoT.

In this article I am going to do the practice of simulating a IoT device and communicate with Azure IoT Hub (Uses MQTT Protocol).

First of all we need to have a Azure account created and enable a free tier subscription or if you already have a that's better.


1. Goto https://portal.azure.com 

2. Search and create the IoT hub






3.It will take several minutes to complete the Azure IoT Hub creation.Once it compete you can navigate inside to it.You need to get some important names and keys from the overview section and  shared policies section.

Overview 



Then select shared access policies and click on the iothubowner row inorder show the keys and connection strings.

Shared access policies




4.Next step is to integrate with the devices.To do that we have to simulate a device from the code. Here I am using a .net console applications to work as a device. You can download the given sample project from github or just run the command if you have git bash.

git clone https://github.com/Azure-Samples/iot-hub-dotnet-simulated-device-client-app.git


5.Once you get the clone you can open the solution in Visual Studio.There are 3 main console applications
       i.Create Device Identity - Add the simulated device to IoT Hub
       ii.Read Device to Cloud Messages - Read the messages sent from the IoT Hub
       iii.Simulated Device - Sending messages to the IoT Hub



Just add the all related parameters with correct values which are represented in IoT Hub.

6.Set the Create Device Identity project as startup and run it (Add the correct values for properties).



Once you run the application you will see a device key in the console.Note it down and add that key into the Simulated Device project program.cs DeviceId property.



Also check in the Azure portal, it will show up the added device.



Now it's time to get the messages from cloud and send the messages from the device.

Set the required properties as the correct values and run the project Read Device To Cloud Messages.




Sending a message to the device from the IoT Hub


You will receive the messages sent from the IoT hub you can see all the messages in the console.

7. Run Device to Cloud Console application and run a message from the IoT hub using Message to Device option.



8.Finally run the Simulated Device Console application to send the data to IoT hub.


You can see set of messges received from the azure portal IoT hub.



That's all for today with a quick go through .See you next time with an advanced article.

(References - Azure documentation for IoT Hub with .NET (https://docs.microsoft.com/en-us/azure/iot-hub/))