Building Workflows with AWS Lambda and Step Functions

Step Functions and Lambda are two key services within the AWS ecosystem that work seamlessly together to create complex, stateful workflows.

What are Step Functions?

  • A visual workflow service: Step Functions allows you to define workflows as a series of steps. Each step can be a task, a choice, or a wait.
  • State machines: These workflows are defined as state machines, which can be executed and managed.
  • Integration with other AWS services: Step Functions integrates with a wide range of AWS services, including Lambda, ECS, and DynamoDB.

What is Lambda?

  • Serverless compute service: Lambda lets you run code without provisioning or managing servers.
  • Event-driven: Lambda functions are triggered by events, such as API calls, file uploads, or messages from other AWS services.
  • Scalable: Lambda automatically scales to meet demand, ensuring your applications can handle varying workloads.

How Step Functions and Lambda Work Together

  • Lambda as a task: One of the most common use cases is to use Lambda functions as tasks within a Step Functions state machine. When a state machine is executed, the Lambda function associated with that step is invoked.
  • Input and output: Step Functions can pass data between steps, allowing Lambda functions to process and transform data as it flows through the workflow.
  • Error handling: Step Functions provides built-in error handling mechanisms, such as retries and catch blocks, to ensure that your workflows are resilient.

Benefits of Using Step Functions and Lambda

  • Simplified development: Step Functions provides a visual interface for creating and managing workflows, making it easier to build complex applications.
  • Scalability: Lambda’s serverless architecture ensures that your applications can scale to meet demand without requiring manual provisioning of resources.
  • Cost-effectiveness: Step Functions and Lambda are pay-as-you-go services, meaning you only pay for the resources you use.
  • Integration with other AWS services: Step Functions’ ability to integrate with a wide range of AWS services makes it a versatile tool for building complex applications.

Example use case: A common use case is building a data processing pipeline. The pipeline might involve:

  1. Ingesting data from a source like S3 or a database.
  2. Transforming the data using Lambda functions.
  3. Storing the processed data in a destination like S3 or DynamoDB.

In Step Functions, you define your workflow in JSON format. Here’s a simplified example:

{
  "Comment": "An example of a simple order process",
  "StartAt": "Check Inventory",
  "States": {
    "Check Inventory": {
      "Type": "Task",
      "Resource": "arn:aws:lambda:region:account-id:function:CheckInventory",
      "Next": "Process Payment"
    },
    "Process Payment": {
      "Type": "Task",
      "Resource": "arn:aws:lambda:region:account-id:function:ProcessPayment",
      "Next": "Notify Customer"
    },
    "Notify Customer": {
      "Type": "Task",
      "Resource": "arn:aws:lambda:region:account-id:function:NotifyCustomer",
      "End": true
    }
  }
}

Step Functions can be used to define the workflow, with each step representing a specific task or decision point. Lambda functions can be used to perform the actual data processing.

AWS Doc: https://aws.amazon.com/step-functions/

Avoid task failures in ECS/Fixing “Essential container in task exited” error in ECS

In Amazon ECS (Elastic Container Service), the concept of “essential” and “non-essential” containers refers to the importance of specific containers within a task definition. This distinction is used to determine how the task behaves when individual containers exit or fail.

Essential Containers

  • Definition: An essential container is a critical part of the task. If this container stops or exits with an error, the entire task will stop.
  • Behavior: If an essential container fails, ECS will consider the task as failed and will potentially stop all other containers in the task. The task might be restarted or terminated based on the task’s restart policy.
  • Use Case: Essential containers are typically the main components of an application, like a web server, primary service, or a critical process that the task cannot function without.

Non-Essential Containers

  • Definition: A non-essential container is considered supplementary. If this container stops or exits, the task will continue to run as long as all essential containers are running.
  • Behavior: The failure of a non-essential container will not cause the entire task to stop. However, the status of the non-essential container will be reported as stopped.
  • Use Case: Non-essential containers are often used for auxiliary tasks like logging, monitoring, or sidecar containers that provide additional but non-critical functionality.

Example Scenario

Imagine a task with two containers:

  • Container A (Essential): A web server that serves application traffic.
  • Container B (Non-Essential): A logging agent that forwards logs to a remote server.

If Container A fails, the entire task will stop. If Container B fails, the task will continue running, but logs might not be forwarded until the logging container is restarted or replaced.

Configuration

In the ECS task definition JSON, you can specify whether a container is essential or non-essential by setting the essential parameter:

jsonCopy code{
  "containerDefinitions": [
    {
      "name": "web-server",
      "image": "nginx",
      "essential": true
    },
    {
      "name": "log-agent",
      "image": "log-agent-image",
      "essential": false
    }
  ]
}

In this example, “web-server” is essential, while “log-agent” is non-essential.

How to access Load Balancer logs

AWS provides access logs for Elastic Load Balancers (ELB), allowing you to monitor and analyze traffic patterns. Below are general steps to access and view ELB access logs:

Amazon ELB Access Logs:

  1. Navigate to the EC2 Console:
    • Open the AWS EC2 Console.
  2. Select Load Balancers:
    • In the left navigation pane, choose “Load Balancers” under the “Load Balancing” section.
  3. Choose Your Load Balancer:
    • Click on the name of the load balancer for which you want to access logs.
  4. View Access Logs:
    • In the “Description” tab, look for the “Attributes” section.
    • Check if the “Access logs” attribute is set to “Enabled.”
  5. Access Logs Location:
    • If access logs are enabled, you can find them in an S3 bucket specified in the “S3 bucket” field.
  6. Navigate to S3 Bucket:
    • Open the AWS S3 Management Console.
    • Access the S3 bucket mentioned in the “S3 bucket” field.
  7. Access Log Files:
    • Inside the S3 bucket, you should find log files with a naming convention like <load-balancer-name>/<YYYY>/<MM>/<DD>/....
    • Download or view the log files to analyze the access logs.

AWS CLI:

You can also use the AWS Command-Line Interface (CLI) to access ELB access logs:

# Replace <your-load-balancer-name> and <your-s3-bucket-name> with your actual values

aws s3 cp s3://<your-s3-bucket-name>/<your-load-balancer-name>/<path-to-log-file> .

This command downloads the specified log file to the current directory.

Analyzing Access Logs:

Access logs typically include information such as client IP addresses, request timestamps, response status codes, and more. You can use tools like AWS Athena, Amazon CloudWatch Logs Insights, or other log analysis tools to query and visualize the logs.

Remember to adjust the steps based on the specific type of load balancer you are using (Application Load Balancer, Network Load Balancer, or Classic Load Balancer). Always refer to the official AWS documentation for the most accurate and up-to-date information.

These logs are much helpful if you are looking from which instance the request is coming.

Export Cognito Users from AWS Using AWS CLI

Export Cognito Users in AWS

As of today, there is no way to directly export users from Cognito in AWS.

But we can use AWS CLI or AWS SDK to get the list of users.

  • First step is to install AWS CLI on your machine

Click here to download and install AWS CLI

  • Next, step is to set up AWS on your machine. To do this, open cmd (command prompt) and do the following –
$ aws configure
AWS Access Key ID [None]: YourAccessKeyId
AWS Secret Access Key [None]: YourSecretAccessKey
Default region name [None]: YourRegion
Default output format [None]: json

Replace the above with your values. For more info, click here.

  • To get the list of all users in Cognito, run the following command
aws cognito-idp list-users --region <region> --user-pool-id <userPoolId> 
--output json > users.json

The above will return the list of users in a JSON format. If you want to get the result in a table format, run the next command

aws cognito-idp list-users --region <region> --user-pool-id <userPoolId>  --output table > users.txt
  • Now if you want to convert the result json to csv. Use the following code snippet.
private static void ConvertJsonToCsv()
{
 using (var csv = new ChoCSVWriter("users.csv").WithFirstLineHeader())
 {
   using (var json = new ChoJSONReader("CognitoUsers.json")
             .WithField("Username")
             .WithField("Email", jsonPath: "$.Attributes[?(@.Name == 
              'email')].Value", isArray: true)
             .WithField("UserStatus")
         )
  {
     csv.Write(json);
  }
 }
}

Restore SQL database from .bak file stored in AWS S3 bucket

A Sql Database can be restored using .bak file

A Sql Database can be restored using .bak file

The below query can be used to restore database using a .bak file stored in AWS S3 bucket

exec msdb.dbo.rds_restore_database

-- replace name with the database name

@restore_db_name='<name>', 

--Replace backup s3 path with the actual path. 
--Eg. prod-backups/DLProdDB-12-07-19
--In this, prod-backups is the bucket name and DLProdDB-12-07-19 is the .bak --file name

@s3_arn_to_restore_from='arn:aws:s3:::{backup s3 path}'


To check the status of the restore, use the following query –

exec msdb.dbo.rds_task_status @db_name='<name>'

For more info, click here

Interested in Cryptocurrency. Register and start investing here

Earn a side income by affiliate marketing. Learn here how to do it.