Developing theRadio.Ai - Part 2
Migrating TheRadio.Ai to the Cloud: A Step-by-Step Guide
In our latest project, TheRadio.Ai, we are building a platform that allows users to create and manage radio stations with AI-managed playlists. This blog post provides an overview of our journey to migrate TheRadio.Ai to the cloud using AWS services.
Project Overview
TheRadio.Ai is designed to enable users to create personalized radio stations by selecting and purchasing songs. An AI DJ manages the playlists based on user instructions and pre-engineered algorithms. Our goal is to launch with at least five stations, each containing over 100 songs.
Step 1: Setting Up the Cloud Infrastructure
1. AWS Account Setup:
We started by creating an AWS account and setting up necessary IAM roles for secure access.
Configured the AWS CLI for streamlined interaction with AWS services.
2. Terraform Configuration:
Utilized Infrastructure as Code (IaC) with Terraform to set up our AWS infrastructure.
Defined the main.tf file to create a VPC, subnets, security groups, and EC2 instances.
Successfully initialized and applied Terraform configurations to provision the required resources.
provider "aws" {
region = "us-east-1"
}
resource "aws_vpc" "main" {
cidr_block = "10.0.0.0/16"
}
resource "aws_subnet" "subnet1" {
vpc_id = aws_vpc.main.id
cidr_block = "10.0.1.0/24"
}
resource "aws_subnet" "subnet2" {
vpc_id = aws_vpc.main.id
cidr_block = "10.0.2.0/24"
}
resource "aws_security_group" "allow_web" {
vpc_id = aws_vpc.main.id
ingress {
from_port = 80
to_port = 80
protocol = "tcp"
cidr_blocks = ["0.0.0.0/0"]
}
egress {
from_port = 0
to_port = 0
protocol = "-1"
cidr_blocks = ["0.0.0.0/0"]
}
}
resource "aws_instance" "web" {
ami = "ami-03972092c42e8c0ca"
instance_type = "t2.micro"
subnet_id = aws_subnet.subnet1.id
security_groups = [aws_security_group.allow_web.name]
key_name = "my-key-pair"
tags = {
Name = "WebServer"
}
}
resource "aws_s3_bucket" "radio_bucket" {
bucket = "theradio-bucket"
acl = "private"
}
3. SSH Access Configuration:
Generated an SSH key pair for secure access to EC2 instances.
Configured Git Bash to use the key pair for connecting to the instances.
Step 2: Dockerization and Container Management
1. Creating Docker Images:
Developed a Dockerfile to containerize the application.
Built Docker images locally and ensured they were ready for deployment.
# Use an official Node.js runtime as a parent image
FROM node:14
# Set the working directory
WORKDIR /usr/src/app
# Copy the package.json and package-lock.json files
COPY package*.json ./
# Install dependencies
RUN npm install
# Copy the rest of your application files
COPY . .
# Expose the port the app runs on
EXPOSE 8080
# Define the command to run the app
CMD ["node", "app.js"]
2. Pushing Docker Images to Amazon ECR:
Created an Amazon ECR repository for storing Docker images.
Authenticated Docker with ECR and pushed the images.
aws ecr create-repository --repository-name theradioai/app
aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin <your_aws_account_id>.dkr.ecr.us-east-1.amazonaws.com
docker tag theradioai/app:latest <your_aws_account_id>.dkr.ecr.us-east-1.amazonaws.com/theradioai/app:latest
docker push <your_aws_account_id>.dkr.ecr.us-east-1.amazonaws.com/theradioai/app:latest
Step 3: Setting Up Amazon EKS
1. EKS Cluster Creation:
Used eksctl to create an Amazon EKS cluster.
Configured kubectl for managing the EKS cluster.
eksctl create cluster --name theradioai-cluster --region us-east-1 --nodegroup-name linux-nodes --node-type t2.micro --nodes 3 --nodes-min 1 --nodes-max 4
aws eks --region us-east-1 update-kubeconfig --name theradioai-cluster
2. Deploying to EKS:
Defined a Kubernetes deployment file to manage application deployment.
Applied the deployment and service configurations to EKS.
apiVersion: apps/v1
kind: Deployment
metadata:
name: theradioai-app
spec:
replicas: 2
selector:
matchLabels:
app: theradioai
template:
metadata:
labels:
app: theradioai
spec:
containers:
- name: theradioai
image: <your_aws_account_id>.dkr.ecr.us-east-1.amazonaws.com/theradioai/app:latest
ports:
- containerPort: 8080
---
apiVersion: v1
kind: Service
metadata:
name: theradioai-service
spec:
type: LoadBalancer
ports:
- port: 80
targetPort: 8080
selector:
app: theradioai
Step 4: Additional AWS Services Integration
1. Setting Up API Gateway and Lambda:
Configured API Gateway for handling HTTP requests.
Developed Lambda functions to manage backend logic.
2. Utilizing Amazon S3:
Created an S3 bucket for storing static content.
Configured bucket policies to allow appropriate access.
Next Steps and Goals
With the foundational infrastructure set up and the application containerized and deployed, our next steps include:
Enhancing AI Features:
Develop and integrate advanced AI capabilities for the AI DJ.
Logging, Monitoring, and Alerting:
Implement comprehensive logging and monitoring using CloudWatch, Splunk, and Grafana.
CI/CD Pipeline Setup:
Establish continuous integration and deployment pipelines using AWS CodePipeline and Jenkins.
Mobile App Development and User Engagement:
Develop a mobile app to enhance user interaction and engagement.
Implement monetization strategies and performance optimization.
By following these steps, we are progressing steadily towards achieving our goal of launching TheRadio.Ai with robust cloud infrastructure and advanced features. Stay tuned for more updates as we continue this exciting journey!