Sunday, 16 November 2025

How to Use AI Effectively: The A.I.M. Framework (Actor • Intent • Mission)

 




Using AI isn’t just about typing prompts — it’s about thinking strategically so the AI can produce its best work for you. Whether you're a student, a creator, an analyst, a business owner, or someone simply exploring AI for the first time, mastering the A.I.M. Framework will transform the quality of your results.

Below is your complete guide to A.I.MActor, Intent, Mission — along with visuals to help you understand and apply it immediately.


🔥 Introduction: Why You Need a Framework for AI

Most people use AI like a search engine:
They ask a quick question… and hope for the best.

But AI is not Google.
AI thinks with you, shapes ideas with you, and creates outputs based on how clearly you define the roles and goals.

That’s why you need a framework — A.I.M.


🎯 The A.I.M. Framework

A.I.M stands for:

A — Actor

Tell AI who it should be.

I — Intent

Tell AI what you’re trying to do.

M — Mission

Tell AI what exact output or result you need.

When you define these three elements, AI behaves like a specialist, not a generic assistant.


A — ACTOR (Define the Role)

Start by assigning AI a role, persona, or expertise.

Examples:

  • Act as a senior cloud architect…

  • You are my marketing strategist…

  • Act as my expert DevOps consultant…

  • You are a world-class resume writer…

Why it works:
Roles give AI context — which raises the quality, relevance, and depth of your answers.


I — INTENT (State Your Objective)

Intent explains why you are asking.

Examples:

  • I want to troubleshoot an AWS issue…

  • I want to create a landing page that converts…

  • I’m trying to learn Kubernetes basics…

This tells AI the direction to think in.


M — MISSION (Specify the Final Deliverable)

The mission is the exact format you want.

Examples:

  • Give me a step-by-step guide.

  • Create a blog post with headers and bullet points.

  • Generate a PowerPoint outline.

  • Produce a comparison table between AWS, Azure, and GCP.

  • Write the email in a friendly but professional tone.

When your mission is clear, the output becomes instantly usable.


🔧 Putting It All Together (Example Prompt)

Here’s how a weak prompt becomes an expert-level one:

❌ Weak Prompt

“Explain Kubernetes.”

✅ A.I.M. Prompt

Actor: Act as a senior Kubernetes trainer.
Intent: I’m preparing for an interview and need to strengthen my fundamentals.
Mission: Give me a clear explanation with diagrams, analogies, and a 10-question practice quiz.

See the difference?
One triggers basic info.
The other triggers expertise.


📊 Visual Reinforcement




🚀 Why A.I.M. Works Everywhere

You can use A.I.M. for:

  • Writing emails

  • Creating business plans

  • Debugging code

  • Learning new skills

  • Building creative content

  • Improving processes

  • Studying complex topics

  • Planning marketing campaigns

  • Writing job applications

  • Interview preparation

  • Research summaries

AI becomes clearer, faster, and more powerful when you give it structure.


🧠 Tips for Mastering AI with A.I.M.

1. Always define the Actor first

AI behaves differently depending on the role.

2. Write your Intent in plain language

You don’t have to be fancy — just tell it why.

3. Treat the Mission like instructions to a designer

Formats matter. Specify them.

4. Iterate

You can always say:
Revise this.
Make it shorter.
Improve this section.
Add visuals.

AI is a collaborator — not a vending machine.


🏁 Conclusion

The A.I.M. Framework is the fastest way to get powerful, accurate, and tailored responses from AI. When you define the Actor, clarify your Intent, and specify your Mission, you transform AI into a high-level partner who understands exactly what you need.

Use A.I.M. daily — and your AI skills will multiply.

How to Create An AI Agent to send Email





Step 1 Register at n8n.com

Step 2 on your Dashboard click create work flow





Step 3 Click on + Open Node Agent

select Ai AGent




Then click outside the canvas 



Then Save




1. Chat Model

A chat model is the core AI engine that generates responses during a conversation.
It’s a large language model trained on vast amounts of data so it can understand questions, follow instructions, and provide helpful, human-like answers.

Key Characteristics of a Chat Model

  • Understands natural language (text inputs)

  • Generates context-aware responses

  • Can reason, summarize, explain, and create content

  • Adapts to conversation flow

  • Does not store personal information unless explicitly allowed

What a Chat Model does

  • Responds to prompts

  • Maintains conversation context

  • Performs tasks like writing, coding, explaining, solving problems

2. Memory

Memory allows an AI system to remember important user preferences across conversations—but only when the user explicitly wants it saved.

How Memory Helps

  • Saves user preferences (tone, style, recurring details)

  • Remembers ongoing projects or long-term goals

  • Makes future responses more personalized

What Memory Does NOT Do

  • It does not store personal details without permission

  • It does not automatically remember everything

  • It does not keep sensitive information unless the user clearly requests it

Examples of Memory

  • “Remember I prefer short answers.”

  • “Remember my business name is CloudWave Tech.”

3. Tools

Tools are extensions that the chat model can call to perform specialized tasks.
They allow the model to go beyond text generation and interact with systems, APIs, or external capabilities.

Why Tools Matter

Tools expand what AI can do. They let the model:

  • Access real-time information (web search tools)

  • Generate images

  • Analyze code or run environments

  • Create documents (PDF, Word, Excel)

  • Use APIs for specific workflows

Examples of AI Tools

  • Web browsing tool: fetches up-to-date information

  • Python sandbox: executes code securely

  • Image generation tool: creates images based on prompts

  • Document generation tool: outputs PDFs, slides, spreadsheets

  • Developer tools: run code snippets, test scripts, debug

How Tools Work

The chat model detects when a tool is needed → calls the tool → receives the result → integrates it into the final answer.


In Simple Terms

ComponentPurposeAnalogy
Chat ModelBrain that thinks and respondsA smart assistant
MemorySaves preferencesA planner that remembers your style
ToolsSpecial abilitiesApps the assistant can use



Step 4 Click on chat model and select a chat model : we use openAi for this lab

N8N gives you free 100 OpenAI credits. Click Claim free credits to get it 



We will skip creating credentials for now. Click outside the canvas to go back to work flow




Next is to setup memory... click on + under memory
Select Simple Memory




You can increase the context window

A context window is the amount of text or information an AI model can “hold in mind” at one time during a conversation. It includes the user’s messages, the AI’s replies, and any additional data provided.

The larger the context window, the more the AI can remember within a single session—allowing it to follow long conversations, understand complex documents, and maintain continuity. Once the limit is exceeded, older parts of the conversation roll off and the model can no longer reference them directly.


Now this is enough to Test



Hit the Play Button, And ask the AI AGENT A Question, And it will respond

Step 5 Next Connect the Agent to a Tool...Click the + below Tool



Select Gmail...


Connect your Gmail credential select Oauth2, Sign in with Google and select your Account. Let the Agent define the To field, Subject and Message.

Go back to your Workflow, And Open chat and ask the AI to send a mail 



It works

Friday, 22 December 2023

How to upgrade Maven

 

java.lang.IllegalStateException

I had installed maven in my ubuntu using command 

apt install maven

This installed maven in path /usr/share/maven

Months later, I encountered a maven exception when compiling a java project. The error was as follow:

[ERROR] Error executing Maven.
[ERROR] java.lang.IllegalStateException: Unable to load cache item
[ERROR] Caused by: Unable to load cache item
[ERROR] Caused by: Could not initialize class com.google.inject.internal.cglib.core.$MethodWrapper

My java version at the time was

openjdk version "17.0.2" 2022-10-18
OpenJDK Runtime Environment (build 17.0.2+8-Ubuntu-2ubuntu120.04)
OpenJDK 64-Bit Server VM (build 17.0.2+8-Ubuntu-2ubuntu120.04, mixed mode, sharing)

and my maven version at the time was

Apache Maven 3.6.3
Maven home: /usr/share/maven
Java version: 17.0.2, vendor: Oracle Corporation, runtime: /usr/lib/jvm/java-17-openjdk-amd64
Default locale: en, platform encoding: UTF-8
OS name: "linux", version: "5.10.16.3-microsoft-standard-wsl2", arch: "amd64", family: "unix"

The cause of the error was that maven version(3.6.3) is old. I needed to upgrade to the latest version of maven.

Unfortunately, I could not upgrade to the latest maven version (3.9.0 at the time) using the aptpackage manager on Ubuntu. Generally, the easiest way to install anything on Ubuntu is via the apt package manager. However, it often does not include the latest JDK packages.

These are the steps to install the latest maven version on Ubuntu:

  1. Download the latest maven binaries

a. cd into the /tmp directory on your terminal

b. Check https://maven.apache.org/download.cgi and copy the link for the “Binary tar.gz archive” file.

c. Run the following command to download the binaries:

wget https://dlcdn.apache.org/maven/maven-3/3.9.6/binaries/apache-maven-3.9.6-bin.tar.gz

d . Untar the archive file and extract it in the directory

tar -xvf apache-maven-3.9.6-bin.tar.gz
mv apache-maven-3.9.6 maven
mv maven /usr/share/

Note: the latest maven version I was downloading was 3.9.6. Make sure to replace the version in the commands above with the maven version

Tuesday, 12 December 2023

SQL Fundamentals

SQL Fundamentals Course

SQL Fundamentals Course Documentation

Table of Contents

  1. Oracle Cloud Account Setup
  2. Provisioning Oracle Autonomous Database
  3. Connecting to Oracle Autonomous Database
  4. SQL Development Tools Installation
  5. Lab Exercise 1: Setting Up SQL Environment
  6. Lab Exercise 2: Querying Data
  7. Lab Exercise 3: Exploring Joins
  8. Lab Exercise 4: Aggregating Data
  9. Lab Exercise 5: Modifying Data and Transactions
  10. Lab Exercise 6: Building a Blood Donation Database
  11. Final Project: Building a Blood Donation Database

1. Oracle Cloud Account Setup

Sign Up for an Oracle Cloud Account:

Go to Oracle Cloud.

2. Provisioning Oracle Autonomous Database

Access Oracle Cloud Console:

Log in to the Oracle Cloud Console.

Create an Autonomous Database:

Navigate to the "Autonomous Database" section.

Click "Create Autonomous Database" and follow the setup wizard.

Provide details such as database name, username, and password.

Obtain Connection Details:

Once the Autonomous Database is provisioned, note down the connection details (hostname, port, service name, username, password).

3. Connecting to Oracle Autonomous Database

Download SQL Developer or Toad for Oracle:

Download and install Oracle SQL Developer or Toad for Oracle on your local machine.

Connect SQL Developer or Toad to Autonomous Database:

Open SQL Developer or Toad and create a new connection.

Use the connection details obtained earlier (hostname, port, service name, username, password) to connect to the Autonomous Database.

4. SQL Development Tools Installation

Install SQL Developer:

Download SQL Developer from the official website.

Follow the installation wizard to install it on your machine.

Install Toad for Oracle:

Download Toad for Oracle from the official website.

Follow the installation wizard to install Toad on your machine.

5. Lab Exercise 1: Setting Up SQL Environment

1. Install Toad for Oracle:

Download Toad for Oracle from the official website.

Follow the installation wizard to install Toad on your machine.

-- SQL Command: None, as it involves setting up Toad.

2. Connect to a Database:

Open Toad and click on "New Connection."

Enter your connection details, including username, password, and database connection details (hostname, port), and click "Connect."


        -- SQL Command: None, as it involves setting up Toad.

    

3. Create a Sample Database and Table:

In the SQL Editor within Toad, execute the CREATE TABLE statement to create a table named users with columns id, name, and age.

CREATE TABLE users ( id NUMBER PRIMARY KEY, name VARCHAR2(50), age NUMBER ); ALTER TABLE users MODIFY id int NOT NULL; CREATE SEQUENCE users_sequence START WITH 1 INCREMENT BY 1; CREATE OR REPLACE TRIGGER users_trigger BEFORE INSERT ON users FOR EACH ROW BEGIN SELECT users_sequence.nextval INTO :new.id FROM dual; END;

4. Insert Sample Data:

Use the INSERT INTO statements to add sample data to the users table.

INSERT INTO users (name, age) VALUES ('John Doe', 25); INSERT INTO users (name, age) VALUES ('Jane Smith', 30);

5. Execute Basic Queries:

In the SQL Editor, run a SELECT * FROM users; query to retrieve all data from the users table.

SELECT * FROM users;

6. Lab Exercise 2: Querying Data

1. Basic SELECT Statement:

Retrieve all columns from the users table:

SELECT * FROM users;

2. Filtering Data:

Retrieve users older than 25:

SELECT * FROM users WHERE age > 25;

3. Sorting Data:

Retrieve users sorted by age in descending order:

SELECT * FROM users ORDER BY age DESC;

4. Limiting Results:

Retrieve the first 5 users:

SELECT * FROM users WHERE ROWNUM <= 5;

7. Lab Exercise 3: Exploring Joins

1. Inner Join:

Retrieve information from two tables where there is a match:

SELECT users.id, users.name, orders.order_number FROM users INNER JOIN orders ON users.id = orders.user_id;

2. Left Join:

Retrieve all records from the left table and the matched records from the right table:

SELECT users.id, users.name, orders.order_number FROM users LEFT JOIN orders ON users.id = orders.user_id;

3. Right Join:

Retrieve all records from the right table and the matched records from the left table:

SELECT users.id, users.name, orders.order_number FROM users RIGHT JOIN orders ON users.id = orders.user_id;

4. Full Outer Join:

Retrieve all records when there is a match in either the left or right table:

SELECT users.id, users.name, orders.order_number FROM users FULL OUTER JOIN orders ON users.id = orders.user_id;

8. Lab Exercise 4: Aggregating Data

1. Counting Records:

Count the number of users in the users table:

SELECT COUNT(*) FROM users;

2. Grouping Data:

Group users by age and display the count in each group:

SELECT age, COUNT(*) FROM users GROUP BY age;

9. Lab Exercise 5: Modifying Data and Transactions

1. Updating Records:

Update the age of a user in the users table:

UPDATE users SET age = 28 WHERE name = 'John Doe';

2. Deleting Records:

Delete a user from the users table:

DELETE FROM users WHERE name = 'Jane Smith';

3. Transactions:

Use transactions to ensure atomicity for a series of SQL statements:

BEGIN; -- SQL statements within the transaction COMMIT;

10. Lab Exercise 6: Building a Blood Donation Database

1. Create Tables:

Create tables for donors, donations, and recipients:

CREATE TABLE donors ( donor_id NUMBER PRIMARY KEY, donor_name VARCHAR2(50), blood_type VARCHAR2(5) ); CREATE TABLE donations ( donation_id NUMBER PRIMARY KEY, donor_id NUMBER, donation_date DATE, volume_ml NUMBER, FOREIGN KEY (donor_id) REFERENCES donors(donor_id) ); CREATE TABLE recipients ( recipient_id NUMBER PRIMARY KEY, recipient_name VARCHAR2(50), blood_type VARCHAR2(5) );

2. Insert Sample Data:

Insert sample data into each table:

INSERT INTO donors (donor_id, donor_name, blood_type) VALUES (1, 'John Smith', 'O+'); INSERT INTO donors (donor_id, donor_name, blood_type) VALUES (2, 'Jane Doe', 'A-'); INSERT INTO donations (donation_id, donor_id, donation_date, volume_ml) VALUES (1, 1, TO_DATE('2023-01-01', 'YYYY-MM-DD'), 500); INSERT INTO donations (donation_id, donor_id, donation_date, volume_ml) VALUES (2, 2, TO_DATE('2023-02-15', 'YYYY-MM-DD'), 750); INSERT INTO recipients (recipient_id, recipient_name, blood_type) VALUES (1, 'Alice Johnson', 'AB+'); INSERT INTO recipients (recipient_id, recipient_name, blood_type) VALUES (2, 'Bob Williams', 'B-');

3. Write Queries:

Write queries to retrieve information about donors, donations, and recipients.

-- Example queries SELECT * FROM donors; SELECT * FROM donations; SELECT * FROM recipients;

11. Final Project: Building a Blood Donation Database

Project Overview:

For the final project, you will build a Blood Donation Database to manage information about blood donors, donations, and recipients.

Project Tasks:

  1. Create tables for donors, donations, and recipients.
  2. Insert sample data into each table.
  3. Write queries to retrieve information about donors, donations, and recipients.
  4. Implement basic CRUD operations (Create, Read, Update, Delete) for the database.

Project Submission:

Submit your SQL script containing all the queries and commands used to create and populate the Blood Donation Database.

Monday, 27 November 2023

Project DeliApp Nov 2023

    Deli Foods is an Emerging Restaurant business with presence all over the United States designs.

They currently have a legacy web Application Written in Java and hosted by their private server : https://project-deliapp.s3.us-east-2.amazonaws.com/DeliApp/src/main/webapp/index.html

It usually takes 5hrs to update their application and updates are manual, which incurs alot of downtime and is affecting their business because clients get locked out which gives their competitors upper hand.




Your Task is to migrate this Application into the cloud and implement Devops Practices to their entire Software Development Life Cycle

You should show concepts that implement Plan --Code--Build--Test--Deploy--Monitor



TASK A - Documentation: Setup a Wiki Server for your Project (Containerization)

a.

You can get the docker-compose file from below link

https://github.com/bitnami/containers/blob/main/bitnami/dokuwiki/docker-compose.yml

Or

Use the below command on your Terminal to get the Yaml code and create a Docker Compose File

curl -sSL https://raw.githubusercontent.com/bitnami/containers/main/bitnami/dokuwiki/docker-compose.yml

b. mount your own Data Volume on this container

Hint: by modifying the Docker Compose file eg.



c. Change default port of Wiki Server to be running on Port 84

d. Change the default User and password

 to 

         Username: DeliApp

         Password:  admin

hint: Use the official image documentation to find details to accomplish all this

https://github.com/bitnami/containers/tree/main/bitnami/dokuwiki#how-to-use-this-image

TASK A  Acceptance Criteria: 

i. The Wiki Server should be up and running and serving on 84

ii. Mount your own container volume to persist data

iii. Login with Credentials DeliApp/admin


TASK B: Version Control The DeliApp Project

Plan & Code

App Name: DeliApp

  • WorkStation A- Team  Osato- 3.142.247.23
  • WorkStation B - Team     -
Developer Workstations are windows machines, Your Project Supervisor will provide you their ip/dns and credentials you will use to log into the machine assigned to ur group: You can use Mobaxterm or RemoteDesktop to connect. The Username is Administrator

When you access the Developer workstation assigned to your group, you will find the code base in the below location:
This PC:---->Desktop---->DeliApp



(You can use Github or Bitbucket )- 

1) Set up 2 repos, a Build Repo to store all the code base and a Deployment Repo to store all your deployment scripts and name them accordingly as you see below(in green): 

  • Build repo : DeliApp_Build  --->Developers Access
  • Deployment repo: DeliApp_Deployment   --->-Your Team Access

2)Version control the DeliApp project located in the Developers WorkStation to enable the Developers migrate their code to the Source Control Management Tool(Bitbucket/Git)

  • Set up Developers workstations ssh-keys in bitbucket to access Build Repo and Your Team(Devops) workstation ssh-keys in bitbucket to access the Deployment Repo

3)Git branching Strategy for DeliApp_Build

  • master
  • release: eg    release/release-v1
  • feature:   eg  feature/feature-v1
  • develop

4)Git branching Strategy for DeliApp_Deploy

  • master
  • feature eg feature/feature-v1
  • develop



5. Secure the Repos by Installing git-secrets on your build( DeliApp_Build )and deployment (DeliApp_Deploy )repo --PRE-COMMIT HOOK

6. Prevent the developers and your Team from pushing code directly to master by installing PRE-PUSH HOOK

TASK B Acceptance Criteria: 

1. You should be able to push and pull code from the Developer Workstation assigned to your Team to the DeliApp_Build repo in Source Control Management(SCM) 

2. Your Team (Devops) Should be able to pull and push code from your individual workstations to the DeliApp_Deploy repo

3. Demonstrate the git branching Strategy

4. Your git commit should should throw an error when there is a secret in your repo

Hint: Add a text file containing some secrets eg. aws secret key/access key and commit

5. You should get an Error when you try to push to master

TASK C: Set up your Infrastructure

1. Set up your Environment: DEV, UAT, QA, PROD A, PROD B

Provision 6 Apache Tomcat Servers (You can use Any IAC Tool(Terraform, Cloud Formation, Ansible Tower)You can host this use any cloud provider - Aws, Google Cloud, Azure

i. DEV - t2micro -8gb

ii. UAT(User Acceptance Testing)- t2small -10gb

iii. QA(Quality Assurance) - T2Large-20gb

iv. PROD A- T2Xlarge-30gb

v. PROD B- T2xLarge-30gb

Apache Tomcat Servers should be exposed on Port 4444

Linux Distribution for Apache Tomcat Servers: Ubuntu 18.04

Note: When Bootstrapping your servers make sure you install the Datadog Agent

Apache Tomcat Servers should be exposed on Port 4444

Linux Distribution for Apache Tomcat Servers: Ubuntu 16.04

Note: When Bootstrapping your servers make sure you install the Datadog Agent

2. Set up your Devops tools servers:

(These can be provisioned Manually or with IAC Tool, Be Free to use any Linux Distributions on theses eg Linux 2, Debian, Ubuntu,etc)

NOTE: USE AZURE CLOUD FOR BELOW

1 Ansible Tower T2xxl- 15gb

1 Kubernetes Server-You can use EKS, k3s,kubeadm or minikube

1 Jenkins(CI/CD) t2 xlarge 20gb

1 Vulnerability Scanning Tool Server- Owasp Zap (Install in a Windows instance) See: https://www.devopstreams.com/2022/06/getting-started-with-owasp-zap.html

Insall Helm in your kubernetes Sever(k3s,Eks,kubeadm,miniqube) and the following with helm:

Install Sonarqube

Artifactory

Bonus Task:

Add an application or Elastic Loadbalancer to manage traffic between your ProdA and Prod B Servers

Register a Domain using Route 53, eg www.teamdevops.com

Point that domain to the Elastic/Application Loadbalancer 

Acceptance Criteria: When you Enter your domain in the browser, it should Point to Either Prod A or Prod B

TASK D: Monitoring

a. Set up continuous monitoring with Datadog by installing Datadog Agent on all your servers

 Acceptance criteria: 

 i All your infrastructure Sever metrics should be monitored(Infrastructure Monitoring)

ii All running Processes on all your Servers should be monitored(Process monitoring)

ii Tag all your servers on the Datadog dashboard

TASK E: Domain Name System

a. Register a Domain for your Team

i. You can use Route 53, Godaddy or any DNS service of your choice 

eg. www.team-excellence.com


TASK F: Set Up Automated Build for Developers 

The Developers make use of Maven to Compile the code

a. Set up a C/I  Pipeline in Jenkins using Jenkinsfile 

b. Enable Webhooks in bitbucket to trigger Automated build to the Pipeline Job

c. The CI Pipeline job should run on an Agent(Slave)

d. Help the developers to version their artifacts, so that each build has a unique artifact version

Tips: https://jfrog.com/knowledge-base/configuring-build-artifacts-with-appropriate-build-numbers-for-jenkins-maven-project/


Pipeline job Name: DeliApp_Build

Pipeline should be able to checkout the code from SCM and build using Maven build tool, Provide code analysis ,codecoverage with sonarqube and upload artifacts to artifactory, and also send email to the team and provide versioning of artifacts

Pipeline should have slack channel notification to notify build status


i. Acceptance Criteria:

 Automated build after code is pushed to the repository

1. Sonar Analysis on the sonarqube server

2. Artifact uploaded to artifactory

3. Email notification on success or failure

4. Slack Channel Notification

5. Each artifact has a unique version number

6. Code coverage displayed

TASK G: Deploy & Operate (Continous Deployment)

a. Set up a C/D pipeline in Jenkins using Jenkinsfile

create 4 CD pipeline jobs for each env (Dev,Uat, QA,Prod) or 1 pipeline that can select any of the 4 enviroments

Pipeline job Name:eg DeliApp_Dev_Deploy


i. Pipeline should be able to deploy any of your LLE (Dev, Uat, Qa) or HLE (Prod A, PROD B) 

You can use DeploytoContainer plugin in jenkins or Deploy using Ansible Tower to pull artifact from artifactory and deploy to either  Dev, Uat , Qa or  Prod

ii. Pipeline should have slack channel notification to notify deployment status

iii. Pipeline should have email notification

iv. Deployment Gate

1. Acceptance criteria:

i. Deployment is seen and verified in either Dev, Uat, Qa or Prod

ii. Notification is seen in slack channel

iii. Email notification

TASK H:a.  Deployment and Rollback

a. Automate the manual deployment of a Specific Version of the Deli Application using Ansible Tower

Manual Deployment Process is Below:


step 1: login to tomcat server

step 2 :download the artifact

step 3: switch to root

step 4: extract the artifact to Deployment folder 

Deployment folder:  /var/lib/tomcat8/webapps

Use service id : ubuntu


Acceptance Criteria:

i. Deploy new artifact from artifactory to either Dev, Uat, Qa or  Prod

ii. Rollback to an older artfact from Artifactory either to Dev, Uat, Qa or Prod

iii. All credentials should be encrypted

TASK H:b.  Domain Name Service and LoadBalancing

i. Add an application or Elastic Loadbalancer to manage traffic between your ProdA and Prod B Servers

ii. Configure your DNS with Route 53 such that if you enter your domain eg www.team-excellence.com it direct you to the LoadBalancer that will inturn point to Prod A or Prod B

Acceptance criteria: 

i. Your team domain name eg www.mint.com will take you to your application that is residing on Prod A or Prod B

 

TASK I: 

    a. Set Up A 3 Node kubernetes Cluster(Container Orchestration) with Namespace dev,qa,prod

  • Using a Jenkins pipeline or Jenkins Job  -The pipeline or job should be able to Create/Delete the cluster

   b. Dockerize the DeliApp

  • You can use a Dockerfile to create the image or Openshift Source to image tool 
  c. Deploy the Dockerized DeliApp into the prod Namespace of the cluster(u can use dev and qa          for testing)
 d. Expose the application using a Load balancer or NodePort
 e.  Monitor your cluster using prometeus and Grafana
 TASK I Acceptance Criteria: 

1. You should be able to create/delete a kubernetes cluster

2. Be able to deploy your application into any Namespace(Dev,Qa,Prod)

3. You should be able to access the application through Nodeport or LoadBalancer

4. You should be able to monitor your cluster in Grafana

TASK J: Demonstrate Bash Automation of 

i. Tomcat

ii. jenkins

iii. Apache


Acceptance criteria: 

1. Show bash scripts and successfully execute them


Wednesday, 1 November 2023

Year-End Blitz: DevOps Mastery at $1500 – Secure Your Future!"

How to Use AI Effectively: The A.I.M. Framework (Actor • Intent • Mission)

  Using AI isn’t just about typing prompts — it’s about thinking strategically so the AI can produce its best work for you. Whether you...