Installation - VM (on-Prem , Cloud)
Step 1 - PSKnowHOW System Requirements
Please check here for System Requirements: PSknowHOW | System Requirements
Step 2 - Release Notes
Please read the release notes before starting the upgrade.
Step 3 - Create folder structure
To create the "PSKnowhow" directory in Linux, you can use the mkdir
command in the terminal. Here's the command:
mkdir /location/PSKnowhow
Replace "/location/PSKnowhow
" with your preferred location. This command will create the "PSKnowhow" folder in the specified location. Make sure to choose a directory path that you have appropriate permissions to create folders in and have available space and is permanent storage.
Step 4 - Download PSKnowHOW docker-compose.yaml
Go to https://github.com/PublicisSapient/PSknowHOW/blob/master/docker-compose.yaml
Right Click on Raw and save link as to Download docker-compose.yaml file to local as shown in below picture
Upload the downloaded docker-compose.yaml to PSKnowhow folder which you created in Step 1.
Step 4.b - Download auth service docker-compose.yaml file
Go to https://github.com/PublicisSapient/PSknowHOW/blob/master/central-auth-service/docker-compose.yaml
Open the
docker-compose.yaml
file in any preferred editor and replace all the placeholders like Docker image tags, DB username, password etc., and save it.
Step 5 - Edit the docker compose file with appropriate values
Update Credentials of DB and other Environmental variable as specified herehttps://psknowhow.atlassian.net/wiki/spaces/PSKNOWHOW/pages/45711363
Create readWrite Privileged user which adds extra layer of data protection from accidentally deleting collections and use this in application . follow this steps to create
docker exec -it mongodb /bin/bash mongo admin -u ${MONGO_INITDB_ROOT_USERNAME} -p ${MONGO_INITDB_ROOT_PASSWORD} use kpidashboard db.createUser({user: '<MONGODB_APPLICATION_USER>', pwd: '<MONGODB_APPLICATION_PASS>', roles:[{role:'readWrite', db:'kpidashboard'}]}) exit
To begin with PSKnowhow, it is essential to have three mandatory containers: i) UI, ii) CustomAPI, and iii) MongoDB. Additionally, there are other optional containers, such as jira-processor, devops-processor (including Jenkins, GitHub, GitLab, Bamboo, Bitbucket, Zephyr, Sonar, and TeamCity collectors), azure-board-processor (for Azure Board), and azure-pipeline-repo (for Azure Pipeline and Azure Repo).
Based on specific requirements, you can bring up these respective containers as needed.
Step 6 - Pull the PSKnowHOW docker images and Run the Containers
Open terminal/command prompt in PSKnowhow folder .
Pull the images by
docker-compose pull
Run the container by
Step 7 - Run the Auth service docker container
Now pull the Docker image by running:
After successfully pulling the Docker image, run the container by executing:
Now connect the Knowhow container with auth containers
Your auth application will now be running and can be accessed from a browser using the DNS.
Step 8 - Run DB Backup Script
Download the dbbackup.sh shell script to the server in /var/knh location.
Make sure the shell script has executable permissions using the
chmod
command.
To automate the backup process, you can utilize the crontab scheduling utility. This example schedules the backup script to run every night at 11:00 PM from Monday to Sunday.
With this configuration, the script will execute at 11:00 PM every day of the week (Monday to Sunday) and create backups in the /var/backups
directory.
Step 8 - Backing up to External Cloud Storage
While performing regular backups on your server is crucial, ensuring the safety of your data in the event of server failures or data breaches requires a multi-tiered approach. Consider implementing an off-site backup strategy by utilizing cloud storage services like Amazon S3. This safeguards your data by storing it in a separate location. Here's how you can set up a backup to AWS S3:
Setting Up Off-Site Backup to AWS S3
Create an AWS S3 Bucket: Begin by creating an Amazon S3 bucket to store your database backups. This bucket should be located in a different region from your server to enhance data redundancy.
Configure AWS CLI: Install and configure the AWS Command Line Interface (CLI) on your server to enable communication between your server and AWS services.
Modify Backup Script: Edit your
dbbackup.sh
script to include commands for transferring the backup files to the S3 bucket. Use the AWS CLI to upload the backup file to your S3 bucket.
Step 9 - Check Post installation steps for further configuration.
Â
© 2022 Publicis Sapient. All rights reserved.