- Backend: python 3.8
- Database: MongoDB
- Frontend: Node ^14.14.0 https://nodejs.org/en/download/
Assuming you have everything installed, you must be able to use node and npm commands :
$ node -v
$ npm -v
Paste these commands in your project folder. This will clone the project & install python and node dependencies
(venv) $ git clone https://github.com/RuellePaul/datatensor.git
(venv) $ cd datatensor
(venv) $ pip install --upgrade pip
(venv) $ pip install -r api/requirements.txt
(venv) $ brew install rabbitmq
(venv) $ brew services rabbitmq start
(venv) $ cd ux
(venv) $ yarn
Set environment variable in /development/init_env.sh
Backend
Run FastAPI backend using :
(venv) $ python api/app.py
Front end
Run React front-end using :
(venv) $ cd ux
(venv) $ yarn run development
Worker
Run Rabbit MQ server using :
(venv) $ cd api
(venv) $ celery -A worker worker --loglevel=INFO
(venv) $ cd api
(venv) $ celery -A worker worker --loglevel=INFO
MacOS Procedure
Install docker using :
brew install --cask docker
Launch Docker Desktop, and run docker deamon.
This section show deployment for test
env, but the same apply for other envs.
On PyCharm terminal, push a new tag :
(venv) $ git tag v_0.0.1
(venv) $ git push origin v_0.0.1
On AWS, search for DTProduction
instance, or rebuild it using DTProduction instance model.
Then, login using SSH to this instance using DTProductionLogin.sh
script :
(venv) $ cd builds/production
(venv) $ source DTProductionLogin.sh
DTProductionKeys.pem
in builds/production
.
Next, on the machine, install git
, docker
and docker-compose
:
sudo -i
apt install docker.io
apt install docker-compose
Then, use login to Github Packages :
cat ./github_token.txt | docker login https://docker.pkg.github.com -u <username> --password-stdin
<username>
must be authorized to collaborate on Datatensor github project, and you must have a github_token.txt
with repo, workflow and packages enabled.
Retrieve it from github here : https://github.com/settings/tokens/new
You can now clone the project :
git clone https://github.com/RuellePaul/datatensor.git
cd datatensor
If prompted, login with <username>
and your github token as password.
Fill the env with secret keys (copy from local) :
cd ~/datatensor/builds/production
nano init_env.sh
Add the deploy.sh
script :
cd ~/datatensor
nano deploy.sh
- In all
conf.d/<domain>.conf
files, comment the line :
# return 301 https://$server_name$request_uri;
Then deploy proxy
- Run certbot :
apt install certbot
certbot certonly --dry-run
Select option :
2: Place files in webroot directory (webroot)
Please enter in your domain name(s):
datatensor.io www.datatensor.io api.datatensor.io kibana.datatensor.io
Input the webroot
:
/var/www/letsencrypt
If the dry run is successful, run in /etc/letsencrypt
rm -rfd archive
mkdir archive
rm -rfd live
mkdir live
then run the same as above without --dry-run
argument.
- Rename generated certs directory in
/etc/letsencrypt/live
:
mv datatensor.io-0004/ datatensor.io
- Replace the line in all
conf.d/<domain>.conf
:
return 301 https://$server_name$request_uri;
and re-deploy proxy
service
Hydrate ~/datatensor/builds/production/elk/.env
with production values.
Deploy filebeat :
cd ~/datatensor/builds
docker-compose up -d filebeat
Deploy elasticsearch and kibana :
cd ~/datatensor/builds/production/elk
docker-compose up -d elasticsearch kibana
In certs
docker volume, copy/paste ca.crt
certificate in path ~/datatensor/builds/production/elk
, then deploy logstash :
cd ~/datatensor/builds/production/elk
docker-compose up -d logstash
Kibana is visible at URL https://kibana.datatensor.io