- Repo Structure
- Setup
- Version Control Guide
- Seeding the Production Database
- Formatting and Linting
- Secrets
llsc/
βββ backend/
β βββ app/
β β βββ middlewares/ # Middleware components (e.g., auth middleware)
β β βββ migrations/ # Database migrations
β β βββ models/ # SQLAlchemy database models
β β β βββ __init__.py # App initialization, contains all models
β β βββ resources/ # Data Transfer Objects (DTOs)
β β βββ routes/ # Route components
β β βββ schemas/ # Pydantic schemas
β β βββ services/ # Services handling business logic
β β β βββ implementations/ # Concrete implementations of service interfaces
β β β βββ interfaces/ # Service interfaces
β β βββ utilities/ # Shared utility modules
βββ frontend/
β βββ public/ # Static files
β βββ src/
β β βββ APIClients/ # API clients
β β βββ components/ # Reusable UI components
β β βββ constants/ # Constants
β β βββ contexts/ # Context providers
β β βββ pages/ # Next.js page routes
β β βββ styles/ # Global styles
β β βββ types/ # Custom type definitions
β β βββ utils/ # Utility functions
βββ docker-compose.yml
βββ Dockerfile
βββ .env.sample
βββ README.md
-
Make sure you have been added to the UW Blueprint Github Workspace.
-
Install Docker Desktop (MacOS | Windows | Linux) and ensure that it is running.
-
Clone the LLSC Github Repository to your local machine and
cd
into the project folder:
SSH (recommended)
git clone git@github.com:uwblueprint/llsc.git
cd llsc
If you haven't set up SSH keys already for github, follow the steps outlined here.
HTTPS
git clone https://github.com/uwblueprint/llsc.git
cd llsc
- Create a .env file in the root directory based on the .env.sample file. Update the environment variables as needed. Consult the Secrets section for detailed instructions.
cp .env.sample .env
- Build and start the Docker containers
docker-compose up --build
- Install pdm (this is a global installation, so location doesn't matter) On macOS:
brew install pdm
Otherwise, feel free to follow install instructions here
You will then need to go into each directory individually to install dependencies.
FastAPI backend
cd backend
pdm install
To run the backend server locally (recommended for development), run the following command:
cd backend
pdm run dev
To check if the database has been started up, type the following:
docker ps | grep llsc_db
This checks the list of docker containers and searchs for the container name llsc_db
NextJS frontend
cd frontend
npm install
To run the frontend server locally (recommended for development), run the following command:
cd frontend
npm run dev
-
Branch off of
main
for all feature work and bug fixes, creating a "feature branch". Prefix the feature branch name with your github username. The branch name should be in kebab case and it should be short and descriptive and should include the ticket number. E.g.mslwang/LLSC-42-readme-update
. -
To integrate changes on
main
into your feature branch, use rebase instead of merge
# currently working on feature branch, there are new commits on main
git pull origin main --rebase
# if there are conflicts, resolve them and then:
git add .
git rebase --continue
# force push to remote feature branch
git push --force-with-lease
If youβre new to Docker, you can learn more about docker-compose
commands at
this docker compose overview.
# build Builds images
docker-compose
# builds images (if they donβt exist) & starts containers
docker-compose up
# builds images & starts containers
docker-compose up --build
# stops the containers
docker-compose down
# stops the containers and removes volumes
docker-compose down --volumes
# get Names & Statuses of Running Containers
docker ps
# Remove all stopped containers, unused networks, dangling images, and build cache
docker system prune -a --volumes
Run in two lines (View Users Table):
docker exec -it llsc_db psql -U postgres -d llsc
SELECT * FROM public.users;
Running the commands line by line.
# run a bash shell in the container
docker exec -it llsc /bin/bash
# in container now
psql -U postgres -d llsc
# in postgres shell, some common commands:
# display all table names
\dt
# quit
\q
# you can run any SQL query, don't forget the semicolon!
SELECT * FROM public."<table-name>";
TBD
We use Ruff for code linting and formatting in the backend. To check for linting issues:
cd backend
pdm run ruff check .
To automatically fix linting issues:
cd backend
pdm run ruff check --fix .
To run the formatter:
cd backend
pdm run ruff format .
All code needs to pass ruff formatting and linting before it can be merged.
We use Prettier for code formatting in the frontend. To check for formatting issues:
npm run prettier:check
To automatically fix formatting issues:
npm run prettier:fix
We use ESLint for code linting. To check for linting issues:
npm run lint
To automatically fix linting issues:
npm run lint:fix
To run both Prettier and ESLint to format and fix linting issues in one command:
npm run format
-
Commits should be atomic (guideline: the commit is self-contained; a reviewer could make sense of it even if they viewed the commit diff in isolation)
-
Trivial commits (e.g. fixing a typo in the previous commit, formatting changes) should be squashed or fixup'd into the last non-trivial commit
# last commit contained a typo, fixed now
git add .
git commit -m "Fix typo"
# fixup into previous commit through interactive rebase
# x in HEAD~x refers to the last x commits you want to view
git rebase -i HEAD~2
# text editor opens, follow instructions in there to fixup
# force push to remote feature branch
git push -f
- Commit messages and PR names are descriptive and written in imperative tense. The first word should be capitalized. E.g. "Create user REST endpoints", not "Created user REST endpoints"
- PRs can contain multiple commits, they do not need to be squashed together before merging as long as each commit is atomic. Our repo is configured to only allow squash commits to
main
so the entire PR will appear as 1 commit onmain
, but the individual commits are preserved when viewing the PR.
Secrets are stored in the Environment Variable file within the LLSC notion.
Migrations (mirrors backend README)
We use Alembic for database schema migrations. We mainly use migration scripts to keep track of the incremental and in theory revertible changes that have occurred on the database. But, we don't need to rely on this to build the datebase itself, as Base.metadata.create_all(bind=engine)
achieves that based on the current models. To create a new migration, run the following command after adding or editing models in backend/app/models.py
:
cd backend
pdm run alembic revision --autogenerate -m "<migration message>"
To apply the migration, run the following command:
pdm run alembic upgrade head