Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API Production Deploy 3/14/2023 #846

Merged
merged 46 commits into from
Mar 14, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
46 commits
Select commit Hold shift + click to select a range
567dd39
fix personalisation
Jan 18, 2024
ed3a356
merge from main and fix some tests
Jan 19, 2024
6dc9828
fix tests
Jan 22, 2024
4024c8f
merge from main
Jan 22, 2024
d7c29b8
fix blank lines
Jan 22, 2024
0c15a65
remove print statement
Jan 22, 2024
c97be34
optimize personalisation lookup
Jan 23, 2024
b5a0562
fix tests
Jan 24, 2024
31ff6a1
merge from main
Feb 1, 2024
0010a43
merge from main again
Feb 6, 2024
0b714c2
merge
Feb 7, 2024
aef797e
merge from main again
Feb 22, 2024
b777249
merge from main
Mar 1, 2024
ec9619c
merge from main
Mar 5, 2024
a8dcda4
fix test
Mar 5, 2024
492daee
fix personalisation for emails
Mar 6, 2024
e2b2cdc
fix tests
Mar 6, 2024
89200e5
Update README.md with logo
stvnrlly Mar 6, 2024
90e6b0b
Merge pull request #837 from GSA/stvnrlly/add-logo-to-readme
ccostino Mar 7, 2024
92d4171
We hope this is right.
xlorepdarkhelm Mar 8, 2024
ed9896f
Updating the versions for things.
xlorepdarkhelm Mar 8, 2024
7c95211
fix login.gov to use user uuid instead of email (notify-admin-1277)
Mar 8, 2024
23167c5
Formatting is annoying.
xlorepdarkhelm Mar 8, 2024
b46bad8
Version bumps aren't fun.
xlorepdarkhelm Mar 8, 2024
e843b05
Removing all the references!
xlorepdarkhelm Mar 8, 2024
9ce1e48
Terraform formatting
xlorepdarkhelm Mar 8, 2024
4d89e64
Reversing the over-engineering.
xlorepdarkhelm Mar 8, 2024
6d72b0d
Merge pull request #839 from GSA/API-729_Recursive_Delete_Changes
ccostino Mar 8, 2024
c84d01b
Merge pull request #783 from GSA/notify-api-749
ccostino Mar 8, 2024
8a6cc76
Merge pull request #840 from GSA/notify-admin-1277
ccostino Mar 11, 2024
6c4c8b0
fix code coverage reporting
Mar 11, 2024
d7edf32
Merge pull request #842 from GSA/code_coverage
ccostino Mar 11, 2024
6170347
Fix remaining Terraform for production and demo
ccostino Mar 12, 2024
ef46ddc
Fixed reference to the Cloud Foundry org instead of space
ccostino Mar 12, 2024
68fa123
Add space data back in - we still need it!
ccostino Mar 12, 2024
f0e6688
Adding missing variable for egress_space
ccostino Mar 12, 2024
111135a
Helping the migration succeed!
xlorepdarkhelm Mar 12, 2024
6713585
Merge pull request #850 from GSA/API-847_Clobbering_bad_enum_data
ccostino Mar 12, 2024
15f8be7
Explicitly add allow_ssh flag and disable for production
ccostino Mar 13, 2024
1484c2f
Adjust properties further to only apply to spaces
ccostino Mar 13, 2024
0404348
Removed last bit of extraneous config that is not needed
ccostino Mar 13, 2024
d72521f
Merge pull request #849 from GSA/fix-remaining-terraform
stvnrlly Mar 13, 2024
9d60e6e
Fix reference to CF org vs. space
ccostino Mar 13, 2024
8fe70fe
Merge pull request #851 from GSA/more-terraform-fixes
ccostino Mar 13, 2024
a8640a6
fix email notifications missing personalisation (notify-api-853)
Mar 13, 2024
30ecc21
Merge pull request #854 from GSA/notify-api-853
ccostino Mar 13, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .github/workflows/checks.yml
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,8 @@ jobs:
NOTIFY_E2E_TEST_HTTP_AUTH_USER: ${{ secrets.NOTIFY_E2E_TEST_HTTP_AUTH_USER }}
NOTIFY_E2E_TEST_PASSWORD: ${{ secrets.NOTIFY_E2E_TEST_PASSWORD }}
- name: Check coverage threshold
run: poetry run coverage report --fail-under=50
# TODO get this back up to 95
run: poetry run coverage report --fail-under=87

validate-new-relic-config:
runs-on: ubuntu-latest
Expand Down
2 changes: 1 addition & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ test: ## Run tests and create coverage report
poetry run black .
poetry run flake8 .
poetry run isort --check-only ./app ./tests
poetry run coverage run -m pytest -vv --maxfail=10
poetry run coverage run --omit=*/notifications_utils/* -m pytest --maxfail=10
poetry run coverage report -m --fail-under=95
poetry run coverage html -d .coverage_cache

Expand Down
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
![notify-logo](https://github.com/GSA/notifications-api/assets/4156602/6b2905d2-a232-4414-8815-25dba6008f17)

# Notify.gov API

This project is the core of [Notify.gov](https://notify-demo.app.cloud.gov).
Expand Down
62 changes: 62 additions & 0 deletions app/aws/s3.py
Original file line number Diff line number Diff line change
Expand Up @@ -130,6 +130,21 @@ def extract_phones(job):
return phones


def extract_personalisation(job):
job = job.split("\r\n")
first_row = job[0]
job.pop(0)
first_row = first_row.split(",")
personalisation = {}
job_row = 0
for row in job:
row = row.split(",")
temp = dict(zip(first_row, row))
personalisation[job_row] = temp
job_row = job_row + 1
return personalisation


def get_phone_number_from_s3(service_id, job_id, job_row_number):
# We don't want to constantly pull down a job from s3 every time we need a phone number.
# At the same time we don't want to store it in redis or the db
Expand Down Expand Up @@ -175,6 +190,53 @@ def get_phone_number_from_s3(service_id, job_id, job_row_number):
return "Unavailable"


def get_personalisation_from_s3(service_id, job_id, job_row_number):
# We don't want to constantly pull down a job from s3 every time we need the personalisation.
# At the same time we don't want to store it in redis or the db
# So this is a little recycling mechanism to reduce the number of downloads.
job = JOBS.get(job_id)
if job is None:
job = get_job_from_s3(service_id, job_id)
JOBS[job_id] = job
incr_jobs_cache_misses()
else:
incr_jobs_cache_hits()

# If the job is None after our attempt to retrieve it from s3, it
# probably means the job is old and has been deleted from s3, in
# which case there is nothing we can do. It's unlikely to run into
# this, but it could theoretically happen, especially if we ever
# change the task schedules
if job is None:
current_app.logger.warning(
"Couldnt find personalisation for job_id {job_id} row number {job_row_number} because job is missing"
)
return {}

# If we look in the JOBS cache for the quick lookup dictionary of personalisations for a given job
# and that dictionary is not there, create it
if JOBS.get(f"{job_id}_personalisation") is None:
JOBS[f"{job_id}_personalisation"] = extract_personalisation(job)

# If we can find the quick dictionary, use it
if JOBS.get(f"{job_id}_personalisation") is not None:
personalisation_to_return = JOBS.get(f"{job_id}_personalisation").get(
job_row_number
)
if personalisation_to_return:
return personalisation_to_return
else:
current_app.logger.warning(
f"Was unable to retrieve personalisation from lookup dictionary for job {job_id}"
)
return {}
else:
current_app.logger.error(
f"Was unable to construct lookup dictionary for job {job_id}"
)
return {}


def get_job_metadata_from_s3(service_id, job_id):
obj = get_s3_object(*get_job_location(service_id, job_id))
return obj.get()["Metadata"]
Expand Down
7 changes: 6 additions & 1 deletion app/celery/provider_tasks.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,11 @@
import json
import os
from datetime import datetime, timedelta

from flask import current_app
from sqlalchemy.orm.exc import NoResultFound

from app import aws_cloudwatch_client, notify_celery
from app import aws_cloudwatch_client, notify_celery, redis_store
from app.clients.email import EmailClientNonRetryableException
from app.clients.email.aws_ses import AwsSesClientThrottlingSendRateException
from app.clients.sms import SmsClientResponseException
Expand Down Expand Up @@ -162,8 +163,12 @@ def deliver_email(self, notification_id):
"Start sending email for notification id: {}".format(notification_id)
)
notification = notifications_dao.get_notification_by_id(notification_id)

if not notification:
raise NoResultFound()
personalisation = redis_store.get(f"email-personalisation-{notification_id}")

notification.personalisation = json.loads(personalisation)
send_to_providers.send_email_to_provider(notification)
except EmailClientNonRetryableException as e:
current_app.logger.exception(
Expand Down
9 changes: 9 additions & 0 deletions app/dao/notifications_dao.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,15 @@ def dao_create_notification(notification):
notification.id = create_uuid()
if not notification.status:
notification.status = NotificationStatus.CREATED

# notify-api-749 do not write to db
# if we have a verify_code we know this is the authentication notification at login time
# and not csv (containing PII) provided by the user, so allow verify_code to continue to exist
if "verify_code" in str(notification.personalisation):
pass
else:
notification.personalisation = ""

# notify-api-742 remove phone numbers from db
notification.to = "1"
notification.normalised_to = "1"
Expand Down
26 changes: 26 additions & 0 deletions app/dao/users_dao.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,32 @@ def create_secret_code(length=6):
return "{:0{length}d}".format(random_number, length=length)


def get_login_gov_user(login_uuid, email_address):
"""
We want to check to see if the user is registered with login.gov
If we can find the login.gov uuid in our user table, then they are.

Also, because we originally keyed off email address we might have a few
older users who registered with login.gov but we don't know what their
login.gov uuids are. Eventually the code that checks by email address
should be removed.
"""

print(User.query.filter_by(login_uuid=login_uuid).first())
user = User.query.filter_by(login_uuid=login_uuid).first()
if user:
if user.email_address != email_address:
save_user_attribute(user, {"email_address": email_address})
return user
# Remove this 1 July 2025, all users should have login.gov uuids by now
user = User.query.filter_by(email_address=email_address).first()
if user:
save_user_attribute(user, {"login_uuid": login_uuid})
return user

return None


def save_user_attribute(usr, update_dict=None):
db.session.query(User).filter_by(id=usr.id).update(update_dict or {})
db.session.commit()
Expand Down
27 changes: 23 additions & 4 deletions app/delivery/send_to_providers.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
import json
from datetime import datetime
from urllib import parse

Expand All @@ -10,7 +11,7 @@
)

from app import create_uuid, db, notification_provider_clients, redis_store
from app.aws.s3 import get_phone_number_from_s3
from app.aws.s3 import get_personalisation_from_s3, get_phone_number_from_s3
from app.celery.test_key_tasks import send_email_response, send_sms_response
from app.dao.email_branding_dao import dao_get_email_branding_by_id
from app.dao.notifications_dao import dao_update_notification
Expand All @@ -21,6 +22,18 @@


def send_sms_to_provider(notification):
# we no longer store the personalisation in the db,
# need to retrieve from s3 before generating content
# However, we are still sending the initial verify code through personalisation
# so if there is some value there, don't overwrite it
if not notification.personalisation:
personalisation = get_personalisation_from_s3(
notification.service_id,
notification.job_id,
notification.job_row_number,
)
notification.personalisation = personalisation

service = SerialisedService.from_id(notification.service_id)
message_id = None
if not service.active:
Expand Down Expand Up @@ -105,6 +118,14 @@ def send_sms_to_provider(notification):


def send_email_to_provider(notification):
# Someone needs an email, possibly new registration
recipient = redis_store.get(f"email-address-{notification.id}")
recipient = recipient.decode("utf-8")
personalisation = redis_store.get(f"email-personalisation-{notification.id}")
if personalisation:
personalisation = personalisation.decode("utf-8")
notification.personalisation = json.loads(personalisation)

service = SerialisedService.from_id(notification.service_id)
if not service.active:
technical_failure(notification=notification)
Expand All @@ -126,9 +147,7 @@ def send_email_to_provider(notification):
plain_text_email = PlainTextEmailTemplate(
template_dict, values=notification.personalisation
)
# Someone needs an email, possibly new registration
recipient = redis_store.get(f"email-address-{notification.id}")
recipient = recipient.decode("utf-8")

if notification.key_type == KeyType.TEST:
notification.reference = str(create_uuid())
update_notification_to_sending(notification, provider)
Expand Down
14 changes: 13 additions & 1 deletion app/job/rest.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,11 @@
import pytz
from flask import Blueprint, current_app, jsonify, request

from app.aws.s3 import get_job_metadata_from_s3, get_phone_number_from_s3
from app.aws.s3 import (
get_job_metadata_from_s3,
get_personalisation_from_s3,
get_phone_number_from_s3,
)
from app.celery.tasks import process_job
from app.config import QueueNames
from app.dao.fact_notification_status_dao import fetch_notification_statuses_for_job
Expand Down Expand Up @@ -97,6 +101,14 @@ def get_all_notifications_for_service_job(service_id, job_id):
paginated_notifications.items, many=True
)

for notification in paginated_notifications.items:
if notification.job_id is not None:
notification.personalisation = get_personalisation_from_s3(
notification.service_id,
notification.job_id,
notification.job_row_number,
)

return (
jsonify(
notifications=notifications,
Expand Down
1 change: 1 addition & 0 deletions app/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,7 @@ class User(db.Model):
id = db.Column(UUID(as_uuid=True), primary_key=True, default=uuid.uuid4)
name = db.Column(db.String, nullable=False, index=True, unique=False)
email_address = db.Column(db.String(255), nullable=False, index=True, unique=True)
login_uuid = db.Column(db.Text, nullable=True, index=True, unique=True)
created_at = db.Column(
db.DateTime,
index=False,
Expand Down
3 changes: 1 addition & 2 deletions app/notifications/process_notifications.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,8 +77,6 @@ def persist_notification(
document_download_count=None,
updated_at=None,
):
current_app.logger.info("Persisting notification")

notification_created_at = created_at or datetime.utcnow()
if not notification_id:
notification_id = uuid.uuid4()
Expand Down Expand Up @@ -117,6 +115,7 @@ def persist_notification(
notification.international = recipient_info.international
notification.phone_prefix = recipient_info.country_prefix
notification.rate_multiplier = recipient_info.billable_units

elif notification_type == NotificationType.EMAIL:
current_app.logger.info(
f"Persisting notification with type: {NotificationType.EMAIL}"
Expand Down
29 changes: 29 additions & 0 deletions app/notifications/rest.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
from notifications_utils import SMS_CHAR_COUNT_LIMIT

from app import api_user, authenticated_service
from app.aws.s3 import get_personalisation_from_s3, get_phone_number_from_s3
from app.config import QueueNames
from app.dao import notifications_dao
from app.enums import KeyType, NotificationType, TemplateProcessType
Expand Down Expand Up @@ -36,6 +37,19 @@ def get_notification_by_id(notification_id):
notification = notifications_dao.get_notification_with_personalisation(
str(authenticated_service.id), notification_id, key_type=None
)
if notification.job_id is not None:
notification.personalisation = get_personalisation_from_s3(
notification.service_id,
notification.job_id,
notification.job_row_number,
)
recipient = get_phone_number_from_s3(
notification.service_id,
notification.job_id,
notification.job_row_number,
)
notification.to = recipient
notification.normalised_to = recipient
return (
jsonify(
data={
Expand Down Expand Up @@ -67,6 +81,21 @@ def get_all_notifications():
key_type=api_user.key_type,
include_jobs=include_jobs,
)
for notification in pagination.items:
if notification.job_id is not None:
notification.personalisation = get_personalisation_from_s3(
notification.service_id,
notification.job_id,
notification.job_row_number,
)
recipient = get_phone_number_from_s3(
notification.service_id,
notification.job_id,
notification.job_row_number,
)
notification.to = recipient
notification.normalised_to = recipient

return (
jsonify(
notifications=notification_with_personalisation_schema.dump(
Expand Down
34 changes: 22 additions & 12 deletions app/organization/invite_rest.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,10 @@
import json

from flask import Blueprint, current_app, jsonify, request
from itsdangerous import BadData, SignatureExpired
from notifications_utils.url_safe_token import check_token, generate_token

from app import redis_store
from app.config import QueueNames
from app.dao.invited_org_user_dao import (
get_invited_org_user as dao_get_invited_org_user,
Expand Down Expand Up @@ -48,28 +51,35 @@ def invite_user_to_org(organization_id):
current_app.config["ORGANIZATION_INVITATION_EMAIL_TEMPLATE_ID"]
)

personalisation = {
"user_name": (
"The Notify.gov team"
if invited_org_user.invited_by.platform_admin
else invited_org_user.invited_by.name
),
"organization_name": invited_org_user.organization.name,
"url": invited_org_user_url(
invited_org_user.id,
data.get("invite_link_host"),
),
}
saved_notification = persist_notification(
template_id=template.id,
template_version=template.version,
recipient=invited_org_user.email_address,
service=template.service,
personalisation={
"user_name": (
"The Notify.gov team"
if invited_org_user.invited_by.platform_admin
else invited_org_user.invited_by.name
),
"organization_name": invited_org_user.organization.name,
"url": invited_org_user_url(
invited_org_user.id,
data.get("invite_link_host"),
),
},
personalisation={},
notification_type=NotificationType.EMAIL,
api_key_id=None,
key_type=KeyType.NORMAL,
reply_to_text=invited_org_user.invited_by.email_address,
)
redis_store.set(
f"email-personalisation-{saved_notification.id}",
json.dumps(personalisation),
ex=1800,
)
saved_notification.personalisation = personalisation

send_notification_to_queue(saved_notification, queue=QueueNames.NOTIFY)

Expand Down
Loading
Loading