Skip to content

Commit

Permalink
Prmdr 338 subtask (#93)
Browse files Browse the repository at this point in the history
* prmdr-242 create name validtions

* prmdr-242 added more validation and test

* prmdr-242 refactor validtion service and add unit test

* prmdr-242 test-fix

* prmdr-242 change regex

* prmdr-242 Pr changes

* allowing special characters to regex

* added test for allowing special char

* added model for metadata

* [PRMDR-336] Add csv_to_staging_metadata to lambda

Co-authored-by: NogaNHS <noga.sasson1@nhs.net>

* add lambda to workflow

* [PRMDR-336] Add unit test for lambda, tested lambda with sandbox

* [PRMDR-336] Substitute expected sqs msg in test with .json file

* run formatter

* [PRMDR-336] Setup configdict and alias in pydantic model to support serialise/deserialise with field names in NHS spec

* [PRMDR-336] Add unit test for sqs service

* [PRMDR-336] Add logging, run formatter

* [PRMDR-336] Remove unneccesary file

* [PRMDR-336] Add lambda deploy to github action yml

* [PRMDR-336] Add lambda deploy to dev-to-main github action yml

* [PRMDR-336] Remove a redundant str() call

* [PRMDR-338] Add bulk_upload_handler lambda, amend StagingMetadata model, add basic test for serialise/deserialise for StagingMetadata

* (WIP) [PRMDR-338] Add bulk upload service

* [PRMDR-338] Refactoring

* [PRMDR-338] Start bringing in LG file validation from other branch

* [PRMDR-338] Adding unit test for bulk upload service

* [PRMDR-338] Run formatter

* [PRMDR-338] Continue adding unit tests

* [PRMDR-338] run formatter

* [PRMDR-338] Adding unit test to bulk upload service

* [PRMDR-338] Rename the file for test data

* [PRMDR-338] Add unit test for new methods in S3 service

* [PRMDR-338] Add unit tests for bulk_upload_handler

* [PRMDR-338] Remove duplicated code

* Run formatter

* [PRMDR-338] Add new lambda to github action yml

* [PRMDR-338] Add unit test for added validation in lloyd_george_validator

* [PRMDR-338] Fix issue around sqs message attribute, add unit test for extract_info_from_filename

* Run formatter

* [PRMDR-338] Revert accidental deletion of some unit tests for bulk_upload_metadata lambda

* [PRMDR-338] Amend a unit test for bulk_upload_service

* run formatter

---------

Co-authored-by: NogaNHS <noga.sasson1@nhs.net>
  • Loading branch information
joefong-nhs and NogaNHS authored Oct 16, 2023
1 parent cef4c71 commit aaaa996
Show file tree
Hide file tree
Showing 27 changed files with 945 additions and 62 deletions.
43 changes: 41 additions & 2 deletions .github/workflows/lambdas-deploy-feature-to-sandbox.yml
Original file line number Diff line number Diff line change
Expand Up @@ -412,7 +412,7 @@ jobs:
zip_file: package_lambdas_lloyd_george_record_stitch_handler.zip


python_deploy_bulk_upload_lambda:
python_deploy_bulk_upload_metadata_lambda:
runs-on: ubuntu-latest
environment: development
needs: ["python_lambdas_test"]
Expand Down Expand Up @@ -449,4 +449,43 @@ jobs:
with:
aws_region: ${{ vars.AWS_REGION }}
function_name: ${{ github.event.inputs.sandboxWorkspace}}_BulkUploadMetadataLambda
zip_file: package_lambdas_bulk_upload_metadata_handler.zip
zip_file: package_lambdas_bulk_upload_metadata_handler.zip

python_deploy_bulk_upload_lambda:
runs-on: ubuntu-latest
environment: development
needs: ["python_lambdas_test"]
strategy:
matrix:
python-version: ["3.11"]

steps:
- name: Checkout
uses: actions/checkout@v3

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}

- name: Make virtual environment
run: |
make env
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v2
with:
role-to-assume: ${{ secrets.AWS_ASSUME_ROLE }}
role-skip-session-tagging: true
aws-region: ${{ vars.AWS_REGION }}

- name: Create release package for Bulk Upload Lambda
run: |
make lambda_name=bulk_upload_handler zip
- name: Upload Lambda Function for BulkUpload Lambda
uses: appleboy/lambda-action@master
with:
aws_region: ${{ vars.AWS_REGION }}
function_name: ${{ github.event.inputs.sandboxWorkspace}}_BulkUploadLambda
zip_file: package_lambdas_bulk_upload_handler.zip
45 changes: 43 additions & 2 deletions .github/workflows/lambdas-deploy-to-test-manual.yml
Original file line number Diff line number Diff line change
Expand Up @@ -418,7 +418,7 @@ jobs:
function_name: ${{ vars.BUILD_ENV}}_LloydGeorgeStitchLambda
zip_file: package_lambdas_lloyd_george_record_stitch_handler.zip

python_deploy_bulk_upload_lambda:
python_deploy_bulk_upload_metadata_lambda:
runs-on: ubuntu-latest
environment: test
needs: ["python_lambdas_test"]
Expand Down Expand Up @@ -457,4 +457,45 @@ jobs:
with:
aws_region: ${{ vars.AWS_REGION }}
function_name: ${{ vars.BUILD_ENV}}_BulkUploadMetadataLambda
zip_file: package_lambdas_bulk_upload_metadata_handler.zip
zip_file: package_lambdas_bulk_upload_metadata_handler.zip

python_deploy_bulk_upload_lambda:
runs-on: ubuntu-latest
environment: test
needs: ["python_lambdas_test"]
strategy:
matrix:
python-version: ["3.11"]

steps:
- name: Checkout
uses: actions/checkout@v3
with:
ref: ${{ github.event.inputs.buildBranch}}

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}

- name: Make virtual environment
run: |
make env
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v2
with:
role-to-assume: ${{ secrets.AWS_ASSUME_ROLE }}
role-skip-session-tagging: true
aws-region: ${{ vars.AWS_REGION }}

- name: Create release package for Bulk Upload Lambda
run: |
make lambda_name=bulk_upload_handler zip
- name: Upload Lambda Function for BulkUpload Lambda
uses: appleboy/lambda-action@master
with:
aws_region: ${{ vars.AWS_REGION }}
function_name: ${{ vars.BUILD_ENV}}_BulkUpload Lambda
zip_file: package_lambdas_bulk_upload_handler.zip
17 changes: 15 additions & 2 deletions .github/workflows/new_base-lambdas-reusable-deploy-all.yml
Original file line number Diff line number Diff line change
Expand Up @@ -92,8 +92,8 @@ jobs:
secrets:
AWS_ASSUME_ROLE: ${{ secrets.AWS_ASSUME_ROLE }}

deploy_bulk_upload_lambda:
name: Deploy metadata_bulk_upload_lambda
deploy_bulk_upload_metadata_lambda:
name: Deploy bulk_upload_metadata_lambda
uses: ./.github/workflows/new_base-lambdas-reusable-deploy.yml
with:
environment: ${{ inputs.environment}}
Expand All @@ -102,5 +102,18 @@ jobs:
sandbox: ${{ inputs.sandbox }}
lambda_handler_name: bulk_upload_metadata_handler
lambda_aws_name: BulkUploadMetadataLambda
secrets:
AWS_ASSUME_ROLE: ${{ secrets.AWS_ASSUME_ROLE }}

deploy_bulk_upload_lambda:
name: Deploy bulk_upload_lambda
uses: ./.github/workflows/new_base-lambdas-reusable-deploy.yml
with:
environment: ${{ inputs.environment}}
python_version: ${{ inputs.python_version }}
build_branch: ${{ inputs.build_branch}}
sandbox: ${{ inputs.sandbox }}
lambda_handler_name: bulk_upload_handler
lambda_aws_name: BulkUploadLambda
secrets:
AWS_ASSUME_ROLE: ${{ secrets.AWS_ASSUME_ROLE }}
50 changes: 50 additions & 0 deletions lambdas/handlers/bulk_upload_handler.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
import json
import logging
import os

from services.bulk_upload_service import BulkUploadService
from services.lloyd_george_validator import LGInvalidFilesException
from services.sqs_service import SQSService
from utils.exceptions import InvalidMessageException

logger = logging.getLogger()
logger.setLevel(logging.INFO)


def lambda_handler(event, _context):
logger.info("Received event. Starting bulk upload process")
bulk_upload_service = BulkUploadService()

if "Records" not in event:
logger.error(f"No sqs messages found in event: {event}. Will ignore this event")
return

for index, message in enumerate(event["Records"], start=1):
try:
logger.info(f"Processing message {index} of {len(event['Records'])}")
bulk_upload_service.handle_sqs_message(message)
except (InvalidMessageException, LGInvalidFilesException) as error:
handle_invalid_message(invalid_message=message, error=error)


def handle_invalid_message(invalid_message: dict, error=None):
# Currently we just send the invalid message to invalid queue.
# In future ticket, will change this to record errors in dynamo db
invalid_queue_url = os.environ["INVALID_SQS_QUEUE_URL"]
sqs_service = SQSService()

new_message = {"original_message": invalid_message["body"]}
if error:
new_message["error"] = str(error)

try:
nhs_number = invalid_message["messageAttributes"]["NhsNumber"]["stringValue"]
except KeyError:
nhs_number = ""

sqs_service.send_message_with_nhs_number_attr(
queue_url=invalid_queue_url,
message_body=json.dumps(new_message),
nhs_number=nhs_number,
)
logger.info(f"Sent message to invalid queue: {invalid_message}")
12 changes: 5 additions & 7 deletions lambdas/handlers/create_document_reference_handler.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@
import logging
import os
import sys
import uuid
from json import JSONDecodeError

from botocore.exceptions import ClientError
Expand All @@ -11,12 +10,12 @@
UploadRequestDocument)
from pydantic import ValidationError
from services.dynamo_service import DynamoDBService
from services.lloyd_george_validator import (LGInvalidFilesException,
validate_lg_files)
from services.s3_service import S3Service
from utils.exceptions import InvalidResourceIdException
from utils.lambda_response import ApiGatewayResponse
from utils.utilities import validate_id

from services.lloyd_george_validator import validate_lg_files, LGInvalidFilesException
from utils.utilities import create_reference_id, validate_id

sys.path.append(os.path.join(os.path.dirname(__file__)))

Expand Down Expand Up @@ -64,7 +63,7 @@ def lambda_handler(event, context):
except ValidationError as e:
logger.error(e)
return ApiGatewayResponse(
400, f"Failed to parse document upload request data", "GET"
400, "Failed to parse document upload request data", "GET"
).create_api_gateway_response()
except JSONDecodeError as e:
logger.error(e)
Expand Down Expand Up @@ -92,7 +91,7 @@ def lambda_handler(event, context):

logger.info("Provided document is supported")

s3_object_key = str(uuid.uuid4())
s3_object_key = create_reference_id()

document_reference: NHSDocumentReference

Expand Down Expand Up @@ -121,7 +120,6 @@ def lambda_handler(event, context):
return response

try:

s3_response = s3_service.create_document_presigned_url_handler(
document_reference.s3_bucket_name,
document_reference.nhs_number + "/" + document_reference.id,
Expand Down
10 changes: 6 additions & 4 deletions lambdas/models/nhs_document_reference.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,9 @@
from datetime import datetime, timezone
from typing import Any

from enums.metadata_field_names import DocumentReferenceMetadataFields
from pydantic import BaseModel



class UploadRequestDocument(BaseModel):
fileName: str
contentType: str
Expand All @@ -25,7 +23,7 @@ def __init__(
self.deleted = None
self.uploaded = None
self.virus_scanner_result = "Not Scanned"
self.file_location = f"s3://{self.s3_bucket_name}/{self.nhs_number}/{self.id}"
self.file_location = f"s3://{self.s3_bucket_name}/{self.s3_file_key}"

def set_uploaded(self) -> None:
self.uploaded = datetime.now(timezone.utc).strftime("%Y-%m-%dT%H:%M:%S.%fZ")
Expand Down Expand Up @@ -54,6 +52,10 @@ def to_dict(self):
}
return document_metadata

@property
def s3_file_key(self):
return f"{self.nhs_number}/{self.id}"

def __eq__(self, other):
return (
self.id == other.id
Expand All @@ -63,6 +65,6 @@ def __eq__(self, other):
and self.created == other.created
and self.deleted == other.deleted
and self.uploaded == other.uploaded
and self.virus_scanner_result == other.virus_scan_result
and self.virus_scanner_result == other.virus_scanner_result
and self.file_location == other.file_location
)
1 change: 0 additions & 1 deletion lambdas/models/staging_metadata.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@ class MetadataFile(BaseModel):
file_path: str = Field(alias="FILEPATH")
page_count: str = Field(alias="PAGE COUNT")
gp_practice_code: str
nhs_number: str = Field(exclude=True, alias=NHS_NUMBER_FIELD_NAME)
section: str
sub_section: Optional[str]
scan_date: str
Expand Down
Loading

0 comments on commit aaaa996

Please sign in to comment.