Using a triple store to store and query spek output from candidate smasher. Determine which candidates are acceptable using causal pathways.
-
Upload data to S3 Bucket
-
Spin up Nebula cluster stack with bastion host
aws cloudformation create-stack --stack-name $STACK_NAME\ --capabilities "CAPABILITY_IAM" \ --template-body file://nept_stack_cf.yaml \ --parameters '[{"ParameterKey":"Env","ParameterValue":"test"},'\ '{"ParameterKey":"DbInstanceType","ParameterValue":"db.r4.xlarge"},'\ '{"ParameterKey":"KeyName","ParameterValue":"aws-growls-useast-1"}]'
-
Assign IAM ROLE to cluster
aws neptune add-role-to-db-cluster \ --role-arn ${NeptuneLoadFromS3IAMRoleArn} \ --db-cluster-identifier ${DBClusterId}
-
Configure Bastion Host
ansible-playbook -i ${BastionIp}, -u ec2-user bastion_play.yml
-
Login to Bastion Host and Load Data
ssh ec2-user@${BastionIp} scripts/neptune_load.sh
-
Start in memory fuseki that allows for updates
${FUSEKI_HOME}/fuseki-server --mem --update /ds 1> fuseki.out 2>&1 &
-
Input example spek
./insert_spek.sh
-
Run ISR update to identify acceptable candidates
./update_isr.sh
-
Run query to get results
./query_isr.sh
- fuseki
Creative Commons 3.0