Skip to content

Commit

Permalink
Merge pull request #539 from usdot-jpo-ode/candidate_r1
Browse files Browse the repository at this point in the history
Merge candidate_r1 into master
  • Loading branch information
SaikrishnaBairamoni authored Feb 26, 2024
2 parents 179c99e + eef2915 commit 967a238
Show file tree
Hide file tree
Showing 148 changed files with 3,619 additions and 1,095 deletions.
2 changes: 1 addition & 1 deletion .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ version: 2.1
jobs:
build:
docker:
- image: 'circleci/openjdk:14.0.2-jdk-buster-node'
- image: 'cimg/openjdk:21.0.2-node'
steps:
- checkout
- run:
Expand Down
4 changes: 2 additions & 2 deletions .devcontainer/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
# Licensed under the MIT License. See https://go.microsoft.com/fwlink/?linkid=2090316 for license information.
#-------------------------------------------------------------------------------------------------------------

FROM openjdk:11-jdk
FROM eclipse-temurin:21-jdk

# This Dockerfile adds a non-root user with sudo access. Use the "remoteUser"
# property in devcontainer.json to use it. On Linux, the container user's GID/UIDs
Expand Down Expand Up @@ -45,7 +45,7 @@ RUN apt-get -y install snmp

#-------------------Install Kafka----------------------------------
RUN mkdir ~/Downloads
RUN curl "https://archive.apache.org/dist/kafka/2.1.1/kafka_2.11-2.1.1.tgz" -o ~/Downloads/kafka.tgz
RUN curl "https://archive.apache.org/dist/kafka/3.6.0/kafka_2.12-3.6.0.tgz" -o ~/Downloads/kafka.tgz
RUN mkdir ~/kafka \
&& cd ~/kafka \
&& tar -xvzf ~/Downloads/kafka.tgz --strip 1
Expand Down
43 changes: 23 additions & 20 deletions .devcontainer/devcontainer.json
Original file line number Diff line number Diff line change
@@ -1,30 +1,33 @@
// For format details, see https://aka.ms/devcontainer.json. For config options, see the README at:
// https://github.com/microsoft/vscode-dev-containers/tree/v0.154.0/containers/java-8
{
"name": "Java 11 and C++",
"name": "Java 21 and C++",
"dockerFile": "Dockerfile",
"overrideCommand": false,
"shutdownAction": "stopContainer",
"settings": {
"terminal.integrated.shell.linux": "/bin/bash"
"customizations": {
"vscode": {
"settings": {
"terminal.integrated.shell.linux": "/bin/bash"
},
"extensions": [
"vscjava.vscode-java-pack",
"vscjava.vscode-java-debug",
"vscjava.vscode-maven",
"vscjava.vscode-java-dependency",
"vscjava.vscode-java-test",
"hbenl.vscode-test-explorer",
"ms-vscode.test-adapter-converter",
"esbenp.prettier-vscode",
"mhutchie.git-graph",
"tabnine.tabnine-vscode",
"redhat.java",
"redhat.vscode-commons",
"ms-vscode.cpptools",
"ms-vscode.cmake-tools"
]
}
},
// Add the IDs of extensions you want installed when the container is created.
"extensions": [
"vscjava.vscode-java-pack",
"vscjava.vscode-java-debug",
"vscjava.vscode-maven",
"vscjava.vscode-java-dependency",
"vscjava.vscode-java-test",
"hbenl.vscode-test-explorer",
"ms-vscode.test-adapter-converter",
"esbenp.prettier-vscode",
"mhutchie.git-graph",
"tabnine.tabnine-vscode",
"redhat.java",
"redhat.vscode-commons",
"ms-vscode.cpptools",
"ms-vscode.cmake-tools"
],
// Use 'forwardPorts' to make a list of ports inside the container available locally.
// "forwardPorts": [8080, 9090, 46753, 46800, 5555, 6666, 8090, 2181, 9092],
// Use 'postCreateCommand' to run commands after the container is created.
Expand Down
66 changes: 52 additions & 14 deletions .devcontainer/post-create.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,17 +6,55 @@ bin/kafka-server-start.sh -daemon config/server.properties
# wait 2 seconds for the server to start and be able to add partitions
sleep 2s
# add topics
bin/kafka-topics.sh --create --topic "topic.OdeBsmPojo" --zookeeper localhost:2181 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeBsmJson" --zookeeper localhost:2181 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.FilteredOdeBsmJson" --zookeeper localhost:2181 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeTimJson" --zookeeper localhost:2181 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeTimBroadcastJson" --zookeeper localhost:2181 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.J2735TimBroadcastJson" --zookeeper localhost:2181 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeDriverAlertJson" --zookeeper localhost:2181 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.Asn1DecoderInput" --zookeeper localhost:2181 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.Asn1DecoderOutput" --zookeeper localhost:2181 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.Asn1EncoderInput" --zookeeper localhost:2181 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.Asn1EncoderOutput" --zookeeper localhost:2181 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.SDWDepositorInput" --zookeeper localhost:2181 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeTIMCertExpirationTimeJson" --zookeeper localhost:2181 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeRawEncodedMessageJson" --zookeeper localhost:2181 --replication-factor 1 --partitions 1

# BSM
bin/kafka-topics.sh --create --topic "topic.OdeBsmPojo" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeBsmJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.FilteredOdeBsmJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeRawEncodedBSMJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1

# TIM
bin/kafka-topics.sh --create --topic "topic.OdeTimJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeTimBroadcastJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.J2735TimBroadcastJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeTIMCertExpirationTimeJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeRawEncodedTIMJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1

# PSM
bin/kafka-topics.sh --create --topic "topic.OdePsmPojo" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdePsmJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeRawEncodedPSMJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1

# SSM
bin/kafka-topics.sh --create --topic "topic.OdeSsmPojo" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeSsmJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeRawEncodedSSMJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1

# SRM
bin/kafka-topics.sh --create --topic "topic.OdeSrmPojo" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeSrmJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeRawEncodedSRMJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1

# MAP
bin/kafka-topics.sh --create --topic "topic.OdeRawEncodedMAPJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeMapTxPojo" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeMapJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1

# SPaT
bin/kafka-topics.sh --create --topic "topic.OdeSpatTxPojo" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeSpatPojo" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeSpatJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.FilteredOdeSpatJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeSpatRxJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeSpatRxPojo" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1

# ASN1
bin/kafka-topics.sh --create --topic "topic.Asn1DecoderInput" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.Asn1DecoderOutput" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.Asn1EncoderInput" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.Asn1EncoderOutput" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1

# ETC
bin/kafka-topics.sh --create --topic "topic.SDWDepositorInput" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeRawEncodedMessageJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
bin/kafka-topics.sh --create --topic "topic.OdeDriverAlertJson" --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1
10 changes: 5 additions & 5 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,20 +7,20 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v2
uses: docker/setup-buildx-action@v3
- name: Build
uses: docker/build-push-action@v3
uses: docker/build-push-action@v5

sonar:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
submodules: recursive
- name: Set up JDK
uses: actions/setup-java@v3
uses: actions/setup-java@v4
with:
java-version: "11"
java-version: "21"
distribution: "temurin"
- name: Run Sonar
env:
Expand Down
4 changes: 2 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM maven:3.8.1-openjdk-11 as builder
FROM maven:3.8-eclipse-temurin-21-alpine as builder
MAINTAINER 583114@bah.com

WORKDIR /home
Expand All @@ -16,7 +16,7 @@ COPY ./jpo-ode-svcs/src ./jpo-ode-svcs/src

RUN mvn clean package -DskipTests

FROM eclipse-temurin:11-jre-alpine
FROM eclipse-temurin:21-jre-alpine

WORKDIR /home

Expand Down
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ Once the ODE is deployed and running locally, you may access the ODE's demonstra
- [WYDOT Log Records](data/wydotLogRecords.h)
3. Press `Upload` button to upload the file to ODE.

Upload records within the files must be embedding BSM and/or TIM messages wrapped in J2735 MessageFrame and ASN.1 UPER encoded, wrapped in IEEE 1609.2 envelope and ASN.1 COER encoded binary format. Please review the files in the [data](data) folder for samples of each supported type. By uploading a valid data file, you will be able to observe the decoded messages contained within the file appear in the web UI page while connected to the WebSocket interface.
Upload records within the files can be embedding BSM, MAP and/or TIM messages wrapped in J2735 MessageFrame and ASN.1 UPER encoded, wrapped in IEEE 1609.2 envelope and ASN.1 COER encoded binary format. Log processing of files that contain messages with WSMP headers within the ASN.1 UPER encoded messages is supported but the header will be removed before processing. Please review the files in the [data](data) folder for samples of each supported type. By uploading a valid data file, you will be able to observe the decoded messages contained within the file appear in the web UI page while connected to the WebSocket interface.

Another way data can be uploaded to the ODE is through copying the file to the location specified by the `ode.uploadLocationRoot/ode.uploadLocationObuLog`property. If not specified, Default locations would be `uploads/bsmlog`sub-directory off of the location where ODE is launched.

Expand All @@ -90,6 +90,7 @@ Supported message types:
- SPaT
- SRM
- SSM
- PSM

1. Navigate to the [UDP sender Python scripts](<./scripts/tests/>) in the project.
2. Ensure the environment variable "DOCKER_HOST_IP" has been set in the shell that will be running the script. This must be set to the same IP that the ODE deployments are using.
Expand Down
Loading

0 comments on commit 967a238

Please sign in to comment.