Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sync develop to master #258

Merged
merged 51 commits into from
Sep 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
51 commits
Select commit Hold shift + click to select a range
3c6ab67
initial queue implementation cc bridge
abey-yoseph Mar 31, 2023
62b3ee0
add some comments
abey-yoseph Mar 31, 2023
1bf28d8
fix unsub logic (#126)
adev4a Apr 4, 2023
9a25264
init
dan-du-car Apr 5, 2023
732c2a9
Merge pull request #128 from usdot-fhwa-stol/fix_ui_confirm_topics
adev4a Apr 5, 2023
6a1c0af
cc bridge simplify logic
abey-yoseph Apr 5, 2023
a6bfeae
Cloud bridge investigation (#131)
abey-yoseph Apr 5, 2023
3181cb0
back to queue, add in file listener
abey-yoseph Apr 6, 2023
df63b3c
merge conflicts resolve
abey-yoseph Apr 7, 2023
12b6ad4
Merge pull request #132 from usdot-fhwa-stol/cloud_bridge_investigation
adev4a Apr 7, 2023
19d192e
Fix unit status and UI notification and event live status (#129)
dan-du-car Apr 7, 2023
d6d46a2
Remove credentials (#134)
dan-du-car Apr 10, 2023
8dcc64d
updating log levels, docker compose log size parameter
abey-yoseph Apr 17, 2023
ee3cbb4
revert local changes
abey-yoseph Apr 17, 2023
5016c14
init
dan-du-car Apr 20, 2023
c6a113a
remove hardcoded values
dan-du-car Apr 21, 2023
5e21503
Data type conflict on message fields with intermittent alphanumeric v…
adev4a Apr 25, 2023
8be591a
Verification testing small fixes (#139)
adev4a Apr 25, 2023
952bed3
add analysis scripts
adev4a May 1, 2023
6834657
messaging server log size parameter
abey-yoseph May 2, 2023
c55f61e
messaging server log size parameter (#143)
adev4a May 2, 2023
1880188
init
dan-du-car May 2, 2023
a71d39b
add nan
dan-du-car May 2, 2023
52e3338
Data type float into string field type conflicting (#144)
adev4a May 2, 2023
4c2e3d7
Fix json key value convertor logic (#146)
adev4a May 3, 2023
cac3958
Fix/drop nan value fields (#148)
adev4a May 3, 2023
b8405a3
clean up readme.md
SaikrishnaBairamoni May 5, 2023
1db4ec5
Merge release/k900 for 4.4.0 release (#149)
codygarver May 5, 2023
8bb8eae
update image names to release candidate (#234)
adev4a Jul 16, 2024
53dabb3
Fix/local deployment messaging server (#237)
adev4a Jul 22, 2024
e1e9c3b
Filename title should use original filename (#238)
dan-du-car Jul 22, 2024
e4fb7ba
Filename should only include alpha characters and underscore (#239)
dan-du-car Jul 22, 2024
0be1f4e
Fix a type in filename validation logic (#240)
dan-du-car Jul 22, 2024
36e9d03
FIx issue with undefined filename (#241)
dan-du-car Jul 22, 2024
a09ef47
Fix/update env variables (#242)
adev4a Jul 25, 2024
5f60f3d
Fix infrequent topic request (#243)
adev4a Jul 25, 2024
3a699c3
Fix analysis script issues (#244)
dan-du-car Aug 15, 2024
f085d35
Fix analysis script to parse bridge logs in XIL environment (#245)
dan-du-car Aug 16, 2024
7256884
Fix analysis script issues (#246)
dan-du-car Aug 16, 2024
eecec73
add script for historical data processing log parser (#248)
adev4a Aug 23, 2024
0dc84cf
Resolve merge conflicts on neon (#252)
adev4a Aug 27, 2024
dc29f66
Fix merge conflicts 2 (#253)
adev4a Aug 28, 2024
ad4d5f8
resolve merge conflicts with master
adev4a Aug 28, 2024
e6178d0
Fix neon merge conflicts (#257)
adev4a Aug 28, 2024
ec459ce
Merge release/neon (4.7.0) branch into master (#249)
SaikrishnaBairamoni Aug 28, 2024
f0d6376
update images in docker compose to release
adev4a Aug 28, 2024
9964ed5
Hotfix neon 4.7.1 (#259)
SaikrishnaBairamoni Aug 28, 2024
77d4697
update docker compose files to point 4.7.2
SaikrishnaBairamoni Aug 28, 2024
dc52391
update docker compose files to point 4.7.2 (#260)
SaikrishnaBairamoni Aug 28, 2024
301c9ed
update docker compose to point develop
SaikrishnaBairamoni Aug 29, 2024
30f6b00
fix the docker compose file to point develop org
SaikrishnaBairamoni Aug 29, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 5 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,30 +3,29 @@
|-----|-----|-----|
[![Docker build](https://github.com/usdot-fhwa-stol/cda-telematics/actions/workflows/docker.yml/badge.svg?branch=develop)](https://github.com/usdot-fhwa-stol/cda-telematics/actions/workflows/docker.yml)| [![Docker build](https://github.com/usdot-fhwa-stol/cda-telematics/actions/workflows/docker.yml/badge.svg?branch=master)](https://github.com/usdot-fhwa-stol/cda-telematics/actions/workflows/docker.yml)| [![Quality Gate Status](https://sonarcloud.io/api/project_badges/measure?project=usdot-fhwa-stol_cda-telematics&metric=alert_status)](https://sonarcloud.io/summary/new_code?id=usdot-fhwa-stol_cda-telematics)


# CDA-Telematics
This project will create an open-source Module that can be installed on any vehicle (e.g. a CARMA Platform and/or Messenger vehicle, an L0 or L1 production vehicle, etc.) that will collect data about the vehicle and wirelessly send it out in real time for data analysis. The same Module, with any modifications, if necessary, will also be compatible with CARMA Streets and CARMA Cloud. On the receiving end of this data, a user will have a Data Processing & Visualization Tool available to visualize and/or plot the data that was sent using the Module(s). This Module can be thought of as a Fleet Management tool with extra capabilities to support CDA research and education.

## Architecture Diagram
[Detailed Design](https://usdot-carma.atlassian.net/wiki/spaces/WFD2/pages/2230321179/Detailed+System+Design)

![architecture](https://user-images.githubusercontent.com/34483068/171265484-67177ebb-69f7-4286-9602-016043079958.png)

## Release Notes
The current version of CDA-Telematics tool and release history of the CARMA software platform: [CARMA Release Notes](<docs/Release_notes.md>)

## Documentation
Documentation of the setup, operation, and design of the CDA Telematics can be found on the project [Confluence](https://usdot-carma.atlassian.net/wiki/spaces/WFD2/overview) pages.
Documentation of the setup, operation, and design of the CDA Telematics can be found on the project [Confluence](https://usdot-carma.atlassian.net/wiki/spaces/WFD2/overview) pages.


## Contribution
Welcome to the CDA Telematics contributing guide. Please read this guide to learn about our development process, how to propose pull requests and improvements, and how to build and test your changes to this project. [CDA Telematics Contributing Guide](Contributing.md)
Welcome to the CDA Telematics contributing guide. Please read this guide to learn about our development process, how to propose pull requests and improvements, and how to build and test your changes to this project. [CDA Telematics Contributing Guide](Contributing.md)

## Code of Conduct
## Code of Conduct
Please read our [CDA Telematics Code of Conduct](Code_of_Conduct.md) which outlines our expectations for participants within the developer community, as well as steps to reporting unacceptable behavior. We are committed to providing a welcoming and inspiring community for all and expect our code of conduct to be honored. Anyone who violates this code of conduct may be banned from the community.

## Attribution
The development team would like to acknowledge the people who have made direct contributions to the design and code in this repository. [CDA Telematics Attribution](ATTRIBUTION.md)
The development team would like to acknowledge the people who have made direct contributions to the design and code in this repository. [CDA Telematics Attribution](ATTRIBUTION.md)

## License
By contributing to the Federal Highway Administration (FHWA) CDA Telematics repository, you agree that your contributions will be licensed under its Apache License 2.0 license. [CDA Telematics License](<docs/License.md>)
Expand Down
2 changes: 1 addition & 1 deletion telematic_system/docker-compose.cloud.servers.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ services:
command: bash -c 'wait-for-it localhost:4222 && java -jar /telematic_cloud_messaging/app.jar'
env_file:
- .env

rosbag2_processing_service:
build:
context: ./telematic_historical_data_processing
Expand Down
2 changes: 1 addition & 1 deletion telematic_system/docker-compose.local.yml
Original file line number Diff line number Diff line change
Expand Up @@ -135,7 +135,7 @@ services:
build:
context: "./telematic_cloud_messaging"
container_name: messaging_server
image: usdotfhwastoldev/telematic_local_messaging:develop
image: usdotfhwastoldev/telematic_cloud_messaging:develop
depends_on:
- nats
- mysqldb
Expand Down
8 changes: 4 additions & 4 deletions telematic_system/docker-compose.webapp.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ services:
context: "./telematic_apps/web_app/server"
restart: always
container_name: web_server
image: usdotfhwastoldev/telematic_web_server:develop
image: usdotfhwastoldevdev/telematic_web_server:develop
logging:
options:
max-size: "10m"
Expand Down Expand Up @@ -35,11 +35,11 @@ services:
- UPLOAD_HTTP_PORT=9011
- UPLOAD_TIME_OUT=3600000 # Milliseconds
- UPLOAD_MAX_FILE_SIZE=21474836480 #20 GB
- CONCURRENT_QUEUE_SIZE=5
- PART_SIZE=10485760
- CONCURRENT_QUEUE_SIZE=5
- PART_SIZE=10485760
- NATS_SERVERS=<NATS_IP>:4222
- FILE_PROCESSING_SUBJECT=ui.file.processing
- FILE_EXTENSIONS=.mcap
- FILE_EXTENSIONS=.mcap
command: bash -c '/app/service.sh'
volumes:
- /opt/apache2/grafana_htpasswd:/opt/apache2/grafana_htpasswd
Expand Down
147 changes: 147 additions & 0 deletions telematic_system/scripts/log_analysis/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,147 @@
# Prerequisite
- Preferred operating system ubuntu 20 or above
- Python environment setup
1. Install python
```
sudo apt update
sudo apt install python3
```
2. Check python version
```
python3 --version
```
Recommended version is `3.10`
3. Create a virtual environment. Navigate to `cda-telematics/telematic_system/scripts/log_analysis` directory, and run below command:
```
python3 -m venv .venv
```
4. Activate virtual environment.
```
.venv\bin\activate
```
Note: Need to run this command to activate virtual environment every time openning a new terminal.
- Install depedencies:
- Install debian packages
```
sudo apt install libcairo2-dev libxt-dev libgirepository1.0-dev

```
- Install python packages
```
pip install -r requirements.txt
```
- Clone repos:
- Clone cda-telematics GitHub repos
```
git clone https://github.com/usdot-fhwa-stol/cda-telematics.git
cd cda-telematics
```
- Download `log_timesheet.csv`
Most of the python analysis scripts refer to `log_timesheet.csv` for test runs and their duration. Since this `log_timesheet.csv` is generated during the verification/validation testing, ensure download the `log_timesheet.csv` file to this `log_analysis` folder before executing any python scripts.


# Process V2xHub bridge log
1. Navigate to `cda-telematics/telematic_system/scripts/log_analysis` directory
2. Download v2xhub logs to the current folder.
3. Run command to generate data publishing metrics.
```
python3 parse_v2xhub_telematic_plugin_logs.py --log_file_path <input-file-name>

e.g:
python3 parse_v2xhub_telematic_plugin_logs.py --log_file_path T20_R6-13_V2XHub.log
```
It will generate parsed bridge log in csv files.

# Process Streets bridge log
1. Navigate to `cda-telematics/telematic_system/scripts/log_analysis` directory
2. Download streets bridge logs to the current folder.
3. Run command to generate data publishing metrics.
```
python3 parse_streets_bridge_logs.py <path-to-log-file>
```
It will generate parsed bridge log in csv files.

# Process Cloud bridge log
1. Navigate to `cda-telematics/telematic_system/scripts/log_analysis` directory
2. Download streets bridge logs to the current folder.
3. Run command to generate data publishing metrics.
```
parse_cloud_bridge_logs.py <path-to-log-file>

e.g:
python3 parse_cloud_bridge_logs.py T20_R6-9_carma_cloud.log
python3 parse_cloud_bridge_logs.py T20_R10-13_carma_cloud.log
```
It will generate parsed bridge log in csv files.

# Process Vehicle bridge log
1. Navigate to `cda-telematics/telematic_system/scripts/log_analysis` directory
2. Download vehicle bridge logs to the current folder.
3. Run command to generate data publishing metrics.
```
python3 parse_vehicle_bridge_logs.py <path-to-log-file>

e.g:
python3 parse_vehicle_bridge_logs.py T20_R6_R13_fusion/T20_R6_fusion.log
```
It will generate parsed bridge log in csv files.

# Process Messaging Server log
1. Navigate to `cda-telematics/telematic_system/scripts/log_analysis` directory
2. Download messaging server logs to the current folder.
3. Run command to generate data publishing metrics.
```
parse_messaging_server_logs.py <path-to-log-file>

e.g:
python3 parse_messaging_server_logs.py T20_R6-13_messaging_server.log
```
It will generate parsed messaging server delay and message drop log in csv files.

# Metric analysis
## Latency
1. Create a folder with the test case name in the current `log_analysis` folder.
For example, test case 20:
```
mkdir T20
```
2. Copy all the generated T20_*_messaging_server_*_delay_parsed.csv files to this new folder `T20`
3. Run plot latency script to generate plots for those csv files with delay metrics in folder `T20`.
```
python3 latencyPlotter.py <folder-name or test case name>

e.g:
python3 latencyPlotter.py T20
```
The generated plots are saved into `output` folder.
## Message loss
1. Create a folder with the test case name and message drop in the current `log_analysis` folder.
For example, test case 20:
```
mkdir T20_message_drop

```
2. Copy all generated <test case name>_*_messaging_server_*_message_drop_parsed.csv files to this new folder `<test case name>_message_drop`.
3. Copy all generated bridge csv files into the same folder
4. Run message drop analysis script to analyze all files in the `<test case name>_message_drop` folder.
```
python3 get_message_drop.py <folder-name or test case name>_message_drop

e.g:
python3 get_message_drop.py T20_message_drop
```
Generated result is similar to below:
<br>
![Message_loss_result](https://github.com/user-attachments/assets/15fefacb-e929-4340-a0e3-6d7f6441ba8e)

## Rosbag Processing time
1. Navigate to `cda-telematics/telematic_system/scripts/log_analysis` directory
2. Download historical data processing service logs to the current folder.
3. Run command to generate data publishing metrics.
```
parse_processing_service_logs.py <path-to-log-file>

e.g:
python3 parse_processing_service_logs.py T19_R1_R5_rosbag2.log
```
It will print the time required to process each rosbag .mcap file and the average time required for all the files in the log.
74 changes: 23 additions & 51 deletions telematic_system/scripts/log_analysis/get_message_drop.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,17 +9,17 @@

import matplotlib.dates as mdates
import matplotlib.pyplot as plt

import os
warnings.filterwarnings("ignore")

'''
This script combines bridge logs with the messaging server logs to give the number of dropped messages from each unit.

Input: The script looks within the the argument directory for csv files from Messaging Server, Vehicle Bridge, Streets Bridge and Cloud Bridge log
Input: The script looks within the the argument directory for csv files from Messaging Server, Vehicle Bridge, Streets Bridge and Cloud Bridge log
,which are parsed log output from the bridges, to calculate the number of dropped messages from each unit.

Required Input File Format:
The csv files to be read currently need to follow a specific format.
Required Input File Format:
The csv files to be read currently need to follow a specific format.
The messaging server parsed csv needs to start with the word "Messaging" separated by underscores
Streets bridge parsed csv file name needs to start with the word Streets separated by underscores(_)
Vehicle bridge parsed csv file name needs to start with the word Vehicle or BlueLexus or Fusion separated by underscores(_)
Expand All @@ -32,32 +32,31 @@ def combineFiles(log_dir):
path_obj = Path(log_dir)
print(log_dir)
filenames = [ f.name for f in path_obj.glob('*.csv')]

bridge_csv_exist = False
bridge_csv_regex = r'.*(Streets|Vehicle|BlueLexus|Fusion|V2xHub|Cloud).*'
bridge_csv_regex = r'.*(Streets|Vehicle|BlueLexus|Fusion|V2xHub|Cloud|Ros2).*'
bridges_csv = []

messaging_server_csv_exist = False
messaging_server_csv = ""
messaging_server_csv = []

for filename in filenames:
if "Messaging" in filename:
for filename in filenames:
if "messaging" in filename.lower():
messaging_server_csv_exist = True
messaging_server_csv = log_dir + "/" + filename
messaging_server_csv.append(log_dir + "/" + filename)

matched = re.match(bridge_csv_regex, filename, re.IGNORECASE)
if matched:
bridges_csv.append(log_dir + "/" + filename)
bridge_csv_exist = True

if not bridge_csv_exist:
sys.exit("Did not find any Vehicle/Streets/Cloud/BlueLexus/Fusion/V2xHub bridge csv logs in directory: " +log_dir+ "")

if not messaging_server_csv_exist:
sys.exit("Did not find any Messaging server csv logs in directory: "+log_dir+ "")


messaging_server_df = pd.read_csv(messaging_server_csv)
messaging_server_df = pd.concat(map(pd.read_csv, messaging_server_csv), ignore_index=True)
infrastructure_units = ['streets_id', 'cloud_id']

############# Load messaging server logs and get a list of dataframes for all unit ids
Expand All @@ -69,62 +68,35 @@ def combineFiles(log_dir):
if key not in infrastructure_units:
value = value[~value['Message Time'].isnull()]
# value = value.drop('Metadata',axis =1)


#Get dataframes from bridge logs
bridge_dfs = dict()
for bridge_csv in bridges_csv:
bridge_df = pd.read_csv(bridge_csv)
bridge_dfs.update(dict(tuple(bridge_df.groupby('Unit Id'))))

print(bridge_dfs.keys())

bridge_df = pd.concat(map(pd.read_csv, bridges_csv), ignore_index=True)
bridge_dfs = dict(tuple(bridge_df.groupby('Unit Id')))

# Create combined dataframes from

# Create combined dataframes from
for key in bridge_dfs:
if key in messaging_server_dfs:

bridge_df_combined = pd.merge(bridge_dfs[key], messaging_server_dfs[key], how='left', left_on=['Topic','Payload Timestamp'], right_on = ['Topic','Message Time'])
bridge_df_combined.to_csv(log_dir + key + "_combined.csv")
if not os.path.exists("output"):
os.mkdir("output")
bridge_df_combined.to_csv("output/"+log_dir+"_"+ key + "_combined.csv")

bridge_missing_message_count = bridge_df_combined['Log_Timestamp(s)'].isnull().sum()
bridge_total_message_count = len(bridge_df_combined['Payload Timestamp'])
print("Message drop for unit: ", key)
print("\nMessage drop for unit: ", key)
print("Missing count: ", bridge_missing_message_count)
print("Total count: ", bridge_total_message_count)
print("Percentage of messages received",(1 - (bridge_missing_message_count/bridge_total_message_count))*100)


topics_with_empty_count = (bridge_df_combined['Message Time'].isnull().groupby([bridge_df_combined['Topic']]).sum().astype(int).reset_index(name='count'))
topics_with_empty_count = topics_with_empty_count.loc[~(topics_with_empty_count['count']==0)]

print("{} missed messages: ".format(key))
print(topics_with_empty_count)

# Plot vehicle data
bridge_df_combined = bridge_df_combined[bridge_df_combined['Message Time'].isnull()]
bridge_df_combined['Payload Timestamp'] = pd.to_datetime(bridge_df_combined['Payload Timestamp'], infer_datetime_format=True)
bridge_df_combined['Message Time'] = pd.to_datetime(bridge_df_combined['Message Time'], infer_datetime_format=True)


ax1 = plt.plot(bridge_df_combined['Topic'], bridge_df_combined['Payload Timestamp'], '|')

#Plot start and end lines
start_time = pd.to_datetime(messaging_server_dfs[key]['Log_Timestamp(s)'].iloc[0])
end_time = pd.to_datetime(messaging_server_dfs[key]['Log_Timestamp(s)'].iloc[-1])

plt.axhline(y = start_time, color = 'r', linestyle = '-', label = 'Test Start Time')
plt.axhline(y = end_time, color = 'r', linestyle = '-', label = 'Test End Time')

plt.title('{} : Topics against time of dropped message'.format(key))
plt.xlabel('Topics with dropped messages hours:mins:seconds')
plt.ylabel('Time of message drop')
xfmt = mdates.DateFormatter('%H:%M:%S')
plt.gcf().autofmt_xdate()
plt.show()
# plt.savefig('{}_Message_drop.png'.format(key))






Expand Down
Loading
Loading