Skip to content

Autonomous RC-Car racing competition in HKUST

License

Notifications You must be signed in to change notification settings

JosefGst/autorace

 
 

Repository files navigation

license code_size total_Lines Tweet

Autorace provides hardware and example codes to achieve vision-based autonomous racing on RC-Cars. It is developed by RAM-LAB to support the 1st autonomous RC-Car racing competition in Hong Kong University of Science and Technology (HKUST). The competition data is Feb 26, 2021 (tentative, due to COVID-19).

Event Collaborators: School of Engineering, Robotics Institue (RI), Robotics and Multiperception Lab (RAM-LAB), Intelligent Autonomous Driving Center, Entrepreneurship Center

Keywords: autonomous racing, visual navigation, artifical intelligence, deep learning.

Maintaners: Peide Cai <pcaiaa@connect.ust.hk>   Supervisor: Prof. Ming Liu <eelium@ust.hk>

💻 Official Website

📹 Demo & Workshops

🏁 Rules and Regulations

If you meet with any problems, feel free to create an issue to let me and other participants know, then we can solve it together.

If you like the project, give it a star ⭐. It means a lot to the people maintaining it 🧙

Table of Contents

Features

  • Coding language: Python3

  • Deep learning framework: PyTorch 1.6

  • On-board computer: Jetson Nano B1

  • Complete pipeline of data collection, model training and testing

  • Easy to use and DIY

  • What can you achieve 🤔?

    1. Build a small but powerful RC-Car that can drive itself.
    2. Record driving data (camera images, control actions) by teleoperating the RC-Car.
    3. Train different AI autopilots to autonomously drive your car on the track as fast as possible.
    4. Autonomous collision avoidance around different obstacles.

Build a RC-Car

The RC-Car is named JetRacer, a high speed AI racing robot powered by Jetson Nano.

Website: 中文 | EN

JetRacer WiKi: 中文 | EN

Some parts of the user guides on software in the above wiki (provided by the vendor) are out-of-date (you may meet different errors during tests). Please follow this repository for system and software installation in the following sections, and just take the above wiki as a reference.

Assemble Manusal: 中文 | EN

Slides and Assemble Video: Link

Back to Top

System Installation

1. Jetson Nano on the RC-Car

This is for data collection and model deployment, covered in our 2nd workshop with slides and videos.

NVIDIA Jetson Nano lets you bring incredible new capabilities to millions of small, power-efficient AI systems. It is also the perfect tool to start learning about AI and robotics in real-world settings, with ready-to-try projects and the support of an active and passionate developer community.

To use this mainboard, we need to install a software development kit on it, which is named JetPack 👇

1.1 Use JetPack to Install a Base Ubuntu System

1.1.1 Features

Based on the introduction in https://developer.nvidia.com/jetpack-sdk-44-archive

JetPack SDK is the most comprehensive solution for building AI applications. It includes the latest Linux Driver Package (L4T) with Linux operating system named Ubuntu (version: 18.04) and CUDA-X accelerated libraries and APIs for Deep Learning, Computer Vision, Accelerated Computing and Multimedia. It also includes samples, documentation, and developer tools for both host computer and developer kit.

In this project we use JetPack 4.4.

1.1.2 Installation

JetPack installation is quite simple: Flash the image to a microSD card -> Connect Nano to a display and boot the system (Ubuntu 18.04) -> Finish initialization. Step-by-step instructions are from Page 4 - Page 23 in this slides from workshop#2. Then you will enter the Ubuntu 18.04 system, which looks like the following:

1.1.3 Caution

Remember do not upgrade the system after installation, even the system reminds you of that. Because some of our library dependencies rely on the current version of jetpack, which is 4.4.

1.2 Use JetCard to Quickly Configure the System

After installing the Ubuntu 18.04 system with JetPack4.4, we provide a system configuration named JetCard to make it easy to get started with AI on Jetson Nano. Simply execute the provided script and all external library dependencies will be installed automatically.

1.2.1 Features

JetCard comes pre-loaded with:

  • A Jupyter Lab server that starts on boot for easy web programming
  • A script to display the Jetson Nano's IP address, CPU & GPU usage, battery life, charging status, etc.
  • The popular deep learning frameworks PyTorch (version: 1.6)
  • A Python3 library to drive RC-Cars Donkeycar
  • Other development tools such as Oh My Zsh, virtualenv, torch2trt

After configuring your system using JetCard, you can get started prototyping AI projects from your web browser in Python3.

1.2.2 Installation

After you install JetPack 4.4 on the SD card, boot the system (Ubuntu 18.04) and open a terminal by pressing Ctrl+Alt+T, and then do the followings to use JetCard to quickly configure your system:

$ cd ~
$ git clone https://github.com/caipeide/jetcard
$ cd jetcard
$ sh ./install.sh <password>

The whole installation will cost about 40 min. After that the script will ask you to reboot. Take a look at the ip address shown on the display (10.79.157.13 in this case).

Now you can disconnect the HDMI port, keyboard and mouse on the jetson nano and start remote development: Open a browser on your own laptop and enter 10.79.157.13:8888 for remote connection to and development on the car. (the <password> you set earlier will be asked to enter the jupyterlab)

Finally, we clone this project to the RC-Car:

$ cd ~/projects/donkeycar
$ pip install -e .
$ cd ~
$ git clone https://github.com/caipeide/autorace
1.2.3 Tips

To ensure that the Jetson Nano doesn't draw more current than the battery pack can supply, place the Jetson Nano in 5W mode by calling the following command.

  • You need to launch a new Terminal and enter following commands to select 5W power mode:
$ sudo nvpmodel -m1
  • Check if mode is correct (or take a look at the car display):
$ sudo nvpmodel -q

m1: 5W power mode, m2: 10W power mode (MAXN)

Back to Top

2. Your Host PC

A host PC is needed for model training

2.1 Requirement

  • Nvidia GPU (at least with 6 GB frame buffer), e.g., RTX 2060, GTX 1080.
  • Ubuntu 18.04 system should be installed.

For participants who do not have a NVIDIA graphics card (GPU) in their computer, they can apply for using our server to train their models, and skip the following section 2.2. In this way, the laptop will be only used for remote connection to the RC-Car, which is doable on any system including Windows, MacOS and Ubuntu.

2.2 Graphics Driver Installation

The following instructions refer to steps from 2 Ways to Install Nvidia Driver on Ubuntu 18.04 (GUI & Command Line). Here we simply choose to use graphical user interface (GUI) for installing the Nvidia driver.

First, go to system settings > details and check what graphics card your computer is using. By default, your integrated graphics card (Intel HD Graphics) is being used.

Then open softare & updates program from you application menu. Click the additional drivers tab. You can see what driver is being used for Nvidia card. If you can not see nvidia-driver-xxx in the list, open a terminal and do sudo apt update, then re-open this window, and available drivers can be updated.

As you can see many driver versions are available for the GeForce RTX 2080 Ti card on our server. Here we use the nvidia-driver-440 and it works fine for our model training. There might be some other drivers for your particular Nvidia card. Click Apply Changes button to install the driver.

After it’s installed, reboot your computer for the change to take effect. After that, go to system settings > details, you will see Ubuntu is using Nvidia graphics card.

You can open a terminal and do nvidia-smi to check the running information of your GPU:

2.3 System Configuration

We provide a script install_host.sh for you to quickly configure your host PC (or server account) with all necessary dependencies for model training.

2.3.1 Features

By executing the script, the following packages will be automatically installed:

  • Miniconda: A package manager that helps you find and install packages. Then a new conda environement named autorace will be automatically created
  • OpenCV: An open source computer vision and machine learning software library
  • Matplotlib: A comprehensive library for creating static, animated, and interactive visualizations in Python
  • Other tools such as Pytorch 1.6 and Donkeycar

After configuring your system, you can get started transmitting data between your host PC (or server account) and your RC-Car (via ssh), and using the collected dataset to train your own self-driving car (will be introduced in the next section Train a Self-driving Car)

2.3.2 Installation
  1. If you are using your own PC, open a terminal and do the first three steps first in the following (they are already installed on the server)
$ sudo apt update
$ sudo apt install openssh-server
$ sudo apt install git
  1. Then you can install the other dependencies.

If you are using the server, connect to it with ssh first in a new terminal on your PC (or on the RC-Car with jupyterlab), for example, If you are using Ubuntu, open a new terminal (Ctrl+Alt+T) and ssh -p <port_number> <server_account_name>@<server_ip_address>. Then do the followings in the terminal for host configuration.

$ cd ~
$ git clone https://github.com/caipeide/autorace
$ cd autorace
$ sh ./install_host.sh
$ source ~/.bashrc
$ conda create -n autorace python=3.6 -y
$ conda activate autorace
$ sh ./install_host_continue.sh

Back to Top

Start Your Journey of Self-Driving

1. Calibration

1.1 Configuration Files

All of the car's settings are in the config.py and myconfig.py scripts. config.py stores the default values for all parameterss, and you can adjust these settings in myconfig.py by uncommenting related lines and changing their values. When the main program manage.py starts, it will first read config.py and then override the variables with values you set in myconfig.py.

You can edit this file using two methods:

  • If you are in a terminal, you can use nano to edit files:
nano ~/autorace/myconfig.py
  • If you perfer GUI, you can directly edit files in Jupyter Lab. Just double-click a file, make changes, and Ctrl+S to save.

Here uncomment means deleting the # symbol at the beginning of lines. In most editors, you can toggle between the comment and uncomment status by pressing Ctrl+/.

1.2 Throttle Calibration

Make sure your car is off the ground to prevent a runaway situation.

  1. Turn on your car and the motor, and connect to the car via Jupter Lab on your PC.
  2. Open a terminal and run donkey calibrate --channel 1 --bus=1
  3. Enter 370 when prompted for a PWM value. Then you should hear your ESC beep indicating that it's calibrated. If not, adjust this value a little, or else the motor won't work.
  4. Enter 400 and you should see your cars wheels start to go forward.
  5. Keep trying different values until you've found a reasonable max speed and remember this PWM value. Remember do not set the max speed too high, otherwise, the front wheel of the car will be easily damaged when crashed at fence. For repairing, refer to this.

Reverse on RC cars is a little tricky because the ESC must receive a reverse pulse, zero pulse, reverse pulse to start to go backwards. To calibrate a reverse PWM setting...

  1. Enter the reverse value, for example 330, then the zero throttle value you find above, then the reverse value again.
  2. Enter values +/- 10 of the reverse value to find a reasonable reverse speed. Remember this reverse PWM value.

Enter these values in myconfig.py script as THROTTLE_FORWARD_PWM, THROTTLE_STOPPED_PWM, and THROTTLE_REVERSE_PWM.

1.3 Steering Calibration

  1. Open a terminal and run donkey calibrate --channel 0 --bus=1
  2. Enter 320 and you should see the wheels on your car move slightly. If not enter 400 or 300.
  3. Next enter values +/- 10 from your starting value to find the PWM setting that makes your car turn all the way left and all the way right. Remember these values.
  4. Enter these values in myconfig.py script as STEERING_RIGHT_PWM and STEERING_LEFT_PWM. Note that the default vaules are STEERING_LEFT_PWM = 460, STEERING_RIGHT_PWM = 290, saved in config.py.

Note: You need to make sure that the value (STEERING_LEFT_PWM + STEERING_RIGHT_PWM)*0.5 can make the front wheels face straight ahead, then the car can drive in a straight line with steering = 0.

1.4 Fine Tuning and Testing Your Calibration

Now that you have your car roughly calibrated you can try driving it to verify that it drives as expected. Here's how to fine tune your car's calibration.

  1. Start your car by running python manage.py drive in this folder.
  2. Go to <car_ip_address>:8887 in a browser (Tested on Chrome). The following is what you will see: camera video streams, adjustable control values and driving modes. Now you can press J or L on keyboard to adjust the car steering to left or right, and use I and K to adjust the throttle values.
  1. Now set the steering to zero and press I a few times to get the car to go forward. If it goes straight, that is good and you can move on to the next section; if not, adjust your values of STEERING_LEFT_PWM and STEERING_RIGHT_PWM until a straight driving trajectory can be achieved.

Note: Too small throttle values, e.g., 0.1, may not be enough to drive the car. You need to increase it a bit.

2. Data Collection

2.1 Driving with Web Controller

Only for basic testing. If you want to collect data in a more flexible way, use the joystick introduced in the next section Driving with Physical Joystick Controller

This controller provides a UI window accessible at <car_ip_address>:8887. After you run python manage.py drive this module will be loaded.

2.1.1 Features
  • Pilot mode - Choose this if the pilot should control the angle and/or throttle. It has three options:

    • local_angle will only let the pilot model control the angle of the car.
    • local_pilot will let the pilot model control both the angle and throttle.
    • user_mode where you have full control over the car using keyboard shortcuts or mouse.

       Notes

    • Switches to pilot modes will only take effect if you load your neural network model when starting the manage.py, which will be introduced in Model Testing.
    • Clicking with mouse on the right blue area (with texts "Click/touch to use joystic") will switch from pilot modes to user_mode. You can use this function to quickly stop your self-driving car.
  • Recording - Press record data to start recording images, steering angels and throttle values. By default the data will be automatically recorded if throttle is not zero in user_mode. If the car is in pilot mode, you can use this function to manually record some self-driving data. Note do not use the data from a pilot mode to train you netowrk. Training details will be covered in Model Training.

  • Throttle mode - Option to set the throttle as constant. This is used in races if you have a pilot that will steer but doesn't control throttle.

  • Max throttle - Select the maximum throttle for user_mode.

2.1.2 Keyboard Shortcuts
  • R : toggle recording
  • I : increase throttle by 0.05
  • K : decrease throttle by 0.05
  • J : turn left
  • L : turn right

Note: Throttle range and steering range are both [-1.0, 1.0]. Throttle values that are less than 0 indicate reversing.

2.2 Driving with Physical Joystick Controller

2.2.1 Features
  • Recommended for data collection, much more flexible to operate the RC-Car than using web controller.
  • By default, no UI interface is published in this mode. However, you can set USE_FPV = True in myconfig.py to monitor the camera video streams. The published FPV images are accessible in <car_ip_address>:8890
2.2.2 Start the Programm for Data Collection
  • Plug the USB receiver of the Joystick controller into Jetson Nano, then start the program (before that, make sure your car motor is powered on):
$ cd ~/autorace
$ python manage.py drive --js

Optionally, if you want joystick use to be sticky and don't want to add the --js each time, modify your myconfig.py so that USE_JOYSTICK_AS_DEFAULT = True

  • The joystick controls are shown as follows, they will also be printed on your sceen when the program starts.
+------------------+--------------------------+
|     control      |          action          |
+------------------+--------------------------+
|     a_button     |       toggle_mode        |
|     b_button     | toggle_manual_recording  |
|     x_button     |   erase_last_N_records   |
|     y_button     |      emergency_stop      |
|  right_shoulder  |  increase_max_throttle   |
|  left_shoulder   |  decrease_max_throttle   |
|     options      | toggle_constant_throttle |
| left_stick_horz  |       set_steering       |
| right_stick_vert |       set_throttle       |
|  right_trigger   |    constant_rage_mode    |
|   left_trigger   |   constant_gentle_mode   |
+------------------+--------------------------+
  • Explanations on the operations.
    • left_stick_horz: Left analog stick - Left and right to adjust steering
    • right_stick_vert: Right analog stick - Forward to increase forward throttle, and Backward to increase reverse throttle. No operation for zero throttle.
    • toggle_mode: Switches modes - "User, Local Angle, Local(angle and throttle)"
    • toggle_manual_recording: Toggle recording of all data even if your car stops with zero throttle. This is disabled by default because a more suitable method auto record on throttle is enabled by default, which means that whenever the throttle is not zero, driving data will be recorded - as long as you are in user mode. The data will be saved in folder data/tub_xx_xx_xx/
    • erase_last_N_records: You don't want to use bad data to train you network, such as collisions with walls. This function can erase the data in the last 100 frames to keep your dataset clean.
    • increase_max_throttle: the max_throttle for joystick control is set to 0.5 by default (when right analog stick is pushed to the front). Press right shoulder to increase the max_throttle by PER_THROTTLE_STEP if you need more speed. PER_THROTTLE_STEP is 0.05 by default, you can change this in myconfig.py.
    • decrease_max_throttle: Simmilar to the above. Press left shoulder to decrease the max_throttle by PER_THROTTLE_STEP.
    • toggle_constant_throttle: Toggle constant throttle. Sets to max throttle.
    • constant_rage_mode: Start a constant throttle mode with very a large throttle RAGE_THROTTLE = 0.75. This can be useful at long and straight tracks, where you can quickly accelerate your car. The throttle value can be adjusted in myconfig.py
    • constant_gentle_mode: Start a constant throttle mode with a low throttle GENTLE_THROTTLE = 0.45. This can be useful when RC-Car enters a sharp corner at high speeds, where you need to quickly slow the speed down but not stop. This throttle can also be adjusted in myconfig.py
2.2.3 Data Collection Procedure
  1. Place some obstacles on the track, and practice driving around the track a couple times. Considering you may not drive well at first, you can set AUTO_RECORD_ON_THROTTLE = False in myconfig.py to disable recording data automatically.
  2. When you're confident you can drive 10 laps without mistake, restart the python mange.py process to create a new tub session. Set AUTO_RECORD_ON_THROTTLE = True. The joystick will auto record with any non-zero throttle.
  3. If you crash or run off the track, loosen the throttle immediately to stop recording. Then tap the X button to erase the last 5 seconds of records.
  4. After you've collected 10-20 laps of good data (5-20k images) you can stop your car with Ctrl+C in the terminal session for your car.
  5. The data you've collected is in the data/ folder in the most recent tub folder.

After you finish training a car can that can drive solely on the track based on the following procedure, you should return to this part and cooperate with another team to collect data on wheel-to-wheel driving (two cars running on the track). These data will be crucial for your model to learn how to behave in the presence of another car, such as overtaking and slowing down to avoid collisions. Otherwise, it is likely to perform poorly in the main race.

2.2.4 Tips
  • To increase the robustness of your model, you should place different obstacles (colors, types) randomly on diffeent locations on the track during data collection.
  • DRIVE_LOOP_HZ set in myconfig.py is the max frequency that the drive loop should run. The actual frequency may be less than this if there are many blocking parts, e.g., your designed AI model is too complex to run quickly.
  • You can choose whether or not to add extra control noise in user_mode. If the value CONTROL_NOISE is set to True in myconfig.py (default value is False), random action noises on steering angle and throttle will be added during your tele-operation. This can help you to collect more divrese data that the car recovers from off-center and off-orientation mistakes. Based on these your trained agent can be more "intelligent". We also provide two scalars THROTTLE_NOISE and ANGLE_NOISE to adjust the level of noise. Note with this module acctivated, the data collection process will be difficult.

3. Model Training

3.1 Transfer Data from RC-Car to Host PC

Training a deep neural network on Jetson Nano can be painful (quite slow 🐌). Therefore, we will use more powerful PCs for faster training. The first step is to copy the collected data from your RC-Car to the host PC.

  1. If you are using your own computer for model training, open a new terminal on your host PC and use rsync to copy your cars data/ folder.
$ cd ~/autorace
$ rsync -rv --progress --partial <car_account_name>@<car_ip_address>:~/autorace/data ./
  1. If you use the server for model training, open a new terminal on the RC-Car, then do the followings instead.
$ cd ~/autorace
$ rsync -rv -e 'ssh -p <port_number>' --progress --partial ./data <server_account_name>@<server_ip_address>:~/autorace/

Now you can check the new data in the folder ~/autorace/data/ on your host PC:

$ ssh -p <port_number> <server_account_name>@<server_ip_address>  # if you use the server, connect to it via ssh first on your RC-Car
$ ls ~/autorace/data/

The copied data folders will be printed. The following is an example.

3.2 Start Training Models

In the same terminal you can now run the training script on the latest tub by passing the path to that tub as an argument. For example,

$ cd ~/autorace
$ python manage.py train --model models/resnet18.pth --type resnet18 --tub data/tub_1_20-12-12/,data/tub_2_20-12-12

The trained model will be saved in ~/autorace/models/resnet18.pth. Optionally you can pass no arguments for the tub, and then all tubs will be used in the default data/ folder.

$ python manage.py train --model models/resnet18.pth --type resnet18

We provide three basic model types for training: linear, rnn and resnet18. Click these hyperlinks for more details. The following image shows the model architecture from Nvidia Self-Driving Car in 2016, which uses a series of convolutional layers and fully connected layers to learn driving behaviors.

You can change these model architectures in ai_drive_models.py, where linear -> LinearModel, rnn -> RNNModel and resnet18 -> LinearResModel. Note the rnn model runs slowly on the RC-Car, thus it may not be suitable for racing.

Other training parameters such as batchsize, learning rate, etc., can be configured in myconfig.py. Their default values should work well in most cases.

During training you can run nvidia-smi in a terminal to check GPU power consumption and memory usage. The following shows a training demo.

After the training is finished, you can view how the training loss and validation loss decrease in models/loss_plot_resnet18.png.

3.3 Copy Model Back to RC-Car

In previous step we managed to get a model trained on the data. Now is time to move the model back to RC-Car, so we can use it for testing it if it will drive itself.

  1. If you use your own computer for model training, open a new terminal on your host PC:
$ cd ~/autorace
$ rsync -rv --progress --partial ./models/resnet18.pth <car_account_name>@<car_ip_address>:~/autorace/models/
  1. If you use the server for model training, open a new terminal on the RC-Car, then do the followings instead.
$ cd ~/autorace
$ rsync -rv -e 'ssh -p <port_number>' --progress --partial <server_account_name>@<server_ip_address>:~/autorace/models/resnet18.pth ./models/

3.4 Accelerate your Model

The resnet18 model trained above runs about 150 ms/frame (6.7 Hz) on the car, which can cause problems if your car drives fast (not quickly enough to make a decision to take turns -> collision). Therefore, we will accelerate the model as follows on the RC-Car:

$ cd ~/autorace
$ python accel_model.py --model models/resnet18.pth --type resnet18

The process takes about 1 ~ 2 min, and the accelerated model will be saved in models/resnet18_trt.pth. The inference speed of this model is about 50 ms/frame (20 Hz).

Notes:

  1. You may have some problems if you try to acclerate a rnn model, which seems to be the limitation of the torch2trt library we use. Unfortunately, I have no solution on that.
  2. Add --half to the above command if you want to save memory usage of the neural network. Then the accelerated model will be using FP16 rather than FP32, allowing deployment of larger networks.

4. Model Testing

Ensure to place the car on the track and power on its motor so that it is ready to drive.

$ python manage.py drive --model models/resnet18_trt.pth --trt --type resnet18

When all modules are ready (1 ~ 2 min to warm up), the program will notice you to press ENTER to start. Then the car should start to drive on it's own. Congratulations!

Notes:

  1. Add --half to the above command if you use that during model acceleration.

Back to Top

Notes

  1. The ip address of the car may change automatically (it will change only once after each boot). If your JupyterLab losts connection, check if the car's ip is changed. If so, using the new ip to refresh the browser.
  2. If you collect lots of data, for example, more than 1000 images in a tub under data/ folder, DO NOT try to open the tub folder and view images through JupyerLab, because it will cost much time to load the files and the GUI may freeze. You can transfer the data to your laptop to view.
  3. Do not directly turn off the power when system is running. Open a terminal and dosudo shutdownfirst to shutdown the OS system, then turn off the power.
  4. If the RC-Car crashes into the track fence at a high speed, the front wheels are likely to get stuck at their drive rods. Then you have to take off the two drive rods and reinstall them, which is a little troublesome. It is suggested to simply removing the drive rod of the two front wheels, then this problem can be solved. This video demonstrates how to fix the problem. Other hardware modifications are not allowed.
  5. You are free to DIY your own algorithms to drive the car for competition, but the following functions in this repo should be kept:
    1. Press ENTER to start the car immediately when using autopilot mode
  6. If you think coding with Jupyter Lab is boring because it lacks code navigation functins like go to defination and code suggestions, you can choose VSCode for development.

Visual Studio Code has a high productivity code editor which, when combined with programming language services, gives you the power of an IDE and the speed of a text editor. In this topic, we'll first describe VS Code's language intelligence features (suggestions, parameter hints, smart code navigation) and then show the power of the core text editor.

After installation, you can install the Remote - SSH extension within VSCode to let your editor connected to your RC-Car. The following is a video demo showing how.

Back to Top

Other Useful Toturials

Python3

PyTorch

Credits

  • Donkeycar: Open source hardware and software platform to build a small scale self driving car.
  • JetRacer: An autonomous AI racecar using NVIDIA Jetson Nano.
  • Jetcard: An SD card image for web programming AI projects with NVIDIA Jetson Nano.
  • torch2trt: An easy to use PyTorch to TensorRT converter for model acceleration. Real-time computing performance is important for our high-speed driving occasions.
  • Getting Started with Jetson Nano Developer Kit

About

Autonomous RC-Car racing competition in HKUST

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.9%
  • Shell 1.1%