This repository contains the code used by the team of the Hanze during the Self Driving Challenge 2024
The Self Driving Challenge is an annual competition organized by the RDW, where teams from different universities and colleges in the Netherlands compete against each other in a series of challenges. The goal of the competition is to develop a self-driving car that can navigate through a series of challenges, such as following a line, avoiding obstacles, and recognizing traffic signs.
The vehicle starts in front of a traffic light. The first objective is to detect when the traffic light turns green. Once it has turned green, the vehicle may start driving.
Traffic signs indicating speed limits will be placed along the track. Multiple different signs may be present. The vehicle should adhere to these limits, until a new speed-limit is given.
A red traffic light with a stop line will be present on the track. The vehicle approaching the traffic light will need to stop at an adequate location in front of the traffic light, and wait for it to turn green before continuing.
A pedestrian may be waiting on the side of a zebra crossing. The approaching vehicle will need to detect the presence of a waiting pedestrian, wait for them to cross and then continue its journey once the zebra crossing is cleared.
A stationary vehicle will be present in the lane of the vehicle. The vehicle must plan and execute an overtaking maneuver by changing lanes, passing the obstacle, and then returning to its original lane.
To finish the parkour, the vehicle will need to perform a parallel parking manoeuvre. It will need to park itself in a parking spot surrounded by barriers.
Prerequisites
This project is built using Python. Ensure you have Python 3.12 and its package manager pip installed on your system. Verify their versions in your terminal with these commands:
python --version
pip --version
Step 1: Clone the Repository
git clone https://github.com/HeadTriXz/SDC-2024
Step 2: Create a Virtual Environment
python -m venv venv
Step 3: Activate the Virtual Environment
# On Windows
venv\Scripts\activate
# On macOS and Linux
source venv/bin/activate
Step 4: Install the Required Packages
pip install -r requirements.txt
Before starting the go-kart, let's ensure it is ready:
-
Plug in the Intel NUC: Make sure the Intel NUC is securely connected to a power source and powered on.
-
Connect the Controller: Connect the controller to the Intel NUC via USB or Bluetooth (see Bluetooth Support).
-
Calibrate the Cameras:
- Place the ChArUco board in front of the go-kart.
- Ensure the ChArUco board is clearly visible on all three cameras.
- Run the calibrate_cameras script to calibrate the cameras:
python -m scripts.python.calibrate_cameras
To start the go-kart, run the following command:
python -m src.main
The vehicle will start in manual driving mode. To switch to autonomous driving mode, hold the Start or Select button on the controller. Switching to autonomous mode will be indicated by 3 short vibrations followed by one long vibration. Switching back to manual mode will be indicated by one long vibration.
Manual driving allows for direct control over the vehicle. This mode is essential for situations where human intervention is needed or for initial testing before deploying autonomous mode.
Bluetooth support is not enabled by default. To enable it, configure the following udev rule:
KERNEL=="event[0-9]*", SUBSYSTEM=="input", ATTRS{name}=="Xbox Wireless Controller", ACTION=="add", SYMLINK+="input/by-id/usb-Microsoft_Controller_Wireless-event-joystick"
The kart can be manually controlled using the following controller bindings:
Input | Type | Action |
---|---|---|
Left joystick | Controls the steering of the vehicle. Move left to steer left and right to steer right. | |
Left trigger | Applies the brake. The further you press, the stronger the braking force. | |
Right trigger | Controls the throttle. The further you press, the faster the vehicle accelerates. | |
A-button | Hold | Press and hold to enable the controller. |
B-button | Press | Sets the gear to forward, allowing the vehicle to move forward. |
X-button | Press | Sets the gear to reverse, allowing the vehicle to move backwards. |
Y-button | Press | Sets the gear to neutral, stopping the vehicle from moving forward or backwards. |
Start/Select-button | Hold | Switch between manual and autonomous mode. In autonomous mode, 3 short vibrations followed by one long vibration will indicate activation. Switching back to manual mode will be indicated by one long vibration. |
To execute a script, run the following command in your terminal:
python -m scripts.python.<script_name>
- braking_calibration - Calibrate maximum braking force using a binary search algorithm.
- calibrate_cameras - Calibrate the cameras and generate the matrices required to generate a top-down view of the road.
- data_driving - Capture images from the cameras while manually driving.
- view_lidar - Visualize lidar sensor data for analysis and debugging.
This project is licensed under the MIT License.
See the LICENSE file for more information.