The original ideas of this project come from one simple question: What if all vehicles on the road are autopiloted? Obviously, drivers like me will be free from manual operations, which is not only boring but also very risky. But, what else? In recent years, we have witnessed a great improvement in Autonomous Vehicles’ algorithm, which makes these questions very realistic. Although companies like Tesla and Amazon have already operating Autonomous Vehicles business somehow, those autopiloting cars on the road are working alone. In another word, they assume all other vehicles are human-operated. Therefore, if more than one autonomous vehicles are nearby, should they communicate with each other to bring a better understanding of the environment? Will this kind of extra information help the built-in algorithm to work better? Can this kind of information sharing benefit traffic in the future? We decide to do some research.
To build a algorithm for autopilot vehicles, which can provide a reliable communication protocol between two or more vehicles.
Instead of adding onboard sensors, we want to make the following car to be blind but auto-pilotable basing on the information shared from the leading car.
Default mode of Raspberry Pi onboard bluetooth module is SLAVE ACCEPT, need to change it to MASTER before connect HC-05 on Arduino
sudo hciconfig hci0 lm master
After this, use Bluetoothctl
tool to scan, pair and trust target HC-05 module.
Then, connect it as software serial port so that we can send commands via Bluetooth channel.
sudo rfcomm connect hci0 XX:XX:XX:XX:XX:XX
cd IBID/Master_Control
python3 ControlBT.py
Slave.ino
in IBID/Slave_Control/Slave
pybluez
library is necessary for running RSSI code.
pip install pybluez
cd IBID/Rssi
python3 testblescan.py
Linux Server
Echo Dot
/Amazon Alexa App
to fetch voice commandsAlexa Skill
to do Nature Language Processing in the CloudAlexa API
to send Raspberry Pi processed voice commands package, including to main part [{Device name}, {Operation name}]At last, we combine all three parts above together to make our final demo. In this demo, we use voice to activate the leading car(Master). When leading car moves, it trigers the following algorithm, which guide the following car(Slave) to follow behind but keep a safe distance basing on the information acquired by both the RSSI system and the sensor system.
In all, we successfully built a vehicle interaction system(simulation) this quarter including an indoor positioning system, sensor information sharing platform and voice control system. Human players are able to control all registered vehicles in this system simply by voice. Vehicles inside the system, human-operated or not, are able to communicate with each other to achieve simple collaboration like following or obstacle avoiding.
Although we don’t have enough funding/support to make a test on real vehicles, we believed that our trial is still meaningful and prospective. The Arduino-Raspberry Pi intelligent car system can be regarded as a simulation of what might happen in the future. Currently, we mainly rely on cheap ultrasonic and IR sensors, but the same idea can be realized on Tesla sedan, which should be equipped with much more precise radars for similar purposes. Now we use Bluetooth iBeacons to build an indoor positioning system. When it goes to the outdoor, we can simply change the source to be GPS satellites while keeping the original positioning algorithm structures. What we did was not playing with toy cars, but a low-cost prototype model for the future way of transportation system.