Vietnam’s first autonomous vehicle debuts
Phenikaa Group, a multi-sectoral corporation, has introduced its prototype of a level-4 autonomous vehicle,
Phenikaa Group has introduced its first prototype of a level-4 autonomous vehicle, Vietnam's first smart self-driving vehicle, which is impressive modern technology.
The vehicle is equipped with smart functions, artificial intelligence (AI) and advancements such as 2D/3D maps, Lidar sensor, SLAM (simultaneous localisation and mapping technology), machine learning and deep learning.
The electrical eco-friendly self-driving vehicle also has nearly 40 level-4 self-driving features based on the standards of the Society of Automobile Engineers (SAE International), in which level 5 is the highest, according to Vietnamnet.
Phenikaa's smart autonomous vehicle is driverless and the user can take control of the vehicle via a customised application.
It still needs to undergo more tests for safety and at different locations, as well as more research to reduce production costs before being brought to market.
With the current infrastructure, technology and regulations in Vietnam, it might take some time before the vehicle can be used on the roads.
"We hope the introduction of Vietnam’s first 'Made-in-Vietnam' Level-4 Smart Autonomous Vehicle will facilitate the development of the self-operating industry, localise technological products and perfectly meet the market demand for high-quality, internationally standardised products and services," Le Anh Son, Director of Phenikaa-X JSC, said.
"We don't hold a dream of becoming an automaker. When investing in this project, our first target is to create useful and good high-tech products for lives," Ho Xuan Nang, Chairman of Phenikaa University, said.
"And the second purpose is to gather researching forces and to show that this kind of research in a university is really useful for training and teaching."
Last week, Phenikaa Group signed a memorandum of understanding (MoU) with Nippon Koei Vietnam, SICK Sensor Intelligence, Advantech Vietnam Technology Co Ltd, BAP Group, and VEDAX to co-operate in making new products with high qualities and superior functions and pushing the development of the autonomous industry.
Although current Advanced Driver-Assistance Systems (ADAS) provide important safety functions such as pre-collision warnings, steering assistance, and automatic braking, self-driving vehicles take these technologies to the next level by completely removing the need for a driver.
As a matter of fact, there are “levels” to autonomy, which breaks down as follows:
Level 0: The automated system has no control over the vehicle, but may prompt the driver of hazards
Level 1: The driver and the automated system share control of the vehicle. Examples of this can be found in most cars equipped with ADAS
Level 2: The automated system is capable of taking full control of the vehicle; however, the driver must be ready to intervene if the system fails to recognize a potential hazard
Level 3: The Automated system takes full control of the vehicle and the passenger can safely take their attention away from driving tasks; however, they must still be able to intervene
Level 4: Driver can safely divert all attention away from driving tasks and let the automated system take full control. This functionality is currently limited to specific “geofenced” areas and other relatively controlled environments1
Level 5: No human intervention is required
Autonomous cars rely on sensors, actuators, complex algorithms, machine learning systems, and powerful processors to execute software.
Autonomous cars create and maintain a map of their surroundings based on a variety of sensors situated in different parts of the vehicle. Radar sensors monitor the position of nearby vehicles. Video cameras detect traffic lights, read road signs, track other vehicles, and look for pedestrians. Lidar (light detection and ranging) sensors bounce pulses of light off the car’s surroundings to measure distances, detect road edges, and identify lane markings. Ultrasonic sensors in the wheels detect curbs and other vehicles when parking.
Sophisticated software then processes all this sensory input, plots a path, and sends instructions to the car’s actuators, which control acceleration, braking, and steering. Hard-coded rules, obstacle avoidance algorithms, predictive modeling, and object recognition help the software follow traffic rules and navigate obstacles.
The scenarios for convenience and quality-of-life improvements are limitless. The elderly and the physically disabled would have independence. If your kids were at summer camp and forgot their bathing suits and toothbrushes, the car could bring them the missing items. You could even send your dog to a veterinary appointment.
But the real promise of autonomous cars is the potential for dramatically lowering CO2 emissions. In a recent study, experts identified three trends that, if adopted concurrently, would unleash the full potential of autonomous cars: vehicle automation, vehicle electrification, and ridesharing. By 2050, these “three revolutions in urban transportation” could:
Reduce traffic congestion (30% fewer vehicles on the road)
Cut transportation costs by 40% (in terms of vehicles, fuel, and infrastructure)
Improve walkability and livability
Free up parking lots for other uses (schools, parks, community centers)
Reduce urban CO2 emissions by 80% worldwide
Chau Polly