2021 is the first year of the meta universe. Although related technologies are still in the frontier exploration stage, some enterprises engaged in digital twins are now stepping up the layout of industrial Internet and intelligent manufacturing in the meta universe environment.
The concept of digital twin was first proposed by Professor Grieves in 2003 based on the perspective of product life cycle management. Its English name is Digital Twin, which was mainly used in the manufacturing of key equipment in the aerospace and military industries in the early days. For example, the US Air Force Research Laboratory and the National Aeronautics and Space Administration (NASA) have all carried out aircraft quality control applications based on digital twins. Lockheed Martin also introduced digital twins into the production process of F-35 fighter aircraft to improve the process flow, production efficiency and quality. As Digital Twin has the characteristics of virtual reality integration and real-time interaction, iterative operation and optimization, as well as all elements, all processes, all businesses, and data driven, it has been widely applied to all stages of the product life cycle, including product design, manufacturing, service, operation and maintenance.
The three-dimensional models of physical objects, virtual objects and connecting objects defined by Professor Grieves are the main models guiding the development and application of digital twins. Physical objects are all real entities that can be digitally observed and modeled; Virtual objects are virtual entities that describe and map physical objects with geometric, physical, behavioral and rule models; Connection objects are operational objects that complete the mapping of physical objects to virtual objects and virtual objects operate physical objects.
The definition of the basic model of the meta universe is very similar to that of the digital twin. In particular, the meta universe also emphasizes the realization of the ecological environment of coexistence and integration of virtual and real through physical objects, virtual objects and the mapping relationship between them. This also makes it impossible to distinguish the relationship between them in many scenarios. In fact, the meta universe has inherited the entertainment gene since its birth, so if the digital twin is centered on product production, the meta universe is centered on user consumption experience. The meta universe can expand the digital twin model from the R&D and production stages to the use and consumption stages; Expand the overall digital living space from the local digital workshop.
The meta universe can bring potential beyond imagination to the automobile industry and drive automobile manufacturing innovation and business model innovation. The three-dimensional immersive experience of the universe, the digitalization of human and social relations, the convergence of physical and digital worlds, massive user creation, and the manifestation of the value of digital assets will be gradually applied to the design and intelligent manufacturing of automobiles.
In the last article, "Car Enterprises Gather in the" Meta Universe "(I) - From the perspective of design and development, I focused on the application of the meta universe in the automotive R&D and manufacturing scene. Today, I will continue to introduce the application prospect of the meta universe in the vehicle intelligent system and entertainment system.
On November 2, 2020, the General Office of the State Council officially issued the New Energy Automobile Industry Development Plan (2021-2035), and clearly pointed out that the integration of automobile and related technologies in energy, transportation, information communication and other fields has been accelerated, and electrification, Internet and intelligence have become the development trend and trend of the automobile industry. In the Technical Roadmap 2.0 of Intelligent Connected Vehicles, it is also clearly pointed out that by 2025, the sales of intelligent connected vehicles at the PA1 and CA2 levels will account for more than 50% of the total sales, the assembly rate of new cars at the C-V2X3 terminal will reach 50%, and highly autonomous vehicles will achieve commercial applications in limited areas and specific scenarios. 
According to the classification standard of the American Society of Automotive Engineers, the development of intelligent vehicles is divided into six levels: manual driving, driving assistance, partial automation, conditional automation, high automation and full automation. For example, electronic stability program (ESP) and anti lock brake system (ABS) belong to driving assistance level; The Cruise control system (CCS) is automated through the combined Control component of the accelerator and brake System; Adaptive Cruise Control (ACC), Automatic parking system (APK), etc. are conditional automation because they still require manual participation. Tesla's Full Self Drive FSD is only L3 from the perspective of actual use effect; At present, there is no L4 level automatic driving in the cars on the market, and L4 level automatic driving is still in the laboratory, such as the Baidu and Didi projects; However, L5 level automatic driving, that is, fully intelligent driving without driver and under all working conditions, is still in the conceptual design stage, and there is still a lot of gap to go on the road.
Schematic Diagram of the Development Stage of Automobile Intelligence
Source: Research status and prospect of human-computer cooperative control of intelligent vehicles 
The meta universe based intelligent driving mainly focuses on L3-L4 and L5. In L3-L4, it mainly focuses on human-computer cooperative driving, while in L5, it focuses on auto autonomy.
Intelligent environment perception is the premise of realizing comprehensive automation. In the current L4-L5 vehicle design, a large number of external sensors such as vision, radar and positioning systems installed on the vehicle body are used to perceive physical objects in the real world and form real-time vehicle decision-making environment data. For example, Tesla Autopilot system is composed of 8 cameras, 12 ultrasonic radars and 1 millimeter wave radar. The external environment sensor provides Tesla drivers with a 360 degree visual width and a distance view of 250 meters.
Tesla Autopilot Rendering
Tesla Autopilot system constitutes the environment perception layer of intelligent driving. Due to the high requirements for the reliability and low delay of sensors, it is not reliable to make real-time intelligent decisions by only relying on the vehicle's own sensors to perceive the external environment data. Although Tesla Autopilot system claims to be fully automatic driving (FSD), its actual use requires manual intervention, and its core problem is that it relies heavily on sensors. For example, the Ya'an Xichang Expressway in southwest China, namely, the Sichuan section of Beijing Kunming Expressway (National Expressway G5), is located in the complex mountain canyon environment of Daliang Mountain. Although the scenery is spectacular, the accident risk is high due to the impact of perennial rainstorm, fog, mud rock flow, landslide, ice and snow and other complex weather. Under the influence of the superposition effect under the complex climate conditions, the vehicle's own environment perception system is far from reaching the level of fully automated driving.
Entering the era of the meta universe, a large amount of road conditions and weather information will be digitally constructed into a three-dimensional road condition space. In the virtual environment, various geographic, weather, road conditions and other exogenous variables will produce three-dimensional multi superposition effects. Through modeling these superposition effects, the digital road environment analysis and response in all weather and all road conditions will be realized. The digital road condition space of the meta universe will be synchronized to the edge cloud through regional attributes. The vehicle can interact with the edge cloud through 5G and 6G communication networks with the external environment perceived by the sensor at a response speed of less than milliseconds. The vehicle's own environment and the digital environment of the edge cloud can be used to correct the road conditions and complement each other. The edge cloud and the vehicle's own sensors constitute a dual data guarantee for fully automated driving. Once the body sensor fails, the edge cloud will take over the environmental data supply to provide continuous data support for intelligent decision-making.
The digital road condition space of the universe is jointly maintained by the whole people. Public transport institutions, commercial data institutions and each intelligent vehicle will form a digital road condition operation framework with bottom-up road condition physical objects scanned into the network. At present, smart cars have far more ultrasonic radars than millimeter wave radars due to cost considerations. In the future, smart cars will be equipped with a large number of cameras, millimeter wave radars and climate sensors with 3D scanning function to build more secure digital road data through remote perception of 3D entity data of road conditions.
The definition of L5 level of fully automatic driving is to achieve all-weather and all region automatic driving through the perception and decision-making of on-board computer without the intervention of drivers. The L5 level requires that vehicles can cope with the dynamic changes of environmental climate and geographical location. This definition is close to the need to infinitely overcome the impact of various road conditions and climate conditions on the earth, as well as their combined factors. As a single computing unit, a vehicle cannot build such a real-time intelligent response mechanism for multiple road conditions only relying on its own sensors and computing power. Intelligent vehicles, like drivers, need to accumulate and learn a lot of live data to have the ability to handle complex road conditions.
According to the technical development path of "Chelu Zhixing" in Baidu's Apollo intelligent transportation white paper, intelligent transportation needs to go through 1.0 digital stage, 2.0 networking stage and 3.0 fully automated stage respectively. In Baidu ACE traffic engine architecture, the intelligent engine is composed of Apollo auto drive system and vehicle road coordination system. The vehicle road collaboration is an intelligent driving assistance system that can meet the large-scale application of future traffic autonomous vehicles by integrating road side multi perception data, relying on cloud edge computing, deep learning, integrating scenes, dynamic high-precision maps, edge cloud collaboration, etc.
Diagram of intelligent transportation development stages
Source: Apollo Intelligent Transportation White Paper 
The intelligent driving in the meta universe stage is to fully realize the fully automated driving in the L5 stage, which will complete the automatic driving in all road conditions and all weather under the dual guarantee of the cloud 3D virtual digital road space and the on-board intelligent decision-making system. The virtual digital road condition space will be jointly participated in the scanning, construction, network access and operation of the physical objects of the road condition by public transport management institutions, business intelligent transport operation institutions and each intelligent vehicle. Public transport management institutions establish static models such as basic 3D maps and climate through satellite remote sensing, satellite imaging and 3D mapping; The business intelligent transportation operation organization establishes a third view road condition dynamic model through field survey, dynamic 3D scanning and millimeter wave radar scanning; The intelligent car establishes the first view road condition dynamic model through its own 3D camera, millimeter wave radar and microwave radar.
Public transport management institutions, commercial intelligent transport operation institutions and each intelligent vehicle upload the detection data to the intelligent transport edge cloud node in real time through the on-board 5G/6G communication chip. The edge cloud node realizes the construction, maintenance and update of virtual digital road condition space, and realizes edge cloud data collaboration through the blockchain network to ensure the real-time Accurate synchronization to each node to assist intelligent decision-making of intelligent vehicles.
In April 2019, Tesla released its self-developed FSD master chip on the Autopilot HW3.0 platform. The dual chip redundancy design is adopted, with the single chip computing power reaching 72 TOPS and the board computing power reaching 144 TOPS. The vertical integration of autopilot chip and neural network algorithm is realized. The software and hardware fusion design of real-time AI intelligent decision-making based on neural network algorithm has become the mainstream design idea of intelligent vehicles in the future. However, at present, there are still two different design ideas in the decision-making system of intelligent vehicles, one is the comprehensive intelligent decision-making of body sensor+accurate map and edge cloud collaboration represented by domestic intelligent vehicle enterprises; One is a humanoid visual decision system, represented by Tesla, which relies solely on high-precision sensors such as body high-definition camera and millimeter wave radar. At present, it is not clear which design idea has more advantages, but it is certain that if intelligent transportation has advantages in digital road space, the first strategy will be more safe and available.
The intelligent vehicle decision-making system in the meta universe environment is a comprehensive intelligent decision-making system based on digital road space to achieve edge cloud collaboration. This is a 3D multi-dimensional decision-making model based on vehicle intelligent decision-making system, supplemented by cloud intelligent decision-making system, and multi person perspective. The digital road condition space of the universe is a virtual digital road condition environment observation from the third person perspective, namely, "God's perspective", which can more objectively, comprehensively and dynamically display the changes of traffic, weather, accidents and disasters, and more penetratively observe the potential road condition risks. The on-board intelligent decision-making system is a first person perspective, which is used to observe the current state of vehicle driving. It uses the on-board vision and radar sensors to form subjective decision-making conditions. The neural network algorithm is used as the decision-making algorithm to simulate human visual observation and judgment, which can more quickly find the current road risk.
Road condition risk analysis is essentially a nonlinear structural analysis. Vehicle driving risk is often difficult to predict. In addition to ensuring the system safety of the vehicle itself, a large part of vehicle driving safety also needs to monitor the sudden risk of road conditions in real time and make the best decision for this purpose. Therefore, in addition to the subjective analysis of the first person perspective of the vehicle intelligent decision-making system, intelligent decision-making also requires the objective auxiliary analysis of the third person perspective. First person and third person 3D perspectives, on-board decision-making system and cloud based decision-making system constitute a dual decision-making system for intelligent and safe driving in the meta universe environment.
The first car with car radio in the world can no longer be verified. However, according to Chevrolet and Cadillac, in the 1920s, they began to try to install car radio in their own models. The first car record player was born in 1956 by Peter Goldmark, a Chrysler owner, who put a 7-inch vinyl mini record player into the glove box of the car, thus igniting the revolution of car player. In 1960, the first cassette stereo car player was built by Earl Muntz, an engineer and businessman called "Madman", and named "Muntz Stereo Pak". Since then, car owners have finally been able to listen to four channel stereo music. In 1987, Ford was the first to embed the CD player perfectly into the Lincoln Town Car, and people began to feel the charm of digital music in the vehicle. In 2000, with the rise of MP3, MP4 and other multimedia players, people officially entered the era of car entertainment system. Now, the large display screen, Internet access, digital music, video, movies, KTV and other diversified entertainment experiences have been the basic configurations of new cars.
In NIO Day 2021 in 2021, we will show that we have developed exclusive AR glasses jointly with NREAL. The glasses weigh 76g, but can project a viewing distance of 6m, equivalent to a large screen of 201 inches. It also showed that NIO VR Glasses, jointly developed with NOLO, can achieve binocular 4K display and millisecond level delay. By using new meta cosmic technologies such as AR/VR, we will build a panoramic digital cockpit "PanoCinema". In 2022, Elon Musk said that he hoped to move the Steam game platform directly to Tesla models.
Although, as a means of transportation, the safe transportation function of automobiles has always occupied the leading position in the design, it has never prevented consumers from constantly pursuing a comfortable and comfortable driving experience in the interior space of automobiles. The car has a closed private space, which is undoubtedly a prerequisite for people to pursue entertainment. No matter in the era of car radio, cassette player, CD player, digital multimedia system, people try to put entertainment experience equipment representing the latest technology into this private space. The development of automobile intellectualization from the manual driving stage of L0 to the fully automatic stage of L5 is a process of gradually releasing the driver, allowing the passengers to fully immerse themselves in the customized space experience area, thereby weakening the vehicle's transport function and strengthening its consumer functions such as office, entertainment, sports, etc.
The intelligent development of automobiles in the L0-L4 stage is centered on assisting drivers to achieve safe and efficient automatic driving. From the perspective of the design and development of the automobile cab, it provides drivers with a driving assistance system with intuitive perspective, simple operation and comprehensive information, so as to solve the safety problems of drivers in the process of driving and rest switching. From the multi-function steering wheel to the Head Up Display (HUD) system, we try our best to help the driver operate the vehicle or view the road information without lowering or turning his head, so that the driver can always maintain the best driving observation state.
HUD is to project important information displayed on the instrument to the front windshield during driving, so that the driver can see the information in the instrument without looking down. Its design concept mainly comes from the helmet display system of fighter pilots. In the 1970s, the "Phantom" F1AZ pilot of the South African Air Force took the lead in using helmet mounted sight in actual combat. In actual combat, the missile guidance head can quickly aim at the target according to the head movement of the pilot, and achieve large off-axis angle launch. The pilot is no longer required to flexibly aim the nose at the target, and achieve "what you see is what you get" in the sense of air combat.
After the intelligent development of the car entered the L1 stage, a large number of auxiliary driving devices gradually released the driver's feet and hands, so that the driver no longer has to control the car 100% in the whole process. But at the same time, it is also difficult to give consideration to the driver's rest and maintain driving attention. In May 2021, in California, USA, a car accident occurred. A Tesla Model 3 collided with a truck on a highway, causing the truck to roll over and the Tesla driver died on the spot. After the investigation of the Traffic Safety Commission of the United States, it was found that Tesla's auto drive system could not detect the truck correctly, and the driver's lack of attention and Tesla's immature monitoring system were the reasons for the accident.
From the perspective of safe driving, it is still very important to maintain the driver's driving attention at least before the L5 stage of fully automated driving. The driving experience based on the meta universe still puts vehicle safety first. By using the head up display device or intelligent glasses display system, a deep virtual reality fusion meta universe safe auxiliary driving experience space can be established between the driver's vision and hearing organs and the road. When the auxiliary driving experience space is in operation, AR or MR road situation games will be presented in the driver's focus area. The participants of the massive road situation experience game in the meta universe will focus the driver's attention firmly in the meta universe safety auxiliary driving experience space. The driver can give an alarm at the millisecond level and stimulate the driver's visual and auditory nerves against the road situation safety risks, Let the driver take over the control of the vehicle at the first time. The meta universe safety assisted driving experience space realizes "what you see is what you get" in the sense of safe driving.
The idea of working on cars mainly comes from the advent of "commercial vehicles", namely: Multi Purpose Vehicles (MPV). The "Voyager" and "Caravan" models launched by Chrysler in 1983 are the earliest MPV in the world. The biggest characteristics of MPV are that it has many passengers, large space and high comfort, and can play a role in family travel, business reception and other scenes.
According to statistics, in 2020, the average one-way commuting time in Beijing will be 47 minutes, while that in Shanghai will be 42 minutes. On average, office workers will spend one and a half hours a day on the road. How to use the travel time to complete the work is the function that the vehicle office system needs to consider. At the 2016 Consumer Electronics Show in Berlin, Microsoft and Mercedes Benz jointly launched the "In Car Office" project, which turns customers' cars into a mobile workstation, helping customers easily complete tasks related to work on vehicles and improving office efficiency. In "In Car Office", in addition to Microsoft's Office software, it will also include conference tools, WeChat, Facebook (Mate) and other tools. However, whether the third-party vehicle office system like Microsoft or the original office system of automobile enterprises, they have all presented a situation of thunder, rain and little in recent years. The main reason is that due to the fact that the current level of vehicle intelligence is still at the L2-L3 stage, drivers cannot be completely liberated; The other reason is that the original office equipment is still unable to get rid of the traditional mobile hard office experience, that is, mobile phones, laptops, Pads, etc. These devices are relatively small in the car and have insufficient stability, so they can not generate direct efficiency to stimulate the demand for on-board office.
Yuanuniverse virtualizes the working distance, office space and office mode into digital 3D mode, and cooperates with holographic imaging, VR, AR, MR and other new imaging technologies to achieve an immersive experience of soft office. Compared with the traditional mobile hard office experience such as mobile phones, laptops, and Pad, an optical or digital adjustable field of view can be formed in a narrow car space, allowing passengers to experience in a 100 square meter conference room in a 1 square meter car space; Relying on digital anti shake technology, it can also greatly reduce the experience fluctuation caused by body shake, so that the passengers can always keep immersed in the whole work experience.
In the 70 years since the first car mounted mini vinyl record player was put into Chrysler, people have never stopped adding various entertainment devices to the car, such as record player, cassette player, CD player, digital video player, KTV, game player, etc. As long as they belong to the latest mini entertainment devices of that era, people will want to install them in the car. As an industrial product, automobile has been endowed with human values, life forms and emotional needs since its birth. At the same time, it reflects the aesthetic orientation of different times and people, forming the meaning of automobile culture. However, the most successful car entertainment devices are all kinds of players that can play music. In the automobile culture, music is the soul of a car, and a car without music is a cold, emotionless industrial product.
However, from the perspective of engineering technology, the car entertainment system with music as the core has been enduring and new for 70 years, mainly because music can provide 360 ° immersive hearing experience for car drivers and passengers, which is incomparable with the current visual equipment mainly based on displays. Immersive experience in car entertainment requires that drivers and passengers be provided with an immersive, sensory and realistic entertainment atmosphere, which should include: auditory, visual and even social contact interaction experience. The meta universe is a virtual and real symbiotic world based on the mapping and interaction of the real world. It is an immersive digital living space with multiple sensory systems. The meta universe, which evolved from the science fiction world and the game world, is naturally entertaining. The virtual reality symbiotic digital entertainment will weaken the physical shortcomings in the vehicle, such as narrowness, noise, vibration and airtightness. It will enlarge the observation range and scene depth of passengers through digital space, optimize the hearing experience through digital noise reduction, improve the visual experience through the holographic 360 degree vision system, and improve the tactile experience through the 3D feedback tactile system. In the car entertainment space based on the meta universe, no matter games, social activities, movies and other entertainment activities will enjoy zero delay, 3D and package immersion experience.
Although, at present, the main business of the Metauniverse and the automotive industry has not had a direct chemical reaction, and the large auto dealers are only limited to brand promotion, exhibition hall construction, sales and other auxiliary businesses, but out of the expectation of the next generation of digital world, there will be more auto enterprises to join the Metauniverse in the future. However, how to endow the meta universe with productivity, provide multi-dimensional consumption space for automobiles, and provide users with virtual and real symbiotic automobile life requires a long period of imagination, thinking, design, R&D trial and error and practice in the industry. This article is just the beginning.
 Tao Fei, Liu Weiran, Zhang Meng, Hu Tianliang, Qi Qinglin, Zhang He, Sui Fangyuan, Wang Tian, Xu Hui, Huang Zuguang, Ma Xin, Zhang Lianchao, Cheng Jiangfeng, Yao Niankui, Yi Wangmin, Zhu Kaizhen, Zhang Xinsheng, Meng Fanjun, Jin Xiaohui, Liu Zhongbing, He Lirong, Cheng Hui, Tuesdays, Li Yang, Lv Qian, Luo Jingmin. Digital Twin Five dimensional Models and Applications in Ten Fields [J]. Computer Integrated Manufacturing System, 2019, 25 (01): 1-18. DOI: 10.13196/j.cims. 2019.01.001
 Li Puchao, Ding Shouchen, Xue Bing. From the development of automotive intelligence to the "meta universe" of the automotive industry [J]. Internal combustion engine and accessories, 2021 (24): 164-166. DOI: 10.19475/j.cnki.issn1674-957x.2021.24.054
 Hu Yunfeng, Qu Ting, Liu Jun, Shi Zhuqing, Zhu Bing, Cao Dongpu, Chen Hong. Research Status and Prospect of Human Machine Cooperative Control of Intelligent Vehicles [J]. Journal of Automation, 2019, 45 (07): 1261-1280. DOI: 10.16383/j.aas.c180136