Cre­at­ing an en­vi­ron­ment model for au­to­mated driv­ing

De­tec­tion of the ve­hi­cle en­vi­ron­ment us­ing a com­bi­na­tion of cam­eras, RADARS, LIDARS and cloud-based data can help au­to­mated trucks ori­en­tate them­selves in their sur­round­ings

PMV Middle East - - CONTENTS -

Re­li­able per­cep­tion of the ve­hi­cle en­vi­ron­ment and its pre­cise eval­u­a­tion is a ba­sic re­quire­ment for highly de­vel­oped driver as­sis­tance func­tions and au­to­mated driv­ing. Tech­nol­ogy com­pany Con­ti­nen­tal is de­vel­op­ing an en­vi­ron­ment model that cap­tures the ve­hi­cle en­vi­ron­ment us­ing var­i­ous sen­sors. Data from sen­sors such as cam­era, RADAR and LIDAR is com­bined with ad­di­tional in­for­ma­tion such as the pro­file of the route ahead. The data is eval­u­ated and in­ter­preted by an in­tel­li­gent con­trol unit and Con­ti­nen­tal’s as­sisted and au­to­mated driv­ing con­trol unit (ADCU), a high-per­for­mance com­puter. A com­plex and de­tailed en­vi­ron­ment model is then cre­ated from the re­sult­ing data. The ADCU cre­ates an en­vi­ron­men­tal model more than fifty times per sec­ond by link­ing in­for­ma­tion from the in­di­vid­ual sen­sors and the var­i­ous ap­pli­ca­tions.

Con­ti­nen­tal ar­gues that only through this view of the ve­hi­cle en­vi­ron­ment can ve­hi­cles such as trucks ori­en­tate them­selves in their sur­round­ings and make driv­ing strat­egy de­ci­sions, for in­stance, by rec­og­niz­ing pos­si­ble driv­ing cor­ri­dors. How­ever, the in­for­ma­tion can be con­fig­ured in many ways, and man­u­fac­tur­ers have the flex­i­bil­ity to in­te­grate the func­tion­al­ity into their sys­tems.

Con­ti­nen­tal of­fers var­i­ous radar sen­sors and cam­eras for en­vi­ron­men­tal de­tec­tion. Its de­vel­op­ment engi­neers are work­ing on a 3D Flash LIDAR for pas­sen­ger cars and com­mer­cial ve­hi­cles. The ad­van­tage of us­ing a com­bi­na­tion of dif­fer­ent sen­sors is that it pro­vides a more re­li­able and ac­cu­rate view of the en­vi­ron­ment. Each sen­sor has its in­di­vid­ual strengths and de­tects dif­fer­ent en­vi­ron­men­tal pa­ram­e­ters. In ad­di­tion to sen­sor data on other road users and static ob­jects such as lane mark­ings and traf­fic signs, more data also flows into the model by means of con­nec­tiv­ity tech­nolo­gies for ve­hi­cleto-ve­hi­cle (V2V) and ve­hi­cle-to-in­fra­struc­ture (V2X) com­mu­ni­ca­tion; HD maps and GPS, for ex­am­ple, pro­vide ex­act po­si­tion­ing data about the ve­hi­cle. Sys­tems like the dy­namic ehori­zon and traf­fic data from third-party providers can also take the en­tire traf­fic sit­u­a­tion into ac­count, such as a traf­fic jam up ahead or a mo­bile con­struc­tion site. This cre­ates a re­li­able im­age of the ve­hi­cle en­vi­ron­ment.

Dr. Michael Ruf, head of Con­ti­nen­tal’s com­mer­cial ve­hi­cles and after­mar­ket busi­ness unit, said: “The en­vi­ron­men­tal model com­ple­ments Con­ti­nen­tal’s port­fo­lio of com­po­nents and sub­sys­tems for ve­hi­cle en­vi­ron­ment de­tec­tion. With our sen­sors, our ap­pli­ca­tions for ve­hi­cle con­nec­tiv­ity and in­tel­li­gent con­trol units for au­to­mated driv­ing, Con­ti­nen­tal will in fu­ture be in a po­si­tion to of­fer its cus­tomers ev­ery­thing they need for the re­li­able de­tec­tion of the ve­hi­cle en­vi­ron­ment, from a sin­gle source. To achieve this, we have used our ex­pe­ri­ence in au­to­mated driv­ing func­tions for pas­sen­ger cars, more than ten years of ex­pe­ri­ence in sen­sor ap­pli­ca­tions for driver as­sis­tance func­tions for trucks and our in-depth sys­tems ex­per­tise.”

Merg­ing of in­for­ma­tion from a cam­era, LIDAR and RADAR to cre­ate an ac­cu­rate view of the en­vi­ron­ment.

Per­cep­tion of the ve­hi­cle en­vi­ron­ment by an au­to­mated truck.

Newspapers in English

Newspapers from UAE

© PressReader. All rights reserved.