你可以說我老土,但自動(dòng)駕駛這個(gè)概念實(shí)在是沒辦法說服我;我想象過當(dāng)我在一座像波士頓那樣的大城市街道上開著車,看看隔壁車道上的另一輛車,然后發(fā)現(xiàn)駕駛座上沒人──在波士頓開車已經(jīng)讓人心情夠糟,我恐怕會(huì)忍不住大吼大叫。
雖然我對(duì)自動(dòng)駕駛抱持高度懷疑,我卻意外對(duì)自動(dòng)停車(autonomous parking)這個(gè)點(diǎn)子十分著迷;試想,當(dāng)你從駕駛座走下來、只消對(duì)車子說:“去找個(gè)地方停著!”就可以徑自瀟灑走向準(zhǔn)備前往的餐廳,那會(huì)是件多么爽的事情。如此我們?cè)僖矝]有約會(huì)遲到的借口,或是因?yàn)檎也坏酵\囄欢幌氤鲩T。
包括 Google 以及車廠 Audi 、 Volvo 都曾展示過與自動(dòng)駕駛相關(guān)的技術(shù)研發(fā)成果;其中Audi在今年初的國際消費(fèi)性電子展(CES)上,也展示過配備自動(dòng)停車功能的汽車概念。而Volvo則是在上個(gè)月公開了一段視頻,展示其最新自動(dòng)停車技術(shù)。
這家瑞典車廠表示,其首款自動(dòng)停車汽車原型,能“安全且順暢地與其它車輛或行人互動(dòng)”,會(huì)自己導(dǎo)航并進(jìn)入停車場,并尋找空位、自己停進(jìn)去。
當(dāng) 然,任何一種新科技問世時(shí),不會(huì)有人相信那些供貨商的表面說法;舉例來說,Volvo的最新汽車系統(tǒng)是以其Vehicle 2 Infrastructure (V2I)為基礎(chǔ),也就是說,該系統(tǒng)需要在一個(gè)配備嵌入式信號(hào)發(fā)射器的停車場架構(gòu)才能運(yùn)作,你不可能真的完全放任車子自己出去找車位。
這真的很令人失望,因?yàn)槲乙婚_始對(duì)自動(dòng)停車概念著迷,就是錯(cuò)誤假設(shè)我不必再大街小巷四處繞著找停車位…那時(shí)我到底在想啥?
想 想看需要多久的時(shí)間,才能讓每一座停車場與所有街道空間都配備可無線交換所有必要車輛資料的智能型基礎(chǔ)設(shè)施(像V2I那種)?恐怕比你想象得來得更久。雖 然V2I技術(shù)的主要設(shè)計(jì)目的是避免或減少交通事故,Volvo表示其原型車輛不會(huì)仰賴該技術(shù)來避開行人或是橫沖直撞的購物車;該系統(tǒng)會(huì)藉由汽車內(nèi)建的傳感器來避免類似的狀況。
我 可以想象,打造擁有“眼睛”的車輛,對(duì)那些供應(yīng)汽車廠商的電子業(yè)者來說,會(huì)是一筆多大的生意。根據(jù)市場研究機(jī)構(gòu)IHS在4月份發(fā)表的報(bào)告,包括車道偏離警告、自動(dòng)停車等應(yīng)用,將會(huì)成為今年嵌入式視覺市場的主要成長推手;所謂的嵌入式視覺技術(shù)能協(xié)助機(jī)器觀看并解釋來自計(jì)算機(jī)視覺軟件的資料。
本文授權(quán)編譯自EE Times,版權(quán)所有,謝絕轉(zhuǎn)載
本文下一頁:能讓車子“看見”的基礎(chǔ)技術(shù)有哪些?
相關(guān)閱讀:
• 機(jī)器人已可替代人類做危險(xiǎn)試駕測試
• 美國開放無人駕駛車測試,掛牌上路還需時(shí)日
• “聯(lián)網(wǎng)汽車”的終極進(jìn)化是自動(dòng)駕駛WXyesmc
{pagination}
根據(jù)IHS估計(jì),“汽車應(yīng)用的特殊目的計(jì)算機(jī)視覺處理器”市場營收,將由2011年的1.26億美元、2012年的1.37億美元,在2013年成長至1.51億美元;該數(shù)字并將在2016年進(jìn)一步達(dá)到1.87億美元,前景看好。
但到底能讓車子“看見”的基礎(chǔ)技術(shù)有哪些?自動(dòng)停車還面臨那些挑戰(zhàn)?對(duì)此賽靈思(Xilinx)的全球汽車市場行銷與產(chǎn)品企劃資深經(jīng)理Kevin Tanaka表示:“雖然已經(jīng)有像是V2I這樣的技術(shù)開始發(fā)展,但在全球市場非常罕見,并沒有一家汽車制造商在現(xiàn)階段真的仰賴該類技術(shù)提供自動(dòng)停車功能?!?
他說,目前在德國市場有一些讓汽車與電子化停車場通訊的試驗(yàn),能告訴汽車有哪些車位是空的,但:“那都還在非常非常早 期的研究階段?!贝送釭oogle也有自動(dòng)駕駛車輛,不過:“并非利用V2I而是大量的感測裝置,包括攝影機(jī)、雷達(dá)、光學(xué)定向與測距(lidar),以及 一些超音波、地圖繪制程式?!?
Tanaka進(jìn)一步指出,在今日的量產(chǎn)方案中,車廠則是使用結(jié)合雷達(dá)、超音波與攝影機(jī)的方案: “處理傳感器資料、執(zhí)行算法,并在最終協(xié)調(diào)變速箱、方向盤、油門以及煞車控制,需要難以想象的大量平行處理效能?!倍硎?,這也是此應(yīng)用領(lǐng)域?qū)删幊蘏oC有龐大需求的原因,未來也將繼續(xù)扮演要角。
本文授權(quán)編譯自EE Times,版權(quán)所有,謝絕轉(zhuǎn)載
編譯:Judith Cheng
參考英文原文:Can a Car Find a Parking Spot by Itself?,by Junko Yoshida
相關(guān)閱讀:
• 機(jī)器人已可替代人類做危險(xiǎn)試駕測試
• 美國開放無人駕駛車測試,掛牌上路還需時(shí)日
• “聯(lián)網(wǎng)汽車”的終極進(jìn)化是自動(dòng)駕駛WXyesmc
{pagination}
Can a Car Find a Parking Spot by Itself?
Junko Yoshida, Chief International Correspondent
MADISON, Wis. -- Call me old-fashioned, but autonomous driving is a concept that has never fired my loins. I picture myself driving on a street in a big city -- say, Boston. I look at the car in the next lane, only to realize that there's no driver there. I mean, it's bad enough driving in Boston. I want somebody I can yell at.
So, though I'm stubbornly dubious of autonomous driving, I find myself strangely fascinated with the idea of autonomous parking. Imagine how liberating it would be just to tell your car, "Park somewhere," while you jump out of the driver's seat and head for Legal Sea Foods. There'd be no more late arrivals and lame excuses about not finding a spot.
Everyone from Google to Audi and Volvo has been showing off research and development efforts and results for autonomous driving. Audi showed its version of a self-parking car at the International Consumer Electronics Show this year. Volvo released a video clip last month showcasing its new autonomous parking technology. The Swedish automaker says this is the first self-parking prototype that "interacts safely and smoothly with other cars and pedestrians."
Autonomous Parking
Volvo says its prototype enters and navigates a car park and then finds and parks in an available spot.
Of course, when any new technology comes out, nobody takes vendors' claims at face value. For example, Volvo's latest system relies on what it calls Vehicle 2 Infrastructure (V2I) technology. In other words, the system requires a parking structure that comes with embedded transmitters. You can't send your Volvo off to hunt for a parking spot totally on its own.
To me, this is a huge disappointment. I was initially warm to the self-parking concept, because I erroneously assumed that I would no longer have to drive around block after block to find a spot. What was I thinking?
Imagine how long it would take to equip each and every parking lot and street space with a smart infrastructure (like V2I) capable of wirelessly exchanging necessary data with all vehicles. An eternity? Maybe longer.
Though the V2I technology is designed primarily to avoid or mitigate crashes, Volvo says its prototype car doesn't rely on V2I for avoiding pedestrians or rogue shopping carts. Rather, its system comes with in-vehicle sensors to avoid such objects.
I do know that the notion of cars with eyes is a huge deal for the electronics industry serving automotive companies.
The market research firm IHS said in April that automotive applications such as lane departure warnings and self-parking will be among the major growth drivers this year for the embedded vision market -- technology that helps machines see and interpret data from computer vision software.
Revenue from "special-purpose computer vision processors used in under-the-hood automotive applications" rose from $126 million in 2011 to $137 million last year and should reach $151 million this year, IHS said. That revenue will keep expanding and will hit $187 million by 2016, "confirming the solid prospects in store for embedded vision, one of the fastest-growing trends in technology."
But exactly what are the basic technology building blocks involved in making a blind car see? And what challenges does an self-parking car still face?
I popped a few questions on the topic to Kevin Tanaka, senior manager of worldwide automotive marketing and product planning at Xilinx. "While there is the start of some V2I out there, it's still very limited worldwide at the moment, so no automaker is really relying on that for autonomous parking in its current state," he told me. There are some trials in Germany right now in which the vehicle communicates with an electronic parking lot, which tells it what spaces are open. "But it's, again, very, very early."
There's also the Google self-driving car, "but that also does not utilize V2I, but rather a huge range of sensors." They include "cameras, radar, lidar (a remote sensing technology) and some ultrasonics and mapping programs."
In production setups today, OEMs are using combinations of radar, ultrasonic devices, and cameras. "There is an incredible amount of parallel processing power that needs to be done to process the sensor data, run algorithms, and ultimately coordinate gearbox, steering, acceleration, and braking controls."
This obviously is part of the reason why programmable SoCs are in huge demand these days for cars with eyes. Tanaka said Xilinx Automotive FPGAs and Zynq-7000 All Programmable SoCs are being used in many of the radar and camera programs right now, and they will continue to be used into the future.
責(zé)編:Quentin