【世界语《第二书》的自动解析笔记】

作为周末爱好,最近把 Esperanto parser (世界语自动句法解析器) 复活了。为“自然”语言做的平台,对于“人造”语言扫描,那的确是”降维打击“,就跟美国轰炸伊拉克似的。先找一个简单的句子热身:

世界语《第一书》略显简单,咱们从《第二书》开始认真试试庖丁牛刀,LOL。

la unua frazo en 《La Dua Libro》:

Elirante ankorau unu fojon antau la estimata publiko, mi sentas la devon antau chio danki la legantan publikon por la viva kunsento, kiun ghi montris por mia afero.

这基本上是外语文法老师都会讲解的句法分析吧?只不过这是机器自动解析而已。具体说来就是:

1. 柴门霍夫《第二书》的这第一句话是一个复合句。
2. 复合句主句的主干是“mi sentas la devon”,其中“mi”是主语,“sentas”是谓语动词,“la devon”是宾语。
3. 主句的状语从句是一个副词性分词短语,“Elirante ankorau unu fojon antau la estimata publiko”, 其中,“ankorau unu fojon” 是分词的状语,介词短语 “antau la estimata publiko” 也是其状语。
4. 主句宾语“la devon” 后面带有一个定语从句,"kiun ghi montris por mia afero", 其中,montris 是从句谓语动词,kiun 是其宾语,ghi 是其主语,介词短语 “por mia afero” 是其状语。
5 再细一点,可以解析NP和PP里面的限定词(如 la)和定语(如 estimate),等等。

看样子,柴门霍夫喜欢用复杂的句式。也许因为是第二书了,他有意为之,为了表现语言表达复杂概念和关系的能力。

La multaj promesoj, kiujn mi ricevas, kaj el kiuj tre granda parto estas subskribita “senkondicˆe”, la leteroj kun kuragˆigoj au ̆ kon- siloj—cˆio tio cˆi montras al mi, ke mia profunda kredo je l’ homaro min ne trompis.

tiu estas la dua frazo, kiu ankau estas longa kaj malsimpla.  “chio tio chi”,kial ne "chio chi"?

La bona genio de la homaro vekighis: de chiuj flankoj al la laboro chiuhoma venas amasoj, kiuj ordinare estas tiel maldiligentaj por chia nova afero;

The good genius of mankind has awakened: from all sides to the work of every man come masses who are ordinarily so lazy for every new cause;

《第二书》这话是在欢呼人类理性的觉醒,接纳并拥抱了世界语的诞生。

kiam pasus la jaro, mi intencis eldoni libreton, en kiu estus analizitaj chiuj pensoj esprimitaj de la publiko, kaj uzinte tiujn, kiuj efektive estus bonaj, mi donus al la lingvo la finan formon, kaj post tio chi oni jam povus komenci la eldonon de plenaj vortaroj, libroj, gazetoj kaj cetere, char tiam la lingvo jam estus trairinta la jughon de la tuta mondo, kaj chiuj plej gravaj malbonajhoj, kiuj povus esti trovitaj en ghi, char en verko de unu homo, — estus jam pli au malpli forigitaj.

柴门霍夫的下一个句子超长,非常复杂,一个长句里面居然有10个从句:不知道老先生是炫酷,还是忘记断句了。另一个很可能是潜意识的动机就是他想表达复杂思想的组合来展示世界语的语言手段的丰富。这么复杂的句子给解析,无论机器还是人,都带来一定的困难。不必为圣人避讳,“chiuj plej gravaj malbonajhoj, kiuj povus esti trovitaj en ghi, 【char en verko de unu homo, — 】estus jam pli au malpli forigitaj”中我用【...】标示了里面的白璧微瑕,这种原因状语从句严重缩略以后的别扭写法以及标点的使用,行文不像柴门霍夫其他作品那样通晓明白。

谷歌机器翻译居然翻译得还很到位;这当然得益于欧洲语言句型句式的某种同质性。

when the year was over, I intended to publish a booklet in which all the thoughts expressed by the public would be analyzed, and using those that would actually be good, I would give the language its final form, after which the publication of full dictionaries, books, newspapers, and so on, for by that time the language would have passed the judgment of the whole world, and all the most important evils that could be found in it, because in the work of one man, would have been more or less removed. .

英文的机器翻译一字未动,因为没什么好修正的了;机器翻译的中文译文有点缺陷,原来的机器翻译是:

当这一年结束时,我打算出版一本小册子,对公众表达的所有思想进行分析,并使用那些实际上会很好的思想,给该语言提供最终形式,然后出版该语言。 完整的词典,书籍,报纸等,因为那时该语言已经通过了整个世界的判断,并且其中可能发现的所有最重要的弊端,因为在一个人的工作中,或多或少都会被消除。

我的编辑版是:

当这一年结束时,我打算出版一本小册子,其中对公众表达的所有思想做了分析,并采纳那些实际上会有很好效果的建议,我会给该语言确定最终形式;此后人们就已经可以开始编纂该语言的完整词典、书籍、报纸杂志等,因为届时该语言应该已经通过了全世界的审阅,并且其中所能够发现的由于本来只是一人之力而难免存在的最重要的弊端,或多或少地已被消除了 。

良心说,这么复杂甚至有些啰嗦和瑕疵的句子,机器翻译成中文那样,意思基本到了,已经相当不错了。谷歌译文的主要错误在 “...然后出版该语言。 ” 那个句号不该用,出版的也不是语言,而是该语言的词典等。

10个子句大多是状语从句和定语从句,大体标示下来就是(其中有一个 char / because 本来是带原因状语从句的,但是老先生在那里面省略了太多东西,只剩下一个介词短语,感觉有些不顺):

(1)[kiam pasus la jaro], (2) [mi intencis eldoni libreton], (3) [en kiu estus analizitaj chiuj pensoj esprimitaj de la publiko], (4) kaj uzinte tiujn, [kiuj efektive estus bonaj], (5) [mi donus al la lingvo la finan formon], (6) [kaj post tio chi oni jam povus komenci la eldonon de plenaj vortaroj, libroj, gazetoj kaj cetere], (7) char tiam la lingvo jam estus trairinta la jughon de la tuta mondo, (8) [kaj chiuj plej gravaj malbonajhoj, (9) [kiuj povus esti trovitaj en ghi], (10) [char en verko de unu homo], — estus jam pli au malpli forigitaj].

终于调通了柴老爷100多年前写的这个包含10个分句的巨复杂的世界语句子:

 

说这话的历史背景是:1887年7月26日,柴门霍夫第一次以 D-ro Esperanto(希望者博士)署名发布了他酝酿创造和自己使用了几十年的世界语方案,称作《第一书》,是俄语与世界语对照的读物。由于此前类似的人造语言方案几乎都没有流行起来,柴门霍夫心里是很忐忑的。也许是历史机缘好,也许是上帝垂顾他的艰辛努力和崇高信念,世界语很快开始流行,同时也引来了许多疑问和质询。太多的语言爱好者,包括一批语言学家以及一些也曾闭门造车自创过人造语方案的人士,开始提出五花八门的修改建议,觉得必须动大手术这个语言才能完美。每个人都有自己的一大摞理由,每个人都很坚持,有希望增加词尾形式的,有坚决反对词尾的,不一而足。柴门霍夫不断答问,为了推广世界语,也准备好了做重大让步和修改。1889 年,他把这些答问配备上述前言汇编成册,出版了这本《第二书》,然后声称从此以后,他完全放弃世界语的著作权,不再掌管世界语。世界语的修改和解释权由世界语刊物编辑部以及后来成立的世界语协会接管。组织接管以后开会讨论过各种修改方案,最终除了少数修改以外,基本维持原方案的形式。

 

【相关】

【第一书】是世界语的“圣经”

Tesla autopilot and its QA issues

Another news report of Tesla's "loss of control" came from China, this time causing 2 deaths and 6 injuries. This year, there have been three incidents in China, and a car rushed into a gas station in Shanghai and injured two people.

Tesla has been reported many times for "suddenly speeding up and losing control". Every time Tesla attempts to prove the impossibility by design principle, but the driver has his own strong opinions.  It seems to have become a big unsolved myth.  Car owners usually say that they are indeed out of control.  Typically the report is, a sudden speed-up horrified the driver who was in a hurry, trying to brake in vain, or had no time to brake it, causing an accident. Tesla usually says that the speeding-up is a feature of autopilot, not a bug. The out-of-control accident is caused by improper operations or carelessness of the driver.  In addition, the brakes never fail (unlike Toyota at one time, who admitted the defect of brakes cause failure and had to recall millions).  It is always possible for an attentive driver to take over control any time. Tesla fans, who are also car owners, often condemn the perpetrators on Tesla's side, which is not an uncommon phenomenon often seen in discussions on Tesla's Facebook fan club pages.

As a Tesla owner as well as a tech guru who has "played with" Tesla (perhaps the biggest gadget toy in my life) for almost a year, I have to say that both sides have their own reasonable narratives.  Indeed, almost every case can finally come down not to a real bug.  Technically speaking, there will be no "out-of-control speed-up" even possible in Tesla engineering design.  Acceleration is by nature part of the definition for all automatic driving, auto-pilot included.  After all, can there be an automatic driving monster, with only deceleration and parking features?  So any speed-up can be argued to be an innate feature instead of a bug.  Although autopilot belongs to software-controlled hardware operations, hence "bugs" inevitable, however,  at least up to now, no one has been able to prove that Tesla's automatic driving has "out-of-control bugs".  In fact, "going out-of-control" is probably a false proposition in software engineering, to start with.

However, on the other hand, Tesla drivers clearly know that they "feel" out of control, and there is no need to question that feeling.  In fact, Tesla owners all have had such personal experiences, to a different extent.  Tesla, as a manufacturer, has its own responsibility for failing to greatly reduce (if not eliminate) a lot of scenarios that make Tesla drivers "feel" out of control. In fact, it's not that they haven't made an effort to address that, but they always (have to) rush new versions online over-the-air (OTA).  Due to the incremental nature of software training and engineering in general, Tesla has little time in taking care of the "feelings" of all customers, some being very ignorant of software, thinking the machine simply going crazy when in a panic state. In fact, there is no standard to judge which type of speed is defined as "out-of-control" (speed limit+alpha may be regarded as the absolute upper limit which Tesla never fails to follow).  As long as the speed-up is within the pre-set upper limit, any speedup can always be argued to be a feature rather than a bug.  Not everyone has the same tolerance level to any surprise changes of speed, so the driver reporting out of control is as real as many people who claim to have seen UFO. The feeling is real, but the felt world is not necessarily the objective world.

I have actually made a serious study on Tesla's reported sudden acceleration issue. Most of them look like a mixture of misunderstanding and illusion, and the loss of control like Toyota's brake failure years ago has not been verified on Tesla. The so-called sudden acceleration might well happen in the following scenarios.

In automatic driving, if the traffic on the road ahead is clear, the car will accelerate until the set speed limit.  This is, of course, a feature, not a bug. But in the early Tesla, when the car in front suddenly turned to the neighboring lane, Tesla would accelerate suddenly and quickly, which really made people feel scared and out of control.  Later software updates began to control the acceleration pace, taking better care of people's feelings.  Those updates effectively reduced the complaints of the "out-of-control" report.
 
More specifically to Tesla, there are two states of assisted automatic driving at this point, one is called "traffic-aware cruise control", which only controls the speed but not the steering wheel, and the other is so-called "autopilot" (which also controls the steering wheel).  Note that there have been clear chimes for entering and exiting from autopilot but up to now, there is no obvious sound prompt for entering or exiting the automatic cruise state (there are signs, though, on the screen, but there are not any alarm sound effects).  Sometimes drivers forget that they are in cruise control, especially when they start using the accelerator pedal some time and releasing the foot from the accelerator, which triggers the machine take-over.  That scenario can easily lead to the illusion that although they are driving (holding the steering wheel and having used the gas pedal), the car is out of control accelerating by itself!  This has happened to me a few times and over time I have learned to get used to this human-machine interaction without panic.

In any case, people can take over the control back at any time.  As long as you don't panic, the brakes can stop the acceleration immediately.  In addition, there is also automatic emergency braking that kicks in any time when Tesla detects collision risks. However, emergency braking only works in emergency and there are also hidden bugs found in not being able to cover all emergency cases: there will be misjudgments (for example, thinking that the big white truck standing still in front is a normal white cloud on the blue sky, so it will bump right into it; for another known example, when the obstacle in front is a police car parking on the highway roadside that presses the lane instead of a vehicle obstructing the entire lane.  Tesla cannot always judge whether to drive clear of such a police car, and numerous such accidents have been reported when netters joke that Tesla loves to challenge police by hitting them.  These are the known bugs for the current Tesla autopilot, which cannot prevent collisions 100%.

On the other hand, it seems to be a deliberate design not to give too many prompts as a feature for "seamless human-machine coupling" (interaction and cooperation).  Tesla drivers can take over the speed control by stepping on the throttle at any time, regardless of being in automatic cruise control or in autopilot.  What needs to be educated as a pre-warning is that as soon as the throttle is released, the machine will take over, and then the machine will speed up within the preset maximum speed limit according to the road conditions. This sudden and seamless takeover of the machine often makes people feel startled and out of control when accelerating more than expected.

In fact, there is a solution to this problem. I don't know why Tesla has not done it properly. Perhaps Tesla overemphasized seamlessness in human-machine coupling and hence tolerated the side effects. The solution is fairly straightforward.  First, the entering and exit of automatic cruise should also be given some kind of chime by default, at least this prompt should be made configurable to be set on or off.  Drivers should always know whether Tesla is under their own control or it is machine-controlled at any given time without the need to try to figure out. Second, after the machine takes over, even if the road is completely clear and even the adjacent lanes are free of traffic, the acceleration should not be too abrupt, and it should be carried out gradually, considerate of human feelings, not just the objective requirements for safe driving maneuvers. By doing these two things, I believe that the above-mentioned "out-of-control" reports will be greatly reduced.  Anyone here is an insider in Tesla? Please help deliver the above suggestions to Tesla to help avoid more complaints, making Tesla safer and more user-friendly.

In fact, phantom braking is a more annoying thing in Tesla than sudden acceleration. It is called phantom because it is usually not easy to determine how it is triggered. In the past, it happened frequently, anything like shadows on the road, direct sunlight, and so on, might cause phantom braking, making people startled and increasing the possibility of rear-end collision. With continuous updating of the re-trained autopilot software on more data, the phantom brake cases began to decrease significantly although it still happens occasionally, and hence there is indeed an adaptation process for preparedness.

Almost a year as a Tesla owner, how often do I "feel out-of-control" in driving Tesla?  Phantom braking aside, I have experienced about four or five times of unexpected abrupt speed increase or lane swinging in the 9+ months of driving.  Each time when it happens, I feel a little shaken, but with a prepared mindset, I can take control back safely every time.  I can imagine, though, what may happen to a newbie with an unprepared mind.

The nature of software engineering is incremental, it is normal to be imperfect and immature, constantly in the state of being in the process. When an immature thing is put into the market, it will inevitably lead to disputes. Amazingly, Elon Musk who is not afraid of taking risks can withstand such disputes involving life and death, still enabling Tesla to stand popular in the stock market.   I know part of the reason is the innovation wonder realized in such futuristic products, way ahead of competitions.

Tesla's QA (Quality Assurance) is far from satisfactory, and it definitely does not reach a stringent high standard often seen in software giants in the IT industry. Part of the reason, I guess, is due to their big boss. Musk "whips" engineering team for speed every day.  He is known to be a tough boss, placing tons of pressure on the autopilot development, boasting too many times of features which are far from being complete. Under the pressure of such a boss and with the stimulation of stock options, how can programmers have the luxury of pursuing highest-level QA management? Therefore, Tesla's over-the-air software update, pushed as frequently as once every couple of weeks, often takes two steps forward and one step back. Regression "bugs", as well as even the smallest enhancements, are often reported by passionate users everywhere in social media,  quite a unique phenomenon in the software world.  But in their corporate culture, Tesla can hardly afford to slow down feature development, neither can it afford a long QA process to ensure safety. 

I have always loved new gadgets. Tesla is the most recent gadget.  To tell the truth, the fun brought by this big toy is completely beyond my imagination. Automatic driving, which is usually beyond reach for customers, is now played within my hands every day, thanks to Tesla's autopilot.  I take great fun in driving such a super-computer around in the valley, often purely for testing all the new features and experiencing the feel of real life AI embodied inside Tesla.  In fact, the fun becomes greater in finding "bugs" in the process because we then are always looking forward to the next upgrade, hoping the issues to have been resolved, often bringing a full load of surprises, good or bad.  l often go online to check which version is the most recent release.  I install a new upgrade as soon as I get the notice (I seem to be in the top 5-10% of users who receive upgrades, and often envy those beta testers who are always the first to test a new biggy version).  Among the numerous features updates since I owned Tesla are regenerative braking supported single pedal driving (such a joy once we experience the convenience and benefits of single-pedal driving), performance improvement in the automatic lane change, as well as traffic lights responses. The recent upgrade has the function for the automatic window closing on locking, which is also very good, and solved my worry of forgetting to close the windows once and for all. This is another convenience measure for Tesla after its long-standing automatic door opening and locking function.

A process is more important than the result. If it is a perfect robot, a finished product from the future world, I would feel like a fool to sit in inside as it has nothing to do with you, you cannot get engaged, you are just another target to serve.   You are you, the vehicle is a vehicle, no different from any other tool we use and forget in life.  The current on-going experience is different, we are coupled with the vehicle seamlessly and Tesla does not only look like a live friend, but often gives the feel of ourselves' extension.  This kind of "man-machine coupling" proves to be the most fun to a techy guru.  No wonder engineers in the valley become the first large wave of Tesla owners.  It is unspeakable enjoyment in driving a supercomputer around every day when this machine pops up some "bugs" from time to time (not necessarily the strictly engineering bugs).  Although you cannot drill down inside the software to debug, you can evaluate, guess, and imagine how the problem is caused.  There are nice incredible surprises, too.  For example, autopilot at night and in the storm used to be thought of as the worst nightmare for Tesla, it actually ended up performing exceptionally well.  There were two heavy storms I came across on road, which made it very difficult for me to control it myself.  I then tried to apply autopilot.  As a result, automatic driving turned out to be more stable, slowing down automatically on the lane and sticking inside the lane steadfast.  As for night driving in freeway, I found it to be the safest.  I bet you can really sleep or nap an extended period of time inside the car with no problem at all when autopilot is on (of course we cannot do that now, it is against the traffic law).  One big reason for the night driving safety is that at night, crazy and wild driving behaviors and roaring motorcycles are almost extinct, and everyone follows the rules and focuses on getting back home.  Night driving for a human driver is often monotonous and lengthy, and we are prone to fatigue driving.  But, autopilot does not know fatigue, there is really no challenge at all.  Machines don't necessarily feel the same difficulty when people find it difficult.

 

thanks to Sougou Translate from https://liweinlp.com/?p=7094

 

 

有感于“特斯拉又失控了“

微信群又传来特斯拉失控的新闻,这次很惨,造成2死6伤: 四川一特斯拉失控26伤!今年已发生3起,上海也曾有一辆冲进加油站撞伤2

特斯拉自己提速、失控,多次被报道。每次特斯拉都是从原理上论证了不可能,车主与特斯拉各执一词,成了当今世界的一大悬案。车主通常是说,明明就是失控了。突然莫名其妙提速,车主急了,手忙脚乱想刹车,没刹住,所以造成事故。特斯拉说,提速是一个自动驾驶状态的feature 不是 bug,失控是车主操作不当或粗心大意造成了事故,绝不会刹车失灵,要接管总是可以的。同是车主的特斯拉众粉丝,往往站在特斯拉一边谴责肇事者。这个现象很有趣。无产阶级不为无产阶级兄弟辩护和维权,反而站在资本家一边落井下石。但这个现象真地非常普遍,去特斯拉各大脸书网页经常看到这样的议论。

作为技术控和“把玩”了特斯拉快一年的车主,我得说,这事儿其实是公婆各有理。几乎每个 case 最终都能说明,提速不是 bug 而是 feature,纯技术上看,自驾原理上不会有“失控”。 自动驾驶的定义中就有加速一条,哪里有只有减速和停车的自动驾驶呢?虽然都知道“说无难”,而且自动驾驶属于软件控制硬件,bugs也难免,但至少迄今未能有任何人证实特斯拉自动驾驶有失去控制的bugs,其实工程上讲,这很可能是个伪命题。

但是,另一方面 车主明明白白就是“感觉”失控了,这种感觉是不必质疑的,车主们其实都有过程度不同的切身体会。特斯拉作为制造商,没有做到大量减少或消除让车主出现失控感觉的种种情形,是有其自身责任的。其实也不是没有努力去做,而是因为总是匆忙上线,加上软件训练的 incremental 本性,它是没有办法照顾所有不同车主的“感觉”的。对于技术控不是问题的情形,到了普罗车主就可能是机器发疯,完全失控了。加速到什么情况可以称为失控,其实是没有标准的(speed limit + 增量alpha 可能算是绝对上限),绝对上限之内都可以 argue 说是 feature 而不是 bug。每个人的小心脏对于预期之外的自动驾驶表现和 surprises 的承受能力不同,所以 report 失控的车主与看到了 UFO 的人一样真实。感觉是不会骗人的,但感觉的世界不一定是客观世界。

特斯拉突然加速的事情,我认真做过调查研究。的确多数是一种误解/错觉,像丰田车曾经被证明刹车失灵那样的完全失控没有在特斯拉上被证实过。所谓自动加速大体是下面几种情形。

    • 在自动驾驶状态,如果前面的道路的 traffic 没了,车会逐渐加速,直到设定的速度限制为止。这当然是一个 feature,不是 bug。但早期的特斯拉,前面的车突然转到邻道,特斯拉会很快加速,的确让人有些心惊胆战的。后来的软件更新开始控制突然加速的幅度,改为逐步加速,照顾了人的感受,这才减少了“失控”的抱怨。
  •  
    • 自动辅助驾驶有两种状态,一种是“自动巡航”(traffic-aware cruise control,只控制速度,不控制方向盘),另一个是所谓“自动驾驶”( autopilot,方向盘也受机器控制)。进入自动巡航状态和从自动巡航状态退出没有明显的提示(屏幕上有标示,但没有警声,但从 autopilot进出,都有明显的声音提示)。有时候开车的人忘了自己处于这个状态之中,这就会造成虽然自己在开车(把握着方向盘呢),汽车却失控,自己加速的错觉。无论如何,人都可以随时接管,完全失控的故事迄今为止都是个传说或谣传,而且只要不惊慌失措,刹车就可以马上阻止提速。岂止刹车,如果前面有障碍物,或要撞上前面的车辆了,汽车也会自动刹车。但这个自动刹车的 feature 却是还有隐患或 bugs,会有判断失误(譬如以为前面静止的白色大货车是蓝天上的白云,于是一头撞上去,或者前面的障碍物是压线的警车,而不是占据了整个车道的车辆,有时候特斯拉判断不好压线的程度要不要规避,也会撞上警车,这些都是已知的bugs和当前自动驾驶模型的短板),不是100% 都可以避免事故。

具体说来,突然提速的错觉最常发生在车主把脚从油门松开的时候。作为一个“无缝对接”的 feature,无论自动巡航还是自动驾驶,人随时可以踏油门接管速度控制。但是,油门一松,就是机器接管了,这时候机器根据路况会在预设的最高速度限制范围内提速。这种突然无缝接管的机器提速,常会让人吓了一跳,感觉就是失控。其实这个问题是有解决方案的,不知道特斯拉为什么没做到位,也许是过分强调了“人机耦合,无缝对接”造成的副作用吧。解决方案是:自动巡航的进出也应该默认给某种声音提示,起码是这种提示是可以配置的,不能真地“无缝”,让人不能随时感知是处于自己控制还是机器控制的状态。第二,机器接管以后,哪怕大路朝天,甚至临近车道都没有 traffic,提速也不能太快,还是要渐次进行,不抢一秒,照顾一下人类的感受,而不仅仅是安全驾驶的客观需求。做到这两条,上述“失控”的问题就会大为减少。有谁是特斯拉挖煤的吗,请把上述宝贵建议转达上去,避免你家特斯拉遇到更多投诉,也增强特斯拉自驾的“人性”感。

对于技术控,对于把自驾当成把玩和研究对象的人,每出现一个 surprise,都会尝试分析其原因。绝大多数 surprises 无论是加速,还是突然减速或刹车(叫做鬼影刹车 phantom braking),最后都找到了某种合理的解释,的确是 features,但也有解不出来的。这时候技术控根据多年自己既是软件制造人,也是软件使用人的经验,知道解不出来也是常态,至少不能把解不出来归因为 bugs,因为 bugs 需要证明或 debug(非内部人不能,所以车主其实无法证明),而解不出来只是一种无知而已。无知其实不绝对等于真解不出来,而往往是耗尽了耐心,或感觉不值得去解了。可这一切对于一个驾驶特斯拉的老妪或新同学,那就是实实在在的失控。

比起突然提速,鬼影刹车在特斯拉是个更加头疼的事情,叫鬼影,就是因为通常不容易确定究竟是怎么触发的。以前发生频率很高,马路上遇到阴影,太阳光直射,等等都会造成误判,车子会突然减速,让人吓了一跳,也增加了后车追尾的可能。自动驾驶的不断更新,鬼影刹车开始明显减少了,但还是偶然会遇到,的确有个适应的问题。

说一点体会,驾驶了快一年,真正没有找到合理解释,除了鬼影,真切“感觉失控”过几次呢?大约有四五次(突然提速或车道摇摆)。每次都有些惊魂不定,但每次都能接管过来,有惊无险。

软件的本性就是不完善、不成熟,永远在过程中。一个不成熟完善的东西推向市场,必然引起纠纷。令人惊异的是,在这种人命关天的纠纷中,特斯拉能屹立不倒,反而被股市狂热追捧,也是一个奇观。谁家胆子有马斯克那么肥呢?

特斯拉的QA(质量管控)只能算勉强,差强人意,绝对不是严格的业内高标准。部分原因猜想是源于老板的催命。马斯克每天鞭子抽着技术人员,他海口跨出去n多年了,总是不能按时实现,已经成为马斯克现象级的笑话了。在这种情况下,他的管理本能就是加压。在老板加压以及巨大股权刺激下的程序猿们,怎么可能做到高水准的 QA 管理呢?因此特斯拉每两个星期推送一次的软件在线更新(OTA,over-the-air),往往是进两步,退一步。regressions 是随处可见的,软件大厂中所罕见。但它停不下步伐,马斯克定型的公司文化也很难改变这种节奏。他满足最多的是技术控,因为技术控们抢着着白老鼠,巴不得第一时间把玩新的 features,哪怕冒着 regressions 的风险。但这种节奏其实是不适合快速推广的。

从来就喜欢 gadgets,当年玩苹果 iPods 各种也是入迷。但说心里话,尽管有事没事拿马斯克开开涮,这个大玩具给我带来的乐趣 完全超出想象。本来是遥不可及的自动驾驶,现在每天把玩。有事没事开着车在谷子里瞎转悠(长走的线路是 280,880,680,92 到半月湾,17路到桑塔克鲁兹,当然还有太平洋一号大道,还有一些 express ways,不大喜欢杂乱的101)。大概除了坐火箭外,没有啥玩意儿比这东西好玩了。而且正因为他家自动驾驶很不完善,乐趣变得更大,因为老在盼望升级,满怀退一步进两步的抱怨和惊喜。老友说:“最近特斯拉升级有点太频繁了。以至于我这一阵,每次坐进车里,都在现在升级还是等到晚上23:30中纠结“。我的回答是,不嫌其贫,不嫌其烦,不嫌其频繁,乐在其中也。Lol  现在老往网上去查看新的版本已经出来没有,是不是排队快到我家了(特斯拉软件升级是分批次的,我大约处于前百分之十收到升级的用户,而最“幸运”的小群体是被称为小白鼠的 beta testers)。虽然有些升级很琐碎,一旦接到通知,还是立马升级,先试为快。自从拥有特斯拉,最受益的几个升级包括,(自动回收电能的)单踏板驾驶、自动换线的性能提升和有限制的红绿灯行驶。最近的升级有自动关闭窗户,也很不错,一劳永逸地解决了我忘关窗户的担心。这是特斯拉在由来已久的自动开门和上锁功能后的又一个便民措施。

过程比结果重要。如果就是一个完美的全自动,一个未来世界的完成品,坐进去就跟傻瓜似的,它让你无可挑剔,你也插不上手。它是它,你是你,那也了无趣味。现在的体验是让你有参与感,这种“人机耦合”,随时接管,最有玩点。每天驾驶着一台超级计算机,这台机器不时有几个 bugs 冒出来(不必是严格意义上的工程 bugs),你虽然不能钻进去 debug,但你可以评估、猜测和想象问题的所在。很多体验在以前觉得不可思议。譬如夜间的自动驾驶、狂风暴雨中的自动驾驶,以前想当然以为自动驾驶会一团糟,或至少是大退步,其实表现完全超过。有两次遭遇狂风骤雨,自己把控都觉得非常困难,结果自动驾驶反而显得更加稳定,坚持在线内减速行驶(也有自驾主动放弃的时候,大概是传感器和摄像头接到信号低于某个阈值)。至于夜间驾驶,发现比白天驾驶安全感更好,真地可以在车上睡觉或假寐了。部分原因是夜间的路上,疯狂和野性的驾驶行为和呼啸而过的摩托几乎绝迹,大家都循规蹈矩地赶路回家。夜间走长路没有新鲜的刺激,单调冗长,特别容易疲劳驾驶。好在自驾不懂疲劳,对它没有挑战。人觉得困难的场景,机器不见得就觉得一样的困难。

马斯克在内部说,电动车造价下不了,老百姓用不起,那就不能算特斯拉成功了。不仅仅是性价比的问题,而是绝对价格要下来。这才能达成全面取代燃油车的清洁能源愿景。据说两万美金左右的“迷你特斯拉”,正在上海和柏林紧锣密鼓研制中。特斯拉进一步推广到大众,“失控”和事故的案子会越来越多的。车主从技术控群体,越来越普及到普罗。特斯拉这么有钱,还是赶紧把软件队伍扩大十乃至百倍,加紧研制更“人性”的自动驾驶来减少麻烦和事故吧。但愿特斯拉软件质量跟上硬件车辆的普及速度。

 

【相关】

是特斯拉的自驾AI牛还是马斯克牛? 

王婆不卖瓜,特斯拉车主说自驾》 

马斯克的AI牛皮

【语义计算:李白对话录系列】

【置顶:立委NLP博文一览】

《朝华午拾》总目录