Sunday, October 21, 2018

Do I Hit a Tree or a School Bus?



 
Do I Hit a Tree or a School Bus?
Alex Koyfman PhotoBy Alex Koyfman
Written Oct. 04, 2018
Dear Reader,
The driverless car has been a longtime fantasy among an extremely varied group of potential users, from science-fiction aficionados all the way to Jagermeister enthusiasts who dream of a machine that can take them home without the risk of injury or arrest. 
Today, thanks to advancements made by some of the world's most prominent automakers, that fantasy is becoming a reality. 

BMW, Mercedes, Ford, Fiat-Chrysler, and, of course, Tesla, among others, have all made major inroads into developing this potentially world-altering product, with several models capable of completely autonomous operation due to be released as early as 2021. 
And while most of these offerings allow the conventional human driver the mere option of sitting back and not doing anything, some automakers, such as GM, for example, have gone so far as to promise a car with no steering wheel and no pedals at all, making it a true autonomous vehicle. 
If all goes according to plan, this heavily retrofitted Chevy Bolt will be entering service with select taxi fleets as early as next year.
To those of you not paying extra-special attention to your calendars, that's sometime within the next 15 months
That's a pretty insane timeline when you think about it. We've been living with human-controlled, human-dependent automobiles for more than 110 years now — vehicles where the operator was the only decision maker. 
And just like that, inside of the next five financial quarters, all that will start to change. What most people fail to appreciate, however, is just how much work and complexity goes into achieving anything close to a functional autonomous vehicle. 
Consider this: In the chaotic environment of the modern roadway, occasionally accidents will be inevitable, regardless of how many computer-controlled cars are out there. 
Tree limbs will fall, pedestrians will step into roadways, water mains will break, and any number of other events that no supercomputer will be able to control will occur, instantly creating situations where contact between vehicles and other objects will be unavoidable. 
In such situations, autonomous cars will have to make the same sorts of decisions that human drivers have been making since the very dawn of motorized vehicles: Since I have to crash into something, who or what do I crash into, given my choices?
Hard Decisions
Do I crash into the tree that's just fallen into my path, or do I swerve and hit the minivan containing a family of five?
Do I hit the pedestrian that just stepped into my path, or do I strike an oncoming bus?
Do I careen headlong into a stopped truck, or do I take my chances with a brick wall?
These are all difficult choices, but they're choices that will have to be made nonetheless — and choices the vehicle owner and vehicle manufacturer/software designer will have to live with, both morally and legally. 
In order to even consider those choices, however, the car will need to be equipped with technology that will inform its central processor of the situation.
This crucial function will require arrays of complex sensors to feed a constant stream of accurate, detailed information. 
This is an aspect of the problem that I think many overlook when they think about autonomous vehicles, and it's one of the most challenging to perfect. 
Other, more salient aspects of the package, such as servo motors to control the steering, the acceleration and breaking, or the navigational suite, are fairly straightforward in comparison.
These sensors, especially the visual sensors — the eyes of the car — which will provide a complete situational picture of the vehicle's immediate surroundings, will need to be particularly advanced, in terms of both collecting data and processing it.
And that's what I'm currently most interested in for an investment angle.
Because this technological subset, referred to by the eggheads as “machine vision,” is becoming an industry unto itself.
It will require billions in investment capital over the coming years, but the idea of machine vision is nothing new.
Machines have been growing eyes and making decisions based on the information those eyes provide for since before the turn of the 21st century.
The origins of these advanced sensors come from a variety of surprising applications.An Unexpected Origin
One of the pioneers of this technology, which I've been following for the last several months, made its reputation building visual sensors for machines involved in manufacturing.
These sensors have allowed for products to get produced faster and more precisely, while adhering to stricter quality standards.
That testing ground has created some of the most advanced visual sensors in existence, but what few people — its creators included — anticipated was just how important and far-reaching their work would be in this new application. 
Since I started my research on this company, its shares have been trending upward. 
It's my belief that this is just the beginning, the first hints of one of the most profound bull markets in any technological sector of the last decade.
Big enough to take this $10 billion company up several-fold at least — just in the next few years as the autonomous vehicle industry begins to find its rhythm. 
The real show has yet to begin, but if you don't want to be sitting on the sidelines, watching it go by when the real fireworks start over the course of the next year or so, you need to get informed now. 
I've put together a substantial research report detailing all of my findings. 
It's free of charge to access and gives you the full rundown of the company, the technology, the market, and the forecast. 
Don't wait another minute. 
Fortune favors the bold,
alex koyfman Signature
Alex Koyfman

No comments:

Post a Comment