Tesla is ‘checking up on drivers’ with new monitoring software

Autopilot uses cameras, ultrasonic sensors and radar to see and sense the environment around the car. 

The sensor and camera suite provides drivers with an awareness of their surroundings that a driver alone would not otherwise have. 

A powerful onboard computer processes these inputs in a matter of milliseconds to help what the company say makes driving ‘safer and less stressful.’

Autopilot is a hands-on driver assistance system that is intended to be used only with a fully attentive driver. 

It does not turn a Tesla into a self-driving car nor does it make a car autonomous.

Before enabling Autopilot, driver must agree to ‘keep your hands on the steering wheel at all times’ and to always ‘maintain control and responsibility for your car.’ 

Once engaged, if insufficient torque is applied, Autopilot will also deliver an escalating series of visual and audio warnings, reminding drivers to place their hands on the wheel. 

If drivers repeatedly ignore the warnings, they are locked out from using Autopilot during that trip.

Any of Autopilot’s features can be overridden at any time by steering or applying the brakes.

The Autopilot does not function well in poor visibility (due to heavy rain, snow, fog, etc.), bright light (due to oncoming headlights, direct sunlight, etc.), mud, ice, snow, interference or obstruction by objects mounted onto the vehicle (such as a bike rack), obstruction caused by applying excessive paint or adhesive products (such as wraps, stickers, rubber coating, etc.) onto the vehicle; narrow, high curvature or winding roads, a damaged or misaligned bumper, interference from other equipment that generates ultrasonic waves, extremely hot or cold temperatures.

 

Read more at DailyMail.co.uk