Ford trials robot charging station for electric vehicles


Ford is trialling a robot charging station for electric vehicles, which could make it easier for mobility-impaired people to charge their cars.

The Michigan-based car manufacturer has demonstrated a prototype system, developed by engineers at Dortmund University, Germany.

It consists of a robotic arm that extends all the way into an electric car’s charging port, operated by the driver via their smartphone from inside the vehicle.

After charging, the arm retracts back into place and the driver can be on their way – without having to ever get out of the car.

The robotic arm extends all the way into an electric car’s charging port, operated by the driver via their smartphone from inside the vehicle

HOW DOES IT WORK?

A charging station, which could be located in a car park or a roadside, features a sliding door that conceals the robotic arm.

When the driver parks next to the station, he or her can bring up Ford’s free-to-use FordPass app to open the sliding doors and release the arm.

Once activated, the station cover slides open and the charging arm extends towards the car’s charging inlet with the help of a tiny camera.

It perfectly slots into the car’s charging point, and the driver just has to wait as it pumps up the car’s charge.

Currently, filling a car with petrol or plugging it into a charging point can be challenging for the mobility-impaired, according to Ford.

Disabled drivers represent 5 per cent of the UK driving population and experience major challenges while getting from one place to another.

‘Ford is committed to ensuring freedom of movement and right now refuelling or charging your vehicle can be a major problem for some drivers,’ said Birger Fricke, research engineer at Ford of Europe.

‘The robot charging station could be an added convenience for some people but – absolutely essential for others.’

Following initial lab testing, Ford researchers are now putting the robot charging station to the test in real-life situations.

The charging station, which could be located in a car park or a roadside, features a sliding door that conceals the robotic arm.

When the driver parks next to the station, she or he can bring up Ford’s free-to-use FordPass app to open the sliding doors and release the arm.

Once activated, the station cover slides open and the charging arm extends towards the car’s charging port with the help of a tiny camera.

According to Ford, filling a car with petrol or plugging it into a charging point can be challenging for the mobility-impaired

According to Ford, filling a car with petrol or plugging it into a charging point can be challenging for the mobility-impaired

FORD RECRUITS ROBOT DRIVERS TO TRIAL VEHICLES

Ford is using two robotic test drivers – Shelby and Miles – to trial its vehicles at extreme temperatures.

The robots are too tests worker in conditions that are for any human to endure.

Shelby and Miles can operate at temperatures ranging from -40°F to 176°F (-40°C to 80°C) as well as at extreme altitudes, Ford says.

Their robotic legs extend to the accelerator, brake and clutch pedals, with one arm positioned to change gear and the other used to start and stop the engine.

Read more

It perfectly slots into the car’s charging point, and the driver just has to wait as it pumps up the car’s charge.

For the trial, drivers were able to monitor the charge status via the FordPass app, which is already available and lets Ford drivers unlock and start a car engine with their smartphone.

Ford said the system has undergone successful trials and is currently not available for purchase.

But if rolled out in the future it could be installed at disabled parking spaces, in car parks or at private homes.

In theory, not just Ford cars would be able to use the system although it would depend on the charging connection used in different markets.

Ultimately, the process could become fully automated, with minimal or no driver involvement, according to Ford.

The driver would simply send an autonomous vehicle to the charging station, to get itself ‘charged up’, before returning home.

This would be part of a future where fully self-driving cars are the norm, although this could be as far away as the 2040s, according to research firm IDTechEx.

Autonomous vehicles are powered by artificial intelligence (AI) that’s trained to detect pedestrians in order to know when to stop and avoid a collision.

But they can only be widely adopted once they can be trusted to drive more safely than human drivers – and this seems to be years away.

Disabled drivers represent 5 per cent of the UK driving population and experience major challenges while getting from one place to another

Disabled drivers represent 5 per cent of the UK driving population and experience major challenges while getting from one place to another

Autonomous vehicle technology is still learning how to master many of the basics – including recognising dark-skinned faces in the dark.

Several self-driving cars have been involved in nasty accidents – in March 2018, for example, an autonomous Uber vehicle killed a female pedestrian crossing the street in Tempe, Arizona in the US.

The Uber engineer in the vehicle was watching videos on her phone, according to reports at the time.

SELF-DRIVING CARS ‘SEE’ USING LIDAR, CAMERAS AND RADAR

Self-driving cars often use a combination of normal two-dimensional cameras and depth-sensing ‘LiDAR’ units to recognise the world around them.

However, others make use of visible light cameras that capture imagery of the roads and streets.

They are trained with a wealth of information and vast databases of hundreds of thousands of clips which are processed using artificial intelligence to accurately identify people, signs and hazards.

In LiDAR (light detection and ranging) scanning – which is used by Waymo – one or more lasers send out short pulses, which bounce back when they hit an obstacle.

These sensors constantly scan the surrounding areas looking for information, acting as the ‘eyes’ of the car.

While the units supply depth information, their low resolution makes it hard to detect small, faraway objects without help from a normal camera linked to it in real time.

In November last year Apple revealed details of its driverless car system that uses lasers to detect pedestrians and cyclists from a distance.

The Apple researchers said they were able to get ‘highly encouraging results’ in spotting pedestrians and cyclists with just LiDAR data.

They also wrote they were able to beat other approaches for detecting three-dimensional objects that use only LiDAR.

Other self-driving cars generally rely on a combination of cameras, sensors and lasers.

An example is Volvo’s self driving cars that rely on around 28 cameras, sensors and lasers.

A network of computers process information, which together with GPS, generates a real-time map of moving and stationary objects in the environment.

Twelve ultrasonic sensors around the car are used to identify objects close to the vehicle and support autonomous drive at low speeds.

A wave radar and camera placed on the windscreen reads traffic signs and the road’s curvature and can detect objects on the road such as other road users.

Four radars behind the front and rear bumpers also locate objects.

Two long-range radars on the bumper are used to detect fast-moving vehicles approaching from far behind, which is useful on motorways.

Four cameras – two on the wing mirrors, one on the grille and one on the rear bumper – monitor objects in close proximity to the vehicle and lane markings.

.

Leave a Comment

Your email address will not be published.