MIT Scientists Release Open-Source Photorealistic Simulator for Autonomous Driving

Brasil Notícia Notícia

MIT Scientists Release Open-Source Photorealistic Simulator for Autonomous Driving
Brasil Últimas Notícias,Brasil Manchetes
  • 📰 SciTechDaily1
  • ⏱ Reading Time:
  • 85 sec. here
  • 3 min. at publisher
  • 📊 Quality Score:
  • News: 37%
  • Publisher: 68%

MIT researchers unveil the first open-source simulation engine capable of constructing realistic environments for deployable training and testing of autonomous vehicles. Since they’ve proven to be productive test beds for safely trying out dangerous driving scenarios, hyper-realistic virtual worl

researchers unveil the first open-source simulation engine capable of constructing realistic environments for deployable training and testing of autonomous vehicles.

With this in mind, scientists from MIT’s Computer Science and Artificial Intelligence Laboratory created “VISTA 2.0,” a data-driven simulation engine where vehicles can learn to drive in the real world and recover from near-crash scenarios. What’s more, all of the code is being released open-source to the public.

VISTA 2.0, which builds off of the team’s previous model, VISTA, is fundamentally different from existing AV simulators since it’s data-driven. This means it was built and photorealistically rendered from real-world data — thereby enabling direct transfer to reality.

The team of scientists was able to scale the complexity of the interactive driving tasks for things like overtaking, following, and negotiating, including multiagent scenarios in highly photorealistic environments. Lidar sensor data is much harder to interpret in a data-driven world — you’re effectively trying to generate brand-new 3D point clouds with millions of points, only from sparse views of the world. To synthesize 3D lidar point clouds, the researchers used the data that the car collected, projected it into a 3D space coming from the lidar data, and then let a new virtual vehicle drive around locally from where that original vehicle was.

Taking their full-scale car out into the “wild” — a.k.a. Devens, Massachusetts — the team saw immediate transferability of results, with both failures and successes. They were also able to demonstrate the bodacious, magic word of self-driving car models: “robust.” They showed that AVs, trained entirely in VISTA 2.0, were so robust in the real world that they could handle that elusive tail of challenging failures.

Resumimos esta notícia para que você possa lê-la rapidamente. Se você se interessou pela notícia, pode ler o texto completo aqui. Consulte Mais informação:

SciTechDaily1 /  🏆 84. in US

Brasil Últimas Notícias, Brasil Manchetes

Similar News:Você também pode ler notícias semelhantes a esta que coletamos de outras fontes de notícias.

MIT engineers develop a chip-free, wireless electronic skin to monitor healthMIT engineers develop a chip-free, wireless electronic skin to monitor healthMassachusetts Institute of Technology (MIT) engineers have developed a new category of wireless wearable skin-like sensor for health monitoring that doesn't require batteries or internal processor.
Consulte Mais informação »

Lincoln’s Bonkers L100 Concept Pushes the Limits of Automotive DesignLincoln’s Bonkers L100 Concept Pushes the Limits of Automotive DesignThe design, unveiled at Monterey Car Week, opens like a futuristic, self-driving orchid.
Consulte Mais informação »

International Space Station will host a surgical robot in 2024International Space Station will host a surgical robot in 2024Similar robots could one day conduct remote and autonomous surgery in deep space.
Consulte Mais informação »

Sonic Frontiers Story Trailer Reveals Release Date - ComingSoon.netSonic Frontiers Story Trailer Reveals Release Date - ComingSoon.netSega released another trailer for Sonic Frontiers at Gamescom's Opening Night Live, showing off more of the hedgehog in action.
Consulte Mais informação »



Render Time: 2025-03-06 17:56:42