Linux Foundation launches Overture Maps Foundation


map data

The Overture Maps Foundation, created by the Linux Foundation, aims to help developers who build map services or use geospatial data. | Source: Overture Maps Foundation

The Linux Foundation announced it formed the Overture Maps Foundation, a collaborative effort to create interoperable open map data as a shared asset. The Overture Maps Foundation aims to strengthen mapping services worldwide and enable current and next-generation mapping products. These mapping services could be crucial to robotic applications like autonomous driving. 

Currently, companies developing and rolling out autonomous vehicles have to spend massive amounts of time and money meticulously mapping the cities they’re deploying in. Additionally, those companies have to continuously remap those cities to account for any changes in road work or traffic laws. 

The foundation is founded by Amazon Web Services (AWS), Meta, Microsoft and TomTom. Overture hopes to add more members in the future to include a wide range of signals and data inputs. Members of the foundation will combine their resources to create map data that is complete, accurate and refreshed as the physical world changes. The resulting data will be open and extensible under an open data license. 

“Mapping the physical environment and every community in the world, even as they grow and change, is a massively complex challenge that no one organization can manage. Industry needs to come together to do this for the benefit of all,” Jim Zemlin, executive director for the Linux Foundation, said. “We are excited to facilitate this open collaboration among leading technology companies to develop high quality, open map data that will enable untold innovations for the benefit of people, companies, and communities.”

The Overture Maps foundation aims to build maps using data from multiple sources, including Overture members, civic organizations and open data sources, and simplify interoperability by creating a system that links entities from different data sets to the same real-world entities. All data used by Overture will undergo validation to ensure there are no map errors, breakage or vandalism within the mapping data. 

Overture also aims to help drive the adoption of a common, structured and documented data schema to create an easy-to-use ecosystem of map data. Currently, developers looking to create detailed maps have to source and curate their data from disparate sources, which can be difficult and expensive. Not to mention, many datasets use different conventions and vocabulary to reference the same real-world entities. 

“Microsoft is committed to closing the data divide and helping organizations of all sizes to realize the benefits of data as well as the new technologies it powers, including geospatial data,” Russell Dicker, Corporate Vice President, Product, Maps and Local at Microsoft, said. “Current and next-generation map products require open map data built using AI that’s reliable, easy-to-use and interoperable. We’re proud to contribute to this important work to help empower the global developer community as they build the next generation of location-based applications.” 

Overture hopes to release its first datasets in the first half of 2023. The initial release will include basic layers including buildings, road and administrative information, but Overture plans to steadily add more layers like places, routing or 3D building data. 

The post Linux Foundation launches Overture Maps Foundation appeared first on The Robot Report.

from The Robot Report – Robotics News, Analysis & Research
via artificialconference


Source link

NVIDIA DRIVE OS earns safety certification


NVIDIA drive

NVIDIA DRIVE OS is an operating system for in-vehicle accelerating computing power. | Source: NVIDIA

TÜV SÜD has determined that NVIDIA’s DRIVE OS 5.2 software meets the International Organization for Standardization (ISO) 26262 Automotive Safety Integrity Level (ASIL) B standard, which targets functional safety in road vehicles’ systems, hardware and software. 

NVIDIA DRIVE OS is an operating system for in-vehicle accelerating computing powered by the NVIDIA DRIVE platform. DRIVE OS is the foundation of NVIDIA’s DRIVE SDK, which includes NVIDIA’s CUDA libraries for efficient parallel computing, the NVIDIA Tensor RT SDK for real-time AI inferencing and the NvMedia library for sensor input processing, among other developer tools and modules. 

To meet the standard, NVIDIA’s software had to be able to detect failures during operation, and be developed in a process that handles potential systematic faults along the whole V-model, this includes everything from safety requirements definition to coding, analysis, verification and validation. Essentially, the software has to avoid failures whenever possible, and detect and respond to them if they can’t be avoided. 

TÜV SÜD’s team determined that DRIVE OS 5.2 complies with its strict testing criteria and is suitable for safety-related use in applications up to ASIL B. ISO 26262 identifies four ASILs, A, B, C and D, with A being the lowest degree and D being the highest degree of automotive hazard.

TÜV SÜD, based in Munich, Germany, assesses compliance to national and international standards for safety, durability and quality in various applications, including cars, factories, buildings, bridges and other infrastructure. 

NVIDIA DRIVE is an open platform, which means that experts from top car companies can build upon the company’s industrial-strength system. 

Earlier this year, NVIDIA filed a patent for a system that would help solve one of the biggest issues in autonomous driving: how self-driving cars identify and react to emergency vehicles.

Nvidia’s patent filing, which was published by the US Patent and Trademark Office in May 2022, seeks to help self-driving cars to avoid situations where an autonomous vehicle doesn’t how know to react to emergency vehicles, which could result in a slowed response time, meaning more property damage and personal injuries. 

The patent describes a system involving microphones attached to an autonomous or semi-autonomous car to capture the sounds of nearby emergency response vehicles’ sirens. The microphones will work with a Deep Neural Network (DNN) to create audio signals that correspond to the sirens detected.

NVIDIA won a 2022 RBR50 Robotics Innovation Award from our sister publication Robotics Business ReviewThe company won for its Omniverse Replicator, a data generation engine that produces synthetic data for training deep neural networks based on physical simulations in photorealistic, physically-accurate virtual environments.

The post NVIDIA DRIVE OS earns safety certification appeared first on The Robot Report.

from The Robot Report – Robotics News, Analysis & Research
via artificialconference


Source link

ARO launches new RaaS service for any robotics OEM


ARO field employees setup and maintain a delivery robot at the Cincinnati airport. | Credit: ARO

We write often about the benefits of Robots-as-a-Service (RaaS) here at The Robot Report, highlighting companies like Robotics“>Locus Robotics, who are building a strong business around RaaS.

For automation buyers, RaaS changes the entire purchase process from a capital expenditure (CAPEX) decision, into an operating expense (OPEX) decision. This shortens the decision process (or the sales process for the robot manufacturer) and reduces the risk for the automation client. A RaaS contract guarantees the performance of the solution and enables the client to quickly scale up with additional robots (if the solution is successful) or easily return the solution if it is unsuccessful.

Either way, RaaS is changing the vendor/customer dynamic and reducing barriers to entry for new robotic solutions.

When there is no capital expenditure required, the purchase decision process can often be made quicker, and with lowers levels of signing authority on the client side.

However, implementation of a RaaS business model is often difficult for young robotic startups who have no experience in service delivery. A RaaS-based business model requires a different organizational structure from a classic OEM hardware company business model, and it also requires a lot more capital to build out a fleet of robots. The result is that RaaS company cofounders must find patient and knowledgeable investors, who understand and believe in the RaaS business model.

Instant service organization

There is an alternative to the problem of scaling up the service organization infrastructure and the capital to finance the operational fleet.

Robotics” target=”_blank” rel=”noopener”>ARO announces a RaaS partnership program designed to solve organizational issues and lower customers’ barriers to adoption. ARO can extend the capital to the Robotics OEM through a new program. The ARO business model is one in which, it owns the fleet of robots (assets), and essentially preserves all the benefits of a manufacturer-backed RaaS go-to market.

ARO is operating as a Robotics service provider (RSP) for any Robotics OEM who desires to sell their solutions using the RaaS business model but wants to remain a hardware OEM. ARO’s model includes ARO’s traditional robotic implementation and managed robot services bundled into the price for a true robot-as-a-service model.

The company covers extended warranties, device upgrades, and customer success programs, so customers can be assured that their robots will operate efficiently and effectively while always being supported. ARO provides a 24/7 services center and a remote Robotics operations center, as well as a team of field operations engineers who can travel to customer facilities for installs, maintenance and emergency repairs. The company has been around since 1990.

Manufacturers can enjoy capital preservation, device up-sales, and the full weight of ARO’s operational support to ensure their customers have the best experience with their robots.

“We’re incredibly excited to announce our new RaaS program,” says Jeff Pittelkow, Managing Director of Robotics for ARO. “Our partners are always looking to preserve capital, increase sales, and increase customer success and device utilization. This program provides a complete package in making all of those things happen.”

The post Robotics-oem/”>ARO launches new RaaS service for any Robotics OEM appeared first on The Robot Report.

from The Robot Report – Robotics News, Analysis & Research
via artificialconference


Source link

How Waymo tests its collision avoidance capabilities


Every driver will, inevitably, face unexpected hazards on the road, like other drivers running red lights or suddenly changing lanes. Autonomous vehicles (AVs) are no different, and AV developers have to find ways to prepare their autonomous drivers for as many unexpected events as possible. 

Waymo, the self-driving unit of Google-parent Alphabet, recently gave some insight into how it trains its Waymo Driver to avoid collisions on the road. The company recently published a paper detailing how it judges good collision avoidance performance, how it identifies the right set of scenarios to test and the testing tools it has developed to evaluate the Waymo Driver’s performance. 

Waymo is currently operating fully driverless robotaxi services in Chandler, Arizona, Downtown Phoenix and San Francisco, but before rolling out any of those services, the company tested its Driver extensively. To determine whether its Driver is ready, Waymo compares its performance against the performance of a reference model of a non-impaired human driver that always has eyes on the road, called NIEON for Non-Impaired with Eyes always On the conflict. 

NIEON is a model of a driver that surpasses the abilities of human drivers because it is always able to stay focused on what’s happening on the road. This means it creates a very high benchmark for the Waymo Driver to compete with, and the company has found that its Driver outperforms or demonstrates a comparable performance to NIEON.

Waymo found that the NIEON model could prevent 62% of crashes entirely, and reduce serious injury risk by 84%. The Waymo Driver, however, still did better, preventing 75% of collisions and reducing serious injury risk by 93%. 

Putting the Waymo Driver to the test

Waymo tests its Driver using three different methods: staging scenarios on closed tracks, using examples Waymo runs into during on-road testing and with fully synthetic simulations. Waymo’s real-world examples are constantly being updated with new scenarios the company runs into on the road. It uses fully synthetic simulations for situations that are too dangerous to stage, like for very fast-moving crashes, or for scenarios are too complicated to stage, like multi-lane intersections. 

Along with the millions of miles of driving data Waymo has gathered over years of testing, the company also uses human crash data, like police accident databases and crashes recorded by dash cams, and expert knowledge about its operation design domain, like geographic areas, driving conditions and road types where the Driver will operate, to decide what scenarios are the most important for it to test. 

Waymo has been gathering data for its scenario database since 2016, and it continues to add unique scenarios that it runs into on the roads to it. During its research, Waymo has found that the most common types of crashes are similar in any city, so its database can also help it to scale quickly in new cities. 

Waymo isn’t the only autonomous vehicle company to give insight into the safety of its robotaxis. Cruise recently released its safety report to give the public insights on what the company does to ensure its robotaxis are safe. The report details the approaches, tenets and processes that help keep Cruise vehicles safe on the road. 

The post How Waymo tests its collision avoidance capabilities appeared first on The Robot Report.

from The Robot Report – Robotics News, Analysis & Research
via artificialconference


Source link

10 most memorable robotics stories of 2022


The Robotics industry had its fair share of memorable moments in 2022. Here we take a look back at our picks for the most memorable Robotics stories of the year.

The list is comprised of moments that made us laugh and cringe, as well as moments that left us surprised or amazed at the capabilities of various robots. What did we miss? Please let us know in the comments what you’ll remember most from 2022.

Subscribe to The Robot Report Newsletter to stay updated on the Robotics stories you need to know about.

First audio of Martian dust devil

According to NASA, nearly every Mars rover has experienced dust devils. But the Perseverance rover is the first to capture audio of one. In December 2022, NASA released the audio, adding that it could help scientists better understand how dust might affect future Mars missions.

NASA said capturing a passing dust devil takes some luck, adding that this first-of-its-kind recording was around a 1-in-200 shot. Scientists can’t predict when they’ll pass by, so rovers like Perseverance routinely monitor in all directions for them.

Spiderman robot crashes at Disneyland

On June 9, 2022, the Spider-Man robot at Disneyland momentarily lost its Spidey-Sense. The Spider-Man robot failed in mid-air and crashed into the W.E.B. facility that is part of the stunt setup. And it was all caught on video.

“Here goes something,” narrated the attraction’s soundtrack as Spidey took flight. According to Disney, the robot makes its own, real-time decisions when to tuck, somersault and slow down while soaring 85 feet in the air. But halfway through, the robot’s limbs locked up, rendering it a mere projectile. It crashed right into W.E.B. building. Nobody was hurt during the crash, and the attraction was up and running again a few hours later.

Tesla uveils working humanoid prototype

Elon Musk unveiled at Tesla AI Day the company’s much-anticipated Optimus humanoid robot. Last year, of course, the “robot” was simply a human dressed in a robot suit. This time, Tesla, albeit briefly, actually demoed a working prototype that walked, waved, and danced on stage. Tesla also played videos that showcased the Optimus prototype doing different tasks, such as picking up boxes in a warehouse and watering a plant in an office.

Musk unabashedly claimed Tesla will be able to leapfrog other humanoid developers, such as Agility Robotics and Boston Dynamics, in part because Optimus uses the same neural networks as Tesla’s Autopilot technology. The Robotics industry watched Tesla AI Day in earnest to see exactly what Optimus would be this time around. Many experts agreed the technology inside Optimus lags behind the competition, but they were impressed with how quickly Optimus was built.

Riding in a Waymo around SF

I’ve been fortunate enough to ride in several robotaxis over the years, but none were better than my ride with Waymo around San Francisco in October. It was a roughly 9-mile ride that took just under 30 minutes. Because I’m not a Waymo employee, there was a human safety driver behind the wheel.

The route we took had a myriad of obstacles, including multiple unprotected left turns, pedestrians crossing streets and sidewalks, bicyclists, narrow streets, double-parked Amazon delivery trucks and construction vehicles cutting us off. As you would hope, the trip was flawless and uneventful. Actually, it was quite boring.

First science images from James Webb

james webb space telescope image

NASA released the first science image from the James Webb Space Telescope in July 2022 during a press conference with President Joe Biden. The image was of SMACS J0723.3-7327. What you see in the image is an area of space that includes the faintest objects ever observed in the infrared.

This slice of the vast universe covers a patch of sky approximately the size of a grain of sand held at arm’s length by someone on the ground. The light in this region goes back 13+ billion years.

Cruise robotaxi interacts with SF police

In April 2022, a video surfaced of an interesting interaction between a Cruise robotaxi and San Francisco police officers. The Cruise robotaxi is stopped at a red light, with a police car behind it. As the light turns green, a member of the SF police officer gets out of his car and approaches the robotaxi, apparently because the vehicle’s headlights weren’t on. The police officer tries to open the driver’s side door, but it’s locked. As he heads back to his police car, the Cruise robotaxi drives off before quickly pulling over and flashing its hazard lights less than 10 seconds later.

The internet went crazy with all kinds of jokes about police chases, but Cruise said things went according to plan.

“Our AV yielded to the police vehicle, then pulled over to the nearest safe location for the traffic stop, as intended. An officer contacted Cruise personnel and no citation was issued,” Cruise tweeted. “We work closely with the SFPD on how to interact with our vehicles, including a dedicated phone number for them to call in situations like this.”

Chess-playing robot breaks 7-year-old’s finger

A seven-year-old boy’s finger was broken by a chess-playing robot during a tournament in Russia back in July 2022. The video shows the robot playing on three chessboards simultaneously. The boy makes a move when the robotic arm lowers onto his hand. Four men rush in to help, the boy’s finger is freed, and he is taken away.

The story captured the world’s attention, and tournament officials at one point blamed the child for making a move the robot didn’t understand.

We discussed this incident at length on The Robot Report Podcast with Nima Fazelli, an assistant professor of Robotics at the University of Michigan. Fazelli created a famous Jenga-playing robot during his days at MIT. He discussed the lack of basic safety protocols with the chess-playing robot, the poor choice of using an industrial robot, the system’s lack of intelligence and much more.

Turning dead spiders into robot grippers

Just when you think you’ve seen it all, along comes an idea to use dead spiders as robotic grippers. Engineers at Rice University introduced the world to the concept of “necrobotics” explaining why turning dead wolf spiders into mechanical grippers.

According to the researchers, unlike people and other mammals that move their limbs by synchronizing opposing muscles, spiders use hydraulics. A chamber near their heads contracts to send blood to limbs, forcing them to extend. When the pressure is relieved, the legs contract. Internal valves in the spiders’ hydraulic chamber, or prosoma, allow them to control each leg individually.

The researchers ran one dead spider through 1,000 open-close cycles to see how well its limbs held up, and found it to be fairly robust.

Call me close-minded, but I don’t see necrobotics going mainstream.

SF votes in favor of police using robots with lethal force

In late November 2022, San Francisco’s Board of Supervisors voted to allow the San Francisco Police Department to use remote-controlled and potentially lethal robots in emergency situations. The board voted 8-3 in favor of giving police the option to deploy robots as a last resort in emergency situations.

This made a lot of people inside and outside the Robotics industry quite angry. And just a week later the decision was unanimously reversed. The proposal would have allowed officers to use robots to kill a suspect “when risk of loss of life to members of the public or officers is imminent and officers cannot subdue the threat after using alternative force options or de-escalation tactics.”

Digit helps human coworkers unload trailers

While Tesla’s humanoid prototype could do little more than walk and wave, Agility Robotics showed us that bipedal robots working alongside humans might not be as far off as we think. In April 2022, Agility Robotics gave us a look at the capabilities of its bipedal robot Digit, which brought in $150 million in funding that month. The video shows Digit handling packages and bins, loading them onto trucks, picking them up from shelving and even taking totes from human workers. 

Digit has a max speed of 1.5 m/s and can run for 3 hours doing light work. Doing heavy work, the robot will last 1.5 hours. While right now Digit can only be used to move totes and packages and unload trailers, the company hopes to expand its capabilities to last-mile deliveries soon. 

The post Robotics-stories-of-2022/”>10 most memorable Robotics stories of 2022 appeared first on The Robot Report.

from The Robot Report – Robotics News, Analysis & Research
via artificialconference


Robotics-stories-of.html”>Source link

Remembering robotics companies we lost in 2022


<a href=Robotics companies lost in 2022″ class=”aligncenter size-full wp-image-564654″ decoding=”async” height=”556″ sizes=”(max-width: 758px) 100vw, 758px” src=”” srcset=” 758w,×220.jpg 300w,×110.jpg 150w,×238.jpg 324w” width=”758″/>

There are Robotics-startups-fail/” target=”_blank” rel=”noopener”>many reasons Robotics companies fail. From an ill-conceived idea to poor execution or the inability to raise funding, building and running a sustainable Robotics company is challenging.

This is never a fun recap to write. We don’t want to see startups fail, but inevitably many do. The last couple of years have been especially difficult thanks to a global pandemic, economic uncertainties and ongoing supply chain issues. But perhaps some lessons can be learned from those that couldn’t survive a global pandemic or supply chain issues.

Here are some of the Robotics companies we’ll, unfortunately, remember losing in 2022.

Argo AI (2016-2022)

Argo AI, the self-driving company previously backed by Ford and Volkswagen, abruptly closed its doors in October. For most, this will be the most surprising shutdown on the list. When news broke about the shutdown, Ford said its plan was to shift its focus away from funding Argo AI’s development of Level 4 autonomous driving technology and towards creating its own Level 2 and Level 3 driving systems.

“We still believe in Level 4 autonomy that it will have a big impact on our business of moving people,” Ford’s CEO and President Jim Farley said at the time. “We’ve learned though, in our partnership with Argo and after our own internal investments, that we will have a very long road. It’s estimated that more than $100 billion has been invested in the promise of Level 4 autonomy. And yet no one has defined a profitable business model at scale.”

Farley continued, “Deploying L4 broadly, perhaps the toughest technical problem of our time, will require significant breakthroughs going forward in many areas: reliable and low-cost sensing, it’s not the case today; algorithms that can operate on limited compute resources without constraining the operating time and domain of an electric vehicle; breakthroughs in neural networks that can learn to operate a car more safely than a human, even in very complex urban environments.”

“We’re optimistic about a future for L4 ADAS, but profitable, fully autonomous vehicles at scale are a long way off and we won’t necessarily have to create that technology ourselves.”

Argo AI spun out of Carnegie Mellon in 2016 and came out of stealth in 2017 with a $1 billion investment from Ford. Since then, it raised another $2.6 billion, primarily from Ford and VW, and secured partnerships with Walmart and Lyft.

Kitty Hawk (2010-2022)

After more than a decade of trying to make autonomous flying cars, Kitty Hawk closed its doors in September. The company was founded in 2010 by Sebastian Thrun, who previously founded and led Google’s self-driving car project, which we now know as Waymo.

Kitty Hawk built a number of different aircraft, and in 2021 demonstrated a beyond-visual-line-of-sight flight in Ohio. In June 2021, Kitty Hawk acquired 3D Robotics, a drone company that was once a competitor to DJI. As part of the acquisition, 3D Robotics co-founder Chris Anderson became Kitty Hawk’s chief operating officer. Kitty Hawk said at the time its new focus was on developing a remote-piloted electric vertical takeoff and landing (eVTOL) aircraft.

After the company shut down, Thrun said that “no matter how hard we looked, we could not find a path to a viable business.”

Local Motors (2007-2022)

Olli shuttle

Local Motors, which was building Olli the autonomous shuttle, shut down in early January. Local Motors was founded in 2007, but didn’t start dipping its toes into the world of autonomous vehicles until 2016 when it launched Olli. The company closed due to a lack of funding.

Olli 1.0 was a low-speed pod that could drive for 60 miles on a single charge. The shuttle was designed for environments like hospitals, military bases and universities. In 2019, Local Motors upgraded to Olli 2.0 with a top speed of 25 miles per hour and the ability to run for 100 miles on a single charge.

In October 2020, the company announced it would be testing Olli on the streets of Toronto. Olli hit the streets in 2021, but would only carry out tests until December, when an Olli 1.0 shuttle collided with a tree, resulting in the attendant being critically injured. After the collision, the City of Toronto stopped its trials of the self-driving shuttles. An investigation by the Durham Regional Police Service found that the shuttle was being operated manually during the accident.

Perceptive Automata (2015-2022)

Perceptive Automata was a Boston-based developer of human behavior understanding AI for autonomous vehicles and robots. According to co-founder and CTO Sam Anthony, Perceptive Automata went “kablooey” after it failed to close Series B funding.

Anthony said that the shutdown snuck up on him and the staff. “The part that was lousy was how it went down for the staff. There was a sense that we were blindsided by it falling apart,” he said. “That said, I’m not sure we should’ve been blindsided by it. Part of being a VC-funded company is that you have fairly specific marks you have to hit. If you don’t hit them, the path is cloudy at best. Combined with other factors outside of our control, we were in a tough spot.”

Perceptive Automata raised $20 million since it was founded in 2015.

Skyward (2013-2022)

Skyward built a software platform that helped customers manage drone workflows, including training crews, planning missions, accessing controlled airspace and more. It was acquired by Verizon in 2017 before being shut down in May. At the time of the acquisition, Verizon said it planned to use the company’s technology to streamline drone operation management through one platform.

Skyward sent its customers an email to announce the closure, which came as a surprise to many. Verizon said the decision to shutter Skyward “was about market agility and ensuring that Verizon continues to focus on areas that provide both near and mid-term growth opportunities.”

Chowbotics (2014-2022)

Chowbotics' Sally feeds frontline health workers during coronavirus crisis Saladworks

DoorDash shut down its subsidiary Chowbotics less than 1.5 years after acquiring the business. Chowbotics built Sally, a vending machine-like robot that made salads and other fresh meals. It should be noted many folks in the industry have questioned whether Sally is a robot, but nevertheless.

“At DoorDash, we create an environment to build new products and set high standards to determine when to scale, continue, or cut back investments,” a DoorDash spokesperson said. “We’re always looking for new ways to serve our merchants, exceed consumers’ increasingly higher expectations, and complement our logistics infrastructure.”

Chowbotics was founded in 2014 and acquired by DoorDash in February 2021 for an undisclosed amount. At the time of the acquisition, DoorDash wanted to explore how to deploy Chowbotics’ technology across restaurants. It hoped Sally could help restaurants expand their menu or allow salad bars to pop up in more locations without needing more manpower.

Fifth Season (2016-2022)

Fifth Season was a Pittsburgh-based company that used Robotics to grow and harvest various leafy vegetables that were then packaged and sold as salads, mixed greens or in variety packs. It shut down in October. A Carnegie Mellon University spinout founded in 2016 and raised more than $75 million in investment.

Fifth Season had about 100 employees, including about 20 or so that worked shifts at a 60,000-square-foot indoor farming facility in Braddock, Pa.

Rovenso (2016-2022)

Rovenso was a Switzerland-based company developing autonomous robots for security and safety monitoring of industrial sites. The company was founded in 2016 and raised $2.8 million in funding, according to Crunchbase.

Thomas Estier, co-founder and CEO of Rovenso, posted about the shutdown on LinkedIn, saying he and the team didn’t understand the impact of COVID on business development and components sourcing.

The post Robotics-companies-lost-in-2022/”>Remembering Robotics companies we lost in 2022 appeared first on The Robot Report.

from The Robot Report – Robotics News, Analysis & Research
via artificialconference


Robotics-companies-we-lost.html”>Source link

IDS launches new higher resolution Ensenso N 3D camera


The resolution and accuracy have almost doubled on The Ensenso N camera while the price has remained the same. | Credit: IDS

The Ensenso N-series 3D cameras have a compact body made of aluminum or a plastic composite, depending on the model, and a pattern projector built right in. They can be used to take pictures of both still and moving objects. The integrated projector projects a high-contrast texture onto the objects in question.

A pattern mask with a random dot pattern fills in surface structures that don’t exist or are only faintly detectable. This makes it possible for the cameras to make detailed 3D point clouds even when the lighting is bad.

The Ensenso models N31, N36, N41 and N46, supercede the previously available N30, N35, N40 and N45. Visually, the cameras are identical to their predecessors. Internally, however, the cameras leverage the new IMX392 sensor from Sony. This sensor has a higher resolution of 2.3 MP over the prior 1.3 MP. All cameras are pre-calibrated and therefore easy to set up. The Ensenso selector on the IDS website helps to choose the right model.

With Ensenso N, users can choose from a series of 3D cameras that give reliable 3D information for a wide range of applications, whether they are fixed in place or being moved around by a robot arm. The cameras show their worth when they are used to pick up single items, support industrial robots that are controlled remotely, help with logistics, and even help to automate high-volume laundry.

The most recent update of the  IDS NXT software includes the ability to detect anomalies in addition to Object Detection and Classification. This can be done with only a minimum of training data required to reliably identify both known and unknown deviations.

The post IDS launches new higher resolution Ensenso N 3D camera appeared first on The Robot Report.

from The Robot Report – Robotics News, Analysis & Research
via artificialconference


Source link

WPI launches Autonomous Vehicle Mobility Institute



From left to right, researchers Vladimir Vantsevich, Huashuai Fan and Lee Moradi. | Source: WPI

Vladimir Vantsevich and Lee Moradi, two professors at Worchester Polytechnic Institute’s (WPI’s) Department of Mechanical and Materials Engineering, have established an Autonomous Vehicle Mobility Institute (AVMI) at WPI.

The institute aims to expand the university’s interdisciplinary research into autonomous vehicle technologies as well as to boost educational opportunities for students. AVMI will focus on developing technology for off-road autonomous vehicles that travel across rough terrains. This could mean anything from farmland to battlefields to other planets. 

“Much of the current research into autonomous vehicles focuses on cars that travel on roads, but we focus on off-road vehicles, from small robotic vehicles to full-scale vehicles, both manned and unmanned, with as many as 8, 12, or 16 wheels that are driven by electric motors or mechanical drivetrain systems with controls,” Vantsevich said. “The technological challenge for these off-road vehicles is making them intelligent enough to sense and understand the terrain under the wheel to supply in real time the correct amount of power to each wheel and thus improve the vehicle’s terrain mobility, maneuverability, and energy efficiency. We believe that WPI is an excellent place to engage students, other faculty members, and industry partners in this work.”

Researchers at WPI already have a few ongoing projects having to do with autonomous vehicle technology. These projects include models to sift through large amounts of sensor data from autonomous systems and software that will enable groups of lunar robots to collaborate while exploring the moon. 

“A significant portion of vehicles on and off roads are expected to be autonomous in the coming decades,” Wole Soboyejo, interim president of WPI, said. “WPI researchers across departments are already doing groundbreaking work in this field, and Vladimir and Lee will allow WPI to transform the scale of our innovations with their expertise and their ability to bring together collaborators with complementary expertise. This will lead to several new opportunities for our students and prepare them for leadership positions in a field that will define the cutting edge of transportation and space exploration.”

AVMI is funded by the USA Army, NASA, the U.S. Department of Energy and industry partners in both the U.S. and Western Europe. 

“I’m very excited to join the faculty of WPI to continue working on autonomous off-road vehicles that could be used in agriculture, construction, the military, and especially planetary exploration,” Moradi said. “As humans continue to explore space, developing autonomous vehicles that can function on other planets under harsh conditions will be of the utmost importance.”

Vantsevich has experience in research and engineering on mechanical and intelligent mechatronic multi-physics systems with application to vehicle system modeling and simulation. Morado has spent over 18 years working in the industry after receiving his BS in engineering and his MS and Ph.D. in civil engineering from the University of Alabama (UAB). Before joining WPI in early 2022, both Vantsevich and Moradi worked together as faculty members at UAB. 

The post WPI launches Autonomous Vehicle Mobility Institute appeared first on The Robot Report.

from The Robot Report – Robotics News, Analysis & Research
via artificialconference


Source link

SafeAI brings in $38M for construction retrofit kits


SafeAI announced that it brought in $38 million in Series B funding. The company offers robotic retrofit kits for construction and mining vehicles. 

SafeAI plans to use the funding to accelerate its autonomous vehicle technology roadmap and to scale operations globally to support its growing customer base and deliver on contractual milestones. The company plans to expand its developer and engineering teams. SafeAI especially wants to fill the position of Chief Technology Officer to lead its efforts in the autonomous vehicle engineering space. 

“Autonomy has existed in mining for more than two decades—but its growth has stalled when it needed to skyrocket. I founded SafeAI to change that,” Bibhrajit Halder, founder and CEO at SafeAI, said. “We’ve designed a flexible, interoperable, scalable retrofit model to enable companies across heavy industry to uplevel their operations. The mining industry has been successfully implementing Autonomy on a limited scale for years. Our approach is purpose-built to finally accelerate autonomous accessibility and deployment on a significant scale. This funding is just the beginning of our next chapter of expansion and growth.”

The funding round included participation from Builders VC, McKinley Management, George Kaiser Family Foundation and Energy Innovation Capital. Moog Inc also joined the round as a strategic investor. Existing investors Autotech Ventures, Brick & Mortar Ventures, Embark Ventures, Newlab and Vimson Group also participated in the round. 

“Moog Inc. is focused on enabling productivity, safety and sustainability when performance really matters. Our autonomy solutions transform equipment into high integrity robots, enabling autonomous operations in challenging environments,” Joe Baldi, Director of Strategy and Partnerships for the construction sector of Moog Inc, said. “Through this investment in SafeAI, we can collaborate to accelerate the adoption of autonomy for heavy equipment and help companies across construction and mining realize the benefits to their operations”

SafeAI’s retrofit kits equip aftermarket vehicles and fleets with autonomous capabilities, regardless of manufacturer or vehicle type. 

In 2022, Safe AI announced partnerships with MACA to deploy 100 mining trucks, Siemens on a collaboration for vehicle electrification and autonomy and Obayashi Corporation, to develop solutions for construction. The company has employees in the U.S., Australia, Japan and India, and was able to more than double its headcount in 2022. 

The post SafeAI brings in $38M for construction retrofit kits appeared first on The Robot Report.

from The Robot Report – Robotics News, Analysis & Research
via artificialconference


Source link

Australia establishes National Robotics Strategy Advisory Committee


Lyro <a href=Robotics” aria-describedby=”caption-attachment-564634″ class=”size-full wp-image-564634″ decoding=”async” height=”500″ sizes=”(max-width: 770px) 100vw, 770px” src=”” srcset=” 770w,×195.jpg 300w,×97.jpg 150w,×499.jpg 768w,×238.jpg 368w” width=”770″/>

Australian-based Robotics company Lyro Robotics creates an autonomous packing robot. | Source: Lyro Robotics

Ed Husic, Australia’s Minister for Industry and Science, appointed a National Robotics Strategy Advisory Committee. The committee will help to guide Australia’s strategy for emerging automation technologies. 

The committee will develop a national Robotics strategy to help the country harness Robotics and automation opportunities. The committee will examine Robotics from every industry, from advanced manufacturing to agriculture. 

“We have brought together some of the nation’s leading Robotics and technology thinkers and practitioners to guide the way we develop and use Robotics,” Husic said. “Australia has a lot of the key elements that can help in the development of national Robotics capabilities: our people, research and manufacturing skills. And while we’re recognized as possessing strength in field Robotics, we can do better, across a wider range of activities.”

The National Robotics Strategy Advisory Committee is chaired by Professor Bronwyn Fox, the Chief Scientist of CSIRO, Australia’s national science agency. 

Other members of the committee include:

  • Catherine Ball, an associate professor at the Australian National University 
  • Andrew Dettmer, the National President of the Australian Manufacturing Workers’ Union 
  • Hugh Durrant-Whyte, the NSW chief scientist and engineer 
  • Sue Keay, the founder and chair of the Robotics Australia Group
  • Simon Lucey, the director of the Australian Institute of Machine Learning 
  • Julia Powels, the director of UWA Minderoo Tech & Policy Lab
  • Mike Zimmerman, a partner at Main Sequence Ventures

“Australian-made and maintained Robotics and automation systems have the potential to boost local manufacturing, open up export opportunities and create safer and more productive work environments,” Husic said.

Husic also said that the National Robotics Strategy Advisory Committee will aim to develop robotic strength while also developing human skills so that Australians still have access to secure, well-paying jobs. Husic asked for the strategy to be finalized by March 2023. 

The post Robotics-strategy-advisory-committee/”>Australia establishes National Robotics Strategy Advisory Committee appeared first on The Robot Report.

from The Robot Report – Robotics News, Analysis & Research
via artificialconference


Robotics.html”>Source link